The disclosure of Japanese Patent Application No. 2011-185031, which was filed on Aug. 26, 2011, is incorporated herein by reference.
1. Field of the Invention
The present invention relates to an electronic camera and in particular, relates to an electronic camera which has a function of shooting a document page.
2. Description of the Related Art
According to one example of this type of camera, a manuscript of a readout target is supported by a copy holder. A manuscript image is converted into an electric signal by an imager. An open space for setting the manuscript exists between the copy holder and the imager. A ranging sensor is placed on an upper side of the copy holder so as to measure an objective distance in a direction toward the copy holder. The manuscript image is read in response to a change of the objective distance measured by the ranging sensor. Thereby, it becomes possible to reduce a work burden for reading a plurality of manuscript images.
However, in the above-described camera, executing/suspending a page turning operation is determined based on output of the ranging sensor arranged separately from the imager, and therefore, there is a problem in that a composition becomes complicated.
An electronic camera according to the present invention, comprises; an imager which repeatedly outputs an image representing a scene captured on an imaging surface; a definer which executes a process of defining a document page region within the scene captured on the imaging surface, corresponding to a document page photographing mode; a searcher which searches for one or at least two characteristic images including a page edge from a partial image belonging to the document page region defined by the definer out of the image outputted from the imager; a detector which detects a termination of a page turning operation based on a search result of the searcher; and an extractor which extracts the image outputted from the imager corresponding to a detection of the detector.
According to the present invention, An imaging control program recorded on a non-transitory recording medium in order to control an electronic camera provided with an imager which outputs an image representing a scene captured on an imaging surface, the program causing a processor of the electronic camera to perform the steps comprises: a defining step of executing a process of defining a document page region within the scene captured on the imaging surface, corresponding to a document page photographing mode; a searching step of searching for one or at least two characteristic images including a page edge from a partial image belonging to the document page region defined by the defining step out of the image outputted from the imager; a detecting step of detects a termination of a page turning operation based on a search result of the searching step; and an extracting step of extracting the image outputted from the imager corresponding to a detection of the detecting step.
According to the present invention, An imaging control method executed by an electronic camera provided with an imager which outputs an image representing a scene captured on an imaging surface, comprises: a defining step of executing a process of defining a document page region within the scene captured on the imaging surface, corresponding to a document page photographing mode; a searching step of searching for one or at least two characteristic images including a page edge from a partial image belonging to the document page region defined by the defining step out of the image outputted from the imager; a detecting step of detects a termination of a page turning operation based on a search result of the searching step; and an extracting step of extracting the image outputted from the imager corresponding to a detection of the detecting step.
The above described features and advantages of the present invention will become more apparent from the following detailed description of the embodiment when taken in conjunction with the accompanying drawings.
With reference to
When the document photographing mode is selected, the document page region is defined within the scene captured on the imaging surface, and one or at least two characteristic images including the page edge is searched from the partial image belonging to the document page region. When the termination of the page turning operation is detected based on the search result, corresponding thereto, the image outputted from the imager 1 is extracted. Thereby, a complication of a composition is inhibited, and an imaging performance for the document page is improved.
With reference to
A CPU 32 is a CPU which executes a plurality of tasks on a multi task operating system such as the μITRON, in a parallel manner. When a power source is applied, under a main task, the CPU 32 executes a process of determining an operation mode being selected at a current time point, and a process of activating a task corresponding to the determined operation mode. When a determined operation mode is a normal photographing mode, a normal photographing task is activated whereas when the determined operation mode indicates the document page photographing mode, a page photographing task is activated. When a mode selector button 34sw arranged in a key input device 34 is operated, the task that is being activated is stopped, and a task corresponding to the operation mode selected by the operation of the mode selector button 34sw is activated alternately.
It is noted that the document photographing mode is assumed that a dedicated jig FX1 fixed on a desk DSK1 is prepared as shown in
When the normal photographing task is activated, in order to execute a moving-image taking process, the CPU 32 commands a driver 20d to repeat an exposure procedure and a electric-charge reading-out procedure. In response to a vertical synchronization signal Vsync periodically generated, the driver 20d exposes the imaging surface of the imager 18 and reads out the electric charges produced on the imaging surface in a raster scanning manner. From the imager 18, raw image data that is based on the read-out electric charges is cyclically outputted.
A signal processing circuit 22 performs processes such as a white balance adjustment, a color separation, and a YUV conversion on the raw image data outputted from the imager 18. YUV-formatted image data generated thereby is written into a YUV image area 26a of an SDRAM 26 through a memory control circuit 24. An LCD driver 28 repeatedly reads out the image data stored in the YUV image area 26a through the memory control circuit 24, and drives an LCD monitor 30 based on the read-out image data. As a result, a real-time moving image (live view image) representing a scene captured on the imaging surface is displayed on a monitor screen.
Moreover, the signal processing circuit 22 applies Y data forming the image data to the CPU 32. The CPU 32 performs a simple AE process on the applied Y data so as to calculate an appropriate EV value and set an aperture amount and an exposure time period that define the calculated appropriate EV value to the drivers 20c and 20d, respectively. Thereby, the raw image data outputted from the imager 18, by extension, a brightness of a live view image displayed on the LCD monitor 30 is adjusted approximately.
When a zoom button 34zm arranged in the key input device 34 is operated, the CPU 32 controls the driver 20a so as to move the zoom lens 12 in an optical-axis direction. As a result, a magnification of an optical image irradiated on the imaging surface, by extension, a magnification of a live view image displayed on the LCD monitor 30 is changed.
When a shutter button 34sh arranged in the key input device 34 is half-depressed, the CPU 32 performs a strict AE process on the Y data applied from the signal processing circuit 22 so as to calculate an optimal EV value. Aperture amount and an exposure time period that define the calculated optimal EV value are set to the drivers 20c and 20d, respectively. As a result, a brightness of a live view image is adjusted strictly. Moreover, the CPU 32 performs an AF process on a high-frequency component of the Y data applied from the signal processing circuit 22. Thereby, the focus lens 14 is placed at a focal point, and as a result, the raw image data outputted from the imager 18, by extension, a sharpness of a live view image displayed on the LCD monitor 30 is improved. When the shutter button 34sh is fully depressed, the CPU 32 executes a still-image taking process, and concurrently, commands a memory I/F 36 to execute a recording process.
Image data representing a scene at a time point at which the shutter button 34sh is fully depressed is evacuated from the YUV image area 26a to a still-image area 26b by the still-image taking process. The memory I/F 36 commanded to execute the recording process reads out the image data evacuated to the still-image area 26b through the memory control circuit 24 so as to record an image file containing the read-out image data on a recording medium 38.
When the document photographing task is activated in a state where the digital camera 10 is attached to the jig FX1 shown in
When the shutter button 34sh is operated in this state, the CPU 32 regards that a document-page photographing-start operation is performed, and searches for a document page from the image data stored in the YUV image area 26a. When the document page is detected, the CPU 32 defines a region covering the detected document page as a document page region PR1 (see
Upon completion of adjusting the zoom magnification, a center-page spread state detecting task is activated. Under the center-page spread state detecting task, page-turning determination processes 1 to 3 are executed at every time the vertical synchronization signal Vsync is generated. The page-turning determination process 1 is executed with reference to a page edge, the page-turning determination process 2 is executed with reference to a finger of a person, and the page-turning determination process 3 is executed with reference to a color of a hand.
However, when a determined result indicating a “page-turning-operation stopped state” is acquired in the page-turning determination process 1, the page-turning determination process 2 is complementary executed in order to verify a reliability of the determined result. Furthermore, when a determined result indicating “a page-turning-operation stopped state” is acquired in the page-turning determination process 2, the page-turning determination process 3 is complementary executed in order to verify a reliability of the determined result.
In the page-turning determination process 1, firstly, a line segment equivalent to the longest portion of a vertical edge forming the document page is searched from the image data belonging to the document page region PR1. Specifically, a searching target is the longest line segment among one or at least two line segments each of which has an inclination θ1 equal to or less than 45 degrees and a length equal to or more than 40 percent of a vertical size of the document page region PR1. A length of the detected line segments is set to a variable EhL1.
In an example shown in
Subsequently, a line segment being on an extended line of the detected line segment is detected from the image data belonging to the document page region PR1. The detected line segment is another portion of the line segment forming the same vertical edge, and a length of the detected ling segment is set to a variable EhL2. In the example shown in
When a total sum of the variables EhL1 and EhL2 is equal to or more than 50 percent and less than 70 percent of the vertical size of the document page region PR1, a line segment equivalent to a horizontal edge of the document page is additionally searched. Specifically, a searching target is equivalent to a line segment having θ2 which is an angle intersect with the vertical edge detected in a manner described above belonging to a range from 60 degrees to 100 degrees and a length equal to or more than 70 percent of a horizontal size of the document page. In the example shown in
The determined result of the page-turning determination process 1 indicates a “page-turning-operation executed state” when the total sum of the variables EhL1 and EhL2 is equal to or more than 70 percent of the vertical size of the document page region.
Moreover, the determined result of the page-turning determination process 1 is regarded as the “page-turning-operation executed state” when the total sum of the variables EhL1 and EhL2 is equal to or more than 50 percent and less than 70 percent of the vertical size of the document page region and the line segment equivalent to the horizontal edge of the document page is detected.
In contrary, when the vertical edge of the document page is not detected, when a length of the detected vertical edge (=EhL1+EhL2) is less than 50 percent, or when the length of the detected vertical edge is in a range from 50 percent to 70 percent and the horizontal edge is not detected, the determined result of the page-turning determination process 1 indicates the “page-turning-operation stopped state”.
In the page-turning determination process 2, an image representing the finger (=finger image) is searched from the document page region PR1. Upon searching, dictionary images FG1 to FG15 shown in
In the example of
The determined result of the page-turning determination process 2 indicates the “page-turning-operation executed state” when the finger image is detected from the document page region PR1, and indicates the “page-turning-operation stopped state” when the finger image is not detected from the document page region PR1.
In the page-turning determination process 3, a group image having the same color as the color specified by the variable HandColor is extracted from the image data belonging to the document page region PR1, and a dimension of the extracted group image is compared with a threshold value THdm. In the example of
When the dimension exceeds the threshold value THdm, the determined result of the page-turning determination process 3 indicates the “page-turning-operation executed state”. In contrary, when the dimension of the group is equal to or less than the threshold value THdm, or when the variable HandColor is not set, the determined result of the page-turning determination process 3 indicates the “page-turning-operation stopped state”.
In the document page photographing task, the termination of the page turning operation is detected by noticing a temporal change of the determined results. While the termination of the page turning operation is not detected, the CPU 32 repeatedly executes the simple AE process. As a result, a brightness of a live view image is adjusted approximately.
In contrary, when the termination of the page turning operation is detected, the CPU 32 executes the strict AE process and the AF process, and concurrently, executes the still-image taking process. As a result, image data representing a scene at a time point at which the page turning operation is ended and in which a brightness and a sharpness are strictly adjusted is evacuated from the YUV image area 26a to a still-image area 26b.
Upon completion of an evacuating process, an image modifying process is executed. In the image modifying process, a region surrounding the document page region PR1 is set as an unnecessary-image detection region DR1, and a color of the unnecessary-image detection region DR1 is changed to the color of the margin of the document page. As a result, in the example of
The process is executed at every time the document page is turned, and as a result, one or at least two frames of image data are evacuated to a still-image area 26b. When the shutter button 34sh is operated again in order to end photographing the document page, the CPU 32 commands the memory I/F 36 to execute the recording process. The memory I/F 36 reads out the one or at least two image data evacuated to the still-image area 26b through the memory control circuit 24 so as to record a single image file containing the read-out image data on the recording medium 38.
The CPU 32 executes following tasks: the main task shown in
With reference to
Upon completion of the process in the step S5, S7 or S9, in a step S11, it is repeatedly determined whether or not the mode selector button 34sw is operated. When a determined result is updated from NO to YES, the task that is being activated is stopped in a step S13, and thereafter, the process returns to the step S1.
With reference to
In the step S25, the simple AE process is executed. As a result, a brightness of a live view image is adjusted approximately. Upon completion of the simple AE process, in a step S27, it is determined whether or not the zoom button 34zm is operated. When a determined result is NO, the process directly returns to the step S23 whereas when the determined result is YES, in a step S29, a zoom magnification is changed (=the zoom lens 12 is moved in an optical-axis direction). Thereafter, the process returns to the step S23. As a result of the process in the step S29, a magnification of a live view image is changed.
When the shutter button 34sh is half-depressed, in the step S31, the strict AE process is executed, and in a step S33, the AF process is executed. As a result, a brightness and a sharpness of a live view image are adjusted strictly. In a step S35, it is determined whether or not the shutter button 34sh is fully depressed, and in a step S37, the operation of the shutter button 34sh is cancelled. When a determined result of the step S37 is YES, the process directly returns to the step S23. When a determined result of the step S35 is YES, in a step S39, the still-image taking process is executed, and in a step S41, the memory I/F 36 is commanded to execute the recording process. Thereafter, the process returns to the step S23.
As a result of the process in the step S39, image data representing a scene at a time point at which the shutter button 34sh is fully depressed is evacuated from the YUV image area 26a to the still-image area 26b. Moreover, as a result of the process in the step S41, the memory I/F 36 reads out the image data evacuated to the still-image area 26b through the memory control circuit 24 so as to record an image file containing the read-out image data on the recording medium 38.
With reference to
In a step S55, a document page is searched from the image data stored in the YUV image area 26a, and in a step S57, it is determined whether or not the document page is detected. When a determined result is NO, the process returns to the step S57 whereas when the determined result is YES, the process advances to a step S59. In a step S59, a region covering the detected document page is defined as a document page region PR1.
In a step S61, a zoom magnification (=a position of the zoom lens 12) is adjusted so that the defined document page region PR1 accounts for 90 percent of the image data, and in a step S63, the center-page spread state detecting task is activated. In a step S65, a flag FLG_Page_PR is set to “0”, and in a step S67, it is determined whether or not a logical AND condition under which the flag FLG_Page_PR indicates “0” and a flag FLG_Page_CR indicates “1” is satisfied.
Here, the flag FLG_Page_PR is a flag for identifying whether the page turning operation is in the executed state or the stopped state at a timing equivalent to a prior frame. Moreover, the flag FLG_Page_CR is a flag for identifying whether the page turning operation is in the executed state or the stopped state at a timing equivalent to a current frame. In both of the flags, “0” indicates the executed state whereas “1” indicates the stopped state. Moreover, a value of the flag FLG_Page_CR is controlled by the center-page spread state detecting task.
When a determined result is NO, it is regarded that a state at a current time point is a state on a page turning (FLG_Page_PR=FLG_Page_CR=0) or a state after the page turning (FLG_Page_PR=FLG_Page_CR=1), and in a step S69, the simple AE process is executed. Thereafter, the process advances to a step S79.
In contrary, when the determined result of the step S67 is YES, it is regarded that a state at a current time point is a state immediately after the page turning, and in a step S71 or S73, the strict AE process and the AF process are executed. Concurrently, in a step S75, the still-image taking process is executed. As a result, image data representing a scene at a time point at which the page turning operation is ended and in which a brightness and a sharpness are strictly adjusted is evacuated from the YUV image area 26a to the still-image area 26b. Upon completion of the process in the step S75, in a step S77, the image modifying process is executed, and thereafter, the process advances to the step S79. As a result of the process in the step S77, an image of the unnecessary-image detection region DR1 surrounding the document page region PR1 is filled by the color of the margin of the document page.
In a step S79, the value of the flag FLG_Page_CR is set to the FLG_Page_PR. In a step S81, it is determined whether or not the shutter button 34sh is operated again, and when a determined result is NO, the process returns to the step S67 whereas when the determined result is YES,
In a step S83, it is determined whether or not one or at least two frames of image data are evacuated to a still-image area 26b, and when a determined result is NO, the process returns to the step S53 whereas when the determined result is YES, in a step S85, the memory I/F 36 is commanded to execute the recording process. The memory I/F 36 reads out the one or at least two image data evacuated to the still-image area 26b through the memory control circuit 24 so as to record a single image file containing the read-out image data on the recording medium 38. Upon completion of the recording process, the process returns to the step S53.
The image modifying process in the step S77 is executed according to a subroutine shown in
With reference to
In a step S109, it is determined whether or not the flag FLG_Edge_Page Turning indicates “0”. When a determined result is NO, in a step S111, the flag FLG_Page_CR is set to “0”, and thereafter, the process returns to the step S105. In contrary, when the determined result is YES, in a step S113, the page-turning determination process 2 is executed with reference to the finger of the person. A flag FLG_Finger_Page Turning is set to “1” when the finger image is detected from the image data belonging to the document page region PR1 whereas is set to “0” when the finger image is not detected from the image data belonging to the document page region PR1.
In a step S115, it is determined whether or not the flag FLG_Finger_Page Turning indicates “0”. When a determined result is NO, in a step S123, the flag FLG_Page_CR is set to “0”, and thereafter, the process returns to the step S105. In contrary, when the determined result is YES, in a step S117, the page-turning determination process 3 is executed with reference to the color of the hand. A flag FLG_HandColor_Page Turning is set to “1” when a dimension of the group image having the same color as the color of the finger image and existing in the document page RG1 exceeds the threshold value THdm whereas is set to “0” when the condition is not satisfied.
In a step S119, it is determined whether or not the flag FLG_HandColor_Page Turning indicates “0”. When a determined result is NO, in the step S123, the flag FLG_Page_CR is set to “0”, and thereafter, the process returns to the step S105. In contrary, when a determined result is YES, in a step S121, the flag FLG_Page_CR is set to “1”, and thereafter, the process returns to the step S105.
The page-turning determination process 1 in the step S107 shown in
In a step S133, it is determined whether or not the searching target is detected. When a determined result is NO, in a step S145, the flag FLG_Edge_Page Turning is set to “0”, and thereafter, the process returns to the routine in an upper hierarchy. On the other hand, when the determined result is YES, the process advances to a step S135, and a line segment being on an extended line of the line segment detected in the step S131 is detected from the image data belonging to the document page region PR1. A length of the detected ling segment is set to the variable EhL2.
In a step S137, it is determined whether or not the total sum of the variables EhL1 and EhL2 is equal to or more than 70 percent of the vertical size of the document page region PR1. Moreover, in step S139, the total sum of the variables EhL1 and EhL2 is equal to or more than 50 percent of the vertical size of the document page region PR1.
When a determined result of the step S137 is YES, in a step S147, the flag FLG_Edge_Page Turning is set to “1”, and thereafter, the process returns to the routine in an upper hierarchy. When both of the determined result of the step S137 and a determined result of the step S139 are NO, the process returns to the routine in an upper hierarchy via the process in the step S145.
When the determined result of the step S137 is NO whereas the determined result of the step S139 is YES, in a step S141, the horizontal edge forming the document page is detected. Specifically, a searching target is equivalent to a line segment having θ2 which is an angle intersect with the vertical edge detected in a manner described above belonging to a range from 60 degrees to 100 degrees and a length equal to or more than 70 percent of a horizontal size of the document page. In a step S143, it is determined whether or not the line segment is detected, and when a determined result is NO, the process returns to the routine in an upper hierarchy via the process in the step S145 whereas when the determined result is YES, the process returns to the routine in an upper hierarchy via the process in the step S147.
The page-turning determination process 2 of the step S113 shown in
When the determined result is YES, in a step S157, the color of the detected finger image (a skin color of the finger, exactly) is detected, and it is determined whether or not the detected color approximates the color of the margin of the page (whether or not a parameter value defining the detected color belongs to a predetermined region including a parameter value defining the color of the margin). When a determined result is YES, the process advances to a step S165 whereas when the determined result is NO, the process advances to the step S165 via processes in steps S161 to 5163. In the step S161, the variable HandColor_set is set to “1”. In the step S163, a numerical value indicating the color detected in the step S157 is set to the variable HandColor. In the step S165, the FLG_Finger_PageTurning is set to “1”. Upon completion of the setting, the process returns to the routine in an upper hierarchy.
The page-turning determination process 3 in the step S117 shown in
As can be seen from the above-described explanation, the imager 18 repeatedly outputs an image representing a scene captured on the imaging surface. When the document photographing mode is selected, the CPU 32 defines the document page region within the scene captured on the imaging surface (S55 to S59), and searches for one or at least two characteristic images including the page edge from the partial image data belonging to the document page region (S107, S113 and S117). Moreover, the CPU 32 detects the termination of the page turning operation based on the search result (S67), and extracts the YUV-formatted image data that is based on the raw image data outputted from the imager 18 at a timing of detection to the still-image area 26b (S71 to S75). Thereby, the imaging performance for the document page is improved.
Moreover, in this embodiment, the control programs equivalent to the multi task operating system and the plurality of tasks executed thereby are previously stored in the flash memory 40. However, a communication I/F 42 may be arranged in the digital camera 10 as shown in
Moreover, in this embodiment, the processes executed by the CPU 32 are divided into a plurality of tasks in a manner described above. However, these tasks may be further divided into a plurality of small tasks, and furthermore, a part of the divided plurality of small tasks may be integrated into another task. Moreover, when each of tasks is divided into the plurality of small tasks, the whole task or a part of the task may be acquired from the external server.
Although the present invention has been described and illustrated in detail, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, the spirit and scope of the present invention being limited only by the terms of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2011-185031 | Aug 2011 | JP | national |