The present invention relates to an image displaying device capable of enabling a user to visually recognize a focusing state of a subject, a corresponding method and a computer readable medium to realize the method.
In general, a digital still camera is provided with a LCD (Liquid Crystal Display), for example, on a back side of a camera body thereof A user is able to check various photographing conditions, such as, an angle of view, a white balance, exposure, a focusing state, etc., via a so-called through image displayed on the LCD. However, in general, resolution of the through image is low. Therefore, the user is not able to adequately recognize the focusing state when the user adjusts the focusing state through a manual operation. For this reason, a problem that it is difficult to accurately adjust the focusing state through a manual operation has been pointed out.
In view of such a problem, Japanese Patent Provisional Publication No. 2007-28380A (hereafter, referred to as patent document 1) discloses a digital still camera having the function of assisting user's manual operation for adjusting a focusing state. Specifically, the digital still camera described in patent document 1 is configured to assign a predetermined color to pixels along a contour line of a subject displayed on an LCD so that the contour line is appropriately highlighted regardless of resolution of the image or the pixel size.
When the digital still camera described in patent document 1 is set for the manual focusing mode, representation on the LCD is changed to representation of a colored contour image. Therefore, the user is not able to visually recognize a true subject image during the manual focusing operation. Even when the user achieves focusing by adjusting the contour image, the user may not be able to shoot a subject in a composition intended by the user or the user may not be able to properly shoot the subject because the user is not able to visually recognize the true subject image.
The present invention is advantageous in that it provides an image displaying device configured suitably to assist user's manual focusing operation while enabling the user to visually recognize a true subject image. In addition, the invention is directed to a corresponding method and a computer readable medium to conduct said method.
According to an aspect of the invention, there is provided an image displaying device which comprises a photographing image generating unit configured to generate a photographing image by processing photographing data outputted by an image pickup device, an area designation unit configured to designate a cut-out area in the photographing image, a contour image generating unit configured to generate a contour image of the cut-out area based on a high frequency component of the cut-out area, a combined image generating unit configured to generate a combined image in which the contour image is combined with a part of the photographing image, and a combined image displaying unit configured to display the combined image on a display screen, e.g. an LCD, a TFTLCD or an OLED.
According to the above described configuration, a user is able to conduct manual operation for focusing while recognizing a composition of a subject and a whole composition including the subject displayed in a normal style (i.e., in true color, luminance, tone and etc.), by checking the degree of emphasis of a contour line of a contour image while visually recognizing the whole image.
The image displaying device may further comprise a ranging point combining unit configured to combine and display a predetermined ranging point on the photographing image. In this case, the area designation unit may be configured to designate an area centered at the predetermined ranging point as the cut-out area.
The image displaying device may further comprise an operation unit. In this case, the area designation unit may be configured to designate the cut-out area in accordance with an operation to the operation unit by a user.
The image displaying device may further comprise a ranging point combining unit configured to combine and display a predetermined ranging point on the photographing image. In this case, a position of the predetermined ranging point in the photographing image may be determined in accordance with an operation to the operation unit by the user. The area designation unit may be configured to designate, as the cut-out area, an area centered at the position of the predetermined ranging point determined in accordance with the operation to the operation unit by the user.
The combined image generating unit may be configured to combine the contour image on an area including the designated cut-out area centered at the predetermined ranging point.
The image displaying device may further comprise a ranging point combining unit configured to combine and display a plurality of predetermined ranging points on the photographing image. 1n this case, the area designation unit may be configured to designate, as the cut-out area, an area centered at a ranging point which is one of the plurality of ranging points and is used for focusing.
The combined image generating unit may be configured to combine the contour image on an area which includes the designated cut-out area and is centered at the ranging point used for focusing.
The image displaying device may further comprise an operation unit configured to designate an area on which the contour image is combined with the photographing image. In this case, the combined image generating unit may be configured to combine the contour image on the area in the photographing image designated by an operation to the operation unit by a user.
The image displaying device may further comprise an operation unit configured to designate a displaying scale factor for the contour image, and a size conversion unit configured to change a displaying size of the contour image at the displaying scale factor designated by an operation to the operation unit by a user. In this case, the combined image generating unit may be configured to combine the contour image whose displaying size is changed by the size conversion unit with the photographing image.
The contour image generating unit may be configured to generate the contour image based on a high frequency component of a luminance signal corresponding to the cut-out area designated by the area designation unit.
According to another aspect of the invention, there is provided an image displaying method, comprising the steps of: generating a photographing image by-processing photographing image data outputted by an image pickup device; designating a cut-out area in the photographing image; generating a contour image of the cut-out area based on a high frequency component of the cut-out area; generating a combined image in which the contour image is combined with a part of the photographing image; and displaying the combined image.
According to the above described configuration, a user is able to conduct manual operation for focusing while recognizing a composition of a subject and a whole composition including the subject displayed in a normal style (i.e., in true color, luminance, tone and etc.), by checking the degree of emphasis of a contour line of a contour image while visually recognizing the whole image.
The method may further comprise: combining and displaying a predetermined ranging point on the photographing image, and designating an area centered at the predetermined ranging point as the cut-out area.
In the step of designating the cut-out area, the cut-out area is designated in accordance with an operation to an operation unit by a user, in this case, the method further comprise: combining and displaying a predetermined ranging point on the photographing image, wherein a position of the predetermined ranging point in the photographing image is determined in accordance with an operation initiated by the user; and designating, as the cut-out area, an area centered at the position of the predetermined ranging point determined in accordance with the operation initiated by the user.
According to another aspect of the invention, there is provided a non-transitory computer readable medium having computer readable instruction stored thereon, which, when executed by a processor of an image displaying device, configures the processor to perform the steps of the above described method.
Hereinafter, an image displaying device, a method and a computer readable medium according to an embodiment of the invention is described with reference to the accompanying drawings. In the embodiment described below, the image displaying device is mounted on a photographing apparatus capable of obtaining a digital image. The photographing apparatus is, for example, a digital single reflex camera (having a quick return mirror) or a mirror-less single reflex camera of a lens interchangeable type, a compact digital camera, a camcorder, a mobile phone, a PHS (Personal Handy phone System), a smart phone, a feature phone or a portable game machine. Furthermore, any kind of programming language can be used.
Configuration of Photographing Apparatus
The operation unit 104 includes various switches for allowing the user to operate the photographing apparatus 1, such as a power switch, a release switch, a mode setting switch, a cross key, a dial key and a zoom key. When the user presses the power switch, power is supplied from a battery (not shown) to the various circuits of the photographing apparatus 1 via power lines. After power is supplied, the CPU 100 accesses the ROM 116 to read out a control program, and loads the control program onto an internal memory (not shown). Then, by executing the loaded control program, the CPU 100 totally controls the photographing apparatus 1. In.
The DSP 102 drives and controls the aperture stop 108 and the shutter 110 via the aperture stop and shutter drive circuit 114 so that proper exposure measured by a TTL (Through The Lens) photometer (not shown) provided in the photographing apparatus i can be achieved, More specifically, the drive and control for the aperture stop (08 and the shutter 110 is executed based on the AE function mode, such as program AE (Automatic Exposure), shutter speed-priority AE or aperture priority AE, designated by the mode setting switch. The DSP 102 executes AF (Autofocus) control as well as the AE control, and drives and controls the photographing lens 106 via the focusing lens drive circuit 122. As the AF control, the active autofocus, the phase detection autofocus or the contrast detection autofocus is used. Since these AE and AF control and configuration are known, detailed explanations thereof are omitted.
A light beam from a subject passes through the photographing lens 106, the aperture stop 108 and the shutter 110, and is received by the image sensor 112. The image sensor 112 is, for example, a single-chip color CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor having a Bayer type pixel array, and is configured to accumulate charges corresponding to the light amount of an optical image formed at each pixel on an imaging surface thereof, and converts the accumulated charges into an electric signal. The DSP 102 executes predetermined signal processing including color interpolation, a matrix operation and Y/C separation, and generates a luminance signal Y and color-difference signals Cb and Cr, and then compresses them in a predetermined format such as JPEG (Joint Photographic Experts Group). The compressed image signal (photographed image data) is then stored in a memory card 200 inserted into the memory card adapter 118. The DSP 102 buffers, on a basis of a frame, the signal which has been subjected to the Y/C separation, by storing the signal in a frame memory (not shown). The DSP 102 sweeps out the buffered signal at predetermined timing from each frame memory, converts the buffered signal into an image signal to generate an image, and then displays the image on the LCD 120. The user is able to visually recognize a real time through image photographed at proper luminance and focus based on the AE control and the AF control.
(Generation and Display of Focusing-Assisting Image by Photographing Apparatus)
When the AF function is stopped by a user operation through the operation unit 104, the photographing apparatus 1 moves to a manual focus mode. As a result, the user becomes able to adjust the focus through a manual operation. In this embodiment, when the photographing apparatus 1 moves to the manual focus mode, a focusing-assisting image is displayed on the LCD 120. The focusing-assisting image is an image for assisting the manual focusing operation by the user. Hereafter, generating and displaying of the focusing-assisting image are explained.
S1 in
As shown in
As shown in
S2 in
The cut-out processing circuit 102b extracts a part of the luminance signal Y inputted from the image conversion processing circuit 102a for each frame. Specifically, the cut-out processing circuit 102b designates an area in the subject image as a cut-out area, and extracts a part of the luminance signal Y corresponding to the designated area. Let us consider, for example, the case where a ranging point is a single point situated at the center. In this case, the cut-out processing circuit 102b designates, as a cut-out area, an area (a central area of the subject image) having a predetermined size centering the ranging point (the single point at the center), and extracts the part of the luminance signal Y corresponding to the designated area.
S3 in
As shown in
S4 in
As shown in
S5 in
To the combining circuit 102e, the edge signal E outputted from the size conversion processing circuit 102d as well as the luminance signal Y and the color difference signals Cb and Cr outputted from the image conversion processing circuit 102a are inputted. The edge signal E is a signal of the contour image constituting a partial area in the photographing angle of view. The luminance signal Y and the color difference signals Cb and Cr are signals constituting the image in the whole photographing angle of view. In the following, an image constituted by the edge signal E is referred to as a “contour image (E)”, and an image constituted by the luminance signal Y and the color difference signals Cb and Cr is referred to as a “whole image (YC)”. The combining circuit 102e generates a combined image (a focusing-assisting image) by combining the contour image (E) with the whole image (YC). An area on the whole image (YC) on which the contour image (E) is combined is, for example, an area including the cut-out area designated by the cut-out processing circuit 102b in the area cut-out process of step S2 in
in
As shown in
By thus executing the steps S1 (image conversion process) to S6 (displaying of focusing-assisting image) in
S7 in
(Operation for Changing Cut-Out Area)
Let us consider, for example, the case where one of parts of the cross-key 104a is pressed long by the user. In this case, the combining circuit 102e stops the combining process of combining the whole image (YC) and the contour image (E), and alternatively outputs a combined image in which a cursor C is combined with the whole image (E). As a result, onscreen representation of the LCD 120 is changed from the focusing-assisting image in which the contour image (E) is combined with the whole image (YC) to the combined image in which the cursor C is combined with the whole image (YC).
The position of the cursor C on the whole image (YC) represents the ranging point (here, one ranging point), and can be moved by the user operation to the cross key 104a. When a confirmation key (not shown) is pressed by the user, the cut-out processing circuit 102b designates an area having a predetermined size and centering at the cursor C as a cut-out area, and extracts a part of the luminance signal Y corresponding to the designated area. As a result, the focusing-assisting image having the contour image (E) of the area designated by the user is displayed on the LCD 120.
The size (range) of the cut-out area cut out by the cut-out processing circuit 102b is not limited to the predetermined size (range). Let us consider, for example, the case where the dial key 104b is operated by the user. In this case, the cut-out processing circuit 102b changes the size (range) of the cut-out area in response to the operation to the dial key 104b. As a result, the size (range) of the contour image (E) displayed on the LCD 120 is changed.
By changing (the position and the range of) the cut-out area to be cut out by the cut-out processing circuit 102b, the user is able to display, on the LCD 120, the contour image (E) of the subject for which the focusing should be achieved in the photographing angle of view.
(Operation for Changing Displaying Position of Contour Image (E))
Even when the position and the size (range) of the contour image (E) are changed, the area on the whole image (YC) at which the contour image (E) is combined is an area including the cut-out area designated by the cut-out processing circuit 102b as explained in step S5 (combining process) in
By changing the displaying position of the contour image (E) formed by the combing circuit 102e, the user becomes able to visually recognize, in a normal style (i,e., in true color, luminance, tone and etc.), for example, the subject which has been hidden behind the contour image (E) while checking the degree of emphasis of the contour line of the contour image (E).
(Operation for Changing Displaying Scale Factor (Display Size) of Contour Image (E))
Let us consider, for example, the case where the zoom key 104c is operated by the user. In this case, the size conversion processing circuit 102d changes the settings of the displaying scale factor (display size) of the contour image (E) in response to an operation to the zoom key 104c. The combining circuit 102e combines the contour image (E), whose displaying scale factor (the displaying size) has been changed, with the whole image (YC). As a result, on the whole image (YC), the contour image (E) is displayed in an enlarged size, in an equal size or in a reduced size.
By enlarging the contour image (E), the user becomes able to visually recognize the degree of emphasis of a contour line more easily. Furthermore, by reducing the contour image (E), the user becomes able to visually recognize, in a normal style (i.e., in true color, luminance, tone and etc.), for example, the subject which has been hidden behind the contour image (E).
The foregoing is the exemplary embodiment of the invention. It is understood that the invention is not limited to the above described embodiment, but can be varied within the scope of the invention. For example, combinations of the above described exemplary embodiment and variations derived therefrom are also included in the scope of the invention.
For example, in the above described embodiment, the focusing-assisting image is displayed during the manual operation for focus adjustment; however, in another embodiment the focusing-assisting image may be displayed during autofocus.
Let us consider, for example, the case where the AF ranging point is a single point at the center. In this case, the cut-out processing circuit 102b designates an area having a predetermined size and centering at the AF ranging point (the single point at the center) as the cut-out area. The combining circuit 102e combines the contour image (E), for example, on an area including the cut-out area. Let us further consider the case of multipoint ranging where a plurality of AF ranging points is laid out. In this case, the cut-out processing circuit 102b designates, as the cut-out area, an area having a predetermined size and centering at an AF ranging point which is one of the plurality of AF ranging points and is used for focusing. In this case, the combining circuit 102e combines the contour image (E) on an area including the cut-out area.
As described above, when the focusing-assisting image is displayed during execution of autofocus, the user is able to check the AF focusing state while recognizing the composition of the subject and the whole composition including the subject displayed in a normal style (i.e., in true color, luminance, tone and etc.).
This application claims priority of Japanese Patent Application No. P2013-035633, filed on Feb. 26, 2013. The entire subject matter of the application is incorporated herein by reference.
Number | Date | Country | Kind |
---|---|---|---|
2013-035633 | Feb 2013 | JP | national |