This nonprovisional application claims priority under 35 U.S.C. §119(a) on Patent Application No. 2009-174006 filed in Japan on Jul. 27, 2009 and on Patent Application No. 2010-130763 filed in Japan on Jun. 8, 2010, the entire contents of which are hereby incorporated by reference.
1. Field of the Invention
The present invention relates to an image reproducing apparatus which performs reproduction of images and an image sensing apparatus which obtains images by photography.
2. Description of Related Art
In a conventional image reproducing apparatus, in order to view a reproduction target image by expanding a part of the same, a user instructs a position and a size of the image to be viewed on the reproduction target image. Responding to this instruction, the image reproducing apparatus clips an image having the specified position and size from the reproduction target image and expands the clipped image so as to output the image to the monitor. Thus, in the conventional image reproducing apparatus, if the user wants to clip a part of the reproduction target image and to expand the clipped image for viewing the same, the user is required to operate an operating key or the like so as to specify a position and a size of the region to be clipped, separately. Therefore, it is difficult to display a desired image quickly and intuitively.
The same is true for taking an image. Specifically, for example, in the conventional image sensing apparatus, if the user wants to record in the recording medium only an image signal inside a noted region on an image sensor in which a noted subject exists, it is necessary to set a position and a size of the noted region separately.
Further, there are also proposed a method of performing an electronic zoom or an optical zoom in accordance with an operation to a touch panel. However, any specific method of the touch panel operation is not proposed. It is desired to propose an operation method that can be performed intuitively. In addition, in a conventional method concerning a clipping operation of an image, there are proposed an operation to input a circle enclosing the subject and an operation to trace a periphery of the subject, but these proposals are not sufficient, and it is desired to propose an operation method that can be performed intuitively.
An image reproducing apparatus according to the present invention includes touch panel monitor which has a display screen and receives touch panel operation performed with an operation member touching the display screen, in which an output image obtained by extracting an image inside an extraction region as a part of an entire image area of an input image from the input image is displayed on the touch panel monitor or a monitor of an external display device. The touch panel monitor receives region specifying operation of specifying a position and a size of the extraction region as one type of the touch panel operation when an entire image of the input image is displayed on the display screen. In the region specifying operation, the position and the size of the extraction region are specified on the basis of a position at which the operation member touches the display screen and a period of time while the operation member is touching the display screen, or on the basis of an initial point and a terminal point of a movement locus of the operation member on the display screen, or on the basis of a plurality of positions at which the a plurality of operation members as the operation member touch the display screen.
Further, for example, an image sensing apparatus including the image reproducing apparatus maybe constituted. An input image to the image reproducing apparatus can be obtained by photography with the image sensing apparatus.
A first image sensing apparatus according to the present invention includes a touch panel monitor which has a display screen and receives touch panel operation performed with an operation member touching the display screen, an image sensor which outputs an image signal indicating an incident optical image of a subject, and an extracting unit which extracts an image signal inside an extraction region as a part of an effective pixel region of the image sensor. The touch panel monitor receives region specifying operation of specifying a position and a size of the extraction region as one type of the touch panel operation when the entire image based on the image signal inside the effective pixel region is displayed on the display screen. In the region specifying operation, the position and the size of the extraction region are specified on the basis of a position at which the operation member touches the display screen and a period of time while the operation member is touching the display screen, or on the basis of an initial point and a terminal point of a movement locus of the operation member on the display screen, or on the basis of a plurality of positions at which the a plurality of operation members as the operation member touch the display screen.
A second image sensing apparatus according to the present invention includes a touch panel monitor which has a display screen and receives touch panel operation performed with an operation member touching the display screen, an image pickup unit which has an image sensor to output an image signal indicating an incident optical image of a subject and generates an image by photography from an output signal of the image sensor, a view angle adjustment unit which adjusts an imaging angle of view in the image pickup unit, and an incident position adjustment unit which adjusts an incident position of the optical image on the image sensor. The touch panel monitor receives view angle and position specifying operation for specifying the imaging angle of view and the incident position as one type of the touch panel operation when a taken image obtained by the image pickup unit is displayed on the display screen. In the view angle and position specifying operation, the imaging angle of view and the incident position are specified on the basis of a position at which the operation member touches the display screen and a period of time while the operation member is touching the display screen, or on the basis of an initial point and a terminal point of a movement locus of the operation member on the display screen, or on the basis of a plurality of positions at which the a plurality of operation members as the operation member touch the display screen.
A third image sensing apparatus according to the present invention includes an image pickup unit which has an image sensor to output an image signal indicating an incident optical image of a subject and generates an image by photography from an output signal of the image sensor, a view angle adjustment unit which adjusts an imaging angle of view in the image pickup unit, and an incident position adjustment unit which adjusts an incident position of the optical image on the image sensor. The view angle and position specifying operation for specifying the imaging angle of view and the incident position is received as single operation.
Meanings and effects of the present invention will be apparent from the following description of embodiments. However, the embodiments described below are merely example embodiments of the present invention, and meanings of the present invention and terms of individual elements are not limited to those described in the following embodiments.
Hereinafter, embodiments of the present invention will be described specifically with reference to the attached drawings. In the drawings to be referred to, the same part is denoted by the same reference numeral or symbol, so that overlapping description of the same part is omitted as a rule.
A first embodiment of the present invention will be described.
The digital camera 1 includes a main casing 2 like a roundish rectangular solid and a monitor casing 3 like a plate, which are connected to each other via a connection. The monitor casing 3 is equipped with a camera monitor 17 as a display device. The monitor casing 3 is attached to the main casing 2 in an openable and closable manner, so that a relative position of the monitor casing 3 to the main casing 2 is variable.
Incident light from the subject enters the image sensor 33 via the individual lenses constituting the optical system 35 and the aperture stop 32. The lenses constituting the optical system 35 form an optical image of the subject on the image sensor 33. The image sensor 33 is constituted of a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) image sensor. The image sensor 33 performs photoelectric conversion of the optical image of the subject received via the optical system 35 and the aperture stop 32, and outputs an electric signal obtained by the photoelectric conversion as an image signal.
The driver 34 moves the zoom lens 30, the focus lens 31 and the correction lens 36 on the basis of a lens control signal from a photography control unit 13. When a position of the zoom lens 30 is changed, a focal length of the image pickup unit 11 and an angle of view of imaging with the image pickup unit 11 (hereinafter referred to as “imaging angle of view” simply) are changed. At the same time, an optical zoom magnification is changed. When a position of the focus lens 31 is changed, a focal position of the image pickup unit 11 is adjusted. When a position of the correction lens 36 is changed, the optical axis is shifted, so that an incident position of the optical image on the image sensor 33 is changed. In addition, the driver 34 controls opening amount of the aperture stop 32 (size of the opening part) on the basis of an aperture stop control signal from the photography control unit 13. As the opening amount of the aperture stop 32 increases, incident light amount per unit time in the image sensor 33 increases.
An analog front end (AFE) that is not illustrated amplifies an analog image signal output from the image sensor 33 and converts the signal into a digital signal (digital image signal). The obtained digital signal is recorded as image data of a subject image in an image memory 12 such as a synchronous dynamic random access memory (SDRAM) or the like. The photography control unit 13 adjusts the imaging angle of view, the focal position, and incident light amount in the image sensor 33 on the basis of the image data, a user's instruction or the like. Note that the image data is a type of video signal which includes, for example, a luminance signal and a color difference signal.
An image processing unit 14 processes the image data of the subject image stored in the image memory 12 by necessary image processings (noise reduction process, edge enhancement process, and the like). A recording medium 15 is a nonvolatile memory constituted of a magnetic disk, a semiconductor memory, or the like. Image data after the image processing by the image processing unit 14 or image data before the image processing (so-called RAW data) can be recorded in the recording medium 15.
A record controller 16 performs record control necessary for recording various data in the recording medium 15. The camera monitor 17 displays images obtained by the image pickup unit 11 or images recorded in the recording medium 15. An operating part 18 is a part for a user to do various operations to the digital camera 1. As illustrated in
A display controller 20 controls display contents of the camera monitor 17 or the monitor of the external display device (TV monitor 7 that will be described later as illustrated in
Operation modes of the digital camera 1 includes an imaging mode in which images (still images or moving images) can be taken and recorded, and a reproducing mode in which images (still images or moving images) recorded in the recording medium 15 are reproduced and displayed on the camera monitor 17 or the monitor of the external display device. The operation mode is changed between the individual modes in accordance with the operation of the operating key 43.
In the imaging mode, imaging of a subject is performed periodically at a predetermined frame period, so that the image pickup unit 11 outputs the image signal indicating the photographed image sequence of the subject. The image sequence such as the photographed image sequence means a set of images arranged in time sequence. Image data of one frame period expresses one image. The one image expressed by the image data of one frame period is also referred to as a frame image.
The camera monitor 17 is equipped with a touch panel.
As illustrated in
In the display screen 51 and the two-dimensional image 300, it is supposed that as a value of x which is the X axis coordinate value of the noted point increases, the position of the noted point moves to the right side which is a positive side of the X axis (right side in the XY coordinate plane), and that as a value of y which is the Y axis coordinate value of the noted point increases, the position of the noted point moves to the lower side which is a positive side of the Y axis (lower side in the XY coordinate plane). Therefore, in the display screen 51 and the two-dimensional image 300, as a value of x which is an X axis coordinate value of the noted point decreases, the position of the noted point moves to the left side (left side in the XY coordinate plane). Further, as a value of y which is the Y axis coordinate value of the noted point decreases, the position of the noted point moves to the upper side (upper side in the XY coordinate plane).
When the two-dimensional image 300 is displayed on the display screen 51 (when the two-dimensional image 300 is displayed on the entire display screen 51), the image at the position (x,y) on the two-dimensional image 300 is displayed at the position (x,y) on the display screen 51.
When the operation member touches the display screen 51, the touch detection unit 52 illustrated in
The digital camera 1 performs a characteristic operation according to the touch panel operation in the reproducing mode. When an image is reproduced, the digital camera 1 works as an image reproducing apparatus. The digital camera 1 can also display images (still images or moving images) recorded in the recording medium 15 on the monitor of an external display device such as a television receiver or the like.
In the first embodiment, hereinafter, an operation of the digital camera 1 in the reproducing mode, and display contents of the camera monitor 17 and the TV monitor 7 will be described. In the reproducing mode, a person who performs operations including the touch panel operation to the digital camera 1 is referred to as “operator”, and a person who views the TV monitor 7 is referred to as “viewer”. The operator can also be one of viewers. The image recorded in the recording medium 15 which is an image to be reproduced is referred to as “reproduction target image”. The reproduction target image can be obtained by photography with the digital camera 1. The reproduction target image is a still image or a moving image.
When the clipped image is displayed on the TV monitor 7 or the camera monitor 17, a resolution of the clipped image is converted into a resolution that is suitable as a resolution of the TV monitor 7 or the camera monitor 17. For instance, if the numbers of pixels of the image inside the clipping frame on the reproduction target image in the horizontal and the vertical directions are respectively 640 and 360, and if the numbers of pixels of the display screen of the TV monitor 7 in the horizontal and the vertical directions are respectively 1920 and 1080, the number of pixels of the image inside the clipping frame is multiplied by three in each of the horizontal and the vertical directions by a resolution conversion method using a known pixel interpolation method or the like, and then the image data is given to the TV monitor 7.
A block diagram of the portion that realizes the above-mentioned generation of the clipped image and the display operation is illustrated in
The clip setting unit 61 generates clipping information for clipping the clipped image from the input image that is the reproduction target image on the basis of touch operation information from the touch detection unit 52, camera motion information from the camera motion decision unit 21, and track result information from the track processing unit 63. The clip processing unit 62 generates the clipped image (in other words, extracts an image inside the clipping frame from the reproduction target image) by actually clipping the image inside the clipping frame from the reproduction target image on the basis of the clipping information. The generated clipped image itself or the image after a predetermined process performed on the generated clipped image can be displayed on the TV monitor 7 as an output image. In this case, the entire image of the reproduction target image is displayed on the camera monitor 17. However, it is possible to display on the camera monitor 17 the same image as the image displayed on the TV monitor 7. Note that the display controller 20 also performs timing control of image reproduction on the TV monitor 7 and the camera monitor 17 (details will be described later in the other embodiment).
The clipping information defines a condition for generating the clipped image as the output image from the input image as the reproduction target image. As long as the output image can be generated from the input image, any form of the clipping information can be adopted. For instance, as illustrated in
Conversion from a coordinate system of the input image to a coordinate system of the output image can be realized by using a geometric conversion (e.g., affine conversion). Therefore, it is possible that the clipping information includes a conversion parameter of the geometric conversion for generating the output image from the input image. In this case, the clip processing unit 62 performs the geometric conversion in accordance with the conversion parameter in the clipping information so that the output image is generated from the input image.
The camera motion information and the track result information are additional information for setting the clipping information, and it is possible that the camera motion information and/or the track result information are not reflected on the clipping information at all (in this case, the camera motion decision unit 21 and/or the track processing unit 63 are unnecessary). A method of using also the camera motion information or the track result information will be described later in the other embodiments, and this embodiment describes a method of setting the clipping information in accordance with the touch operation information.
The operator can specify a position and a size of the clipping frame by a plurality of operation methods. As the plurality of operation methods, first to fifth operation methods are exemplified as follows.
[First Operation Method]
A first operation method will be described. The touch panel operation according to the first operation method is an operation of pressing one point on the display screen 51 with a finger continuously for necessary time period. The touch detection unit 52 outputs to the clip setting unit 61 the touch operation information indicating a position (x1,y1) that is pressed by this operation continuously for the time period while the point is being pressed. The clip setting unit 61 sets the clipping information in accordance with the touch operation information so that the position (x1,y1) becomes the center position of the clipping frame and that a size of the clipping frame corresponds to the time period while the position (x1,y1) is being pressed.
In the first operation method, it is supposed that the aspect ratio of the clipping frame is prefixed. If the above-mentioned pressing time period is zero or substantially zero, a width and a height of the clipping frame are the same as those of the input image. As the pressing time period increases from zero, a width and a height of the clipping frame is decreased from those of the input image. During the period in which the touch panel operation according to the first operation method is performed, the clipping frame to be set is actually displayed on the display screen 51 (display screen 401 in
After setting the clipping information, the output image is generated from the image inside the clipping frame according to the clipping information and is displayed on the TV monitor 7 (the same is true in second to fifth operation methods).
[Second Operation Method]
A second operation method will be described. The touch panel operation according to the second operation method is an operation of pressing two points on the display screen 51 with two fingers simultaneously. The touch detection unit 52 outputs to the clip setting unit 61 the touch operation information indicating the two positions (x2A,y2A) and (x2B,y2B) that are pressed by this operation. Here, on the display screen 51 and on the XY coordinate plane, it is supposed that the position (x2A,y2A) is located on the upper left side of the position (x2B,y2B) (see
Simply, for example, the clip setting unit 61 sets the clipping information in accordance with the touch operation information, so that the position (x2A,y2A) becomes a start point of the clipping frame and that the position (x2B,y2B) becomes an end point of the clipping frame (see
In addition, for example, it is possible to set the clipping information so that |x2A−x2B| becomes a width of the clipping frame and that the aspect ratio of the clipping frame agrees with a desired aspect ratio. In this case, the position of the start point of the clipping frame is (x2A,y2A). Alternatively, the position of the end point of the clipping frame is (x2B,y2B). Alternatively, the center position of the clipping frame is set to ((x2A+x2B)/2,(y2A+y2B)/2).
In addition, for example, the clipping information may be set so that |y2A−y2B| becomes a height of the clipping frame and that the aspect ratio of the clipping frame agrees with a desired aspect ratio. In this case, the position of the start point of the clipping frame is (x2A,y2A). Alternatively, the position of the end point of the clipping frame is (x2B,y2B). Alternatively, the center position of the clipping frame is ((x2A+x2B)/2,(y2A+y2B)/2).
[Third Operation Method]
A third operation method will be described. The touch panel operation according to the third operation method is an operation of touching the display screen 51 with a finger and enclosing a particular region (desired by the operator) on the display screen 51 by moving the finger. In this case, the finger tip drawing a figure enclosing the particular region does not part from the display screen 51. In other words, the finger of the operator draws the figure enclosing the particular region with a single stroke.
In the touch panel operation according to the third operation method, the finger of the operator first starts to touch a position (x3A,y3A) on the display screen 51, and then the finger moves from the position (x3A,y3A) to the position (x3B,y3B) on the display screen 51 so as to enclose the periphery of the particular region. Until the finger reaches the position (x3B,y3B) from the position (x3A,y3A), the finger does not part from the display screen 51. The operator releases the finger from the display screen 51 when the finger reaches the position (x3B,y3B). Therefore, a movement locus of the finger from the position (x3A,y3A) as an initial point to the position (x3B,y3B) as a terminal point is specified by the touch operation information from the touch detection unit 52. The position (x3A,y3A) and the position (x3B,y3B) should agree with each other ideally, but they don't agree with each other actually in many cases. If they don't agree with each other, a straight line or a curve connecting the position (x3A,y3A) and the position (x3B,y3B) may be added to the movement locus, for example.
The clip setting unit 61 sets the clipping information in accordance with the movement locus specified by the touch operation information so that a barycenter of the figure enclosed by the movement locus becomes the center of the clipping frame and that a size of the clipping frame corresponds to the size of the figure enclosed by the movement locus. In this case, as a size of the figure increases, a size of the clipping frame is set larger. For instance, a rectangular frame that has the center as the barycenter of the figure and is a smallest rectangular frame including the figure is set as the clipping frame. In this case, an aspect ratio of this rectangular frame should agree with a desired aspect ratio (aspect ratio of the TV monitor 7 in this example).
[Fourth Operation Method]
A fourth operation method will be described. The touch panel operation according to the fourth operation method is an operation of touching the display screen 51 with a finger and tracing a diagonal of a display region to be the clipping region with the finger.
In the touch panel operation according to the fourth operation method, the finger of the operator first starts to touch a position (x4A,y4A) on the display screen 51, and then the finger moves linearly from the position (x4A,y4A) to a position (x4B,y4B) on the display screen 51. Until the finger reaches the position (x4B,y4B) from the position (x4A,y4A), the finger does not part from the display screen 51. The operator releases the finger from the display screen 51 when the finger reaches the position (x4B,y4B). Therefore, a movement locus of the finger from the position (x4A,y4A) as an initial point to the position (x4B,y4B) as a terminal point is specified by the touch operation information from the touch detection unit 52. Here, on the display screen 51 and on the XY coordinate plane, it is supposed that the position (x4A,y4A) is located on the upper left side of the position (x4B,y4B) (see
The clip setting unit 61 sets the clipping information in accordance with the touch operation information so that the position (x4A,y4A) becomes the start point of the clipping frame and that the position (x4B,y4B) becomes the end point of the clipping frame (see
[Fifth Operation Method]
A fifth operation method will be described. The touch panel operation according to the fifth operation method is an operation of touching the display screen 51 with a finger and tracing a half diagonal of a display region to be the clipping region with the finger.
In the touch panel operation according to the fifth operation method, the finger of the operator first starts to touch a position (x5A,y5A) on the display screen 51, and then the finger moves linearly from the position (x5A,y5A) to a position (x5B,y5B) on the display screen 51. Until the finger reaches the position (x5B,y5B) from the position (x5A,y5A), the finger does not part from the display screen 51. The operator releases the finger from the display screen 51 when the finger reaches the position (x5B,y5B). Therefore, a movement locus of the finger from the position (x5A,y5A) as an initial point to the position (x5B,y5B) as a terminal point is specified by the touch operation information from the touch detection unit 52. Here, on the display screen 51 and on the XY coordinate plane, it is supposed that the position (x5A,y5A) is located on the upper left side of the position (x5B,y5B) (see
The clip setting unit 61 sets the clipping information in accordance with the touch operation information so that the position (x5A,y5A) becomes the center position of the clipping frame and that the position (x5B,y5B) becomes the end point of the clipping frame (see
Specifically, if the positions (x5A,y5A) and (x5B,y5B) specified by the operator are used as they are as the center position and the position of the end point of the clipping frame so as to generate the clipped image, the aspect ratio of the clipped image may not agree with a desired aspect ratio. In this case, for example, it is possible to expand the image in the clipping frame in the horizontal or the vertical direction so that the aspect ratio of the clipped image and the desired aspect ratio agree with each other, and to display the expanded image on the TV monitor 7. Alternatively, it is possible to reset the center and the end point of the clipping frame in which the aspect ratio agrees with the desired aspect ratio, in accordance with the positions (x5A,y5A) and (x5B,y5B).
In addition, for example, the clipping information may be set so that (|x5A−x5B|×2) becomes a width of the clipping frame and that the aspect ratio of the clipping frame agrees with a desired aspect ratio. In this case, the center position of the clipping frame is set to (x5A,y5A). In addition, for example, the clipping information may be set so that (|y5A−y5B|×2) becomes a height of the clipping frame and that the aspect ratio of the clipping frame agrees with a desired aspect ratio. In this case, too, the center position of the clipping frame is set to (x5A,y5A).
Although the operation described above is the case where the center position (x5A,y5A) and the lower right corner position (x5B,y5B) of the clipping frame are specified, it is possible to specify not the lower right corner position but a position of the upper left corner, the upper right corner or the lower left corner in the clipping frame by the touch panel operation.
In the digital camera 1, any one of the first to the fifth operation methods may be adopted. It is possible to configure the digital camera 1 so that a plurality of operation methods among the first to the fifth operation methods can be used for the touch panel operation, and that the digital camera 1 automatically decide which one of the operation methods is used as the touch panel operation by deciding the number of fingers touching the display screen 51 and the moving state of the finger touching the display screen 51 from the touch operation information.
As described above, a position and a size of the clipping frame can be specified by the intuitive touch panel operation (region specifying operation). Therefore, the operator can set an angle of view and the like of the reproduction image to desired ones quickly and easily. In each operation method, when the operator gives the touch panel operation to the digital camera 1, the finger does not part from the display screen 51 of the touch panel (the finger is not released from the display screen 51 of the touch panel). In other words, a position and a size of the clipping frame are specified by the single operation without separating the finger from the display screen 51 (the single operation is finished when separating the finger from the display screen 51). Therefore, the operation is easier and finishes in a shorter time than the conventional apparatus which requires specifying a position of the clipping frame and a size of the clipping frame separately.
For instance, in the conventional apparatus, a first operation (with a cursor key, for example) is performed for specifying the center position of the clipping frame, and a second operation (with a zoom button, for example) for specifying a size of the clipping frame is performed separately and differently from the first operation, so that specifying a position and a size of the clipping frame is completed. In other words, in the conventional apparatus, the operation of specifying the center position of the clipping frame and the operation for specifying a size of the clipping frame are performed at different timings by different operation methods. In contrast, in this embodiment, the operation of specifying the position of the clipping frame and the operation of specifying the size of the clipping frame are made to be common. Therefore, when the operation of specifying the position of the clipping frame is completed, the operation of specifying the size of the clipping frame is completed at the same time. Further, when the operation of specifying the size of the clipping frame is completed, the operation of specifying the position of the clipping frame is completed at the same time. In other words, in this embodiment, a position and a size of the clipping frame are designated by a single operation that cannot be divided.
Further, in comparison with “an operation of inputting a circle enclosing a subject and an operation of tracing a periphery of the subject” described in JP-A-2010-062853, paragraph 0079, the individual operation methods illustrated in
In the first operation method, the position at which the finger contact is set as the center position of the clipping frame. Therefore, the user can precisely set the center position of the clipping frame to a desired position.
In the second operation method, a position and a size of the clipping frame are determined when the two fingers contact with the display screen 51. Therefore, the user can instantly complete specifying a position and a size of the clipping frame.
In the fourth operation method, diagonal corners of the clipping frame are located at positions of the initial point and the terminal point of the movement locus of the finger. Therefore, the user can set a position and a size of the clipping frame to a desired position and size correctly and easily. In addition, it is easier to complete quickly the setting of the position and the like of the clipping frame than the operation of inputting a circle enclosing the subject or the operation of tracing the periphery of the subject.
Also by the fifth operation method, similarly to the fourth operation method, the user can easily set a position and a size of the clipping frame to a desired position and size correctly, and can complete more quickly and easily the setting of a position and the like of the clipping frame than the operation of inputting a circle enclosing the subject or the operation of tracing the periphery of the subject.
Further, in the above description, it is supposed that the clipped image generated by the touch panel operation is displayed on the TV monitor 7, but it is possible to display the clipped image on the camera monitor 17 (the same is true in second to sixth embodiments described later). Specifically, for example, before the touch panel operation is performed, the entire image of the reproduction target image (e.g., the image 310 illustrated in
A second embodiment of the present invention will be described. The second embodiment and a third to sixth embodiments described later are embodiments based on the description of the first embodiment, and the description of the first embodiment is applied also to the second to sixth embodiments as long as no contradiction arises. Also in the second to sixth embodiments, similarly to the first embodiment, an operation of the digital camera 1 in the reproducing mode will be described. In addition, also in the second to sixth embodiments, similarly to the first embodiment, it is supposed that the television receiver 6 is connected to the digital camera 1.
In the second embodiment, it is an assumption that the reproduction target image as the input image is a moving image. In addition, it is supposed that the number of pixels in the display screen of the TV monitor 7 is larger than the number of pixels of the image in the clipping frame (therefore, a size of the image in the clipping frame is enlarged when the image inside the clipping frame is clipped and displayed on the TV monitor 7).
The reproduction target image that is a moving image is constituted of a plurality of frame images arranged in time sequence. Each of the frame images constituting the input image as the reproduction target image is particularly referred to as an input frame image, and the n-th input frame image is denoted by symbol Fn (n is an integer). The input frame images F1, F2, F3 and so on of the first, second, third and so on are sequentially displayed so that the reproduction target image as the moving image is reproduced. Note that in this specification, for simple description, a symbol may be referred to so that a name corresponding to the symbol may be omitted or shortened. For instance, “input frame image Fn” may be simply referred to as “image Fn”, and both indicate the same thing.
In the second embodiment, a position and the like of the clipping frame are determined by the touch panel operation, the subject in the clipping frame is tracked so as to update a position of the clipping frame. Therefore, a position and a size of the clipping frame are determined in accordance with not only the touch operation information from the touch detection unit 52 but also the track result information from the track processing unit 63 illustrated in
With reference to
Right after the touch panel operation, the clip setting unit 61 generates clipping information according to the touch panel operation and supplies the same to the clip processing unit 62. Thus, right after the touch panel operation, the enlarged image of the image in the clipping frame set on the input frame image Fn is displayed as the clipped image on the TV monitor 7. A position and a size of the clipping frame set on the input frame image Fn is determined on the basis of the touch operation information without depending on an output of the track processing unit 63. After that, positions and sizes of clipping frames set on input frame images Fn+1, Fn+2 and so on may be the same as those of the input frame image Fn. In this embodiment, however, positions and sizes of them are updated on the basis of the output of the track processing unit 63 (i.e., the track result information).
In order to realize this update, after the touch panel operation, the track processing unit 63 performs a track process of tracking on the input frame image sequence a target object in the input frame image sequence on the basis of the image data of the input frame image sequence. Here, the input frame image sequence is an input frame image sequence constituted of an input frame image Fn and individual input frame images after the input frame image Fn. If the reproduction target image is a one that is obtained by photographing by the digital camera 1, the target object is a target subject of the digital camera 1 when the reproduction target image is photographed. The target object to be tracked by the track process is referred to as a tracking target in the following description.
In the track process, positions and sizes of the tracking target in the individual input frame images are sequentially detected on the basis of the image data of the input frame image sequence. Actually, an image area (in other words, an image region) in which image data indicating the tracking target exists is set as a tracking target region in each input frame image, and a center position (or a barycenter position) and a size of the tracking target region are detected as a position and a size of the tracking target. The track processing unit 63 outputs the track result information containing information indicating a position and a size of the tracking target in each input frame image. As a method of the track process, any tracking method including known methods can be used. For instance, a mean shift method, a block matching method, or a tracking method based on an optical flow maybe used for realizing the track process.
The clip setting unit 61 updates the clipping information on the basis of the track result information so that the tracking target region is included in the clipping frame set in each input frame image after the input frame image Fn. Simply, for example, the clipping information are sequentially updated on the basis of the track result information so that the center of the tracking target region and the center of the clipping frame agree or substantially agree with each other. The size of the clipping frame may be constant, but the size of the clipping frame may be updated in accordance with a size of the tracking target region.
If it becomes unable to detect the tracking target from the input frame image because the tracking target goes out of the frame, the clipping control should be canceled. The clipping control means control of generating the clipped image so as to display the clipped image on the TV monitor 7 and/or the camera monitor 17. Cancel of the clipping control means to cease the generation of the clipped image and the display of the clipped image on the TV monitor 7 and/or the camera monitor 17. After the clipping control is canceled, the entire image of the input frame image is displayed on the TV monitor 7 and the camera monitor 17.
In addition, the clipping control may be canceled also in the case where it is decided that the tracking target has not moved for a predetermined time after the tracking target is set. It is because that if the non-moving object is displayed in an enlarged manner continuously, the display moving image may become monotonous so that the viewer may be bored. If a level of movement of the center position of the tracking target region in the input frame image is lower than a predetermined value for a predetermined time, it is possible to decide that the tracking target has not moved for a predetermined time.
Various methods may be used as a setting method of the tracking target.
For instance, a contour extracting process based on the image data can be used for extracting an object that exists at the center or the vicinity thereof in the clipping frame set in the image Fn so as to set the extracted object as a tracking target.
Alternatively, for example, it is also possible to determine a main color of the image inside the clipping frame on the image Fn on the basis of image data of the image inside the clipping frame on the image Fn, so as to set an object having the main color inside the clipping frame on the image Fn as the tracking target. A barycenter of the image area having the main color in the clipping frame on the image Fn can be set as the center of the tracking target region, and the image area having the main color can be set as the tracking target region. The main color means, for example, a dominate color or a most frequent color in the image in the clipping frame on the image Fn. The dominate color in an image means the color that occupies most part of the image area of the image. The most frequent color in an image means the color that has a highest frequency in a color histogram of the image (the dominate color and the most frequent color may be the same).
Alternatively, for example, it is possible to detect a face of a person existing in the image on the basis of the image data of the image in the clipping frame on the image Fn, so as to set the detected face or the person having the same as the tracking target.
Still alternatively, for example, it is also possible to set the tracking target on the basis of the touch operation information generated by the touch panel operation. For instance, it is possible to set an object existing at a position on the display screen 51 touched first by a finger in the touch panel operation as the tracking target.
Specifically, for example, it is also possible to use the following tracking target setting method based on the touch panel operation. A case where the input frame image displayed on the camera monitor 17 when the touch panel operation is made is the image 430 illustrated in
This touch panel operation is a variation of that according to the third operation method described above in the first embodiment (see
By updating the position and the like of the clipping frame by using the track process, it is possible to display continuously the enlarged image of the object noted by the operator (and viewer).
A third embodiment of the present invention will be described. Also in the third embodiment, similarly to the second embodiment, it is an assumption that the reproduction target image as the input image is a moving image, and it is supposed that the number of pixels in the display screen of the TV monitor 7 is larger than the number of pixels of the image in the clipping frame.
In the third embodiment, after a position and the like of the clipping frame is determined by the touch panel operation, cancellation or the like of the clipping control is performed as needed on the basis of the camera motion information from the camera motion decision unit 21 illustrated in
With reference to
Right after the touch panel operation, the clip setting unit 61 generates clipping information according to the touch panel operation and supplies the same to the clip processing unit 62. Thus, right after the touch panel operation, the enlarged image of the image in the clipping frame set on the input frame image Fn is displayed as the clipped image on the TV monitor 7. Similar clipping control is performed also for the individual input frame images after the input frame image Fn, but the clipping control can be cancelled on the basis of the camera motion information.
The camera motion decision unit 21 decides a state of the camera movement on the basis of the image data of the input frame image sequence when the input frame image sequence is photographed. Here, the input frame image sequence is an input frame image sequence constituted of the input frame image Fn and the individual input frame images after the input frame image Fn. The camera movement means a movement of the main casing 2 by a panning operation (operation of turning the main casing 2 in a yawing direction) or the like. There are camera movements including a movement of turning the main casing 2 in a tilting or rolling direction and a movement of moving the main casing 2 in a parallel manner. In the following description, however, for convenience of description, it is supposed that the camera movement is a movement of the main casing 2 by the panning operation.
For instance, the camera motion decision unit 21 estimates presence or absence of the panning operation on the basis of an optical flow between the first and the second frame images detected on the basis of image data of the first and the second frame images that are adjacent in time, so as to decide presence or absence of the camera movement. The method of estimating presence or absence of the panning operation on the basis of the optical flow is known. The first and second frame image is, for example, the input frame image Fn+9 and the input frame image Fn+10. If it is estimated that there is a panning operation, it is decided that there is a camera movement. If it is estimated that there is no panning operation, it is decided that there is no camera movement.
In addition, for example, the camera motion decision unit 21 can decide presence or absence of a camera movement by using scene change decision using color histograms. For instance, color histograms of the first and the second frame images are generated from image data of the first and the second frame images (e.g., images Fn+9 and Fn+10), and a difference degree of the color histogram between the first and the second frame images is calculated. Further, if the difference degree is relatively large, it is decided that there is a camera movement. If the difference degree is relatively small, it is decided that there is no camera movement. If an image of a scene of sea is taken by the panning operation after taking an image of a mountain landscape, the color histogram changes largely between before and after the panning operation. From this change of the color histogram, presence or absence of the camera movement can be decided.
Further, if a camera movement sensor for detecting a movement of the main casing 2 is provided to the camera motion decision unit 21 and detection data of the camera movement sensor is recorded in the recording medium 15 when the input frame image sequence is taken, it is possible to detect presence or absence of a panning operation from the detection data so as to decide presence or absence of the camera movement. The camera movement sensor is, for example, an angular velocity sensor for detecting angular velocity of the main casing 2 or an acceleration sensor for detecting acceleration of the main casing 2.
If it is decided there is a camera movement while the clipping control is performed, the clip setting unit 61 and the clip processing unit 62 cancel the clipping control at the decision time point. For instance, if the panning operation is performed in the time period between the input frame images Fn+9 and Fn+10 and it is decided that there is a camera movement between the input frame images Fn+9 and Fn+10, the clipping control for the input frame image Fn+10 is canceled so that the entire image of the input frame image Fn+10 is displayed on the TV monitor 7 (and the camera monitor 17) (see
However, after the clipping control is once cancelled, it is also possible to process as follows. It is supposed that the first panning operation has been performed between the input frame images Fn+9 and Fn+10, and due to this, the clipping control that has been performed for the input frame images Fn to Fn+9 is cancelled at the time point of displaying the input frame image Fn+10. In this case, the camera motion decision unit 21 stores the clipping information set for the input frame image Fn+9 that is a taken image before the first panning operation as initial clipping information. Then, after the input frame image Fn+9 is regarded as a background image, similarity between each input frame image (Fn+11, Fn+12 and so on) obtained after the input frame image Fn+10 and the background image is evaluated. When an input frame image giving high similarity is supplied to the camera motion decision unit 21, the clipping control is restarted by using the initial clipping information.
More specifically, for example, a binary differential image between the input frame image and the background image is generated for each input frame image obtained after the input frame image Fn+10, so that the similarity between the input frame image and the background image is evaluated from the binary differential image. If a sum of absolute values of pixel signals of individual pixels of the binary differential image is smaller, the similarity is higher. The clipping control is not restarted until a similarity higher than a predetermined reference similarity is obtained. For instance, if similarities corresponding to input frame images Fn+11 to Fn+19 are lower than the reference similarity and a similarity corresponding to the input frame image Fn+20 is higher than the reference similarity, it is decided that the second panning operation is performed right before the input frame image Fn+20 is taken, so that the imaging direction of the digital camera 1 is reset to that before the first panning operation, and the clipping control is restarted. The initial clipping information is applied to the input frame images (including the input frame image Fn+20) after the clipping control is restarted.
This method can be adapted to a situation, for example, as illustrated in
Note that it is possible to restart the clipping control on the basis of the initial clipping information according to an instruction from the operator. For instance, in the above-mentioned example, after the clipping control is cancelled by the first panning operation, the initial clipping information is stored while the entire images of the individual input frame images after the input frame image Fn+10 are sequentially displayed on the TV monitor 7 and the camera monitor 17. In this case, a particular icon is also displayed on the display screen 51 of the camera monitor 17. When the operator touches the icon with a finger, the initial clipping information is applied to the input frame images after the time point of the touching operation, and the clipping control is restarted.
If the clipping control is simply continued when the panning operation or the like is performed, it is usually undesired because an image area that is not noted by the operator and viewer is enlarged and displayed. This problem can be solved by the method of the third embodiment.
A fourth embodiment of the present invention will be described. In the fourth embodiment, a timing when the touch panel operation is reflected on the display image will be described. Operations performed by the operator to the digital camera 1 for displaying a desired image on the TV monitor 7 or the camera monitor 17 include the above-mentioned touch panel operation and other setting operation. A series of periods of the touch panel operation and the setting operation is referred to as an operation period. The operation period can be considered to start at the same time when the touch panel operation is started. However, it is possible to start the operation period by a predetermined operation operated by the operator. The operation period may be finished simultaneously with the end of the touch panel operation (i.e., the operation period may be finished when the finger is released from the display screen 51), or the operation period may be finished in accordance with a predetermined operation performed by the operator.
In a first method, when the touch panel operation is started, the clipped image is promptly displayed on the TV monitor 7 or the camera monitor 17 in accordance with the touch panel operation without waiting the end of the operation period. More specifically, for example, in the case where the above-mentioned first operation method is used (see
In a second method, until the touch panel operation is completed, or until the operation period is finished, a result of the touch panel operation is not reflected on the display image. For instance, when the first operation method is utilized (see
A fifth embodiment of the present invention will be described. In the fifth embodiment, display content control of the camera monitor 17 or the TV monitor 7 during the operation period will be described. As a display content control method, first to fifth display control methods are described as follows. In the digital camera 1, any one of the first to the fifth display control method can be performed. It is possible to combine one display control method with another display control method to be performed, as long as no contradiction arises.
The first display control method will be described. In the first and the second display control methods, it is an assumption that the clipped image is displayed on the camera monitor 17 after the clipping information is generated. In the first display control method, the contents of the touch panel operation are promptly reflected on the camera monitor 17. Specifically, when the touch panel operation is performed, the clipped image according to the clipping information corresponding to the touch panel operation is promptly generated from the input image and is displayed on the camera monitor 17 (in other words, when the touch panel operation is performed for specifying a position and a size of the clipping frame, the specified contents is promptly reflected on the display contents of the camera monitor 17). In the first display control method, the response of changing display contents according to the touch panel operation can be improved.
A second display control method will be described. In the second display control method, the contents of the touch panel operation performed during the operation period are reflected on the camera monitor 17 step by step (in other words, when the touch panel operation for specifying a position and a size of the clipping frame is performed, the specified contents are reflected on the display contents of the camera monitor 17 step by step). For instance, it is supposed that a size of the clipping frame is set to ⅓ of that of the input image by the touch panel operation from an initial state in which the entire image of the input image is displayed on the camera monitor 17. In this case, right after the touch panel operation, the clipped image using the clipping frame having a size of ⅔ of the input image is displayed on the camera monitor 17. Then, after a predetermined time passes, the clipped image using the clipping frame having a size of ½ of the input image is displayed on the camera monitor 17. Further, after a predetermined time passes, the clipped image using the clipping frame having a size of ⅓ of the input image is displayed on the camera monitor 17. The center position of each clipping frame agrees with that specified by the touch panel operation. According to the second display control method, a result of the touch panel operation is gently reflected on the image, so that the operator can easily set the display image to a desired one.
A third display control method will be described. In the third display control method, it is an assumption that the entire image of the reproduction target image is displayed on the camera monitor 17 during the operation period. In the third display control method, the clipping frame is actually displayed on the camera monitor 17 during the operation period. Specifically, for example, if the reproduction target image supplied to the clip processing unit 62 is the reproduction target image 310 illustrated in
A fourth display control method will be described. In the fourth display control method, clip setting information is displayed only on the camera monitor 17. The clip setting information means information for supporting the operator to determining a position and the like of the clipping frame.
In the fourth display control method, for example, an image 450 as illustrated in
The clip setting information is not limited to that described above. The clipping frame displayed on the camera monitor 17 as described above in the third display control method is also one type of the clip setting information. In addition, it is possible that numeric value or the like indicating a position or a clipping size of the clipping frame is included in the clip setting information. In addition, it is also possible to display any icon to support setting of the clipping frame (e.g., the icon described above in the third embodiment) as the clip setting information on the camera monitor 17.
Further, it is also possible to perform the display according to the third or the fourth display control method at time other than the operation period. For instance, it is possible to display on the camera monitor 17 the entire image of the reproduction target image on which the clipping frame is superposed or an image such as the image 450 illustrated in
A fifth display control method will be described. In the fifth display control method, it is assumption that the reproduction target image is a moving image. In the case where the reproduction target image is a moving image, after the touch panel operation, the clipped images are sequentially generated from the frame images constituting the reproduction target image so that the moving image constituted of the clipped image sequence is displayed on the TV monitor 7. In addition, the moving image constituted of the clipped image sequence or the moving image constituted of the frame image sequence (i.e., the reproduction target image) is displayed on the camera monitor 17. In the fifth display control method, during the operation period, reproduction of the moving image displayed on the camera monitor 17 is temporarily stopped. Specifically, for example, the image displayed on the camera monitor 17 at start time of the operation period (a clipped image or a frame image as a still image) is displayed fixedly on the camera monitor 17 during the operation period. By temporarily stopping the reproduction of the moving image, the operator can easily perform various operations. When the operation period is finished, the reproduction of the moving image to be displayed on the camera monitor 17 is restarted.
In each display control method described above, display contents of the camera monitor 17 are particularly noted, but it is possible to display on the TV monitor 7 a whole or a part of the display contents of the camera monitor 17 described above in each display control method described above, as long as no contradiction arises.
A sixth embodiment of the present invention will be described. A display content control of the TV monitor 7 after finishing the operation period will be described. As a display content control method according to the sixth embodiment, the sixth and the seventh display control method will be described later. In the digital camera 1, the sixth or the seventh display control method can be performed.
A sixth display control method will be described. In the sixth and the seventh display control methods, it is an assumption that the contents of the touch panel operation performed during the operation period is not reflected on the TV monitor 7 in the operation period. In the sixth display control method, the contents of the touch panel operation performed during the operation period is promptly reflected on the TV monitor 7 right after the operation period is finished. In other words, display of the clipped image corresponding to the touch panel operation is not performed on the TV monitor 7 during the operation period, but when the operation period is finished, right after that, the clipped image based on the clipping information corresponding to the touch panel operation is promptly generated from the reproduction target image and is displayed on the TV monitor 7. According to the sixth display control method, the response of changing display contents according to the touch panel operation can be improved.
A seventh display control method will be described. In the seventh display control method, the contents of the touch panel operation performed during the operation period is reflected step by step on the TV monitor 7 right after the operation period is finished. For instance, it is supposed that a size of the clipping frame is set to ⅓ of that the input image by the touch panel operation from an initial state in which the entire image of the input image is displayed on the camera monitor 17. In this case, right after the touch panel operation, the clipped image using the clipping frame having a size of ⅔ of the input image is displayed on the TV monitor 7. Then, after a predetermined time passes, the clipped image using the clipping frame having a size of ½ of the input image is displayed on the TV monitor 7. Further, after a predetermined time passes, the clipped image using the clipping frame having a size of ⅓ of the input image is displayed on the TV monitor 7. The center position of each clipping frame agrees with that specified by the touch panel operation. According to the seventh display control method, a relationship between display images before and after the clipping control is performed can be easily understood by the viewer.
It is possible to combine any one of the first to the fifth operation methods described above in the first embodiment with any method described above in the second to the sixth embodiments.
A seventh embodiment of the present invention will be described. In the above description, a still image or a moving image taken in the imaging mode is temporarily recorded in the recording medium 15, and after that, in the reproducing mode, the still image or the moving image read from the recording medium 15 is supplied to the clip processing unit 62 as the input image. In contrast, in the seventh embodiment, the process of generating a still image or a moving image of clipped images from the still image or the moving image obtained by photography is performed in real time in the imaging mode. Note that as to the digital camera 1 according to the seventh embodiment, the clip setting unit 61 and the clip processing unit 62 illustrated in
An operation when a still image is taken in the imaging mode will be described. When a still image is taken in the imaging mode, one frame image indicating the still image is displayed on the camera monitor 17 and is supplied to the clip processing unit 62 illustrated in
After the clipped image is generated, the display controller 20 can display the clipped image on the camera monitor 17 (or the TV monitor 7). In addition, the image data of the clipped image can be recorded in the recording medium 15. It is possible to record also the entire image data of the input image together with the image data of the clipped image in the recording medium 15.
An operation when the moving image is taken in the imaging mode will be described. When the moving image is taken, in the imaging mode, the individual frame images forming the moving image are sequentially displayed on the camera monitor 17 and are supplied as the input frame image to the clip processing unit 62 illustrated in
The display controller 20 can display the clipped image sequence generated from the input frame image sequence on the camera monitor 17 (or the TV monitor 7). In addition, the image data of the clipped image sequence can be recorded on the recording medium 15. Together with the image data of the clipped image sequence, the entire image data of the input frame image sequence may also be recorded in the recording medium 15.
In addition, when a moving image is photographed, the image data of each input frame image may also be supplied to the track processing unit 63 and/or the camera motion decision unit 21 illustrated in
The seventh embodiment is an embodiment based on the description in the first embodiment, and the description in the first embodiment can also be applied to the seventh embodiment as long as no contradiction arises. Further, the descriptions in the second to the sixth embodiments can also be applied to the seventh embodiment as long as no contradiction arises. When the descriptions in the first to the sixth embodiments are applied to the seventh embodiment, some terms adapted to the reproducing mode should be read as another term adapted to the imaging mode. Specifically, for example, when the descriptions in the first to the sixth embodiments are applied to the seventh embodiment, “operator” and “reproduction target image” in the descriptions in the first to the sixth embodiments should be read as “photographer” and “record target image” (or simply “target image”), respectively.
Further, the image sensor 33 illustrated in
The entire image of the frame image is formed of the output image signal of the individual light receiving pixels arranged in the effective pixel region. Since the clipped image is a part of the entire image of the frame image, the clipped image is formed of the output image signal of light receiving pixels in a part of the effective pixel region. The region where a part of the light receiving pixels are arranged can be regarded as the clipping region on the image sensor 33. The clipping region on the image sensor 33 is a part of the effective pixel region. Then, it can be said that the clip processing unit 62 is a unit of extracting the output image signal of the light receiving pixels in the clipping region set on the effective pixel region.
Considering this, when a moving image is taken, after the clipping information is set, it is possible to define a clipping region (clipping frame) according to the clipping information on the image sensor 33 so as to read only the output image signal of the light receiving pixels in the clipping region from the image sensor 33. Here, the image formed of the read image signal is equivalent to the clipped image described above, and this image may be displayed as the output image on the camera monitor 17 (or the TV monitor 7) and may be recorded in the recording medium 15.
In this embodiment too, the same effect as that of the embodiments described above can be obtained. Specifically, since the clipping position and size can be specified by an intuitive touch panel operation (region specifying operation), the operator can set an angle of view and the like of the display image or the record image to desired ones quickly and easily.
An eighth embodiment of the present invention will be described. As a method of extracting a subject image in a particular region, the method using so-called electronic zoom is described above in the seventh embodiment, while in the eighth embodiment, a method using optical zoom will be described. The following description in the eighth embodiment is a description of an operation of the digital camera 1 in the imaging mode.
In the imaging mode, the frame image sequence obtained by sequential photography is displayed as a moving image on the camera monitor 17 under control of the display controller 20. In the eighth embodiment, the image displayed on the camera monitor 17 is the entire image of the frame image. In the state where the entire image of the frame image is displayed on the camera monitor 17, the photographer can perform the same touch panel operation as that described above. When the touch panel operation is performed to the camera monitor 17, the clip setting unit 61 illustrated in
In the eighth embodiment, a frame corresponding to the clipping frame described above in each embodiment is referred to as an expansion specifying frame. Now, for specific description, it is supposed that when a frame image 500 illustrated in
In this case, the photography control unit 13 adjusts the imaging angle of view and adjusts an incident position of the optical image of the subject on the image sensor 33, on the basis of the clipping information to be said as expansion specifying information, so that the optical images of the subjects that have been formed at positions (xA1,yA1) and (xA2,yA2) on the image sensor 33 when the frame image 500 is taken are formed at the upper left corner and the lower right corner in the effective pixel region after a time period necessary for optical control passes. The adjustment of the imaging angle of view is realized by movement of the zoom lens 30 illustrated in
A frame image 510 obtained by photography after the time period necessary for optical control passes is illustrated in
The photography control unit 13 can realize the above-mentioned optical control as described later. The movement direction and the movement amount of the correction lens 36 that is necessary for forming the optical image of the subject that has been formed at the center position ((xA1+xA2)/2,(yA1+yA2)/2) of the expansion specifying frame, at the center position of the effective pixel region are determined by using a lookup table or a conversion expression that is prepared in advance. In addition, a ratio of a size (width or height) of the effective pixel region to a size (width or height) of the expansion specifying frame is determined, while a ratio of an imaging angle of view when the frame image 500 is taken to an imaging angle of view when the frame image 510 is taken is determined. Then, the movement amount of the zoom lens 30 necessary for matching the former ratio with the latter ratio is determined by using a lookup table or a conversion expression that is prepared in advance (the movement direction of the zoom lens 30 is known). Then, in the period after photography of the frame image 500 is finished until the exposure of the frame image 510 is started, the correction lens 36 is actually moved in accordance with the determined movement direction and movement amount of the correction lens 36, and the zoom lens 30 is actually moved in accordance with the determined movement amount of the zoom lens 30. Thus, the above-mentioned optical control is realized.
The display controller 20 can display each of the frame images 500 and 510 as a still image on the camera monitor 17 (and the TV monitor 7), and can display the frame image sequence including the frame images 500 and 510 as a moving image on the camera monitor 17 (and the TV monitor 7). The record controller 16 can record each of the frame images 500 and 510 as a still image in the recording medium 15, and can record the frame image sequence including the frame images 500 and 510 as a moving image in the recording medium 15.
The eighth embodiment is an embodiment based on the description in the first embodiment, and the description in the first embodiment can also be applied to the eighth embodiment as long as no contradiction arises. Further, the descriptions in the second to the seventh embodiments can also be applied to the eighth embodiment as long as no contradiction arises. When the descriptions in the first to the sixth embodiments are applied to the eighth embodiment, some terms adapted to the reproducing mode should be read as another term adapted to the imaging mode. Specifically, for example, when the descriptions in the first to the sixth embodiments are applied to the eighth embodiment, “operator” in the descriptions in the first to the sixth embodiments should be read as “photographer”.
In this embodiment too, the same effect as that of the embodiments described above can be obtained. Specifically, since a position and a size of the expansion specifying frame can be specified by an intuitive touch panel operation (view angle and position specifying operation), the operator can set an angle of view and the like of the display image or the record image to desired ones quickly and easily. In addition, since the enlarged image of the target subject is obtained by the optical control, image quality of the display image or the record image is improved compared with the seventh embodiment in which it is obtained by electronic zoom.
The method for realizing the adjustment of the incident position of the optical image by using movement of the correction lens 36 is described above, it is possible to realize the adjustment of the incident position by disposing a variangle prism (not shown) that can adjust a refraction angle of the incident light from the subject instead of the correction lens 36 in the optical system 35 and by driving the variangle prism. Alternatively, instead of driving an optical member such as the correction lens 36 or the variangle prism, it is possible to move the image sensor 33 in the direction perpendicular to the optical axis so as to realize the adjustment of the incident position. The function of driving the variangle prism or the function of moving the image sensor 33 may be performed by the photography control unit 13 illustrated in
A ninth embodiment of the present invention will be described. The ninth embodiment is an embodiment based on the description in the first embodiment, and as to matters that are not particularly described in this embodiment, the description in the first embodiment is also applied to this embodiment as long as no contradiction arises. Further, descriptions in the second to the sixth embodiments can also be applied to this embodiment as long as no contradiction arises. In the ninth embodiment too, similarly to the first embodiment, an operation of the digital camera 1 in the reproducing mode will be described. In the first embodiment, the first to the fifth operation methods for specifying a position and a size of the clipping frame (e.g., the clipping frame 311 illustrated in
In the following description, the input image that is a reproduction target image itself is also referred to as an original input image, for convenience sake. It is also possible that the clipped image extracted from the original input image by using the method Ai is set as a new input image, and the method Ai is further applied to the new input image. In
When the touch panel operation according to the method Ai described above is performed, the angle of view of the display image is decreased. By utilizing another touch panel operation, it is also possible to increase the angle of view of the display image. Specifically, for example, as illustrated in
In this embodiment, a method of increasing and decreasing the angle of view of the display image in a switching manner by the touch panel operation will be described. As apparent from the above description, the decrease in the size of the clipping frame causes a decrease in the angle of view of the display image. The increase in the size of the clipping frame causes an increase in the angle of view of the display image. Therefore, the method of increasing and decreasing the angle of view of the display image in a switching manner can be said to be a method of increasing and decreasing the size of the clipping frame in a switching manner. In the following description, for convenience sake, the state where the clipping frame 601 illustrated in
The user can use the touch panel so as to perform the operation of changing the clipping frame set on the original input image 600 from the clipping frame 601 to the clipping frame 601A (hereinafter referred to as an increasing operation) and the operation of changing the clipping frame set on the original input image 600 from the clipping frame 601 to the clipping frame 601B (hereinafter referred to as a decreasing operation). The former change corresponds to the increase (i.e., expansion) in a size of the clipping frame, while the latter change corresponds to the decrease (i.e., reduction) in a size of the clipping frame. Each of the increasing operation and the decreasing operation is one type of the touch panel operation. The touch panel operation according to the method Ai described above in the first embodiment is one type of the decreasing operation. Each of the various methods of increasing a size of the clipping frame as follows is one type of the increasing operation. When a size of the clipping frame is increased in accordance with the increasing operation, the center position of the clipping frame may be agreed before and after the increase, the center position of the clipping frame after the increase may be determined on the basis of the increasing operation (the same is true in other embodiments described later).
—Increase/Decrease Switching Method—
First, as a method of switching between increase and decrease of a size of the clipping frame, a plurality of switching methods will be described. By each of the switching methods, a change direction of a size of the clipping frame is determined. The plurality of switching methods include the following methods B1 to B6.
[Method B1]
In the method B1, a change direction of a size of the clipping frame is determined in advance by an increasing or decreasing direction setting operation as one type of the touch panel operation or an increasing or decreasing direction setting operation with respect to the operating part 18 illustrated in
The method B1 can be performed in combination with any one of the methods A1 to A5. For instance, in the case where it is combined with the method A1, if the determined direction is the decrease direction, a touch position is set to the center so that a size of the clipping frame (clipping frame 651 in the example illustrated in
In the case where the method B1 is combined with any one of the methods A2 to A5, if the determined direction is the decrease direction, the clipping frame should be set in accordance with the methods A2 to A5. By this setting, a size of the clipping frame is decreased. In the case where the method B1 is combined with any one of the methods A2 to A5, if the determined direction is the increase direction, a size of the clipping frame should be increased by a predetermined touch panel operation (e.g., an operation of pressing a specific point on the display screen 51 with a finger). A method of setting an increase rate will be described later (the same is true for the methods B2 to B6).
[Method B2]
In a method B2, a change direction of a size of the clipping frame is determined by a movement direction from an initial point to a terminal point in a movement locus of a touch position (it can be said that a change direction of a size of the clipping frame is determined on the basis of a positional relationship between the initial point and the terminal point). In a section corresponding to the method B2 in
[Method B3]
In a method B3, a change direction of a size of the clipping frame is determined on the basis of a positional relationship between the initial point and the terminal point on the movement locus of the touch position. The method B3 can also be performed in combination with the method A4 or A5. For instance, if the terminal point is closer to the center of the display screen 51 than the initial point, the clipping frame should be set in accordance with the method A4 or A5. By this setting, a size of the clipping frame can be decreased. On the contrary, if the initial point is closer to the center of the display screen 51 than the terminal point, a size of the clipping frame should be increased. It is possible to set the relationship between the positional relationship and a change direction of a size of the clipping frame in the opposite manner.
[Method B4]
In a method B4, a change direction of a size of the clipping frame is determined on the basis of whether or not a movement direction of the touch position is reversed while the touch position is moved. The method B4 can also be performed in combination with the method A4 or A5. For instance, in the process of moving from the initial point to the terminal point, if the touch position moves only in the same direction, a movement direction of the touch position is not reversed: In this case, the clipping frame should be set in accordance with the method A4 or A5. By this setting, a size of the clipping frame is decreased. On the contrary, if the touch position moves from the initial point to a certain direction and then moves in the opposite direction to reach the terminal point, it is decided that a movement direction of the touch position is reversed. In this case, a size of the clipping frame should be increased. Further, even if there is a reverse, if a movement of the touch position after the reverse is small, a change direction of a size of the clipping frame may be set to the decrease direction. It is possible to set the relationship between presence or absence of the reverse and a change direction of a size of the clipping frame in the opposite manner.
[Method B5]
When a method B5 is used, it is supposed that when the touch position moves from the initial point to the terminal point, the touch position moves in the clockwise direction or in the counterclockwise direction. On the display screen 51, the direction in which the touch position moves from the left side region via the upper side region to the right side region corresponds to the clockwise direction (see
[Method B6]
When a method B6 is used, it is supposed that the finger is still at the initial point or the terminal point for a certain time period. One of a time period while the finger is still at the initial point keeping a contact state with the display screen 51 and a time period while the finger is still at the terminal point keeping a contact state with the display screen 51 can be adopted as a target still period. In the method B6, a change direction of a size of the clipping frame is determined in accordance with a time length of the target still period. The method B6 can be performed in combination with any one of the methods A1 to A5. Since a movement of the touch position is not expected in the methods A1 and A2, if the method A1 or A2 is used, the touch position itself in the method A1 or A2 should be regarded as the initial point or the terminal point. For instance, a counter (not shown) which outputs a reset signal every time when a constant unit time passes is used, and a change direction of a size of the clipping frame is set to the decrease direction if the number of the reset signals output during the target still period is an odd number, while the change direction is set to the increase direction if the number is an even number. The relationship between the number and the change direction may be set in the opposite manner.
A specific example will be described. For instance, in the case where the method B6 is combined with the method A1, as illustrated in
In a combination example of the methods B6 and A1 corresponding to
In addition, with reference to
—Setting Method of Changing Rate (Increase Rate and Decrease Rate)—
Next, a setting method of a changing rate in a size of the clipping frame will be described. A size of the clipping frame before a size of the clipping frame is changed is represented by SIZEBF, and a size of the clipping frame after the size of the clipping frame is changed is represented by SIZEAF. Then, the changing rate is expresssed by “SIZEAF/SIZEBF”. The size of the clipping frame is expressed by, for example; the number of pixels in the clipping frame. A degree of change in the size of the clipping frame is referred to as a “change degree”. If the change direction of a size of the clipping frame is the decrease direction, the changing rate is the decrease rate having a value smaller than one. In this case, if the changing rate (SIZEAF/SIZEBF) is closer to zero, the change degree (change degree of decrease) is larger. If the changing rate (SIZEAF/SIZEBF) is closer to one, the change degree (change degree of decrease) is smaller. If the change direction of a size of the clipping frame is the increase direction, the changing rate is the increase rate having a value larger than one. In this case, if the changing rate (SIZEAF/SIZEBF) is larger, the change degree (change degree of increase) is larger. If the changing rate (SIZEAF/SIZEBF) is closer to one, the change degree (change degree of increase) is smaller.
As a setting method of the changing rate, methods C1 to C7 will be described below.
[Method C1]
In the method C1, a changing rate for one operation is set fixedly in advance. Specifically, if a change direction of a size of the clipping frame is determined to be the decrease direction by the method Bi described above, a size of the clipping frame is decreased at a decrease rate determined in advance regardless of a moving state or the like of the finger. On the contrary, if the change direction is determined to be the increase direction, a size of the clipping frame is increased at an increase rate determined in advance regardless of a moving state or the like of the finger. The method C1 can be performed in combination with any one of the methods A1 to A5.
[Method C2]
In a method C2, a changing rate is set in accordance with the movement amount of the finger on the display screen 51, and the changing rate for the movement amount is fixedly set in advance. Therefore, if the movement amount is determined, the changing rate is automatically determined.
[Method C3]
A method C3 is used in combination with the method A3. In the method A3, the movement locus of the touch position draws an arc. In the method C3, if a length of the arc is larger, the change degree of decrease or increase is set to a larger value. If the length of the arc is smaller, the change degree of decrease or increase is set to a smaller value. Alternatively, if a central angle of the arc is larger, the change degree of decrease or increase is set to be larger. If the central angle of the arc is smaller, the change degree of decrease or increase is set to be smaller. When the method C3 is used, the touch position may be moved along a circumference on the display screen 51 a plurality of turns. When the touch position is moved along a circumference on the display screen 51 just one turn, a length of the arc agrees with a length of the circumference and the central angle of the arc is decided to be 360 degrees. When the touch position is moved along a circumference on the display screen 51 just two turns, a length of the arc agrees with twice a length of the circumference and the central angle of the arc is decided to be 720 degrees.
[Method C4]
A method C4 is used in combination with the method A4 or A5. In the method C4, a length of the movement locus of the touch position by the method A4 or A5 is determined. If the determined length is larger, the change degree of decrease or increase is set to be larger. If the determined length is smaller, the change degree of decrease or increase is set to be smaller.
[Method C5]
When a method C5 is used, there is a turning point between the initial point and the terminal point on the movement locus of the touch position. Specifically, in the method C5, it is assumed that the touch position moves in a certain direction from the initial point to the turning point and then the touch position moves in another direction from the turning point to the terminal point. Then, a distance between the turning point and the terminal point is determined. If the determined distance is shorter, the change degree of decrease or increase is set to be larger. If the determined distance is longer, the change degree of decrease or increase is set to be smaller. The method C5 can be used in combination with the method A4 or A5. In this combination, contents of the method A4 or A5 may be corrected a little. For instance, if the method C5 is combined with the method A4, a rectangular frame that is as small as possible to include the initial point, the turning point and the terminal point should be regarded as the clipping frame.
[Method C6]
When a method C6 is used, it is assumed that a turning point exists between the initial point and the terminal point on the movement locus of the touch position, and the direction of moving from the initial point to the turning point is opposite to the direction of moving from the turning point to the terminal point (here, the terminal point may be substantially the same as the turning point). The touch position moves from the initial point to the turning point, and after that, the touch position goes back to the initial point side. In accordance with this going back degree, the changing rate is determined. Specifically, for example, a distance dSM between the initial point and the turning point, and a distance dME between the turning point and the terminal point are determined. If a distance ratio (dME/dSM) is larger, the change degree of decrease or increase is set to be larger. If the distance ratio (dME/dSM) is smaller, the change degree of decrease or increase is set to be smaller. The method C5 can be used in combination with the method A4 or A5. In this combination, the turning point may be regarded as the terminal point in the method A4 or A5.
[Method C7]
In method C7, it is assumed that the finger is still at the initial point or the terminal point for a certain period of time. One of a time period while the finger is still at the initial point keeping a contact state with the display screen 51 and a time period while the finger is still at the terminal point keeping a contact state with the display screen 51 can be adopted as a target still period. In the method C7, the changing rate is determined in accordance with a time length of the target still period. Specifically, for example, if the time length of the target still period is longer, the change degree of decrease or increase is set to be larger. If the time length of the target still period is shorter, the change degree of decrease or increase is set to be smaller. The method C7 can be performed in combination with any one of the methods A1 to A5. Since a movement of the touch position is not expected in the methods A1 and A2, when the methods A1 and A2 are used, the touch position itself in the method A1 or A2 should be regarded as the initial point or the terminal point.
—Notification of Information about Increase or Decrease of a Size of the Clipping Frame—
Next, A method of notifying the user of information about increase or decrease of a size of the clipping frame or the like will be described. As a process concerning this notification, notification processes D1 to D5 will be described below.
[Notification Process D1]
The notification process D1 will be described. In the notification process D1, before the change direction of a size of the clipping frame is fixed, and during the period while the touch panel operation is being performed for setting the change direction of a size of the clipping frame, the icon ICD illustrated in
For instance, in the case where the method B5 illustrated in
[Notification Process D2]
When a change direction of a size of the clipping frame is fixed, it is possible to inform the user about that a change direction is fixed. In this case, the notification process for informing about that a change direction is fixed is included in the notification process D2. Any method can be adopted for the notification performed by the notification process D2. For instance, if it is fixed that a change direction of a size of the clipping frame becomes the decrease direction, the icon ICD on the display screen 51 may be blinked so as to notifying that a change direction is fixed. Alternatively, any method working on human five senses (sight, hearing and the like) may be used for notifying that a change direction is fixed. The same is true in the case where the change direction is fixed to be the increase direction. The notification in the notification process D1 (e.g., a display of the icon ICD or ICU) and the notification in the notification process D2 (e.g., a blink display of the icon ICD or ICU) corresponds to the notification for informing the user about which of the increasing operation and the decreasing operation the touch panel operation performed to the camera monitor 17 corresponds to. By this notification, the user can perform a desired operation easily.
[Notification Process D3]
A notification process D3 will be described. In the notification process D3, an index indicating a current changing rate is displayed before a changing rate of a size of the clipping frame is fixed and during the period while the touch panel operation for determining a changing rate of a size of the clipping frame is performed. Any method of indicating a changing rate may be adopted. For instance, it is possible to notify the user about a current changing rate by using an icon having a bar shape, a numerical value, a color or the like.
For instance, in the case where the method B5 illustrated in
[Notification Process D4]
When a changing rate of a size of the clipping frame is fixed, it is possible to notify the user about that a changing rate is fixed. In this case, the notification process of notifying that a changing rate is fixed is included in the notification process D4. It is possible to notify that a changing rate is fixed by a display of a particular icon, or by any other method working on human five senses (sight, hearing and the like). In addition, the fixed changing rate itself is notified to the user by the notification process D4. Any method of indicating the fixed changing rate may be adopted. For instance, it is possible to notify the user about the fixed changing rate by using an icon having a bar shape, a numerical value, a color or the like.
[Notification Process D5]
It is possible to display a cancel icon or a cancel gesture icon for demonstrating a canceling gesture for a cancel acceptance period having a constant time length (e.g., a few seconds) after a change direction of a size of the clipping frame and a changing rate are fixed. The display of the cancel icon and the cancel gesture icon is included in the notification process D5. The icons 681 and 682 illustrated in
For instance, it is supposed that execution of the process of decreasing a size of the clipping frame is fixed by a certain touch panel operation in the state where the display image of the camera monitor 17 is the input image 620 (see
Next, specific operational examples of the digital camera 1 in which each method and each process described above are used will be described.
A first operational example will be described with reference to
It is supposed that the time TA(i+1) is after the time tAi. The positions 711 to 715 are touch positions at the time TA1 to TA5, respectively. The positions 711 to 715 are positions that are different from each other, and the locus formed by connecting the positions 711 to 715 in order draws an arc. The positions 711 and 715 are respectively a position of the initial point and a position of the terminal point of the locus. It supposed that the touch position moves in a clockwise direction in the process that the touch position moves from the position 711 to the position 715. For instance, the input image 620 is displayed on the display screen 51 from the time TA1 to the time TA5, and the clipped image 630 is displayed on the display screen 51 at the time TA6 and the time TA7 (see
Specific description will be added along time sequence. A finger touches a position 711 on the display screen 51 at time TA1, and the touch position moves from the position 711 to the position 712 during the period from time TA1 to time TA2. In this case, the display controller 20 performs the notification process D1. Specifically, it estimates that a change direction of a size of the clipping frame will be determined to be the decrease direction with high probability from the movement locus between positions 711 and 712 on the basis of the method B5 illustrated in
Next, the touch position moves from the position 712 to the position 713 in the period from time TA2 to time TA3. In this case, the display controller 20 performs the notification process D2. Specifically, the central angle of the arc formed by the movement locus between the position 711 and the position 713 exceeds 180 degrees at time TA3. Therefore, a change direction of a size of the clipping frame is fixed to be the decrease direction, and in order to notify the user about that a change direction of a size of the clipping frame is fixed, the icon ICD is blinked at time TA3. This blink display is continued for a constant period of time.
Next, the touch position moves from the position 713 to the position 714 during the period from time TA3 to time TA4. In this case, the display controller 20 performs the notification process D3. Specifically, at time TA4, the changing rate described above is calculated on the basis of the assumption that the position 714 is the terminal point, and the calculated changing rate (90% in the example illustrated in
The cancel acceptance period starts from time TA5 and the cancel acceptance period ends right before time TA7. The time TA6 is time in the cancel acceptance period. Therefore, at time TA6, the icon 680 that is the cancel icon 681 or the cancel gesture icon 682 is displayed. When the cancel acceptance period is finished, the display of the icon 680 is deleted so as to reach a state of receiving other touch panel operation. Note that the changing rate displayed at time TA4 or the like may be a changing rate based on a size of the original input image 600 or may be a changing rate based on a size of the input image 620. In addition, as described above, the touch position may be moved a plurality of turns along the circumference on the display screen 51 for determining the changing rate.
With reference to
It is supposed that time TB(i+1) is after time tBi. Positions 731 to 733 are touch positions at time TB1 to time TB3, respectively. The positions 731 to 733 are positions different from each other, and a locus formed by connecting the positions 731 to 733 in order draws an arc. In the process that the touch position moves from the position 731 to the position 733, it is supposed that the touch position moves in a counterclockwise direction. For instance, in the period from time TB1 to time TB5, the input image 620 is displayed on the display screen 51, and the original input image 600 is displayed on the display screen 51 at time TB6 and time TB7 (see
Specific description will be added along time sequence. A finger touches a position 731 on the display screen 51 at time TB1, and the touch position moves from the position 731 to the position 732 during the period from time TB1 to time TB2. In this case, the display controller 20 performs the notification process D1. Specifically, it estimates that a change direction of a size of the clipping frame will be determined to be the increase direction with high probability from the movement locus between positions 731 and 732 on the basis of the method B5 illustrated in
Next, the touch position moves from the position 732 to the position 733 during the period from time TB2 to time TB3. In this case, the display controller 20 performs the notification process D2. Specifically, at time TB3, a central angle of the arc formed by the movement locus between the positions 731 and 733 exceeds 180 degrees. Therefore, a change direction of a size of the clipping frame is fixed to be the increase direction, and in order to notify the user about that a change direction of a size of the clipping frame is fixed, the icon ICU is blinked at time TB3. This blinking display is continued for a constant period of time.
In the second operational example, since the method C6 (see
The cancel acceptance period starts from time TB5, and the cancel acceptance period ends right before the time TB7. The time TB6 is time in the cancel acceptance period. Therefore, at time TB6, the icon 680 that is the cancel icon 681 or the cancel gesture icon 682 is displayed. When the cancel acceptance period is finished, the display of the icon 680 is deleted so as to reach a state where other touch panel operation can be accepted.
With reference to
It is supposed that time TC(i+1) is after time tCi. Positions 751 to 753 are touch positions at time TC1 to TC3, respectively. It is supposed that the direction from the position 751 to the position 752 is the right direction, while the direction from the position 752 to the position 753 is the left direction. For instance, the input image 620 is displayed on the display screen 51 in the period from time TC1 to TC4, and the original input image 600 is displayed on the display screen 51 at time TC5 (see
Specific description will be added along time sequence. A finger touches a position 751 on the display screen 51 at time TC1, and the touch position moves from the position 751 to the position 752 during the period from time TC1 to time TC2. In this case, the display controller 20 performs the notification process D1. In the process that the touch position moves from the position 751 to the position 752, there is no reverse in the movement direction of the touch position. Therefore, at time TC2, it is estimated that a change direction of a size of the clipping frame is determined to be the decrease direction with high probability on the basis of the method B4 illustrated in
The movement direction of the touch position is reversed with respect to time TC2 as a center, and the touch position moves from the position 752 to the position 753 during the period from time TC2 to time TC3. The display controller 20 detects the reverse so as to estimate that a change direction of a size of the clipping frame is determined to be the increase direction, and the icon ICU is displayed at time TC3 (see
When the finger is released from the display screen 51 at time TC4, the above-mentioned change direction and changing rate are fixed. Then, the display controller 20 performs the notification processes D2 and D4. Specifically, the icon ICU is blinked at time TC4 so as to notify the user about that a change direction is fixed to be the increase direction. This blink display is continued for a constant period of time. Further, the fixed changing rate (110% in the example illustrated in
According to this embodiment, not only a decrease of the angle of view of the display image but also an increase of the angle of view of the display image can be instructed by an intuitive touch panel operation.
A tenth embodiment of the present invention will be described. The tenth embodiment is an embodiment based on the description in the seventh embodiment, and as to matters that are not particularly described in this embodiment, the description in the seventh embodiment is also applied to this embodiment as long as no contradiction arises. Therefore, the following description in the tenth embodiment is an operational description of the digital camera 1 in the imaging mode. The matters described in the ninth embodiment can be applied to the seventh embodiment. The tenth embodiment corresponds to a combination of the seventh and the ninth embodiments.
An operation in which a still image is taken in the imaging mode will be described. When a still image is taken in the imaging mode, one frame image indicating the still image is displayed on the camera monitor 17 and is supplied to the clip processing unit 62 illustrated in
An operation in which a moving image is taken in the imaging mode will be described. When a moving image is taken in the imaging mode, frame images forming the moving image are sequentially displayed on the camera monitor 17 and are supplied to the clip processing unit 62 illustrated in
Now, as described above in the ninth embodiment, the clipped image 610 is regarded as a new input image 620, and the state where the input image 620 is displayed is regarded as a reference state (see
If the increasing operation is performed, the image inside the clipping frame 601A can be displayed as the clipped image, and the image data inside the clipping frame 601A can be recorded in the recording medium 15 as the image data of the clipped image. If the decreasing operation is performed, the image inside the clipping frame 601B can be displayed as the clipped image, and the image data inside the clipping frame 601B can be recorded in the recording medium 15 as the image data of the clipped image.
As described above in the seventh embodiment, the entire image of the frame image is formed of output image signals of individual light receiving pixels arranged in the effective pixel region of the image sensor 33 (see
According to this embodiment, not only the decrease of an angle of view of the display image and the record image but also the increase of the angle of view of the display image and the record image can be instructed by the intuitive touch panel operation.
The eleventh embodiment of the present invention will be described. The eleventh embodiment is an embodiment based on the description in the eighth embodiment, and as to matters that are not particularly described in this embodiment, the description in the eighth embodiment is also applied to this embodiment as long as no contradiction arises. Therefore, similarly to the eighth embodiment, the following description in the eleventh embodiment is an operational description of the digital camera 1 in the imaging mode. The matters described in the ninth embodiment can be applied to the eighth embodiment. The eleventh embodiment corresponds to a combination of the eighth and the ninth embodiments.
As described above in the eighth embodiment, an imaging angle of view and an incident position on the image sensor 33 can be adjusted by the touch panel operation according to the method Ai. The adjustment of an imaging angle of view described above in the eighth embodiment corresponding to the decrease of the imaging angle of view. The user can perform an imaging view angle decrease instruction operation and an imaging view angle increase instruction operation by using the touch panel. Each of the imaging view angle decrease instruction operation and the imaging view angle increase instruction operation is one type of the touch panel operation. The touch panel operation for decreasing an imaging angle of view described above in the eighth embodiment corresponds to the imaging view angle decrease instruction operation.
The method of the imaging view angle decrease instruction operation is similar to the decreasing operation for decreasing a size of the clipping frame described above in the ninth embodiment, and the method of the imaging view angle increase instruction operation is similar to the increasing operation for increasing a size of the clipping frame described above in the ninth embodiment. When the matter described above in the ninth embodiment is applied to this embodiment, the clipping frame (or size of the clipping frame) in the ninth embodiment should be read as “imaging angle of view”, and the changing rate in the ninth embodiment should be read as “imaging angle of view changing rate”. The imaging angle of view changing rate is expressed by “ANGAF/ANGBF”, for example. ANGBF represents an imaging angle of view before the imaging angle of view is changed, and ANGAF represents an imaging angle of view after the imaging angle of view is changed.
When the imaging view angle decrease instruction operation is performed, an imaging angle of view changing rate is determined in accordance with the method described above in the ninth embodiment, and the photography control unit 13 illustrated in
When the imaging view angle increase instruction operation is performed, the imaging angle of view changing rate is determined in accordance with the method described above in the ninth embodiment, and the photography control unit 13 illustrated in
According to this embodiment, not only the decrease of the angle of view of the display image and the record image but also the increase of the angle of view of the display image and the record image can be instructed by an intuitive touch panel operation.
<<Variations>>
Specific numerical values indicated in the above description are merely examples, and they can be changed to various values as a matter of course. As variations or annotations of the embodiments described above, Note 1 and Note 2 are described below. Descriptions in the Notes can be combined in any way as long as no contradiction arises.
[Note 1]
In each embodiment described above, the touch panel is used as an example of a pointing device for specifying a position and a size of the clipping frame and the expansion specifying frame. However, it is possible to use a pointing device other than the touch panel (e.g., a pen tablet or a mouse) so as to specify a position and a size of the clipping frame and the expansion specifying frame.
[Note 2]
The digital camera 1 according to the embodiments can be constituted of hardware or a combination of hardware and software. If software is used for constituting the digital camera 1, a block diagram of a portion realized by software indicates a functional block diagram of the portion. The function realized by using software may be described as a program, and the program may be executed by a program execution device (e.g., a computer) so as to realize the function.
Number | Date | Country | Kind |
---|---|---|---|
2009-174006 | Jul 2009 | JP | national |
2010-130763 | Jun 2010 | JP | national |