1. Field of the Invention
The present invention relates to an imaging apparatus that records moving images, and to a reproducing apparatus that reproduces the moving images, such as a video camera or a digital camera.
2. Description of the Related Art
In recent years, a recording and reproducing apparatus capable of recording/reproducing high-quality video signals such as high definition (HD) video signals or HD signals, and a video display apparatus capable of displaying such signals as images have been widely spread. Background techniques of an image processing system used in these apparatuses include techniques of miniaturizing the size of an image sensor, increasing the number of pixels, increasing the speed of transfer processing, and improving the efficiency of coding such as the H.264 standard. In addition, various types of video display apparatuses have been made available, such as those of a liquid crystal type, a plasma type, and an electro-luminous (EL) type. Further, video display apparatuses having a further decreased thickness, a higher definition, and/or a larger screen are being developed. Mobile-type video display apparatuses are also being developed.
Since display apparatuses of such various types have been become available, users' viewing styles have changed significantly. Users handle images in various manners depending on the purpose, for example, some users prefer high-quality images while others wish to exchange short movies with ease.
Techniques for storing large amounts of video data play an important role in addressing the needs of users. Particularly, remarkable progress has been made in the increase of the capacity of hard disks and in the decrease of costs thereof, followed by semiconductor memories. Additionally, large-capacity removable media or servers on the order of terabytes have been made available at low cost.
Under the background described above, while the frame rate of the national television system committee (NTSC) is 29.97 frames/second in the field of imaging techniques, higher frame rates such as 120 frames/second and 240 frames/second are proposed. Further, techniques for multi-screen reproduction and reproduction with multi-audio channels are also being developed. Based on these techniques, more realistic and dynamic images can be displayed.
While video signals are conventionally recorded at a fixed frame rate, some techniques are proposed for appropriately changing the frame rate, efficiently suppressing the total amount of recording data, and recording high-quality videos when necessary.
Japanese Patent Application Laid-Open No. 2003-274360 (U.S. Pat. No. 7,456,875) discusses a technique for controlling the frame rate of captured video signals without requiring a photographer's control. More specifically, based on this technique, when an image is captured, a surrounding audio volume level is monitored, and if it is detected that the monitored volume level exceeds a predetermined level, the frame rate is increased, so that important scenes are recorded with high quality. Further, based on this technique, when the available memory capacity of a recording medium reaches less than or equal to a predetermined level, the frame rate is decreased to suppress consumption of the recording medium. In addition, when the available power of a battery reaches less than or equal to a predetermined level, the frame rate is decreased to suppress consumption of the battery.
Japanese Patent Application Laid-Open No. 2007-134991 (USPA 2007/0104462) discusses a technique in which motion of a captured image is detected. Based on this technique, when the detected motion is more than or equal to a threshold, the captured image is recorded at a high frame rate, and when the detected motion is less than the threshold, the captured image is recorded at a low frame rate.
However, based on the above techniques discussed in Japanese Patent Application Laid-Open No. 2003-274360 (U.S. Pat. No. 7,456,875) and Japanese Patent Application Laid-Open No. 2007-134991 (USPA 2007/0104462), change of the frame rate does not reflect photographer's intention. Namely, based on the technique discussed in Japanese Patent Application Laid-Open No. 2003-274360 (U.S. Pat. No. 7,456,875), the frame rate is changed depending on conditions surrounding an imaging apparatus, such as the surrounding audio volume level or the available power of a recording medium or a battery. Thus, the imaging condition is changed without photographer's intention.
In addition, the technique discussed in Japanese Patent Application Laid-Open No. 2007-134991 (USPA 2007/0104462) uses motion information acquired from an image signal as an index. For example, when the ratio of monotonous scenes including the sky or night scenes is increased, it is determined that the amount of motion is less, and as a result, the image is recorded at a low frame rate. On the other hand, in a wide image whose focal length is short, the motion amount tends to be detected as being large because of camera shake, and consequently, the image is recorded at a high frame rate. In either case, the frame rate is determined irrespective of photographer's intention.
Normally, TV receivers or video monitors support only a fixed frame rate. Thus, when a video signal is input and the frame rate thereof is changed in the middle of processing, even if the change of the frame rate can be followed, the video may be displayed at an unnatural rendering speed misaligned with the actual time. For example, if a monitor receives an input video signal and displays the signal at a certain rate images captured at a frame rate higher than the standard frame rate are rendered in slow motion, and images captured at a lower frame rate are rendered at high speed.
Since compatibility between these viewing and reproduction environments is not taken into consideration, use of the technique of variable frame rate recording is limited.
The present invention is directed to an imaging apparatus capable of changing a frame rate in view of a photographer's intention and maintaining compatibility with existing viewing and reproduction environments.
According to an aspect of the present invention, an imaging apparatus includes an imaging unit including an image sensor configured to convert an optical image to an image signal, and a zoom unit configured to optically zoom an optical image incident on the image sensor or electronically zoom an image signal output from the image sensor, a recording unit configured to record a video signal including an image signal captured by the imaging unit in a recording medium, and configured to record a recording frame rate, at which the video signal is recorded in the recording medium, and zoom operation information for the zoom unit in the recording medium, a zoom operation unit configured to operate the zoom unit, a control unit configured to control the recording frame rate at which the video signal is recorded in the recording medium based on an operation of the zoom operation unit, and configured to increase the recording frame rate to be higher than a normal frame rate during a period including a period when the zoom operation unit is operated, and a reproducing unit configured to reproduce the video signal from the recording medium based on a set reproduction mode, and configured to carry out thinning processing on the video signal during the period including the period when the zoom operation unit is operated based on the zoom operation information and reproduce the processed video signal at the normal frame rate.
Based on an imaging apparatus and a reproducing apparatus according to the present invention, videos including zoom operations can be reproduced. Since videos are reproduced at a standard frame rate, compatibility with existing video display apparatuses can be ensured. During periods including zoom operation periods, videos are recorded at a frame rate higher than a normal frame rate, and when the videos are reproduced, the frame rate is reduced to the standard frame rate. Thus, high-quality reproduction videos can be displayed.
Further features and aspects of the present invention will become apparent from the following detailed description of exemplary embodiments with reference to the attached drawings.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate exemplary embodiments, features, and aspects of the invention and, together with the description, serve to explain the principles of the invention.
Various exemplary embodiments, features, and aspects of the invention will be described in detail below with reference to the drawings.
An imaging unit 10 (camera unit) has the following configuration and functions. The imaging optical system of the imaging unit 10 includes a front lens 12, a zoom lens 14, a diaphragm 16, and a focus lens 18. The front lens 12 is fixed to a lens barrel, and the zoom lens 14 and the focus lens 18 can move along the optical axis. The diaphragm 16 is arranged between the zoom lens 14 and the focus lens 18.
An exposure control unit 20 receives a control signal from a camera control unit 30 and controls the aperture of the diaphragm 16 based on the control signal. A zoom position sensor 22 detects the position of the zoom lens 14. Based on the position of the zoom lens 14 detected by the zoom position sensor 22, a lens control unit 24 carries out feedback control on the zoom lens 14. The lens control unit 24 also controls the zoom position of the zoom lens 14 and the focus position of the focus lens 18, based on instruction signals from the camera control unit 30 to be described later.
An image sensor driving unit 28 drives an image sensor 26 that converts an optical image formed by the imaging optical system into an electric image signal. The image sensor driving unit 28 drives the image sensor 26 based on a timing signal supplied from the camera control unit 30 and controls the image sensor 26 to output each pixel signal for the image signal. By increasing the driving signal frequency of the image sensor driving unit 28, a frame rate higher than a standard frame rate can be handled.
A sample-and-hold (S/H) unit 32 amplifies the image signal output from the image sensor 26, and samples and holds the signal based on a timing signal supplied from the camera control unit 30 (or the image sensor driving unit 28). An analog-to-digital (A/D) converter 34 converts the signal supplied from the S/H unit 32 into a digital signal, and supplies the digital signal to a camera signal processing unit 36.
Based on parameters set by the camera control unit 30, the camera signal processing unit 36 carries out known camera signal processing, such as color separation, gradation correction, and white balance adjustment, on the image data output from the A/D converter 34. The camera signal processing unit 36 supplies the processed image data to a compression/expansion unit 44 via a data bus 38 and a memory 40.
While an audio input unit and a system for processing an audio signal supplied from the audio input unit are not illustrated in
A recording medium control unit 46 writes the image and audio data compressed by the compression/expansion unit 44 in a recording medium 48. Examples of the recording medium 48 include a nonvolatile semiconductor memory such as a flash memory, a hard disk apparatus, and a recordable optical disk.
Based on instructions from the system control unit 50, the camera control unit 30 controls operations of the lens control unit 24, the exposure control unit 20, the image sensor driving unit 28, and the camera signal processing unit 36.
The system control unit 50 includes a central processing unit (CPU) and controls individual units according to instructions from an operation unit 52 operated by a user, setting values, or an operation status. For example, the system control unit 50 controls an imaging operation via the camera control unit 30 and controls recording/reproducing data in the recording medium 48 via the recording medium control unit 46. Needless to say, the system control unit 50 comprehensively controls the entire imaging apparatus by executing predetermined programs.
An external interface (I/F) 54 sends and receives signals in a predetermined format to and from an external device connected to an external terminal 56. Examples of the external terminal 56 include terminals that comply with standards such as the institute of electrical and electronic engineers (IEEE) 1394, universal serial bus (USB) and/or the high-definition multimedia interface (HDMI). Examples of the signal format include analog video signals (audio/video) and digital video signals of various resolutions and frame rates.
The operation unit 52 is used as a user interface and includes a shutter button, a zoom button, a power-source on/off button, and a record start/stop button used during imaging or recording. The operation unit 52 also includes a select button, a determination button, and a cancel button used to make selection on the menu, set parameters, and the like.
A display control unit 58 drives a display apparatus 60, which is used as a monitor displaying captured images and reproduced images and is used as a display unit displaying various types of management information to users.
The memory 40 is shared by each function block via the data bus 38, and is formed by a read only memory (ROM) and/or a random-access memory (RAM). Each function block connected to the data bus 38 can write and read data in and from the memory 40 via a memory control unit 42.
Image and audio data recorded in the recording medium 48 is reproduced as follows. The recording medium control unit 46 supplies compressed data read from the recording medium 48 to the compression/expansion unit 44. The compression/expansion unit 44 decompresses the input compressed data to restore video data and audio data.
The display control unit 58 supplies the restored video data to the display apparatus 60, which displays a reproduced image using the supplied video data. Examples of the display apparatus 60 include a liquid crystal display panel and an organic light-emitting device. A speaker (not illustrated) outputs reproduced audio. When necessary, reproduced video data and audio data is output to the outside via the external I/F 54 and the external terminal 56.
An operation for controlling the frame rate at which video data is recorded in the recording medium 48 will be described below.
The camera control unit 30 controls the recording medium control unit 46 to write frame images temporarily stored in the memory 40 in the recording medium 48, as illustrated in
If the system control unit 50 detects a zoom operation (YES in step S1), in step S2, the system control unit 50 determines the zoom speed. For example, the zoom speed is determined based on the force applied by the user to the zoom switch on the operation unit 52 or the length of time while the zoom switch is pushed. If a zoom speed is previously or initially set on the menu screen or the like, the system control unit 50 refers to the zoom speed. In other words, the system control unit 50 also functions as a zoom speed detection unit.
In step S3, based on the zoom speed determined in step S2, the system control unit 50 sets a recording frame rate.
As the relationship between the zoom speed and the recording frame rate of
In step S4, the system control unit 50 carries out a thinning process on the image frame data in the memory 40 based on the recording frame rate set in step S3, and records the remaining frames in the recording medium 48. In step S5, the system control unit 50 records zoom magnification information and frame rate management information in the recording medium 48 as metadata, along with video data of each frame. In step S6, if instructions to stop recording are input, the processing ends. If not, steps S1 to S5 are repeated.
Since the recording frame rate is changed based on the zoom speed, when images are captured at a low zoom speed, the images are recorded at a low frame rate. Thus, since unnecessary captured frames are not recorded, consumption of the available capacity of the recording medium 48 can be suppressed. While an object moves rapidly during a zoom-on period at a high zoom speed, since the object is followed at a high frame rate, captured scenes in which the angle of view changes greatly can be recorded with good quality at a high sampling rate.
The metadata portion stores information that is updated depending on various imaging conditions, such as a date, time, a recording frame rate, and a shutter speed. The data portion stores image data and audio data encoded in a predetermined format. The metadata is embedded in the video data in the metadata format using a description language such as extensible markup language (XML) or hypertext markup language (HTML). The metadata may be added as binary data or a watermark.
In the example illustrated in
To ensure compatibility for reproduction, according to the present exemplary embodiment, storage of the video data in the recording medium 48 is controlled as follows.
Frames captured at t1, t2, t3, and the like at a normal recording frame rate of 1/60 second are stored in a main storage area A of the recording medium 48. On the other hand, if a zoom operation is carried out and the frame rate is changed, the frames captured at a higher frame rate between t3, t3a, t3b, t3c, t4, . . . , and t9 are stored in a sub-storage area Sub of the recording medium 48.
Thus, by separating video data depending on the frame rate and separately storing the data in different areas of the recording medium 48, even when an apparatus without special reproduction functions is used, the video data stored in the main storage area A can be reproduced. Thus, compatibility for reproduction can be ensured. The main storage area A and the sub-storage area Sub are distinguished logically in terms of file management, and thus physical recording locations of these areas may be identical.
In
A reproduction operation of the present exemplary embodiment will be described below. According to the present exemplary embodiment, before reproducing videos of different frame rates, an interpolation or thinning process is carried out. In this way, scenes captured during a zoom operation can be provided with special reproduction effects.
In step S11, based on reproduction instructions from the operation unit 52, the system control unit 50 reads metadata about a specified video file from the file management information of the recording medium 48. In step S12, the system control unit 50 refers to the metadata and determines whether videos of different frame rates are recorded in the video file to be reproduced.
If images are not recorded at different recording frame rates (NO in step S12), the processing proceeds to step S16. In step S16, normal reproduction processing is carried out.
If images are recorded at different recording frame rates (YES in step S12), in step S13, the system control unit 50 reads special reproduction effects previously set by the user. In steps S14 and S15, the system control unit 50 controls displayed frames and voice output as described below.
A linear smoothing mode (mode 1) is a reproduction interpolation mode in which, even when the speed of a zoom operation carried out by the user is not constant, the zoom speed is changed during reproduction processing and reproduced images are displayed (rendered) at a constant zoom speed. In the linear smoothing mode, among the frames recorded in the recording medium 48, the system control unit 50 selects frames so that the zoom magnification of the selected frames changes proportionally. Next, the selected frames are displayed on the display apparatus 60 at a normal frame rate.
When the zoom operation is not carried out, the captured frames are thinned and recorded at a frame rate of 1/60-second intervals. As illustrated in
The linear smoothing mode will be described in detail with reference to
The frame at time t5c is shifted as a frame at time t4 (frame 70-1 represented as a black circle) to be displayed. The frame at time t6c is shifted as a frame at time t5 (frame 70-2 represented as a black circle) to be displayed. The frame at time t7c is shifted as a frame at time t6 (frame 70-3 represented as a black circle) to be displayed. The frame at time t8a is shifted as a frame at time t7 (frame 70-4 represented as a black circle) to be displayed. The frame at time t8c is shifted as a frame at time t8 (frame 70-5 represented as a black circle) to be displayed.
Thus, in the linear smoothing mode, captured frames (recorded frames) whose zoom magnification changes linearly from the wide end to the telephoto end are re-arranged on the time axis so that the frames are displayed at a constant zoom speed.
A dynamic shooting mode (mode 2) is a reproduction interpolation mode in which an image is zoomed rapidly as the object comes closer, irrespective of the zoom operation speed by the user. In other words, the linear smoothing mode provides a zoom effect of gradually increasing the zoom speed.
Among the recorded frames in the recording medium 48, the system control unit 50 selects frames so that the speed, at which the zoom magnification changes, is gradually increased. The selected frames are displayed on the display apparatus 60 at a normal frame rate.
During the period between time t3 and time t9 when the zoom magnification changes, captured frames are recorded at 1/240-second intervals. When the zoom operation is not carried out, the captured frames are thinned and recorded at a frame rate of 1/60-second intervals. In the example illustrated in
The dynamic shooting mode will be described in detail with reference to
The frame at time t3 is temporally shifted as a frame at time t4 (frame 72-1 represented as a black circle) to be displayed. The frame at time t3a is temporally shifted as a frame at time t5 (frame 72-2 represented as a black circle) to be displayed. The frame at time t3b is temporally shifted as a frame at time t6 (frame 72-3 represented as a black circle) to be displayed. The frame at time t4 is temporally shifted as a frame at time t7 (frame 72-4 represented as a black circle) to be displayed. The frame at time t5a is temporally shifted as a frame at time t8 (frame 72-5 represented as a black circle) to be displayed.
Thus, in the dynamic shooting mode, frames whose zoom magnification changes nonlinearly from the wide end to the telephoto end are displayed at a normal frame rate, so that the zoom speed is gradually increased.
A soft landing mode (mode 3) is a reproduction interpolation mode in which an image is zoomed in a decreasing speed as the object comes closer, irrespective of the zoom operation speed by the user. Namely, the soft landing mode provides a zoom effect of gradually decreasing the zoom speed.
Among the recorded frames in the recording medium 48, the system control unit 50 selects frames so that the zoom speed gradually decreases as the object comes closer. The selected frames are displayed on the display apparatus 60 at a normal frame rate.
During the period between time t3 and time t9 when the zoom magnification changes, captured frames are recorded at 1/240-second intervals. When the zoom operation is not carried out, the captured frames are thinned and recorded at a frame rate of 1/60-second intervals. In the example illustrated in
The soft landing mode will be described in detail with reference to
The frame at time t6c is temporally shifted as a frame at time t4 (frame 74-1 represented as a black circle) to be displayed. The frame at time t8 is temporally shifted as a frame at time t5 (frame 74-2 represented as a black circle) to be displayed. The frame at time t8b is temporally shifted as a frame at time t6 (frame 74-3 represented as a black circle) to be displayed. The frame at time t8c is temporally shifted as a frame at time t7 (frame 74-4 represented as a black circle) to be displayed. The frame at time t9 is temporally shifted as a frame at time t8 (frame 74-5 represented as a black circle) to be displayed.
Thus, in the soft landing mode, frames whose zoom magnification changes nonlinearly from the wide end to the telephoto end are interpolated and displayed, so that the zoom speed is gradually decreased.
A slow motion mode (mode 4) is a reproduction mode in which all the frame images recorded at a high frame rate during a zoom operation are rendered at a preset magnification. For example, frame images captured and recorded at a high frame rate of 1/240-second intervals during a zoom period are reproduced at a normal frame rate of 1/60-second intervals. As a result, the images are displayed at a slow speed (¼ of the original speed). Namely, the video recorded during a zoom operation is reproduced slowly and can be observed easily.
A skip mode (mode 5) is a reproduction mode that may be used when a user makes a mistake in a zoom operation. For example, when a user increases the zoom magnification excessively and decreases hurriedly, such erroneous operation is automatically detected, and reproduction of the images captured by the erroneous operation is skipped. When a zoom-in operation or a zoom-out operation is repeated in a short period of time and an erroneous operation is made, this mode is effective in removing the images captured by the erroneous operation.
During the period between time t3 and time t9 when the zoom magnification is changing, captured frames are recorded at 1/240-second intervals. When the zoom operation is not carried out, the captured frames are thinned and recorded at a frame rate of 1/60-second intervals. In the example illustrated in
The skip mode will be described in detail with reference to
In
Thus, when a user carries out an erroneous zoom operation or adjustment, if the user captures a video in which the zoom magnification is overshot, the user can reproduce and display the video without the overshoot portion by using the skip mode.
In the case of the example illustrated in
While five special reproduction modes have thus been described, any one of these special reproduction modes may be specified for each scene as a display attribute. A plurality of these special reproduction modes may be used in combination. Further, the above special zoom reproduction effects have been described based on examples where a video is zoomed in from the wide end to the telephoto end. However, needless to say, the same processing is possible when a video is zoomed out from the telephoto end to the wide end.
In step S15, the system control unit 50 continuously controls, reproduces, and outputs recorded audio, irrespective of selection of reproduced and output frames. For example, the system control unit 50 appropriately adjusts the time axis and/or deletes unnecessary portions (silent or prolonged portions, for example).
Referring back to
In step S17, if the system control unit 50 receives instructions to stop reproduction from the operation unit 52 or completes reproduction of the object to be reproduced (YES in step S17), the system control unit 50 ends the reproduction processing. If not (NO in step S17), the processing returns to step S11.
The system control unit 50 is connected to the operation unit 52 and also to the compression/expansion unit 44, the recording medium control unit 46, the memory 40, and the memory control unit 42 via the data bus 38. The system control unit 50 includes a zoom operation detection unit 50-1, a frame rate control unit 50-2, and a file management data generation unit 50-3 as a recording system. Further, the system control unit 50 includes a file management data detection unit 50-4, a display frame control unit 50-5, and an audio control unit 50-6 as a reproducing system.
During recording, the zoom operation detection unit 50-1 detects whether a zoom operation is carried out by the user and the zoom speed during a zoom operation. The zoom operation detection unit 50-1 supplies the detected zoom operation information to the frame rate control unit 50-2. The frame rate control unit 50-2 controls the recording frame rate, based on the zoom operation information supplied from the zoom operation detection unit 50-1.
The file management data generation unit 50-3 stores frame rate and zoom operation information about video data stored in the recording medium 48 in the file management data as metadata. Needless to say, the file management data generation unit 50-3 may store the recording frame rate and zoom operation information in another file different from an image file, as long as each frame of video data is associated with the recording frame rate and zoom operation information.
During reproduction, the file management data detection unit 50-4 reads the metadata about video data to be reproduced from the recording medium 48, and supplies the metadata to the display frame control unit 50-5 and the audio control unit 50-6. The display frame control unit 50-5 controls display frames as described above, based on the metadata supplied from the file management data detection unit 50-4 and a specified reproduction mode. Similarly, the audio control unit 50-6 controls output audio as described above, based on the metadata supplied from the file management data detection unit 50-4 and a specified reproduction mode.
As described above in detail, according to the present exemplary embodiment, frames captured during a zoom operation are recorded at a high frame rate, and during reproduction, based on a specified effect, certain frames are re-arranged on the time axis. In this way, videos during zoom operations can be reproduced and displayed with various display effects.
Since videos are captured at a high frame rate, frame images of a desired zoom magnification can be extracted, and high-quality and smooth interpolation can be realized. Additionally, even when a video captured during a zoom operation is visually undesirable because of a human error or the like during the zoom operation, by carrying out an interpolation process, the video can be reproduced without the undesirable portion. By separately processing the video and audio during such interpolation process, while some frames are skipped in the video, continuity of the audio can be maintained and reproduced without a break.
In the above exemplary embodiments, the camera unit constantly captures frames at a high frame rate, and the frame rate at which the frames are recorded in the recording medium 48 is decreased when necessary. However, the present invention is not limited to such example. For example, by changing the frequency of a timing signal output from the image sensor driving unit 28, even when the frame rate at which the camera unit captures frames is changed more flexibly, similar effects can be obtained.
While a video captured during a zoom operation often includes important scenes, similarly, videos captured before and after a zoom operation often include important scenes. Thus, irrespective of on/off of a zoom operation, frames are recorded at a high frame rate, to record video data captured during certain periods before and after the zoom operation in the recording medium 48 at a high frame rate. During reproduction, special reproduction effects similar to those of the first exemplary embodiment are applied to a zoom operation period and the periods before and after the zoom operation period. In this way, a video can be reproduced smoothly during a zoom operation period and the periods before and after the zoom operation period.
As seen from
To record frames captured during period D1, which is before the zoom operation, in the recording medium 48, a buffer memory capable of storing video data output from the camera signal processing unit 36 for more than period D1 is set in the memory 40 in advance.
If the start of a zoom operation is detected, frames captured during period D1, the zoom operation, and period D2 and recorded in the buffer memory 40 are encoded without changing the frame rate, and recorded in the recording medium 48. If the start of a zoom operation is not detected, frames captured after period D1 and stored in the buffer memory 40 are thinned at a frame rate of 1/60-second intervals. The resultant frames are then encoded and recorded in the recording medium 48.
Double circles represent frames captured at 1/240-second intervals, and single circles represent frames captured at 1/60-second intervals. According to the present exemplary embodiment, the frames represented by the single circles and captured at 1/60-second intervals are recorded in the recording medium 48, irrespective of on/off of the zoom operation. In addition to the period between time t3 and time t9 when the zoom magnification changes, video data captured during the period between time t1 and time t3, which is before the zoom operation, and during the period between time t9 to time t11, which is after the zoom operation, is also recorded in the recording medium 48 at a frame rate of 1/240-second intervals, which is higher than normal rate.
Thus, according to the present exemplary embodiment, in addition to a zoom operation period, video is recorded in the recording medium 48 at a high frame rate during certain periods before and after the zoom operation period. Generally, when a user carries out a zoom operation, video captured before and after a zoom-in operation or a zoom-out operation includes a target object. Thus, the video includes important scenes that reflect the photographer's intention, and examples of such scenes include goal scenes in sports games and close-up facial expressions. By recording the video captured before and after a zoom operation at a high frame rate as described above, the video captured before and after the zoom operation can be displayed effectively.
While the above exemplary embodiments use an optical zoom, which optically zooms an optical image incident on the image sensor 26, the present invention may similarly be applicable to an imaging apparatus, which uses an electronic zoom that electronically zooms an image signal generated by the sensor 26.
The rate of a video signal has been described as a frame rate in the above description. However, in the case of an interlace signal, by replacing the frame rate with a field rate, similar effects can be obtained.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all modifications, equivalent structures, and functions.
Number | Date | Country | Kind |
---|---|---|---|
2009-121535 | May 2009 | JP | national |
This application is a continuation of U.S. Pat. No. 8,508,627 filed Sep. 6, 2012, which is a continuation of U.S. Pat. No. 8,264,573 filed May 11, 2010, which claims the benefit of and priority to Japanese Patent Application No. 2009-121535 filed May 20, 2009, the contents of each of which are hereby incorporated by reference herein in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
6897896 | Mizumura | May 2005 | B1 |
7362352 | Ueyama | Apr 2008 | B2 |
7450162 | Shioji et al. | Nov 2008 | B2 |
7456875 | Kashiwa | Nov 2008 | B2 |
7679657 | Morita | Mar 2010 | B2 |
7940324 | Hibino et al. | May 2011 | B2 |
RE42728 | Madrane | Sep 2011 | E |
8154606 | Tokuyama | Apr 2012 | B2 |
8204355 | Saito et al. | Jun 2012 | B2 |
8577161 | Suzuki | Nov 2013 | B2 |
20040027367 | Pilu | Feb 2004 | A1 |
20050052553 | Kido et al. | Mar 2005 | A1 |
20050068339 | Lipsky et al. | Mar 2005 | A1 |
20050068340 | Lipsky et al. | Mar 2005 | A1 |
20050081247 | Lipsky et al. | Apr 2005 | A1 |
20080043123 | Shimomura et al. | Feb 2008 | A1 |
20090046175 | Ozawa et al. | Feb 2009 | A1 |
20090195675 | Haneda | Aug 2009 | A1 |
20100002002 | Lipsky et al. | Jan 2010 | A1 |
20100026848 | Kakehi | Feb 2010 | A1 |
20100039536 | Dahllof et al. | Feb 2010 | A1 |
20100079620 | Kuriyama | Apr 2010 | A1 |
20110199496 | Muraki et al. | Aug 2011 | A1 |
20120092525 | Kusaka | Apr 2012 | A1 |
Number | Date | Country |
---|---|---|
2001-358984 | Dec 2001 | JP |
2002-300457 | Oct 2002 | JP |
2002-354329 | Dec 2002 | JP |
2003-274360 | Sep 2003 | JP |
2004253904 | Sep 2004 | JP |
2004-357118 | Dec 2004 | JP |
2005-086499 | Mar 2005 | JP |
2007-134991 | May 2007 | JP |
2008-005051 | Jan 2008 | JP |
2008005051 | Jan 2008 | JP |
2011119854 | Jun 2011 | JP |
2008050806 | May 2008 | WO |
2010017852 | Feb 2010 | WO |
Entry |
---|
Japanese Office Action issued in corresponding application No. 2009-121535 on May 14, 2013. |
Number | Date | Country | |
---|---|---|---|
20130343732 A1 | Dec 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13605065 | Sep 2012 | US |
Child | 13954888 | US | |
Parent | 12778017 | May 2010 | US |
Child | 13605065 | US |