1. Field of the Invention
The present invention relates to an image processing apparatus, an image processing method, and a storage medium storing a program appropriate for displaying thumbnail images corresponding to frames constituting a moving image.
2. Description of the Related Art
Conventionally, there is a technique for an imaging apparatus to record image data together with positioning information indicating a position and an altitude at which an image is captured. The imaging apparatus then displays the image on a map based on the positioning information (as in Japanese Laid-Open Patent Application No. 2006-157810).
However, according to the conventional technique, if the position and the altitude of each of the images are close to each other, such as when performing continuous shooting, the images are displayed as overlapping images and thus become difficult to view. In particular, if a moving image is to be captured underwater, the images are often captured within a small area over a long period of time, so that shooting positions cannot be appropriately expressed.
The present invention provides an image processing apparatus comprising a generation unit configured to generate thumbnail images from frames included in a moving image and a display unit configured to arrange and display the thumbnail images based on shooting time and shooting position, with respect to a direction of gravity, of the frames corresponding to the thumbnail images, wherein the generation unit generates the thumbnail images from the frames captured at each of a plurality of predetermined levels in the direction of gravity.
One aspect of the present invention is directed to appropriately expressing, if the imaging apparatus moves with respect to a direction of gravity while capturing the moving image, the shooting position with respect to the direction of gravity of a scene in the moving image. A user can thus easily recognize the shooting position.
Further features of the present invention will become apparent from the following description of exemplary embodiments (with reference to the attached drawings).
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate exemplary embodiments, features, and aspects of the invention and, together with the description, serve to explain the principles of the invention.
In the exemplary embodiments of the present invention to be described below, the moving image captured underwater is an example of the images captured at different altitudes.
Referring to
A camera signal processing unit 113 performs predetermined signal processing on the electric signal output from the image sensor 112 and outputs the result as a camera signal. A recording/reproducing signal processing unit 114 performs predetermined signal processing, such as a compression process, on the camera signal output from the camera signal processing unit 113. The recording/reproducing signal processing unit 114 then records the processed signal as image data in a recording medium 115 such as a memory card. Further, when the imaging apparatus 110 is in a playback mode, the recording/reproducing signal processing unit 114 reproduces the image data recorded in the recording medium 115.
A control unit 116 is a microcomputer for controlling the imaging apparatus 110. A memory 117 stores parameters of the imaging apparatus 110, which is controlled by the control unit 116. A display unit 118 displays, when the control unit 116 functions as a display control unit, a through image in a shooting mode, a reproduced image in the playback mode, and icons and text as a user interface. A main body operation unit 119 which functions as an instruction unit is an operation unit for the user to instruct the imaging apparatus 110 to perform operations. An interface unit 120 mediates information input to the imaging apparatus 110 from outside.
A configuration example of the underwater pack 100 will be described below. The underwater pack 100 includes a water pressure sensor 101 and an external operation unit 102. The water pressure sensor 101 detects water pressure. The external operation unit 102 is an operation unit used by the user to instruct via the underwater pack 100, the imaging apparatus 110 inside the underwater pack 100 to perform the operations.
The operation of the imaging apparatus 110 according to an exemplary embodiment will be described below.
The image sensor 112 performs photoelectric conversion of the object image formed by the imaging lens 111, and the result is output to the camera signal processing unit 113 as the electric signal. The camera signal processing unit 113 then performs predetermined signal processing, such as gamma correction and white balance processing, on the electric signal output from the image sensor 112. The camera signal processing unit 113 outputs the processed result to the recording/reproducing signal processing unit 114 as the camera signal. The memory 117 stores the parameters used by the camera signal processing unit 113 for performing predetermined signal processing. The control unit 116 thus controls the camera signal processing unit 113 to appropriately perform signal processing according to the parameters stored in the memory 117.
The recording/reproducing signal processing unit 114 then performs predetermined signal processing, such as setting a recording size in a recording mode, on the camera signal output from the camera signal processing unit 113. As a result, the recording/reproducing signal processing unit 114 acquires frames, and outputs the frames as moving image data to the recording medium 115. Further, the recording/reproducing signal processing unit 114 outputs the moving image data to be displayed as the through image to the control unit 116. The recording medium 115 thus records as the moving image data, the signal processed by the recording/reproducing signal processing unit 114. The control unit 116 outputs the moving image data output from the recording/reproducing signal processing unit 114 to the display unit 118. The display unit 118 thus functions as a monitor when the imaging apparatus 110 is capturing the moving images. At the same time, the display unit 118 displays the through image, an operation mode and a shooting time of the imaging apparatus 110, which are related to the user interface. The above-described series of operations are performed by the user operating the main body operation unit 119 in the imaging apparatus 110.
The shooting operation performed when the underwater pack 100 is attached to the imaging apparatus 110 will be described below.
As described above, the underwater pack 100 includes the external operation unit 102, and the user performs the shooting operation and a playback operation of the imaging apparatus 110 from outside the underwater pack 100. For example, if the user operates a zoom lever (not illustrated) in the external operation unit 102, a member (not illustrated) coupled with a zoom key in the imaging apparatus 110 inside the underwater pack 100 operates the zoom key. The user can thus change a shooting angle.
Further, as described above, the underwater pack 100 includes the water pressure sensor 101 which detects the water pressure. The imaging apparatus 110 is thus capable of acquiring via the interface unit 120 the water pressure, i.e., water pressure information, detected by the water pressure sensor 101. More specifically, since the interface unit 120 in the imaging apparatus 110 is a jack connector, the water pressure sensor 101 is connectable by inserting a wire plug i.e., an output line, thereof into the interface unit 120. The connection between the imaging apparatus 110 and the water pressure sensor 101 is not limited to the wire plug. Other methods, such as wireless communication and short range wireless communication, may be employed in performing connection, as long as the signals can be transmitted and received. According to an exemplary embodiment, the underwater pack 100 includes the water pressure sensor 101. However, a similar operation may be realized in the case where the water pressure sensor 101 is installed in the main body of the imaging apparatus 110.
The process in which the imaging apparatus 110 converts the water pressure information acquired from the water pressure sensor 101 to water depth information, and records the moving image data by attaching the water depth information as metadata will be described below with reference to the flowchart illustrated in
When the user switches the imaging apparatus 110 to the shooting mode, the process starts. In step S201, the control unit 116 determines whether the current shooting mode is an underwater shooting mode. Since, unlike in the air, an infrared component of sunlight is absorbed underwater, it becomes important to control white balance of the imaging apparatus 110 appropriately to capture the images underwater. As a result, the control unit 116 determines whether the user has set the imaging apparatus 110 to the underwater shooting mode before capturing images underwater. If the underwater shooting mode is set (YES in step S201), the process proceeds to step S202. If the underwater shooting mode is not set (NO in step S201), the process proceeds to step S206.
In step S202, the control unit 116 stands by until detecting that the user has pressed a trigger key, for example a shooting start key, in the imaging apparatus 110. If the control unit 116 detects that the user has pressed the trigger key (YES in step S202), the process proceeds to step S203. In step S203, the control unit 116 acquires via the interface unit 120, the water pressure detected by the water pressure sensor 101 as the water pressure information.
In step S204, the control unit 116 functions as an acquisition unit, and converts the water pressure information acquired in step S203 to the water depth information. Since the water pressure is proportional to the water depth, the control unit 116 calculates the water depth information by multiplying the water pressure information by a constant. The control unit 116 selects the constant to be used in the calculation appropriately from constants stored in a data table in the memory 117.
In step S205, the control unit 116 generates the moving image data by performing the above-described procedure. The control unit 116 then attaches to each frame in the image data, shooting mode information and the water depth information converted in step S204 as the metadata, and records the resulting moving image data in the recording medium 115. The process thus ends.
On the other hand, in step S206, the control unit 116 stands by until detecting that the user has pressed the trigger key, i.e. the shooting start key, in the imaging apparatus 110. If the control unit 116 detects that the user has pressed the trigger key (YES in step S206), the process proceeds to step S207. In step S207, the control unit 116 generates the moving image data by performing the above-described procedure. The control unit 116 then causes the recording/reproducing signal processing unit 114 to record the generated moving image data in the recording medium 115. The process thus ends.
As described above, according to an exemplary embodiment, the imaging apparatus 110 becomes capable of recording by attaching a detection result of the water pressure sensor 101 to the moving image data, as the water depth information. If the user uses the imaging apparatus 110 to capture the images underwater without setting the imaging apparatus 110 to the underwater shooting mode, the imaging apparatus 110 may display a warning to prompt the user to switch the shooting mode.
The process performed for reproducing the moving image captured by the imaging apparatus 110 will be described below with reference to the flowchart illustrated in
Further, the process for reproducing the moving image to be described below is not only performed by the imaging apparatus 110 but the process may be similarly realized by an information processing apparatus, such as a computer apparatus or a mobile communication apparatus, capable of importing the moving images from the imaging apparatus 110. In such a case, the information processing apparatus is set to the playback mode by a control unit in the apparatus which activates software, such as an operating system (OS) and a moving image reproduction application program, which a storage medium stores.
If the control unit 116 detects that the imaging apparatus 110 has been switched to the playback mode, the control unit 116 displays on the display unit 118 a screen as illustrated in
The processes illustrated in
When the user has selected the moving image on the screen illustrated in
In step S302, the control unit 116 determines whether the selected moving image has been captured underwater. The control unit 116 makes the determination by confirming whether the water depth information is attached to the moving image data as metadata. If the moving image has been captured underwater (YES in step S302), the process proceeds to step S303. If the moving image has not been captured underwater (NO in step S302), the process proceeds to step S306.
In step S303, the control unit 116 acquires from among the water depth information attached to the moving image data recorded in the recording medium 115, the water depth information of predetermined time intervals synchronous with the frames corresponding to the thumbnail images generated in step S301. In step S304, the control unit 303 calculates y-coordinates or vertical positions based on the water depth information acquired in step S303.
In step S305, the control unit 116 displays on the display unit 118 the thumbnail images generated in step S301. More specifically, the control unit 116 displays the thumbnail images in a predetermined display area in the screen, arranged at the positions corresponding to the y-coordinates or vertical positions calculated in step S304. The control unit 116 also displays in the display area a scale mark indicating the water depth. The process then ends.
The moving image is divided at predetermined time intervals along a time axis. A selection cursor 409 displays an area corresponding to one scene in the moving image. A darkened portion in the selection cursor 409 indicates an area of the scene in the moving image corresponding to the currently displayed thumbnail image. The moving image has a top frame of each scene corresponding to the predetermined time interval.
If the user moves the selection cursor 409, the thumbnail images 404, 405, 406, 407, and 408 being displayed also update, so that the thumbnail images displayed on the screen change.
Scale marks 410 indicate the y-coordinates according to a scale calculated from a range of the water depth information. A water depth value based on the water depth information is displayed at the same time, so that the thumbnail images 404, 405, 406, 407, and 408 are arranged at different y-coordinate positions. In the screen illustrated in
Further, a range of the water depth information differs according to a shooting condition of each moving image. A maximum value and a minimum value of the water depth are thus acquired, and a fineness of the scale of the water depth set to the display area is changed according to the range of the water depth information.
Returning to
As described above, according to the present exemplary embodiment, the water depth information is recorded associated with each frame of the moving image captured underwater using the imaging apparatus 110 covered by the underwater pack 100. The thumbnail images are then generated at predetermined time intervals from the moving image. Further, the y-coordinates in the display area are calculated based on the water depth information of predetermined time intervals synchronous with the generated thumbnail images. The thumbnail images are thus arranged and displayed in the display area at the calculated y-coordinate positions in chronological order. As a result, the imaging apparatus 110 becomes capable of explicitly and simply notifying a user of the change in the water depth of the moving image in the playback standby state along a time axis. The user can thus visually recognize the water depth at which the moving image has been captured, along with the change in time. Further, since the imaging apparatus 110 displays the thumbnail images arranged according to the water depth information, the user can recognize from the captured object images the approximate water depth at which organisms and plants live in. Furthermore, the user can easily recognize from the thumbnail images the time or the scene at which the object unique to each water depth has been captured.
Moreover, the displaying methods on the screen are switched according to whether the imaging apparatus 110 is set to the underwater shooting mode or a normal shooting mode. The user can thus easily recognize whether the moving image has been captured underwater or normally. For example, the user can easily identify whether the moving image has been captured by a user of the imaging apparatus diving underwater, or by shooting an aquarium from the outside in the normal shooting mode.
The control unit 116 starts the process of the flowchart illustrated in
In step S602, the control unit 116 acquires from the metadata recorded in the recording medium 115 the water depth information of the scene currently being reproduced. More specifically, the control unit 116 acquires the stored water depth information associated with the first frame of the scene in the moving image being reproduced. Further, the control unit 116 acquires the water depth information of the plurality of frames at predetermined time intervals, and the water depth information and a scene number of each scene recorded in the recording medium 115. The control unit 116 thus generates reproducing process extension information from the acquired information.
In step S603, the control unit 116 determines whether the user has pressed the up key/down key in the main body operation unit 119. If the user has pressed the up key/down key (YES in step S603), the process proceeds to step S604. If the user has not pressed the up key/down key (NO in step S603), the process proceeds to step S606.
In step S604, the control unit 116 determines whether there is a scene captured at the water depth which is less than or greater than the water depth of the scene currently being reproduced. For example, if the user has operated the up key/down key and has instructed to move upwards, the control unit 116 searches the reproducing process extension information for whether there is a scene of less water depth as compared to the current scene. On the other hand, if the user has operated the up key/down key and has instructed to move downwards, the control unit 116 searches the reproducing process extension information for whether there is a scene of greater water depth as compared to the current scene. If there is a scene of less or greater water depth (YES in step S604), the process proceeds to step S605. If there is no scene of less or greater water depth (NO in step S604), the process proceeds to step S606.
In step S605, the control unit 116 jumps to the scene captured at the water depth which is less than or greater than that of the current scene, according to the key operation in step S603. The control unit 116 then starts reproducing from the top frame of the scene. In such a case, the control unit 116 updates the water depth information of the scene currently being reproduced to the water depth information of the scene which the imaging apparatus 110 has jumped to.
In step S606, the control unit 116 determines whether the scene currently being reproduced has reached the end. If the scene currently being reproduced has reached the end (YES in step S606), the process ends. On the other hand, if the control unit 116 is still in the process of reproducing the scene (NO in step S606), the process returns to step S603.
As described above, according to the present exemplary embodiment, the imaging apparatus 110 can jump between scenes based on the water depth information while reproducing the moving image. As a result, the user can view the scene in the moving image captured at the water depth in which a target object image exists.
Thus, the water depth information in underwater image capturing can be used as the information indicating the shooting position with respect to the direction of gravity. Further, when the imaging apparatus 110 performs normal image capturing, a defined predetermined value is used as the altitude information. However, when normal image capturing is to be performed, the altitude may be measured and recorded, and the recorded altitude information may be used as the information indicating the shooting position with respect to the direction of gravity, similar to underwater image capturing described above. The thumbnail images may then be arranged at the positions corresponding to the altitudes.
According to a second exemplary embodiment, when the imaging apparatus 110 is in the playback standby state, the imaging apparatus 110 allows the user to visually recognize the water depth at which the moving image is captured. Further, the imaging apparatus 110 allows the user to visually recognize timing at which the imaging apparatus 110 switches between underwater image capturing and normal image capturing. Descriptions for the configurations and the processes similar to those described with respect to the first exemplary embodiment are omitted.
The process performed by the imaging apparatus 110 according to the second exemplary embodiment is described with reference to the flowchart illustrated in
The user selects the moving image to be reproduced on the screen illustrated in
In step S802, the control unit 116 classifies the moving image based on the shooting mode information acquired in step S801. In step S803, the control unit 116 acquires the frames corresponding to predetermined time intervals in the moving image captured in the underwater shooting mode. The control unit 116 then generates the thumbnail images from the acquired frames. The method for generating the thumbnail images is similar to the method described with respect to the first exemplary embodiment.
In step S804, the control unit 116 acquires the water depth information of predetermined time intervals synchronous with the frames corresponding to the thumbnail images generated in step S803. The control unit 116 acquires such water depth information from among the water depth information attached to the moving image data recorded in the recording medium 115. In step S805, the control unit 116 calculates the y-coordinates based on the water depth information of the moving image captured in the underwater shooting mode.
In step S806, the control unit 116 generates the thumbnail images from the frames corresponding to predetermined time intervals in the moving image captured in the normal shooting mode classified in step S802. In step S807, the control unit 116 reads the defined y-coordinate information from the memory 117. The process of step S807 is performed so that the y-coordinates of the display positions of the thumbnail images are not changed for the moving image captured in the normal shooting mode, unlike for the moving image captured in the underwater shooting mode.
In step S808, the control unit 116 arranges the thumbnail images generated in step S803 in the predetermined display area in the screen, at the positions indicated by the y-coordinates calculated based on the water depth information in step S805. The control unit 116 then displays the thumbnail images on the display unit 118. In such a case, the control unit 116 adds to the display area and displays the scale marks indicating the water depths. In step S809, the control unit 116 arranges the thumbnail images generated in step S806 in the predetermined display area in the screen, at the defined y-coordinate positions read in step S807. The control unit 116 then displays the thumbnail images on the display unit 118. The process then ends.
Referring to
In the example illustrated in
As described above, according to the present exemplary embodiment, the moving images captured by the imaging apparatus 110 covered by the underwater pack 100 are recorded by attaching thereto the shooting mode information and the water depth information. When the imaging apparatus 110 is then switched to the playback mode, the moving images are classified by those captured in the underwater shooting mode and the normal shooting mode. The imaging apparatus 110 generates from each of the classified moving images the thumbnail images corresponding to predetermined time intervals. Further, the imaging apparatus 110 reads the water depth information of predetermined time intervals synchronous with the thumbnail images, calculates the y-coordinates in the display area, and displays the thumbnail images in the display area in the screen, at the positions indicated by the y-coordinates.
As a result, when the imaging apparatus 110 captures the moving images in the normal shooting mode and the underwater shooting mode, the imaging apparatus 110 is capable of explicitly and simply notifying the user of the change in the water depth when performing underwater image capturing. Further, the user can visually recognize the scene in which the shooting mode has been switched from the normal shooting mode to the underwater shooting mode. Furthermore, the imaging apparatus 110 can visually notify of the scene in which the user has switched the shooting mode from the underwater shooting mode to the normal shooting mode. Moreover, the imaging apparatus 110 can provide to the user using the object image included in the thumbnail images, the approximate water depth in which the organisms and plants live.
According to a third exemplary embodiment, an example in which the thumbnail images are generated at predetermined water depth intervals from the moving image captured underwater will be described below. The descriptions relevant to the third exemplary embodiment which are similar to those already described above for the first and second exemplary embodiments will be omitted.
A file format of the metadata in the moving image data recorded in the recording medium 115 will be described below with reference to
When the user selects the moving image on the screen, as illustrated in
In step S1103, the control unit 116 displays on the display unit 118 the maximum value and the minimum value of the water depth calculated in step S1102. The process then ends. For example, if the user has selected the representative image displayed in the selection frame 401 in the screen illustrated in
The control process performed for displaying the thumbnail images with respect to each water depth of the position at which the moving image has been captured will be described below with reference to the flowchart illustrated in
In step S1201, the control unit 116 reads from the recording medium 115 the metadata of the moving image. In step S1202, the control unit 116 analyzes the metadata and calculates the change in the water depth of the moving image as illustrated in
For example, if the predetermined water depth interval is 10 meters, the control unit 116 generates the thumbnail images from each of the frames constituting the moving image, captured at the water depths of 10 meters, 20 meters, and 30 meters respectively. The control unit 116 then groups the thumbnail images according to the moving image from which the thumbnail images are generated, and displays the grouped thumbnail images. A group 1301 illustrated in
As described above, the imaging apparatus 110 generates the thumbnail images at predetermined water depth intervals from the moving image and displays the thumbnail images in groups. The imaging apparatus 110 is thus capable of explicitly notifying the user of the range of the water depth in which one moving image has been captured. Further, the imaging apparatus 110 displays a scroll bar 1302 in the screen illustrated in
The positions for displaying the thumbnail images may also be changed according to time. The control performed for changing the positions at which the thumbnail images are displayed according to time will be described below with reference to
Since the processes performed in step S1401 to step S1403 are similar to those performed in step S1201 to step S1203 illustrated in
In step S1404, the control unit 116 calculates the y-coordinates at which the thumbnail images generated from the frames captured at predetermined time intervals are to be displayed. The control unit 116 may calculate the y-coordinates from the water depths of the thumbnail images generated from the frames captured at predetermined time intervals. Further, the control unit 116 may calculate the y-coordinates from the water depths of the thumbnail images generated from the frames captured at arbitrary times, based on the change in the water depth of the moving image with respect to time. In step S1405, the control unit 116 displays at the position corresponding to the y-coordinates calculated in step S1404, the thumbnail images generated from the frame captured at the designated time.
According to the present exemplary embodiment, the imaging apparatus 110 changes the thumbnail images and shifts the arrangement of the thumbnail images according to time. The imaging apparatus 110 is thus capable of explicitly notifying the user of the change in the water depth and the range of the water depth when capturing the moving image underwater.
According to the first exemplary embodiment, if the user operates the up key/down key while the imaging apparatus 110 is reproducing the moving image, the scene jumps to a scene of a different water depth. According to a fourth exemplary embodiment, if the user further operates a left key/right key while the imaging apparatus 110 is reproducing the moving image, the scene jumps to a scene of the same water depth. The descriptions corresponding to the fourth exemplary embodiment which are similar to those already provided above with respect to the first, second, or third exemplary embodiments will be omitted.
In step S1501, the control unit 116 sets a flag indicating that the scene corresponding to the water depth information designated by the user operating the up key/down key in step S605 is being reproduced. In step S1502, the control unit 116 determines whether the user has operated the left key/right key. If the user has operated the left key/right key (YES in step S1502), the process proceeds to step S1503. If the user has not operated the left key/right key (NO in step S1502), the process proceeds to step S606.
In step S1503, the control unit 116 determines whether there are other scenes having the same water depth information as the water depth information of the scene currently being reproduced. The control unit 116 determines by searching the reproducing process extension information. If there are other scenes having the same water depth information (YES in step S1503), the process proceeds to step S1504. On the other hand, if there is no other scene having the same water depth information (NO in step S1503), the process proceeds to step S606.
In step S1504, the control unit 116 causes the scene to jump to the scene having the same water depth information, according to a direction in which the user has operated the left key/right key in step S1502. The control unit 116 then starts reproducing the scene from the top frame of the scene. In other words, if the user operates the left key/right key towards a right side, the scene jumps to the scene of a greater scene number and of the same water depth information as the scene currently being reproduced. In contrast, if the user operates the left key/right key towards a left side, the scene jumps to the scene of a smaller scene number and the same water depth information as the scene currently being reproduced.
In step S1505, the control unit 116 determines whether the flag is ON. If the flag is ON (YES in step S1505), the process proceeds to step S1501. If the flag is OFF (NO in step S1505), the process proceeds to step S1506.
In step S1506, the control unit 116 determines whether the user has operated the left key/right key. If the user has operated the left key/right key (YES in step S1506), the process proceeds to step S1507. If the user has not operated the left key/right key (NO in step S1506), the process proceeds to step S606.
In step S1507, if the user has operated the left key/right key towards the right side, the scene jumps to the scene of a greater scene number as compared to the scene currently being reproduced. If the user has operated the left key/right key towards the left side, the scene jumps to the scene of a smaller scene number as compared to the scene currently being reproduced.
According to the present exemplary embodiment, the user can easily confirm the scene which has been captured at the same water depth at a different time.
According to a fifth exemplary embodiment, if the user operates the left key/right key while the imaging apparatus 110 is reproducing the moving image, the imaging apparatus 110 selectively decides whether to jump to the scene of the same water depth or to the next scene regardless of the water depth. The descriptions corresponding to the fifth exemplary embodiment which are similar to those already provided above for the first, second, third, or fourth exemplary embodiments will be omitted.
In step S1601, the control unit 116 determines while reproducing an arbitrary scene, whether the user has operated the left key/right key. If the user has operated the left key/right key (YES in step S1601), the process proceeds to step S1602. If the user has not operated the left key/right key (NO in step S1601), the process proceeds to step S606.
In step S1602, the control unit 116 acquires operation setting information of the left key/right key in the playback mode which has been preset by the user. The control unit 116 then determines the operation setting of the left key/right key. The operation setting information is generated by the displaying unit 118 displaying a setting menu, and the user selecting a predetermined item on the setting menu. The user can select on the setting menu, “jump to normal scene” or “jump to scene of same water depth” as the operation to be performed by the left key/right key.
If the user has selected “jump to normal scene”, and the user operates the left key/right key, the scene jumps to the scene of the scene number immediately before or immediately after the scene number of the current scene. On the other hand, if the user has selected “jump to scene of same water depth”, the scene jumps to the scene of a scene number closest to that of the current scene among the scenes having the same water depth information as the current scene. If the operation setting information indicates “normally increment scene number” (NO in step S1602), the process proceeds to step S1603. If the operation setting information indicates “jump to scene of same water depth” (YES in step S1602), the process proceeds to step S1503.
In step S1603, the control unit 116 determines whether there is a subsequent scene to be reproduced according to the user operation on the left key/right key. If there is a subsequent scene to be reproduced (YES in step S1603), the process proceeds to step S1604. If there is no subsequent scene to be reproduced (NO in step S1603), the process proceeds to step S606.
In step S1604, the control unit 116 causes, if the user has operated the left key/right key towards the right side in step S1601, the scene to jump to the scene whose scene number is larger than that of the scene currently being reproduced. If the user has operated the left key/right key towards the left side in step S1601, the control unit 116 causes the scene to jump to the scene whose scene number is smaller than that of the scene currently being reproduced.
According to the present exemplary embodiment, if the user instructs the imaging apparatus 110 to jump from the scene in the moving image currently being reproduced, the imaging apparatus 110 selectively reproduces the scene according to the intention of the user as follows. The imaging apparatus 110 switches to another scene of the same water depth, or to another scene regardless of the water depth, and reproduces the scene. User friendliness is thus improved.
Aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiment(s), and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiment(s). For this purpose, the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (e.g., computer-readable medium).
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all modifications, equivalent structures, and functions. Further, portions of the above-described exemplary embodiments may be combined as appropriate.
This application claims priority from Japanese Patent Application No. 2011-139628 filed Jun. 23, 2011, which is hereby incorporated by reference in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2011-139628 | Jun 2011 | JP | national |