The present disclosure relates to display control apparatuses, imaging apparatuses, and display control methods.
Among imaging apparatuses which can capture a moving image (video), such as a digital video camera and the like, there are some that can switch between a plurality of recording modes, depending on a relationship between the input timing of a trigger signal corresponding to a trigger operation, such as pressing down of a record button or the like, and a period of time during which a video is recorded. The recording modes include, for example, a recording mode corresponding to so-called “start trigger” in which a video during a predetermined period of time following the input timing of a trigger signal is recorded, and a recording mode corresponding to so-called “end trigger” in which a video during a predetermined period of time preceding the input timing of a trigger signal is recorded. For example, Patent Literature 1 discloses a technique of displaying a selected recording mode, as an icon which can be easily intuitively recognized by the user, on a display of an imaging apparatus.
Here, a user who is using an imaging apparatus typically performs a trigger operation, depending on a period of time during which recording is desired, while viewing a so-called live view image showing a current state of an object. Therefore, in a recording mode in which a preceding video is recorded, such as the above recording mode corresponding to end trigger, the user cannot perform a trigger operation while directly viewing a video which is at the start of recording. Therefore, unless the user checks a recorded video after the end of recording, the user cannot verify whether or not a desired video has been recorded, and therefore, the user may feel inconvenient.
With the above in mind, the present disclosure proposes a novel and improved display control apparatus, imaging apparatus, and display control method which can further improve the convenience of the user.
According to the present disclosure, there is provided a display control apparatus including: a display control unit configured to delay a captured video for a predetermined time, and display the delayed video. The predetermined time for which the video is delayed is determined on the basis of a timing at which recording is started according to a trigger signal.
According to the present disclosure, there is provided an imaging apparatus including: an imaging unit configured to capture a video; a storage unit configured to store a video with a certain period of time captured by the imaging unit; and a display control unit configured to delay the video captured by the imaging unit for a predetermined time, and display the delayed video, using the video stored in the storage unit. The predetermined period of time for which the video is delayed is determined on the basis of a timing at which recording is started according to a trigger signal.
According to the present disclosure, there is provided a display control method including: causing a processor to delay a captured video for a predetermined time, and display the delayed video. The predetermined time for which the video is delayed is determined on the basis of a timing at which recording is started according to a trigger signal.
According to the present disclosure, a captured video which is delayed for a predetermined time is displayed. Therefore, when a video preceding the user's trigger operation is recorded, the user can perform a trigger operation at a more appropriate timing while checking a delayed video. As a result, the convenience of the user is further improved.
As described above, according to the present disclosure, the convenience of the user can be further improved. Note that the effects described above are not necessarily limitative. With or in the place of the above effects, there may be achieved any one of the effects described in this specification or other effects that may be grasped from this specification.
Hereinafter, (a) preferred embodiment(s) of the present disclosure will be described in detail with reference to the appended drawings. In this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
Note that description will be provided in the following order.
An imaging apparatus according to this embodiment (corresponding to an imaging apparatus 10 shown in
Note that, although it is data (video information) corresponding to a video that is actually recorded in the first recording medium and the second recording medium, “record (save) video information” may also be herein referred to “record (save) a video” or the like for the sake of simplicity. Also, in the description that follows, “record a video into the first recording medium” is referred to as “capture a video (image)” or “shoot a video (image),” and “record a video into the second recording medium” is referred to as “record,” as distinct from each other, for the sake of convenience. In other words, in this embodiment, a portion of a captured video is recorded as a video which is intended to be finally recorded.
Here, the imaging apparatus according to this embodiment has a plurality of recording modes, depending on a relationship between the input timing of a trigger signal corresponding to a trigger operation which is performed by the user to start recording (also hereinafter simply referred to as a “trigger operation”) and a period of time during which a video is recorded. The recording modes used in the imaging apparatus according to this embodiment will be described with reference to
A recording mode shown in
A recording mode shown in
Note that, in the description that follows, a predetermined period of time preceding the input timing of a trigger signal in the first recording mode and the second recording mode is also referred to as a “preceding period of time.” In other words, a preceding period of time refers to a period of time from a timing at which a recording period of time starts (recording start timing) to the input timing of a trigger signal. The first recording mode and the second recording mode are both a recording mode in which a recording period of time includes a preceding period of time. Also, the first recording mode is a recording mode in which a preceding period of time is the same as a recording period of time.
A shooting mode shown in
In the foregoing, the recording modes used in the imaging apparatus according to this embodiment have been described with reference to
Note that, in this embodiment, when the recording mode is the first recording mode or the second recording mode, the effect of improving the convenience of the user is particularly exhibited. Therefore, in the description that follows, a case where the recording mode is the first recording mode or the second recording mode will be mainly described.
A functional configuration of the imaging apparatus according to this embodiment will be described with reference to
Referring to
The imaging unit 110 includes an imaging element, such as a charge coupled device (CCD) image sensor, complementary metal-oxide-semiconductor (CMOS) image sensor, or the like, and optical members which control incident light to the imaging element, such as a lens, diaphragm, shutter, and the like. The imaging unit 110 images an object, and acquires an image signal (image information) corresponding to the captured image. Operations of the imaging unit 110, such as driving of the imaging element or the optical members according to shooting conditions, are performed by an imaging control unit 161 described below of the control unit 160. In this embodiment, the imaging unit 110 can acquire image information corresponding to a moving image (video) by continuously shooting an object at a predetermined frame rate. The frame rate may have a value corresponding to so-called high-speed shooting. The imaging unit 110 provides image information corresponding to an acquired video to a captured video processing unit 162 described below of the control unit 160.
Note that the imaging apparatus 10 may include an audio input unit (not shown) together with the imaging unit 110. The audio input unit includes a sound collection device which can acquire surrounding sounds, such as a microphone or the like, and can acquire audio information depending on image information corresponding to a video acquired by the imaging unit 110.
The operation unit 120 is an input interface for detecting various operation inputs made by the user. The operation unit 120 includes various input devices, such as various buttons, such as a recording start button, cross button, menu button, and the like, a touchscreen, a remote controller, and the like. Note that the configuration of the operation unit 120 is not limited to this example, and the operation unit 120 may include various known input devices which can be included in commonly used digital video cameras. Also, the operation unit 120 includes an input control circuit which generates an input signal based on the user's operation and provides the input signal to the control unit 160.
The user can input various kinds of information or instructions to the imaging apparatus 10 through the operation unit 120. In this embodiment, the user can input an instruction to set a recording mode to a recording mode setting unit 163 described below of the control unit 160 through the operation unit 120. Also, the user can input an instruction to start shooting to the imaging control unit 161 described below of the control unit 160 through the operation unit 120. Also, the user can input an instruction (i.e., a trigger signal) to start recording to a captured video recording unit 164 described below of the control unit 160 by performing a trigger operation through the operation unit 120 (e.g., the above recording start button).
The display unit 130 is an output interface which displays various kinds of information in various forms, such as text, an image, a figure, a graph, or the like, on a display screen to visually notify the user of the information. The display unit 130 includes various display devices, such as a liquid crystal display (LCD), organic electro-luminescence (EL) display, and the like. In this embodiment, the display unit 130 displays a video which is delayed for a predetermined time, under the control of a display control unit 165 described below of the control unit 160. A predetermined period of time for which a video is delayed may be a period of time corresponding to a preceding period of time in a recording mode which is set. As a result, a video of which display is delayed is the same as a video at a recording start timing. Also, the display unit 130 may display a video showing a current object (so-called live view image) together with a video which is delayed for a predetermined time. Note that the function of the display unit 130 is not limited to this example, and the display unit 130 can display various kinds of information which may be displayed by commonly used imaging apparatuses, such as a setting screen for setting shooting conditions, a recorded video, and the like.
Note that the imaging apparatus 10 may include an audio output unit (not shown) together with the display unit 130. The audio output unit includes an audio output device, such as a loudspeaker or the like, and can output audio corresponding to a video displayed on the display unit 130.
The first storage unit 140 includes a recording medium which can temporarily record information, such as a memory or the like, and stores a video with a certain period of time during shooting. In other words, the first storage unit 140 buffers a captured video. During shooting, a video corresponding to a recording start timing corresponding to the recording mode is read from the first storage unit 140, and is provided to the display control unit 165. Also, a video corresponding to a recording period of time is read from the first storage unit 140 according to an operation input indicating start of recording, and is stored into the second storage unit 150. The processes of storing and reading a video to and from the first storage unit 140 are controlled by the captured video recording unit 164 described below of the control unit 160.
A video saved in the first storage unit 140 is updated as appropriate during the time that shooting is continued so that videos are erased in chronological order with the oldest first. The storage capacity of the first storage unit 140 is set so that at least a video corresponding to a recording period of time can be stored. Preferably, the storage capacity of the first storage unit 140 is designed to be equal to the amount of information of a video corresponding to a preceding period of time (a recording period of time in the first recording mode). When the storage capacity of the first storage unit 140 is equal to the amount of information of a video corresponding to a preceding period of time, a video immediately before being erased in the first storage unit 140 corresponds to a video at a recording start timing in the first and second recording modes.
The second storage unit 150 includes various storage devices, such as a magnetic storage device, semiconductor storage device, optical storage device, magneto-optical storage device, and the like, and stores a video corresponding to a recording period of time, i.e., a video which is intended to be finally recorded. The video storage process performed in the second storage unit 150 is controlled by the captured video recording unit 164 described below of the control unit 160. The storage capacity of the second storage unit 150 is set into sufficiently large, compared to the first storage unit 140. For example, the second storage unit 150 can store a plurality of different videos. Note that the second storage unit 150 may include a removable recording medium, such as an optical disc, magneto-optical disc, semiconductor memory, or the like.
Note that the imaging apparatus 10 may further include a storage unit for storing, for example, various kinds of information other than video information processed by the control unit 160, a program for operating the control unit 160, and the like, in addition to the first storage unit 140 and the second storage unit 150. Alternatively, the second storage unit 150 may further store these kinds of information in addition to video information.
The control unit 160 includes various processors, such as a central processing unit (CPU), digital signal processor (DSP), and the like, and performs various calculation processes to control operations of the imaging apparatus 10. The control unit 160 may correspond to a display control apparatus of the present disclosure. As shown, the control unit 160 has, as its functions, the imaging control unit 161, the captured video processing unit 162, the recording mode setting unit 163, the captured video recording unit 164, and the display control unit 165. Note that functions of the control unit 160 including these functions are achieved by a processor included in the control unit 160 operating according to a predetermined program stored in, for example, the second storage unit 150 or other storage devices or the like.
The imaging control unit 161 controls operations of the imaging unit 110. For example, the imaging control unit 161 drives the imaging element and optical members provided in the imaging unit 110, according to preset shooting conditions (e.g., exposure, shutter speed, etc.). The imaging control unit 161 drives the imaging apparatus and the optical member to start shooting, according to a signal indicating start of shooting which is input through the operation unit 120. When shooting is thus started by the imaging unit 110 under the control of the imaging control unit 161, image information corresponding to a video is provided from the imaging unit 110 to the captured video processing unit 162.
The captured video processing unit 162 performs various signal processes on image information acquired by the imaging unit 110 to generate video information corresponding to a video which is shot (captured video). The signal processes performed by the captured video processing unit 162 include, for example, gamma correction, auto gain control (AGC), adjustment of white balance, exposure correction, magnification control corresponding to digital zooming, and the like, which are performed to process image information in commonly used imaging apparatuses. The image information after the signal processes is, for example, information (video information) corresponding to a video which is displayed on the display unit 130 and may be finally visually recognized by the user. The captured video processing unit 162 provides the image information after the processes, i.e., video information, to the captured video recording unit 164 and the display control unit 165.
The recording mode setting unit 163 sets the recording mode of the imaging apparatus 10. The recording mode of the imaging apparatus 10 may be selected from, for example, the first recording mode to the third recording mode described above (1. Recording modes). The recording mode setting unit 163 can determine the recording mode of the imaging apparatus 10 according to the user's instruction input through the operation unit 120. The recording mode setting unit 163 provides information about the set recording mode to the captured video recording unit 164.
The captured video recording unit 164 records video information corresponding to a captured image. The captured video recording unit 164 saves video information generated by the captured video processing unit 162 into the first storage unit 140. Here, as described above, the storage capacity of the first storage unit 140 is designed to be capable of storing a video with a certain period of time. Also, when shooting is being continued, the captured video processing unit 162 may sequentially generate video information. Therefore, the captured video recording unit 164 erases videos stored in the first storage unit 140 in chronological order with the oldest first, i.e., performs a sequential updating process. As a result, the first storage unit 140 invariably stores a video which precedes the current time by a predetermined period of time (a period of time corresponding to the storage capacity of the first storage unit 140) while updating the video.
Also, the captured video recording unit 164 determines a recording period of time on the basis of information about the set recording mode during shooting. For example, when the recording mode is the first recording mode, a predetermined period of time preceding the current time is a recording period of time. Also, for example, when the recording mode is the second recording mode, a combination of a predetermined period of time preceding the current time and a predetermined period of time following the current time is a recording period of time. Also, for example, when the recording mode is the third recording mode, a predetermined period of time following the current time is a recording period of time. The captured video recording unit 164 reads a video corresponding to a recording start timing corresponding to the recording mode from the first storage unit 140, on the basis of the determined recording period of time, and provides the video to the display control unit 165. Also, the captured video recording unit 164 reads a video which is during a recording period of time corresponding to the set recording mode, from the first storage unit 140, saves the video into the second storage unit 150, according to a signal indicating start of recording (i.e., a trigger signal) which is input through the operation unit 120.
Note that the captured video recording unit 164 may have the function of performing an encoding process, and may encode video information using a predetermined technique (e.g., MPEG2, etc.), and store the encoded video information into the first storage unit 140 and the second storage unit 150. The encoding technique is not limited to this example, and various known techniques may be employed.
Processes of the captured video recording unit 164 corresponding to the recording modes will now be described in sequence.
Firstly, a case where the recording mode is the first recording mode will be described. In the first recording mode, a recording period of time is the same as a preceding period of time (see
Next, a case where the recording mode is the second recording mode will be described. In the second recording mode, a recording period of time includes a preceding period of time and a predetermined period of time following the input timing of a trigger signal (see
Next, a case where the recording mode is the third recording mode will be described. In the third recording mode, a recording period of time does not include a preceding period of time (see
Here, a process of the captured video recording unit 164 in the first recording mode will be described in greater detail with reference to
Referring to
When shooting is started, the captured video recording unit 164 reads a video corresponding to a timing at which a recording period of time starts (in
In the foregoing, the process of the captured video recording unit 164 in the first recording mode has been described in greater detail with reference to
The display control unit 165 controls driving of the display unit 130 so that various kinds of information are displayed on the display unit 130. In this embodiment, when the recording mode is set to the first recording mode or the second recording mode, the display control unit 165 drives the display unit 130 to display a video which is delayed for a predetermined time. In other words, the display control unit 165 has a function of delaying a live view image for a predetermined time and displaying the delayed image. The display control unit 165 can display a delayed video by using a video temporarily stored in the first storage unit 140 (i.e., a buffered video). Here, a time for which the display of a video is delayed may be a time corresponding to a preceding period of time in the set first recording mode or second recording mode. Thus, the display control unit 165 can drive the display unit 130 to display a video which is at a recording start timing corresponding to the recording mode.
Here, in a commonly used existing imaging apparatus, the user performs a trigger operation while viewing a current video (so-called live view image). Therefore, when the recording mode is the first recording mode or the second recording mode, i.e., a video preceding the timing of a trigger operation is recorded, it is necessary for the user to perform a trigger operation on the basis of a prediction of a video at the start of recording, which is made while viewing a live view image which is a video different from the video at the start of recording, and taking a preceding period of time into account. Therefore, a certain skill is necessary for starting recording from a desired timing. Also, unless a recorded video is subsequently checked, it cannot be verified whether or not a video has been recorded from a desired timing, which is inconvenient. Meanwhile, in this embodiment, as described above, the display control unit 165 displays, on the display unit 130, a video which is delayed for a predetermined time, such as a video at a recording start timing. Therefore, the user can perform a trigger operation while checking a video which is at a recording start timing, which is displayed on the display unit 130, and thereby more easily record a desired video.
Note that the display control unit 165 may drive the display unit 130 to display a live view image together with a video which is delayed for a predetermined time. The display control unit 165 can drive the display unit 130 to display a live view image, by acquiring video information indicating a current state of an object from the captured video processing unit 162. When the user determines when to perform a trigger operation, i.e., during what period of time a video is to be recorded, a current condition of an object may also be a criterion. Therefore, if a live view image is displayed together with a video which is delayed for a predetermined time on the display unit 130, the user can perform a trigger operation at a more appropriate timing while viewing both of the videos. In particular, when the recording mode is the first recording mode, a video which is delayed for a predetermined time corresponds to a video at a recording start timing, and a live view image corresponds to a video at a timing at which a recording period of time ends. Therefore, if a live view image is displayed together with a video which is delayed for a predetermined time on the display unit 130, the user can perform a trigger operation while checking both a video at the start timing of a recording period of time and a video at the end timing of the recording period of time, resulting in a further improvement in the convenience of the user.
Here, when the recording mode is the third recording mode, the display control unit 165 may drive the display unit 130 to display only a live view image without displaying a video which is delayed for a predetermined time. This is because, in the third recording mode, the timing of a trigger operation is a recording start timing, and a video displayed as a live view image is a video at the recording start timing. Thus, in this embodiment, it can be said that the display control unit 165 has the function of driving the display unit 130 to display a video which is at a recording start timing, depending on the recording mode.
Note that the function of the display control unit 165 is not limited to the above example. The display control unit 165 can drive the display unit 130 to display various kinds of information which may be displayed on commonly used digital video cameras, such as a setting screen for setting shooting conditions, a recorded video, and the like. For example, the display control unit 165 can drive the display unit 130 to display a recorded video, by reading recorded video information from the second storage unit 150.
In the foregoing, the functional configuration of the imaging apparatus 10 according to this embodiment has been described with reference to
Also, the configuration shown in
Also, a computer program for implementing each of the above functions of the imaging apparatus 10 according to this embodiment, particularly each function of the control unit 160, can be created and installed in a personal computer (PC) or the like. Also, a computer-readable recording medium storing such a computer program can also be provided. Examples of the recording medium include a magnetic disk, optical disc, magneto-optical disc, flash memory, and the like. Also, the above computer program may be distributed through, for example, a network without using a recording medium.
Next, a display example in this embodiment carried out by the display control unit 165 will be described in greater detail. Here, for comparison, a commonly used display example and a display example according to this embodiment will both be described. Also, as an example, a display example in a case where the recording mode is the first recording mode will be described.
Also, in
Firstly, display examples in a commonly used imaging apparatus will be described with reference to
The user, when recording a video, performs a trigger operation while viewing the display screen 210. In the examples shown in
(3-2. Display Examples According to this Embodiment)
Next, display examples in the imaging apparatus 10 according to this embodiment will be described with reference to
As shown in
For example, as shown in
As shown in
As shown in
As shown in
A procedure for a display control method according to this embodiment will be described with reference to
Referring to
Next, a recording period of time is determined on the basis of the set recording mode (step S103). For example, when the recording mode is the first recording mode, the recording period of time is a predetermined period of time preceding the current time. Also, for example, when the recording mode is the second recording mode, the recording period of time is a combination of a predetermined period of time preceding the current time and a predetermined period of time following the current time. Also, for example, when the recording mode is the third recording mode, the recording period of time is a predetermined period of time following the current time. Note that, for example, the process shown in step S103 corresponds to a process performed by the captured video recording unit 164 shown in
Next, a video corresponding to a recording start timing is read from the first storage unit (first recording medium) on the basis of the determined recording period of time (step S105). When the recording mode is the first recording mode or the second recording mode, and the storage capacity of the first storage unit is a storage capacity corresponding to a video during a preceding period of time in each recording mode, the video corresponding to a recording start timing is a video immediately before being erased from the first storage unit. Meanwhile, when the recording mode is the third recording mode, the video corresponding to a recording start timing is a video at the current time (i.e., a live view image), and therefore, may not be read from the first storage unit. Note that, for example, the process shown in step S105 corresponds to a process performed by the captured video recording unit 164 shown in
Next, the video corresponding to a recording start timing is displayed (step S107). For example, when the recording mode is the first recording mode or the second recording mode, a video which is delayed for a predetermined time is displayed as the video corresponding to a recording start timing. Also, in this case, as shown in
Next, a video during the recording period of time is read from the first storage unit, and is saved into the second storage unit (second recording medium), according to the input of a trigger signal (step S109). For example, when the recording mode is the first recording mode or the second recording mode, the recording period of time includes a preceding period of time. The first storage unit temporarily stores a video with a certain period of time. Therefore, by acquiring a video corresponding to a preceding period of time from the first storage unit, a video during the recording period of time depending on these recording modes (in the second recording mode, a video during a portion of the recording period of time) can be saved into the second storage unit. Meanwhile, when the recording mode is the third recording mode, the recording period of time does not include a preceding period of time, and therefore, the process of moving a video from the first storage unit to the second storage unit is not performed, and a most recent video sequentially acquired which corresponds to a live view image is saved into the second storage unit. Note that, for example, the process shown in step S109 corresponds to a process performed by the captured video recording unit 164 shown in
In the foregoing, the procedure for the display control method according to this embodiment has been described with reference to
In the above embodiment, as a display example of a video which is delayed for a predetermined time, a case has been described in which, in a partial region of a display screen which displays a live view image, a delayed display screen having a smaller size which displays the video which is delayed for a predetermined time is provided.
As in the display examples shown in
However, this embodiment is not limited to this example. A video which is delayed for a predetermined time may be displayed using other methods. Here, as a variation of the display example in this embodiment, a case will be described in which a video which is delayed for a predetermined time is displayed using other methods. Note that a method of displaying a video which is delayed for a predetermined time may be selected, as appropriate, from the display methods according to the above embodiment, display methods according to variations described below, and in addition, various known display methods, taking into consideration the hardware configuration of the imaging apparatus 10, the convenience of the user, or the like.
(5-1. Variation in which Plurality of Display Devices are Provided)
A variation in which a plurality of display devices are provided will be described with reference to
Referring to
The display screen 250 of the display device 240 displays a live view image. Meanwhile, the display screen 280 of the display device 270 displays a video which is delayed for a predetermined time. Here, in the above embodiment, as shown in
(5-2. Variation in which Only Video Delayed for Predetermined Time is Displayed)
A variation in which only a video which is delayed for a predetermined time is displayed will be described with reference to
Referring to
In this variation, a video which is delayed for a predetermined time is displayed, instead of a live view image, on the display screen 250 of the display device 240. According to this variation, compared to the display example shown in
(5-3. Variation in which Live View Image and Video Delayed for Predetermined Time are Superimposed and Displayed Together)
In the above embodiment or the variation described above (5-1. Variation in which plurality of display devices are provided), a plurality of display screens are provided, and a live view image is displayed on one display screen, and a video which is delayed for a predetermined time is displayed on another display screen. However, this embodiment is not limited to this example, and both a live view image and a video which is delayed for a predetermined time may be displayed on a single display screen.
Such a variation in which a live view image and a video which is delayed for a predetermined time are placed and displayed on top of each other (superimposition display) will be described with reference to
Referring to
In this variation, a live view image and a video which is delayed for a predetermined time are placed and displayed on top of each other (so-called superimposition display) on the display screen 250 of the display device 240. In the example shown, an object 251a involved with a live view image and an object 251b involved with a video which is delayed for a predetermined time are both displayed on the display screen 250 of the display device 240. At this time, a display effect which allows the user to intuitively understand that the object 251b is not an image indicating a current state of the object 251, such as transparent display or the like, may be applied to the object 251b involved with a video which is delayed for a predetermined time.
Such superimposition display may be achieved by, for example, combining a video 290 corresponding to a live view image with a video 295 which is delayed for a predetermined time. This combination process may, for example, be performed by the display control unit 165 shown in
According to this variation, compared to the display example shown in
A hardware configuration of the imaging apparatus according to this embodiment will be described with reference to
The imaging apparatus 900 includes a central processing unit (CPU) 901, a read only memory (ROM) 903, and a random access memory (RAM) 905. In addition, the imaging apparatus 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, a communication device 925, an imaging mechanism 933, and a sensor 935. The imaging apparatus 900 may include a processing circuit called a digital signal processor (DSP) or application specific integrated circuit (ASIC), instead of or in addition to the CPU 901.
The CPU 901 functions as a calculation processor and a controller, and controls a portion of or all operations in the imaging apparatus 900 in accordance with various programs recorded in the ROM 903, the RAM 905, the storage device 919, or a removable recording medium 927. The ROM 903 stores programs, calculation parameters, and the like which are used by the CPU 901. The RAM 905 serves as primary storage for programs which are used in execution of the CPU 901, parameters which change, as appropriate, in the execution, and the like. The CPU 901, the ROM 903, and the RAM 905 are connected to each other by the host bus 907 which includes an internal bus, such as a CPU bus or the like. In addition, the host bus 907 is connected to the external bus 911, such as a peripheral component interconnect/Interface (PCI) bus or the like, via the bridge 909. In this embodiment, for example, the CPU 901 may be included in the control unit 160 shown in
The input device 915 is a device which is operated by the user, such as a mouse, keyboard, touchscreen, button, switch, lever, or the like. For example, the input device 915 may be a remote controller using infrared light or other radio waves, or may be an externally connected device 929 operable in response to the operation of the imaging apparatus 900, such as a mobile phone or the like. The input device 915 includes an input control circuit which generates an input signal on the basis of information which is input by the user, and outputs the input signal to the CPU 901. By operating the input device 915, the user can input various types of data to the imaging apparatus 900 or instruct the imaging apparatus 900 to perform a processing operation. The input device 915 may, for example, be included in the operation unit 120 shown in
The output device 917 includes a device capable of visually or audibly notifying the user of acquired information. The output device 917 may, for example, be a display device, such as a liquid crystal display (LCD), plasma display panel (PDP), organic EL display, lamp, projector, light, or the like, an audio output device, such as a loudspeaker, headphone, or the like, a printer, or the like. The output device 917 may output a result obtained from the process of the imaging apparatus 900 in the form of a video, such as text, an image, or the like, and an audio, such as voice, sound, or the like. In this embodiment, the display device may, for example, be included in the display unit 130 shown in
The storage device 919 is a device for data storage which is configured as an example of a storage unit of the imaging apparatus 900. The storage device 919 includes, for example, a magnetic storage device, such as a hard disk drive (HDD) or the like, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like. The storage device 919 stores programs to be executed by the CPU 901, various kinds of data, various kinds of data obtained from the outside, and the like. In this embodiment, for example, the storage device 919 may be included in the first storage unit 140 and the second storage unit 150 shown in
The drive 921 is a reader/writer for the removable recording medium 927, such as a magnetic disk, optical disk, magneto-optical disk, semiconductor memory, or the like, and is internal or external to the imaging apparatus 900. The drive 921 reads information recorded in the removable recording medium 927 attached thereto, and outputs the read information to the RAM 905. Also, the drive 921 writes a record into the removable recording medium 927 attached thereto. In this embodiment, for example, the drive 921 can read and write various kinds of information which are processed by the control unit 160 shown in
The connection port 923 is a port used to directly connect a device to the imaging apparatus 900. The connection port 923 may be a Universal Serial Bus (USB) port, IEEE1394 port, and Small Computer System Interface (SCSI) port, or the like. Alternatively, the connection port 923 may be an RS-232C port, optical audio terminal, High-Definition Multimedia Interface (HDMI) (registered trademark) port, or the like. The connection of the externally connected device 929 to the connection port 923 allows exchange of various kinds of data between the imaging apparatus 900 and the externally connected device 929. In this embodiment, for example, various kinds of information which are processed by the control unit 160 shown in
The communication device 925 is, for example, a communication interface including a communication device for connection to a communication network 931, and the like. The communication device 925 may be, for example, a communication card for a wired or wireless local area network (LAN), Bluetooth (registered trademark), or wireless USB (WUSB), or the like. Alternatively, the communication device 925 may be a router for optical communication, a router for Asymmetric Digital Subscriber Line (ADSL), various modems for communication, or the like. For example, the communication device 925 transmits and receives a signal to and from the Internet or other communication devices in accordance with a predetermined protocol, such as TCP/IP or the like. In addition, the communication network 931 connected to the communication device 925 may be a network connected in a wired or wireless manner, and may be, for example, the Internet, a home LAN, infrared communication, radio wave communication, satellite communication, or the like. In this embodiment, for example, the communication device 925 may transmit and receive various kinds of information which are processed by the control unit 160 shown in
The imaging mechanism 933 is a mechanism which generates a captured image by imaging a real space using an imaging device, such as a CCD image sensor, CMOS image sensor, or the like, and various members, such as a lense for controlling the formation of an image of an object on the image sensor, and the like. The imaging mechanism 933 may be a device which preferably captures a moving image (video). In this embodiment, for example, the imaging mechanism 933 may be included in the imaging unit 110 shown in
The sensor 935 is any of various sensors, such as an acceleration sensor, gyroscopic sensor, geomagnetic sensor, optical sensor, sound sensor, ranging sensor, and the like. The sensor 935 acquires information about a state of the imaging apparatus 900 itself, such as the orientation of the housing of the imaging apparatus 900, or the like, or information about an environment around the imaging apparatus 900, such as brightness or noise around the imaging apparatus 900, or the like. The sensor 935 may also include a global positioning system (GPS) sensor which receives a GPS signal, and measures the latitude, longitude, and altitude of the apparatus. In this embodiment, various kinds of information acquired by the sensor 935 may be used to achieve various known functions of commonly used imaging apparatuses, such as a so-called camera-shake correction function, a function of automatically determining imaging conditions (exposure, shutter speed, etc.) in an auto mode, and the like.
In the foregoing, a hardware configuration example of the imaging apparatus 900 has been described. Each of the above components may be configured using general-purpose members, or alternatively, may be configured by hardware specialized in the function of each component. Such a configuration may be modified as appropriate according to the state of the art at the time of the implementation.
Note that it is also possible to develop a computer program for realizing the respective functions of the imaging apparatus 900 as discussed above, and implement the computer program in a personal computer or the like. In addition, a computer-readable recording medium storing such a computer program may also be provided. The recording medium may be a magnetic disc, an optical disc, a magneto-optical disc, or flash memory, for example. Furthermore, the above computer program may also be delivered via a network, for example, without using a recording medium.
The preferred embodiment(s) of the present disclosure has/have been described above with reference to the accompanying drawings, whilst the present disclosure is not limited to the above examples. A person skilled in the art may find various alterations and modifications within the scope of the appended claims, and it should be understood that they will naturally come under the technical scope of the present disclosure.
Further, the effects described in this specification are merely illustrative or exemplified effects, and are not limitative. That is, with or in the place of the above effects, the technology according to the present disclosure may achieve other effects that are clear to those skilled in the art based on the description of this specification.
Additionally, the present technology may also be configured as below.
(1)
A display control apparatus including:
The display control apparatus according to (1), further including:
The display control apparatus according to (2),
The display control apparatus according to any one of (1) to (3), further including:
The display control apparatus according to any one of (1) to (4),
The display control apparatus according to (5),
The display control apparatus according to (6),
The display control apparatus according to (6),
The display control apparatus according to (5),
An imaging apparatus including:
A display control method including:
Number | Date | Country | Kind |
---|---|---|---|
2014-149785 | Jul 2014 | JP | national |
This application is a U.S. National Phase of International Patent Application No. PCT/JP2015/066269 filed on Jun. 4, 2015, which claims priority benefit of Japanese Patent Application No. JP 2014-149785 filed in the Japan Patent Office on Jul. 23, 2014. Each of the above-referenced applications is hereby incorporated herein by reference in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2015/066269 | 6/4/2015 | WO | 00 |