The present disclosure relates to an image processing apparatus, and more particularly, to an image processing apparatus and an image processing method of displaying a stereoscopic image and a program allowing a computer to execute the method.
In recent years, image capturing apparatuses such as a digital still camera or a digital video camera (for example, camera-integrated recorder) which captures an image of a subject such as a person or an animal to generate a captured image (image data) and records the captured image as image content have become widespread.
In addition, recently, a number of stereoscopic image displaying methods of displaying a stereoscopic image capable of obtaining stereoscopic viewing by using parallax between left and right eyes have been disclosed. In addition, the image capturing apparatuses such as a digital still camera or a digital video camera (for example, camera-integrated recorder) which record image data used for displaying a stereoscopic image as image content (stereoscopic image content) have been disclosed.
Since the stereoscopic image content are recorded by the image capturing apparatus in this manner, it is considered that, for example, the recorded stereoscopic image content are allowed to be displayed on a display unit of the image capturing apparatus. For example, an information apparatus having a stereoscopic image display mode of displaying a stereoscopic image, which is configured by using two images generated through an image capturing operation, on a display unit has been disclosed (refer to, for example, Japanese Unexamined Patent Application Publication No. 2004-112111 (FIGS. 4A and 4B)).
In the related art, the stereoscopic image generated through the image capturing operation may be allowed to be displayed on the display unit of the information apparatus.
However, it may be considered that, for example, the stereoscopic image which is to be displayed on the display unit is rotated by the user manipulation to be displayed. In this case, the parallax direction of the stereoscopic image and the parallax direction of the display unit, on which the stereoscopic image is to be displayed, may be not be coincident with each other, so that the stereoscopic image may not properly be displayed. In this manner, due to the rotation or the like of the stereoscopic image according to the user manipulation, the stereoscopic image may not be properly displayed, and the image may not be seen by a user.
It is desirable to properly display an image at the time of displaying a stereoscopic image.
According to a first embodiment of the present disclosure, there are provided an image processing apparatus including: an acquisition unit which acquires stereoscopic image information used for displaying a stereoscopic image on a display unit; a controller which performs one of a first control of allowing the stereoscopic image to be displayed as a planar image on the display unit and a second control of performing an image process on the stereoscopic image so that a parallax direction of the stereoscopic image and a parallax direction of the display unit are coincident with each other and allowing the stereoscopic image, which is subject to the image process, to be displayed on the display unit in the case where a parallax direction of the stereoscopic image displayed on the display unit and a parallax direction of the display unit are not coincident with each other based on the stereoscopic image information, an image processing method thereof, and a program allowing a computer to execute the method. Accordingly, in the case where the parallax direction of the stereoscopic image displayed on the display unit and the parallax direction of the display unit are not coincident with each other, it is possible to obtain a function of performing one of the first control of allowing the stereoscopic image to be displayed as a planar image and the second control of performing an image process on the stereoscopic image so that the parallax direction of the stereoscopic image and the parallax direction of the display unit are coincident with each other and allowing the stereoscopic image, which is subject to the image process, to be displayed.
In addition, in the first embodiment of the present disclosure, in the case of performing the second control, the controller may perform control of performing a rotation process on the stereoscopic image so that the parallax direction of the stereoscopic image and the parallax direction of the display unit are coincident with each other and allowing the stereoscopic image, which is subject to the rotation process, to be displayed on the display unit. Accordingly, in the case where the second control is performed, it is possible to obtain a function of performing a rotation process on the stereoscopic image so that the parallax direction of the stereoscopic image and the parallax direction of the display unit are coincident with each other and allowing the stereoscopic image, which is subject to the rotation process, to be displayed on the stereoscopic image.
In addition, in the first embodiment of the present disclosure, the image processing apparatus may further include a manipulation receiving unit which receives rotation command manipulation for rotating the stereoscopic image which is to be displayed on the display unit, wherein if the rotation command manipulation is received in the case where the stereoscopic image is displayed on the display unit, the controller performs the first control in the case where the parallax direction of the stereoscopic image and the parallax direction of the display unit are not coincident with each other. Accordingly, if the rotation command manipulation is received in the case where the stereoscopic image is displayed on the display unit, it is possible to obtain a function of performing the first control in the case where the parallax direction of the stereoscopic image and the parallax direction of the display unit are not coincident with each other.
In addition, in the first embodiment of the present disclosure, the manipulation receiving unit may receive returning command manipulation for returning the rotation, which is based on the rotation command manipulation after receiving the rotation command manipulation, to an original state, and after the returning command manipulation is received in the case where the stereoscopic image is displayed on the display unit, the controller may perform the second control in the case where the parallax direction of the stereoscopic image and the parallax direction of the display unit are not coincident with each other. Accordingly, in the case where the stereoscopic image is displayed on the display unit, after the returning command manipulation is received, it is possible to obtain a function of performing the second control in the case where the parallax direction of the stereoscopic image and the parallax direction of the display unit are not coincident with each other.
In addition, in the first embodiment of the present disclosure, the stereoscopic image may be configured by multi-viewing-point images, and in the case where the first control is performed, the controller may perform control of allowing at least one viewing point image among the multi-viewing-point images to be displayed on the display unit. Accordingly, in the case where the first control is performed, it is possible to obtain a function of allowing at least one viewing point image among the multi-viewing-point images to be displayed.
In addition, in the first embodiment of the present disclosure, wherein the stereoscopic image information may include parallax information indicating the parallax direction of the stereoscopic image, which is displayed on the display unit based on the stereoscopic image information, at an image capturing operation time, and the controller may determine based on the parallax information included in the acquired stereoscopic image information whether or not the parallax direction of the stereoscopic image and the parallax direction of the display unit are coincident with each other. Accordingly, it is possible to obtain a function of determining based on the parallax information included in the stereoscopic image information whether or not the parallax direction of the stereoscopic image and the parallax direction of the display unit are coincident with each other.
In addition, in the first embodiment of the present disclosure, the image processing apparatus may further include: a first casing having the display unit; a second casing which is a casing different from the first casing; a rotating member which rotatably connects the first casing and the second casing; and a detection unit which detects a rotation state of the first casing with respect to the second casing, wherein the stereoscopic image information includes parallax information indicating the parallax direction of the stereoscopic image, which is displayed on the display unit based on the stereoscopic image information, at an image capturing operation time, and wherein the controller determines based on the parallax information included in the acquired stereoscopic image information and the detected rotation state of the first casing whether or not the parallax direction of the stereoscopic image and the parallax direction of the display unit are coincident with each other. Accordingly, it is possible to obtain a function of determining based on the parallax information included in the stereoscopic image information and the detected rotation state of the first casing whether or not the parallax direction of the stereoscopic image and the parallax direction of the display unit are coincident with each other.
In addition, in the first embodiment of the present disclosure, the display unit may be set so that one of a specific direction of the display screen and an orthogonal direction directing to the display screen is the parallax direction, and the controller may perform control of changing the parallax direction of the display unit based on the detected rotation state of the first casing. Accordingly, it is possible to obtain a function of changing the parallax direction of the display unit based on the detected rotation state of the first casing.
In addition, in the first embodiment of the present disclosure, in the case where the parallax direction of the stereoscopic image displayed on the display unit and the parallax direction of the display unit are not coincident with each other, the controller may perform one of the first control, the second control, and a third control of changing the parallax direction of the display unit so that the parallax direction of the display unit is coincident with the parallax direction of the stereoscopic image and allowing the stereoscopic image to be displayed on the display unit. Accordingly, in the case where the parallax direction of the stereoscopic image displayed on the display unit and the parallax direction of the display unit are not coincident with each other, it is possible to obtain a function of performing one of the first control, the second control, and the third control of changing the parallax direction of the display unit so that the parallax direction of the display unit is coincident with the parallax direction of the stereoscopic image and allowing the stereoscopic image to be displayed on the display unit.
In addition, in the first embodiment of the present disclosure, the display unit may be set so that one of a specific direction of the display screen and an orthogonal direction directing to the display screen is the parallax direction, and the controller may change the parallax direction of the display unit based on user manipulation or a posture of the display unit and determines whether or not the changed parallax direction of the display unit and the parallax direction of the stereoscopic image are coincident with each other. Accordingly, it is possible to obtain a function of changing the parallax direction of the display unit based on the user manipulation or the posture of the display unit and determining whether or not the changed parallax direction of the display unit and the parallax direction of the stereoscopic image are coincident with each other.
In addition, in the first embodiment of the present disclosure, the image processing apparatus may further include a manipulation receiving unit which receives selection manipulation for selecting whether the controller is allowed to perform the first control or the controller is allowed to perform the second control in the case where the parallax direction of the stereoscopic image displayed on the display unit and the parallax direction of the display unit are not coincident with each other, wherein in the case where the parallax direction of the stereoscopic image displayed on the display unit and the parallax direction of the display unit are not coincident with each other, the controller allows the image corresponding to the acquired stereoscopic image information to be displayed on the display unit according to the selected control. Accordingly, in the case where the parallax direction of the stereoscopic image displayed on the display unit and the parallax direction of the display unit are not coincident with each other, it is possible to obtain a function of displaying the image corresponding to the acquired stereoscopic image information according to the selected control.
In addition, in the first embodiment of the present disclosure, the image processing apparatus may further include: a detection unit which detects movement amounts and movement directions of a plurality of areas of the first image with respect to the second image based on the first image and the second image; and a composition unit which moves images of a plurality of areas of the second image based on the detected movement amounts and movement directions of the areas of the first image and generates a composed image based on the moved images, wherein in the case where the second control is performed, the controller allows the generated composed image and the second image to be displayed as the stereoscopic image on the display unit. Accordingly, it is possible to obtain a function of detecting the movement amounts and movement directions of the plurality of areas of the first image with respect to the second image based on the first image and the second image, moving the images of the plurality of areas of the second image based on the detected movement amounts and movement directions of the areas of the first image, generating the composed image based on the moved images, and allowing the generated composed image and the second image as the stereoscopic image in the case where the second control is performed.
In addition, in the first embodiment of the present disclosure, the image processing apparatus may further include an image capturing unit which image-captures a subject to generate a first image and a second image used for displaying the stereoscopic image for stereoscopically viewing the subject; a detection unit which detects movement amounts and movement directions of a plurality of areas of the first image with respect to the second image based on the generated first and second images; a composition unit which moves images of a plurality of areas of the second image based on the detected movement amounts and movement directions of the areas of the first image and generates a composed image based on the moved images; and a recording control unit which allows the generated composed image and the second image to be recorded as multi-viewing-point images included in the stereoscopic image information on a recording medium. Accordingly, it is possible to obtain a function of detecting the movement amounts and movement directions of the plurality of areas of the first image with respect to the second image based on the first image and the second image, moving the images of the plurality of areas of the second image based on the detected movement amounts and movement directions of the areas of the first image, generating the composed image based on the moved images, and allowing the generated composed image and the second image to be recorded as the multi-viewing-point image.
In addition, in the first embodiment of the present disclosure, the image processing apparatus may further include an image capturing unit which image-captures a subject to generate multi-viewing-point images used for displaying the stereoscopic image for stereoscopically viewing the subject; an image cutting unit which cuts a predetermined area of at least one end portion side among the two end portions in the longitudinal direction in each of the generated multi-viewing-point images; and a recording control unit which allows the multi-viewing-point images, in which the predetermined area is cut, to be included in the stereoscopic image information and to be recorded on a recording medium. Accordingly, it is possible to obtain a function of cutting a predetermined area of at least one end portion side among the two end portions in the longitudinal direction in each of the generated multi-viewing-point images and allowing the multi-viewing-point images, in which the predetermined area is cut, to be recorded.
In addition, in the first embodiment of the present disclosure, the image processing apparatus may further include an image capturing unit which image-captures a subject to generate a plurality of sets of image groups where sets of multi-viewing-point images used for displaying the stereoscopic image for stereoscopically viewing the subject are consecutively disposed in a time sequence; a composition unit which performs composition by using at least a portion of each of the plurality of the generated sets of the image groups to generate a plurality of composed images used for displaying the stereoscopic image for stereoscopically viewing the subject; and a recording control unit which allows the plurality of generated composed images to be recorded as multi-viewing-point images in the stereoscopic image information on a recording medium. Accordingly, it is possible to obtain a function of performing the composition by using at least a portion of each of the plurality of the generated sets of the image groups to generate the plurality of composed images and allowing the plurality of generated composed images to be recorded as the multi-viewing-point images.
In addition, according to a second embodiment of the present disclosure, there are provided an image processing apparatus including: a parallax direction acquisition unit which acquires a parallax direction of a user; an acquisition unit which acquires stereoscopic image information used for displaying a stereoscopic image on a display unit; and a controller which performs one of a first control of allowing the stereoscopic image to be displayed as a planar image on the display unit, a second control of performing an image process on the stereoscopic image so that a parallax direction of the stereoscopic image and the acquired parallax direction are coincident with each other and allowing the stereoscopic image, which is subject to the image process, to be displayed on the display unit, and a third control of changing the parallax direction of the display unit so that the parallax direction of the stereoscopic image and the acquired parallax direction are coincident with each other and allowing the stereoscopic image to be displayed on the display unit in the case where the parallax direction of the stereoscopic image displayed on the display unit and the acquired parallax direction are not coincident with each other based on the stereoscopic image information, an image processing method, and a program allowing a computer to execute the method. Accordingly, in the case where the parallax direction of the stereoscopic image displayed on the display unit and the parallax direction of the user are not coincident with each other, it is possible to obtain a function of performing one of the first control of allowing the stereoscopic image to be displayed as a planar image, the second control of performing an image process on the stereoscopic image so that the parallax direction of the stereoscopic image and the parallax direction of the user are coincident with each other and allowing the stereoscopic image, which is subject to the image process, to be displayed, and the third control of changing the parallax direction of the display unit so that the parallax direction of the stereoscopic image and the parallax direction of the user are coincident with each other and allowing the stereoscopic image to be displayed.
In addition, according to a third embodiment of the present disclosure, there are provided an image processing apparatus including: an acquisition unit which acquires stereoscopic image information used for displaying a stereoscopic image on a display unit; and a controller which performs control of allowing the stereoscopic image to be displayed as a planar image on the display unit in the case where the parallax direction of the stereoscopic image displayed on the display unit and the parallax direction of the display unit are not coincident with each other based on the stereoscopic image information, an image processing method, and a program allowing a computer to execute the method. Accordingly, in the case where the parallax direction of the stereoscopic image displayed on the display unit and the parallax direction of the display unit are not coincident with each other, it is possible to obtain a function of allowing the stereoscopic image to be displayed as a planar image.
In addition, according to a fourth embodiment of the present disclosure, there are provided an image processing apparatus including: an acquisition unit which acquires stereoscopic image information used for displaying a stereoscopic image on a display unit; and a controller which performs control of performing an image process on the stereoscopic image so that a parallax direction of the stereoscopic image and a parallax direction of the display unit are coincident with each other and allowing the stereoscopic image, which is subject to the image process, to be displayed on the display unit in the case where the parallax direction of the stereoscopic image displayed on the display unit and the parallax direction of the display unit are not coincident with each other based on the stereoscopic image information, an image processing method, and a program allowing a computer to execute the method. Accordingly, in the case where the parallax direction of the stereoscopic image displayed on the display unit and the parallax direction of the display unit are not coincident with each other, it is possible to obtain a function of performing the image process on the stereoscopic image so that the parallax direction of the stereoscopic image and the parallax direction of the display unit are coincident with each other and allowing the stereoscopic image, which is subject to the image process, to be displayed.
According to the present disclosure, at the time of displaying the stereoscopic image, it is possible to obtain an excellent effect capable of properly displaying the image.
Hereinafter, embodiments for carrying out the present disclosure (hereinafter, referred to as embodiments) will be described. The description will be made in the following order.
1. First Embodiment (Display Control: Example of Displaying Image Based on User Settings in Case where Parallax Direction of Display Unit and Parallax Direction of Stereoscopic Image Displayed on Display Unit are Not Coincident with each other)
2. Modified Example
3. Modified Example
The image capturing apparatus 100 includes a shutter button 111, a display unit 170, a left-eye image capturing unit 210, and a right-eye image capturing unit 220. The image capturing apparatus 100 is an image capturing apparatus capable of image-capturing the subject to generate a captured image (image data) and recording the generated captured image as image content (still image content or moving image content) in a content storage unit 200 (illustrated in
The image capturing apparatus 100 includes a first casing 101 and a second casing 102. In addition, the first casing 101 and the second casing 102 are rotatably connected to each other by using a rotating member 103 (indicated by a dotted line) as a rotation reference. Accordingly, a relative position relationship of the second casing 102 with respect to the first casing 101 may be changed. For example, in the case where the second casing 102 is rotated by 90 degree in the direction of arrow 104 illustrated in
Herein, in the first embodiment of the present disclosure, as illustrated in
The first casing 101 includes a shutter button 111, a left-eye image capturing unit 210, and a right-eye image capturing unit 220.
The shutter button 111 is a manipulation member of commanding the image recording start. For example, in the case where a still image capturing mode is set, the shutter button 111 is pressed when the image data generated by the left-eye image capturing unit 210 and the right-eye image capturing unit 220 are recorded as a still image file on a recording medium.
The left-eye image capturing unit 210 and the right-eye image capturing unit 220 are configured to image-capture the subject to the image data. As illustrated in
The second casing 102 includes a display unit 170. The display unit 170 is a display unit for displaying various images. For example, an image corresponding to the image content stored in the content storage unit 200 (illustrated in
In addition, the left-eye image capturing unit 210 and the right-eye image capturing unit 220 are described in detail with reference to
The content storage unit 200 is configured to store the images, which are output from the captured-image signal processing unit 230, in a correspondence manner as an image file (the image content) based on control of the recording control unit 260. In addition, the content storage unit 200 supplies the stored image content to the content acquisition unit 130. In addition, as the content storage unit 200, for example, a removable recording medium (one or a plurality of the recording media) such as a disc such as a DVD (Digital Versatile Disc) or a semiconductor memory such as a memory card may be used. In addition, such a recording medium may be built in the image capturing apparatus 100; and otherwise, the recording medium may be detachably provided to the image capturing apparatus 100.
The left-eye image capturing unit 210 and the right-eye image capturing unit 220 are configured so that a pair of left and right optical systems and a pair of left and right image capturing devices are disposed in order to generate the left-eye viewing image and the right-eye viewing image. In addition, configurations (lens, image capturing device, and the like) of the left-eye image capturing unit 210 and the right-eye image capturing unit 220 are substantially the same except that the arrangement positions are different. Therefore, hereinafter, with respect to one of the left and right configurations, some portions thereof are omitted in the description. In addition, the left-eye image capturing unit 210 and the right-eye image capturing unit 220 are examples of an image capturing unit disclosed in the embodiments of the present disclosure.
The left-eye image capturing unit 210 includes a lens 211 and an image capturing device 212. In addition, the right-eye image capturing unit 220 includes a lens 221 and an image capturing device 222. In addition, in
The lens 211 is a lens group (for example, a focus lens and a zoom lens) which condenses light incident from a subject. The light condensed by the lens group is incident on the image capturing device 212 with the amount (light amount) being adjusted by a diaphragm (not shown).
The image capturing device 212 is an image capturing device which perform a photoelectric conversion process on incident light transmitting through the lens 211 and supplies the photoelectrically-converted electrical signal (image signal) to the captured-image signal processing unit 230. In other words, the image capturing device 212 receives light incident from the subject through the lens 211 and performs photoelectric conversion to generate an analog image signal according to a received light amount. In addition, the image capturing device 212 and the image capturing device 222 (the right-eye image capturing unit 220) forms images through synchronization driving with respect to the subject images incident through the lenses to generate the analog image signals. In this manner, the analog image signal generated by the image capturing device 212 and the analog image signal generated by the image capturing device 222 are supplied to the captured-image signal processing unit 230. In addition, as the image capturing devices 212 and 222, a CCD (Charge Coupled Device), a CMOS (Complementary Metal-Oxide Semiconductor), or the like may be used.
The captured-image signal processing unit 230 is a captured-image signal processing unit which applies various signal processes on the analog image signal supplied from the image capturing devices 212 and 222 based on control of the controller 120. Next, the captured-image signal processing unit 230 outputs digital image signals (left-eye viewing image and right-eye viewing image), which are generated through the various signal processes, to the recording control unit 260. For example, the captured-image signal processing unit 230 generates the stereoscopic image (vertically long stereoscopic image) of which the parallax direction is the horizontal direction and of which the longitudinal direction is the vertical direction based on the control of the controller 120. In addition, a vertically long stereoscopic image generating method will be described in detail with reference to
The image capturing parallax direction detection unit 240 detects the parallax direction at the image capturing operation time and outputs the detected parallax direction (image capturing parallax direction) to the recording control unit 260. In addition, in the image capturing operation at the normal time, the horizontal direction of the captured image is detected as the parallax direction.
The image capturing posture detection unit 250 detects acceleration, motion, tilt, or the like of the image capturing apparatus 100 to detect a change of the posture of the image capturing apparatus 100 at the image capturing operation time and acquires the posture information (image capturing posture) of the image capturing time based on a result of the detection. Next, the image capturing posture detection unit 250 outputs the acquired image capturing posture (for example, a rotation angle (for example, 0 degree, 90 degrees, 180 degrees, or 270 degrees) using the optical axis direction as a rotation axis) to the recording control unit 260. In addition, the image capturing posture detection unit 250 may be implemented by a gyro sensor (angular velocity sensor) or an acceleration sensor.
The recording control unit 260 is configured to record the images, which are output from the captured-image signal processing unit 230, as an image file (image content) in the content storage unit 200 based on control of the controller 120. For example, in the case where the still image recording command manipulation is received by the manipulation receiving unit 110, the recording control unit 260 allows the left-eye viewing image and the right-eye viewing image to be recorded in a correspondence manner as still image file (still image content) in the content storage unit 200. At the recording time, attribute information including date information, image capturing parallax direction (parallax information), image capturing posture, and the like of the image capturing time are recorded as the image file (for example, recording of rotation information or the like of Exif (Exchangeable image file format)). In addition, the still image recording command manipulation is performed, for example, by the pressing manipulation of the shutter button 111 (illustrated in
In addition, for example, the case where the moving image recording command manipulation is received by the manipulation receiving unit 110 is considered. In this case, the recording control unit 260 allows the left-eye viewing image and the right-eye viewing image which are output in a predetermined frame rate from the captured-image signal processing unit 230 to be sequentially recorded as a moving image file (moving image content) in the content storage unit 200. In addition, the moving image recording command manipulation is performed, for example, by the pressing manipulation of the recording button.
The manipulation receiving unit 110 is a manipulation receiving unit which receives manipulation input of the user and supplies a manipulation signal according to the content of the received manipulation input to the controller 120. For example, in the stereoscopic image display mode, the manipulation receiving unit 110 receives setting manipulation for setting content of control which is to be preferentially performed at the time of displaying the stereoscopic image on the display unit 170. In addition, for example, the manipulation receiving unit 110 receives setting manipulation for setting the stereoscopic image recording mode or command manipulation for commanding image recording.
In addition, for example, the manipulation receiving unit 110 receives rotation command manipulation for rotating the stereoscopic image which is to be displayed on the display unit 170. In addition, for example, the manipulation receiving unit 110 receives returning command manipulation for returning the rotation based on the rotation command manipulation to the original state after the reception of the rotation command manipulation. The image processing unit 150 performs an image process on the stereoscopic image, which is to be displayed on the display unit 170, based on the command manipulation.
The controller 120 is configured to control components of the image capturing apparatus 100 based on the manipulation content from the manipulation receiving unit 110. For example, in the case where the setting manipulation for setting the content of control which is to be preferentially performed is received by the manipulation receiving unit 110, the controller 120 allows preference information according to the setting manipulation to be retained in the preference information retention unit 121.
In addition, for example, in the case where the stereoscopic image is displayed on the display unit 170 based on the image content (stereoscopic image content), the controller 120 determines whether or not the parallax direction of the stereoscopic image displayed on the display unit 170 and the parallax direction of the display unit 170 are coincident with each other. For example, the controller 120 determines whether or not the two parallax directions are coincident with each other based on the image capturing parallax direction included in the attribute information (attribute information included in the image content) acquired by the attribute information acquisition unit 140 and the rotation state of the display unit 170 (the first casing 101). In addition, in the case where the display unit 170 (the first casing 101) is in the horizontally long state, it is determined based on the image capturing parallax direction included in the attribute information whether or not the parallax direction of the stereoscopic image and the parallax direction of the display unit 170 are coincident with each other.
Next, if the parallax direction of the stereoscopic image displayed on the display unit 170 and the parallax direction of the display unit 170 are not coincident with each other, the controller 120 performs one of a first control and a second control. The first control is a control for allowing the stereoscopic image to be displayed as a planar image on the display unit 170. In addition, the second control is a control for allowing the image processing unit 150 to perform an image process on the stereoscopic image so that the parallax direction of the stereoscopic image and the parallax direction of the display unit 170 are coincident with each other and for allowing the stereoscopic image, which is subject to the image process, to be displayed on the display unit 170. In addition, in the case of performing the second control, for example, the image processing unit 150 is allowed to perform the rotation process on the stereoscopic image so that the parallax direction of the stereoscopic image and the parallax direction of the display unit 170 are coincident with each of the, and the stereoscopic image, which is subject to the rotation process, is allowed to be displayed on the display unit 170.
In addition, in the case where the parallax direction of the stereoscopic image displayed on the display unit 170 and the parallax direction of the display unit 170 are not coincident with each other, which one of the first control and the second control is to be performed may be set, for example, through a setting screen 330 illustrated in
Herein, with respect to the display unit 170, any one of a specific direction (for example, the longitudinal direction) of the display screen and an orthogonal direction directing toward the display unit 170 may be set as the parallax direction. With respect to the parallax direction, changing thereof according to the posture of the display unit 170 or fixing thereof irrespective of the posture of the display unit 170 may be set by the user manipulation. For example, in the case where the changing thereof according to the posture of the display unit 170 is set, the controller 120 performs control for changing the parallax direction of the display unit 170 based on the rotation state of the display unit 170 (the first casing 101) detected by the posture-of-display-unit detection unit 180. In this manner, in the case where the parallax direction of the display unit 170 is changed, it is determined whether or not the changed parallax direction of the display unit 170 and the parallax direction of the stereoscopic image are coincident with each other.
In addition, in the case where the fixing thereof irrespective of the posture of the display unit 170 is set, the case where the parallax direction of the stereoscopic image displayed on the display unit 170 and the parallax direction of the display unit 170 are not coincident with each other is considered. In this case, besides the first control and the second control, a third control for changing the parallax direction of the display unit 170 so that the parallax direction of the display unit 170 is coincident with the parallax direction of the stereoscopic image and for allowing the stereoscopic image to be displayed on the display unit 170 may be performed.
In addition, for example, in the case where the stereoscopic image is displayed on the display unit 170, if the parallax direction of the stereoscopic image and the parallax direction of the display unit are not coincident with each other through reception of the rotation command manipulation, the controller 120 performs the first control. On the other hand, in the case where the stereoscopic image is displayed on the display unit 170, if the parallax direction of the stereoscopic image and the parallax direction of the display unit 170 are not coincident with each other after reception of the returning command manipulation, the controller 120 performs the second control.
The preference information retention unit 121 retains the content of control, which is to be preferentially performed at the time of displaying the stereoscopic image on the display unit 170, as the preference information and supplies the retained preference information to the controller 120. In addition, the preference information retained in the preference information retention unit 121 is updated by the controller 120 every time when the setting manipulation for setting the preference information is received by the manipulation receiving unit 110. In addition, with the retained content of the preference information retention unit 121 will be described with reference to
The content acquisition unit 130 is configured to acquire the image content (the stereoscopic image information) stored in the content storage unit 200 and to supply the acquired image content to the attribute information acquisition unit 140 and the image processing unit 150 based on control of the controller 120. In addition, the content acquisition unit 130 is an example of an acquisition unit disclosed in the embodiments of the present disclosure.
The attribute information acquisition unit 140 is configured to acquire the attribute information included in the image content acquired by the content acquisition unit 130 and to supply the acquired attribute information to the controller 120 and the image processing unit 150. The attribute information includes, for example, the date information, the image capturing parallax direction, the image capturing posture, and the like of the image capturing time.
The image processing unit 150 is configured to perform various image processes for displaying the images on the display unit 170 on the images corresponding to the image content acquired by the content acquisition unit 130 based on control of the controller 120. For example, the image processing unit 150 performs an image process for displaying the stereoscopic image on the display unit 170 based on the image content acquired by the content acquisition unit 130 and the attribute information acquired by the attribute information acquisition unit 140. In addition, in the case where the changing manipulation (for example, the rotation command manipulation) for changing the stereoscopic image, which is to be displayed on the display unit 170, is performed by the manipulation receiving unit 110, the image processing unit 150 performs an image process according to the changing manipulation. In addition, the image processing unit 150 is an example of a detection unit and a composition unit disclosed in the embodiments of the present disclosure.
The display control unit 160 is configured to allow the images, on which the image process is performed by the image processing unit 150, to be displayed on the display unit 170 based on control of the controller 120. For example, in the case where the command manipulation for displaying the stereoscopic image (the still image) is received the manipulation receiving unit 110, the display control unit 160 allows the stereoscopic image, on which the image process is performed by the image processing unit 150, to be displayed on the display unit 170. In addition, the display control unit 160 allows various screens (for example, a setting screen 330 illustrated in
The display unit 170 is a display unit for displaying the image content stored in the content storage unit 200 based on control of the display control unit 160. In addition, various menu screens or various images are displayed on the display unit 170.
The posture-of-display-unit detection unit 180 is configured to detect the posture of the display unit 170 and to output a result of the detection to the controller 120. In other words, the posture-of-display-unit detection unit 180 detects the rotation state of the second casing 102 with respect to the first casing 101. For example, the posture-of-display-unit detection unit 180 detects an angle formed by the first casing 101 and the second casing 102 as a rotation state of the second casing 102 with respect to the first casing 101 and outputs a result of the detection to the controller 120. For example, an angle detection switch which is not pressed in the case where the rotation angle of the second casing 102 with respect to the first casing 101 is less than a predetermined value and which is pressed in the case where the rotation angle is equal to or more than the predetermined value is disposed at a portion of the rotating member 103. Next, the posture-of-display-unit detection unit 180 detects the angle formed by the first casing 101 and the second casing 102 by using the angle detection switch. For example, the posture-of-display-unit detection unit 180 detects the angle formed by the first casing 101 and the second casing 102 in units of 90 degrees. In addition, as the posture-of-display-unit detection unit 180, an aspect detection sensor (for example, an acceleration sensor) for detecting the posture of the display unit 170 (for example, the vertical state or the horizontal state) irrespective of the rotation state with respect to the first casing 101 may be used. In addition, the posture-of-display-unit detection unit 180 is an example of a detection unit disclosed in the embodiments of the present disclosure.
In addition, as described above, although the image capturing apparatus 100 may perform the recording process on any one of the moving image and the still image, hereinafter, the generation process and the recording process for the still image are mainly described.
In addition, in
Herein, as described above, with respect to the parallax direction of the display unit 170, it may be set by the user manipulation whether the parallax direction of the display unit 170 is changed according to the posture of the display unit 170 or the parallax direction of the display unit 170 is fixed irrespective of the posture of the display unit 170. For example, in the case where the parallax direction is set to be changed according to the posture of the display unit 170, the controller 120 perform control of changing the parallax direction of the display unit 170 based on the rotation state of the display unit 170 (first casing 101) detected by the posture-of-display-unit detection unit 180. For example, in the case where the display unit 170 is in the horizontally long state, the parallax direction (directions indicated by arrow 305) illustrated
The selection button 331 and the selection button 332 are buttons which are pressed at the time of setting the content of the control which is to be preferentially performed when the stereoscopic image is to be displayed on the display unit 170. Herein, the selection button 331 is a button which is pressed at the time of setting the performance of the first control in the case where the parallax direction of the stereoscopic image displayed on the display unit 170 and the parallax direction of the display unit 170 are not coincident with each other. On the other hand, the selection button 332 is a button which is pressed at the time of setting the performance of the second control in the case where the parallax directions are not coincident with each other. For example, in the case where the display unit 170 is constructed with a touch panel, the to-be-preferentially-performed control content may be set as preference information by performing pressing manipulation of a desired button in the display unit 170. The preference information will be described in detail with reference to
The enter button 333 is a button which is pressed at the time of determining the selection after the pressing manipulation of selecting the to-be-preferentially-performed control content is performed. In addition, the preference information (to-be-preferentially-performed control content) which is determined by the pressing manipulation of the enter button 333 is retained in the preference information retention unit 121. The return button 334 is a button which is pressed, for example, in the case of returning to the display screen which is displayed just before.
The setting items 122 are items which are the object of the user setting manipulation on the setting screen 330 illustrated in
The example illustrated in
In addition, in the case where the display unit 170 (the second casing 102) is rotated by 90 degrees in the direction of arrow 105 in the state illustrated in
In this manner, in the case where the posture of the display unit 170 is changed, the image displayed the display unit 170 is reduced or magnified to be displayed on the display unit 170 according to a change in the posture in the state where the direction of the image displayed the display unit 170 is maintained. In other words, even in the case where the posture of the display unit 170 is changed, the image may be displayed in the state where the direction of the image displayed the display unit 170 is maintained. Therefore, even in the case where the posture of the display unit 170 is changed, before and after the change, the horizontal direction of the user and the horizontal direction of the displayed image may be coincident with each other.
In addition, in the case where the posture of the display unit 170 is changed, the image may be displayed so that the longitudinal direction of the display area of the display unit 170 and the longitudinal direction of the image displayed on the display unit 170 are coincident with each other. In other words, in the case where the display unit 170 (the second casing 102) is rotated by 90 degrees in the direction of arrow 104 in the state illustrated in
In addition, in the state illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
Therefore, as illustrated in
[Example of Displaying Stereoscopic Image Captured so that Vertical Direction Becomes Parallax Direction]
In
In the state illustrated in
As illustrated in
As illustrated in
As illustrated in
In addition, in the case where the parallax direction of the stereoscopic image displayed on the display unit 170 and the parallax direction of the person are not coincident with each other, the user may not properly stereoscopically see the stereoscopic image. Therefore, the first embodiment of the present disclosure is configured so that the stereoscopic image may be properly stereoscopically seen by the user even in the case where the parallax direction of the stereoscopic image displayed on the display unit 170 and the parallax direction of the person are not coincident with each other. More specifically, in the case where the parallax direction of the stereoscopic image displayed on the display unit 170 and the parallax direction of the person are not coincident with each other, the parallax direction of the display unit 170 is changed so that the parallax directions are coincident with each other. In this case, the parallax direction of the person may be acquired based on, for example, the posture of the display unit 170. In other words, since the parallax direction of the person corresponds to the posture of the display unit 170, the parallax direction of the person may be estimated. For example, the longitudinal direction of the display unit 170 and the parallax direction of the person are estimated to be the same direction. In addition, the parallax direction of the person may be acquired by a parallax direction acquisition unit (for example, a parallax direction acquisition unit 722 of a special-purpose glasses 720 illustrated in
More specifically, the controller 120 acquires the preference information which is retained in the preference information retention unit 121 and determines the to-be-preferentially-performed control content. In this example, as described above, the preference information which is retained in the preference information retention unit 121 is set to “displaying of stereoscopic image is preferred”. Subsequently, the controller 120 acquires attribute information (attribute information acquired by the attribute information acquisition unit 140) included in the content which becomes a display object and determines whether or not the parallax direction of the display unit 170 and the parallax direction of the stereoscopic image corresponding to the content which becomes the display object are coincident with each other. As a result of the determination, in the case where the parallax direction of the display unit 170 and the parallax direction of the stereoscopic image corresponding to the content which becomes the display object are not coincident with each other, the controller 120 performs display control according to the preference information which is retained in the preference information retention unit 121. In other words, the controller 120 performs control of changing the direction of the stereoscopic image so that the parallax direction of the display unit 170 and the parallax direction of the stereoscopic image are coincident with each other. For example, as illustrated in
In
In other words, as illustrated in
[Example of Display Control in Case where Direction of Image is Preferred]
In other words, as illustrated in
In other words, as illustrated in
[Example of Displaying Planar Image in Case where Direction is Set to be Preferred]
In this manner, in the first embodiment of the present disclosure, the case where the displaying of the stereoscopic image is set to be preferred and the case where the displaying of the direction of the image is set to be preferred may be easily set by the selection manipulation of the user. For example, in the case where the displaying of the stereoscopic image is set to be preferred, the displaying of the stereoscopic image is preferred to the direction of the stereoscopic image. Therefore, for example, in the case where it is necessary to rotate the stereoscopic image which becomes the display object, the stereoscopic image is displayed after the stereoscopic image is rotated. In addition, in the case where the direction of the image is set to be preferred, the displaying of the stereoscopic image with the direction being preferred is performed only within an available range. Accordingly, it is possible to perform the displaying a proper stereoscopic image according to user's preference.
Hereinbefore, the example of mainly displaying the horizontally long stereoscopic image (that is, the stereoscopic image of which the horizontal direction at the time of the image capturing becomes the longitudinal direction) is illustrated. However, a user may be considered to desire that the vertically long stereoscopic image is displayed on the display unit 170 in the vertically long state to be seen. Therefore, hereinafter, an example of generating the stereoscopic image by which the vertically long stereoscopic image may be displayed on the display unit 170 in the vertically long state to be seen is illustrated.
In
In the state illustrated in
In
In this manner, the vertically long stereoscopic image (the stereoscopic image of which the horizontal direction is the parallax direction) may be generated by using the image capturing apparatus 100.
In addition, although this example illustrates the example of generating the vertically long stereoscopic image by composing the two captured images which are consecutively disposed, the vertically long stereoscopic image may be generating by composing the three captured images or more which are consecutively disposed.
In addition, in the example illustrated in
The image capturing apparatus 100 includes a lens control unit 551, a drive unit 552, and a hand shake correction lenses 553 and 554.
The lens control unit 551 is configured to control the hand shake correction lenses 553 and 554 for correcting hand shake correction based on the control of the controller 120. The drive unit 552 is configured to move the hand shake correction lenses 553 and 554 based on the control of the lens control unit 551, so that the hand shake correction is performed.
Herein, the case of generating the two images, which are consecutively disposed or overlapped in the vertical direction, by using the image capturing apparatus 100 having the hand shake correction mechanism is described. For example, in the case where a subject of interest (for example, a person) is located at a position which is relatively far from the image capturing apparatus 100, the two images (the two images which are consecutively disposed or overlapped) where the subject of interest in the vertical direction is shifted may be generated by using the hand shake correction mechanism.
In addition, the user shakes the image capturing apparatus 100 having the hand shake correction mechanism in the vertical direction to generate the two images which are consecutively disposed or overlapped in the vertical direction, so that the shifting due to the shaking may be corrected by using the hand shake correction mechanism.
As described hereinbefore, in the case of displaying the vertically long stereoscopic image (stereoscopic image of which the horizontal direction is the parallax direction) generated by the image capturing apparatus 100, an example of displaying the vertically long stereoscopic image is illustrated
As illustrated in
In this manner, the vertically long stereoscopic image may be generated and recorded through the image process at the time of the image capturing. In addition, the vertically long stereoscopic image may be generated and displayed by performing the aforementioned image process on the horizontally long stereoscopic image at the time of displaying the stereoscopic image.
Hereinbefore, the example of generating the vertically long stereoscopic image by performing the composing process or the cutting process on the left-eye viewing image and the right-eye viewing image which constitute the horizontally long stereoscopic image is described. Hereinafter, an example of changing the parallax direction of the stereoscopic image by generating new images based on shift amounts of the images which constitute the stereoscopic image is described.
In
In
In
In
As illustrated in
In other words, as illustrated in
In this manner, in the block matching method, one motion vector is calculated with respect to one object area. In other words, a correlation determination (matching determination) process between the images is performed in units of the divided block, so that the motion vector of each block is obtained.
In
As illustrated in
In
In this manner, the left-eye image capturing unit 210 and the right-eye image capturing unit 220 generates the left-eye viewing image and the right-eye viewing image. Subsequently, the captured-image signal processing unit 230 detects the movement amount and the movement direction of each of the plurality of areas of the left-eye viewing image with respect to the right-eye viewing image based on the generated left-eye viewing image and the generated right-eye viewing image. Subsequently, the captured-image signal processing unit 230 moves the images of the plurality of areas of the right-eye viewing image based on the detected movement amount and the detected movement direction of each of the areas of the left-eye viewing image and generates a composed image (new left-eye viewing image) based on the after-movement images.
In this manner, the vertically long stereoscopic image of which the horizontal direction becomes the parallax direction may be generated and recorded by changing the parallax direction through the image process at the time of the image capturing. In addition, the vertically long stereoscopic image of which the horizontal direction becomes the parallax direction may be generated and displayed by performing the aforementioned image process on the vertically long stereoscopic image of which the vertical direction becomes the parallax direction at the time of displaying the stereoscopic image.
In addition, in the case of generating the vertically long stereoscopic image through the composing process or the cutting process or the case of generating the vertically long stereoscopic image through the parallax direction changing process, the before-process stereoscopic images and the after-process stereoscopic images may be recorded in a correspondence manner. Accordingly, the before-process stereoscopic image and the after-process stereoscopic image may be used at the time of displaying.
Next, operations of the image capturing apparatus 100 according to the first embodiment of the present disclosure are described with reference the drawings.
First, the content acquisition unit 130 acquires image content, which become display object, from the content storage unit 200 (Step S901). In addition, Step S901 is an example of an acquisition procedure disclosed in the embodiments of the present disclosure. Subsequently, the posture-of-display-unit detection unit 180 detects a posture of the display unit 170 (Step S902), and the controller 120 acquires a result of the detection. Subsequently, the controller 120 acquires the rotation amount (the rotation amount of the stereoscopic image) according to the rotation command manipulation received by the manipulation receiving unit 110 (Step S903). Subsequently, the attribute information acquisition unit 140 acquires the posture (the image capturing posture) at the time of the image capturing included in the image content acquired by the content acquisition unit 130 (Step S904), so that the controller 120 acquires the image capturing posture. Subsequently, the attribute information acquisition unit 140 acquires the parallax direction (the image-capturing-time parallax direction) at the time of the image capturing included in the image content acquired by the content acquisition unit 130 (Step S905), so that the controller 120 acquires the image-capturing-time parallax direction.
Subsequently, it is determined based on the posture of the display unit 170 whether or not the changing of the parallax direction of the display unit 170 is set (Step S906). In the case where the changing of the parallax direction of the display unit 170 is set (Step S906), the controller 120 changes the parallax direction of the display unit 170 based on the posture of the display unit 170 (Step S907), so that the changed parallax direction is acquired (Step S908). For example, in the case where the display unit 170 (the second casing 102) is changed from the horizontally long state to the vertically long state, the changing of the parallax direction of the display unit 170 is performed. For example, the parallax direction illustrated in
Subsequently, the controller 120 performs the rotation process on the parallax direction of the stereoscopic image based on the acquired image-capturing-time parallax direction, the posture of the display unit 170, the rotation amount of the stereoscopic image, and the image capturing posture (Step S909). In other words, similarly to the rotation process for the stereoscopic image, the rotation process for the parallax direction of the stereoscopic image is performed according to the setting content with respect to the displaying, which are set by the user.
Subsequently, the image processing unit 150 performs the image process for displaying the stereoscopic image corresponding to the image content acquired by the content acquisition unit 130 based on the control of the controller 120 (Step S910). In this case, the rotation process is performed on the stereoscopic image based on the acquired image-capturing-time parallax direction, the posture of the display unit 170, the rotation amount of the stereoscopic image, and the image capturing posture.
Subsequently, it is determined whether or not the parallax direction of the stereoscopic image displayed on the display unit 170 and the parallax direction of the display unit 170 are coincident with each other (Step S911). Next, if the parallax directions are coincident with each other (Step S911), the stereoscopic image which is subject to the image process for display is allowed to be displayed on the display unit 170 (Step S913). On the other hand, if the parallax direction of the stereoscopic image displayed on the display unit 170 and the parallax direction of the display unit 170 are not coincident with each other (Step S911), the image process (rotation process) is performed on the stereoscopic image so that the parallax directions are coincident with each other (Step S912). Next, the stereoscopic image, which is subject to the image process, is allowed to be displayed on the display unit 170 (Step S913).
It is determined whether or no the parallax direction of the stereoscopic image displayed on the display unit 170 and the parallax direction of the display unit 170 are coincident with each other (Step S911). Next, if the parallax directions are coincident with each other (Step S911), the stereoscopic image which is subject to the image process for display is allowed to be displayed on the display unit 170 (Step S921). On the other hand, if the parallax direction of the stereoscopic image displayed on the display unit 170 and the parallax direction of the display unit 170 are not coincident with each other (Step S911), the stereoscopic image is allowed to be displayed as a planar image on the display unit 170 (Step S922).
It is determined whether or not the parallax direction of the stereoscopic image displayed on the display unit 170 and the parallax direction of the display unit 170 are coincident with each other (Step S911). Next, if the parallax directions are not coincident with each other (Step S911), it is determined whether or not the preference information retained in the preference information retention unit 121 indicates the setting that the displaying of the stereoscopic image is preferred (Step S931). In the case where the preference information indicates the setting that the displaying of the stereoscopic image is preferred (Step S931), the image process (rotation process) is performed on the stereoscopic image so that the parallax direction of the stereoscopic image displayed on the display unit 170 and the parallax direction of the display unit 170 are coincident with each other (Step S933). Next, the stereoscopic image which is subject to the image process is displayed on the display unit 170 (Step S921). In addition, in the case where the preference information does not indicates the setting that the displaying of the stereoscopic image is preferred (Step S931), the stereoscopic image is displayed as a planar image on the display unit 170 (Step S932). In addition, Steps S911 to S913, S921, S922, and S931 to S933 are examples of a control procedure disclosed in the embodiments of the present disclosure.
First, the left-eye image capturing unit 210 and the right-eye image capturing unit 220 perform image capturing processes of generating the left-eye viewing image and the right-eye viewing image which are used to generate the stereoscopic image (Step S971). Subsequently, it is determined whether or not the vertically long stereoscopic image recording mode is set (Step S972).
In the case where the vertically long stereoscopic image recording mode is set (Step S972), the captured-image signal processing unit 230 generates the vertically long image by cutting predetermined left and right areas of each of the left-eye viewing image and the right-eye viewing image (Step S973). Subsequently, the captured-image signal processing unit 230 performs an image process for recording on the generated vertically long image (the left-eye viewing image and the right-eye viewing image) (Step S974). Subsequently, the recording control unit 260 performs the recording process for recording the vertically long image (the left-eye viewing image and the right-eye viewing image), which is subject to the image process, in the content storage unit 200 (Step S975).
In addition, in the case where the vertically long stereoscopic image recording mode is not set (Step S972), the captured-image signal processing unit 230 performs an image process for recording on the generated left-eye viewing image and the generated right-eye viewing image (Step S976), the procedure proceeds to Step S975. In other words, a normal image process for recording the stereoscopic image is performed.
First, it is determined whether or not the vertically long stereoscopic image recording mode is set (Step S980). In the case where the vertically long stereoscopic image recording mode is set (Step S980), the left-eye image capturing unit 210 and the right-eye image capturing unit 220 perform the image capturing process for generating one set of the first image groups (the left-eye viewing image and the right-eye viewing image) (Step S981). Subsequently, the left-eye image capturing unit 210 and the right-eye image capturing unit 220 perform the image capturing process for generating one set of the second image group (the left-eye viewing image and the right-eye viewing image) used for generating the stereoscopic image (Step S982).
Subsequently, the captured-image signal processing unit 230 generates the composed images (the left-eye viewing image and the right-eye viewing image) by composing the two consecutive images to be overlapped at each of the left and right sides based on the correlation between the images included in the generated first and second image groups (Step S983). Subsequently, the captured-image signal processing unit 230 generates the vertically long image by cutting predetermined left and right areas of each of the generated composed images (the left-eye viewing image and the right-eye viewing image) (Step S984). Subsequently, the captured-image signal processing unit 230 performs an image process for recording on the generated vertically long image (the left-eye viewing image and the right-eye viewing image) (Step S985). Subsequently, the recording control unit 260 performs the recording process for recording the vertically long image (the left-eye viewing image and the right-eye viewing image), which is subject to the image process, in the content storage unit 200 (Step S986).
In addition, in the case where the vertically long stereoscopic image recording mode is not set (Step S980), the left-eye image capturing unit 210 and the right-eye image capturing unit 220 perform image capturing processes for generating the left-eye viewing image and the right-eye viewing image (one set of left and right image groups) (Step S987). Subsequently, the captured-image signal processing unit 230 performs an image process for recording on the generated left-eye viewing image and the generated right-eye viewing image (Step S988), and the procedure proceeds to Step S986. In other words, a normal image process for recording the stereoscopic image is performed.
First, it is determined whether or not the vertically long stereoscopic image recording mode is set (Step S1001). In the case where the vertically long stereoscopic image recording mode is set (Step S1001), it is determined based on the result of the detection from the image capturing posture detection unit 250 whether or not the posture of the image capturing apparatus 100 is the posture which is rotated by 90 degrees by using the optical axis direction as a rotation axis (Step S1002). In the case where the posture of the image capturing apparatus 100 is the 90-degree-rotated posture (Step S1002), the left-eye image capturing unit 210 and the right-eye image capturing unit 220 performs the image capturing process for generating the left-eye viewing image and the right-eye viewing image (one set of the image group) (Step S1003).
Subsequently, the captured-image signal processing unit 230 divides the left-eye viewing image into a plurality of areas (Step S1004). Subsequently, the captured-image signal processing unit 230 extracts one area (object area) as an object of comparison from the divided areas of the left-eye viewing image (Step S1005). Subsequently, the captured-image signal processing unit 230 searches the area of the right-eye viewing image, of which the correlation to the image included in the object area is highest, and detects the motion vector based on a result of the searching (Step S1006).
Subsequently, it is determined whether or not the detection process for the motion vector is ended with respect to all the areas of the left-eye viewing image (Step S1007). In the case where the detection process for the motion vector is not ended with respect to all the areas, the process returns to Step S1005. On the other hand, in the case where the detection process for the motion vector is ended with respect to all the areas of the left-eye viewing image (Step S1007), the rotation process for the motion vector is performed (Step S1008). In other words, the captured-image signal processing unit 230 performs the rotation process for rotating the motion vectors, which are detected in the areas of the left-eye viewing image, by only a predetermined angle (for example, 90 degrees clockwise) (Step S1008). Subsequently, the captured-image signal processing unit 230 divides the right-eye viewing image into areas of which the size is equal to the size of the areas which the left-eye viewing image is divided into (Step S1009).
Subsequently, the captured-image signal processing unit 230 moves each of the areas of the right-eye viewing image based on the motion vector detected with respect to each of the areas of the left-eye viewing image to generate a new left-eye viewing image (Step S1010). Subsequently, the captured-image signal processing unit 230 performs an interpolation process on the newly generated left-eye viewing image (Step S1011). Subsequently, the captured-image signal processing unit 230 performs an image process for recording on the new left-eye viewing image which is subject to the interpolation process and the original right-eye viewing image (Step S1012). Subsequently, the recording control unit 260 performs a recording process for recording the two images (the vertically long images (the left-eye viewing image and the right-eye viewing image)), which is subject to the image process, in the content storage unit 200 (Step S1013).
In addition, in the case where the vertically long stereoscopic image recording mode is not set (Step S1001), or in the case where the posture of the image capturing apparatus 100 is the 90-degree-rotated posture (Step S1002), image capturing processes for capturing one set of the left and right image groups are performed (Step S1014). In other words, the left-eye image capturing unit 210 and the right-eye image capturing unit 220 perform the image capturing processes for generating the left-eye viewing image and the right-eye viewing image (one set of the left and right image groups). Subsequently, the captured-image signal processing unit 230 performs an image process for recording on the generated left-eye viewing image and the generated right-eye viewing image (Step S1015), the procedure proceeds to Step S1013. In other words, a normal image process for recording the stereoscopic image is performed.
In addition, in the examples illustrated in
In addition, in the examples illustrated in
In addition, the first embodiment of the present disclosure illustrates the example where the display unit 170 and the main body (the first casing 101) are configured as different cases and the posture-of-display-unit detection unit 180 detects the rotation state of the display unit 170 with respect to the main body. However, the first embodiment of the present disclosure may be adapted to an image capturing apparatus or an image processing apparatus such as a mobile phone apparatus where the display unit and the main body are configured as an integral body. For example, a posture detection unit (for example, an acceleration sensor) which detects the posture (for example, the vertical state or the horizontal state) of the display unit (the mainly body of the apparatus) may be provided to the image processing apparatus, so that the various controls may be performed by using a result of the detection by the posture detection unit. For example, based on the result of the detection, the parallax direction of the display unit may be changed, or it may be determined whether or not the parallax direction of the stereoscopic image and the parallax direction of the display unit are coincident with each other.
The first embodiment of the present disclosure is described with respect to the example of using the parallax barrier type as a display type for displaying the stereoscopic image. However, the first embodiment of the present disclosure may be adapted to types other than the parallax barrier type. Therefore, hereinafter, a modified example of the first embodiment of the present disclosure is illustrated. The configuration of the image capturing apparatus according to the modified example is substantially the same as the example illustrated in
The image processing apparatus 700 includes a display unit 710, a synchronization signal transmitting unit 711, and a parallax direction receiving unit 712. In addition, the special-purpose glasses 720 include a synchronization signal receiving unit 721 and a parallax direction acquisition unit 722.
Herein, the case where the user wears the special-purpose glasses 720 and sees the stereoscopic image is considered. In this case, the image processing apparatus 700 (the display control unit 160 illustrated in
In addition, the parallax direction acquisition unit 722 detects a change of the posture of the special-purpose glasses 720 by detecting acceleration, motion, tilt, and the like of the special-purpose glasses 720 and acquires the parallax direction of the user based on the result of the detection. Next, the acquired parallax directions of the user are sequentially transmitted from the special-purpose glasses 720 to the parallax direction receiving unit 712. Accordingly, it may be determined whether or not the parallax direction of the user (the posture of the special-purpose glasses 720) and the parallax direction of the stereoscopic image (the posture of the display unit 710) are coincident with each other. In addition, the parallax direction acquisition unit 722 may be implemented by a gyro sensor (angular velocity sensor) or an acceleration sensor.
More specifically with respect to the stereoscopic image (displayed image 731) displayed on the display unit 710, the left-eye viewing image and the right-eye viewing image are schematically illustrated by “L” and “R” in the time axis. In addition, with respect to the images reaching the user through the special-purpose glasses 720, the image (the image 732 transmitting through the right lens) which reaches the user's right eye through the right-eye lens is schematically illustrated by “R” in the time axis. Similarly, the image (the image 733 transmitting through the left lens) which reaches the user's left eye through the left-eye lens is schematically illustrated by “L” in the time axis.
In other words, in the case where the right-eye image “R” is displayed on the display unit 710, the left glass of the special-purpose glasses 720 is closed. On the other hand, in the case where the left-eye image “L” is displayed on the display unit 710, the right glass of the special-purpose glasses 720 is closed. In this manner, the images displayed on the display unit 710 are seen by the user using the special-purpose glasses 720, so that the stereoscopic image may be properly seen.
Herein, in the case where the stereoscopic image is seen by using the special-purpose glasses 720, the parallax direction of the display unit 710 is changed according to a change of the parallax direction of the user (that is, a change of the posture of the special-purpose glasses 720). Therefore, the image processing apparatus 700 (the controller 120 illustrated in
Herein, for example, in the case where the user's head is tilted, it may be considered that the parallax direction of the user is rotated by 45 degrees by using the eye direction as a rotation axis. In this manner, in the case where the change in the parallax direction of the user is the rotation of less than 90 degrees, for example, the example illustrated in
In the first embodiment of the present disclosure, the example of generating the two images (the two images used for displaying the stereoscopic image) by using the two optical systems and the two image capturing devices is illustrated. However, the two images may be configured to be generated by using one image capturing device. In addition, the first embodiment of the present disclosure may be adapted to a case of generating a multi-viewing-point image by using an image capturing apparatus having other configurations. Therefore, hereinafter, a modified example of the first embodiment of the present disclosure is illustrated. The configuration of the image capturing apparatus of the modified example is substantially the same as the example illustrated in
As illustrated in
As illustrated in
Hereinbefore, the examples of using the image capturing apparatus are described. However, the first embodiment of the present disclosure may be adapted to other image processing apparatuses having a display unit. In addition, the first embodiment of the present disclosure may be adapted to an image processing apparatus capable of displaying a stereoscopic image or a planar image on an external display apparatus. For example, the first embodiment of the present disclosure may be adapted to a mobile phone apparatus having a display unit. The mobile phone apparatus is illustrated in
The mobile phone apparatus 800 includes a first casing 801 and a second casing 802. In addition, the first casing 801 and the second casing 802 are rotatably connected to each other by using a rotating member 803 as a rotation reference. The mobile phone apparatus 800 is implemented, for example, by a mobile phone apparatus (so-called a camera-attached mobile phone apparatus) having a plurality of image capturing functions. In addition, in
The first casing 801 includes a left-eye image capturing unit 810, a right-eye image capturing unit 820, and a manipulation unit 840. The second casing 802 includes a display unit 830. The left-eye image capturing unit 810 and the right-eye image capturing unit 820 correspond to the left-eye image capturing unit 210 and the right-eye image capturing unit 220 illustrated in
As described above, the first casing 801 and the second casing 802 are rotatably connected to each other. In other words, the second casing 802 may be rotated with respect to the first casing 801 by using the rotating member 803 (indicated by a dotted line) as a rotation reference. Accordingly, a relative position relationship of the second casing 802 with respect to the first casing 801 may be changed. For example, the state where the second casing 802 is rotated by 90 degree in the direction of arrow 804 illustrated in
In addition, the mobile phone apparatus 800 illustrated in
As described hereinbefore, according to the first embodiment of the present disclosure, in the case where the stereoscopic image (multi-viewing-point image) is displayed, the parallax direction of the display unit and the parallax direction of the stereoscopic image may be allowed to be coincident with each other, so that it is possible to prevent the stereoscopic image which arouses uncomfortable feelings in a user from being displayed.
In addition, in the case where the direction of the image is set to be preferred, the stereoscopic image which arouses uncomfortable feelings in a user is not displayed, but the stereoscopic image may be displayed as a planar image.
In addition, the stereoscopic image may be displayed at the image capturing operation time for the vertically long stereoscopic image, and the multi-viewing-point image of which the parallax direction is appropriate may be generated without addition of mechanical and optical mechanisms for only the vertically long image capturing operation.
In addition, although the embodiment of the present disclosure is described with respect to the example using the two-viewing-point image as a multi-viewing-point image, the embodiment of the present disclosure may be adapted to a multi-viewing-point image having three or more viewing points.
In addition, the embodiment of the present disclosure illustrates an example for embodying the present disclosure, and as clarified in the embodiment, the components therein and the components specified in the claims of the present disclosure have a relationship of correspondence. Similarly, the components specified in the claims of the present disclosure and the components in the embodiment of the present disclosure to which the same names are allocated have a relationship of correspondence. However, the present disclosure is not limited to the embodiment, but various modifications of the embodiment are available for embodying the present disclosure within a range without departing from the spirit of the present disclosure.
In addition, the process procedure described in the embodiment of the present disclosure may be considered to be a method having a series of procedures. In addition, the process procedure may be considered to be a program for allowing a computer to execute a series of the procedures or a recording medium storing the program. As the recording medium, for example, a CD (Compact Disc), an MD (Mini Disc), a DVD (Digital Versatile Disc), a memory card, and Blu-ray Disc (registered trade mark)), or the like may be used.
The present disclosure contains subject matter related to that disclosed in Japanese Priority Patent Application JP 2010-176897 filed in the Japan Patent Office on Aug. 6, 2010, the entire contents of which are hereby incorporated by reference.
Number | Date | Country | Kind |
---|---|---|---|
2010-176897 | Aug 2010 | JP | national |