The present disclosure relates to an image processing apparatus, an image processing method, and a program and particularly, to a technology for generating a three-dimensional image.
In recent years, an image display apparatus that displays an image (three-dimensional image: so-called 3D image) perceived by a user as a three-dimensional image is released and begins to spread (for example, Patent Literature 1). The apparatus that can display the three-dimensional image is not limited to a television and other image display apparatuses. Of personal computers, there are types that can display a three-dimensional image.
Among applications running in the personal computer, there are applications that can generate contents having three-dimensional images. If the contents are generated by the applications and the user views the contents in a predetermined manner, the user can perceive images included in the contents as the three-dimensional images.
PTL 1
JP 2010-210712A
However, according to the related art, in generating contents having three-dimensional images, it is necessary to set a positional relation using dedicated software. Accordingly, it is difficult for an end user to generate such contents.
In light of the foregoing, it is desirable to provide an image processing apparatus, an image processing method, and a computer program which are novel or improved ones and which enable contents having three-dimensional images to be generated easily.
An image processing apparatus of the present disclosure comprises an image generating unit that generates a plurality of plane images and sets virtual distances in a depth direction to the plurality of generated plane images, respectively.
An image processing apparatus of the present disclosure comprises a three-dimensional image converting unit that converts the plurality of plane images into a three-dimensional image where objects' positions in space in each of the plurality of plane images are set, based on the virtual distances set to the plurality of plane images generated by the image generating unit.
An image processing apparatus of the present disclosure comprises a three-dimensional image generating unit that outputs data of the three-dimensional image converted by the three-dimensional image converting unit.
An image processing apparatus of the present disclosure comprises an editing screen generating unit that displays the plurality of plane images generated by the image generating unit individually or in a overlapped manner and generates display data of an editing screen displayed by providing tabs to the plane images, respectively.
An image processing apparatus of the present disclosure comprises and an input unit that receives an operation to generate or edit images in the editing screen generated by the editing screen generating unit.
An image processing method of the present disclosure comprises generating a plurality of plane images and setting virtual distances in a depth direction to the plurality of generated plane images, respectively; converting the plurality of plane images into a three-dimensional image where space positions of objects in each of the plurality of plane images are set, based on the virtual distances set to the plurality of generated plane images; outputting data of the converted three-dimensional image; displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and accepting an operation to generate or edit images in the generated editing screen.
A program of the present disclosure that causes a computer to execute an image process, the program causing the computer to execute: generating a plurality of plane images and setting the virtual distances of a depth direction to the plurality of generated plane images, respectively; converting the plurality of plane images into a three-dimensional image where the space positions of objects of the plurality of plane images are set, on the basis of the virtual distances set to the plurality of generated plane images; outputting data of the converted three-dimensional image; displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and accepting an operation to generate or edit images in the generated editing screen.
According to the present disclosure, a plurality of plane images are appropriately displayed so that a user can generate and display a desired three-dimensional image through a simple operation.
According to the present disclosure, there are provided an image processing apparatus, an image processing method, and a computer program which are novel or improved ones and which enables contents having three-dimensional images to be easily generated.
Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the appended drawings, in the following order.
1-1. Configuration of an image processing apparatus (
1-2. Outline of a three-dimensional image (
1-3. Display example of an editing screen (
1-4. Display example of a depth adjustment screen (
1-5. Display example of a ground surface setting screen (
1-6. Example where a camera image is captured (
1-7. Example where an image file is imported (
1-8. Display example of a generated image list (
1-9. Specific example of hardware configuration (
1-1. The functional configuration of an image processing apparatus
First, the configuration of an image processing apparatus according to an embodiment (hereinafter, called “this example”) of the present disclosure will be described.
An image processing apparatus 100 according to this example illustrated in
The image generating/processing unit 110 provides the user with an image generation screen through the image display unit 140 or generates a three- dimensional image from the image generated by the user. As illustrated in
The image generating/processing unit 110 generates a plurality of plane images (for example, three plane images) on the basis of a user operation in a state in which an editing screen for image generation is displayed on the image display unit 140 and generates a three-dimensional image from the plurality of generated plane images (two-dimensional images). The editing screen for the image generation is a screen illustrated in
In addition, the image generating/processing unit 110 supplies image data of the three-dimensional image generated by the image generating/processing unit 110 to the image display unit 140 and the image display unit 140 displays the three-dimensional image. The user views the three-dimensional image using a predetermined method (for example, in a state in which the user wears a shutter spectacle of a time-sharing driving type) and perceives the three-dimensional image displayed on the image display unit 140 as a three-dimensional image.
The image generating unit 112 displays the editing screen for the image generation on the image display unit 140 and generates an image by the user operation. If the image generating unit 112 generates images including a plurality of layers using the image generation screen provided by the image generating unit 112, the images that include the plurality of layers are converted into a three-dimensional image by the three-dimensional image converting unit 114 and the three-dimensional image generating unit 116. The images including the plurality of layers that are generated by the image generating unit 112 are stored in the image storage unit 120 according to the user operation.
The three-dimensional image converting unit 114 executes a conversion process for displaying the images including the plurality of layers transmitted from the image generating unit 112 as the three-dimensional image on the image display unit 140. The image processing apparatus 100 according to this example previously assumes the distance between eyes of the user and the distance between the user and the display surface and executes a conversion process for displaying the image as the three-dimensional image on the image display unit 140, on the basis of the virtual distance between the layers (information of the depth of each layer of the images). Specifically, in order for the three-dimensional image to be generated by performing a coordinate conversion on the images including the plurality of layers, the three-dimensional image converting unit 114 executes a coordinate conversion process with respect to the images including the plurality of layers.
At the time of the conversion process in the three-dimensional image converting unit 114, if the user adjusts the depth of each layer of the images while the three-dimensional image is displayed on the image display unit 140 and changes a depth state, the three-dimensional image converting unit 114 executes the conversion process in real time according to the change. Thereby, the user can adjust the depth of each layer of the images and confirm the three-dimensional image after the adjustment through the display on the editing screen in real time. An example of a process for adjusting the depth of each layer of the images will be described in detail below.
When the image processing apparatus 100 generates the three-dimensional image from the planar images including the plurality of layers generated by the user, the image processing apparatus 100 executes preview display of the three-dimensional image. By the preview display of the three-dimensional image, the user can previously grasp how the images would be seen in a three-dimensional manner before storing the generated image as the three-dimensional image.
The three-dimensional image generating unit 116 generates the three-dimensional image from the images including the plurality of layers, on the basis of the conversion process executed by the three-dimensional image converting unit 114.
The three-dimensional image generated by the three-dimensional image generating unit 116 is displayed on the image display unit 140 and is stored in the image storage unit 120 according to the operation of the input unit 130 from the user.
The editing screen generating unit 118 generates display data of the editing screen, on the basis of a reception state of an input operation in the input unit 130. The editing screen generating unit 118 supplies the display data generated by the editing screen generating unit 118 to the image display unit 140 and displays the editing screen.
The image storage unit 120 stores the images including the plurality of layers generated by the image generating/processing unit 110 or the three-dimensional image generated by converting the images including the plurality of layers. The images stored in the image storage unit 120 are read from the image storage unit 120 according to the operation of the input unit 130 by the user, are processed by the image generating/processing unit 110, and are displayed by the image display unit 140.
The input unit 130 includes various input devices that execute an input operation with respect to the image processing apparatus 100 by the user, for example, a keyboard, a mouse, a graphic tablet, and a touch panel. The user can generate the images including the plurality of layers by operating the input unit 130 and adjust the depth of each layer of the images when the images are converted into the three-dimensional image.
The image display unit 140 is a display that displays an image. For example, the image display unit 140 displays the images including the plurality of layers generated by the image generating/processing unit 110 or the three-dimensional image generated by converting the images including the plurality of layers. The image display unit 140 displays a screen to allow the images to be generated by the user of the image processing apparatus 100. An example of the display screen will be described below. A touch panel may be disposed on an image display surface of the image display unit 140 and the user may directly operate buttons in a displayed image. The touch panel that is included in the image display unit 140 functions as a part of the input unit 130. The image display unit 140 may be a device that is separated from the image processing apparatus 100.
The image display unit 140 may be configured using a display device that can display the three-dimensional image. A method that displays the three-dimensional image is not limited to a specific display method. For example, as the method that displays the three-dimensional image, a method that switches an image for a right eye and an image for a left eye at a high speed and displays the images is known. As a method that transmits the three-dimensional image to the image display unit 140, a frame sequential method, a side-by-side method, and a top and bottom method and the like are known.
The images that are generated by the image generating/processing unit 110 may be output to the television receiver and other display devices that are connected to the image processing apparatus 100 and can display the three-dimensional image.
1-2. Outline of generation of a three-dimensional image
Next, an outline of generation of the three-dimensional image by the image processing apparatus 100 according to the embodiment of the present disclosure will be described with reference to
First, as illustrated in
In an example of
The depth that is illustrated by the distance from the reference position is set to the image of each layer.
The setting state of the depth illustrated in
An image frame that is illustrated by a broken line in
In the example of
As such, when the three layer images are prepared, the layer image may be automatically set to the predetermined depth position suitable for the long-distance view as the first layer image 311, in an initial state. Likewise, the layer image may be automatically set to the predetermined depth position suitable for the middle-distance view as the second layer image 312 and the layer image may be automatically set to the predetermined depth position suitable for the short-distance view as the third layer image 313.
As illustrated in
However, when the position of a horizontal line c is set to the horizontal line position 321 with respect to the layer image 311 of the long-distance view, an image portion of the lower side of the horizontal line position 321 of the layer image 311 of the long-distance view is disposed to be inclined on the three-dimensional space and protrude to the front side. That is, the image portion of the lower side of the horizontal line c becomes an inclined surface 303 that gradually protrudes to the front side, from the depth position 301b of the layer image 311 to the front edge portion 302, as the image portion progresses to the lower side. In the example of
In the example of
As such, when setting of the horizontal line is performed with respect to the first layer image 311 of the long-distance view and setting matched with the horizontal line is performed with respect to the objects in the other layer images 312 and 313, each object is automatically appropriately disposed on the inclined surface 303. Alternatively, a process for erasing inappropriate portions of the objects is executed. A specific example of the setting of the horizontal line and the process associated with the horizontal line will be described below.
Next, an example of a process for converting the images into the three-dimensional image in the three-dimensional image converting unit 114 will be described.
Next, an example of a specific method to calculate the drawing positions of the image for the right eye and the image for the left eye will be described.
As illustrated in
As such, the three-dimensional image converting unit 114 executes the projection coordinate conversion with respect to the virtual display surface 259 and the image processing apparatus 100 can convert the normal two-dimensional images including the plurality of layers into the three-dimensional image.
The examples of
As illustrated in
In the case of this example, as illustrated in
1-3. Display example of the editing screen
Next, an image generation process using the editing screen that is needed to generate the three-dimensional image will be described.
The editing screen is the screen illustrated in
In addition, it is determined whether an operation to select any one of the displayed multi-layer thumbnails by the input unit 130 is received (step S13). When the operation is not received, a waiting state is maintained and the image of the editing screen is not changed.
In addition, when the operation to select any one of the layer thumbnails is received, the image of the layer that corresponds to the selected layer thumbnail is displayed on the editing screen (step S14). However, in this case, in the image display of each layer, the display brightness of the images of the other layers is decreased and the image of each layer is simultaneously displayed, and the entire three-dimensional image is displayed to be recognized.
In a state in which the image of the specific layer is displayed in step S14, it is determined whether an operation to change the layer to another layer by selecting the layer thumbnail exists (step S15). In this case, when it is determined that the operation to change the layer to another layer exists, a change process of the layer that is displayed on the editing screen is executed (step S16), the process returns to step S14, and an image of the layer after the change is displayed.
When it is determined that the operation to change the layer to another layer does not exist in step S15, it is determined whether an operation to change the display to the overlapped display of all of the layers exists (step S17). In this case, when it is determined that the operation to change the display to the overlapped display of all of the layers exists, the display is changed to the overlapped display of all of the layers and the process returns to the 3D editing screen display in step S12.
In step S17, when it is determined that the operation to change the display to the overlapped display of all of the layers does not exist, the screen display of the selected layer in step S14 is continuously executed.
Next, a specific display example of the editing screen will be described with reference to
In addition, plural tabs 201, 202, and 211 to 213 to select a display image are disposed on the upper side of the intermediate image display 203 in edition processing in the editing screen. The tabs 201, 202, and 211 to 213 display the images allocated to the selected tabs, when a user operation to select the corresponding tab display places by the touch panel operation or the like exists.
The image that is allocated to each tab will be described. The layer thumbnail tabs 211, 212, and 213 are tabs to individually display the layers. Specifically, the first layer thumbnail tab 211 is a tab that displays a first layer image, the second layer thumbnail tab 212 is a tab that displays a second layer image, and the third layer thumbnail tab 213 is a tab that displays a third layer image. In the thumbnail tabs 211 to 213, images of the layers are reduced and displayed as the thumbnail images. Therefore, when the image of each layer is modified, the thumbnail image is also modified in the same way.
The tab 201 that is displayed adjacent to the layer thumbnail tabs 211, 212, and 213 is a tab that adds the layer image. That is, when the tab 201 is selected, a layer image is newly added and an editing screen of the newly added layer image is displayed.
In addition, the 3D state thumbnail tab 202 that is disposed to be closer to the right side of the upper side of the intermediate image display 203 in edition processing is a tab that displays a depth state of the image of each layer. When the 3D state thumbnail tab 202 is selected, the display screen of the depth state illustrated in
A peripheral portion of the intermediate image display 203 in edition processing of the editing screen illustrated in
If a user operation to select the file importing button 221 exists, a process for importing a file image that is prepared in the image storage unit 120 or an external memory starts.
If a user operation to select the camera capturing button 222 exists, a process for capturing image data from a camera device connected to the apparatus starts.
In the stamp button 223 and the character input button 224, a process for inputting prepared figures or characters starts, by a user operation of each button.
If a user operation to select the depth operation button 225 exists, a depth adjustment screen is displayed on the editing image display 203. A display process of the depth adjustment screen will be described below.
As illustrated in
On the lower side of the right end of the editing image display 203, a pen tool 290 is displayed. The pen tool 290 of this example displays a first pen 291, a second pen 292, a third pen 293, and an eraser 294. If a user operation to select display places of the pens 291, 292, and 293 exists, a line can be drawn with a color or a line type allocated to each pen. If a user operation to select a display place of the eraser 294 exists, the drawn line is erased. The pen tool 290 may have a function of supporting other drawing or erasure.
As illustrated in
When the image 253 of the third layer is displayed, the objects e and f in the image of the third layer are displayed with set colors or brightness. The objects of the images of the other layers are overlapped and displayed after decreasing the display brightness. That is, only the image of the third layer is highlighted and displayed and the images of the other layers are grayed out and displayed.
By performing an operation to add the objects in the images in the intermediate image display 203 in edition processing or drawing by the user in a state illustrated in
When the image 252 of the second layer is displayed, the object d in the image of the second layer is displayed with set colors or brightness. The objects of the images of the other layers are overlapped and displayed after decreasing the display brightness. By performing an operation to add the objects in the images in the intermediate image display 203 in edition processing or drawing by the user in a state illustrated in
When the image 251 of the first layer is displayed, the objects a, c, and g in the image of the first layer are displayed with set colors or brightness. At the time of the display, the objects of the images of the other layers are overlapped and displayed after decreasing the display brightness.
By performing an operation to add the objects in the images in the intermediate image display 203 in edition processing or drawing by the user in a state illustrated in
In the display examples of
1-4. Display example of a depth adjustment screen
Next, a flow of a depth adjustment process of an image of each layer will be described with reference to a flowchart of
A depth adjustment process is started by a user operation to select a depth operation button 225 of the editing screen illustrated in
In step S21, when it is determined that the operation to select the depth operation button 225 exists, overlapped display of the images of all of the layers is performed as the intermediate image display 203 in edition processing and a depth bar is displayed on the upper side of the intermediate image display 203 in edition processing (step S22). The depth bar is a scale that illustrates the depth of each layer. As illustrated in this example, when the depth bar is configured using the images of the three layers, the depth positions of the three layers are displayed by the depth bar. On the lower side of the editing image display 203, a depth adjustment button is displayed. A specific display example will be described below.
However, in this example, for each layer, an adjustment button to move the depth of the image to the inner side and an adjustment button to move the depth of the image to the front side are prepared and displayed.
The editing screen generating unit 118 determines whether a user operation of any adjustment button exists (step S23). In this case, when it is determined that the operation of the adjustment button does not exist, the display of step S22 is continuously executed. When it is determined that the operation of the adjustment button exists, the image of the layer that corresponds to the operated operation button is displayed as the intermediate image display 203 in edition processing (step S24). The depth position that is set to the image of the corresponding layer is changed according to the operation situation of the adjustment button and the depth position of the depth bar display is changed according to the corresponding position (step S25). After the setting or the display is changed in step S25, the display returns to the display of step S22. However, the intermediate image display 203 in edition processing may be the display of only the operated layer until a next operation exists.
When the operation of the depth operation button 225 of the editing screen exists, as illustrated in
In this example, the depth positions of the images of the three layers are illustrated in one depth bar 401. That is, in the depth bar 401, the depth position 401a of the image of the first layer, the depth position 401b of the image of the second layer, and the depth position 401c of the image of the third layer are illustrated by changing the display colors.
In scales that are given to the depth bar 401, “0” indicates the depth position of the virtual display surface, the inner side is indicated by a minus value, and the front side is indicated by a plus value (plus display is not illustrated).
On the lower side of the editing image display 203, buttons that adjust the depth position are displayed for the image of each layer. Specifically, a depth adjustment button 411 that moves the depth to the front side and a depth adjustment button 412 that moves the depth to the inner side are displayed as adjustment buttons for the image of the first layer. A depth adjustment button 421 that moves the depth to the front side and a depth adjustment button 422 that moves the depth to the inner side are displayed as adjustment buttons for the image of the second layer. A depth adjustment button 431 that moves the depth to the front side and a depth adjustment button 432 that moves the depth to the inner side are displayed as adjustment buttons for the image of the third layer.
If the user operation to select the display places of the depth adjustment buttons 411 to 432 exists, the setting of the depth of the image of each layer is changed. For example, the depth position of the image of the first layer is changed by the operations of the depth adjustment buttons 411 and 412 and the display of the depth position 401a of the image of the first layer in the depth bar 401 is moved as illustrated by an arrow La.
In addition, the depth position of the image of the second layer is changed by the operations of the depth adjustment buttons 421 and 422 and the display of the depth position 401b of the image of the second layer in the depth bar 401 is moved as illustrated by an arrow Lb.
In addition, the depth position of the image of the third layer is changed by the operations of the depth adjustment buttons 431 and 432 and the display of the depth position 401c of the image of the third layer in the depth bar 401 is moved as illustrated by an arrow Lc.
The depth adjustment buttons 411 to 432 are limited to the positions adjacent to the depth positions of the images of the adjacent layers. For example, a range of the movement La of the image of the first image is from the deepest position to the position adjacent to the depth position of the image of the adjacent second layer.
In this example, in the editing image display 203, the image of the selected layer (in this example, image of the third layer) is highlighted and displayed. In a depth bar 501, only the depth position 502 of the layer of the emphasis display is illustrated as the depth bar 501. In the case of the example of
In the case of the example of
In the example of
Therefore, when the depth position is adjusted by the operations of the depth adjustment buttons 611 and 612, the depth position 602 moves in the range of the displayed depth bar 601, as illustrated by an arrow Le.
In an example of
When the depth of the specific layer is adjusted by the operations of the depth adjustment buttons 711, 712, 721, 722, 731, and 732, the position of the image of the layer that is compatible with the adjustment of the depth is indicated by an image frame 704 of four corners, in the editing image display 203.
At almost the center of the image, display 705 of a numerical value (in this example, “−25”) indicating the setting position of the depth is performed.
In this way, display where setting of the depth can be recognized from the display image may be performed.
1-5. Display example of a ground surface setting screen
Next, a flow of a ground surface setting process of an image will be described with reference to a flowchart of
The ground surface setting process starts when the user operates a button (not illustrated in the drawings) to instruct setting of the ground surface, in the editing screen illustrated by
In step S31, when it is determined that the operation of the setting of the ground surface exists, the overlapped display of the images of all of the layers is performed as the intermediate image display 203 in edition processing and a slider bar for horizontal line adjustment is vertically displayed on one end of the intermediate image display 203 in edition processing (step S32). A slider handle that indicates the position of the horizontal line is displayed on the slider bar for the horizontal line adjustment, and it is determined whether a drag operation of the slider handle exists (step S33).
When it is determined that the operation of the slider handle exists, a change process of the position of the horizontal line is executed according to the operation (step S34). The lower side of the horizontal line of the image of the innermost layer (first layer) is set to the inclined surface on the three-dimensional space, according to the change of the position of the horizontal line. The setting of the inclined surface is the process already described in
Next, it is determined whether a mode to erase the lower side of the ground surface in the images of the layers other than the first layer is set (step S35). In this case, when the mode to erase the lower side of the ground surface is set, the objects at the positions that become the lower side of the inclined surface (ground surface) of the images of the layers other than the first layer are erased (step S36).
When it is determined that the mode to erase the lower side of the ground surface is not set in step S35, it is determined whether the object set to be disposed on the ground surface exists among the objects of the images of the individual layers (step S37). In this case, when it is determined that the object set to be disposed on the ground surface exists, the position of the lower end of the corresponding object is adjusted to the position crossing the inclined surface in the image of the layer where the object exists (step S38).
When it is determined that a drag operation does not exist in step S33, after the processes of steps S36 and S38 are executed and when it is determined that the object set to be disposed on the ground surface does not exist in step S37, the process returns to the horizontal line slider bar display process of step S32.
Next, a specific display example at the time of adjusting the horizontal line will be described.
As such, the ground surface of the lower side of the horizontal line of the image of the first layer is set as the inclined surface illustrated in
For example, as illustrated in
In the example of
In this way, the object that becomes the lower side of the ground surface is not displayed so that unnatural display where the object exists on the lower side of the ground surface when the generated image is viewed three-dimensionally can be prevented. The partial erasure process of the object illustrated in
In the example of
The user performs the operation to select each button and the position of the object e is modified. In this case, when the operation to select the ground surface adjustment button 285 exists, the editing screen generating unit 118 executes a process for automatically matching the position of the lower end of the object e with a surface crossing the ground surface.
Therefore, the corresponding object e is automatically disposed on the ground surface by selecting the ground surface adjustment button 285 and an appropriate three-dimensional image can be generated.
1-6. Example where a camera image is captured
For example, if an operation to select the camera capturing button 222 in the editing screen illustrated in
By disposing the extraction image 812 on the image of any layer, the extraction image 812 can be disposed as one of the objects in the intermediate image in generation processing, as illustrated in
1-7. Example where an image file is imported
For example, if an operation to select the file importing button 221 in the editing screen illustrated in
By disposing the extraction image 822 on the image of any layer, as illustrated in
<1-8. Display example of a list of generated images
The three-dimensional image that is generated using the editing screen in the process described above is stored in the image storage unit 120 of the image processing apparatus 100. A list of data of the stored three-dimensional images can be displayed on one screen.
In columns of generated image display, numbers of layers displays 11a, 12a, 13a, . . . that number of layers are indicated by figures are performed. In this case, as the displays of the number of layers by the figures, a figure where three images are overlapped is displayed when the number of layers is three.
By displaying the list of generated images, the generated images can be easily selected. The selected images may be displayed in the editing screen illustrated in
1-9. Specific example of hardware configuration
Next, a specific example of the hardware configuration of the image processing apparatus 100 according to this example will be described with reference to
The image processing apparatus 100 mainly includes a CPU 901, a ROM 903, a RAM 905, a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, an image capturing device 918, a storage device 919, a drive 921, a connection port 923, and a communication device 925.
The CPU 901 functions as an operation processing device and a control device and controls all or a part of operations in the image processing apparatus 100, according to various programs stored in the ROM 903, the RAM 905, the storage device 919, and a removable recording medium 927. The ROM 903 stores programs or operation parameters that are used by the CPU 901. The RAM 905 primarily stores programs used in execution of the CPU 901 or parameters appropriately changed in the execution. These devices are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus.
The host bus 907 is connected to an external bus 911 such as a peripheral component interconnect/interface (PCI) bus through the bridge 909.
The input device 915 is an operation unit such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever operated by the user. The input device 915 may be a remote control unit (so-called remote controller) using infrared rays and other radio waves or an external connection apparatus 929 such as a mobile phone or a PDA that is compatible with the operation of the image processing apparatus 100. The input device 915 is configured using an input control circuit that generates an input signal on the basis of information input by the user using the operation unit and outputs the input signal to the CPU 901. The user of the image processing apparatus 100 operates the input device 915 and can input various data to the image processing apparatus 100 or instructs the image processing apparatus 100 to execute a process operation.
The output device 917 is configured using a display device such as a liquid crystal display device, a plasma display device, an EL display device, and a lamp, a sound output device such as a speaker and a headphone, or a device such as a printer device, a mobile phone, and a facsimile that can visually or audibly notify the user of acquired information. The output device 917 outputs the result that is obtained by various processes executed by the image processing apparatus 100. Specifically, the display device displays the result obtained by the various processes executed by the image processing apparatus 100 with a text or an image. Meanwhile, the sound output device converts an audio signal configured using reproduced sound data or acoustic data into an analog signal and outputs the analog signal.
For example, the image capturing device 918 is provided on the display device and the image processing apparatus 100 can capture a still image or a moving image of the user with the image capturing device 918. The image capturing device 918 includes a charge coupled device (CCD) image sensor or a complementary metal oxide semiconductor (CMOS) image sensor and converts light condensed by a lens into an electric signal and can capture a still image or a moving image.
The storage device 919 is a data storage device that is configured as an example of a storage unit of the image processing apparatus 100. For example, the storage device 919 is configured using a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, or a magneto-optical storage device. The storage device 919 stores programs executed by the CPU 901, various data, and acoustic signal data or image signal data acquired from the outside.
The drive 921 is a reader/writer for a storage medium and is incorporated in the image processing apparatus 100 or is attached to the outside of the image processing apparatus 100. The drive 921 reads information that is recorded in the mounted removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory and outputs the information to the RAM 905. The drive 921 can record an information on the mounted removable recording medium 927 such as the magnetic disk, the optical disk, the magneto-optical disk, or the semiconductor memory. The removable recording medium 927 is a DVD medium, a Blu-ray medium, a compact flash (registered trademark) (CompactFlash: CF), a memory stick, a secure digital (SD) memory card, or the like. The removable recording medium 927 may be an integrated circuit (IC) card or an electronic apparatus where an IC chip of a non-contact type is mounted.
The connection port 923 is a port to directly connect the apparatus to the image processing apparatus 100, such as a universal serial bus (USB) port, an IEEE1394 port such as i.Link, a small computer system interface (SCSI) port, an RS-232C port, an optical audio terminal, or a high-definition multimedia interface (HDMI) port. By connecting the external connection apparatus 929 to the connection port 923, the image processing apparatus 100 acquires the acoustic image data or the image signal data directly from the external connection apparatus 929 or provides the acoustic image data or the image signal data to the external connection apparatus 929.
The communication device 925 is a communication interface that is configured by a communication device for connection with a communication network 931. The communication device 925 is a communication card such as a wired or wireless local area network (LAN), a Bluetooth, and a communication card for a wireless USB (WUSB), a router for optical communication, a router for an asymmetrical digital subscriber line (ADSL), or a modem for various communications. The communication device 925 can transmit and receive a signal based on a predetermined protocol such as TCP/IP between the Internet or other communication apparatuses and the communication device. The communication network 931 that is connected to the communication device 925 is configured by a network connected by wire or wireless. For example, the communication network 931 may be the Internet, a home LAN, infrared communication, radio wave communication, or satellite communications.
The example of the hardware configuration that can realize the function of the image processing apparatus 100 according to this example is described. The various components may be configured using general-purpose members or may be configured by hardware specialized in the functions of the individual components. Therefore, the hardware used in the configuration may be appropriately changed according to various technological levels to carry out this embodiment.
The program (software) that executes each process step executed by the image processing apparatus 100 according to this example may be generated, the program may be deployed in a general-purpose computer device, and the same process may be executed. The program may be stored in various media or may be downloaded from the server side to the computer device through the Internet.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
Note that the following configurations are within the scope of the present disclosure. (1)
An image processing apparatus comprising:
(2)
The image processing apparatus according to (1),
(3)
The image processing apparatus according to (1) or (2),
(4)
The image processing apparatus according to any one of (1) to (3),
(5)
The image processing apparatus according to any one of (1) to (4),
(6)
The image processing apparatus according to any one of (1) to (5),
(7)
The image processing apparatus according to any one of (1) to (6),
(8)
The image processing apparatus according to any one of (1) to (7),
(9)
The image processing apparatus according to any one of (1) to (8),
(10)
The image processing apparatus according to any one of (1) to (9),
(11)
The image processing apparatus according to any one of (1) to (10),
(12)
An image processing apparatus comprising:
(13)
The image processing apparatus according to (12),
(14)
The image processing apparatus according to (12) or (13),
(15)
The image processing apparatus according to any one of (12) to (14),
(16)
The image processing apparatus according to any one of (12) to (15),
(17)
The image processing apparatus according to any one of (12) to (16),
(18)
The image processing apparatus according to any one of (12) to (17),
(19)
The image processing apparatus according to any one of (12) to (18),
(20)
The image processing apparatus according to any one of (12) to (19),
(21)
The image processing apparatus according to any one of (12) to (20),
(22)
The image processing apparatus according to any one of (12) to (21),
(23)
The image processing apparatus according to any one of (12) to (22),
(24)
The image processing apparatus according to any one of (12) to (23),
(25)
The image processing apparatus according to any one of (12) to (24), further comprising:
(26)
An image processing method comprising:
(27)
A program that causes a computer to execute an image process, the program causing the computer to execute:
a to f display object
11, 12, 13 generated image
11
a, 12a, 13a number of layers display
100 image processing apparatus
110 image generating/processing unit
112 image generating unit
114 three-dimensional image converting unit
116 three-dimensional image generating unit
118 editing screen generating unit
120 image storage unit
130 input unit
140 image display unit
201 tab
202 3D state thumbnail tab
203 image display in edition processing
203
a first layer edge portion
203
b second layer edge portion
203
c third layer edge portion
211 to 213 layer thumbnail tab
221 file importing button
222 camera capturing button
223 stamp button
224 character input button
225 depth operation button
231 to 237 object display unit
241 generation start button
242 save button
243 three-dimensional display button
250L image for left eye
250R image for right eye
251, 251′ first layer image
252, 252′ second layer image
253, 253′ third layer image
259, 259′ display surface
261 horizontal line adjustment bar
262 ground surface setting position display
271 ground surface position display in layer
272 non-display portion
281, 282 position movement button
283 returning button
284 erasure button
285 ground surface adjustment button
290 pen tool
291 first pen
292 second pen
293 third pen
294 eraser
301 depth axis
301
a to 301d depth position
302 front edge portion
303 ground surface setting position
304 virtual display surface
311 first layer image
312 second layer image
313 third layer image
321 horizontal line position
401 depth bar display
401
a, 401b, 401c layer position
411, 412, 421, 422, 431, 432 depth adjustment button
501 depth bar display
502 virtual display surface position
503 layer position
511, 512 depth adjustment button
601 depth bar display
603 layer position
611, 612 depth adjustment button
701 depth bar display
702 virtual display surface position
703 layer position
704 depth frame
705 depth value display
711, 712, 721, 722, 731, 732 depth adjustment button
810 camera capturing operation screen
811 camera capturing image
812 extraction image
820 image file importing operation screen
821 imported image
822 extraction image
Number | Date | Country | Kind |
---|---|---|---|
2011-126792 | Jun 2011 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2012/001012 | 2/16/2012 | WO | 00 | 11/26/2013 |