This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2011-093398 filed Apr. 19, 2011.
The present invention relates to an image processing apparatus, an image processing system, an image processing method, and a computer readable medium.
According to an aspect of the invention, there is provided an image processing apparatus including an imaged image acquiring unit, an identifying unit, an acquiring unit, and a display control unit. The imaged image acquiring unit acquires an imaged image. The identifying unit identifies an object imaged in the imaged image. The acquiring unit acquires image information on the object stored in association with the object identified by the identifying unit. The display control unit displays at least part of an image area of an object image based on the image information on the object.
An exemplary embodiment of the present invention will be described in detail based on the following figures, wherein:
Hereinafter, an exemplary embodiment of the present invention will be described with reference to the drawings. In this exemplary embodiment, an example in which an image processing apparatus according to an exemplary embodiment is applied to a portable information terminal having an imaging function and a display function will be explained.
As illustrated in
Furthermore, encoded patterns 202 (dot codes) in which positional information corresponding to positions in the object 200 is encoded are formed, using an infrared absorbing material, on the object 200, as illustrated in
Functions of the individual units provided in the portable information terminal 100 may be implemented when a computer including a control section such as a central processing unit (CPU), a storing section such as a memory, a communication section that transmits and receives data to and from an external device, an imaging section such as a camera, an input/display section such as the touch panel 102, etc. reads and executes a program that is stored in a computer-readable information storage medium. The program may be supplied to the portable information terminal 100, which is, for example, a computer, by an information storage medium such as an optical disk, a magnetic disk, a magnetic tape, a magneto-optical disk, or a flash memory or supplied via a data communication network such as the Internet.
The storing unit 150 stores data and a program and is also used as a work memory. For example, an object data table illustrated in
The imaging unit 152 images an object using the camera 104 provided in the portable information terminal 100. For example, the imaging unit 152 may image, with the camera 104 of the portable information terminal 100 on which the infrared camera adaptor 110 is mounted, encoded patterns printed in infrared absorbing ink on the surface of an object.
The imaged image acquiring unit 154 acquires the image imaged by the imaging unit 152. For example, the imaged image acquiring unit 154 may acquire an imaged image obtained by imaging encoded patterns printed on the surface of the object.
The decoding unit 156 decodes the encoded patterns contained in the imaged image acquired by the imaged image acquiring unit 154. For example, the decoding unit 156 may sequentially extract unit areas each having a predetermined size from the imaged image, identify the arrangement of plural dots contained in the extracted unit area, and acquire data encoded into an encoded pattern on the basis of the identified arrangement. For example, the encoded data may include an object ID and positional information in the object.
The object identifying unit 158 identifies the object imaged in the imaged image acquired by the imaged image acquiring unit 154, on the basis of the result obtained by decoding the encoded pattern by the decoding unit 156. For example, in the case where the encoded pattern contains an object ID, the object identifying unit 158 may identify the object on the basis of the object ID contained in the encoded pattern. In the case where identification information on the object is contained in a predetermined area in the imaged image, scanning is performed on the predetermined area so that identification information on the object can be acquired.
The imaged area identifying unit 160 identifies an area of the object imaged in the imaged image acquired by the imaged image acquiring unit 154. For example, the imaged area identifying unit 160 may identify an area of the object imaged in the imaged image, on the basis of positional information in the object decoded by the decoding unit 156.
The object rotation angle identifying unit 162 identifies the rotation angle of the object imaged in the imaged image, on the basis of the imaged image acquired by the imaged image acquiring unit 154. The rotation angle around each of three axes (X-axis, Y-axis, and Z-axis) of the object may be defined as the rotation angle of the object. For example, the object rotation angle identifying unit 162 may compute the rotation angle of the object on the basis of the ratios of expansion and contraction in the vertical and horizontal directions of the encoded pattern contained in the imaged image and the rotation angle from the upright position of the encoded pattern.
The object image information acquiring unit 164 acquires image information on the object identified by the object identifying unit 158. For example, the object image information acquiring unit 164 may refer to the object image ID stored in the storing unit 150 in association with the object ID identified by the object identifying unit 158, and acquire the object image data stored in the storing unit 150, on the basis of the object image ID.
The display area setting unit 166 sets an area of the object image to be displayed (display area) based on the image information on the object acquired by the object image information acquiring unit 164. For example, the display area setting unit 166 may set, on the basis of the area of the object identified by the imaged area identifying unit 160, a display area corresponding to the area of the object. The display area setting unit 166 may set a display area that corresponds to the area of the object identified by the imaged area identifying unit 160 and corresponds to the area of the object corresponding to the size of the touch panel 102 of the portable information terminal 100.
The display image generating unit 168 generates a display image on the basis of the image information on the object acquired by the object image information acquiring unit 164, the display area set by the display area setting unit 166, and the rotation angle of the object identified by the object rotation angle identifying unit 162. For example, the display image generating unit 168 may generate a display image by extracting an image in the display area set by the display area setting unit 166 from the image information on the object acquired by the object image information acquiring unit 164 and performing image processing on the extracted image on the basis of the rotation angle of the object identified by the object rotation angle identifying unit 162.
The control unit 170 controls each unit of the portable information terminal 100. For example, the control unit 170 may control the display unit 172 to display the display image generated by the display image generating unit 168.
The display unit 172 is implemented by, for example, the touch panel 102, and displays an image under the control of the control unit 170. For example, the display unit 172 may display the display image generated by the display image generating unit 168.
The input unit 174 is implemented by, for example, the touch panel 102, and receives an operation instruction from a user.
A process performed by the portable information terminal 100 will be explained by way of a specific example with reference to a flowchart illustrated in
As illustrated in
The portable information terminal 100 decodes the encoded patterns contained in the image area extracted in step S2 (step S3), and identifies the object, the imaged area of the object, and the rotation angle of the object on the basis of the decoded data (step S4).
The portable information terminal 100 acquires image information on the object identified in step S4 (step S5), and sets a display area of the acquired image information on the object to be displayed (step S6). The portable information terminal 100 generates a display image on the basis of the image information on the object acquired in step S5, the display area set in step S6, and the rotation angle of the object identified in step S4 (step S7), and displays the generated display image (step S8).
When receiving an instruction for an operation such as, for example, movement, expansion/contraction, or rotation of the image displayed on the touch panel 102 (YES in step S9), the portable information terminal 100 updates the display area on the basis of the received operation instruction (step S10). Then, the portable information terminal 100 generates a display image on the basis of the updated display area (step S7), and displays the generated display image (step S8).
When the link in the display image is selected (YES in step S11), the portable information terminal 100 performs processing based on the link (step S12).
When the portable information terminal 100 determines that the process is not to be ended (NO in step S13), the process returns to step S9. When the portable information terminal 100 determines that the process is to be ended (YES in step S13), the process is ended.
The present invention is not limited to the exemplary embodiment described above. For example, in the portable information terminal 100, display based on an imaged image and display based on an object image may be selectively performed.
In the exemplary embodiment described above, an example in which an image processing apparatus according to an exemplary embodiment is applied to the portable information terminal 100 including an imaging section and a display section has been explained. However, the image processing apparatus may be applied to an apparatus not including an imaging section or a display section. Furthermore, in the exemplary embodiment described above, the infrared camera adaptor 110 is mounted on the camera of the portable information terminal 100 so that an infrared image can be acquired. However, an infrared camera may be provided instead of the camera 104.
In the exemplary embodiment described above, an example in which image information on an object is stored in the portable information terminal 100 has been explained. However, image information on an object may be stored in a server different from the portable information terminal 100, and in this case, the portable information terminal 100 may acquire the image information on the object from the server. Here, the server may perform processing from acquisition of an imaged image from the portable information terminal 100 to generation of a display image or only part of the processing.
In the exemplary embodiment described above, a display area set for an object image by the display area setting unit 166 is not limited to an area corresponding to the area of the object that is contained in the imaged image. The display area may be set on the basis of the distance between the object and the portable information terminal 100, the positional relationship between the portable information terminal 100 and a user, and the like.
The foregoing description of the exemplary embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to understand the invention for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2011-093398 | Apr 2011 | JP | national |