The present invention relates to embedment of information in printed matter and a display method of embedded information.
In recent years, there is a technique, such as AR (Augmented Reality). AR is a technique to produce augmented representation by adding another piece of information to information a person can perceive in the real world. As an example of practical use of AR, for example, there is a case where additional information (also referred to as multiplexed information or embedded information) is displayed in a superimposing manner on an image obtained by performing scanning by a camera included in a mobile terminal, such as a smartphone and a tablet.
Further, there is a technique to embed additional information in printed matter (hereinafter, described as an information multiplexing technique). Then, it is possible to extract additional information embedded in printed matter by the information multiplexing technique by reading (scanning) the printed matter by a camera and the like.
By using the AR technique and the information multiplexing technique, it is made possible to display additional information for augmented reality in a superimposing manner on a scanned image (see Japanese Patent Laid-Open No. 2013-026922). For example, as shown in
In
However, for example, a case is supposed where scan-target printed matter is vertically long and animation that is superimposed on a scanned image is animation suitable to a landscape as shown in
Consequently, an object of the present invention is to provide an image processing apparatus capable of embedding additional information so that an appropriate object corresponding to the display direction in accordance with the direction of an original image is displayed.
The image processing apparatus according to the present invention is an image processing apparatus that embeds information relating to an object that is displayed in a superimposing manner on a captured image, in print data of a printed matter as additional information, the image processing apparatus including: an embedment unit configured to embed information capable of at least specifying a display direction in accordance with a direction of an original image of the printed matter and a type of the object in a case of displaying the object in a superimposing manner on the captured image in the print data as the additional information, wherein the captured image includes an image obtained by capturing the printed matter.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
In the following, preferred embodiments of the present invention are explained in detail with reference to the attached drawings. The following embodiments are not intended to limit the present invention according to the scope of the claims and all combinations of features explained in the embodiments are not necessarily indispensable to the solution of the present invention.
The main board 110 includes a CPU 111, a program memory 112, a data memory 113, a wireless LAN control circuit 115, an NFC control circuit 116, a line connection control circuit 117, an operation unit control circuit 118, a camera 119, and a nonvolatile memory 120. Each of these components is connected to one another via an internal bus 121.
The CPU 111 in the form of a microprocessor operates in accordance with control programs stored in the program memory 112 in the form a ROM and contents of data stored in the data memory 113 in the form of a RAM. Further, the CPU 111 communicates with another apparatus connected to the wireless LAN 12 by controlling the wireless LAN unit 102 via the wireless LAN control circuit 115. Further, the CPU 111 is capable of detecting another NFC terminal connected to the NFC 11 and performing transmission and reception of data with another NFC terminal by controlling the NFC unit 101 via the NFC control circuit 116. Further, the CPU 111 is capable of connecting to the mobile telephone line network 13 and performing transmission and reception of voice data, and the like by controlling the line connection unit 103 via the line connection control circuit 117. Further, the CPU 111 is capable of displaying information about a printer on the touch panel display 104 and receiving a user operation for the touch panel display 104 by controlling the operation unit control circuit 118. Further, the CPU 111 is capable of capturing an image by controlling the camera 119 and stores image data obtained by image capturing in an image memory 114 of the data memory 113. Further, the CPU 111 is also capable of storing image data acquired from the outside through the mobile telephone line network 13, the wireless LAN 12, or the NFC 11, other than image data obtained by image capturing, in the image memory 114 and transmitting image data to the outside.
The nonvolatile memory 120 includes a memory, such as a flash memory, and is capable of retaining stored data even in the case where the power source is turned off. In the nonvolatile memory 120, for example, image data and programs, such as software for causing the terminal apparatus 100 to implement various functions, are stored, in addition to telephone directory data, various kinds of pieces of communication connection information, information on devices that were connected in the past, and so on. In the nonvolatile memory 120, an image processing application, to be described later, and animation management information and animation data managed by the image processing application are also stored. Further, an OS for causing the image processing application to operate is also stored.
The main board 210 includes a CPU 211, a program memory 212, a data memory 213, a scanner 215, a printing unit 216, a wireless LAN control circuit 217, an NFC control circuit 218, and an operation unit control circuit 219. Each of these components is connected to one another via an internal bus 220.
The CPU 211 in the form of a microprocessor operates in accordance with control programs stored in the program memory 212 in the form of a ROM and contents of data stored in the data memory 213 in the form of a RAM. Further, the CPU 211 reads a document by controlling the scanner 215 and stores in an image memory 214 of the data memory 213. Further, the CPU 211 is capable of printing image data in the image memory 214 of the data memory 213 on a printing medium by controlling the printing unit 216. Further, the CPU 211 is also capable of transmitting image data obtained by scanning a document to the terminal apparatus 100 or another apparatus connected to the wireless LAN 12 by controlling the wireless LAN unit 203 via the wireless LAN control circuit 217. Further, the CPU 211 is capable of detecting another NFC terminal connected to the NFC 11 and performing transmission and reception of data with another NFC terminal by controlling the NFC unit 202 via the NFC control circuit 218. Further, the CPU 211 is capable of displaying the state of the printing apparatus 200 and a function selection menu on the operation panel 201 and receiving a user operation for the operation panel 201 by controlling the operation unit control circuit 219. It is assumed that the printing apparatus 200 in the present embodiment is capable of printing data in the JPEG file format to which a printer control command is given.
By using
A user activates the image processing application by operating the touch panel display 104 of the terminal apparatus 100. Then, the image processing application displays a top screen shown in
In the case where the “Create” button 402 is tapped (YES at step S302), the image processing application makes a transition into decoration message embedment processing (also referred to as decoration message multiplexing processing) (step S303). The decoration message embedment processing is processing to generate print data in which a decoration message for decorating a scanned image is embedded and will be described later by using
In
The image processing application manages the type of animation, the display method, and animation data by using the animation management information. In the animation management information, as shown in
In the case where a user selects animation by operating the animation selection screen, the image processing application displays an animation check screen (step S502).
Next, the image processing application determines whether the direction of the printing-target image selected on the image selection screen is the vertical direction or the horizontal direction (step S504). As a method of determining the direction of an image, determination is performed based on direction information (for example, Exif information of JPEG) included in image data. In the case where direction information is not included in image data, it may also be possible to perform determination by the aspect ratio of an image. For example, it may also be possible to determine the direction to be the horizontal direction in the case where the aspect ratio (for example, a value whose denominator is the number of pixels in the vertical direction of the image and whose numerator is the number of pixels in the horizontal direction) of the image is 1 or larger, and to determine the direction to be the vertical direction in the other cases.
In the case where the printing-target image is in the vertical direction (YES at step S504), the image processing application specifies animation belonging to the category selected on the animation selection screen and whose display method is the portrait from the animation management information. Then, the image processing application saves IndexNo corresponding to the specified animation in the data memory 113 (step S505). At this time, for example, in the case where the animation selected on the animation selection screen is wedding #1, 1 is saved as IndexNo. Further, for example, in the case where the animation selected on the animation selection screen is wedding #2, 3 is saved as IndexNo. In the case where the printing-target image is in the horizontal direction (NO at step S504), the image processing application specifies animation belonging to the category selected on the animation selection screen and whose display method is the landscape from the animation management information. Then, the image processing application saves IndexNo corresponding to the specified animation in the data memory 113 (step S506). At this time, for example, in the case where the animation selected on the animation selection screen is wedding #1, 2 is saved as IndexNo. Further, for example, in the case where the animation selected on the animation selection screen is wedding #2, 4 is saved as IndexNo. As described above, a user selects a desired category of a plurality of categories of animation by selecting one of the buttons 601 to 604. Then, whether animation data of the portrait display method or animation data of the landscape display method is selected of the desired category is determined automatically at steps S504 to S506. Because of this, appropriate animation data corresponding to the desired category and in accordance with the direction of the printing-target image is selected without a user selecting animation data by taking into consideration the direction of the printing-target image.
Next, the image processing application displays an editing screen shown in
In the case where the selected animation check area 904 is tapped on the editing screen, the image processing application determines that a user inputs a message (YES at step S508) and displays a message input screen shown in
In the case where a “Next” button 905 is tapped on the editing screen (NO at step S508), the image processing application displays a printing setting screen shown in
A user gives instructions to perform printing by tapping a Print button 1105 after performing desired printing setting on the printing setting screen. Upon receipt of the instructions to perform printing, the image processing application generates print data (step S511) and transmits the print data to the printer (here, the printing apparatus 20) (step S512). Due to this, the printed matter in which additional information is embedded is output by the printer.
Here, the processing (print data generation processing) at step S511 is explained.
Upon receipt of the instructions to perform printing, the image processing application generates raster data of RGB by performing rendering for the image data in the SVG format generated by the processing up to step S511 (step S1201)
Next, the image processing application embeds additional information in the raster data by using the information multiplexing technique (step S1202). At this time, the image processing application includes IndexNo corresponding to the animation saved in the data memory 113 at steps S505 and S506 and the text message saved in the data memory 113 at step S508 in the additional information. In the present embodiment, an information multiplexing method of embedding additional information in a state where it is difficult to visually recognize the additional information is used. By adopting such a multiplexing method, the appearance of a printed image is not marred, unlike the multiplexing method of including an AR tag and the like in a partial area of printed matter as described in Japanese Patent Laid-Open No. 2013-026922. As the method of embedding additional information in printed matter or as the method of extracting additional information from a scanned image, it may also be possible to use a well-known technique and detailed explanation is omitted.
Next, the image processing application converts the raster data in which additional information is embedded into print data (JPEG data) (step S1203). In the case where the printer is capable of printing data in a file format (for example, PDF) other than JPEG it may also be possible to convert the raster data into data in a file format other than JPEG Lastly, the image processing application gives a print command (print control command) to the generated print data (step S1204).
By using
In the case where a transition is made into the decoration message display processing, the image processing application activates a camera application and brings about a state where it is possible to capture printed matter (step S1301). Then, a user operates the camera application and scans the printed matter in which additional information is embedded (more specifically, printed matter printed based on the print data in which additional information is embedded). Here, the scan is the operation to read an image printed on printed matter by capturing the printed matter. Consequently, in the following, there is a case where a scanned image is represented as a captured image.
Next, the image processing application specifies the type (here, wedding #1, wedding #2, birthday #1, or birthday #2) of the animation embedded in the scanned image from the extracted IndexNo. Further, the image processing application specifies the display method (here, portrait or landscape) of the animation from the extracted IndexNo. The image processing application reads the animation data corresponding to the specified type and display method form the nonvolatile memory 120. Because the animation is specified uniquely by IndexNo, it may also be possible to read the animation data corresponding to IndexNo from the nonvolatile memory 120 without referring to the type of the animation. The image processing application displays the scanned image of the printed matter by superimposing the animation thereon on the screen (touch panel display 104) of the terminal apparatus 100 by using the read animation data (step S1303). Further, in the case where the text message is extracted at step S1302, the image processing application displays the text message in a superimposing manner on the scanned image in accordance with the display method of the animation.
As described above, in the present embodiment, the display direction (display method) of animation that is displayed in a superimposing manner on a captured image (scanned image) obtained by capturing printed matter is determined in accordance with the direction of a printing-target image. Then, information capable of specifying at least the type of animation and the determined display direction of the animation is embedded in the print data of the printed matter as additional information. Further, the additional information (IndexNo) is extracted from the scanned image obtained by scanning the printed matter and the type and the display direction of animation that should be superimposed on the scanned image are specified. Due to this, for example, as shown in
In the present embodiment, the example is explained in which IndexNo corresponding to animation and a text message are embedded as additional information. However, it may also be possible to embed information indicating the direction of a printing-target image, which is determined on the editing screen shown in
Further, in the present embodiment, explanation is given by taking the image processing application that performs the decoration message embedment processing and the decoration message display processing as an example. However, an application that performs the decoration message embedment processing and another application that performs the decoration message display processing may be installed respectively in the terminal apparatus 100. Then, it may also be possible for the CPU 111 of the terminal apparatus 100 to perform the processing in accordance with each of the applications. Further, an application that performs the decoration message embedment processing and another application that performs the decoration message display processing may be installed in different terminal apparatuses, respectively. In such a case, each of the terminal apparatuses holds the above-described animation management information. It may also be possible to store animation management information in a common storage device that can be accessed from each terminal apparatus.
Further, for example, a table capable of uniquely identifying the type of an object (category selected by a user in
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
According to the present invention, it is possible to embed additional information so that an appropriate object corresponding to the display direction in accordance with the direction of an original image is displayed.
Further, it may also be possible for an operating system (OS) or the like running on a computer to perform part or all of the actual processing based on instructions of a program and one or more of the functions of the embodiment described previously may be implemented by the processing.
Furthermore, it may also be possible to write a program read from a storage medium in a memory included on a function extension board inserted into a computer or in a function extension unit connected to the computer. Then, it may also be possible for a CPU or the like included on the function extension board or in the function extension unit to perform part or all of the actual processing based on instructions of the program and one or more of the functions of the embodiment described previously described may be performed by the processing.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2017-126770, filed Jun. 28, 2017, which is hereby incorporated by reference wherein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2017-126770 | Jun 2017 | JP | national |