The present application claims priority under 35 U.S.C. §119 to Japanese Patent Application No. 2012-277324, filed Dec. 19, 2012. The contents of this application are incorporated herein by reference in their entirety.
1. Field of the Invention
The present invention relates to an image processing terminal, an image processing system, and a computer-readable storage medium storing a control program of an image processing terminal.
2. Discussion of the Background
In recent years, in the field of smartphones, tablet terminals, and other image processing terminals capable of image processing, active research and study have been under way regarding the technology of obtaining various kinds of information from a database on a server and displaying the obtained information on a terminal in an attempt to minimize the amount of retention data at the terminal side. For example, Japanese Unexamined Patent Application Publication No. 2007-264992 discloses use of this technology to facilitate document search.
Another example is recited in Japanese Unexamined Patent Application Publication No. 2009-230748, which discloses placing a two-dimensional bar code stamp on a printed material, such as a paper medium, and scanning the stamp under an image processing terminal, for the purpose of using the stamp in document authentication.
For further example, Japanese Unexamined Patent Application Publication No. 2009-301350 discloses a technique associated with an imaginary image display system. In the imaginary image display system, a photographing section photographs a real image, and in response, an image object is read from an image object storage. The image object is superimposed over the real image, resulting in an imaginary image. By displaying the resulting imaginary image, an expanded sense of reality is generated.
Many image processing terminals such as smartphones and tablet terminals are provided with a camera capability such as a CMOS image sensor. The camera capability is used to photograph a particular subject on a paper medium or other printed material. In accordance with the photographed real image of the subject, an image object (which is herein referred to as air tag) is superimposed over the real image, resulting in an imaginary image. The resulting imaginary image is displayed on the display of an image processing terminal. In this manner, it is possible to generate an expanded sense of reality.
A possible air-tag application is that while a printed material such as an operation manual of any of various appliances are photographed by an image processing terminal, the display of the image processing terminal displays a real-time image of the printed material and an air tag superimposed over the real-time image. In this case, air tags may be superimposed over a particular position (for example, a particular word and a particular drawing) on the printed material displayed in real time on the display. The air tags indicate item names of electronic information to be added (which will be hereinafter referred to as electronic additional information, examples including help information explaining the word, a detailed configuration of the drawing, and moving image information on appliance operation statuses).
Specifically, using image analysis technology, the image processing terminal analyzes a two-dimensional code or a similar identifier printed in advance on a printed material, thus specifying the printed material. When electronic additional information exists that is related in advance to the printed material, the electronic additional information is read and the display displays air tags in the vicinity of a particular position on the printed material specified in the electronic additional information. A possible manner of displaying the air tags to display item names indicating the content of the electronic additional information is by the use of text in what is called a balloon often used in cartoons. Then, a user can touch any of the air tags displayed on the display, where it has a touch screen capability, to have the electronic additional information displayed on the display.
Here, it is possible to store the electronic additional information not in the image processing terminal but in a database on a server that is connectable to the image processing terminal through a network. In this case, where necessary, the user can handle the image processing terminal to call up the electronic additional information. Thus, it is possible to store the electronic additional information in, instead of the image processing terminal, a database on a server connectable to the image processing terminal through a network. This ensures a reduced memory capacity of the image processing terminal, and additionally, ensures collective update of the electronic additional information at the server side.
Unfortunately, with an image processing terminal capable of displaying air tags, displaying a large number of air tags on the touch screen display can cause an erroneous touch on an adjacent air tag on the touch screen display. Thus, it can be difficult to select a desired air tag. This problem applies particularly in smartphones and similar image processing terminals that have a limited display area.
The present invention has been made in view of the above-described circumstances, and it is an object of the present invention to provide an image processing terminal, an image processing system, and a control program of an image processing terminal that ensure selection of a desired air tag even when a large number of air tags exist.
According to one aspect of the present invention, an image processing terminal includes a photographing section, an electronic additional information obtaining section, a display control section, and an image analysis section. The photographing section is configured to photograph an image of a subject. The display section is configured to display the image of the subject photographed by the photographing section. The electronic additional information obtaining section is configured to obtain electronic additional information to be added to the image of the subject photographed by the photographing section. The display control section is configured to control the display section to display the image of the subject with at least one air tag superimposed over the image of the subject. The at least one air tag corresponds to the electronic additional information obtained by the electronic additional information obtaining section. The image analysis section is configured to analyze the image of the subject photographed by the photographing section. The at least one air tag includes a plurality of air tags. In the case where, while the display section is displaying the plurality of air tags, the photographing section photographs an operation of pointing to a particular position within the subject by a fingertip of a user of the image processing terminal, the image analysis section is configured to specify a position of the fingertip in the subject photographed by the photographing section so as to specify the particular position within the subject pointed to by the fingertip, and the display section is configured to display an air tag, among the plurality of air tags, corresponding to the particular position specified by the image analysis section.
According to another aspect of the present invention, an image processing terminal includes a photographing section, an electronic additional information obtaining section, a display control section, and an image analysis section. The photographing section is configured to photograph an image of a subject. The display section is configured to display the image of the subject photographed by the photographing section. The electronic additional information obtaining section is configured to obtain electronic additional information to be added to the subject photographed by the photographing section. The display control section is configured to control the display section to display the image of the subject with at least one air tag superimposed over the image of the subject. The at least one air tag corresponds to the electronic additional information obtained by the electronic additional information obtaining section. The image analysis section is configured to analyze the image of the subject photographed by the photographing section. The at least one air tag includes a plurality of air tags. In the case where, while the display section is displaying the plurality of air tags, the photographing section executes zoom-in or closeup with respect to a particular position within the subject, the image analysis section is configured to specify the particular position within the subject that has been subjected to the zoom-in or the closeup by the photographing section, and the display section is configured to display an air tag, among the plurality of air tags, corresponding to the particular position specified by the image analysis section.
According to another aspect of the present invention, an image processing system includes any of the above-described image processing terminals, and a server connectable to the image processing terminal through a network. The server includes a database that stores electronic additional information of a subject in relation to a particular code to specify the subject. The image processing terminal is configured to transmit to the server a particular code to specify the subject photographed by a photographing section of the image processing terminal, and is configured to receive from the server the electronic additional information of the subject so that an electronic additional information obtaining section of the image processing terminal obtains the electronic additional information.
According to another aspect of the present invention, a computer-readable storage medium stores a control program of an image processing terminal subject. The image processing terminal includes a photographing section and a display section. The photographing section is configured to photograph an image of a subject. The display section is configured to display the image of the subject photographed by the photographing section. The control program causes a computer to perform obtaining electronic additional information to be added to the subject photographed by the photographing section. The display section is controlled to display the image of the subject with at least one air tag superimposed over the image of the subject. The at least one air tag corresponds to the electronic additional information obtained in the obtaining step. In the case where, while the display section is displaying a plurality of air tags in the controlling step, the photographing section photographs an operation of pointing to a particular position within the subject by a fingertip of a user of the image processing terminal, a position of the fingertip in the subject photographed by the photographing section is specified so as to specify the particular position within the subject pointed to by the fingertip. The display section is controlled to display an air tag, among the plurality of air tags, corresponding to the particular position specified in the image analysis performing step.
According to the other aspect of the present invention, a computer-readable storage medium stores a control program of an image processing terminal subject. The image processing terminal includes a photographing section and a display section. The image processing terminal is configured to photograph an image of a subject. The display section is configured to display the image of the subject photographed by the photographing section. The control program causes a computer to perform obtaining electronic additional information to be added to the subject photographed by the photographing section. The display section is controlled to display the image of the subject with at least one air tag superimposed over the image of the subject. The at least one air tag corresponds to the electronic additional information obtained in the obtaining step. In the case where, while the display section is displaying a plurality of air tags in the controlling step, the photographing section executes zoom-in or closeup with respect to a particular position within the subject, the particular position within the subject that has been subjected to the zoom-in or the closeup by the photographing section is specified. The display section is controlled to display an air tag, among the plurality of air tags, corresponding to the particular position specified in the specifying step.
A more complete appreciation of the invention and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
This embodiment is regarding an image processing system and a method for processing an image. In the image processing system and the method, a user of the image processing terminal performs an operation of pointing to a particular position within the subject at a fingertip of the user on the subject instead of on the display section. Based on this operation, an image processing section specifies the position of the fingertip on the display section, thereby specifying the particular position within the subject. Among a plurality of air tags, only an air tag that is corresponding to the particular position is displayed on the display section.
The image processing terminal 100 is connectable to the network 200 through a wireless LAN access point 201. The image processing terminal 100 is also connectable to the server 300 through the network 200. The server 300 includes a database 301.
The image processing terminal 100 according to this embodiment is capable of, using the photographing section 102, photographing a subject MN, which is a paper medium or other printed material indicated as “original” in
Here is a possible example of how to use air tags. In the case where the subject MN is an operation manual or a similar printed material of any of various kinds of appliances, while the photographing section 102 is photographing the subject MN, the display section 101 may display air tags in such a manner that the air tags are superimposed over a real-time image of the subject MN, which is a printed material, or that the air tags are superimposed over electronic data of the subject MN (for example, PDF (Portable Document Format) data or JPEG (Joint Photographic Experts Group) data, which would be obtained by scanning the subject MN in advance). In this case, in accordance with a particular position within the printed material (for example, a particular word and a particular drawing) displayed on the display section 101, the image processing section 103 may show the superimposed air tags as item indicators of electronic additional information that are electronic information corresponding to the particular position and to be added to the subject MN (examples of the electronic additional information including help information explaining the word, a detailed configuration of the drawing, and moving image information on appliance operation statuses).
Here, the electronic additional information is not necessarily stored in the image processing terminal 100. The electronic additional information may also be stored, together with the electronic data of the subject MN, in a database 301 on the server 300, which is connectable to the image processing terminal 100 through the network 200. This ensures that the user, as necessary, can handle the image processing terminal 100 to call up the electronic additional information and the electronic data of the subject MN. Thus, the electronic additional information is not stored at the image processing terminal 100 side but at the database 301 side on the server 300. This ensures a reduced memory capacity of the image processing terminal 100, and additionally, ensures collective update of the electronic additional information at the server 300 side. The following description takes, as an example, storing the electronic additional information in the database 301 on the server 300.
As shown in
The air tag ATa is disposed on the display section 101 at a position specified by the position information in the area electronic additional information 3a. Here, the air tag ATa is an item indicating the content of the electronic additional information, and is displayed in text form in what is called a balloon often used in cartoons (in this example, the indication “Additional information”).
First, the user handles the image processing terminal 100 to activate an application program according to this embodiment (step S1). Next, the subject MN is photographed (step S2).
Here, the image processing section 103 includes an image analysis section 103a, a display control section 103b, an electronic additional information obtaining section 103c, and a finger detection data 103d.
The image analysis section 103a determines, by a known image analysis technique, whether the photographed subject MN contains a particular code CD, such as a two-dimensional code. When a particular code CD is contained, the image analysis section 103a specifies a content indicated by the particular code CD (in this example, the ID information 1, which indicates the first page of the document A) (step S3). Then, the electronic additional information obtaining section 103c receives, from the image analysis section 103a, the ID information 1 obtained using the particular code CD as a key, and transmits the ID information 1 to the server 300 from the network interface 104 through the network 200 (step S4). This causes the image processing terminal 100 to make a demand to the database 301 on the server 300 for electronic data of the first page, MNa, of the document A and for the electronic additional information ESa of the first page, MNa, of the document A stored in the database 301 and corresponding to the ID information 1, which indicates the first page of the document A and is indicated by the particular code CD.
The server 300 receives the ID information 1 transmitted from the image processing terminal 100 (step S5), and searches the database 301 for electronic data and electronic additional information that correspond to the ID information 1 (step S6). When electronic data and electronic additional information that correspond to the ID information 1 exist, the server 300 transmits the electronic data and electronic additional information to the image processing terminal 100 (step S7). Thus, the network interface 104 receives the electronic data and electronic additional information from the server 300, and the electronic additional information obtaining section 103c obtains the electronic data and electronic additional information of the first page, MNa, of the document A (step S8). When the search in the database 301 finds no relevant electronic data or electronic additional information, the server 300 may transmit to the image processing terminal 100 error information or similar information notifying search failure, and the image processing terminal 100 may display an error message.
The image analysis section 103a determines, by a known image analysis technique, the paper area and paper orientation of the first page, MNa, of the photographed document A. Then, based on the determination, the image analysis section 103a specifies the origin position (for example, the upper left corner of the paper) of the first page, MNa, of the document A (step S9). Next, based on the position information contained in each of the pieces of area electronic additional information contained in the electronic additional information, the display control section 103b checks a position relative to the origin position specified at step 9. Thus, the display control section 103b recognizes the display positions of all the air tags respectively corresponding to the pieces of area electronic additional information obtained from the electronic additional information on the subject MN displayed on the display section 101. The display control section 103b then displays all the air tags in such a manner that the air tags are superimposed over the image of the subject MN at their respective display positions (step 10).
Specifically, when the first page, MNa, of the document A shown in
Regarding how to display the air tags, it is possible to superimpose the air tags over the first page, MNa, of the document A displayed in real time on the display section 101. It is also possible to superimpose the air tags over the electronic data of the first page, MNa, of the document A obtained by the electronic additional information obtaining section 103c.
Here, displaying a large number of air tags on the display section 101 as shown in
In view of this, in this embodiment, as shown in
Specifically, referring to the flowchart in
Here, the image analysis section 103a uses the finger detection data 103d to perform known image analysis, thereby specifying the position of the fingertip of the user pointing to the display section 101. The finger detection data 103d contains known data about fingertip image detection such as image analysis color data indicating the contrast between the fingertip color (usually flesh color) and the color (usually white) of the printed material, and image analysis shape data indicating the fingertip shape. Then, the image analysis section 103a converts the position of the fingertip FG into two-dimensional coordinates on the subject MN. In the operations at steps S11 and S12 to determine the position to which the fingertip FG points, it is possible to make the determination when, for example, the fingertip FG detected based on the finger detection data 103d stays within a predetermined distance range for over a predetermined period of time.
Next, the display control section 103b specifies, as the particular position pointed to for air tag display, one of the air tag displayable positions PTa to PTc within the subject MNa that is closest to the converted coordinates (step S13). Then, the display control section 103b displays on the display section 101 only the air tag, among the plurality of air tags, corresponding to the particular position specified at step S13 (step S14). This state corresponds to the right end representation of
Then, the display control section 103b displays on the display section 101 the content of the electronic additional information (for example, moving image information on appliance operation statuses) corresponding to the air tag corresponding to the particular position.
When in the state ST1 an air tag is specified by the fingertip, the state ST1 shifts to a state ST2, in which only the designated air tag is displayed. When in the state ST2 the user moves the fingertip FG off the subject MN and the fingertip FG is no longer displayed on the display section 101, or when the user acts as if the user was undecided, moving the fingertip FG restlessly, then it is possible to delete the specified air tag and return the state ST2 to the state ST1. Then, when the user points to another air tag at the fingertip FG, the state ST1 again shifts to the state ST2.
In the state ST2, when the user performs a double tap operation (which is an operation of quickly tapping the fingertip on the paper twice) while pointing to the particular position on the subject MN, then the image analysis section 103a recognizes this operation by known image analysis using the finger detection data 103d, and the state ST2 shifts to a state ST3, in which the content of the electronic additional information is displayed while being superimposed over the subject MN displayed in real time.
In the state ST3, the display control section 103b deletes the designated air tag that is on display, and superimposes the content of the electronic additional information (for example, help information explaining the word, a detailed configuration of the drawing, and moving image information on appliance operation statuses) corresponding to the air tag over the subject MN displayed in real time. Thus, the content of the electronic additional information is displayed in an overlaid manner. Then, when the display of the content of the electronic additional information ends, or when the user makes a force-quit command with respect to the display of the content of the electronic additional information (by performing, for example, a pinch-in operation (which is an operation of two fingertips on the paper shifting from a state in which the two fingertips are apart from one another to a state in which they are close to one another)), then the state ST3 returns to the state ST2.
In the state ST3, the content of the electronic additional information is displayed in an overlaid manner, that is, the content of the electronic additional information is superimposed over the subject MN displayed in real time. With the subject MN displayed in real time, however, viewing difficulties can occur due to hand jiggling or similar occurrences. In this case, in the state ST2, the user may perform a pinch-out operation using two fingers while pointing to the particular position on the subject MN. (The pinch-out operation is an operation of two fingertips on the paper shifting from a state in which the two fingertips are close to one another to a state in which they are apart from one another.) When the pinch-out operation is performed, the image analysis section 103a recognizes this operation by known image analysis using the finger detection data 103d, and the state ST2 shifts to a state ST4, in which the content of the electronic additional information is displayed while being superimposed over the electronic data of the subject MN.
In the state ST4, the display control section 103b deletes the designated air tag that is on display and ends the real-time display of the real image. Then, the display control section 103b superimposes the content of the electronic additional information (for example, help information explaining the word, a detailed configuration of the drawing, and moving image information on appliance operation statuses) corresponding to the air tag over the electronic data of the subject MN (for example, PDF data or JPEG data obtained by scanning the subject MN in advance) called up from the database 301 in advance. Thus, the content of the electronic additional information is displayed in an overlaid manner. Thus, use of the electronic data of the subject MN to display the content of the electronic additional information in an overlaid manner ensures clear display of the subject MN on the display section 101 regardless of hand jiggling or similar occurrences. Then, when the display of the content of the electronic additional information ends, or when the user makes a force-quit command with respect to the display of the content of the electronic additional information (by performing, for example, the pinch-in operation), then the state ST4 returns to the state ST2.
In the above example, prior to the operation of pointing by the user of the image processing terminal 100 at the fingertip FG, the display control section 103b of the image processing section 103 displays on the display section 101 all the plurality of air tags to be displayed. (This example corresponds to
In this case, with all the air tags displayed, the user is able to, in advance, grasp the whole picture of the plurality of pieces of electronic additional information corresponding to the subject MN.
In this state, when the user of the image processing terminal 100 performs the operation of pointing at the fingertip FG, the display control section 103b may highlight only the specified air tag such as by changing the text size or the color of the specified air tag.
Then, the display control section 103b displays only the specified air tag, while deleting the display of the other air tags than the specified air tag. (This state corresponds to the right end representation of
It should be noted, however, that the present invention does not exclude the option of the display control section 103b of the image processing section 103 not displaying the plurality of air tags to be displayed on the display section 101 prior to the operation of pointing by the user of the image processing terminal 100 at the fingertip FG.
It is of course possible to display all the plurality of air tags on the display section 101 at the time when the image analysis section 103a specifies the particular code CD and the electronic additional information obtaining section 103c obtains, using the particular code CD as a key, the electronic additional information and the electronic data information of the subject MN from the database 301. In this case, however, a large number of air tags would be displayed as in
In view of this, in this case, instead of displaying the plurality of air tags to be displayed on the display section 101, it is possible to display only the air tag corresponding to the position specified by the user's operation of pointing on the subject MN at the fingertip FG. This ensures a simple display screen with minimized air tag display.
With the image processing system and the method for processing an image according to this embodiment, based on the operation of pointing, on the subject MN instead of on the display section 101, to a particular position within the subject MN by the user of the image processing terminal 100 at the fingertip FG, the image processing section 103 specifies the position of the fingertip FG on the display section 101 by image analysis, thereby specifying the particular position within the subject MN. Then, the image processing section 103 displays on the display section 101 only the air tag, among the plurality of air tags, corresponding to the particular position. This eliminates or minimizes difficulty in selecting an air tag, and realizes an image processing system and a method for processing an image that ensure selection of a desired air tag even when a large number of air tags exist.
Also in the image processing system and the method for processing an image according to this embodiment, it is possible, after the operation of pointing at the fingertip FG, to delete the display of the other air tags than the air tag corresponding to the particular position, and display only the air tag corresponding to the particular position. In this case, the user is able to, in advance, grasp the whole picture of the plurality of pieces of electronic additional information corresponding to the subject MN, while at the same time the user will not be annoyed by the display of unnecessary pieces of electronic additional information when the user needs access to a necessary piece of electronic additional information.
This embodiment is a modification of the image processing system and the method for processing an image according to embodiment 1, and is regarding such an image processing system and a method for processing an image that based on an operation of zooming in on a particular position within the subject MN displayed on the display section 101 or based on an operation of making the photographing section 102 close to the particular position within the subject MN performed by the user of the image processing terminal 100, instead of based on the user's pointing to the particular position within the subject at a fingertip of the user, the image processing section 103 specifies the particular position within the subject MN. Then, the image processing section 103 displays on the display section 101 only the air tag, among the plurality of air tags, corresponding to the particular position.
Here, displaying a large number of air tags on the display section 101 as shown in
In view of this, in this embodiment, the desired position to display the electronic additional information is designated among the air tag displayable positions PTa to PTc within the subject MN in the following manner. Specifically, the user of the image processing terminal 100 performs an operation of zooming in on a particular position within the subject MN displayed on the display section 101 so that only one air tag displayable position is contained in the display area of the display section 101, or the user performs an operation of making the photographing section 102 close to the particular position within the subject MN so that only one air tag displayable position is contained in the field angle of the photographing section 102. Here, the image processing terminal 100 has its image analysis section 103a detect whether the user has performed the zoom-in operation or the closeup operation (step S11a). Then, when the image analysis section 103a detects the user's zoom-in operation or closeup operation (Yes at step 11a), then the image analysis section 103a specifies where the particular position targeted for the zoom-in or closeup is within the subject MN (step S12a), and displays on the display section 101 an only air tag, among a plurality of air tags, corresponding to the particular position (step S13).
When the above-described operation is performed to display only the air tag corresponding to the particular position, the image analysis section 103a has already determined, at step S9, the paper area and paper orientation of the photographed subject MN by a known image analysis technique. Based on the determination, the image analysis section 103a has already specified the origin (for example, the upper left corner of the paper) of the subject MN. Thus, when the user handles the image processing terminal 100 to perform the operation of zooming in on the particular position within the subject MN displayed on the display section 101, or to perform the operation of making the photographing section 102 close to the particular position within the subject MN, the coordinates of the position displayed on the display section 101 after the zoom-in operation or the operation of making the photographing section 102 close to the particular position are calculated by a known image processing technique.
Based on this calculation, the image analysis section 103a converts the position into two-dimensional coordinates on the subject MN. Next, the display control section 103b specifies, as the particular position, one of the air tag displayable positions PTa to PTc within the subject MNa that is closest to the converted coordinates. Then, the display control section 103b displays on the display section 101 only the air tag, among the plurality of air tags, corresponding to the particular position. This state corresponds to the right end representation of
It is noted that since no fingertip detection is performed in this embodiment, the finger detection data 103d is not necessary in the configuration of the image processing terminal 100 shown in
At the time of specifying the air tag, it is possible to change the display form of the air tag in conjunction with the operation of zooming in on the particular position within the subject MN displayed on the display section 101 or the operation of making the photographing section 102 close to the particular position within the subject MN. For example, it is possible to gradually diminish the text size of the displayed item in the air tag in conjunction with an operation of increasing the text size of the displayed item in the air tag and/or an operation of zooming out from the particular position within the subject MN displayed on the display section 101, or in conjunction with an operation of making the photographing section 102 apart from the particular position within the subject MN.
When the electronic additional information is displayed as text information after the air tag has been specified, it is possible to change how to display the electronic additional information in conjunction with the operation of further zooming in on the particular position within the subject MN displayed on the display section 101 or the operation of making the photographing section 102 closer to the particular position within the subject MN. For example, it is possible to gradually diminish the text size of the electronic additional information or reduce the number of text words to be displayed in conjunction with the operation of increasing the text size of the electronic additional information and increasing the number of text words to be displayed and/or the operation of zooming out from the particular position within the subject MN displayed on the display section 101, or in conjunction with the operation of making the photographing section 102 apart from the particular position within the subject MN.
Embodiment 2 is otherwise similar to the image processing system and the method for processing an image according to embodiment 1, and will not be further elaborated here.
With the image processing system and the method for processing an image according to this embodiment, based on the operation of zooming in on the particular position within the subject MN displayed on the display section 101 or the operation of making the photographing section 102 close to the particular position within the subject MN performed by the user of the image processing terminal 100, the image processing section 103 specifies the particular position within the subject MN and displays on the display section 101 only the air tag, among the plurality of air tags, corresponding to the particular position. This eliminates or minimizes difficulty in selecting an air tag, and realizes an image processing system and a method for processing an image that ensure selection of a desired air tag even when a large number of air tags exist.
While the above-described embodiments are regarding an image processing system and a method for processing an image, these embodiments should not be construed as limiting the present invention. The present invention encompasses a program (application program) that causes various image processing terminals implemented by various computers to perform the steps of the method for processing an image according to the above-described embodiments. This realizes a program that ensures such a method for processing an image that ensures selection of a desired air tag even when a large number of air tags exist.
Obviously, numerous modifications and variations of the present disclosure are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the present disclosure may be practiced otherwise than as specifically described herein.
Number | Date | Country | Kind |
---|---|---|---|
2012-277324 | Dec 2012 | JP | national |