This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2013-010400 filed Jan. 23, 2013.
1. Technical Field
The present invention relates to an information processing apparatus and a computer-readable medium.
2. Summary
According to an aspect of the present invention, there is provided an information processing apparatus including a feature extraction unit, and a storage unit. The feature extraction unit extracts an extracted feature value indicating a characteristic of a target image, from a feature extraction area which has been set by a user. The storage unit stores the extracted feature value extracted by the feature extraction unit in a database. The storage unit includes a determination unit, a second storage unit, and a notification unit. The determination unit calculates a degree of similarity to the extracted feature value extracted by the feature extraction unit, for each of feature values stored in the database, and determines whether or not a feature value whose degree of similarity to the extracted feature value extracted by the feature extraction unit is equal to or more than a certain value is stored in the database. The second storage unit stores the extracted feature value extracted by the feature extraction unit in the database when it is determined that a feature value whose degree of similarity to the extracted feature value extracted by the feature extraction unit is equal to or more than the certain value is not stored in the database. The notification unit outputs predetermined notification information to the user without storing the extracted feature value extracted by the feature extraction unit in the database, when it is determined that a feature value whose degree of similarity to the extracted feature value extracted by the feature extraction unit is equal to or more than the certain value is stored in the database.
Exemplary embodiments of the present invention will be described in detail based on the following figures, wherein:
An exemplary embodiment of the present invention will be described in detail below on the basis of the drawings.
The main memory 2b stores information necessary for various types of information processing, and serves also as a work memory.
The network interface 2c is an interface for connecting the server 2 to a network. The network interface 2c is used to receive/transmit information from/to the network in accordance with instructions from the controller 2a. As illustrated in
Actually, any number of working information terminals 4 for respective users U1 are connected to the network.
The hard disk 2d stores various types of information. In the present exemplary embodiment, the hard disk 2d stores multiple databases. The data stored in the databases will be described below.
The server 2 is provided with a web server function, and provides a web application. The user U1 accesses the server 2 by using a browser implemented in the working information terminal 4, and uses the web application. The user U1 uses the web application to upload document data indicating a document, for example, a pamphlet, which is created for advertisement of a product of the manufacturer, to the server 2.
When the document data is uploaded, an image of the document indicated by the uploaded document data (hereinafter, referred to as a target image) is displayed in the browser. The user U1 selects a desired database (for example, a database related to the product), and then sets a feature-extraction target area in the target image while referring to the target image displayed in the browser. For example, the user U1 sets an area to which attention is to be given (for example, a surrounding area of the image of the product of the manufacturer) as a feature-extraction target area. The user U1 not only sets a feature-extraction target area, but also inputs a uniform resource locator (URL) for content about a display component in the feature-extraction target area. For example, the user U1 inputs a URL for movie content for viewing a state in which the product of the manufacturer is operating. Thus, the user U1 associates the display component in the feature-extraction target area with the content. The content corresponds to an “information resource”, and the URL corresponds to “address information”.
In the present exemplary embodiment, the user U1 specifies a position at which a marker 10 (see
The data for identifying the database selected by the user U1 (hereinafter, referred to as a database Y), the data for specifying the feature-extraction target area which has been set by the user U1, and the URL which has been input by the user U1 are transmitted to the server 2. In the server 2 receiving these, the controller 2a performs the process illustrated in
That is, the controller 2a (feature extraction unit) specifies the feature-extraction target area on the basis of the data received from the working information terminal 4 for the user U1, and extracts a feature value indicating characteristics of the target image, from the feature-extraction target area (in step S101). In the present exemplary embodiment, in step S101, the controller 2a extracts one or more feature points of the target image from the feature-extraction target area, as a feature value in accordance with the scale-invariant feature transform (SIFT) algorithm.
In addition, the controller 2a generates an image with a marker, in which the marker 10 is disposed in the target image, on the basis or the data received from the working information terminal 4 for the user U1 (in step S102).
The controller 2a (storage unit) specifies a database Y on the basis of the data received from the working information terminal 4 for the user U1, and then executes a storage routine (in step S103). The detail will be described below. In short, in step S103, the controller 2a basically associates the feature value extracted in step S101, the URL which has been input by the user U1, and the image with a marker which is generated in step S102 with each other so as to store them in the database Y (see step S303A in
A large number of copies of the image with a marker are printed as pamphlets for advertising the product of the manufacturer. The printed pamphlets are distributed to any number of persons.
In the server 2, an idea for efficiently advertising a product to the user U2 obtaining a pamphlet is implemented. That is, the user U2 focuses the digital camera included in the portable terminal 6 (second information processing apparatus) on the marker 10, and photographs an area including the marker 10, so that the content associated with the display component (for example, a product) in the area is automatically displayed on the portable terminal 6. Specifically, the user U2 selects a database specified in the pamphlet, and then photographs an area including the marker 10. Then, the above-described application is used to cut out an image of the circumscribed rectangle area of the marker 10 as a search target image from the photographed image captured by using the digital camera, and data for identifying the database selected by the user U2 and data indicating the search target image are transmitted to the server 2. Receiving these pieces of data, the server 2 performs the process illustrated in
The process illustrated in
The controller 2a extracts a feature value indicating characteristics of the search target image (in step S201).
In the present exemplary embodiment, in step S201, the controller 2a extracts one or more feature points as a feature value from the search target image in accordance with the SIFT algorithm.
Then, the controller 2a (search unit) searches for a feature value whose degree of similarity to the feature value extracted from the search target image is equal to or more than a predetermined threshold TH, from feature values stored in the database X in steps S202 and S203.
That is, the controller 2a (search unit) sequentially selects feature values stored in the database X, that is, feature values associated with the database name of the database X, one by one as a feature value X, and calculates a degree of similarity between the feature value extracted from the search target image and a feature value X every time the feature value X is selected (in step S202). In the present exemplary embodiment, in step S202, the controller 2a compares the feature points extracted from the search target image with the feature points indicated by a feature value X, and calculates the number of combinations of feature points between which a correspondence is present, as a degree of similarity.
In step S203, the controller 2a (search unit) specifies feature values whose degrees of similarity to the feature value extracted from the search target image are equal to or more than the threshold TH, from the feature values stored in the database X on the basis of the degree of similarity calculated in step S202 (in step S203).
Then, the controller 2a (transmitting unit) transmits the URL which is stored in the database X in such a manner that the URL is associated with a feature value specified in step S203, to the portable terminal 6 (in step S204). In the exemplary embodiment, the controller 2a transmits the URL associated with the feature value whose degree of similarity to the feature value extracted from the search target image is maximum among the feature values specified in step S203, to the portable terminal 6. In the portable terminal 6 which receives the URL, the content of the link indicated by the URL is obtained, and the obtained content is output. As a result, the user U2 views, for example, the movie showing a state in which the product described in the pamphlet actually operates.
In the case where the feature values for images similar to each other are stored in the same database, that is, in the case where the feature values for images similar to each other belong to the same feature value group, when the process illustrated in
In the storage routine, the controller 2a sequentially selects the feature values stored in the database Y which is a database selected by the user U1, that is, the feature values associated with the database name of the database Y, one by one as a feature value Y. Every time the controller 2a selects a feature value Y, the controller 2a calculates a degree of similarity between the feature value Y and the feature value extracted from the feature-extraction target area in step S101, as in step S202 in
The controller 2a (determination unit) determines whether or not a feature value whose degree of similarity to the feature value extracted from the feature-extraction target area is equal to or more than the above-described threshold TH is present in the database Y, on the basis of the degree of similarity calculated in step S301 (in step S302). If a feature value whose degree of similarity to the feature value extracted from the feature-extraction target area is equal to or more than the above-described threshold TH is not present (NO in step S302), the controller 2a associates the feature value extracted from the feature-extraction target area in step S101, the URL which has been input by the user U1, and the image with a marker generated in step S102 with each other, and stores them in the database Y (in step S303A). Then the storage routine is ended.
If a feature value whose degree of similarity to the feature value extracted from the feature-extraction target area is equal to or more than the above-described threshold TH is present (YES in step S302), the controller 2a sets the number of updates ‘N’ to ‘1’ (in step S303). In addition, the controller 2a (update unit) updates the feature-extraction target area (in step S304). Herein, in step S304, the controller 2a enlarges the feature-extraction target area by using a predetermined scale of enlargement. In step S304, the controller 2a may move the feature-extraction target area by a predetermined distance.
Unless otherwise specified below, a “feature-extraction target area” means an “updated feature-extraction target area”. An “initial feature-extraction target area” means a “feature-extraction target area which is set by the user U1”.
Then, the controller 2a (re-extraction unit) extracts a feature value indicating the characteristics of the target image, from the feature-extraction target area, as in step S101 in
As in step S301, the controller 2a sequentially selects the feature values stored in the database Y, one by one as a feature value Y. Every time the controller 2a selects a feature value Y, the controller 2a calculates a degree of similarity between the feature value Y and the feature value extracted from the feature-extraction target area in step S305 (in step S306).
As in step S302, the controller 2a (redetermination unit) determines whether or not a feature value whose degree of similarity to the feature value extracted from the feature-extraction target area in step S305 is equal to or more than the above-described threshold TH is present in the database Y (in step S307). If a feature value whose degree of similarity to the feature value extracted from the feature-extraction target area is equal to or more than the above-described threshold TH is not present (NO in step S307), the controller 2a performs the following processes. Since the feature-extraction target area is enlarged from the initial area, the controller 2a generates again an image with a marker by disposing the anchor image 12 and the marker 10 which is a ring of an inscribed circle in the feature-extraction target area, in the target image. Then, the controller 2a associates the feature value extracted from the feature-extraction target area in step S305, the URL which has been input by the user U1, and the image with a marker generated again, and stores them in the database Y (in step S308A). Then, the storage routine is ended.
If a feature value whose degree of similarity to the feature value extracted from the feature-extraction target area in step S305 is equal to or more than the above-described threshold TH is present (YES in step S307), the controller 2a determines whether or not the number of updates ‘N’ is equal to an upper limit, for example, ‘5’ (in step S308). If the number of updates ‘N’ is less than the upper limit (NO in step S308), the controller 2a increments the number of updates ‘N’ by ‘1’ (in step S309A), and performs step S304 and its subsequent steps again.
If the number of updates ‘N’ is equal to the upper limit (YES in step S308), without storing the feature value extracted from the feature-extraction target area in step S305, the controller 2a (notification unit) transmits predetermined notification data to the working information terminal 4 for the user U1 (in step S309). In the working information terminal 4 for the user U1 which receives the notification data, for example, a screen for displaying a message that the feature value is not stored is displayed. In addition, for example, a screen for providing a guide to select another database is displayed.
An available exemplary embodiment of the present invention is not limited to the above-described exemplary embodiment.
(1) For example, if a feature value whose degree of similarity to the feature value extracted from the “initial” feature-extraction target area is equal to or more than the above-described threshold TH is present in the database Y (YES in step S302), the controller 2a (notification unit) may immediately perform step S309.
(2) For example, if the number of updates ‘N’ is equal to the upper limit (YES in step S308), without performing step S309, the controller 2a may perform the storage routine again by using another database as the database Y.
(3) For example, after step S309, the controller 2a may associate the feature value extracted from the “initial” feature-extraction target area, the URL which has been input by the user U1, and the image with a marker generated in step S102 with each other, and may store them in another database. For example, in the case where, in the working information terminal 4 for the user U1 which receives the notification data, a screen for providing a guide to select another database is displayed, and where, as a result, the user U1 selects a second database, the controller 2a may associate these pieces of data and may store them in the second database selected by the user U1.
(4) For example, to reduce the processing load in step S306 (see
(5) As long as the “address information” is data indicating an address of an information resource such as content, the “address information” is not limited to a URL, and may be any information. For example, the “address information” may be a file path of an information resource.
(6) In the case where the server 2 is used by multiple companies, databases corresponding to the respective companies may be provided. In the case where multiple companies register information in the same database, information registered by a company other than an intended company may be retrieved in searching the database. In this regard, if databases are provided for the respective companies, occurrence of such a situation is suppressed.
The foregoing description of the exemplary embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to understand the invention for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2013-010400 | Jan 2013 | JP | national |