Technical Field
The present invention relates to an object identification system and method, and in particular, to a system and method for identifying a main target object according to a specific appearance and feature changing of the object.
Related Art
At present, technologies used for identifying objects in an image may be classified into two categories: one is to directly perform comparison by using an appearance of the object, and the other is to mark the object with a distinct tag. The technology of directly perform comparison by using an appearance of an object is affected by shooting angles, and all angles generated by features which may change must be compared when comparison is performed on the appearance, so during data processing, a considerable amount of time is consumed due to an excessively large amount of data that needs to be determined.
In addition, in a case where a tag is taken as a feature of an object, a related technology for performing comparison by using a barcode as a feature is mentioned in the United States Patent U.S. Pat. No. 2,612,994. Object features can be compared by using a rapid algorithm only if a tag has a special appearance. However, a disadvantage of a barcode is that a barcode cannot provide sufficient information because the amount of information in a barcode is too limited, and the barcode can be identified only if the image of the barcode is clear.
Furthermore, a dynamic tag technology for changing feature information of an object by color changing, provided in the United States Patent U.S. Pat. No. 3,935,432, mentions that the present prior art can provide object identification in a case where only several lamp signals are used, solving the problem in the previous prior art that features can be identified only when a high resolution image is available. However, the present prior art, still, can only provide simple numbered data and must rely on a database established in advance to find out related object information by comparing the simple numbered data with data in the database.
The United States Patent US20060054695 further proposes the technical content that a dynamic barcode is used to transmit feature information, but the disadvantage still is that a barcode can be accurately determined only if a high resolution image is available, the present prior art can only be applied in a short-distance range, and when information is transmitted by using the barcode, image transmission performed every time only includes a little bit of information and cannot effectively provide a two-way communication between a tag end and an identification end.
The Chinese Patent CN201111094 proposes that different light signal emission sources are disposed in a space and a tag automatically receives a light signal to determine a location of the tag. However, in the present prior art, each light emitting unit can only provide one unrepeatable location, and since a light signal of a greater encoding range needs a longer read time, both the usable space range and the precision of the present prior art are limited. In addition, the present prior art obtains a location of an object in a space, but the object may be covered by another object, so that the location cannot be associated with image data.
The present invention provides an object identification system and method, and an interesting object may be easily identified according to a tag attached to the object.
A main objective of the present invention is to provide an object identification system which can achieve the objective of object identification by using an existing wireless communication technology and simple light changing.
The present invention provides a tag for identifying an object in image, comprising: a feature changing module, comprising one or more light sources and changing the light source according to a feature signal; and a communication module, receiving or sending a feature signal, and receiving or sending a radio signal related to the feature signal.
The present invention further provides a device for identifying an object in image, comprising: a communication module, where the communication module receives a feature signal; a processing unit, where the processing unit receives the feature signal and an image signal from an image sensor, and generates an image identification result according to the image signal and the feature signal; and a storage module, used for storing the image identification result.
The disclosure will become more fully understood from the detailed description given herein below for illustration only, and thus are not limitative of the disclosure, and wherein:
To make the foregoing features and advantages of the present invention more clear and comprehensible, specific embodiments are used as examples and are described in detail below with reference to accompanying drawings.
Referring to
In this embodiment, the feature changing module 12 includes multiple light sources (not shown) and generates a feature signal 121, and the feature signal 121 includes a command for controlling the light sources, and the feature changing module 12 controls, according to the command included in the feature signal 121, wavelengths or intensity of the multiple light sources at a specific time. After receiving the feature signal 121, the communication module 16 properly processes the feature signal 121 and sends an output signal 161 related to the feature signal 121, where the signal 161 has information related to the command which is in the feature signal 121 and controls the multiple light sources in a light emitting module 14. The location module 18 is used for acquiring location information of the tag 1, where the location information may include but is not limited to a geographic coordinate where the tag 1 is located. The location module 18 transmits the location information to the communication module 16 and sends the location information through the communication module 16. The location module 18 may be but is not limited to a global positioning system (GPS).
In another embodiment, the communication module 16 may receive a feature signal (not shown) from the exterior of the tag 1, and transmit the feature signal to the feature changing module 12. The feature changing module 12 controls, according to the command included in the feature signal, wavelengths or intensity of the multiple light sources at a specific time.
The feature changing module 12, the communication module 16 and the location module 18 may be implemented by hardware or software, where the former has an advantage in operating speed and the latter requires a lower cost in design complexity. If hardware is used in implementation, the modules 12, 16 and 18 may be mounted in a tag 1, and the tag 1 may be a device, for example, a portable computer, a tablet computer, a mobile phone, an intelligent mobile phone, and the like. If software is used in implementation, the modules 12, 16 and 18 may include an executable program or application installed in the tag 1.
Referring to
The communication module 22 may receive a signal 221, where the signal 221 includes a command or related information controlling one or more light sources in a feature changing light emitting module. In another embodiment, the signal 221 may be the same as the signal 161 in
In an embodiment, the processing unit 24 may receive, from a pairing side through the communication module 22, a list of tags that are possibly within a sensing range of an image sensor, so as to reduce the number of tags that need to be compared.
In an embodiment, the location module 35 may be but is not limited to a global positioning system (GPS).
The processing unit 24 generates an image identification result 241 according to the image from the image sensor module 33 and the signal 222, and stores the image identification result 241 in the storage module 26.
Referring to
Referring to
The processing unit 44 generates an image identification result according to an image from the image sensor module and a feature signal (for example, the signal 222 shown in
Referring to
The communication module 62, the image sensor module 63 and the location module 65 may be implemented by hardware or software, where the former has an advantage in operating speed and the latter requires a lower cost in design complexity. If hardware is used in implementation, the modules 62, 63 and 65 may be mounted in an identification device 6, and the identification device 6 may be a device, for example, a computer, a server, a mobile phone, an intelligent mobile phone, and the like. If software is used in implementation, the modules 62, 63 and 65 may include an executable program or application installed in the identification device 6.
In an embodiment, the light source of the feature changing module is a lamp. In another embodiment, the light source of the feature changing module is pixels on a screen. In another embodiment, the light source of the feature changing module is an infra-red source.
In an embodiment, the communication modules 16, 22, 31, 42, 51, 51′ and 62 are wireless radio frequency communication modules.
In another embodiment, the communication modules 16, 22, 31, 42, 51, 51′ and 62 are infra-red communication modules.
In another embodiment, the communication module 16 is a wireless network module, and the communication modules 22, 31, 42, 51, 51′, and 62 are network modules.
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
In step 804, the processing unit generates an identification result according to the image signal and the feature signal. In step 806, the processing unit stores the identification result in a storage module.
When describing exemplary examples of the present invention, the present specification may set out the method of the present invention in specific step orders.
However, because the scope of the method is not limited to the specific step orders provided in this text, the method is not limited to the specific step orders. A person skilled in the art should understand that other step orders are also feasible. Therefore, specific step orders provided in the present specification shall not be taken as limitations to the claims. In addition, the claims related to the method of the present invention should not be implemented only according to the written step orders, and a person skilled in the art should understand that the orders may be altered and are still within the spirit and scope of the present invention.
A person skilled in the art should understand that the foregoing examples may be varied as long as such variations do not depart from the inventive concept of the present invention in the broad sense. Therefore, it should be understand that the present invention is not limited to the specific examples disclosed in the present specification, but includes modifications belonging to the spirit and scope that are defined by the following claims of the present invention.
Although the present invention is described above by using the foregoing embodiments, the present invention is not limited thereto. Any alternations and modifications made by any technical person skilled in the art without departing from the spirit and scope of the present invention shall fall within the protection scope of the present invention.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/CN2012/070655 | 1/20/2012 | WO | 00 | 7/18/2014 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2013/107039 | 7/25/2013 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
2612994 | Woodland et al. | Oct 1952 | A |
3935432 | Maynard | Jan 1976 | A |
6031585 | Stevens, III | Feb 2000 | A |
6715676 | Janning | Apr 2004 | B1 |
6882274 | Richardson | Apr 2005 | B2 |
7131584 | Stephenson | Nov 2006 | B2 |
20010050731 | An | Dec 2001 | A1 |
20040198555 | Anderson | Oct 2004 | A1 |
20060054695 | Owada | Mar 2006 | A1 |
20060208892 | Ehrman | Sep 2006 | A1 |
20070040683 | Oliver et al. | Feb 2007 | A1 |
20070279368 | Shefter | Dec 2007 | A1 |
20080012722 | Moseley | Jan 2008 | A1 |
20090231135 | Chaves | Sep 2009 | A1 |
20110024500 | McReynolds | Feb 2011 | A1 |
20150317896 | Planton | Nov 2015 | A1 |
20200005105 | Herranen | Jan 2020 | A1 |
Number | Date | Country |
---|---|---|
101046842 | Oct 2007 | CN |
201111094 | Sep 2008 | CN |
202600754 | Dec 2012 | CN |
1697811 | Apr 2011 | EP |
Number | Date | Country | |
---|---|---|---|
20150123768 A1 | May 2015 | US |