This application is a national phase filing of PCT Application Serial No. IL03/00045, filed Jan. 16, 2003, which claims priority from U.S. patent application Ser. No. 10/052,427, filed Jan. 18, 2002, now issued as U.S. Pat. No. 6,801,245, entitled “METHOD FOR AUTOMATIC IDENTIFICATION AND CAPTURE”.
The present invention relates to logistics systems and methods generally and more particularly to systems and methods for tracking objects.
Various systems and techniques are known for tracking the location of objects. These involve the use of bar-codes and other optically sensible codes.
The following U.S. patents and Published PCT Applications are believed to represent the current state of the art:
U.S. Pat. Nos. 6,460,769; 6,456,239; 6,445,810; 6,442,476; 6,433,732; 6,427,913; 6,424,260; 6,405,924; 6,400,830; 6,384,409; 6,381,509; 6,339,745; 6,317,044; 6,286,763; 6,265,977; 6,285,342; 6,283,375; 6,259,408; 6,252,508; 6,206,286; 6,127,928; 6,070,801; 6,098,877; 6,027,022; 5,996,895; 5,988,508; 5,912,980; 5,828,049; 5,828,048; 5,825,012; 5,646,389; 5,621,864; 5,600,121; 5,528,228; 5,468,949; 5,384,859; 5,224,373; 4,924,799; 4,844,509; 4,794,238; 4,636,950; 4,340,810; 4,340,008; 4,268,179; 4,044,227 and 3,961,323.
Published PCT Applications WO 00/04711 and WO 98/10358.
The disclosures of the foregoing U.S. patents and Published PCT Applications are hereby incorporated by reference.
The present invention seeks to provide highly efficient and cost effective systems and methodologies for tracking objects.
There is thus provided in accordance with a preferred embodiment of the present invention a methodology for tracking objects including affixing at least one imagable identifier onto each of a multiplicity of objects to be tracked, imaging at least a portion of at least one of the multiplicity of objects at a known location to provide an at least partial image of the at least one of the multiplicity of objects, containing the at least one imagable identifier and employing the at least partial image of the object containing the at least one imagable identifier to provide an output indication of the location of the at least one of the multiplicity of objects.
There is also provided in accordance with another preferred embodiment of the present invention an object tracking system including at least one imagable identifier affixed onto each of a multiplicity of objects to be tracked, an imager, imaging at least a portion of at least one of the multiplicity of objects at a known location to provide an at least partial image of the at least one of the multiplicity of objects, containing the at least one imagable identifier and a processor employing the at least partial image of the object containing the at least one imagable identifier to provide an output indication of the location of the at least one of the multiplicity of objects.
Preferably, the methodology also includes communicating at least one of the at least partial image and the output indication to a remote location.
Affixing preferably includes adhesively attaching the at least one imagable identifier onto a surface of each of the multiplicity of objects. Alternatively or additionally, affixing includes molding the at least one imagable identifier onto a surface of each of the multiplicity of objects. Alternatively or additionally, affixing includes printing the at least one imagable identifier onto a surface of each of the multiplicity of objects.
In accordance with a preferred embodiment of the present invention, the at least one imagable identifier includes a multi-color identifier. Preferably, the at least one imagable identifier includes a multi-segment, multi-color identifier.
In accordance with a preferred embodiment of the present invention, the multi-segment, multi-color identifier is capable of identifying and distinguishing a plurality of objects at least equal to approximately:
Plurality of objects=(n×(n−1)(p-2)×(n−2))/p
where n is the number of different colors and p is the number of segments.
More preferably, the multi-segment, multi-color identifier is capable of identifying and distinguishing a plurality of objects at least equal to approximately:
Plurality of objects=n×(n−1)(p-2)×(n−2)
where n is the number of different colors and p is the number of segments.
In accordance with a preferred embodiment of the present invention, the multi-segment, multi-color identifier has an inherent orientation.
Preferably, imaging includes photographing and may include imaging a plurality of the objects together within a single image. Additionally or alternatively, imaging may include sequentially imaging a plurality of the objects passing a given imaging location.
Preferably, the at least one imagable identifier includes a plurality of imagable identifiers arranged in at least predetermined propinquity to each other.
In accordance with another preferred embodiment of the present invention, employing the at least partial image of the object containing the at least one imagable identifier includes extracting an identification code from the at least partial image.
Preferably, the object tracking system also includes a communicator, communicating at least one of the at least partial image and the output indication to a remote location.
In accordance with still another preferred embodiment of the present invention, the processor is operative to extract an identification code from the at least partial image.
Preferably, output from the imager of the object tracking system, as well as output from the imaging of the methodology, can be stored for future retrieval.
There is further provided in accordance with a preferred embodiment of the present invention a system for tracking objects including visually sensible indicators on objects being tracked, at least one imager capturing images of the objects being tracked which images also show the visually sensible indicators and at least one image processor receiving outputs of the at least one imager and extracting from the outputs coded information indicated by the visually sensible indicators.
There is still further provided in accordance with yet another preferred embodiment of the present invention a system for monitoring objects including a plurality of sensors associated with objects being monitored, visually sensible indicators associated with each of the objects, receiving sensor outputs of the plurality of sensors and providing visually sensible indications of the sensor outputs, at least one imager capturing images of the visually sensible indicators and at least one image processor receiving image outputs of the at least one imager and extracting from the image outputs coded information indicated by the visually sensible indicators.
In accordance with another preferred embodiment of the present invention, the at least one monitor is remotely located from the objects being monitored.
In accordance with yet another preferred embodiment of the present invention the system also includes at least one monitor receiving and displaying the coded information from the image processor. Alternatively, the at least one monitor is also operative to display the images of the objects in conjunction with the coded information.
Preferably, the visually sensible indicator indicates object identity information. Alternatively or additionally, the visually sensible indicator also indicates at least one additional parameter relating to the object.
In accordance with another preferred embodiment of the present invention the visually sensible indicator is a dynamic indicator. Additionally, the visually sensible indicator provides non-alphanumeric indications of multiple parameters relating to an object onto which the indicator is mounted. Alternatively or additionally, the visually sensible indicator provides a coded indication of at least two of the following parameters: object location, object identity, object maximum temperature history, object maximum humidity history, object minimum temperature history, object minimum humidity history, object tilt history, and object G-force history.
In accordance with another preferred embodiment of the present invention the at least one imager includes a plurality of imagers, which plurality is greater than the number of the at least one image processor.
Preferably, the at least one imager includes at least one scanning imager.
There is still further provided in accordance with yet another preferred embodiment of the present invention a method for tracking objects including associating visually sensible indicators with objects being tracked, capturing images of the objects being tracked which images also show the visually sensible indicators and image processing outputs of the at least one imager and extracting from the outputs coded information indicated by the visually sensible indicators.
There is also provided in accordance with still another preferred embodiment of the present invention a method for monitoring objects including associating a plurality of sensors with objects being monitored, associating visually sensible indicators with each of the objects, providing to the visually sensible indicators, sensor outputs of the plurality of sensors, operating the visually sensible indicators to provide visually sensible indications of the sensor outputs, employing at least one imager to capture images of the visually sensible indicators and employing at least one image processor to receive image outputs of the at least one imager and extract from the image outputs coded information indicated by the visually sensible indicators.
In accordance with another preferred embodiment of the present invention the method for tracking objects also includes remotely receiving and displaying the coded information from the image processor. Additionally or alternatively, the method also includes displaying the images of the objects in conjunction with the coded information.
Preferably, the visually sensible indicator indicates object identity information. Additionally, the visually sensible indicator also indicates at least one additional parameter relating to the object.
In accordance with yet another preferred embodiment the visually sensible indicator changes its visual display in real time in accordance with the parameters indicated thereby. Additionally, the visually sensible indicator provides non-alphanumeric indications of multiple parameters relating to an object onto which the indicator is mounted. Alternatively or additionally, the visually sensible indicator provides a coded indication of at least two of the following parameters: object location, object identity, object maximum temperature history, object maximum humidity history, object minimum temperature history, object minimum humidity history, object tilt history, and object G-force history.
In accordance with still another preferred embodiment of the present invention the image processor processes images captured at plural locations.
Preferably, the capturing images employs at least one scanning imager.
The present invention will be understood and appreciated more fully from the following detailed description in which:
Reference is now made to
As seen in
U.S. patent application Ser. No. 09/508,300, (now abandoned).
Published PCT Patent Application WO 00/04711.
It is a particular feature of the present invention that the imagable identifiers on a plurality of objects may be imaged together, as in a single photograph, by a conventional imager 14, such as a digital camera. This is principally due to the fact that the various colors appear in the imagable identifier in two dimensional areas which are relatively easily differentiated from each other both spatially and in color space.
The image output of the imager is preferably provided to a computer 16, which may process the image output locally and provide an output indication 18 representing a plurality of numerical or alphanumerical identifiers corresponding to all of the imagable identifiers imaged in a given image or series of images. Alternatively or additionally, computer 16 may communicate via any suitable computer network, such as the Internet, with a remote tracking center 20, which may receive either image outputs for processing or alternatively may receive the plurality of numerical or alphanumerical identifiers corresponding to all of the imagable identifiers imaged in a given image or series of images. The image outputs may also be stored for future retrieval, either locally in computer 16 or in remote tracking center 20.
The remote tracking center 20 preferably compiles records of tracked numerical or alphanumerical identifiers from a multiplicity of geographically disparate locations so as to enable ready counting, tracking and locating of objects identified thereby. Remote tracking center 20 preferably maintains a database which is updated based on communications received from various geographically disparate locations.
Reference is now made to
U.S. patent application Ser. No. 09/508,300
Published PCT Patent Application WO 00/04711.
It is a particular feature of the present invention that the imagable identifiers on a plurality of objects may be automatically imaged together, as in a single photograph or a series of photographs, by a conventional imager 34, such as a panoramic digital camera. This is principally due to the fact that the various colors appear in the imagable identifier in two dimensional areas which are relatively easily differentiated from each other both spatially and in color space.
The arrangement of
The image output of the imager is preferably provided to a computer 36, which may process the image output locally and provide an output indication 38 representing a plurality of numerical or alphanumerical identifiers corresponding to all of the imagable identifiers imaged in a given image or series of images. Alternatively or additionally, computer 36 may communicate via any suitable computer network, such as the Internet, with a remote tracking center 40, which may receive either image outputs for processing or alternatively may receive the plurality of numerical or alphanumerical identifiers corresponding to all of the imagable identifiers imaged in a given image or series of images. The image outputs may also be stored for future retrieval, either locally in computer 36 or in remote tracking center 40.
The remote tracking center 40 preferably compiles records of tracked numerical or alphanumerical identifiers from a multiplicity of geographically disparate locations so as to enable ready counting, tracking and locating of objects identified thereby. Remote tracking center 40 preferably maintains a database which is updated based on communications received from various geographically disparate locations.
Reference is now made to
U.S. patent application Ser. No. 09/508,300
Published PCT Patent Application WO 00/04711.
It is a particular feature of the present invention that multiple imagable identifiers on one or more objects may be automatically imaged together, as in a single photograph or a series of photographs, by a conventional imager 54, such as a digital camera. This is principally due to the fact that the various colors appear in the imagable identifier in two dimensional areas which are relatively easily differentiated from each other both spatially and in color space.
The arrangement of
The image output of the imager is preferably provided to a computer 56, which may process the image output locally and provide an output indication 58 representing a plurality of numerical or alphanumerical identifiers corresponding to all of the pluralities of imagable identifiers imaged in a given image or series of images. Alternatively or additionally, computer 56 may communicate via any suitable computer network, such as the Internet, with a remote tracking center 60, which may receive either image outputs for processing or alternatively may receive the plurality of numerical or alphanumerical identifiers corresponding to all of the imagable identifiers imaged in a given image or series of images. The image outputs may also be stored for future retrieval, either locally in computer 56 or in remote tracking center 60.
The remote tracking center 60 preferably compiles records of tracked numerical or alphanumerical identifiers from a multiplicity of geographically disparate locations so as to enable ready counting, tracking and locating of objects identified thereby. Remote tracking center 60 preferably maintains a database which is updated based on communications received from various geographically disparate locations.
In accordance with a preferred embodiment of the present invention, the multi-segment, multi-color identifier 12, 32 and 52 is capable of identifying and distinguishing a plurality of objects at least equal to approximately:
Plurality of objects=(n×(n−1)(p-2)×(n−2))/p
where
The foregoing calculation does not assume any predetermined orientation of the imagable identifier.
More preferably, the multi-segment, multi-color identifier is capable of identifying and distinguishing a plurality of objects at least equal to approximately:
Plurality of objects=n×(n−1)(p-2)×(n−2)
where
This calculation assumes a known or constant orientation of the imagable identifier.
In accordance with a preferred embodiment of the invention, the multi-segment, multi-color identifier has an inherent orientation. It is appreciated that this need not necessarily be the case. When the multi-segment, multi-color identifier does not have an inherent orientation, the methodology exemplified in
Reference is now made to
One technique for ensuring correct affixation orientation is to mold or otherwise form onto a container 70, a three-dimensionally defined affixation location 72 and to provide an imagable identifier carrier 74, such as an adhesive backed sticker, which has a configuration, such as a notched configuration, which allows it to be placed in the three-dimensionally defined affixation location 72 only in one unique orientation relative to the container. Clearly, the structure and methodology shown in
It is noted that for the sake of clarity, the features of
Reference is now made to
In accordance with one preferred embodiment of the present invention, the identification indicators 100 are static indicators, such as those shown and described in assignee's Published PCT Application WO 00/04711, the disclosure of which is hereby incorporated by reference. Alternatively or additionally, the identification indicators 100 are dynamic indicators, such as those described hereinbelow with reference to
The outputs of camera 102 or multiple cameras 102 are preferably supplied to a computer 104 which analyzes the images produced thereby to extract a code corresponding to the combination of colored regions on each optically sensible identification indicator. As will be described hereinbelow in greater detail, this code may contain object identification information as well as information relevant to various additional parameters relevant to the object. A preferred embodiment of the methodology of decoding the identification indicators is described hereinbelow with respect to
Preferably, as illustrated in
It is appreciated that a suitable device (not shown) for the generation of identification indicators 100 is preferably provided as part of the multi-parameter object tracking system. In accordance with one embodiment of the present invention, static identification indicators 100 may be generated by any suitable printing device equipped with encoding software, so that an operator could print an encoded identification indicator by selecting the applicable parameters, such as from a menu of available parameters or by any other suitable method. In another preferred embodiment, dynamic identification indicators 100 may be generated as described hereinbelow with reference to
Additionally, the identification indicator 100 may be used in conjunction with an electronic article surveillance (EAS) device or a radio frequency identification device (RFID). In this embodiment, the visually sensible identification indicator 100 preferably contains an indication that the item has an EAS or RFID device and may also indicate the type of the device. Such a combination of identification indicator and EAS/RFID device may also be used to control asset flow, where the EAS device will indicate that an item is leaving a location which will require the operator to utilize a camera, such as camera 102, to image the visually sensible identification indicator 100. Suitable EAS and RFID devices are manufactured by HID Corporation, 9292 Jeronimo Road, Irvine, Calif. 92618 USA or Sensormatic, 6600 Congress Ave., P.O. Box 310700, Boca Raton, Fla. 33431-0700 USA.
Reference is now made to
Reference is now made to
Reference is now made to
Reference is now made to
Reference is now made to
It is appreciated that processor 612 is preferably also operable to receive additional input parameters, by any suitable method, such as a wired or wireless input device, which are also encoded as part of LCD display 614.
Visually sensible output sensors 616 and 618 may include, but are not limited to, the following environmental sensors:
TILTWATCH tilt indicator and SHOCKWATCH shock indicator, commercially available from Switched On I & T Services of Braeside, Victoria, Australia;
Humidity Indicator Cards and Plugs commercially available from Sud-Chemie Performance Packaging—the Americas Rio Grande Industrial Park 101 Christine Drive, Beten, N.Mex. 87002 USA;
COLDMARK and WARMMARK temperature history indicators, commercially available from IntroTech BV, P.O. Box 3, NL-7370 AA Loenen, the Netherlands.
It is appreciated that sensors 602, 606, 608 and 610 may also be selected from the above list or may be any other suitable sensor.
It is appreciated that while processor 612 is preferably operative to change LCD display 614 in real time, processor 612 may alternatively be operative to change LCD display 614 on a periodic basis, not necessarily in real time.
Reference is now made to
The camera outputs from one or multiple cameras are transmitted wirelessly or via a wired network to a central image processor which processes the images, inter alia to locate and decode the identification indicators in order to extract identification and parameter data. This data, preferably together with the original images showing not only the identification indicators, but also the objects, may be transmitted to one or more monitors, which provide a display of the images and of the decoded data.
It is appreciated that the system and method of the present invention enables real time or non-real time monitoring of objects and multiple locations which may be remote from the monitor and from the image processor.
It will be appreciated by persons skilled in the art that the present invention is not limited to what has been particularly described hereinabove. Rather the scope of the present invention includes both combinations and subcombinations of the various embodiments described hereinabove as well as modifications and additions thereto which would occur naturally to a person of skill in the art upon reading the foregoing description and which are not in the prior art.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/IL03/00045 | 1/16/2003 | WO | 00 | 3/15/2005 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO03/060626 | 7/24/2003 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
3636317 | Torrey | Jan 1972 | A |
3646624 | Waugh | Mar 1972 | A |
3959629 | Specht et al. | May 1976 | A |
4044227 | Holm et al. | Aug 1977 | A |
4268179 | Long et al. | May 1981 | A |
4345274 | Bambara | Aug 1982 | A |
4794238 | Hampton | Dec 1988 | A |
4844509 | Kasprzak et al. | Jul 1989 | A |
4858000 | Lu | Aug 1989 | A |
4924088 | Carman et al. | May 1990 | A |
5113349 | Nakamura et al. | May 1992 | A |
5153842 | Dlugos et al. | Oct 1992 | A |
5468949 | Swart et al. | Nov 1995 | A |
5539394 | Cato et al. | Jul 1996 | A |
5565858 | Guthrie | Oct 1996 | A |
5587906 | Melver et al. | Dec 1996 | A |
5600121 | Kahn et al. | Feb 1997 | A |
5621864 | Benade et al. | Apr 1997 | A |
5635403 | Bailey | Jun 1997 | A |
5698833 | Skinger | Dec 1997 | A |
5708470 | Holford | Jan 1998 | A |
5780826 | Hareyama et al. | Jul 1998 | A |
5825012 | Rockstein et al. | Oct 1998 | A |
5828048 | Rockstein et al. | Oct 1998 | A |
5828049 | Knowles et al. | Oct 1998 | A |
5914477 | Wang | Jun 1999 | A |
5963134 | Bowers et al. | Oct 1999 | A |
5988508 | Bridgleall et al. | Nov 1999 | A |
5996895 | Himan et al. | Dec 1999 | A |
6023530 | Wilson | Feb 2000 | A |
6032861 | Lemelson et al. | Mar 2000 | A |
6070801 | Watanabe et al. | Jun 2000 | A |
6076023 | Sato | Jun 2000 | A |
6088482 | He et al. | Jul 2000 | A |
6127928 | Issacman et al. | Oct 2000 | A |
6142375 | Belka et al. | Nov 2000 | A |
6164541 | Dougherty et al. | Dec 2000 | A |
6206286 | Watanabe et al. | Mar 2001 | B1 |
6252508 | Vega et al. | Jun 2001 | B1 |
6259408 | Brady et al. | Jul 2001 | B1 |
6265977 | Vega et al. | Jul 2001 | B1 |
6283375 | Wilz, Sr. et al. | Sep 2001 | B1 |
6285342 | Brady et al. | Sep 2001 | B1 |
6286763 | Reynolds et al. | Sep 2001 | B1 |
6294997 | Paratore et al. | Sep 2001 | B1 |
6317044 | Maloney | Nov 2001 | B1 |
6342830 | Want et al. | Jan 2002 | B1 |
6418235 | Morimoto et al. | Jul 2002 | B1 |
6496806 | Horowitz et al. | Dec 2002 | B1 |
6526158 | Goldberg | Feb 2003 | B1 |
6531675 | Faitel | Mar 2003 | B2 |
6563417 | Shaw | May 2003 | B1 |
6600418 | Francis et al. | Jul 2003 | B2 |
6685094 | Cameron | Feb 2004 | B2 |
6787108 | Ribi | Sep 2004 | B2 |
6801245 | Shniberg et al. | Oct 2004 | B2 |
6830181 | Bennett | Dec 2004 | B1 |
20030118216 | Goldberg | Jun 2003 | A1 |
20030160096 | Morimoto | Aug 2003 | A1 |
20040046643 | Becker et al. | Mar 2004 | A1 |
Number | Date | Country |
---|---|---|
WO 0004711 | Jan 2000 | WO |
Number | Date | Country | |
---|---|---|---|
20050162274 A1 | Jul 2005 | US |