A portion of the disclosure of this patent document contains material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all copyright rights whatsoever.
It is not unusual for parts, including relatively large parts to be misplaced in a production environment. Misplacing parts or even large pallets may occur when an employee, such as a forklift operator, moves a part or pallet containing a part or assembly to another location in order to access another part or pallet. Over time, a part or pallet can continue to be moved away without being returned to its original location. Such a scenario causes considerable time to be wasted later in attempting to locate the part or pallet. It should be understood that the smaller the part being searched for, the greater the problem in trying to locate the misplaced part.
What is needed is a tracking system that can monitor the location of a part and pinpoint its exact location on a display so that a user can more easily locate the desired part. What also is needed is a tracking system that can identify a part or item and provide a user with information concerning the identified part or item.
Briefly, and in general terms, various embodiments are directed to a system for tracking an object in a defined area, such as a warehouse of facility. In one embodiment, the system includes a tracking tag attached to the object and a tracking reader that monitors the location of the tracking tag. The tracking reader is located in the defined area. The tracking technology may include Global Positioning Satellite, Radio Frequency Identification, Near Field Communicator, or the like.
The system also includes an imaging device installed in the defined area, and the imaging device includes a field of view. Multiple imaging devices may be installed. The imaging device may provide at least a partial three-dimensional image of the object. As an example, the imaging device may be a Z-depth camera, and the Z-depth camera can capture and provide a node structure or wireframe structure of the object, along with an image of the object. It also has been contemplated that a stereoscopic camera can be used to capture and provide a node structure or wireframe structure of the object. Further, a stereoscopic camera may be used in combination with a Z-depth camera.
The tracking reader and imaging device are in communication with a server. In one embodiment, the tracking reader provides the server with information concerning the location of the object in relation to the tracking reader and the imaging device provides an image of the object to the server. It has been contemplated that the tracking reader will provide a general location of the object and then the server can then determine which imaging device has the object in its field of view. By receiving images or a live feed from the imaging device that has the object in its field of view, the exact location of the object can be determined as shown in the image of the object. The image of the object may be a live video image or a still image.
In addition, a receiving device having a display is in communication with the server. The server can provide an image of the defined area to the display of the receiving device. The server can then highlight or provide an indicia on the image of the defined area indicating the location of the object in the defined area.
A database storing a computer-generated model of the object may also be included in the tracking system. The system may compare the computer-generated model of the object to the node structure or wireframe structure of the object captured by the imaging device in order to identify the object using image recognition software.
Another embodiment is directed to a system for tracking an object in a defined area that includes an imaging device installed in the defined area. The imaging device has a field of view and provides at least a partial three-dimensional image of the object and an image of the object. The image of the object may be a live video image or a still image. The imaging device may be a Z-depth or similar type of camera. The Z-depth camera may provide a node or wireframe structure of the object along with the image of the object to the server. Furthermore, the Z-depth camera can provide information concerning the distance the object is located away from the Z-depth camera.
A database storing a computer generated three-dimensional model of the object also is included in the system. There is a server in communication with the imaging device and the database of the system. The server compares the three-dimensional image from the imaging device to the three-dimensional model stored in the database to identify the object. The location of the object can also be determined by analysing the location of the imaging device, the field of view of the imaging device, and the distance the object is located away from the imaging device. Further, viewing the surrounding area of the object helps to determine the location of the object. In this embodiment, the server includes image recognition software.
A receiving device having a display is also in communication with the server of the system. The system provides an image of the defined area to the display of the receiving device and the system provides indicia on the image of the defined area indicating the location of the object in the defined area. Along with the image of the location of the object on the receiving device, a written description of the location of the object may also be sent to the receiving device.
Yet another embodiment is directed to a method for tracking an object in a defined area. In this method, the object is identified with a computer server by comparing at least a partial three-dimensional image of the object taken by an imaging device to a three-dimensional model stored in a database. The method includes determining the location of the object from the image of the object and the area surrounding the object provided by the imaging device. Also, the location of the object can be determined by analysing the location of the imaging device, the field of view of the imaging device, and the distance the object is located away from the imaging device.
The location of the object is stored in memory or a database that is in communication with the server. Additional information about the object, including product and installation information can be stored in the database and associated with the object.
Further, information concerning the location of the object may be sent to a receiving computer device having a display that is in communication with the server. The image of the object and the area surrounding the object may be sent to the receiving computer device to help identify the location of the object. Additional information concerning the object can be sent to the receiving device as well. The method also may include marking the location of the object on the video image of the object and the area surrounding the object that is sent to the receiving computer device. A description of the location of the object can be sent to the receiving device.
In one embodiment of the method of tracking the object, the method further includes monitoring the location of the object by comparing at least a partial three-dimensional image of the object taken by the imaging device to a three-dimensional model stored in the database and determining the location of the object from the video image of the object and the area surrounding the object from the imaging device. The monitoring can be done continuously, on request, or at any desired interval of time. If during the method it is determined that the object has moved to a new location, this new location of the object is tracked and stored in the database.
Other features and advantages will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, which illustrate by way of example, the features of the various embodiments.
In one embodiment, a tracking system utilizes location tracking technology coupled with an imaging device to provide an accurate real-time visual display of a location of an object, item, part, product, or the like. This may be accomplished by tracking a location of an object in relation to a particular reader, whether that is a Global Positioning Satellite (“GPS”), Radio Frequency Identification (“RFID”) Reader, Near Field Communicator, or the like. Further, the tracking system realizes the location of the imaging device and the location of the object in relation to the imaging device. Combining information received from the tracking technology with information obtained by the imaging device allows the tracking system to determine a coordinate location of the object and allows any object located within the field of view of the imaging device to be visually displayed on a screen. The system may also store and provide information associated with objects or a group of objects. This tracking system may be setup as a permanent installation with a control hub, or may be mobile.
It has also been contemplated that the tracking system can be used to track a single object or part through a manufacturing process of a larger product, such as an aircraft, furniture, automobiles, or the like. More specifically, the tracking system is able to pinpoint the location of an object from the point it is produced and entered into the system, through its installation on a larger product.
In another embodiment, the tracking system utilizes a Z-depth or depth-sensing camera or other type of three-dimensional imaging device to provide an accurate visual display (real time video feed or still image) of a location of an object, item, part, product, or the like. It has also been contemplated that a stereoscopic camera alone or in combination with a Z-depth (depth-sensing) camera may be used. Using a Z-depth camera and a stereoscopic camera together may increase the accuracy of identifying an object because more information concerning the identity and location of the object will be gathered using both types of cameras simultaneously. In this embodiment, the tracking system identifies the object by comparing node and/or wireframe images of an object sent from the Z-depth camera to a database of computer generated (“CG”) models of objects. Once the object is identified, the location of the object also can be monitored using information sent from the Z-depth camera as discussed more below.
By way of example only, the tracking system allows users to search for objects, including objects that may be lost or misplaced in an area of interest, such as a facility, warehouse, store, storage facility, building, or any other type of area. By using an imaging device, a tracking tag, and/or a database stored with CG models of objects to be tracked, the tracking system may utilize information collected from the imaging device along with tracking information provided by the tracking tag to identify the object, establish the location of the object within a defined or undefined area, and monitor the location/movement of the object. The imaging device may recognize and delineate “zones” of the area of interest, and an alphabetical/numerical coordinate can be created using a series of zones with multiple sub-quadrants to pinpoint and record the location of a part or assembly that has attached a tracking device. These coordinates also may be sent to a receiving computer device to help locate the object.
Referring now to the drawings, wherein like reference numerals denote like or corresponding parts throughout the drawings and, more particularly to
Also, the tracking system may include a tracking tag 26, such as an RFID tag, attached to a part or object 28 to be tracked. A tracking reader, such as an RFID reader 30, can monitor the location of the tracking tag. Memory associated with the server 20 or a separate database 24 can be used to store information relating to the objects being tracked by the system. A back-end computer or control panel 32 may also be in communication with the server and/or database to access, create, and edit information stored on the server and/or database. In certain embodiments, the back-end computer may function as a central hub for operating the tracking system.
The server 20, back-end computer 32, or a separate module (not shown) in communication with the server may analyse information received from the tracking reader 32 to determine the location of the object. The server may also send information concerning the location of the object and descriptive information of the object to a receiving computer device 34 via a network 36. The network may be any local area network, wide area network, cellular network, cloud based network or the Internet. The receiving computer device 34 may be any type of computer having a display, including stand-alone desktop computers, mobile devices, smart phones, tablets, laptops, or the like. If the receiving computer device 34 includes an imaging device, then the receiving computer device can be used to capture images of objects to be sent to the server for analysis.
Also, the server 20 may be any computer. By way of example, and not by way of limitation, the server may include 32 GB GDDR5 Ram, capable of approximately 7 teraflops of data processing, with a minimum 12-core processor. In other embodiments, multiple single core processors may be connected to achieve lower energy consumption and higher processing power than a single 12-core processor. Still further, a cloud-based server with appropriate capabilities could function as the server 20.
The server 20, back-end computer 32, or the separate module associated with the server, will be required to include image recognition software to identify objects in the field of view of the imaging devices 22 or receiving computer device 34 having an imaging device. The server, back-end computer, or separate modules associated with the tracking system can be programmed to utilize the image recognition software and the imaging devices together. The tracking system can identify viewed objects by comparing images of objects taken by the imaging device to stored CG models of objects. The configuration of each imaging device 22 and the location of the imaging devices may be unique to the facility utilizing the tracking system and the configurations and locations will be known and inputted into the tracking system. Furthermore, in one embodiment, the location of the tracking readers 30 will also be known and inputted into the tracking system. Proprietary software may also be used with the tracking system and designed specifically for each facility using the tracking system. By way of example only, and not by way of limitation, the proprietary software may use algorithms that identify objects to be located, search for the location of the object, provide descriptive information (including assembly information) of the object, monitor the location of all or some objects, send alerts when an object is moved or if a defect is detected, and display search results in a desired format.
In one embodiment, the database 24 may store CG models for each part or object to be tracked by the tracking system 10. The CG models stored in the database generally are created in a wireframe or node structure, similar to the wireframe structure of an object shown in
Also, it is possible that the server 20 can reference the CG model database and determine if the part or object being viewed by the imaging device 22 has any defects. This may be helpful to prevent a faulty part from being installed into the larger product. If only a certain percentage of nodes of the object visible to the imaging device match the nodes of a stored CG model for the object, then the tracking system can alert the user that the part may be defective. This percentage or percentage range will depend on the structure of the object.
The database 24 may also store information concerning the objects or parts. For example, any information concerning the object may be stored, including a description of the object, shipping information, assembly information concerning the object and the like. This information will be associated or linked with the CG model of the object or part. The tracking system can send and display this information to a requesting user.
In one embodiment, as shown in
The tracking tag technology provides an approximate location to the server 20 of the tracking system 10. This information can be used to identify the imaging device 22 in the best location to scan and view the area the object is located, so that the tracking system can identify the object and show its exact location on a display of the receiving computer device 34. With the location information supplied by the tracking technology, the tracking system can identify a general location of the object, within approximately five feet of its exact location. The distance of the approximate location may differ depending on the technology being used. In one embodiment, the RFID reader 30 or other tracking technology can be in a centralized location of the facility or area of interest. The RFID reader is then capable of determining the location of a RFID tag 26 in relation to the RFID reader itself. As an example, the reader can determine if a RFID tag is twenty meters due east of the location of the reader. The reader sends this location information to the server of the tracking system, which has the location of the imaging devices and the location of the reader stored in memory. The system cross-references the location information of the object sent from the RFID reader 30 with the locations of the imaging devices 22, and is able to identify an imaging device with a field of view of the approximate location of the object. In another embodiment, if the user is tracking an object using a mobile imaging device, the GPS coordinates of the mobile imaging device can be compared to that of the RFID reader, in order to provide directions from the mobile device to the approximate location of the object. An image taken from the image device closest to the approximate location of the object can also be sent to the mobile imaging device of the user. This allows the user to stream a live feed from the depth sensing camera to their mobile device. In this embodiment, the mobile device does not need a depth-sensor installed to track the object.
However, in another embodiment, a mobile device may include a built in depth-sensing camera to scan the area and track an object. As an example only, a handheld three-dimensional mapping device may be used in place of or in combination with the imaging devices 22. It has been contemplated that the handheld three-dimensional mapping device could be used to scan certain areas that are not in the field of view of an imaging device 22. The handheld three-dimensional mapping device may include an integrated camera, integrated depth-sensing, and an integrated motion tracking camera. In this embodiment, the handheld three-dimensional mapping device can send a live feed or still images back to the server, and provide node and wireframe information of objects and the surrounding area. This information can then be used to locate or identify objects in the field of view of the handheld three-dimensional mapping device.
To identify the one or more objects, a marking or indicia 40 may be displayed on a real time image or schematic image of the area of interest (facility) to highlight or identify the location of the object as shown in
Any number of imaging devices 22 can be located around the area of interest, and it is preferred that enough imaging devices are installed to cover and view the entire area of interest, such as the entire floor of a warehouse. Also, the location and identity of each imaging device 22 installed around the area of interest is known to the server 20 and can be plotted on a map. Further, the system is programmed with the scope of the field of view of each imaging device 22, which allows the tracking system 10 to utilize a specific imaging device with a field of view covering the general location of a part established by the tracking tag. By using Z-depth cameras, stereoscopic cameras, or other three-dimensional imaging devices, additional information can be collected from the imaging devices; including the distance objects are located away from the imaging device.
As an example only, and not by way of limitation, one imaging device faces north, is mounted 12 ft in the air, and is mounted facing downward at a 45 degree angle. The camera continually shoots a single line out of the camera for a “center line”. This center-line runs to the floor of the area and is a first reference point for the imaging device. In this example, the distance from the imaging device to the floor along the center-line is 20 ft. With this information, the system may create a right angle triangle for its field of view. This triangle is 12 ft on its back end and 20 ft on the hypotenuse, and using the Pythagorean Theorem, the system knows that the distance along the floor from the imaging device to the first reference point is 15 ft. This provides an exact distance the first reference point is from the camera along the floor. In this example, the software communicates with the imaging device to create a similar triangle every 1/1000 of a degree to the right and left of the center line until it fills the entire field of view for the imaging device. The system will perform these steps every 1/1000 degree above and below the center line. The number of iterations needed to create a node and/or wireframe structure will depend on the lens/camera of the imaging device and the desired definition of the three-dimensional image created by the system. This effectively maps the depth of an entire space.
Based on the information that the imaging devices 22 provide to the system of the viewed area, the system can reconstruct the area, and objects within the area, in a node and/or wireframe structure. As described above, node and/or wireframe information collected by the imaging devices is delivered to the tracking system to create at least a partial three-dimensional image of the object. Using image recognition software, the tracking system is able to compare the node/wireframe information of the object from the imaging device to CG models in the database 24 in order to identify the object. The CG models in the database 24 are three-dimensional images of the entire object, and the system compares the partial three-dimensional object acquired from the imaging device to all sides, views, or angles of the compete three-dimensional image of the CG model. The system will recognize if the partial three-dimensional object acquired from the imaging device is a match to any part of the complete three-dimensional image of the CG model.
One embodiment of a method of using the tracking system 10 is described with reference to
At step 56, the system determines a more accurate location of the object 28. In this step, the server 20 can reference a CG model of the object 28 in the associated database 24 based on the identification of the part provided by the user. The server may then communicate with the imaging device 22 that is scanning or has a field of view covering the general location of the tracking tag 26 attached to the object 28. Designated imaging devices(s) 22 transmit images to the server 20 to identify the requested object 28 by comparing images in the field of view of the image device to the stored CG image of the object obtained from the database 24. Using image recognition software installed on the server 20 or on a module associated with the server, the server can identify the exact location of the object 28 in the area of interest. Further, using information gathered by the imaging device 22, such as a Z-depth camera, the distance to the object from the imaging device can be established, and the viewing location and viewing angle of the imaging device will be known by the system to help pinpoint the location of the object 28.
Once the tracking system 10 pinpoints the location of the object 28 by identifying the object with the imaging device 22, the object can be displayed or highlight on a display of the receiving computer device 34 at step 58 of
Furthermore, the tracking system 10 may include filtering options available to allow a user to view a description or other information of the object 28, and display one or multiple objects on a screen. Such filtering options may include viewing all objects in the facility, viewing one or more objects by part number or installation number, and viewing objects in a certain area of the facility. The system may display the filtering options 44 on a display 46 of the receiving computer device 34 as shown in
It has also been contemplated that the tracking system 10 may not require the use of the tracking tag 26. This embodiment of a tracking system 50 shown in
In one embodiment, the tracking system 50 tracks parts or objects 28 through a facility during a manufacturing process of a larger product, such as an aircraft, where the parts or objects are installed to create the larger product. It has also been contemplated that the tracking system can be used to monitor whether the parts have been correctly installed during manufacture of the larger product. This may be achieved by using the receiving computer device 34, including technology that provides a heads up display (“HUD”) on any screen or glass, or similar devices, coupled with a Z-depth, stereoscopic, or similar camera capable of viewing and providing images in three-dimensions. As an example only, the HUD device may also include an integrated camera, integrated depth-sensing, or an integrated motion tracking camera for use with the system and the imaging devices 22. In one embodiment, the HUD device would use the integrated camera to recognize a part or object from a predetermined CG database. Information associated with the part would be linked to the CG model and include information on how the part is to be installed into the larger product, along with information on the progress of the manufacturing of the larger product. This information can be displayed to the user on the HUD device.
By way of example only, the tracking system 50 can be used when manufacturing an aircraft, however, it should be understood that the tracking system can be adopted to the construction of any larger product so long as the final product has been constructed in the CG realm beforehand, including all of the smaller parts that are used to complete manufacturing of the larger product.
In one embodiment, the receiving computer device 34, such as a mobile imaging device, can take an image of the part or object 28. The server 20 can receive the image of the part 28 from the receiving computer device 34 or any imaging device 22, then recognize the part by comparing the captured image to the CG database. After identifying the part, the server 20 can display information about the part to the user on the receiving computer device 34, including information on installing the part. Since the final product has been digitally modelled, the server is able to recognize the area that surrounds where the part is to be installed. An image of the surrounding area can be taken with the receiving computer device 34 (mobile imaging device), and the system can recognize the surrounding area of where the part is to be installed and visually show the user the exact rotation and angle that the part is to be installed. After installation is complete, an image of the installed part can be taken with the receiving computer device 34 (mobile imaging device or HUD device), and due to the fact that the entire aircraft was created using CG models, the server can reference the CG model and based on the angle/rotation of the part and its surroundings, determine whether or not the part is installed correctly. If the part is not installed correctly, an alert can appear on the display of the receiving computer device 34 and possibly give solutions to fix the problem before progressing any further in manufacturing of the aircraft. If the part is installed correctly, the server can update the system that the part was installed. The system can do so for every part ensuring that each part was installed correctly. Such a tracking system may result in major savings for the company that would normally spend additional time and money correcting installation of the one part that was incorrectly installed.
An example of a method of using the tracking system 50 is described with reference to
After initializing the tracking system 50, the server 20 uses algorithms based on close proximity pixel edge relation, (contrast, color difference/similarity, etc.) to identify like pixels that make up edges or sides of objects that are visible within the field of view of the imaging device 22 or devices at step 92 of
If the server 20 does not find matching outlines (wireframe or node structure) of the object 28 in the stored CG models of the database 24, the server 20 sends an alert to pre-approved personnel. The alert notifies the personnel that the object 28 at a certain location is not referenced in the database at step 96. The server continues to scan the area of interest until a tracked object is identified. When the server 20 identifies an outline from the database that matches the wireframe image of the object taken by the imaging device(s), the server determines and stores the location of the object in the database at step 98. In this step, it is the location of the object 28 within the field of view of the imaging device 22 that is being stored in the database 24. This location is stored until the object moves, in which case the imaging device(s) 22 identifies the movement and stores the next resting location of the object 28. If the object 28 is scanned by the receiving computer device 34 (mobile imaging device), the location of the object can be attached to the current location of the mobile device based on the GPS location of the mobile imaging device.
Once a location of the object 28 is established by the server 20, the server begins to continually track or monitor the location of the object at step 100. If it is determined by the server 20 that the object leaves its “safe zone” (pre-determined areas where the object should be from manufacturing through installation) for five minutes (or any other designated amount of time), the server communicates with the imaging device(s) 22 in the area that the object was last detected and records the live video feed from about thirty seconds (or any other designated amount of time) before the object was last detected in the safe zone until predetermined personnel stops the recording. The server sends out alerts to the predetermined personnel about a lost object at step 104.
When the server 20 finds an outline (wireframe) from the database 24 that matches the outline (wireframe) of the image of the object 28 taken by the imaging device(s) 22, the server may display these outlines or other indicia on top of the live video stream generated by the imaging device on a display of the receiving computer device 34. The live video stream and overlaid image of the object 28 may also be sent to any web browser, application, mobile web browser, or the like. Also, a still image taken from the imaging device 22 may be used, and in other embodiments, a schematic drawing or other representation of the area of interest may be displayed. An example of a screen shot taken from the receiving computer device 34 is shown in
In one embodiment, the server 20 may display filtering options 4, such as searching for and viewing one particular object, searching for and viewing a group of particular objects, searching for and viewing objects in a specific area, searching for and viewing objects needed for a particular project or build, searching for and viewing the next object required for a particular project or build, searching for and viewing all objects in an area of interest, and the like. An example of filtering options 44 is also shown in
Also, the tracking system 50 may monitor parts or objects 28 throughout the entire manufacturing process of a larger product, such as an aircraft. The steps of this method are shown in
A user may also select an information box of the particular object or objects for display. This action causes a custom window or box 42 to open displaying the information of a selected object as shown in
On the display shown in
At step 124 of
At step 126 of
If the server 20 does not recognize the part or object 28, the server may display an alert box to the user stating the part is not recognized at step 130. The server or designated personnel can then determine whether the part does not belong in the manufacturing process or is simply defective, by referencing the imaged part and comparing it to its CG counterpart already stored in the database 24. The server 20 can also direct the user to the next part or object that is to be installed and provide its location in the safe area at step 132.
The user may use the tracking system 50 to help install parts during the manufacturing process of a larger product as described in the chart shown in
If the part is next in line to be installed, the tracking system 50 opens and displays the same full-scale real time CG model of the product (not shown) being built as used in the “view progress” menu discussed above. On the display of the full-scale real time model, the system may highlight the imaged or selected part in the exact location where the part is to be installed on the final product in step 150. The server may cause a dialog box to be displayed describing where the part is to be installed and instructions for installing the part. The server may also allow the user to zoom, rotate, and manipulate the CG model to assist the user in finding the installation location of the part. When in front of the installation location of the part, the system may prompt the user to take an image using the mobile imaging device of the installation location of the part at step 152. Utilizing the same process as described above and using the real time model being built as a reference, the tracking system is able to recognize the parts surrounding the part to be installed at the designated location. Comparing identified installed parts to the full-scale CG model, the system may determine if the surrounding installed parts have been installed correctly at step 154.
If the surrounding parts have not been installed correctly, the server may send an alert to predetermined personnel that a particular part or parts have not been installed correctly, and that the image of the surrounding parts does not match the previously built full scale CG model of the product being built. The system instructs the designated user to remove the incorrectly installed part and verify that it is the correct part for this location, and verify that it has no defects. If the system does not recognize the identified part or the part is recognized to have a defect by comparing the node/wireframe structure of the imaged part to the node/wireframe structures of the CG models in the database, the user is instructed to set the part aside. The system may also send out alerts to predetermined personnel of an unidentified or defective part including the current location of the part. The system may then direct the user to the proper part and highlight its location within the facility so the user can easily find the proper part for installation. If the surrounding parts have not been installed correctly, but the system recognizes the parts to be the correct parts with no defects, the server can instruct the user to correctly re-install the parts.
If the surrounding parts have been installed correctly, the tracking system 50 may cause an instruction box to be displayed to the user, describing exactly how the next part in the installation process is to be installed at step 156 of
The system 50 may then verify that the part was installed correctly. If the part was installed correctly, the system digitally installs the part on the “view progress” model and updates the progress of the manufacturing process for the product. The system may then identify the next part to be installed based on the pre-installation guide at step 160. The server continues to track the installation of all parts, through the same installation process until the installation phase of the product is complete and all parts have been installed and verified. Depending on the facility and the workflow, the tracking system 50 may be used to repeat the production of the same product or may be updated and installed with information for completing manufacturing of a different product.
In yet another embodiment, the tracking systems 10 or 50 can be used to identify certain indicia affixed to an object, container, or area storing parts or objects. By way of example only, a QR code may be affixed to the object or container that could be read by the imaging devices 22. By reading the QR code, the system can track a box of small parts through a facility. Other types of indica affixed to the object or container could be a specific color and/or shapes on the container, such as a yellow triangle or blue square. A barcode could be affixed to the container if it is large enough to be read by the imaging devices 22. Further, relatively large color letters, numbers, or symbols could also be affixed to the object or container so that the imaging device 22 could identify the object/container and its contents. In one embodiment, ultraviolet ink could form the indicia. The ultraviolet ink may be readily identifiable by the system. The system may recognize any type of indicia, color, or symbol. The database 24 would store the indicia, color, or symbol associated with any object or container of objects. This will allow the system to reference information concerning the object or container of objects and allow the system to track the objects. In these embodiments, any type of high-resolution camera could be used as the imaging device to identify any indicia, color, or symbol.
One of ordinary skill in the art will appreciate that not all tracking systems will have all these components and may have other components in addition to, or in lieu of, those components mentioned here. Furthermore, while these components are viewed and described separately, various components may be integrated into a single unit in some embodiments.
The various embodiments described above are provided by way of illustration only and should not be construed to limit the claimed invention. Those skilled in the art will readily recognize various modifications and changes that may be made to the claimed invention without following the example embodiments and applications illustrated and described herein, and without departing from the true spirit and scope of the claimed invention, which is set forth in the following claims.
This application claims the benefit of U.S. provisional patent application No. 61/791,323, filed Mar. 15, 2013, which is herein incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
61791323 | Mar 2013 | US |