The present invention relates to an object detection and tracking system to identify and determine the location of objects in an area, and more particularly to an object detection system including different sensor systems to detect objects in the area.
Many existing object detection systems include cameras to identify objects (e.g., humans, animals, etc.) moving through or in an area. Some other systems include a thermal sensor system or a motion detection system to identify the presence of an object in an area. But thermal sensor systems and motion detection systems are limited in their ability to identify what the object is or accurately determine the object's location. In addition, existing systems often falsely identify the objects to be identified, or miss objects entirely. For object detection systems with a camera, these failures can be attributed to viewing limitations of the camera when trying to detect, identify, or determine the position of an object. In addition, camera systems alone cannot determine an accurate location of people in an area when a person's physical size is used for an estimation of their location due to the wide variability in human height and size. Existing systems also tend to be complicated to install, calibrate, and update as the environment changes (e.g., when machinery or equipment is moved around).
The invention provides an object detection and tracking system that detects objects in an area using, in one example, a visual image sensor system, a thermal sensor system, and an object location mapping system or location mapping system (e.g., a three-dimensional (3D) mapping or identification system). The object detection system is constructed identify, determine the position, and track an object with a thermal heat range that is consistent with the thermal heat range of a human. The object detection and tracking system may automatically define and adapt to a work space of the area being monitored.
In one aspect, the invention provides an object detection and tracking system including a visual image system that captures one or more visual images of an area, and a thermal image system that captures one or more thermal images of the area concurrently with the visual image system capturing the visual image(s). The thermal image system detects a thermal signature of an object in the area, and the visual image system and the thermal image system cooperate to determine that the object is the same in the visual image and in the thermal image. The systems also identify the object as being different from the thermal background in the thermal image. The object detection and tracking system also includes a location mapping system that determines a location of the object in the area. The visual image system, the thermal image system, and the location mapping system facilitate identification of the object and tracking of the object in the area. The coordinated use of the visual image and the thermal image for the same object makes determining the location of the object substantially more reliable and consistent. The addition of LIDAR, 3D cameras (including time-of-flight type cameras), 3D stereo cameras, other types of 3D sensor imaging, or other point cloud methods make determining the object's location very accurate regardless of the object's specific shape.
In another aspect, the invention provides an object detection and tracking system including a visual image system positioned relative to an area to capture a visual image of at least a portion of the area and a thermal image system positioned relative to the area to capture a thermal image of at least a portion of the area concurrently with capture of the visual image to cooperatively identify an object in the area having a thermal signature. The at least portion of the area captured in the thermal image conforming to the at least portion of the area captured in the visual image. A location mapping system is positioned relative to the area to determine a location of the object in the area.
In another aspect, the invention provides an object detection and tracking system including a visual image system positioned relative to an area to capture a visual image of at least a portion of the area, a thermal image system positioned relative to the area to capture a thermal image of at least a portion of the area, and a location mapping system is positioned to determine a working space of the area (e.g., in 3D). The working space defined by one or more portions of the area that are unobstructed by one or more inanimate objects relative to a first viewpoint of the visual image system and a second viewpoint of the thermal image system. The location mapping system generates a map of the working space of the area.
Before any embodiments of the present invention are explained in detail, it should be understood that the invention is not limited in its application to the details or construction and the arrangement of components as set forth in the following description or as illustrated in the drawings. The invention is capable of other embodiments and of being practiced or of being carried out in various ways. It should be understood that the description of specific embodiments is not intended to limit the disclosure from covering all modifications, equivalents, and alternatives falling within the spirit and scope of the disclosure. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.
With reference to
The thermal image system 18 (e.g., an infrared image system, mid infrared or “MIR”, or far infrared or “FIR”, etc.) facilitates identification of the objects 34 and works simultaneously or concurrently with the visual image system 14 to do so. More specifically, the thermal image system 18 captures a thermal image 42 of the same or a similar portion of the area 26 that is captured by the visual image system 14. For example, the visual image system 14 may have a first viewpoint to capture the visual image 38 of at least a portion of the area 26 and the thermal image system 18 may have a second viewpoint to capture the thermal image 42 conforming to the at least portion of the area 26. In other words, the visual image 38 and the thermal image 42 might have slightly different extents (e.g., left/right, top/bottom borders of the captured images 38, 42), but visual and thermal images 38, 42 each capture generally the same portion of the area 26.
The thermal image system 18 includes one or more thermal sensors 40 that detect thermal attributes of objects 34 (e.g., via the thermal signature associated with the objects 34) that are in the area 26. For purposes of the description and the claims, the term “thermal object” refers to an object that has thermal attributes or a thermal signature that falls within a predefined thermal heat range. Also for purposes of the description and the claims, the term “object of interest” may be the thermal object. For example, mammals typically have thermal heat signatures from a range from 97° F. to 105° F., birds have thermal heat signatures of approximately 105° F., and cold blooded animals have thermal heat signatures in a range from 50° F. to approximately 100° F. In addition, vehicles may include motors or engines that operate in a range from the approximately 100° F. to 200° F., or even higher. As a result, the predefined thermal range may be in a range from 50° F. to 200° F. to detect any thermal object from cold-blooded animals to vehicles. In some constructions, the predefined thermal range may be from 90° F. to 200° F. to detect mammals, birds, and vehicles. In other constructions, the predefined thermal range may be from 70° F. to 150° F. to ensure that mammals (e.g., people) with thermally-insulating clothing and vehicles can be detected. The ODT system 10 may determine the type of thermal object that is detected based on the thermal signature when the signature is compared to the predefined thermal range. In some constructions, the ODT system 10 may determine, via the processing unit 30, whether a thermal signature of the object 34 differs relative to background in the thermal image based on a comparison of a color of the thermal signature relative to a color of the background, or some other comparison of the object 34 relative to the background of the thermal image. When the thermal signature of the object 34 differs relative to the background, the ODT system 108 communicates with the visual image system 14 and the location mapping system to facilitate identifying and locating the object.
Although the example described in detail and illustrated in the Figures is focused on the objects as people, it will be appreciated that the thermal object(s) 34 may include other mammals or operating equipment (e.g., vehicles). In addition, it should be appreciated that the thermal object 34 may be only a portion of a person (e.g., a head, leg, arm, etc.) or a portion of another thermal object. In general, the ODT system 10 identifies the object of interest by differentiating the object from the visual background in the visual image and from the thermal background in the thermal image. The processing unit 30, or another controller or processor of the system 10, communicates with the visual image system and the location mapping system to facilitate identifying and locating the object when the thermal signature of the object is within the predefined thermal heat range or when the heat signature of the object 34 is differentiated relative to the background of the thermal image.
The location mapping system 22 (e.g., Light Detection and Ranging (“LiDAR”), 3D camera technology, stereo camera technology, or other 3D imaging or mapping technology) determines a location or position of the objects in the area 26, including objects 34, on an X-Y map of the area 26 using one or more light sensors 44 (e.g., pulsed lasers 50). The light sensor 44 determines the distance to the objects from the position of the location mapping system 22 (e.g., in the same vicinity as the systems 14, 18). By determining the location of the objects 34 and other objects in the area 26, as well as portions of the area 26 without any objects, the location mapping system 22 utilizes information from the light sensors 44 to generate an X-Y map of the working space (e.g., where people can move about) of the area 26 and the location of inanimate objects (e.g., objects not of interest). The location mapping system 22 can define the working space prior to or concurrent with determining a location of an object 34 in the area 26.
As shown in
In general, the visual image system 14 does not require a high resolution optical sensor 32 to independently identify objects in the area 26, and the thermal image system 18 does not require high resolution to identify objects based on their thermal image. Instead, the two systems 14, 18 work together to more precisely identify the object(s) 34 to be tracked. More specifically, the ODT system 10 uses the signals and data from the visual image system 14 and the thermal image system 18 to determine whether a thermal object (e.g., the people 34) that is in a particular location in the visual image 38 is also in the same or similar location in the thermal image 42. If so, the ODT system 10 identifies the object as a thermal object 34 to be tracked in the area 26. For example, when the thermal image system 18 detects the heat signature of the thermal object 34 is within the predefined thermal heat range, the thermal image system 18 may communicate with the visual image system 14 and the location mapping system 22 to facilitate identification of the thermal object. That is, the ODT system 10 compares the position of the thermal object 34 in the visual image 38 to the position of the thermal object 34 in the thermal image 42. If the positions in each image 38, 42 are the same or substantially the same (e.g., the object 34 in the visual image 38 coincides with the object in the thermal image 42), the ODT system 10 determines that the visual profile of the thermal object 34 in the visual image and the thermal object 34 are the same object and is tracked by the system 10.
The ODT system 10 also leverages the information or signals from the location mapping system 22 (via the sensor(s) 44) to detect the precise position of the object 34 on the X-Y map based on the identification of the object 34 that is identified by the visual image system 14 and the thermal image system 18. Because the visual image system 14 and the thermal image system 18 cooperatively identify the object(s) 34, the location mapping system 22 does not require a high resolution to separately or independently identify the object by shape, size, or thermal signature. The ODT system 10 overlays the cooperative information from each of the three independent systems 14, 18, 22 to identify an object 34, locate the object 34, and track the object 34. For example, when the identification information from the visual image system 14 and the thermal image system 18 are used with the 3D location information of the object 34 to be tracked from the location mapping system 22, the Y-location on the X-Y map of the image being analyzed can be used to determine the precise location or position of the object in the 3D area 26. The Y-location information may be helpful when tracking a person who is wearing thermally-insulated clothing or footwear, which can make thermal detection of the person more difficult. In some constructions, the ODT system 10 may alert personnel or equipment in the area 26 (e.g., via the processing unit 30) that an object 34 is in the area 26, and provide the location of the object 34, even when the object 34 is moving.
The ODT system 10 can dynamically capture images 38 or record a video of the area 26 so that personnel can actively view or monitor the area 26 relative to equipment that may be operating in the area 26 to inhibit adverse interactions between the equipment and the object(s) being monitored. It will be appreciated that the captured images 38 may be analyzed by the ODT system 10 for real-time pedestrian detection (e.g., to determine potential or real collisions with equipment or other objects, for security purposes, etc.). For example, when a person 34 is in the area 26, the ODT system 10 may send a signal to the equipment or other personnel that the person 34 and the equipment may collide. In the illustrated construction, the signal may be sent by the processing unit 30 via an output 54 (e.g., a wireless output or alarm on equipment). The ODT system 10 may also alert personnel monitoring the area 26 that a person or other object 34 is in the area 26 (e.g., when the person is not supposed to be in the area 26). In some constructions, the visual image system 14, the thermal image system 18, and the location mapping system 22 may communicate with the processing unit 30 to analyze and store (e.g., via a cloud based storage system) the images or video, including the processed images 38, 42 and information. The output 54 may include a display (e.g., a computer, a mobile device, etc.) that allows a user to view the processed information (including the image 38, the image 42, and the information from the sensor(s) 44). In some constructions, the output 54 may perform other functions (e.g., tie an identification or location signal to an alarm, warning, or alert system (e.g., on a fork lift).
The facility map created by the ODT system 10 allows for predictive tracking data to be collected because the object(s) may move in or between one or more of the first workspace zone 112a, the second workspace zone 112b, and the third workspace zone 112c. As shown in
The working space 112 may change due to the inanimate objects 116 being maneuvered in the area 126 during typical work flow within the area 126. As a result, the ODT system(s) 10 may automatically adjust the working space 112 when any changes within the area 126 occur. For example, a temporary load of stacked pallets may be positioned in a spot that blocks some or all of the systems 14, 18, 22 ODT system 10. The ODT system 10 may determine the area behind the pallets is a blind zone and use the adjusted working space 112 to predict the next location of an object to be tracked. By dynamically determining the working space 112, the ODT system 10 uses the images and information from the visual image system 14, the thermal image system 18, and the location mapping system 22 to identify and track objects 34 in the working space 112.
It should be appreciated that the ODT system 10 can be implemented as a self-learning and an adaptive system that updates the working space 112 based on movement of objects within the area 26 (e.g., for real time or near-real time monitoring of the area 26). The visual image system 14 and the thermal image system 18 are visual systems that use background detection and edge detection, and that produce images that are analyzed to determine whether an object is an object of interest. The location mapping system 22 then provides information regarding the location of the object of interest from the place where the system 22 is implemented (e.g., a distance from the sensor(s) 44 to the object of interest). In this way, the ODT system 10 provides a map so that after the system 10 determines that the object is an object 34 to be tracked, the location of the object 34 can be accurately determined within the XY map. The location mapping system 22 informs where all objects and physical boundaries are located within the area 26, including the floor, walls, and any objects (e.g., racking or shelves, etc.) on or above the floor. This 3D mapping can be accomplished by using LIDAR, 3D cameras, sound, or radar, or stereo camera technology. The system can auto calibrate and auto setup the XY floor map and adapt as the area 26 changes. In this way, the ODT system 10 can intelligently scan the floor and determine active tracking areas as well as blind areas within the space being monitored.
While the example described in detail herein relates to monitoring a warehouse or manufacturing facility, and aspects related to a warehouse or manufacturing facility, it should be appreciated that the ODT system 10 may be used to monitor any area 26, 126. By combining 3D mapping characteristics, with optical and thermal imaging, the system can be automatically set up and can easily adapt to changes in the area that is being monitored.
The embodiment(s) described above and illustrated in the figures are presented by way of example only and are not intended as a limitation upon the concepts and principles of the present disclosure. As such, it will be appreciated that variations and modifications to the elements and their configurations and/or arrangement exist within the scope of one or more independent aspects as described.
This application claims priority to U.S. Provisional Patent Application No. 63/040,936, filed on Jun. 18, 2020, and entitled “Object Detection and Tracking System,” the contents of each of which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4027159 | Bishop | May 1977 | A |
5836398 | White | Nov 1998 | A |
6323898 | Koyanagi | Nov 2001 | B1 |
7796081 | Breed | Sep 2010 | B2 |
7924146 | Seder et al. | Apr 2011 | B2 |
8411245 | Lee et al. | Apr 2013 | B2 |
8692739 | Mathieu et al. | Apr 2014 | B2 |
9037343 | Aimura | May 2015 | B2 |
9312605 | Sjölund | Apr 2016 | B2 |
9711050 | Ansari | Jul 2017 | B2 |
10007269 | Gray | Jun 2018 | B1 |
10055979 | Stelzig et al. | Aug 2018 | B2 |
10169680 | Sachdeva | Jan 2019 | B1 |
10210401 | Allen et al. | Feb 2019 | B2 |
10220852 | Valois | Mar 2019 | B2 |
10225492 | Steffanson | Mar 2019 | B1 |
10274958 | Delmarco et al. | Apr 2019 | B2 |
10275797 | Freytag | Apr 2019 | B2 |
10496911 | Walters | Dec 2019 | B1 |
10965929 | Bellows | Mar 2021 | B1 |
11328535 | Guo | May 2022 | B1 |
20050264527 | Lin | Dec 2005 | A1 |
20070182818 | Buehler | Aug 2007 | A1 |
20100157280 | Kusevic | Jun 2010 | A1 |
20100231418 | Whitlow | Sep 2010 | A1 |
20110090343 | Alt | Apr 2011 | A1 |
20110121159 | Mourar | May 2011 | A1 |
20120081544 | Wee | Apr 2012 | A1 |
20120274922 | Hodge | Nov 2012 | A1 |
20140104432 | Lee | Apr 2014 | A1 |
20140132723 | More | May 2014 | A1 |
20150148077 | Jelle | May 2015 | A1 |
20150220789 | Wood | Aug 2015 | A1 |
20160202122 | Zhang | Jul 2016 | A1 |
20170023945 | Cavalcanti | Jan 2017 | A1 |
20180025234 | Myers et al. | Jan 2018 | A1 |
20180067487 | Xu et al. | Mar 2018 | A1 |
20180098727 | Spahn | Apr 2018 | A1 |
20180099663 | Diedrich et al. | Apr 2018 | A1 |
20180101736 | Han et al. | Apr 2018 | A1 |
20180129215 | Hazelton et al. | May 2018 | A1 |
20180211121 | Moosaei et al. | Jul 2018 | A1 |
20180211128 | Hotson | Jul 2018 | A1 |
20180232947 | Nehmadi | Aug 2018 | A1 |
20180233048 | Andersson et al. | Aug 2018 | A1 |
20180341818 | Steffanson | Nov 2018 | A1 |
20190071091 | Zhu et al. | Mar 2019 | A1 |
20190095725 | Kalghatgi et al. | Mar 2019 | A1 |
20190132709 | Graefe | May 2019 | A1 |
20190291723 | Srivatsa | Sep 2019 | A1 |
20190387185 | Hicks | Dec 2019 | A1 |
20200034657 | Yi | Jan 2020 | A1 |
20200146557 | Cheung | May 2020 | A1 |
20200160030 | Lavi | May 2020 | A1 |
20200265259 | Paul | Aug 2020 | A1 |
20200327315 | Mullins | Oct 2020 | A1 |
20200357143 | Chiu | Nov 2020 | A1 |
20210012165 | Jiang | Jan 2021 | A1 |
20210058605 | Lajevardi | Feb 2021 | A1 |
20210062653 | Zeng | Mar 2021 | A1 |
20210158501 | Bhat | May 2021 | A1 |
20210181758 | Das | Jun 2021 | A1 |
20210211831 | Gan | Jul 2021 | A1 |
20210219869 | Ryu | Jul 2021 | A1 |
20210256747 | Ryu | Aug 2021 | A1 |
20210263525 | Das | Aug 2021 | A1 |
20210287469 | Ryhorchuk | Sep 2021 | A1 |
20210295530 | Janjic | Sep 2021 | A1 |
20220005332 | Metzler | Jan 2022 | A1 |
20220058811 | Pokhrel | Feb 2022 | A1 |
20220094883 | Jia | Mar 2022 | A1 |
20220169381 | Alrasheed | Jun 2022 | A1 |
Number | Date | Country |
---|---|---|
102018002955 | Oct 2018 | DE |
2352480 | Apr 2009 | RU |
2017138866 | Aug 2017 | WO |
2018196000 | Oct 2018 | WO |
Number | Date | Country | |
---|---|---|---|
20210397852 A1 | Dec 2021 | US |
Number | Date | Country | |
---|---|---|---|
63040936 | Jun 2020 | US |