The present disclosure relates to robot navigation systems, and more particularly, to a sensor arrangement for use in robotic devices such as robotic surface cleaning device that utilizes time of flight sensors (ToF) for navigation and localization.
Some navigation approaches for robotic devices includes utilizing imaging systems to identify objects in an environment for mapping and localization purposes. Such systems may include one or more image sensors to perform object detection, wall tracking, and so on. For example, such systems may include multiple image sensors that each have a different field of view.
One such navigation and localization approach includes utilizing a simultaneous localization and mapping (SLAM) algorithm with image sensor data as an input. Often, multiple image sensors get utilized to ensure that front, back, and side views get captured for purposes of ensuring that environmental features/obstructions are factored into navigation decisions. Multiple image sensors can be particularly important when a robotic device can move in potentially any direction based on rotation about a center axis of the same. This ensures that the robotic device collects a sufficient amount of environmental data from each field of view to prevent collisions, falling down stairs, and so on. However, image sensors increase both cost and complexity in manufacturing robotic devices, as well as necessitate having sufficient hardware/software resources to capture and process multiple simultaneous image data streams.
These and other features and advantages will be better understood by reading the following detailed description, taken together with the drawings wherein:
In general, the present disclosure is directed to a time of flight (ToF) sensor arrangement that may be utilized by a robot device, e.g., a robotic surface cleaning device (or vacuum) or other robotic device, to identify and detect objects in a surrounding environment for mapping and localization purposes. In an embodiment, a robot is disclosed that includes a plurality of ToF sensors disposed about a housing of the robot. Two or more ToF sensors may be angled/aligned to establish at least partially overlapping field of views to form redundant detection regions around the robot. Objects that appear simultaneously, or nearly simultaneously, may then be detected by the robot and utilized to positively identify, e.g., with a high degree of confidence, the presence of an object. The identified objects may then be utilized as data points by the robot to build/update a map. The identified objects may also be utilized during pose routines that allow the robot to orient itself within the map with a high degree of confidence.
Although the following aspects and embodiments specifically reference robotic vacuums, this disclosure is not limited in this regard. The following disclosure is equally applicable to any type of robot that seeks to intelligently understand and traverse an environment by identifying objects/features, e.g., furniture, walls, toys, people, pets, and so on, and their position in the environment. In addition, the ToF sensor arrangement disclosed variously herein may be utilized in a robot without an image sensor system for object identification/tracking, or alternatively, may be used in combination with an image sensor system.
The ToF sensor arrangement disclosed herein allows for environmental information to be collected in a relatively simple manner that can be used alone or in combination with other sensory such as image sensors. When used as a replacement for image sensors, the ToF sensor arrangement advantageously significantly reduces cost, complexity and computational load on hardware resources of the robotic device.
As generally referred to herein, a ToF sensor refers to any sensor device capable of measuring the relative distance between the sensor and an object in an environment. Preferably, infrared-type ToF sensors may be utilized, wherein each infrared ToF sensor includes an IR transmitter and receiver. However, other sensor types may be utilized such as acoustic ToF sensors that emit and receive sound waves, e.g., ultrasound, for measurement purposes.
Referring now to
The housing 102 may have any shape and is not necessarily limited to the shape shown in the figures (e.g., circular). For example, the housing 102 may have a square shape, a D-shape, a triangular shape, a circular shape, a hexagonal shape, a pentagonal shape, and/or any other suitable shape. In some instances, the positioning of the ToF sensors 104, relative to the housing 102, may be based, at least in part, on the shape of the housing 102.
Each of the plurality of time of flight sensors 104-1 to 104-n may comprise any sensor capable of measuring relative distance between the sensor and an object in a surrounding environment and converting the same into a representational electrical signal. For example, the time of flight sensors 104-1 to 104-n may comprise infrared laser-type sensors that utilize infrared wavelengths to output a measurement distance signal, which may be referred to herein as simply a measurement signal. In other examples, the time of flight sensors 104-1 to 104-n may comprise sensors capable of measuring distance acoustically via soundwaves, e.g., ultrasound. In any event, the time of flight sensors 104-1 to 104-n may comprise short-range sensors capable of measurements from a few centimeters to a meter, or long-range sensors capable of measurements from 1 meter to hundreds of meters, or a combination of both short and long-range ToF sensors.
As discussed in further detail below, the navigation controller 103 may receive measurement signals from the ToF sensors 104-1 to 104-n to identify objects the environment of the robot. In an embodiment, the location of the identified objects relative to a known position of the robot may be utilized to update/build a map from a point cloud, e.g., a plurality of points that may utilized to generate a map. The identified objects may also be utilized to calculate robot odometry and pose in order to localize the robot within the map. The ToF sensors 104-1 to 104-n may be used exclusively to identify objects in an environment, e.g., without the aid of image sensors or other like device, or may be used in combination with image sensor(s).
The ToF 104-1 includes a field of view (FOV) 108 which can have a generally conical shape. Each FOV may also be referred to herein as detection regions. When observed from the top, e.g., as shown in
Turning to
This configuration shown in
Thus, in response to the object appearing in the overlapping region 210, a height determination for the object 212 may then be calculated. The calculated height for the object 212 may then be used by the robot navigation system 100 during localization and navigation. For example, height determinations by the robot navigation system 100 using the vertically-stacked ToF arrangement may be advantageously utilized to distinguish between objects/obstructions in an environment that can be navigated around, e.g., furniture, toys, and so on, by the robot 109 versus objects/obstructions that cannot be navigated around such as walls and windows.
Each of the first and second ToF sensors 104-1, 104-2 may optionally be part of first and second ToF arrays 202-1, 202-2. Each of the ToF sensors in the second ToF arrays 202-1, 202-2 can include ToF sensors disposed in a uniform manner relative to each other about the body 102 of the robot, such as shown, or may be disposed at varying distances relative to each other. As further shown in
In addition, each of the first and second ToF sensors 104-1, 104-2 may optionally be part of first and second ToF arrays 202-1, 202-2, respectively. In this embodiment, the first and second ToF arrays 202-1, 202-2 may be disposed in a staggered manner. To this end, the robot navigation system 100 may utilize overlapping FoVs from the first and second ToF sensors 104-1, 104-2, or from first and third ToF sensors 104-1, 104-3. Accordingly, this staggered configuration provides the robot navigation system 100 with flexibility as to which combinations of sensors, and by extension, which overlapping FOVs to utilize when calculating heights of objects/instructions within the same. Note while the embodiments of
As is known, light-based ToF sensors measure relative distances to objects by reflecting light off of an object and measuring the duration of time for the light to be reflected back to the sensor. These calculations operate, in part, based on the speed of light remaining constant. As a robot travels forward, the change of a reported distance combined with the time interval between the reported changes in distance may be used to calculate the real-time speed of the robot, for instance. Therefore, the speed of a robot may be given by the following equation:
In an embodiment, the angles of the FOV may allow for overlap of regions between the FOV for two or more ToF sensors. These overlapped regions may also be referred to as redundant detection regions. For example, the embodiment of
In one specific example embodiment, the ToF 104-1 may be initially relied upon to track odometry. If the ToF 104-1 is not registering any objects, e.g., measurements are at or below a threshold floor value for distance, or tracking an object that has yet to enter the FOV of the other ToF sensors 104-2 to 104-7, the navigation controller 103 may utilize the other sensors that have overlapping FOVs to positively identify objects through multi-sensor detection, e.g., by sensors 104-3 and 104-6. In response to the detection of an object, the navigation controller 103 may “hand off” tracking to the ToF sensor with the FOV that is more likely to continue to have the object in view.
For example, consider a scenario where the navigation controller 103 detects object 112 simultaneously, or nearly simultaneously, entering the FOV 111-3 of ToF sensor 104-3 and the FOV 111-6 of ToF sensor 104-6. The robot 109 may then “hand off” tracking to the sensor 104-6 as the associated FOV 111-6 can continue to detect the object 112 over the entire distance D as the robot 109 moves along direction 110. On the other hand, the sensor 104-3 only remains capable of detecting object 112 for distance d before the same is outside the detection range of FOV 111-3. Accordingly, distance d represents the extent of the redundant region by which both ToF sensors 104-3 and 104-6 can track object 112 while distance D represents the extent of the entire region by which ToF sensor 104-6 can detect presence/distance of the object 112 (assuming forward movement along direction 110). In a general sense, this “handing off” allows for objects to be tracked by a single sensor with a relatively high degree of confidence that the object is present, and the relative distance of the robot 109 to the objects. This can be particularly advantageous when attempting to maintain a particular distance from an object such as a wall, furniture or other obstruction, or to otherwise adjust operation of the robot 109 based on the presence and distance of the object 112.
Alternatively, or in addition to the “hand off” scenario discussed above, the navigation controller 103 may continue to track objects via two or more ToF sensors. For example, in the embodiment of
In any such cases, and in accordance with an embodiment, objects may be detected by the navigation controller 103 when two or more ToF sensors with overlapping FOVs output a measurement signal that indicates the presence of an object and its relative distance from the robot 109. In some cases, multiple measurements from each of the ToF sensors may be utilized by the navigation controller 103 to minimize or otherwise reduce instances of false positives. Thus, upon successive data points from each FOV (e.g., derived from the respective measurement signals), the navigation controller 103, and by extension the robot 109, can detect an object in the environment, optionally continue to track that object via one or more ToF sensors, and use those data points to increase the confidence of the robot odometry processes.
In an embodiment, ToF redundant detection regions (labeled Regions 1 to 3) can be utilized during pose calculations. Consider a scenario where the robot 109 of
In accordance with an aspect of the present disclosure a robotic surface cleaning device is disclosed. The robotic surface cleaning device comprising a housing, a motor coupled to at least one wheel to drive the robot, at least a first Time of Flight (ToF) sensor and a second ToF sensor coupled to the housing, the first and second ToF sensors having a first and a second field of view (FOV), respectively, the first and second FOV at least partially overlapping each other to form a first redundant detection region, a navigation controller disposed in the housing, the navigation controller to receive a first and second measurement signal from the first and second ToF sensors, respectively, and detect an object based, at least in part, on the first and second measurement signals indicating a presence of an object within the first redundant detection region.
In accordance with another aspect of the present disclosure a robotic surface cleaning device to navigate in a surrounding environment to perform cleaning operations is disclosed. The robotic surface cleaning device comprising a housing with a first plurality of Time of Flight (ToF) sensors to identify and/or track objects in an environment surrounding the housing, wherein at least a first and a second ToF sensor of the first plurality of ToF sensors have detection regions at least partially overlapping each other to form a first redundant detection region, and a controller disposed in the housing to determine a location of an object in the surrounding environment relative to the housing based at least in part on the first redundant detection region.
In accordance with another aspect of the present disclosure a computer-implemented method for navigation of a robotic surface cleaning device is disclosed. The method comprising establishing at least a first redundant detection region based at least in part on first and second Time of Flight (ToF) sensors having associated detection regions that at least partially overlap each other, receiving, by a controller, first and second measurement signals from the first and second ToF sensors, respectively, and detecting, by the controller, a location of an object relative to the robotic surface cleaning device based on the first and second measurement signals indicating presence of the object within the first redundant detection region.
While the principles of the disclosure have been described herein, it is to be understood by those skilled in the art that this description is made only by way of example and not as a limitation as to the scope of the disclosure. Other embodiments are contemplated within the scope of the present disclosure in addition to the exemplary embodiments shown and described herein. Modifications and substitutions by one of ordinary skill in the art are considered to be within the scope of the present disclosure, which is not to be limited except by the following claims.
The present application claims the benefit of U.S. Provisional Patent Application Ser. No. 62/651,766 filed on Apr. 3, 2018, which is fully incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
2751030 | Null | Jun 1956 | A |
3629796 | Brownscombe et al. | Dec 1971 | A |
3881568 | Ando et al. | May 1975 | A |
4028533 | Matsubara | Jun 1977 | A |
4119900 | Kremnitz | Oct 1978 | A |
4133404 | Griffin | Jan 1979 | A |
4282430 | Hatten et al. | Aug 1981 | A |
4412129 | Duncan | Oct 1983 | A |
4479053 | Johnston | Oct 1984 | A |
4502773 | Gaewsky et al. | Mar 1985 | A |
4532867 | Mitchell | Aug 1985 | A |
4556313 | Miller, Jr. et al. | Dec 1985 | A |
4558215 | Kaneko et al. | Dec 1985 | A |
4611292 | Ninomiya et al. | Sep 1986 | A |
4627511 | Yajima | Dec 1986 | A |
4659922 | Duncan | Apr 1987 | A |
4668859 | Winterer | May 1987 | A |
4674048 | Okumura | Jun 1987 | A |
4703240 | Yoshimoto et al. | Oct 1987 | A |
4752799 | Stauffer | Jun 1988 | A |
4893025 | Lee | Jan 1990 | A |
4920520 | Goebel et al. | Apr 1990 | A |
4996468 | Field et al. | Feb 1991 | A |
5109161 | Horiuchi et al. | Apr 1992 | A |
5122796 | Beggs et al. | Jun 1992 | A |
5187361 | Ishii | Feb 1993 | A |
5245177 | Schiller | Sep 1993 | A |
5254853 | Reich | Oct 1993 | A |
5276618 | Everett, Jr. | Jan 1994 | A |
5319611 | Korba | Jun 1994 | A |
5341540 | Soupert et al. | Aug 1994 | A |
5353224 | Lee et al. | Oct 1994 | A |
5377106 | Drunk et al. | Dec 1994 | A |
5396070 | Lee | Mar 1995 | A |
5418359 | Juds et al. | May 1995 | A |
5550369 | Skell et al. | Aug 1996 | A |
5621291 | Lee | Apr 1997 | A |
5652489 | Kawakami | Jul 1997 | A |
5933225 | Yamabuchi | Aug 1999 | A |
5935179 | Kleiner et al. | Aug 1999 | A |
5995883 | Nishikado | Nov 1999 | A |
6023814 | Imamura | Feb 2000 | A |
6094158 | Williams | Jul 2000 | A |
6169572 | Sogawa | Jan 2001 | B1 |
6317202 | Hosokawa et al. | Nov 2001 | B1 |
6433329 | Butka et al. | Aug 2002 | B1 |
6493612 | Bisset et al. | Dec 2002 | B1 |
6594844 | Jones | Jul 2003 | B2 |
6603103 | Ulrich et al. | Aug 2003 | B1 |
6703599 | Casebolt et al. | Mar 2004 | B1 |
6925679 | Wallach et al. | Aug 2005 | B2 |
7002550 | Casebolt et al. | Feb 2006 | B2 |
7155308 | Jones | Dec 2006 | B2 |
7237298 | Reindle et al. | Jul 2007 | B2 |
7312854 | Sugiyama et al. | Dec 2007 | B2 |
7403269 | Yamashita et al. | Jul 2008 | B2 |
7429843 | Jones et al. | Sep 2008 | B2 |
7430455 | Casey et al. | Sep 2008 | B2 |
7486386 | Holcombe et al. | Feb 2009 | B1 |
7499775 | Filippov et al. | Mar 2009 | B2 |
7663333 | Jones et al. | Feb 2010 | B2 |
7706921 | Jung | Apr 2010 | B2 |
7738733 | DePue et al. | Jun 2010 | B2 |
7856291 | Jung et al. | Dec 2010 | B2 |
7884733 | ODowd et al. | Feb 2011 | B2 |
7978315 | Lee | Jul 2011 | B2 |
8030914 | Alameh et al. | Oct 2011 | B2 |
8050876 | Feen et al. | Nov 2011 | B2 |
8301304 | Jung et al. | Oct 2012 | B2 |
8306662 | Kim | Nov 2012 | B2 |
8310585 | Lee et al. | Nov 2012 | B2 |
8396592 | Jones et al. | Mar 2013 | B2 |
8401279 | Sugino et al. | Mar 2013 | B2 |
8412377 | Casey et al. | Apr 2013 | B2 |
8463438 | Jones et al. | Jun 2013 | B2 |
8473187 | Kammel et al. | Jun 2013 | B2 |
8478442 | Casey et al. | Jul 2013 | B2 |
8489234 | Rew et al. | Jul 2013 | B2 |
8516651 | Jones et al. | Aug 2013 | B2 |
8565920 | Casey et al. | Oct 2013 | B2 |
8577538 | Lenser et al. | Nov 2013 | B2 |
8600553 | Svendsen et al. | Dec 2013 | B2 |
8648702 | Pala | Feb 2014 | B2 |
8744122 | Salgian et al. | Jun 2014 | B2 |
8761935 | Casey et al. | Jun 2014 | B2 |
8788092 | Casey et al. | Jul 2014 | B2 |
8800101 | Kim et al. | Aug 2014 | B2 |
8838274 | Jones et al. | Sep 2014 | B2 |
8958911 | Wong et al. | Feb 2015 | B2 |
8996172 | Shah et al. | Mar 2015 | B2 |
9025019 | Meinherz et al. | May 2015 | B2 |
9063010 | Wu | Jun 2015 | B2 |
9180596 | Sim et al. | Nov 2015 | B2 |
9223312 | Goel et al. | Dec 2015 | B2 |
9223440 | Lim et al. | Dec 2015 | B2 |
9247171 | Ebisumoto et al. | Jan 2016 | B2 |
9271621 | Park et al. | Mar 2016 | B2 |
9283670 | Sun et al. | Mar 2016 | B2 |
9287864 | Buttolo et al. | Mar 2016 | B2 |
9327407 | Jones et al. | May 2016 | B2 |
9337832 | Buttolo et al. | May 2016 | B2 |
9405008 | Raskar et al. | Aug 2016 | B2 |
9417059 | Go et al. | Aug 2016 | B2 |
9436185 | Schnittman | Sep 2016 | B2 |
9440652 | Ferguson et al. | Sep 2016 | B1 |
9446521 | Casey et al. | Sep 2016 | B2 |
9453743 | Lee et al. | Sep 2016 | B2 |
9483055 | Johnson et al. | Nov 2016 | B2 |
9613406 | Chen et al. | Apr 2017 | B2 |
9625571 | Yamazaki | Apr 2017 | B1 |
9638800 | Skowronek et al. | May 2017 | B1 |
9638801 | Boufounos et al. | May 2017 | B2 |
9720086 | Skowronek et al. | Aug 2017 | B1 |
9739874 | Lee | Aug 2017 | B2 |
9785148 | Yun | Oct 2017 | B2 |
9918600 | Oka et al. | Mar 2018 | B2 |
20040236468 | Taylor et al. | Nov 2004 | A1 |
20050154503 | Jacobs | Jul 2005 | A1 |
20050234612 | Bottomley et al. | Oct 2005 | A1 |
20080125907 | Lee et al. | May 2008 | A1 |
20140188325 | Johnson | Jul 2014 | A1 |
20160279808 | Doughty et al. | Sep 2016 | A1 |
20170023942 | Johnson et al. | Jan 2017 | A1 |
20170031366 | Shamlian et al. | Feb 2017 | A1 |
20170307759 | Pei | Oct 2017 | A1 |
20180078106 | Scholten | Mar 2018 | A1 |
20180302611 | Baak | Oct 2018 | A1 |
20190038099 | Hoshino | Feb 2019 | A1 |
Number | Date | Country |
---|---|---|
201602713 | Oct 2010 | CN |
102039595 | May 2011 | CN |
102866433 | Jan 2013 | CN |
104216404 | Dec 2014 | CN |
104750105 | Jul 2015 | CN |
104977926 | Oct 2015 | CN |
1902038 | Aug 1970 | DE |
3730105 | Mar 1989 | DE |
4405865 | Aug 1995 | DE |
19615712 | Dec 1997 | DE |
0265542 | May 1986 | EP |
0216364 | Apr 1987 | EP |
0505028 | Sep 1992 | EP |
0522200 | Jan 1993 | EP |
0523185 | Jan 1993 | EP |
2166315 | Apr 1986 | GB |
2344650 | Jun 2000 | GB |
S58221925 | Dec 1983 | JP |
S6263315 | Mar 1987 | JP |
S62109528 | May 1987 | JP |
H0358105 | Mar 1991 | JP |
H03286310 | Dec 1991 | JP |
H05084200 | Apr 1993 | JP |
100638220 | Oct 2006 | KR |
20090098513 | Sep 2009 | KR |
1988004081 | Jun 1988 | WO |
9115872 | Oct 1991 | WO |
9905661 | Feb 1999 | WO |
0038026 | Jun 2000 | WO |
0136772 | May 2001 | WO |
0142867 | Jun 2001 | WO |
2005077240 | Aug 2005 | WO |
2017130590 | Aug 2017 | WO |
2018040607 | Mar 2018 | WO |
Entry |
---|
US 8,275,482 B2, 09/2012, Casey et al. (withdrawn) |
PCT Search Report and Written Opinion dated Jun. 26, 2019, received in PCT Application No. PCT/US19/25671, 11 pgs. |
IRobot, 900 Series Owner's Manual, www.irobot.com, published 2017, 13 pgs. |
IRobot Roomba 500/600/700/800 Wall Sensor Bumper IR Array, www.amazon.com, 2019, 4 pgs. |
IRobot Roomba 900 960 980 Wall Sensor Bumper IR Array Genuine, www.ebay.com, published 2019, 1 pg. |
Robot Shop Community, Chaper 12: How to replace Roomba 500 Series Light-Touch Bumper Array, www.robotshop.com, published 2012, 16 pgs. |
Third Party Submission filed on Oct. 8, 2019 in U.S. Appl. No. 16/374,714, 13 pgs. |
Everett, H.R. et al, “Modeling the Environment of a Mobile Security Robot”, Naval Ocean Systems Center, Jun. 1990, pp. 1-170, Technical Document 1835, San Diego, CA (ADA233074). |
Braunsting et al, “Fuzzy Logic Wall Following of a Mobile Robot Based on the Concept of General Perception”, IKERLAN Centro de Investigaciones Tecnologicas, International Conference on Advanced Robotics, (Sep. 1995). |
Ciccimaro, Dominic et al, “Automated security response robot”, Proc. SPIE 3525, Mobile Robots XIII and Intelligent Transportation Systems, SPAWAR Systems Center, Code D371, San Diego, CA, (Jan. 8, 1999); doi 10.1117/12.335728. |
Everett, H.R., “Sensors for Mobile Robots Theory and Application”, Naval Command, Control and Ocean Surveillance Center, A K Peters, LTD, (1995), pp. 96-98. |
The Electrolux Group, Trilobit Manual, Sep. 28, 2001, 10 pages. |
Electrolux Trilobite 2.0—Disassembling, Photos, RobotReviews.com, Mar. 7, 2020, 13 pages. |
CPU-ZILLA, Electrolux Trilobite, “Welcome to Paleozoic Park . . . Not Quite”, HardwareZone.com, Jimmy Tang, Mar. 25, 2003, 2 pages. |
CPU-ZILLA, Electrolux Trilobite, “Life's Better With Auto-Vacuuming”, HardwareZone.com, Jimmy Tang, Mar. 25, 2003, 3 pages. |
CPU-ZILLA, Electrolux Trilobite, “Things to Consider”, HardwareZone.com, Jimmy Tang, Mar. 25, 2003, 3 pages. |
CPU-ZILLA, Electrolux Trilobite, “Final Words”, HardwareZone.com, Jimmy Tang, Mar. 25, 2003, 2 pages. |
Everett, H.R., “Survey of Collision Avoidance and Ranging Sensors for Mobile Robots”, Naval Ocean Systems Center, Mar. 1988, pp. 1-87, Technical Report 1194, San Diego, CA. |
Flynn, “Redundant Sensors for Mobile Robot Navigation”, Massachusetts Institute of Technology, Oct. 1985, 73 pages, (ADA161087). |
Connell, Jonathan H., “Extending the Navigation Metaphor to Other Domains”, IBM T.J. Watson Research Center, AAAI Technical Report FS-92-02, (1992), 7 pages. |
Roening, Juha et al, “Obstacle detection using a light-strip-based method”, Proc. SPIE 3023, Three-Dimensional Image Capture, University of Oulu, Finland, (Mar. 21, 1997); doi: 10.1117/12.269765, 10 pages. |
Schwartz, Ingo “PRIMUS: an autonomous driving robot”, Proc. SPIE 3693, Unmanned Ground Vehicle Technology, DaimlerChrysler Aerospace, Germany, (Jul. 22, 1999); doi: 10.1117/12.354443, 11 pages. |
Borenstein, J. et al, “Where am I?” Sensors and Methods for Mobile Robot Positioning, University of Michigan, (Apr. 1996), 282 pages. |
Everett, H.R., “Sensors for Mobile Robots Theory and Application”, Naval Command, Control and Ocean Surveillance Center, A K Peters, LTD, Wellseley, MA (1995), 543 pages. |
Everett. H.R., “A microprocessor controlled autonomous sentry robot”, A thesis presented to the Naval Postgraduate School, Calhoun International Archive, Monteray, CA, (Oct. 1982). |
Benet et al., “Using infrared sensors for distance measurement in mobile robots”, Robotics and Autonomous Systems 1006 (2002) 1-12), Elsevier Science, 12 pages. |
Viggh, et al. “Infrared People Sensors for Mobile Robots”, Proc. SPIE 1007, Mobile Robots III, (Mar. 10, 1989); doi: 10.117/12.949118, 9 pages. |
Extended European Search Report dated Nov. 21, 2022, received in European Patent Application No. 19781940.2, 15 pages. |
IRobot, 900 Series Owner's Manual, published 2017, available via irobot.com. |
Amazon, 2019, available via amazon.com. |
Ebay, published 2019, available via ebay.com. |
Robot Shop Community, Chapter 12: How to replace Roomba 500 Series Light-Tough Bumper Array, published 2012, available via robotshop.com. |
Number | Date | Country | |
---|---|---|---|
20190302793 A1 | Oct 2019 | US |
Number | Date | Country | |
---|---|---|---|
62651766 | Apr 2018 | US |