Mobile devices, such as mobile robots, are typically connected to a single communications network while being operated or controlled by a remote device. Data is transmitted between the mobile device and the remote device via the single communications network. When the mobile device becomes disconnected from the communications network, the mobile device typically searches for another communications network to connect to.
According to an implementation of the disclosed subject matter, a method may include connecting, via a communications interface of an autonomous mobile device, to both a first communication network and a second communication network of a plurality of communication networks. A third communication network may be connected to when the communications interface is disconnected from one of the first communications network and the second communications network. The method may include storing, at a memory device communicatively coupled to the communications interface, a map that includes a first one or more locations of the autonomous mobile device where the plurality of communications networks are accessible, and includes a second one or more locations of the autonomous mobile device where one or more of the plurality of communications networks have been disconnected. At a different time, when the autonomous mobile device is within a predetermined distance of one of the second one or more locations that the plurality of communications networks have been disconnected, the third communications network or another one of the plurality of communications networks may be switched to based on the map.
Additional features, advantages, and implementations of the disclosed subject matter may be set forth or apparent from consideration of the following detailed description, drawings, and claims. Moreover, it is to be understood that both the foregoing summary and the following detailed description are illustrative and are intended to provide further explanation without limiting the scope of the claims.
The accompanying drawings, which are included to provide a further understanding of the disclosed subject matter, are incorporated in and constitute a part of this specification. The drawings also illustrate implementations of the disclosed subject matter and together with the detailed description serve to explain the principles of implementations of the disclosed subject matter. No attempt is made to show structural details in more detail than may be necessary for a fundamental understanding of the disclosed subject matter and various ways in which it may be practiced.
Implementations of the disclosed subject matter may provide an autonomous mobile device, such as a mobile robot, that may be communicatively connected to a plurality of different access points and/or communications networks at the same time. The connection of the autonomous mobile device to the plurality of networks (e.g., two networks, three networks, or the like) may increase the available bandwidth for the autonomous mobile device. For video communications (e.g., video communications from the autonomous mobile device to a remote computer), having a plurality of network connections may increase the frames per second (FPS) of the video stream. When a connection to one of the plurality of networks is interrupted and/or disconnected, the FPS rate of the video stream may be reduced to the FPS for a single network until another network connection may be made to increase bandwidth.
In implementations of the disclosed subject matter, the autonomous mobile device may connect to a first communications network and a second communications network. When one of the networks is disconnected, the autonomous mobile device may use information from a map to determine a third communications network or another one of a plurality of communications networks to connect to. The map may provide areas where disconnections may occur with one or more networks, and/or what areas may provide one or more communications network connections. When the autonomous mobile device is disconnected from and/or connects to one of the plurality of communications networks, the area of disconnection and/or connection may be stored on the map.
At a different time, when the autonomous mobile device approaches the same area where a disconnection occurred and/or a connection to another communications network was made, the autonomous mobile device may connect to the same network previously connected to before experiencing a disconnection, based on the map.
In some implementations, if the autonomous mobile device takes a similar path as taken before, the autonomous mobile device may use map information and/or sensor information to predict areas to change connections to a communications network before experiencing a disconnection from one or more communications networks. The network that is changed to may not have the highest signal strength of available networks, but may last longer relative to a given path that the autonomous mobile device is planning to operate in. This may minimize the number of network switches.
In some implementations, the autonomous mobile device may adjust a speed of movement in an area when switching connections to a communications network to minimize and/or avoid latencies.
Implementations of the disclosed subject matter may provide communications (e.g., video, text, data, and the like) from the autonomous mobile device to one or more remote devices, where communications may continue when there is a disconnection of one of the plurality of networks that the autonomous mobile device is connected to.
In some implementations, the autonomous mobile device may select a different communications network, based on a task of the autonomous mobile device. For example, the mobile device may operate autonomously with a reduced communications signal strength (e.g., which may not be able to transmit video). In another example, the mobile device may operate autonomously, and may transmit data, video, text, or the like. In this example, the path of the mobile device may be chosen based on communications network coverage for one or more areas.
In some implementations, the mobile device may operate in a manual mode, where the path of the robot may be estimated based on prior user history with the mobile device (e.g., where the user may control the operation of the mobile device from a remote device).
The autonomous mobile device may operate so as to minimize and/or avoid being disconnected from communications networks (i.e., where there are no communications signals).
The autonomous mobile device may connect to one or more communications networks based on information of a stored map. When the mobile device operates in an area where no signal is available from a communications network, and the mobile device is operating in an autonomous mode, it may continue to operate. If the mobile device is operating in a telepresence mode, the mobile device may follow a person for a predetermined period of time, and output a notification that there is no connection to a communications network. If there is no connection for a period that is longer than the predetermined period of time, the mobile device may move to the last known place where a communications network was available.
At operation 12, a communications interface (e.g., network interface 116 shown in
At operation 14, the communications interface of the autonomous mobile device may connect to a third communication network (e.g., network 133 shown in
In some implementations, the communications interface may connect to one of the plurality of communications networks (e.g., network 133 and/or 134 shown in
At operation 16, a memory device communicatively coupled to the communications interface may store a map that includes a first one or more locations of the autonomous mobile device where the plurality of communications networks are accessible. The memory device may be memory 118 and/or fixed storage 120 of the autonomous mobile device 100 shown in
At a different time, when the autonomous mobile device is within a predetermined distance of one of the second one or more locations that the plurality of communications networks have been disconnected, the third communications network (e.g., network 133 shown in
In some implementations, operation 18 may include selecting the third communications network and/or another one of the plurality of communications networks based on an available duration of signal time, regardless of signal strength. The duration of signal time may include the amount of time that a network may be available to the autonomous mobile device based on a route of the autonomous mobile device. The selection of the third communications network and/or another one of the plurality of communications networks may be based on minimizing the switching between the plurality of communication networks.
In some implementations, operation 18 may include switching to the third communications network (e.g., network 133 shown in
The autonomous mobile device may switch to the third communications network (e.g., network 133 shown in
In some implementations, the autonomous mobile device may switch to a low priority network (e.g., network 134 shown in
A bandwidth for the autonomous mobile device may be reduced by the first communication network (e.g., network 131 shown in
In some implementations, the example method 10 may include switching to the first communication network (e.g., network 131 shown in
In some implementations, the method 10 may include storing a disconnection point of the first communications network (e.g., network 131) and/or the second communications network (e.g., network 132) on the map. A connection point of the third communications network (e.g., network 133) and/or another one of the plurality of communications networks (e.g., network 134) may be stored on the map. The map may be stored in memory 118 and/or fixed storage 120 of the autonomous mobile device 100 shown in
A rate of speed of the autonomous mobile device may be adjusted (e.g., by controller 114) within a predetermined distance of the locations that one or more of the plurality of communications networks have been disconnected, or within a switching area. A switching area may be an area where the autonomous mobile robot device switches a connection from one communications network to another (e.g., from network 131 to network 133, as shown in
In some implementations, the communications interface (e.g., network interface 116 shown in
At operation 24, a controller (e.g., controller 114 shown in
At operation 26, the controller may predict when to change from at least one of the first communications network (e.g., network 131) and the second communications network (e.g. network 132) to the third communications network (e.g., network 133) or another one of the plurality of communications networks (e.g., network 134). In some implementations, the controller may use the stored map to predict when to change networks. In some implementations, the controller may predict when to switch networks based on reduction of signal strength of the first communications network and/or the second communications network.
Optionally, at operation 28, the example method 10 may include connecting the autonomous mobile device to a low priority network from among the plurality of communications networks (e.g., networks 131, 132, 133, 134) based on the priority level of the mission of the autonomous mobile device. In some implementations, the priority level of the mission may be received by the autonomous mobile device via network 130 (e.g., which may include one or more of networks 131, 132, 133, and/or 134) from the server 140 and/or the remote platform 160, as shown in
In some implementations, the autonomous mobile device may be operated without signals received by the communications interface (e.g., network interface 116 shown in
At operation 32 a controller (e.g., controller 114 shown in
Optionally, at operation 34, the autonomous mobile device may output an indicator or message that the communications interface is disconnected from the plurality of communication networks after the first predetermined period of time. For example, the autonomous mobile device may output an audible message via a speaker that is part of the user interface 110 shown in
Optionally, at operation 36, a controller (e.g., controller 114 shown in
In some implementations of the example method 10, when at least one sensor (e.g., sensor 102, 102a, 102b, 106) of the autonomous mobile device detects another autonomous mobile device (e.g., autonomous mobile device 200 shown in
The communications interface (e.g., network interface 116) may receive data from another autonomous mobile device (e.g., autonomous mobile device 200 shown in
The at least one first sensor 102 (including sensors 102a, 102b shown in
In some implementations, the at least one first sensor 102 may have a field of view of 70 degrees diagonally. The at least one sensor 102 may have a detection distance of 0.2-4 meters. As shown in
The at least one first sensor 102 may include a first side sensor disposed on a first side of the autonomous mobile device 100 and a second side sensor that may be disposed on a second side of the device. For example, as shown in
The light source 104 may be one or more bulbs, one or more lamps, and/or an array of light emitting diodes (LEDs) or organic light emitting diodes (OLEDs) to emit UV light (e.g., light having a wavelength of 10 nm-400 nm). The intensity (i.e., optical power output) may be controlled by the controller 114, which may also turn on or off a portion or all of the devices (e.g., bulbs, lamps, LEDs, OLEDs) of the light source 104.
The at least one second sensor 106 may be communicatively coupled to the controller 114 shown in
In some implementations, the sensor 102, 106 may be a time-of-flight sensor, an ultrasonic sensor, a two-dimensional (2D) Light Detection and Ranging (LiDAR) sensor, a three-dimensional (3D) LiDAR sensor, and/or a radar (radio detection and ranging) sensor, a stereo vision sensor, 3D three camera, a structured light camera, or the like. The sensor 106 may have a field of view of 20-27 degrees. In some implementations, the sensor 106 may have a detection distance of 0.05-4 meters. The sensors 102, 106 may be used to detect objects, surfaces, people, animals, or the like.
The autonomous mobile device 100 may include a motor to drive the drive system 108 to move the autonomous mobile device in an area, such as a room, a building, or the like. The drive system 108 may include wheels, which may be adjustable so that the drive system 108 may control the direction of the autonomous mobile device 100.
In some implementations, the autonomous mobile device 100 may include a base with the drive system 108, and the sensor 102, 106 may be disposed on the base.
The controller 114 may control and/or operate the autonomous mobile device 100 in an operation mode which may be a manual mode, an autonomous mode, and/or a tele-operation mode. In the manual mode, the controller 114 may receive on or more control signals from the user interface 110 and/or the stop button 112. For example, a user may control the movement, direction, and/or stop the motion of the autonomous mobile device 100 by making one or more selections on the user interface 110. The stop button 112 may be an emergency stop (ESTOP) button which may stop all operations and/or movement of the autonomous mobile device 100 when selected. In some implementations, the controller 114 may receive at least one control signal via a network interface 116 (shown in
In some implementations, when the autonomous mobile device 100 is moving in a direction, the sensor 102, 106 may detect a geometry of one or more surfaces and/or objects. The output of the at least one first sensor 102 may be, for example, a point cloud of the one or more objects in the path of the autonomous mobile device 100. When the sensor 102 and/or sensor 106 is a stereo vision sensor, images from two sensors (i.e., where the two sensors may be part of the stereo vision sensor of the sensor 102 and/or sensor 106) within a known distance from one another distance may be captured at a predetermined point in time, and/or at predetermined time intervals with a global shutter. The global shutter may be configured so that the two sensors of the stereo vision sensor may capture images about simultaneously. One or more features may be determined from the captured images, and be compared to one another to determine portions that are matching. As the focal length of the two sensors of the stereo vision sensor and the distance between the two sensors (e.g., about 6 cm) may be stored in memory 118 and/or fixed storage 120 (shown in
When detecting the surface and/or object, the sensor 102, 106 may be a time-of-flight (TOF) sensor. At least one photon of light may be output by the sensor 102, 106, and may be transmitted through the air. When the at least one photon of light radiates surface and/or an object, a portion of the light may be reflected by the surface and/or the object may return to a receiver portion of the sensor 102, 106. The sensor 106 may calculate the time between sending the at least one photon of light and receiving the reflection, and multiply this value by the speed of light in air, to determine the distance between the sensor 102, 106 and surface and/or object. This may be used to generate the map of the area that the autonomous mobile device is operating within.
The bus 122 allows data communication between the controller 114 and one or more memory components, which may include RAM, ROM, and other memory, as previously noted. Typically RAM is the main memory into which an operating system and application programs are loaded. A ROM or flash memory component can contain, among other code, the Basic Input-Output system (BIOS) which controls basic hardware operation such as the interaction with peripheral components. Applications resident with the autonomous mobile device 100 are generally stored on and accessed via a computer readable medium (e.g., fixed storage 120), such as a solid state drive, hard disk drive, an optical drive, solid state drive, or other storage medium.
The network interface 116 may provide a direct connection to a remote server (e.g., server 140, database 150, and/or remote platform 160 shown in
Many other devices or components (not shown) may be connected in a similar manner. Conversely, all of the components shown in
More generally, various implementations of the presently disclosed subject matter may include or be embodied in the form of computer-implemented processes and apparatuses for practicing those processes. Implementations also may be embodied in the form of a computer program product having computer program code containing instructions embodied in non-transitory and/or tangible media, such as solid state drives, DVDs, CD-ROMs, hard drives, USB (universal serial bus) drives, or any other machine readable storage medium, such that when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing implementations of the disclosed subject matter. Implementations also may be embodied in the form of computer program code, for example, whether stored in a storage medium, loaded into and/or executed by a computer, or transmitted over some transmission medium, such as over electrical wiring or cabling, through fiber optics, or via electromagnetic radiation, such that when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing implementations of the disclosed subject matter. When implemented on a general-purpose microprocessor, the computer program code segments configure the microprocessor to create specific logic circuits.
In some configurations, a set of computer-readable instructions stored on a computer-readable storage medium may be implemented by a general-purpose processor, which may transform the general-purpose processor or a device containing the general-purpose processor into a special-purpose device configured to implement or carry out the instructions. Implementations may include using hardware that has a processor, such as a general purpose microprocessor and/or an Application Specific Integrated Circuit (ASIC) that embodies all or part of the techniques according to implementations of the disclosed subject matter in hardware and/or firmware. The processor may be coupled to memory, such as RAM, ROM, flash memory, a hard disk or any other device capable of storing electronic information. The memory may store instructions adapted to be executed by the processor to perform the techniques according to implementations of the disclosed subject matter.
The foregoing description, for purpose of explanation, has been described with reference to specific implementations. However, the illustrative discussions above are not intended to be exhaustive or to limit implementations of the disclosed subject matter to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The implementations were chosen and described in order to explain the principles of implementations of the disclosed subject matter and their practical applications, to thereby enable others skilled in the art to utilize those implementations as well as various implementations with various modifications as may be suited to the particular use contemplated.
Number | Name | Date | Kind |
---|---|---|---|
7623654 | Tischer | Nov 2009 | B2 |
9295022 | Bevan | Mar 2016 | B2 |
9307568 | Gassend | Apr 2016 | B2 |
9320074 | Gassend | Apr 2016 | B2 |
9320076 | Gassend | Apr 2016 | B2 |
9344935 | Hassan | May 2016 | B2 |
9730258 | Davydov | Aug 2017 | B2 |
10582488 | Morioka | Mar 2020 | B2 |
20070100541 | Kong | May 2007 | A1 |
20070142050 | Handforth | Jun 2007 | A1 |
20090191878 | Hedqvist | Jul 2009 | A1 |
20120302271 | Stewart | Nov 2012 | A1 |
20130294230 | Popa | Nov 2013 | A1 |
20150036657 | Hong | Feb 2015 | A1 |
20150197010 | Ruuspakka | Jul 2015 | A1 |
20160174117 | Wong | Jun 2016 | A1 |
20190182727 | Hassan | Jun 2019 | A1 |
20190191483 | Ryoo | Jun 2019 | A1 |
20190215482 | Sathya | Jul 2019 | A1 |
20200059821 | Wirth | Feb 2020 | A1 |
20200080865 | Ervin | Mar 2020 | A1 |
20200319652 | Kang | Oct 2020 | A1 |
20210041884 | Tian | Feb 2021 | A1 |
20210097728 | Matlack | Apr 2021 | A1 |
20210144560 | Sesia | May 2021 | A1 |
Number | Date | Country |
---|---|---|
102231233 | Nov 2011 | CN |
102711199 | Oct 2012 | CN |
105466421 | Apr 2016 | CN |
109495949 | Mar 2019 | CN |
2019148870 | Sep 2019 | JP |
2019165374 | Sep 2019 | JP |
I602405 | Oct 2017 | TW |
Entry |
---|
Taiwanese Office Action for App. No. TW109126937, dated Aug. 19, 2021, 5 pages. |
Extended European Search Report for App. No. EP20190253.3, dated Jan. 28, 2021, 7 pages. |
Taiwanese Office Action and Search Report for App. No. TW109126937, dated Mar. 29, 2021, 12 pages. |
Number | Date | Country | |
---|---|---|---|
20220038966 A1 | Feb 2022 | US |