PROACTIVE MAINTENANCE IN AN AUTONOMOUS MOBILE ROBOT

Information

  • Patent Application
  • 20240164605
  • Publication Number
    20240164605
  • Date Filed
    November 21, 2022
    a year ago
  • Date Published
    May 23, 2024
    3 months ago
Abstract
A mobile cleaning robot system can include including a mobile cleaning robot, processing circuitry, and memory circuitry. The memory circuitry can include instructions, which when executed by the processing circuitry, can cause the processing circuitry to perform operations to receive a maintenance indication from a remote device indicative of whether maintenance on the mobile cleaning robot is recommended, where the maintenance indication can be based at least in part on factory test data. The processing circuitry can also transmit a maintenance instruction to a user device when maintenance on the mobile cleaning robot is recommended.
Description
BACKGROUND

Autonomous mobile cleaning robots can traverse floor surfaces to perform various operations in an environment, such as vacuuming of one or more rooms of the environment. A cleaning robot can include a controller configured to autonomously navigate the robot about an environment such that the robot can ingest debris as it moves. As an autonomous mobile robot traverses a floor surface, the robot can produce and record information about the environment and the robot.


SUMMARY

Autonomous mobile cleaning robots can be used to automatically or autonomously clean a portion, such as a room or rooms, of an environment. After many missions, robots can require maintenance and can sometimes require service or replacement of components. In some cases, components can fail, requiring replacement of the components. However, replacement of components can be a time-consuming process if parts are not available or if the parts take time to be shipped and failure to replace components can degrade cleaning performance.


This disclosure describes examples of approaches that can help to address this problem such as by including systems that can determine whether a component is likely to require replacement. The determination can be made based on one or more factors, such as factory test data, fleet data, robot telemetry (robot data), or the like. When determination is made that maintenance on a robot will likely be required, one or more systems can transmit a maintenance recommendation or instruction to the robot, a user device, or other device, to provide instructions for the user to replace the component, such as ordering a new component. This process can be used to help a user proactively maintain the robot 100, such as by helping to deliver a replacement component to the user prior to component failure, helping to reduce robot downtime.


For example, a mobile cleaning robot system can include a mobile cleaning robot, processing circuitry, and memory circuitry. The memory circuitry can include instructions, which when executed by the processing circuitry, can cause the processing circuitry to perform operations to receive a maintenance indication from a remote device indicative of whether maintenance on the mobile cleaning robot is recommended, where the maintenance indication can be based at least in part on factory test data. The processing circuitry can also transmit a maintenance instruction to a user device when maintenance on the mobile cleaning robot is recommended.


The above discussion is intended to provide an overview of subject matter of the present patent application. It is not intended to provide an exclusive or exhaustive explanation of the invention. The description below is included to provide further information about the present patent application.





BRIEF DESCRIPTION OF THE DRAWINGS

In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. Like numerals having different letter suffixes may represent different instances of similar components. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed in the present document.



FIG. 1 illustrates a plan view of a mobile cleaning robot in an environment.



FIG. 2A illustrates a bottom view of a mobile cleaning robot.



FIG. 2B illustrates an isometric view of a mobile cleaning robot.



FIG. 3 illustrates a cross-section view across indicators 3-3 of FIG. 2A of a mobile cleaning robot.



FIG. 4A illustrates a diagram illustrating an example of a communication network in which a mobile cleaning robot operates and data transmission in the network.



FIG. 4B illustrates a schematic view of a network.



FIG. 5 illustrates a schematic view of a method.



FIG. 6 illustrates a perspective view of a user device.



FIG. 7 illustrates a perspective view of a user device.



FIG. 8 illustrates a schematic view of a system.





DETAILED DESCRIPTION


FIG. 1 illustrates a plan view of a mobile cleaning robot 100 in an environment 40, in accordance with at least one example of this disclosure. The environment 40 can be a dwelling, such as a home or an apartment, and can include rooms 42a-42e. Obstacles, such as a bed 44, a table 46, and an island 48 can be located in one or more of the rooms 42 of the environment. Each of the rooms 42a-42e can have a floor surface 50a-50e, respectively. Some rooms, such as the room 42d, can include a rug, such as a rug 52. The floor surfaces 50 can be of one or more types of flooring, such as hardwood, ceramic, low-pile carpet, medium-pile carpet, long (or high)-pile carpet, stone, or the like.


The mobile cleaning robot 100 can be operated, such as by a user 60, to autonomously clean the environment 40 in a room-by-room fashion. In some examples, the robot 100 can clean the floor surface 50a of one room, such as the room 42a, before moving to the next room, such as the room 42d, to clean the surface of the room 42d. Different rooms can have different types of floor surfaces. For example, the room 42e (which can be a kitchen) can have a hard floor surface, such as wood or ceramic tile, and the room 42a (which can be a bedroom) can have a carpet surface, such as a medium pile carpet. Other rooms, such as the room 42d (which can be a dining room) can include multiple surfaces where the rug 52 is located within the room 42d.


During cleaning or traveling operations, the robot 100 can use data collected from various sensors and calculations (such as odometry and obstacle detection) to develop a map of the environment 40. Once the map is created, the user 60 can define rooms or zones (such as the rooms 42) within the map. The map can be presentable to the user 60 on a user interface, such as a mobile device, where the user 60 can direct or change cleaning preferences.


During operation, the robot 100 can detect surface types within each of the rooms 42, which can be stored in the robot or another device. The robot 100 can update the map (or data related thereto) such as to include or account for surface types of the floor surfaces 50a-50e of each of the respective rooms 42 of the environment. In some examples, the map can be updated to show the different surface types such as within each of the rooms 42.


In some examples, the user 60 can define a behavior control zone 54 using, for example, the methods and systems described herein. In response to the user 60 defining the behavior control zone 54, the robot 100 can move toward the behavior control zone 54 to confirm the selection. After confirmation, autonomous operation of the robot 100 can be initiated. In autonomous operation, the robot 100 can initiate a behavior in response to being in or near the behavior control zone 54. For example, the user 60 can define an area of the environment 40 that is prone to becoming dirty to be the behavior control zone 54. In response, the robot 100 can initiate a focused cleaning behavior in which the robot 100 performs a focused cleaning of a portion of the floor surface 50d in the behavior control zone 54.


Components of the Robot


FIG. 2A illustrates a bottom view of the mobile cleaning robot 100. FIG. 2B illustrates a bottom view of the mobile cleaning robot 100. FIG. 3 illustrates a cross-section view across indicators 3-3 of FIG. 2A of the mobile cleaning robot 100. FIG. 3 also shows orientation indicators Front and Rear. FIGS. 2A-3 are discussed together below.


The cleaning robot 100 can be an autonomous cleaning robot that can autonomously traverse the floor surface 50 while ingesting the debris 75 from different parts of the floor surface 50. As shown in FIGS. 2A and 3, the robot 100 can include a body 202 movable across the floor surface 50. The body 202 can include multiple connected structures to which components of the cleaning robot 100 are mounted. The connected structures can include, for example, an outer housing to cover internal components of the cleaning robot 100, a chassis or frame to which the drive wheels 210a and 210b and the cleaning rollers 205a and 205b (of a cleaning assembly 204) are mounted, and a bumper 238. The bumper 238 can be removably secured to the body 202 and can be movable relative to the body 202 while mounted thereto. In some examples, the bumper 238 can form part of the body 202.


As shown in FIG. 2A, the body 202 includes a front portion 202a that has a substantially semicircular shape and a rear portion 202b that has a substantially semicircular shape. These portions can have other shapes in other examples, such as a square front (or rounded square front). As shown in FIG. 2A, the robot 100 can include a drive system including actuators 208a and 208b, which can be, for example, motors. The actuators 208a and 208b can be mounted in the body 202 and can be operably connected to the drive wheels 210a and 210b, which can be rotatably mounted to the body 202 to support the body 202 above the floor surface 50. The actuators 208a and 208b, when driven, can rotate the drive wheels 210a and 210b to enable the robot 100 to autonomously move across the floor surface 50.


The controller (or processor) 212 can be located within the housing and can be a programmable controller, such as a single or multi-board computer, a direct digital controller (DDC), a programmable logic controller (PLC), or the like. In other examples the controller 212 can be any computing device, such as a handheld computer, for example, a smart phone, a tablet, a laptop, a desktop computer, or any other computing device including a processor, memory, and communication capabilities. The memory 213 can be one or more types of memory, such as volatile or non-volatile memory, read-only memory (ROM), random-access memory (RAM), magnetic disk storage media, optical storage media, flash-memory devices, and other storage devices and media. The memory 213 can be located within the body 202, connected to the controller 212 and accessible by the controller 212.


The controller 212 can operate the actuators 208a and 208b to autonomously navigate the robot 100 about the floor surface 50 during a cleaning operation. The actuators 208a and 208b can be operable to drive the robot 100 in a forward drive direction, in a backwards direction, or to turn the robot 100. The robot 100 can include a caster wheel 211 that can support the body 202 above the floor surface 50. The caster wheel 211 can support the front portion 202a of the body 202 above the floor surface 50, and the drive wheels 210a and 210b can support the rear portion 202b of the body 202 above the floor surface 50.


As shown in FIG. 3, a vacuum assembly 218 can be located at least partially within the body 202 of the robot 100, e.g., in the rear portion 202b of the body 202. The controller 212 can operate the vacuum assembly 218 to generate an airflow that flows through the air gap near the cleaning rollers 205, through the body 202, and out of the body 202. The vacuum assembly 218 can include, for example, an impeller that generates the airflow when rotated. The airflow and the cleaning rollers 205, when rotated, can cooperate to ingest debris 75 into a suction duct 348 of the robot 100. The suction duct 348 can extend down to or near a bottom portion of the body 202 and can be at least partially defined by the cleaning assembly 204.


The suction duct 348 can be connected to the cleaning head 204 or cleaning assembly and can be connected to a cleaning bin 322. The cleaning bin 322 can be mounted in the body 202 and can contain the debris 75 ingested by the robot 100. A filter 349 can be located in the body 202, which can help to separate the debris 75 from the airflow before the airflow 220 enters the vacuum assembly 218 and is exhausted out of the body 202. In this regard, the debris 75 can be captured in both the cleaning bin 322 and the filter before the airflow 220 is exhausted from the body 202.


The cleaning rollers 205a and 205b can operably connected to one or more actuators 214a and 214b, e.g., motors, respectively. The cleaning head 204 and the cleaning rollers 205a and 205b can be positioned forward of the cleaning bin 322. The cleaning rollers 205a and 205b can be mounted to a housing 224 of the cleaning head 204 and mounted, e.g., indirectly or directly, to the body 202 of the robot 100. In particular, the cleaning rollers 205a and 205b can be mounted to an underside of the body 202 so that the cleaning rollers 205a and 205b engage debris 75 on the floor surface 50 during the cleaning operation when the underside faces the floor surface 50.


The housing 224 of the cleaning head 204 can be mounted to the body 202 of the robot 100. In this regard, the cleaning rollers 205a and 205b can also be mounted to the body 202 of the robot 100, such as indirectly mounted to the body 202 through the housing 224. Alternatively, or additionally, the cleaning head 204 can be a removable assembly of the robot 100 where the housing 224 (with the cleaning rollers 205a and 205b mounted therein) is removably mounted to the body 202 of the robot 100.


A side brush 242 can be connected to an underside of the robot 100 and can be connected to a motor 244 operable to rotate the side brush 242 with respect to the body 202 of the robot 100. The side brush 242 can be configured to engage debris to move the debris toward the cleaning assembly 205 or away from edges of the environment 40. The motor 244 configured to drive the side brush 242 can be in communication with the controller 212. The brush 242 can be a side brush laterally offset from a center of the robot 100 such that the brush 242 can extend beyond an outer perimeter of the body 202 of the robot 100. Similarly, the brush 242 can also be forwardly offset of a center of the robot 100 such that the brush 242 also extends beyond the bumper 238 or an outer periphery of the body 202.


The robot 100 can further include a sensor system with one or more electrical sensors. The sensor system can generate one or more signals indicative of a current location of the robot 100, and can generate one or more signals indicative of locations of the robot 100 as the robot 100 travels along the floor surface 50.


For example, cliff sensors 234 (shown in FIG. 2A) can be located along a bottom portion of the body 202. The cliff sensors 234 can include an optical sensor that can be configured to detect a presence or absence of an object below the optical sensor, such as the floor surface 50. The cliff sensors 234 can be connected to the controller 212.


The bump sensors 239a and 239b (the bump sensors 239) can be connected to the body 202 and can be engageable or configured to interact with the bumper 238. The bump sensors 239 can include break beam sensors, Hall Effect sensors, capacitive sensors, switches, or other sensors that can detect contact between the robot 100 (e.g., the bumper 238) and objects in the environment 40. The bump sensors 239 can be in communication with the controller 212.


An image capture device 240 can be connected to the body 202 and can extend at least partially through the bumper 238 of the robot 100, such as through an opening 243 of the bumper 238. The image capture device 240 can be a camera, such as a front-facing camera, configured to generate a signal based on imagery of the environment 40 of the robot 100. The image capture device 240 can transmit the image capture signal to the controller 212 for use for navigation and cleaning routines.


Obstacle follow sensors 241 (shown in FIG. 2B) can include an optical sensor facing outward or downward from the bumper 238 that can be configured to detect the presence or the absence of an object adjacent to a side of the body 202. The obstacle follow sensor 241 can emit an optical beam horizontally in a direction perpendicular (or nearly perpendicular) to the forward drive direction of the robot 100. The optical emitter can emit an optical beam outward from the robot 100, e.g., outward in a horizontal direction, and the optical detector detects a reflection of the optical beam that reflects off an object near the robot 100. The robot 100, e.g., using the controller 212, can determine a time of flight of the optical beam and thereby determine a distance between the optical detector and the object, and hence a distance between the robot 100 and the object.


The robot 100 can also optionally include one or more dirt sensors 245 connected to the body 202 and in communication with the controller 212. The dirt sensors 245 can be a microphone, piezoelectric sensor, optical sensor, or the like, and can be located in or near a flow path of debris, such as near an opening of the cleaning rollers 205 or in one or more ducts within the body 202. This can allow the dirt sensor(s) 245 to detect how much dirt is being ingested by the vacuum assembly 218 (e.g., via the extractor 204) at any time during a cleaning mission. Because the robot 100 can be aware of its location, the robot 100 can keep a log or record of which areas or rooms of the map are dirtier or where more dirt is collected. The robot 100 can also include a battery 245 operable to power one or more components (such as the motors) of the robot.


Operation of the Robot

In operation of some examples, the robot 100 can be propelled in a forward drive direction or a rearward drive direction. The robot 100 can also be propelled such that the robot 100 turns in place or turns while moving in the forward drive direction or the rearward drive direction.


When the controller 212 causes the robot 100 to perform a mission, the controller 212 can operate the motors 208 to drive the drive wheels 210 and propel the robot 100 along the floor surface 50. In addition, the controller 212 can operate the motors 214 to cause the rollers 205a and 205b to rotate, can operate the motor 244 to cause the brush 242 to rotate, or can operate the motor of the vacuum system 218 to generate airflow. The controller 212 can also execute software stored on the memory 213 to cause the robot 100 to perform various navigational and cleaning behaviors by operating the various motors or components of the robot 100.


The various sensors of the robot 100 can be used to help the robot navigate and clean within the environment 40. For example, the cliff sensors 234 can detect obstacles such as drop-offs and cliffs below portions of the robot 100 where the cliff sensors 234 are located. The cliff sensors 234 can transmit signals to the controller 212 so that the controller 212 can redirect the robot 100 based on signals from the cliff sensors 234.


In some examples, the bump sensor 239a can be used to detect movement of the bumper 238 in one or more directions of the robot 100. For example, the bump sensor 239a can be used to detect movement of the bumper 238 from front to rear or the bump sensors 239b can detect movement along one or more sides of the robot 100. The bump sensors 239 can transmit signals to the controller 212 so that the controller 212 can redirect the robot 100 based on signals from the bump sensors 239.


In some examples, the obstacle follow sensors 241 can detect detectable objects, including obstacles such as furniture, walls, persons, and other objects in the environment of the robot 100. In some implementations, the sensors 241 can be located along a side surface of the body 202, and the obstacle following sensor 241 can detect the presence or the absence an object adjacent to the side surface. The one or more obstacle following sensors 241 can also serve as obstacle detection sensors, similar to proximity sensors. The controller 212 can use the signals from the obstacle follow sensors 241 to follow along obstacles such as walls or cabinets.


The robot 100 can also include sensors for tracking a distance travelled by the robot 100. For example, the sensor system can include encoders associated with the motors 208 for the drive wheels 210, and the encoders can track a distance that the robot 100 has travelled. In some implementations, the sensor can include an optical sensor facing downward toward a floor surface. The optical sensor can be positioned to direct light through a bottom surface of the robot 100 toward the floor surface 50. The optical sensor can detect reflections of the light and can detect a distance travelled by the robot 100 based on changes in floor features as the robot 100 travels along the floor surface 50.


The image capture device 240 can be configured to generate a signal based on imagery of the environment 40 of the robot 100 as the robot 100 moves about the floor surface 50. The image capture device 240 can transmit such a signal to the controller 212. The image capture device 240 can capture images of wall surfaces of the environment so that features corresponding to objects on the wall surfaces can be used for localization.


The controller 212 can use data collected by the sensors of the sensor system to control navigational behaviors of the robot 100 during the mission. For example, the controller 212 can use the sensor data collected by obstacle detection sensors of the robot 100 (e.g., the cliff sensors 234, the bump sensors 239, and the image capture device 240) to help the robot 100 avoid obstacles when moving within the environment of the robot 100 during a mission.


The sensor data can also be used by the controller 212 for simultaneous localization and mapping (SLAM) techniques in which the controller 212 extracts or interprets features of the environment represented by the sensor data and constructs a map of the floor surface 50 of the environment. The sensor data collected by the image capture device 240 can be used for techniques such as vision-based SLAM (VSLAM) in which the controller 212 can extract visual features corresponding to objects in the environment 40 and can construct the map using these visual features. As the controller 212 directs the robot 100 about the floor surface 50 during the mission, the controller 212 can use SLAM techniques to determine a location of the robot 100 within the map by detecting features represented in collected sensor data and comparing the features to previously stored features. The map formed from the sensor data can indicate locations of traversable and non-traversable space within the environment. For example, locations of obstacles can be indicated on the map as non-traversable space, and locations of open floor space can be indicated on the map as traversable space.


The sensor data collected by any of the sensors can be stored in the memory 213. In addition, other data generated for the SLAM techniques, including mapping data forming the map, can be stored in the memory 213. These data produced during the mission can include persistent data that are produced during the mission and that are usable during further missions. In addition to storing the software for causing the robot 100 to perform its behaviors, the memory 213 can store data resulting from processing of the sensor data for access by the controller 212. For example, the map can be a map that is usable and updateable by the controller 212 of the robot 100 from one mission to another mission to navigate the robot 100 about the floor surface 50.


The persistent data, including the persistent map, helps to enable the robot 100 to efficiently clean the floor surface 50. For example, the map enables the controller 212 to direct the robot 100 toward open floor space and to avoid non-traversable space. In addition, for subsequent missions, the controller 212 can use the map to optimize paths taken during the missions to help plan navigation of the robot 100 through the environment 40.


Network Examples


FIG. 4A is a diagram illustrating by way of example and not limitation a communication network 400 that can enable networking and communication between the mobile robot 100 and one or more other devices, such as a mobile device 404, a cloud computing system 406, or another autonomous robot 408 separate from the mobile robot 100. Using the communication network 410, the robot 100, the mobile device 404, the robot 408, and the cloud computing system 406 can communicate with one another to transmit and receive data, instructions, or other information to and from one another. The cloud computing system 406 can include processing circuitry and memory circuitry, including instructions, executable by the processing circuitry to cause the processing circuitry to perform operations.


In some examples, the robot 100, the robot 408, or both the robot 100 and the robot 408 communicate with the mobile device 404 through the cloud computing system 406. Alternatively or additionally, the robot 100, the robot 408, or both the robot 100 and the robot 408 communicate directly with the mobile device 404. Various types and combinations of wireless networks (e.g., Bluetooth, radio frequency, optical based, etc.) and network architectures (e.g., one or more of a distributed network, local area network (LAN), wide area network (WAN), a mesh network, or the like) may be employed by the communication network 410.


In some examples, the mobile device 404 can be a remote device that can be linked to the cloud computing system 406 and can enable a user to provide inputs, such as a smartphone, tablet, computer, or other computing device. The mobile device 404 can include user input elements such as, for example, one or more of a touchscreen display, buttons, a microphone, a mouse, a keyboard, or other devices that respond to inputs provided by the user. The mobile device 404 can also include immersive media (e.g., virtual reality) with which the user can interact to provide input. The mobile device 404, in these examples, can be a virtual reality headset or a head-mounted display.


The user can provide inputs corresponding to commands for the mobile robot 100. In such cases, the mobile device 404 can transmit a signal to the cloud computing system 406 to cause the cloud computing system 406 to transmit a command signal to the mobile robot 100. In some implementations, the mobile device 404 can present augmented reality images. In some implementations, the mobile device 404 can be a smart phone, a laptop computer, a tablet computing device, or other mobile device.


In some examples, the communication network 410 can include additional nodes. For example, nodes of the communication network 410 can include additional robots, such as a fleet of robots. Alternatively or additionally, nodes of the communication network 410 can include network-connected devices that can generate information about the environment 40. Such a network-connected device can include one or more sensors, such as an acoustic sensor, an image capture system, or other sensor generating signals, to detect characteristics of the environment 40 from which features can be extracted. Network-connected devices can also include home cameras, smart sensors, or the like.


In the communication network 410, the wireless links may utilize various communication schemes, protocols, etc., such as, for example, Bluetooth classes, Wi-Fi, Bluetooth-low-energy, also known as BLE, 802.15.4, Worldwide Interoperability for Microwave Access (WiMAX), an infrared channel, satellite band, or the like. In some examples, wireless links can include any cellular network standards used to communicate among mobile devices, including, but not limited to, standards that qualify as 1G, 2G, 3G, 4G, 5G, or the like. The network standards, if utilized, qualify as, for example, one or more generations of mobile telecommunication standards by fulfilling a specification or standards such as the specifications maintained by International Telecommunication Union. For example, the 4G standards can correspond to the International Mobile Telecommunications Advanced (IMT-Advanced) specification. Examples of cellular network standards include AMPS, GSM, GPRS, UMTS, LTE, LTE Advanced, Mobile WiMAX, and WiMAX-Advanced. Cellular network standards may use various channel access methods, e.g., FDMA, TDMA, CDMA, or SDMA.



FIG. 4B is a diagram illustrating an exemplary process 401 of exchanging information among devices in the communication network 410 or performing one or more calculations or determinations on the devices, such as the mobile robot 100, the cloud computing system 406, the mobile device 404, a robot fleet 414, or other devices.


In some examples, the robot 100 can be tested at a factory (e.g., a manufacturing facility or test facility thereof). One or more sensors of the robot or sensors of one or more components or systems in the factory can produce sensor signals during testing. For example, the drive wheel motors can be tested, the roller motors can be tested, or the blower motor can be tested. In each test, data from one or more sensors from the robot or the factory can be collected, analyzed, and stored as factory test data 412. Such factory test data can be stored using an identifier of the robot, such as a serial number. The factory test data can include values from tests, such as a sensor value, e.g., motor current or an amount of time passed during testing. The factory test data can also include whether a tested component of the robot passed or failed. The factory test data can also include a trajectory of testing of the robot, such as a rate at which the robot passed each test, or a number of tries to pass each test.


Also, a fleet of robots 414, which can be mopping robots, vacuuming robots, or two-in-one robots, or other smart-home devices, can be in the field, or in various homes or environments. Each of the robots 414 can include one or more sensors (such as those of the robot 100) and each sensor can produce a signal that can be received by a controller of the robots of the fleet 414. The fleet robots 414 can store or process the signals and can transmit data from the signals or determinations made therefrom, which can be fleet data 416. The fleet data 416 can be transferred from each of the fleet robots 414 to the cloud computing system 406. The fleet data 416 can optionally include factory test data for each fleet robot.


The factory test data 412 and the fleet data 416 can together be used as a training pipeline 418 for a machine learning model 420 (or trained model or classifier) of the cloud computing system 406. Optionally, the training pipeline can include additional data, such as robot age, component life expectancy, or the like. The data received by the machine learning model 420 can be labeled with an indication, such as a source or category of the data, including robot model, event frequency, event type, component type, component life expectancy, or the like. The various data can be stored and analyzed to train the machine learning model 420. The specific machine learning algorithm used for training can be selected from among many different potential supervised or unsupervised machine learning algorithms. Examples of supervised learning algorithms include artificial neural networks, Bayesian networks, instance-based learning, support vector machines, decision trees (e.g., Iterative Dichotomiser 3, C4.5, Classification and Regression Tree (CART), Chi-squared Automatic Interaction Detector (CHAID), and the like), random forests, linear classifiers, quadratic classifiers, k-nearest neighbor, linear regression, logistic regression, and hidden Markov models. Examples of unsupervised learning algorithms include K-means clustering, principal component analysis, expectation-maximization algorithms, vector quantization, and information bottleneck method. Unsupervised models may not have a training engine. In an example embodiment, a classifier can be used and the model 420 can be one or more coefficients corresponding to a learned importance for each of the features.


The resulting machine learning model 420 can be a model configured to predict an outcome based on input data, which can be used to help a user proactively maintain the robot 100. The machine learning model 420 can be a classifier, such as a binary classifier or a multi-class classifier. Once trained, the model 420 can output a correlated data-based outcome from an input of one or more data inputs. For example, the machine learning model 420 can output a result 422 following one or more inferences or determinations 424 based on robot data 426.


The robot data 426 can be robot telemetry or other data from the robot 100. For example, the robot 100 can produce one or more sensor signals from one or more sensors of the robot 100. The sensor signals can be used by the controller 212 of the robot to make one or more determinations, which can be stored by the controller 212, such as in memory 213 of the robot 100. Also, the controller 212 can store data from the signals along with other correlated data, such as time or frequency of occurrence. Any or all of this information collected by the robot 100 can be transmitted to the cloud computing system 406 for use with the machine learning model 420 in the inference pipeline 424.


Upon receipt of the robot data 426, the machine learning model 420 can use the robot data 426 to predict whether maintenance of one or more components of the robot 100 is required or will be required, which can be a maintenance determination or indication. For example, the machine learning model 420 can use data of the robot data 426 relating to operation of the vacuum assembly 218. Upon receipt of the vacuum assembly data, for example data from a motor of the vacuum assembly (such as a motor current sensor), the machine learning model 420 can use the data as an input into the inference pipeline 424 to determine whether maintenance on the blower of the vacuum assembly 218 is required or is likely to be required. When the machine learning model 420 makes a determination, the machine learning model 420 can output the result 422, which can be transmitted to one or more other devices, such as the mobile device 404, an analytics device 428, or a customer care device 430. Either of the analytics device 428 or the customer care device 430 can be devices, such as computers, tablets, or the like configured to be operated or accessed by a user. For example, the customer care device 430 can be a computer or system accessible by a customer care or support specialist.


Optionally, the inference pipeline 424 can receive robot data 426 that is only factory test data and the machine learning model 420 can use factory test data pertaining to a specific robot in the inference pipeline 424 to make a maintenance determination or indication as to whether a component will require service.


When the machine learning model 420 makes a determination that the maintenance indication is maintenance is required, the result 422 can be transmitted to one of the devices, such as the mobile device 404 for receipt by the user 402, such as in the form of a maintenance instruction, as discussed in further detail below. The maintenance indication can optionally be transmitted to any of the mobile device 404, the analytics device 428, or the customer care device 430, where such devices can make a maintenance recommendation or maintenance instruction based on the maintenance indication. The device (any of the mobile device 404, the analytics device 428, or the customer care device 430) can then transmit a maintenance instruction to another of the devices or can display a maintenance instruction. Upon receipt of the indication or instruction, the device can produce an alert or instructions for its user. For example, as discussed in further detail below with respect to FIG. 6, the mobile device 404 can produce instructions for checking the component predicted to fail or instructions for ordering a replacement component.


In another example, the machine learning model 420 can use data from the motor 244 of the side brush 242 (such as a motor current sensor of the motor 244) as an input into the inference pipeline 424 to determine whether maintenance on the motor 244 of the brush 242 is required or is likely to be required. When the machine learning model 420 makes a determination that maintenance is required, the machine learning model 420 can output the result 422, which can be transmitted to one or more other devices, such as the mobile device 404, an analytics device 428, or a customer care device 430. For example, the machine learning model 420 can transmit the maintenance indication or the maintenance instruction to the mobile device 404 that the motor 244 be replaced.


In another example, the machine learning model 420 can use data from the roller actuator or motor 214 (such as a motor current sensor) as an input into the inference pipeline 424 to determine whether maintenance on the motors 214 is required or is likely to be required. When the machine learning model 420 makes a determination that maintenance is required, the machine learning model 420 can output the result 422, which can be transmitted to one or more other devices, such as the mobile device 404, an analytics device 428, or a customer care device 430. For example, the machine learning model 420 can transmit the maintenance indication or the maintenance instruction to the mobile device 404 that one or more of the motors 214 be replaced.


In another example, the machine learning model 420 can use data from the drive wheel actuator or motor 208 (such as a motor current sensor) as an input into the inference pipeline 424 to determine whether maintenance on either of the motors 208 is required or is likely to be required. When the machine learning model 420 makes a determination that maintenance is required, the machine learning model 420 can output the result 422, which can be transmitted to one or more other devices, such as the mobile device 404, an analytics device 428, or a customer care device 430. For example, the machine learning model 420 can transmit the maintenance indication or the maintenance instruction to the mobile device 404 that one or more of the motors 208 be replaced.


Similarly, the machine learning model 420 can use data from a sensor (such as the cliff sensors 234, the bump sensors 239, or the image capture device 240) as an input into the inference pipeline 424 to determine whether maintenance on one or more of the sensors required or is likely to be required. When the machine learning model 420 makes a determination that maintenance is required, the machine learning model 420 can output the result 422, which can be transmitted to one or more other devices, such as the mobile device 404, an analytics device 428, or a customer care device 430. For example, the machine learning model 420 can transmit the maintenance indication or the maintenance instruction to the mobile device 404 that one or more of the sensors be replaced.


Similarly, the machine learning model 420 can use data from a sensor of the robot or from a docking station or a user device indicative of an amount of average robot time between charges as an input into the inference pipeline 424 to determine whether maintenance on the battery 245 of the robot is required. The machine learning model 420 can also receive data regarding a capacity of the battery 245 as an input into the inference pipeline 424 to determine whether maintenance on the battery 245 of the robot is required. When the machine learning model 420 makes a determination that maintenance is required, the machine learning model 420 can output the result 422, which can be transmitted to one or more other devices, such as the mobile device 404, an analytics device 428, or a customer care device 430. For example, the machine learning model 420 can transmit the maintenance indication or the maintenance instruction to the mobile device 404 that one the battery 245 be replaced.


The machine learning model 420 can use other data that is not sensor data in the inference pipeline 424 to determine to predict that a component will require service or replacement. For example, the robot 100 can transmit robot data 426 that include a cleaning frequency of the robot 100 within the environment 40, such as once per day or once per week. The machine learning model 420 can use such information, along with sensor data or along with factory data, to determine the maintenance indication, such as whether the component is likely to fail or require service. Also, the machine learning model 420 can use the cleaning frequency to determine when the component is likely to fail or when the component is likely to require service. The cloud computing system 406 can transmit to one or more device (e.g., the mobile device 404) that the component is likely to fail and when the component is likely to fail. The mobile device 404 can produce a maintenance instruction indicating the component in need of service and when the component will likely need service. Such an indication can provide the user 402 with a timeline for obtaining replacement parts.


Replacement parts can also be automatically ordered by the mobile device 404 or the cloud computing system 406 (or the analytics device 428 or the customer care device 430). For example, when the machine learning model 420 determines that a component is likely to fail, the cloud computing system 406 can transmit an order (such as to the customer care device 430) for a replacement component to be shipped to the user 402. Optionally, the cloud computing system 406 can transmit, to the mobile device 404, the maintenance indication (or a maintenance instruction) and the mobile device 404 can present an indication that is user selectable to place an order for a replacement component of the robot 100. Upon user selection of the indication, the mobile device 404 can transmit acceptance of the order to the cloud computing system 406 or the customer care device 430 and the component can be ordered or prepared for shipment in response.


Operations for the process 401 and other processes described herein, such one or more steps of the method 500 can be executed in a distributed manner. For example, the cloud computing system 406, the mobile robot 100, and the mobile device 404 may execute one or more of the operations in concert with one another. Operations described as executed by one of the cloud computing system 406, the mobile robot 100, and the mobile device 404 are, in some implementations, executed at least in part by two or all of the cloud computing system 406, the mobile robot 100, and the mobile device 404.


Further Operations of the Robot


FIG. 5 illustrates a schematic view of the method 500, in accordance with at least one example of this disclosure. The method 500 can be a method of predicting component failure or maintenance for a mobile cleaning robot. The robot 100 can transmit data to the cloud computing system 406 which can use a machine learning model 420 to analyze the data (along with other data, optionally) to determine whether maintenance of the robot will be required or to predict component replacement. More specific examples of the method 500 are discussed below. The steps or operations of the method 500 are illustrated in a particular order for convenience and clarity; many of the discussed operations can be performed in a different sequence or in parallel without materially impacting other operations. The method 500 as discussed includes operations performed by multiple different actors, devices, or systems. It is understood that subsets of the operations discussed in the method 500 can be attributable to a single actor, device, or system could be considered a separate standalone process or method.


The method 500 can begin at step 502 where factory test data of a mobile cleaning robot can be produced, transmitted, or received. For example, the robot 100 or components of the factory can produce test data, which can be transmitted to the cloud computing system 406 and received thereby.


At step 504 where fleet data of one or more mobile cleaning robots from a fleet can produce data regarding each robot, which can be transmitted thereby and received by a remote device or network, such as the 406. At step 506 a model, such as a machine learning model or classifier or other predictive algorithm can be trained using the factory test data or the fleet data. For example, the machine learning model 420 can be trained on one or more of the fleet data 416 and the factory test data 412.


At step 508, the robot 100 can transmit robot data to another device, such as the mobile device 404 or the cloud computing system 406. The robot data can be data from one or more sensors of the robot 100 or one or more determinations made by the controller 212 of the robot 100. At step 510, a maintenance indication can be determined. For example, the machine learning model 420 can determine or produce a maintenance indication based on the step 506 or other data. The maintenance indication can be indicative of whether maintenance on the mobile cleaning robot 100 is recommended.


At step 512, a maintenance instruction can be transmitted. The instruction can be the maintenance indication or can be an instruction or recommendation. The instruction can be transmitted from the cloud computing system 406 to another device, such as the mobile device 404. In some example, the maintenance indication or instruction can be transmitted only when maintenance on the mobile cleaning robot 100 is recommended.



FIG. 6 illustrates a perspective view of a user device. FIG. 7 illustrates a perspective view of the user device 600. FIGS. 6-7 are discussed together below and illustrate, by way of non-limiting example, a user interface of a smart phone 600, which can be an example of the mobile device 404. The user device 600 can include a display screen 602, which can be configured to display text or images and can be configured to receive user input, such as touch input.


As shown in FIG. 6, an alert 604 regarding an error, such as an error 26, can be presented on the display screen 602. The display screen 602 can also display alert subtext 606, which can provide additional information about the alert, such as that the vacuum suction is underperforming. Also, cleaning instructions 608 can be displayed on the display screen 602, which can include instructions 610 and 612, such as recommended cleaning instructions for the user to try, which can help to ensure that a component is actually failing. For example, the instruction 610 can instruct the user to remove the dust bin and filter from the robot. The instructions 612 can instruct the user to empty the bin and clean off the filter by tapping it on a trash bin.


The display screen 602 can also display component care notes 614, which can include notes 616 and 618 on how to properly maintain the robot. For example, the note 616 can indicate proper filter cleaning frequency to the user, and the note 618 can indicate proper filter replacement frequency to the user.


Both the alert 604 and the alert subtext 606 can vary based on the error the robot is experiencing. For example, when the vacuum system is not performing correctly, the alert 604 and the alert subtext 606 can be displayed. However, when the drive wheel motors are not performing correctly, a drive wheel alert and subtext can be displayed. A different alert 605 and subtext 606 can be displayed for each error or detected issue. Similarly, the cleaning instructions 608 and the care notes 614 can also be tailored based on the specific error of the robot.


The user device 600 can also be configured to display a learn more indication 620 on the display screen 602 that is user-selectable to present a new screen with additional information or selectable indications, such as the display of FIG. 7. One the display screen 602 of FIG. 7, an order alert 622 can be produced, which can explain to the user that a replacement component can be ordered. The display screen 602 can also show subtext 624 of the alert 622, which can include additional details, such as explaining the specific component (e.g., a cleaning head module) that requires replacement.


The display screen 602 can also include instructions 626-630 which can explain steps for replacement or preparation. For example, the instruction 630 can instruct a user to place an order by selecting order indication 632. The order indication 632 can be presented on the display screen 602 and can be user-selectable to transmit an order (or a message for an order) to the cloud computing system 406 or another device. Upon receipt of the order from the user device 600, the cloud computing system 406 can instruct a user or worker to prepare a replacement order for shipment to the user (e.g., the user 402). In this way, the user device 600 can instruct the user to maintain a component that is predicted by the cloud computing system 406 to need maintenance or replacement and the user device 600 can interact with the cloud computing system 406 to quickly and easily place an order for a replacement component before failure of the component, helping to reduce robot downtime.



FIG. 8 illustrates a block diagram of an example machine 800 upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform. Examples, as described herein, may include, or may operate by, logic or a number of components, or mechanisms in the machine 800. Circuitry (e.g., processing circuitry) is a collection of circuits implemented in tangible entities of the machine 800 that include hardware (e.g., simple circuits, gates, logic, etc.). Circuitry membership may be flexible over time. Circuitries include members that may, alone or in combination, perform specified operations when operating. In an example, hardware of the circuitry may be immutably designed to carry out a specific operation (e.g., hardwired). In an example, the hardware of the circuitry may include variably connected physical components (e.g., execution units, transistors, simple circuits, etc.) including a machine readable medium physically modified (e.g., magnetically, electrically, moveable placement of invariant massed particles, etc.) to encode instructions of the specific operation. In connecting the physical components, the underlying electrical properties of a hardware constituent are changed, for example, from an insulator to a conductor or vice versa. The instructions enable embedded hardware (e.g., the execution units or a loading mechanism) to create members of the circuitry in hardware via the variable connections to carry out portions of the specific operation when in operation. Accordingly, in an example, the machine readable medium elements are part of the circuitry or are communicatively coupled to the other components of the circuitry when the device is operating. In an example, any of the physical components may be used in more than one member of more than one circuitry. For example, under operation, execution units may be used in a first circuit of a first circuitry at one point in time and reused by a second circuit in the first circuitry, or by a third circuit in a second circuitry at a different time. Additional examples of these components with respect to the machine 800 follow.


In alternative embodiments, the machine 800 may operate as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine 800 may operate in the capacity of a server machine, a client machine, or both in server-client network environments. In an example, the machine 800 may act as a peer machine in peer-to-peer (P2P) (or other distributed) network environment. The machine 800 may be a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a mobile telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein, such as cloud computing, software as a service (SaaS), other computer cluster configurations.


The machine (e.g., computer system) 800 may include a hardware processor 802 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof), a main memory 804, a static memory (e.g., memory or storage for firmware, microcode, a basic-input-output (BIOS), unified extensible firmware interface (UEFI), etc.) 806, and mass storage 808 (e.g., hard drive, tape drive, flash storage, or other block devices) some or all of which may communicate with each other via an interlink (e.g., bus) 830. The machine 800 may further include a display unit 810, an alphanumeric input device 812 (e.g., a keyboard), and a user interface (UI) navigation device 814 (e.g., a mouse). In an example, the display unit 810, input device 812 and UI navigation device 814 may be a touch screen display. The machine 800 may additionally include a storage device (e.g., drive unit) 808, a signal generation device 818 (e.g., a speaker), a network interface device 820, and one or more sensors 816, such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensor. The machine 800 may include an output controller 828, such as a serial (e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).


Registers of the processor 802, the main memory 804, the static memory 806, or the mass storage 808 may be, or include, a machine readable medium 822 on which is stored one or more sets of data structures or instructions 824 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein. The instructions 824 may also reside, completely or at least partially, within any of registers of the processor 802, the main memory 804, the static memory 806, or the mass storage 808 during execution thereof by the machine 800. In an example, one or any combination of the hardware processor 802, the main memory 804, the static memory 806, or the mass storage 808 may constitute the machine readable media 822. While the machine readable medium 822 is illustrated as a single medium, the term “machine readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) configured to store the one or more instructions 824.


The term “machine readable medium” may include any medium that is capable of storing, encoding, or carrying instructions for execution by the machine 800 and that cause the machine 800 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions. Non-limiting machine readable medium examples may include solid-state memories, optical media, magnetic media, and signals (e.g., radio frequency signals, other photon based signals, sound signals, etc.). In an example, a non-transitory machine readable medium comprises a machine readable medium with a plurality of particles having invariant (e.g., rest) mass, and thus are compositions of matter. Accordingly, non-transitory machine-readable media are machine readable media that do not include transitory propagating signals. Specific examples of non-transitory machine readable media may include: non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.


The instructions 824 may be further transmitted or received over a communications network 826 using a transmission medium via the network interface device 820 utilizing any one of a number of transfer protocols (e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.). Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as Wi-Fi®, IEEE 802.16 family of standards known as WiMax®), IEEE 802.15.4 family of standards, peer-to-peer (P2P) networks, among others. In an example, the network interface device 820 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to the communications network 826. In an example, the network interface device 820 may include a plurality of antennas to wirelessly communicate using at least one of single-input multiple-output (SIMO), multiple-input multiple-output (MIMO), or multiple-input single-output (MISO) techniques. The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine 800, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software. A transmission medium is a machine readable medium.


Notes and Examples

The following, non-limiting examples, detail certain aspects of the present subject matter to solve the challenges and provide the benefits discussed herein, among others.

    • Example 1 is a mobile cleaning robot system including a mobile cleaning robot, the system comprising: processing circuitry; and memory circuitry, including instructions, which when executed by the processing circuitry, cause the processing circuitry to perform operations to: receive factory test data; transmit a maintenance indication indicative of whether maintenance on the mobile cleaning robot is recommended, the maintenance indication based at least in part on the factory test data; and transmit a maintenance instruction to a user device when the maintenance indication is that maintenance on the mobile cleaning robot is recommended.
    • In Example 2, the subject matter of Example 1 optionally includes the mobile cleaning robot comprising: a blower motor configured to operate a vacuum blower to ingest debris from an environment; and a motor sensor connected to the blower motor and configured to produce a motor signal based on operation of the blower motor, wherein the factory test data is based at least in part on the motor signal.
    • In Example 3, the subject matter of Example 2 optionally includes wherein the maintenance indication indicates whether the blower motor is recommended to receive maintenance.
    • In Example 4, the subject matter of any one or more of Examples 1-3 optionally include wherein the memory circuitry includes instructions, which when executed by the processing circuitry, further cause the processing circuitry to perform operations to: produce a sensor signal using a sensor of the mobile cleaning robot; produce sensor data based on the sensor signal; and transmit the sensor data from the mobile cleaning robot, the maintenance indication based at least in part on the sensor data.
    • In Example 5, the subject matter of any one or more of Examples 1-4 optionally include wherein the maintenance indication is determined based on fleet data from a fleet of mobile cleaning robots.
    • In Example 6, the subject matter of any one or more of Examples 1-5 optionally include wherein the memory circuitry includes instructions, which when executed by the processing circuitry, further cause the processing circuitry to perform operations to: determine a cleaning frequency of the mobile cleaning robot in one or more portions of an environment; and transmit the cleaning frequency to a remote device, wherein the maintenance indication is determined at least in part on the cleaning frequency.
    • Example 7 is at least one non-transitory machine-readable medium, including instructions, which when executed, cause processing circuitry to perform operations to: receive factory test data of a mobile cleaning robot; determine a maintenance indication indicative of whether maintenance on the mobile cleaning robot is recommended, the maintenance indication based at least in part on the factory test data of the mobile cleaning robot; and transmit a maintenance instruction to a user device when the maintenance indication is that maintenance on the mobile cleaning robot is recommended.
    • In Example 8, the subject matter of Example 7 optionally includes the instructions to further cause the processing circuitry to: receive fleet data from a fleet of mobile cleaning robots; and determine the maintenance instruction based at least in part on the fleet data of the fleet of mobile cleaning robots.
    • In Example 9, the subject matter of Example 8 optionally includes the instructions to further cause the processing circuitry to: determine the maintenance instruction using a trained machine learning model, the trained machine learning model trained using at least one of the factory test data or the fleet data as input to the trained machine learning model.
    • In Example 10, the subject matter of Example 9 optionally includes wherein the factory test data includes vacuum system test data, and wherein the fleet data includes vacuum system operational data.
    • In Example 11, the subject matter of any one or more of Examples 7-10 optionally include the instructions to further cause the processing circuitry to: present an order indication on the user device that is user selectable to place an order for a replacement component of the mobile cleaning robot; and transmit, to a remote device, the order for the replacement component upon user selection of the order indication.
    • In Example 12, the subject matter of any one or more of Examples 7-11 optionally include the instructions to further cause the processing circuitry to: receive a motor signal based on operation of a blower motor of the mobile cleaning robot; and determine the maintenance instruction based at least in part on the motor signal.
    • In Example 13, the subject matter of Example 12 optionally includes wherein the maintenance indication indicates whether the blower motor is recommended to receive maintenance.
    • In Example 14, the subject matter of any one or more of Examples 7-13 optionally include the instructions to further cause the processing circuitry to: receive a sensor signal from a sensor of the mobile cleaning robot; generate sensor data based on the sensor signal; transmit the sensor data to a remote device; and determine the maintenance instruction based at least in part on the sensor data of the mobile cleaning robot.
    • Example 15 is a method of predicting maintenance for a mobile cleaning robot, the method comprising: receiving factory test data of a mobile cleaning robot; determining a maintenance indication indicative of whether maintenance on the mobile cleaning robot is recommended, the maintenance indication based at least in part on the factory test data of the mobile cleaning robot; and transmitting a maintenance instruction to a user device when the maintenance indication is that maintenance on the mobile cleaning robot is recommended.
    • In Example 16, the subject matter of Example 15 optionally includes receiving fleet data from a fleet of mobile cleaning robots; and determining the maintenance instruction based at least in part on the fleet data of the fleet of mobile cleaning robots.
    • In Example 17, the subject matter of Example 16 optionally includes determining the maintenance instruction using a trained machine learning model, the trained machine learning model trained using at least one of the factory test data or the fleet data as input to the trained machine learning model.
    • In Example 18, the subject matter of any one or more of Examples 16-17 optionally include wherein the factory test data includes vacuum system test data, and wherein the fleet data includes vacuum system operational data.
    • In Example 19, the subject matter of Example 18 optionally includes receiving a sensor signal from a sensor of the mobile cleaning robot; generating sensor data based on the sensor signal; transmitting the sensor data to a remote device; and determine the maintenance instruction based at least in part on the sensor data of the mobile cleaning robot.
    • In Example 20, the subject matter of any one or more of Examples 15-19 optionally include presenting an order indication on the user device that is user selectable to place an order for a replacement component of the mobile cleaning robot; and transmitting, to a remote device, the order for the replacement component upon user selection of the order indication.
    • Example 21 is at least one machine-readable medium including instructions that, when executed by processing circuitry, cause the processing circuitry to perform operations to implement of any of Examples 1-20.
    • Example 22 is an apparatus comprising means to implement of any of Examples 1-20.
    • Example 23 is a system to implement of any of Examples 1-20.
    • Example 24 is a method to implement of any of Examples 1-20.
    • In Example 25, the system, apparatus(es), or method of any one or any combination of Examples 1-24 can optionally be configured such that all elements or options recited are available to use or select from.


The above detailed description includes references to the accompanying drawings, which form a part of the detailed description. The drawings show, by way of illustration, specific embodiments in which the invention can be practiced. These embodiments are also referred to herein as “examples.” Such examples can include elements in addition to those shown or described. However, the present inventors also contemplate examples in which only those elements shown or described are provided. Moreover, the present inventors also contemplate examples using any combination or permutation of those elements shown or described (or one or more aspects thereof), either with respect to a particular example (or one or more aspects thereof), or with respect to other examples (or one or more aspects thereof) shown or described herein.


In the event of inconsistent usages between this document and any documents so incorporated by reference, the usage in this document controls. In this document, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Also, in the following claims, the terms “including” and “comprising” are open-ended, that is, a system, device, article, composition, formulation, or process that includes elements in addition to those listed after such a term in a claim are still deemed to fall within the scope of that claim.


The above description is intended to be illustrative, and not restrictive. For example, the above-described examples (or one or more aspects thereof) may be used in combination with each other. Other embodiments can be used, such as by one of ordinary skill in the art upon reviewing the above description. The Abstract is provided to comply with 37 C.F.R. § 1.72(b), to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. Also, in the above Detailed Description, various features may be grouped together to streamline the disclosure. This should not be interpreted as intending that an unclaimed disclosed feature is essential to any claim. Rather, inventive subject matter may lie in less than all features of a particular disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description as examples or embodiments, with each claim standing on its own as a separate embodiment, and it is contemplated that such embodiments can be combined with each other in various combinations or permutations. The scope of the invention should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.

Claims
  • 1. A mobile cleaning robot system including a mobile cleaning robot, the system comprising: processing circuitry; andmemory circuitry, including instructions, which when executed by the processing circuitry, cause the processing circuitry to perform operations to: receive factory test data;transmit a maintenance indication indicative of whether maintenance on the mobile cleaning robot is recommended, the maintenance indication based at least in part on the factory test data; andtransmit a maintenance instruction to a user device when the maintenance indication is that maintenance on the mobile cleaning robot is recommended.
  • 2. The mobile cleaning robot system of claim 1, the mobile cleaning robot comprising: a blower motor configured to operate a vacuum blower to ingest debris from an environment; anda motor sensor connected to the blower motor and configured to produce a motor signal based on operation of the blower motor, wherein the factory test data is based at least in part on the motor signal.
  • 3. The mobile cleaning robot system of claim 2, wherein the maintenance indication indicates whether the blower motor is recommended to receive maintenance.
  • 4. The mobile cleaning robot system of claim 1, wherein the memory circuitry includes instructions, which when executed by the processing circuitry, further cause the processing circuitry to perform operations to: produce a sensor signal using a sensor of the mobile cleaning robot;produce sensor data based on the sensor signal; andtransmit the sensor data from the mobile cleaning robot, the maintenance indication based at least in part on the sensor data.
  • 5. The mobile cleaning robot system of claim 1, wherein the maintenance indication is determined based on fleet data from a fleet of mobile cleaning robots.
  • 6. The mobile cleaning robot system of claim 1, wherein the memory circuitry includes instructions, which when executed by the processing circuitry, further cause the processing circuitry to perform operations to: determine a cleaning frequency of the mobile cleaning robot in one or more portions of an environment; andtransmit the cleaning frequency to a remote device, wherein the maintenance indication is determined at least in part on the cleaning frequency.
  • 7. At least one non-transitory machine-readable medium, including instructions, which when executed, cause processing circuitry to perform operations to: receive factory test data of a mobile cleaning robot;determine a maintenance indication indicative of whether maintenance on the mobile cleaning robot is recommended, the maintenance indication based at least in part on the factory test data of the mobile cleaning robot; andtransmit a maintenance instruction to a user device when the maintenance indication is that maintenance on the mobile cleaning robot is recommended.
  • 8. The at least one non-transitory machine-readable medium of claim 7, the instructions to further cause the processing circuitry to: receive fleet data from a fleet of mobile cleaning robots; anddetermine the maintenance instruction based at least in part on the fleet data of the fleet of mobile cleaning robots.
  • 9. The at least one non-transitory machine-readable medium of claim 8, the instructions to further cause the processing circuitry to: determine the maintenance instruction using a trained machine learning model, the trained machine learning model trained using at least one of the factory test data or the fleet data as input to the trained machine learning model.
  • 10. The at least one non-transitory machine-readable medium of claim 9, wherein the factory test data includes vacuum system test data, and wherein the fleet data includes vacuum system operational data.
  • 11. The at least one non-transitory machine-readable medium of claim 7, the instructions to further cause the processing circuitry to: present an order indication on the user device that is user selectable to place an order for a replacement component of the mobile cleaning robot; andtransmit, to a remote device, the order for the replacement component upon user selection of the order indication.
  • 12. The at least one non-transitory machine-readable medium of claim 7, the instructions to further cause the processing circuitry to: receive a motor signal based on operation of a blower motor of the mobile cleaning robot; anddetermine the maintenance instruction based at least in part on the motor signal.
  • 13. The at least one non-transitory machine-readable medium of claim 12, wherein the maintenance indication indicates whether the blower motor is recommended to receive maintenance.
  • 14. The at least one non-transitory machine-readable medium of claim 7, the instructions to further cause the processing circuitry to: receive a sensor signal from a sensor of the mobile cleaning robot;generate sensor data based on the sensor signal;transmit the sensor data to a remote device; anddetermine the maintenance instruction based at least in part on the sensor data of the mobile cleaning robot.
  • 15. A method of predicting maintenance for a mobile cleaning robot, the method comprising: receiving factory test data of a mobile cleaning robot;determining a maintenance indication indicative of whether maintenance on the mobile cleaning robot is recommended, the maintenance indication based at least in part on the factory test data of the mobile cleaning robot; andtransmitting a maintenance instruction to a user device when the maintenance indication is that maintenance on the mobile cleaning robot is recommended.
  • 16. The method of claim 15, comprising: receiving fleet data from a fleet of mobile cleaning robots; anddetermining the maintenance instruction based at least in part on the fleet data of the fleet of mobile cleaning robots.
  • 17. The method of claim 16, comprising: determining the maintenance instruction using a trained machine learning model, the trained machine learning model trained using at least one of the factory test data or the fleet data as input to the trained machine learning model.
  • 18. The method of claim 16, wherein the factory test data includes vacuum system test data, and wherein the fleet data includes vacuum system operational data.
  • 19. The method of claim 18, comprising: receiving a sensor signal from a sensor of the mobile cleaning robot;generating sensor data based on the sensor signal;transmitting the sensor data to a remote device; anddetermine the maintenance instruction based at least in part on the sensor data of the mobile cleaning robot.
  • 20. The method of claim 15, further comprising: presenting an order indication on the user device that is user selectable to place an order for a replacement component of the mobile cleaning robot; andtransmitting, to a remote device, the order for the replacement component upon user selection of the order indication.