This invention relates to mapping an area to be mowed by a robot lawnmower.
Autonomous robots that perform household functions such as floor cleaning and lawn cutting are now readily available consumer products. Some robots are generally confined within (i) touched walls and other obstacles within the rooms of a dwelling, (ii) IR-detected staircases (cliffs) leading downward; and/or (iii) user-placed detectable barriers such as directed IR beams, physical barriers or magnetic tape. Walls often provide much of the confinement perimeter. Other robots may try to map the dwelling using a complex system of sensors and/or active or passive beacons (e.g., sonar, RFID or bar code detection, or various kinds of machine vision).
Some autonomous robotic lawn mowers use a continuous boundary marker (e.g., a boundary wire) for confining random motion robotic mowers. The boundary wire is intended to confine the robot within the lawn or other appropriate area, so as to avoid damaging non-grassy areas of the yard or intruding onto a neighboring property. The boundary wire is typically a continuous electrically conductive loop around the property to be mowed. Although the boundary wire can be drawn into the property in peninsulas to surround gardens or other off-limit areas, it remains a continuous loop, and is energized with an AC current detectable as a magnetic field at a distance of a few feet. The boundary wire loop must be supplied with power, usually from a wall socket. Within the bounded area, a mowing robot may “bounce” randomly as the robot nears the guide conductor, or may follow along the guide conductor. Some mowers also touch and bounce from physical barriers.
In some implementations of this disclosure, a robot lawnmower system includes: a plurality of beacons positioned with respect to an area to be mowed; a robot lawnmower comprising: a detection system configured to detect the beacons; and a controller configured to, while traversing the area to be mowed, detect the beacons using the detection system and collect mapping data; one or more computer readable mediums storing instructions that, when executed by a system of one or more computing devices, cause the system to perform operations comprising: receiving the mapping data from the robot lawnmower; receiving at least first and second geographic coordinates for first and second reference points that are within the area and are specified in the mapping data; aligning the mapping data to a coordinate system of a map image of the area using the first and second geographic coordinates; and displaying the map image of the area based on aligning the mapping data to the coordinate system.
These and other implementations can optionally include the following features. The operations include receiving confirmation by a user of the area to be mowed. The operations include configuring the controller of the robot lawnmower to autonomously mow the area. The robot lawnmower comprises a global positioning system (GPS) receiver, and the controller is configured to move the robot lawnmower to the first and second reference points within the area and determine the first and second geographic coordinates for the first and reference points using the GPS receiver at the first and second reference points. The system of one or more computing devices comprises a mobile device, the mobile device comprises a global positioning system (GPS) receiver, and receiving the first and second geographic coordinates for the first and second reference points comprises: displaying instructions to a user to move the mobile device to the first reference point; in response to receiving user input indicating that the mobile device is at the first reference point, determining the first geographic coordinates using the GPS receiver; displaying instructions to the user to move the mobile device to the second reference point; and in response to receiving user input indicating that the mobile device is at the second reference point, determining the second geographic coordinates using the GPS receiver.
The system includes a docking station for the robot lawnmower at the first or second reference point. The robot lawnmower comprises a first global positioning system (GPS) receiver; the docking station comprises a second GPS receiver; receiving the first geographic coordinates comprises receiving the first geographic coordinates from the robot lawnmower using the first GPS receiver; and receiving the second geographic coordinates comprises receiving the second geographic coordinates from the docking station using the second GPS receiver. The docking station comprises a first global positioning system (GPS) receiver; the system of one or more computing devices comprises a mobile device that comprises a second GPS receiver; receiving the first geographic coordinates comprises receiving the first geographic coordinates from the docking station using the first GPS receiver; and receiving the second geographic coordinates comprises receiving the second geographic coordinates from the mobile device using the second GPS receiver. The system of one or more computing devices comprises a mobile device; receiving the mapping data from the robot lawnmower comprises receiving the mapping data over a wired or wireless communications link between the robot lawnmower and the mobile device; and aligning the mapping data to a coordinate system of the map comprises supplying the mapping data to a mapping server system of one or more computers and receiving the map image from the mapping server system.
The mapping data includes a mowing path, and wherein displaying the map image comprises displaying the map image with an overlaid visual indicator of the mowing path. Displaying the map image comprises displaying beacon indicators of locations of the beacons within the area using the mapping data. Aligning the mapping data to the coordinate system of the map image comprises one or more of: shifting, rotating, and scaling the mapping data so that first and second locations on the map of the area match the first and second reference points. The controller is configured to cause the robot lawnmower to traverse the area starting from the first or second reference point. The operations can include: supplying the mapping data and the first and second geographic coordinates for the first and second reference points to a remote server system of one or more computers; receiving, from the remote server system, one or more suggested positions within the area for the beacons; and displaying, on the map, indicators for the suggested positions for the beacons.
The operations can include: for at least one beacon, determining first and second distances to first and second nearest neighbor beacons to the at least one beacon; determining that a difference between the first and second distances is greater than a threshold distance; determining a suggested beacon location for the at least one beacon at a mid-point between the first and second nearest neighbor beacons along a perimeter of the area to be mowed; and displaying, on the map, an indicator for the suggested beacon location of the at least one beacon. The operations can include: receiving tracking data from the robot lawnmower while the robot lawnmower is mowing the area; and displaying, on the map, a graphic overlay indicating progress of the robot lawnmower. The operations can include: projecting a path of the robot lawnmower to complete mowing the area; and displaying, on the map, a graphic overlay indicating the projected path of the robot lawnmower. The operations can include: displaying a user interface element indicating an estimated amount of time to completion. The detection system comprises an emitter/receiver configured to emit a signal, and wherein the beacons are configured to reflect an emitted signal from the detection system back onto the detection system.
The details of one or more embodiments of the invention are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the invention will be apparent from the description and drawings, and from the claims.
Like reference symbols in the various drawings indicate like elements.
In this example, surface treater 200 includes a reciprocating symmetrical grass cutter floating on a following wheel 410. In some examples the wheel can be a continuous track, or tank tread. In other examples, surface treater 200 may comprise a rotary cutter, a spreader, or a gatherer. A grass comber 510 may also be carried by the body 100. The robot body 100 supports a power source 106 (e.g., a battery) for powering any electrical components of the robot lawnmower 10, including the drive system 400.
A computing device, e.g., a wireless operator feedback unit 502, sends a signal to an emitter/receiver 151 on the robot lawnmower 10 that is in communication with a controller 150. The wireless operator feedback unit 502 can be a mobile device comprising a processor, memory, and a digital communications system. The drive system 400 is configured to follow the signal received from the operator feedback unit 502. The robot lawnmower 10 may be docked at a base station or dock 12. In some examples, the dock 12 includes a charging system for changing the power source 106 housed by the robot body 100. In some implementations, the robot 10 includes a magnetometer 315. The magnetometer can be useful, e.g., for rotationally aligning a map of the lawn 20.
To prepare for the use of the robot lawnmower 10, a perimeter 21 of the lawn 20 to be mowed is defined. In some implementations, as a safety measure autonomous use of the robot lawnmower 10 can only be executed once a perimeter has been determined and stored in non-transitory memory of the robot lawnmower 10. In some implementations, a human operator 500 manually defines a perimeter 21 by pushing the robot 10 using a handle 116 attached to the robot body 100, as shown in
Referring to
In some implementations, the push bar 116 includes one or more pressure or strain sensors, monitored by the robot lawnmower 10 to move or steer in a direction of pressure (e.g., two sensors monitoring left-right pressure or bar displacement to turn the robot lawnmower 10). In some other implementations, the push bar 116 includes a dead man or kill switch 117A in communication with the drive system 400 to turn off the robot lawnmower 10. The switch 117A may be configured as a dead man switch to turn off the robot lawnmower 10 when an operator of the push bar 116 ceases to use, or no longer maintains contact with, the push bar 116. The switch 117A may be configured act as a kill switch when the push bar 116 is stowed, allowing the robot lawnmower 10 to operate in autonomous mode. The dead man or kill switch 117A may include a capacitive sensor or a lever bar.
In some implementations, the push bar 116 includes a clutch 117B to engage/disengage the drive system 400. The robot lawnmower 10 may be capable of operating at a faster speed while manually operated by the push bar 116. For example, the robot lawnmower 10 may operate at an autonomous speed of about 0.5 m/sec and a manual speed greeter than 0.5 m/sec (including a “turbo” speed actuatable to 120-150% of normal speed). In some examples, the push bar 116 may be foldable or detachable during the robot's autonomous lawn mowing. Alternatively, the push bar 116 can be configured as one of a pull bar, pull leash, rigid handle, or foldable handle. In some embodiments, the push bar 116 can be stowed on or in the robot body 100.
As noted above, prior to autonomously mowing the lawn, the robot lawnmower 10 completes a teaching phase. During the perimeter teaching phase, the human operator 500 may pilot the robot lawnmower 10 in a manner that requires correction, thus putting the robot lawnmower 10 in an unteachable state. When the robot lawnmower 10 detects that it is in an unteachable state during a teach run, the robot lawnmower 10 alerts the operator (e.g., via operator feedback unit 502 such as a display on a mobile device or a display integrated in a handle 116) to change a direction or speed of the robot lawnmower 10 to enable the robot lawnmower 10 to continue to record the perimeter 21 and/or return to traveling on traversable terrain. For instance, the robot lawnmower 10 may enter the unteachable state when the operator pushes the robot lawnmower 10 into an area of the lawn 20 where the robot lawnmower 10 loses ability to determine its location, when the user is on a second teaching path that varies from a first teaching path, or when the user pushes the robot lawnmower 10 too fast or over terrain that is too bumpy or tilted.
For example, the human operator may try to push the robot lawnmower 10 between a divot and a rock, causing the robot lawnmower 10 to tilt at an excessive angle (e.g., over 30 degrees). Or the operator may attempt to teach the robot lawnmower 10 a path that goes through topography that the robot lawnmower 10 cannot traverse in the autonomous mode. In such cases, the robot lawnmower 10 alerts the operator (e.g., via the operator feedback unit 502) to select a different path. As previously described, the robot lawnmower 10 may alert the operator via the operator feedback unit 502 by any appropriate feedback mechanism, e.g., a visual signal on a display, an audible signal through a speaker, an olfactory feedback signal, and/or a tactile signal, such a vibration from a vibrational unit of the operator feedback unit 502.
If the human operator is pushing the robot lawnmower 10 too fast or too slow during the teaching mode, thus placing the robot in the unteachable state, the robot lawnmower 10 prompts the user to either increase or decrease the speed of the robot lawnmower 10. In some examples, operator feedback unit 502 includes a speed indicator that will light or flash (green, yellow, red light) when the robot lawnmower 10 is going at a speed greater or lower than a threshold speed.
As will be discussed below in reference to
In some examples, the teaching routine requires the operator to traverse the perimeter 21 of the lawn 20 a second time (or more). Once the operator completes a first teaching run, completing a closed loop about the perimeter of the area to be mowed, the robot lawnmower 10 may alert the operator that a second run is needed. In some examples, the operator hits a STOP button to affirmatively indicate completion of a teaching run around the perimeter 21 of the lawn 20. In some implementations, the teaching routine determines a taught-path grade or score, e.g., on a scale, to aid the operator in understanding how close a previous traversal of the lawn 20 was to being acceptable.
In some examples, the robot lawnmower 10 allows the operator to either complete the second teaching run right after the first teaching run or wait until later. If the operator completes a second or subsequent teaching run and the robot lawnmower detects a variance between the two determined perimeters that is greater than a threshold variance, the robot lawnmower 10 alerts the user to the apparent discrepancy and prompts another teaching run to learn the perimeter 21 of the lawn 20.
When the perimeter-teaching process is complete, the user may dock the robot lawnmower 10 in its dock 12 (see
In scan matching, the robot lawnmower 10 can match scans taken at a given time while driving with scans stored in memory that are characteristic of each boundary marker 805, and the robot lawnmower 10 is thus able to determine its position relative to each of the individually identifiable boundary markers 805. In some implementations, the boundary markers 805 includes other individual identification means perceptible to the robot lawnmower 10, such as a bar code or encoded signal to enable the robot lawnmower 10 to determine its relative position.
As shown in
The boundary markers 805 may include a home marker that an operator can place in a position indicating a global origin (e.g., dock 12 or two boundary markers placed side by side). The operator distributes the boundary markers 805 as evenly as possible along the perimeter 21 of the lawn 20.
In some examples, beacons can be placed in the environment, and the robot can use the beacons to localize its position. The beacons can communicate using, e.g., Wide-Band (WB) or Ultra-wide Band (UWB) technology, 2.4 GHz (802.11v) technology, or other types of radio-frequency time of flight technology. These beacons can be placed inside the mowable area (e.g., beacon 810b), on the boundary (e.g., beacon 810a), or outside the boundary (e.g., beacon 810c). These beacons 810 (
Respective WB or UWB transceivers are placed on the robot lawnmower 10 (e.g., the robot lawnmower 10 includes a receiver/emitter 151 communicating with each of the beacons 810a-c), each of the beacons 810a-c, and optionally the dock 12. Several beacons 810a-c are placed about a mowable area and are spaced apart from each other and from the dock 12. As shown by the solid lines emanating from the robot lawnmower 10 in
If WB or UWB signals from WB or UWB beacons 810a-c positioned about a yard are to be used to determine the robot lawnmower's location within the yard, the location of the WB or UWB beacons 810a-c can be established by the robot lawnmower. In general, upon initial setup of a WB or UWB system, an initialization process is performed. The process can be based, in part, on a multidimensional scaling algorithm used to determine the location of the WB or UWB beacons 810a-c relative to one another, which in turn can be used to establish the location of the robot 10 relative to the beacons.
Thus, a human operator 500 is not required to place the WB or UWB beacons 810a-c at particular locations because the system automatically determines the locations of the WB or UWB beacons 810a-c upon initialization. In some implementations, the robot 10 determines the quantity of beacons deployed, e.g., by user input from the human operator 500. In those cases, the robot 10 can compare the quantity of beacons deployed with the number of beacons that are visible. If the number of beacons that are visible is fewer than the number of beacons deployed, the robot 10 can alert the human operator 500, and may suggest relocation, e.g., by identifying the beacons that are not visible by beacon identifier numbers.
This flexibility in positioning of the WB or UWB beacons 810a-c can provide the advantage of simplifying the installation and setup procedure for the autonomous lawn mowing robot system. Additionally, due to the omni-directional nature of the signal, the WB or UWB beacons 810a-c can be lower to the ground than in certain line-of-sight based systems because the robot 10 does not need to align (e.g., in a line-of-sight arrangement) with the beacon in order for a signal to be received from the beacon. The omni-directional nature of the signal also allows the beacons 810a-c to be placed off-plane and/or be rotated/tilted with respect to one another, the dock 12, and/or the robot 10.
In some examples, the beacons have a height of between about 12 inches and about 24 inches from the ground (e.g., between about 12 inches and about 24 inches; between about 16 inches and about 20 inches; about 18 inches). Upon subsequent use (e.g., prior to each time the robot lawnmower mows the lawn), a calibration or confirmation process can be performed to confirm that the WB or UWB beacons 810a-c are still in their expected, previously determined locations.
After collecting mapping data defining the perimeter 21, the human operator 500 may wish to confirm the location of the perimeter using a computing device 502, which can be a mobile device. The human operator 500 can launch a mapping application on the mobile device 502. The mobile device 502, in executing the mapping application, displays a map of the area to be mowed with the perimeter overlaid on the map for the human operator 500 to confirm.
To display the map, the mobile device 502 receives the mapping data from the robot 10. The mobile device 502 also receives first and second geographic coordinates for first and second reference points within the area. Then, the mobile device 502 aligns the mapping data to a coordinate system of a map image of the area using the first and second reference points. The mobile device 502 displays the map image of the area with an indicator of the perimeter overlaid on the map image based on the aligned mapping data. Example systems and methods for displaying the map are described further below with reference to
The robot lawnmower 10 includes a controller 150, a detection system 160 configured to detect beacons, and an optional location system 152, e.g., a Global Positioning System (GPS) receiver. The mobile device 502 includes a processor system 510 and an optional location system 512, e.g., a GPS receiver. The docking station 12, in some implementations, can be a charging station that does not communicate on the network 158, and in some other implementations, can have a controller 154 and/or a location system 156, e.g., a GPS receiver.
In operation, the robot lawnmower 10 traverses an area to be mowed. To train the robot lawnmower 10, a human operator can push the robot lawnmower 10 around a perimeter of the area to be mowed, as described above with reference to
After the robot lawnmower 10 collects the mapping data, the robot lawnmower 10 transmits the mapping data to the mobile device 502. For example, the robot lawnmower 10 can communicate with the mobile device 502 over a Bluetooth connection or over a local wireless network. The mobile device 502 can initiate the transmission by launching a mapping application executed by the processor system 510.
The mobile device 502 receives at least two reference geographic coordinates for reference points within the area. In some examples, the geographic coordinates include a latitude and a longitude. However, other geolocation data could be used. The reference points correspond to positions specified by the mapping data. The mobile device 502 can acquire the reference coordinates using any of various appropriate techniques. For purposes of illustration, consider the following three examples.
In a first example, suppose that the robot lawnmower 10 and the docking system 12 do not have or do not use location systems 152, 156. The mobile device 502 will use its location system 512 to acquire the reference coordinates. For example, the mobile device 502, in executing a mapping application, can instruct a human operator to move the mobile device 502 to the docking station 12, which can be a point specified in the mapping data by virtue of the robot lawnmower 10 starting at the docking station 12 while collecting the mapping data. When at the docking station 12, the human operator provides input to the mobile device 502 indicating that the mobile device 502 is at the docking station, and the mobile device 502 uses its location system 512 to obtain a first geographic reference coordinate.
Then, the mobile device 502 instructs the human operator to move the mobile device 502 to another point in the area that is specified in the mapping data. For example, the mobile device 502 can instruct the human operator to move the mobile device 502 to the closest beacon or to another beacon, or the mobile device 502 can instruct the human operator to walk a certain distance along the path taken while the robot lawnmower 10 was collecting the mapping data. When the human operator reaches the second point, the mobile device 502 uses its location system 512 to obtain the second geographic reference coordinate.
In a second example, suppose that the robot lawnmower 10 does have a location system 152. The robot lawnmower 10, while collecting the mapping data, can also obtain the reference coordinates. For example, when the robot lawnmower 10 starts collecting the mapping data at the location of the docking station 12, the robot lawnmower 10 uses the location system 152 to obtain the first geographic reference coordinates (e.g., latitude and longitude coordinates). Then, after the robot lawnmower 10 moves to another location in the area that is at least a certain distance from the docking station 12, the robot lawnmower 10 uses the location system 152 to obtain additional geographic reference coordinates. This process can be repeated to obtain any number of additional geographic reference coordinates. The robot lawnmower 10 sends the mapping data and the geographic reference coordinates to the mobile device 502. In some cases, e.g., where the location of the reference points is not already specified, the robot lawnmower 10 can send data specifying how the geographic reference coordinates correspond to the mapping data.
In a third example, suppose that the docking station 12 has and uses a location system 156. The docking station 12 can supply the first geographic reference coordinates (e.g., latitude and longitude coordinates), and robot lawnmower 10 or the mobile device 502 can obtain the second geographic reference coordinates at a point at least a certain distance away from the docking station 12 within the area. It may be useful to have a location system 156 in the docking station 12 instead of the robot lawnmower 10, e.g., to reduce the weight of the robot lawnmower 10.
The mobile device 502 uses the geographic reference coordinates and the mapping data to display a map image of the area to be mowed. In some implementations, the mobile device 502 obtains a map image from the mapping system 600, orients the mapping data to a coordinate system of the map image from the mapping system 600, and then displays the map image with a graphic overlay of the perimeter of the area to be mowed. In some other implementations, the mobile device 502 sends the mapping data and the geographic reference coordinates to the mapping system 600, and the mapping system 600 generates a map image with a graphic overlay of the perimeter of the area to be mowed. In one particular example, the first geographic reference coordinate is used to identify a common location between the map image and the boundary and the second geographic reference is used to rotationally align the map image with the mapping data.
In adjusting the mapping data, the perimeter path 450 is translated to the same coordinate frame as the map image 452. The difference between the first reference point location in the robot coordinate system 454 and the first reference point location in the image coordinate system 456 is calculated. All data within the robot coordinate system 454, including beacon locations 805 and the perimeter path 450, can be shifted by that difference, resulting in translated data. The first reference point in the image coordinate system can be used as the vertex to calculate the angle between the second reference point from the translated robot coordinate system 454 and the second reference point from the image coordinate system 456. This angle can used to rotate all data in the translated data in the image coordinate system.
The mobile device 502 initializes communication with the robot lawnmower 10 (5002). For example, the mobile device 502 can establish a Bluetooth connection with the robot lawnmower 10, or the mobile device 502 can connect with the robot lawnmower 10 over a wireless network. In some implementations, the mobile device 502 prompts a user for information to correctly identify the robot lawnmower 10; in some other implementations, the mobile device 502 is configured to wirelessly probe for the robot lawnmower 10. In some implementations, the controller 150 of the robot lawnmower 10 authenticates to the mobile device 502.
If the robot lawnmower 10 has not yet traversed the lawn to collect mapping data, the mobile device 502 can prompt the user to cause the robot lawnmower 10 to collect the mapping data. For example, the mobile device 502 can display instructions for the user to interact with robot lawnmower 10, and the mobile device 502 can place a phone call to a customer service department if the user desires to get help from a person. In some implementations, the user pushes the robot lawnmower 10 about the yard, e.g., as described above with reference to
The mobile device 502 receives the mapping data from the robot lawnmower 10 (5004). The mapping data can define a perimeter of the area to be mowed and other data, e.g., locations of beacons placed about the area to be mowed. The mapping data can be stored in any appropriate data structure, e.g., as a list of coordinates with respect to a starting point, or as vector data.
The mobile device 502 receives geographic coordinates of reference points within the area (5006). For example, the mobile device 502 can receive the geographic coordinates of reference points as described above with reference to
The mobile device 502 receives a map image of the area to be mowed (5007). For example, the mobile device 502 can use one of the reference points (e.g., the latitude and longitude) to retrieve the map image from a database of map images. In another example, the mobile device can request that the user enter an address and use the address to retrieve the map image from the database of map images. The received map image can be an overhead photograph of the lawn and surrounding image such as a satellite image.
The mobile device 502 aligns the mapping data to a coordinate system of a map image of the area to be mowed (5008). For example, the mobile device 502 can obtain the map image and data associated with the map image that specifies a correspondence between portions of the map image and geographic coordinates. The mobile device can then align the mapping by data by shifting, rotating, and/or scaling the mapping data so that the first and second reference points match first and second locations on the map image as specified by the data specifying the correspondence between portions of the map image and geographic coordinates.
In the case where there is insufficient information (e.g., one of the reference points is missing or believed to be noisy or incorrect) the mobile device 502 and/or the mapping system can search for features in the map that correspond with the perimeter 450. Image classifiers can be used on the map image 452 to find commonly traced features, such as the corner of a house or building, the edge of a lawn, or any other distinct path that matches well with the perimeter 450. The mobile device 502 can also incorporate other sources of information with the same geodetic coordinate system as the map image 452, such as public boundary and property lines.
In implementations where the robot 10 includes a magnetometer 315, the robot 10 can include directional data with the mapping data. The mobile device 502 and/or the mapping system can then use the directional data, e.g., instead of one of the reference points or in addition to the two reference points to improve the alignment. For example, the robot 10 can include, with the mapping data, data indicating which direction is north. The mapping system can also then determine which direction, for the map image, faces north, and the mobile device 502 and/or the mapping system can align the mapping data and/or the map image so that the two directions match.
In some implementations, image registration techniques from computer vision are used to align the map data with the map image. The map is converted to a grid representation (if it is not already in that format), where each cell is marked as either inside or outside the mowable region, and each cell value can be represented by a pixel with a different intensity. For example, mowable cells can be marked by white pixels and non-mowable cells can be marked by black pixels. The map image is also processed by an algorithm that converts colors to intensities based on the probability that each color corresponds to a mowable region. For example, green pixels may be converted to white pixels (high likelihood of being mowable), yellow and brown pixels may be converted to gray pixels (medium likelihood of being mowable), and pixels of other colors may be converted to black pixels (low likelihood of being mowable.
In some implementations, a machine learning algorithm (e.g. support vector machine, neural network) can be used to learn which colors of the map image are most likely to represent mowable areas based on example images. The machine learning algorithm can then be used with the map image to create two grass probability images (GPIs), one for the mowable area and one for the map image, can then be aligned using standard image registration techniques.
These techniques can include intensity-based algorithms and feature-based algorithms. In both approaches, the georeferenced points measured using, e.g., GPS (on the robot 10, dock 12, or mobile device 502) can be used as an initial guess of the alignment between the map GPI and the map image GPI. In an intensity-based approach, the map GPI is then transformed (using translation and rotation) and correlated with the map image GPI, and the transformation with the highest correlation value is used to align the images. The search of transformation space can either be exhaustive in a limited region near the initial guess, or non-linear optimization techniques such as hill climbing, gradient descent, or simulated annealing can be used to reduce the time required for alignment.
In a feature-based approach, local features are detected in the map GPI and the map image GPI. These features can include lines and corners, as well as scale-invariant features such as SIFT (Scale-Invariant Feature Transform), SURF (Speeded Up Robust Features), or HOG (Histogram of Oriented Gradients). Once the locations of these features are detected in both GPIs, the feature locations in the map GPI can be transformed and matched with those in the map image GPI using the same techniques as described above for intensity-based approaches. The best transformation is then used for the map alignment.
In some implementations, the mobile device 502 receives the map image from a mapping system 600 and aligns the mapping data to the map image. In some other implementations, the mobile device 502 supplies the mapping data and geographic coordinates for the reference points to the mapping system 600, and the mapping system aligns the mapping data and supplies a map image using the aligned mapping data. The mapping system can select the map image in any appropriate way, e.g., using the geographic coordinates for the reference points, or by using the user's street address.
The mobile device 502 displays the map image using the aligned mapping data (5010). For example, the mobile device 502 can display the map image with a graphic overlay illustrating a perimeter of the area to be mowed. In another example, the mobile device 502 can display the map image with the area to be mowed highlighted or shaded.
Once the mapping data has been aligned to the coordinate system of the map image, the map image and the mapping data can be used for various optional tasks (5012). For example, the mobile device 502 can prompt the user to confirm the area to be mowed. The mobile device 502 can present a display that shows the progress of the robot lawnmower 10 in real-time or near real-time while it is mowing the lawn, e.g., by having the robot lawnmower 10 communicate its position to the mobile device 502 while it mows the lawn. The mobile device 502 can present a display that shows the projected remaining path of the robot lawnmower 10 and/or the projected time remaining for the robot lawnmower 10 to complete mowing the lawn.
The mobile device 502 receives mapping data defining a perimeter around the area to be mowed (6002). The mobile device 502 receives tracking data specifying a portion of the area that the robot lawnmower 10 has already mowed (6004).
The mobile device 502 plots a projected path for the robot lawnmower 10 from the robot lawnmower's current location that could be followed by the robot lawnmower 10 to finish mowing the area inside of the perimeter that has not been mowed (6006). In some implementations, the mobile device 502 plots the projected path by simulating a navigational algorithm stored on the controller 150 of the robot lawnmower 10.
In some implementations, the controller generates a coverage path based on a Boustrephedon (e.g., cornrow) algorithm applied to a grid map representation of the lawn. In this grid map, cells are marked as being inside, outside, or on the boundary of the mowable area. In some cases, the robot may start at one corner of the lawn (as in the example in
The user could also select the orientation of these ranks (vertical, horizontal, or an arbitrary angle as preferred by them). The user can also block off an area of the lawn that they would like the robot lawnmower to not cover during its current mission or designate an area that should be more thoroughly mowed (by way of slower movement or multiple passes). In some implementations, the boundary of this “keep-out” zone are marked in the grid map as boundary cells, while the cells inside the keep-out zone are marked as non-mowable. Paths can have other configurable metrics that allows the user to select between different mow patterns or styles or can have pseudo-random attributes that allow for varying paths to be generated.
The mobile device 502 generates a map image of the area to be mowed with a graphic overlay of the projected path (6008). The mobile device can align the mapping data and the tracking data to a coordinate system of a map image, e.g., as described further above with reference to
The mapping system 600 receives mapping data defining a perimeter of an area to be mowed (7002). The mapping data also specifies the locations of several navigation beacons. The mapping system 600 receives the mapping data from the robot lawnmower 10. For example, the robot lawnmower 10 can provide the mapping data to the mobile device 502, which can then provide the mapping data to the mapping system 600, e.g., at the request of a user.
The mapping system 600 checks each beacon location for a possible suggested location (7004). For example, the mapping system can determine, for each beacon, the distance to the two nearest beacons. If the beacon is closer to one of the nearest beacons than the other, the mapping system can determine a suggested location that is closer to a mid-point between the two beacons along the perimeter. In another example, the mapping system can use elevation data to determine that two neighboring beacons are at two different elevations with a height distance that exceeds a threshold. The mapping system can determine a suggested location that is at an elevation between the two different elevations.
The robot lawnmower 10 can identify places where position estimates from the localization system are less confident. The mapping system can suggest placing a beacon near these places of lower confidence. Suggested locations can also be based on the beacon location map by analyzing long gaps in distance between beacons or beacons that are only able to communicate with a couple beacons because of occlusions.
The mapping system 600 provides any suggested beacon locations for display on a map image (7006). For example, the mapping system 600 can generate a map image, by aligning the mapping data to a map image, with graphic indicators of the current beacon locations and the suggested beacon locations. In another example, the mapping system 600 provides the suggested locations to a mobile device 502, and the mobile device 502 generates a map image showing the suggested beacon locations.
While this specification contains many specific details, these should not be construed as limitations on the scope of the disclosure or of what may be claimed, but rather as descriptions of features specific to particular implementations of the disclosure. Certain features that are described in this specification in the context of separate implementations can also be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation can also be implemented in multiple implementations separately or in any suitable sub-combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a sub-combination or variation of a sub-combination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multi-tasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
Accordingly, other embodiments are within the scope of the following claims.
This application is a continuation application of and claims priority to U.S. application Ser. No. 15/229,674, filed on Aug. 5, 2016, which is a continuation of and claims priority to U.S. application Ser. No. 14/570,616, filed on Dec. 15, 2014, the entire contents of which are hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
2751030 | Null | Jun 1956 | A |
3128840 | Barrett | Apr 1964 | A |
3385041 | Douglas | May 1968 | A |
3457575 | Bienek | Jul 1969 | A |
3550714 | Bellinger | Dec 1970 | A |
3674316 | De Brey | Jul 1972 | A |
3924389 | Kita | Dec 1975 | A |
3937174 | Haaga | Feb 1976 | A |
3946543 | Templeton | Mar 1976 | A |
4119900 | Kremnitz | Oct 1978 | A |
4133404 | Griffin | Jan 1979 | A |
4163977 | Polstorff | Aug 1979 | A |
4306329 | Yokoi | Dec 1981 | A |
4318266 | Taube | Mar 1982 | A |
4369543 | Chen et al. | Jan 1983 | A |
4513469 | Godfrey et al. | Apr 1985 | A |
4545404 | Yoshimura et al. | Oct 1985 | A |
4545453 | Yoshimura et al. | Oct 1985 | A |
4556313 | Miller et al. | Dec 1985 | A |
4603753 | Yoshimura et al. | Aug 1986 | A |
4626995 | Lofgren et al. | Dec 1986 | A |
4674048 | Okumura | Jun 1987 | A |
4679152 | Perdue | Jul 1987 | A |
4696074 | Cavalli et al. | Sep 1987 | A |
4700301 | Dyke | Oct 1987 | A |
4700427 | Knepper | Oct 1987 | A |
4716621 | Zoni | Jan 1988 | A |
4733431 | Martin | Mar 1988 | A |
4756049 | Uehara | Jul 1988 | A |
4767237 | Cosman et al. | Aug 1988 | A |
4777416 | George, II et al. | Oct 1988 | A |
4782550 | Jacobs | Nov 1988 | A |
4811228 | Hyyppa et al. | Mar 1989 | A |
4854000 | Takimoto | Aug 1989 | A |
4887415 | Martin | Dec 1989 | A |
4893025 | Lee | Jan 1990 | A |
4909024 | Jones et al. | Mar 1990 | A |
4912643 | Beirne | Mar 1990 | A |
4918441 | Bohman | Apr 1990 | A |
4919224 | Shyu et al. | Apr 1990 | A |
4933864 | Evans et al. | Jun 1990 | A |
4962453 | Pong et al. | Oct 1990 | A |
4974283 | Holsten et al. | Dec 1990 | A |
5002145 | Waqkaumi et al. | Mar 1991 | A |
5017415 | Cosman et al. | May 1991 | A |
5086535 | Grossmeyer et al. | Feb 1992 | A |
5093955 | Blehert et al. | Mar 1992 | A |
5109566 | Kobayashi et al. | May 1992 | A |
5142985 | Steams et al. | Sep 1992 | A |
5163202 | Kawakami et al. | Nov 1992 | A |
5163273 | Wojtkowski et al. | Nov 1992 | A |
5165064 | Mattaboni | Nov 1992 | A |
5204814 | Noonan et al. | Apr 1993 | A |
5208521 | Aoyama | May 1993 | A |
5216777 | Moro et al. | Jun 1993 | A |
5239720 | Wood et al. | Aug 1993 | A |
5261139 | Lewis | Nov 1993 | A |
5279672 | Betker et al. | Jan 1994 | A |
5284522 | Kobayashi et al. | Feb 1994 | A |
5293955 | Lee | Mar 1994 | A |
5303448 | Hennessey et al. | Apr 1994 | A |
5319828 | Waldhauser et al. | Jun 1994 | A |
5321614 | Ashworth | Jun 1994 | A |
5324948 | Dudar et al. | Jun 1994 | A |
5341540 | Soupert et al. | Aug 1994 | A |
5353224 | Lee et al. | Oct 1994 | A |
5369347 | Yoo | Nov 1994 | A |
5410479 | Coker | Apr 1995 | A |
5438721 | Pahno et al. | Aug 1995 | A |
5440216 | Kim | Aug 1995 | A |
5444965 | Colens | Aug 1995 | A |
5446356 | Kim | Aug 1995 | A |
5454129 | Kell | Oct 1995 | A |
5455982 | Armstrong et al. | Oct 1995 | A |
5465525 | Mifune et al. | Nov 1995 | A |
5467273 | Faibish et al. | Nov 1995 | A |
5483346 | Butzer | Jan 1996 | A |
5497529 | Boesi | Mar 1996 | A |
5507067 | Hoekstra et al. | Apr 1996 | A |
5515572 | Hoekstra et al. | May 1996 | A |
5528888 | Miyamoto et al. | Jun 1996 | A |
5534762 | Kim | Jul 1996 | A |
5537017 | Feiten et al. | Jul 1996 | A |
5539953 | Kurz | Jul 1996 | A |
5542146 | Hoekstra et al. | Aug 1996 | A |
5548511 | Bancroft | Aug 1996 | A |
5553349 | Kilstrom et al. | Sep 1996 | A |
5555587 | Guha | Sep 1996 | A |
5560077 | Crotchett | Oct 1996 | A |
5568589 | Hwang | Oct 1996 | A |
5611106 | Wulff | Mar 1997 | A |
5611108 | Knowlton et al. | Mar 1997 | A |
5613261 | Kawakami et al. | Mar 1997 | A |
5621291 | Lee | Apr 1997 | A |
5622236 | Azumi et al. | Apr 1997 | A |
5634237 | Paranjpe | Jun 1997 | A |
5634239 | Tuvin et al. | Jun 1997 | A |
5650702 | Azumi | Jul 1997 | A |
5652489 | Kawakami | Jul 1997 | A |
5682313 | Edlund et al. | Oct 1997 | A |
5682839 | Grimsley et al. | Nov 1997 | A |
5709007 | Chiang | Jan 1998 | A |
5761762 | Kubo et al. | Jun 1998 | A |
5781960 | Kilstrom et al. | Jul 1998 | A |
5787545 | Co lens | Aug 1998 | A |
5794297 | Muta | Aug 1998 | A |
5812267 | Everett, Jr. et al. | Sep 1998 | A |
5819008 | Asama et al. | Oct 1998 | A |
5825981 | Matsuda | Oct 1998 | A |
5839156 | Park et al. | Nov 1998 | A |
5841259 | Kim et al. | Nov 1998 | A |
5867800 | Leif | Feb 1999 | A |
5916111 | Colens | Jun 1999 | A |
5926909 | McGee | Jul 1999 | A |
5935179 | Kleiner et al. | Aug 1999 | A |
5940927 | Haegermarck et al. | Aug 1999 | A |
5940930 | Oh et al. | Aug 1999 | A |
5942869 | Katou et al. | Aug 1999 | A |
5943730 | Boomgaarden | Aug 1999 | A |
5943733 | Tagliaferr | Aug 1999 | A |
5959423 | Nakanishi et al. | Sep 1999 | A |
5974348 | Rocks | Oct 1999 | A |
6009358 | Angott et al. | Dec 1999 | A |
6041471 | Charkey et al. | Mar 2000 | A |
6049745 | Douglas et al. | Apr 2000 | A |
6073427 | Nichols | Jun 2000 | A |
6076025 | Ueno et al. | Jun 2000 | A |
6076227 | Schalig et al. | Jun 2000 | A |
6108076 | Hanseder | Aug 2000 | A |
6112143 | Allen et al. | Aug 2000 | A |
6124694 | Bancroft et al. | Sep 2000 | A |
6133730 | Winn | Oct 2000 | A |
6140146 | Brady et al. | Oct 2000 | A |
6140231 | Brady et al. | Oct 2000 | A |
6166706 | Gallagher et al. | Dec 2000 | A |
6226830 | Hendriks et al. | May 2001 | B1 |
6240342 | Fiegert et al. | May 2001 | B1 |
6255793 | Peless | Jul 2001 | B1 |
6259979 | Holmquist | Jul 2001 | B1 |
6285930 | Dickson et al. | Sep 2001 | B1 |
6300737 | Bergvall et al. | Oct 2001 | B1 |
D451931 | Abramson et al. | Dec 2001 | S |
6339735 | Peless et al. | Jan 2002 | B1 |
6374155 | Wallach et al. | Apr 2002 | B1 |
6385515 | Dickson et al. | May 2002 | B1 |
6408226 | Byrne et al. | Jun 2002 | B1 |
6417641 | Peless et al. | Jul 2002 | B2 |
6438456 | Feddema et al. | Aug 2002 | B1 |
6442476 | Poropat | Aug 2002 | B1 |
6443509 | Levin et al. | Sep 2002 | B1 |
6444003 | Sutcliffe | Sep 2002 | B1 |
6463368 | Feiten et al. | Oct 2002 | B1 |
6465982 | Bergvall et al. | Oct 2002 | B1 |
6493613 | Peless et al. | Dec 2002 | B2 |
6496754 | Song et al. | Dec 2002 | B2 |
6496755 | Wallach et al. | Dec 2002 | B2 |
6507773 | Parker et al. | Jan 2003 | B2 |
6525509 | Petersson et al. | Feb 2003 | B1 |
6532404 | Co lens | Mar 2003 | B2 |
6535793 | Allard | Mar 2003 | B2 |
6548982 | Papanikolopoulos et al. | Apr 2003 | B1 |
6571415 | Gerber et al. | Jun 2003 | B2 |
6574536 | Kawagoe et al. | Jun 2003 | B1 |
6580246 | Jacobs | Jun 2003 | B2 |
6580978 | McTamaney | Jun 2003 | B1 |
6584376 | Van Kommer | Jun 2003 | B1 |
6586908 | Petersson et al. | Jul 2003 | B2 |
6594844 | Jones | Jul 2003 | B2 |
6604022 | Parker | Aug 2003 | B2 |
6605156 | Clark et al. | Aug 2003 | B1 |
6611120 | Song et al. | Aug 2003 | B2 |
6611734 | Parker et al. | Aug 2003 | B2 |
6611738 | Ruffner | Aug 2003 | B2 |
6615108 | Peless et al. | Sep 2003 | B1 |
6658693 | Reed, Jr. | Dec 2003 | B1 |
6661239 | Ozick | Dec 2003 | B1 |
6671592 | Bisset et al. | Dec 2003 | B1 |
6690134 | Jones et al. | Feb 2004 | B1 |
6741054 | Koselka et al. | May 2004 | B2 |
6748297 | Song et al. | Jun 2004 | B2 |
6764373 | Osawa et al. | Jul 2004 | B1 |
6781338 | Jones et al. | Aug 2004 | B2 |
6809490 | Jones et al. | Oct 2004 | B2 |
6830120 | Yashima et al. | Dec 2004 | B1 |
6841963 | Song et al. | Jan 2005 | B2 |
6845297 | Allard | Jan 2005 | B2 |
6850024 | Peless et al. | Feb 2005 | B2 |
6870792 | Chiappetta | Mar 2005 | B2 |
6883201 | Jones et al. | Apr 2005 | B2 |
6885912 | Peless et al. | Apr 2005 | B2 |
6901624 | Mori et al. | Jun 2005 | B2 |
D510066 | Hickey et al. | Sep 2005 | S |
6938298 | Aasen | Sep 2005 | B2 |
6940291 | Ozick | Sep 2005 | B1 |
6956348 | Landry et al. | Oct 2005 | B2 |
6971140 | Kim | Dec 2005 | B2 |
6984952 | Peless et al. | Jan 2006 | B2 |
6999850 | McDonald | Feb 2006 | B2 |
7024278 | Chiapetta et al. | Apr 2006 | B2 |
7069124 | Whittaker et al. | Jun 2006 | B1 |
7076348 | Bucher et al. | Jul 2006 | B2 |
7085624 | Aldred et al. | Aug 2006 | B2 |
7155309 | Peless et al. | Dec 2006 | B2 |
7203576 | Wilson et al. | Apr 2007 | B1 |
7206677 | Hulden | Apr 2007 | B2 |
D559867 | Abramson | Jan 2008 | S |
7349759 | Peless et al. | Mar 2008 | B2 |
D573610 | Abramson | Jul 2008 | S |
7441392 | Lilliestielke et al. | Oct 2008 | B2 |
7481036 | Lilliestielke et al. | Jan 2009 | B2 |
7525287 | Miyashita et al. | Apr 2009 | B2 |
7729801 | Abramson | Jun 2010 | B2 |
8046103 | Abramson et al. | Oct 2011 | B2 |
8069639 | Fancher, III | Dec 2011 | B2 |
D652431 | Naslund | Jan 2012 | S |
D656163 | Johansson et al. | Mar 2012 | S |
8136333 | Levin et al. | Mar 2012 | B1 |
8306659 | Abramson et al. | Nov 2012 | B2 |
8413616 | Bergquist | Apr 2013 | B2 |
8515578 | Chiappetta | Aug 2013 | B2 |
8532822 | Abramson et al. | Sep 2013 | B2 |
8634960 | Sandin et al. | Jan 2014 | B2 |
8635841 | Fiser et al. | Jan 2014 | B2 |
8781627 | Sandin et al. | Jul 2014 | B2 |
8868237 | Sandin et al. | Oct 2014 | B2 |
8924144 | Forstall | Dec 2014 | B2 |
8954193 | Sandin et al. | Feb 2015 | B2 |
8996171 | Anderson | Mar 2015 | B2 |
9002535 | Powers et al. | Apr 2015 | B2 |
9043952 | Sandin et al. | Jun 2015 | B2 |
9043953 | Sandin et al. | Jun 2015 | B2 |
9420741 | Balutis | Aug 2016 | B2 |
9788481 | Das et al. | Oct 2017 | B2 |
10274954 | Balutis et al. | Apr 2019 | B2 |
20010022506 | Peless et al. | Sep 2001 | A1 |
20010047231 | Peless et al. | Nov 2001 | A1 |
20020011813 | Koselka et al. | Jan 2002 | A1 |
20020016649 | Jones | Feb 2002 | A1 |
20020120364 | Colens | Aug 2002 | A1 |
20020140393 | Peless et al. | Oct 2002 | A1 |
20020156556 | Ruffner | Oct 2002 | A1 |
20020160845 | Simonsen | Oct 2002 | A1 |
20020173877 | Zweig | Nov 2002 | A1 |
20030019071 | Field et al. | Jan 2003 | A1 |
20030023356 | Keable | Jan 2003 | A1 |
20030025472 | Jones et al. | Feb 2003 | A1 |
20030055337 | Lin | Mar 2003 | A1 |
20030060928 | Abramson et al. | Mar 2003 | A1 |
20030120389 | Abramson et al. | Jun 2003 | A1 |
20030137268 | Papanikolopoulos et al. | Jul 2003 | A1 |
20030144774 | Trissei et al. | Jul 2003 | A1 |
20030182914 | Shibata et al. | Oct 2003 | A1 |
20030192144 | Song et al. | Oct 2003 | A1 |
20030208304 | Peless et al. | Nov 2003 | A1 |
20030216834 | Allard | Nov 2003 | A1 |
20030233177 | Johnson et al. | Dec 2003 | A1 |
20030234325 | Marino et al. | Dec 2003 | A1 |
20040020000 | Jones | Feb 2004 | A1 |
20040030448 | Solomon | Feb 2004 | A1 |
20040030449 | Solomon | Feb 2004 | A1 |
20040030450 | Solomon | Feb 2004 | A1 |
20040030571 | Solomon | Feb 2004 | A1 |
20040031113 | Wosewick et al. | Feb 2004 | A1 |
20040036618 | Ku et al. | Feb 2004 | A1 |
20040049877 | Jones et al. | Mar 2004 | A1 |
20040068351 | Solomon | Apr 2004 | A1 |
20040068415 | Solomon | Apr 2004 | A1 |
20040068416 | Solomon | Apr 2004 | A1 |
20040076324 | Burl et al. | Apr 2004 | A1 |
20040088079 | Lavarec et al. | May 2004 | A1 |
20040111184 | Chiappetta et al. | Jun 2004 | A1 |
20040111196 | Dean | Jun 2004 | A1 |
20040134336 | Solomon | Jul 2004 | A1 |
20040134337 | Solomon | Jul 2004 | A1 |
20040156541 | Jeon et al. | Aug 2004 | A1 |
20040158357 | Lee et al. | Aug 2004 | A1 |
20040187457 | Colens | Sep 2004 | A1 |
20040200505 | Taylor et al. | Oct 2004 | A1 |
20040204792 | Taylor et al. | Oct 2004 | A1 |
20040211444 | Taylor et al. | Oct 2004 | A1 |
20040220000 | Falone et al. | Nov 2004 | A1 |
20040236468 | Taylor et al. | Nov 2004 | A1 |
20040244138 | Taylor et al. | Dec 2004 | A1 |
20050000543 | Taylor et al. | Jan 2005 | A1 |
20050007057 | Peless et al. | Jan 2005 | A1 |
20050010331 | Taylor et al. | Jan 2005 | A1 |
20050020374 | Wang | Jan 2005 | A1 |
20050097952 | Steph | May 2005 | A1 |
20050108999 | Bucher | May 2005 | A1 |
20050113990 | Peless et al. | May 2005 | A1 |
20050156562 | Cohen et al. | Jul 2005 | A1 |
20050204717 | Colens | Sep 2005 | A1 |
20050251292 | Casey et al. | Nov 2005 | A1 |
20050278094 | Swinbanks et al. | Dec 2005 | A1 |
20050287038 | Dubrovsky et al. | Dec 2005 | A1 |
20060293794 | Harwig et al. | Dec 2006 | A1 |
20070016328 | Ziegler et al. | Jan 2007 | A1 |
20070142964 | Abramson | Jun 2007 | A1 |
20070150109 | Peless et al. | Jun 2007 | A1 |
20070188318 | Cole et al. | Aug 2007 | A1 |
20080039974 | Sandin et al. | Feb 2008 | A1 |
20080097645 | Abramson et al. | Apr 2008 | A1 |
20080167753 | Peless et al. | Jul 2008 | A1 |
20080183349 | Abramson et al. | Jul 2008 | A1 |
20090254218 | Sandin et al. | Oct 2009 | A1 |
20100059000 | Bergquist | Mar 2010 | A1 |
20100102525 | Fancher | Apr 2010 | A1 |
20110130875 | Abramson | Jun 2011 | A1 |
20110190931 | Anderson et al. | Aug 2011 | A1 |
20110234153 | Abramson | Sep 2011 | A1 |
20120041594 | Abramson et al. | Feb 2012 | A1 |
20120095619 | Pack et al. | Apr 2012 | A1 |
20120226381 | Abramson et al. | Sep 2012 | A1 |
20120265391 | Letsky | Oct 2012 | A1 |
20120290165 | Ouyang | Nov 2012 | A1 |
20130006419 | Bergstrom et al. | Jan 2013 | A1 |
20130024025 | Hung | Jan 2013 | A1 |
20130030609 | Jagenstedt | Jan 2013 | A1 |
20130066484 | Markusson et al. | Mar 2013 | A1 |
20130076304 | Andersson et al. | Mar 2013 | A1 |
20130110322 | Jagenstedt et al. | May 2013 | A1 |
20130152538 | Fiser et al. | Jun 2013 | A1 |
20130184924 | Jagenstedt et al. | Jul 2013 | A1 |
20130249179 | Burns | Sep 2013 | A1 |
20130274920 | Abramson et al. | Oct 2013 | A1 |
20140058611 | Borinato | Feb 2014 | A1 |
20140102061 | Sandin et al. | Apr 2014 | A1 |
20140102062 | Sandin et al. | Apr 2014 | A1 |
20140117892 | Coates | May 2014 | A1 |
20140277900 | Anker | Sep 2014 | A1 |
20150006015 | Sandin et al. | Jan 2015 | A1 |
20150234385 | Sandin et al. | Aug 2015 | A1 |
20150253757 | Ikeda | Sep 2015 | A1 |
20160057925 | Letsky | Mar 2016 | A1 |
Number | Date | Country |
---|---|---|
101251592 | Aug 2008 | CN |
102890507 | Jan 2013 | CN |
19932552 | Feb 2000 | DE |
0774702 | May 1997 | EP |
0792726 | Sep 1997 | EP |
1331537 | Jul 2003 | EP |
1704766 | Sep 2006 | EP |
2828589 | Aug 2001 | FR |
2142447 | Jan 1985 | GB |
2283838 | May 1995 | GB |
2382157 | May 2003 | GB |
62120510 | Jun 1987 | JP |
62154008 | Jul 1987 | JP |
63183032 | Jul 1988 | JP |
63241610 | Oct 1988 | JP |
26312 | Jan 1990 | JP |
03051023 | Mar 1991 | JP |
04320612 | Nov 1992 | JP |
06327598 | Nov 1994 | JP |
07129239 | May 1995 | JP |
07295636 | Nov 1995 | JP |
0816776 | Jan 1996 | JP |
08089451 | Apr 1996 | JP |
08152916 | Jun 1996 | JP |
09179625 | Jul 1997 | JP |
9185410 | Jul 1997 | JP |
11508810 | Aug 1999 | JP |
11510935 | Sep 1999 | JP |
2001258807 | Sep 2001 | JP |
2001275908 | Oct 2001 | JP |
2001525567 | Dec 2001 | JP |
2002078650 | Mar 2002 | JP |
2002204768 | Jul 2002 | JP |
3356170 | Oct 2002 | JP |
2002532178 | Oct 2002 | JP |
3375843 | Nov 2002 | JP |
2002323925 | Nov 2002 | JP |
2002355206 | Dec 2002 | JP |
2002360471 | Dec 2002 | JP |
2002360482 | Dec 2002 | JP |
2003005296 | Jan 2003 | JP |
2003010076 | Jan 2003 | JP |
200305296 | Feb 2003 | JP |
2003036116 | Feb 2003 | JP |
2003505127 | Feb 2003 | JP |
2003061882 | Mar 2003 | JP |
2003310489 | Nov 2003 | JP |
2003038401 | Feb 2021 | JP |
2003038402 | Feb 2021 | JP |
199502220 | Jan 1995 | WO |
199526512 | Oct 1995 | WO |
199740734 | Nov 1997 | WO |
199741451 | Nov 1997 | WO |
199853456 | Nov 1998 | WO |
199916078 | Apr 1999 | WO |
199928800 | Jun 1999 | WO |
199938056 | Jul 1999 | WO |
199938237 | Jul 1999 | WO |
199959042 | Nov 1999 | WO |
200004430 | Jan 2000 | WO |
200036962 | Jun 2000 | WO |
200038026 | Jun 2000 | WO |
200038029 | Jun 2000 | WO |
200078410 | Dec 2000 | WO |
200106904 | Feb 2001 | WO |
200106905 | Feb 2001 | WO |
200239864 | May 2002 | WO |
200239868 | May 2002 | WO |
2002058527 | Aug 2002 | WO |
2002062194 | Aug 2002 | WO |
2002067744 | Sep 2002 | WO |
2002067745 | Sep 2002 | WO |
2002074150 | Sep 2002 | WO |
2002075356 | Sep 2002 | WO |
2002075469 | Sep 2002 | WO |
2002075470 | Sep 2002 | WO |
2002101477 | Dec 2002 | WO |
2003026474 | Apr 2003 | WO |
2003040845 | May 2003 | WO |
2003040846 | May 2003 | WO |
200365140 | Aug 2003 | WO |
2004004533 | Jan 2004 | WO |
2004006034 | Jan 2004 | WO |
2004058028 | Jan 2004 | WO |
2005077244 | Jan 2004 | WO |
2006068403 | Jan 2004 | WO |
2005055795 | Jun 2005 | WO |
2010077198 | Jul 2010 | WO |
Entry |
---|
Kimura et al., “Stuck Evasion Control for Active Wheel Passive-Joint Snake-like Mobile Robot ‘Genbu’.” Proceedings of the 2004 IEEE International Conference on Robotics 8 Automation New Orleans, LA Apr. 2004. |
Kozlowski and Pazderski. Modeling and Control of a 4-wheel Skid-steering Mobile Robot. International J. of Applied Mathematics and Computer Science, 14:477-496, 2004. |
Angle et al., U.S. Appl. No. 60/177,703, 16 pages, published Feb. 7, 2002, available at http://portal.uspto.gov/external/portal/pair, accessed Jul. 11, 2012. |
Bohn et al. “Super-distributed RFID Tag Infrastructures”, Lecture Notes in Computer Science, Springer Verlag, Berlin, DE, vol. 3295, pp. 1-12, Nov. 11, 2004. |
Campbell et al., U.S. Appl. No. 60/741,442, 113 pages, published Jun. 7, 2007, available at http://patentscope.wipo.int/search/docservicepdf_pct/id00000005206306.pdf, accessed Jul. 11, 2012. |
Caracciolo et al. (1999): Trajectory Tracking Control of a Four-wheel Differentially Driven Mobile Robot. IEEE Int. Conf. Robotics and Automation, Detroit, MI, pp. 2632-2638. |
Casey et al., U.S. Appl. No. 60/582,992, 24 pages, published Nov. 10, 2005, available at http://portal.uspto.gov/external/portal/pair, accessed Jul. 11, 2012. |
Communication from a foreign patent office in counterpart application PCT/US2007/064326, dated Jul. 17, 2008. |
Domnitcheva “Smart Vacuum Cleaner An Autonomous Location-Aware Cleaning Device” Proceedings of the International Conference on Ubiquitous Computing, pp. 1-2, Sep. 10, 2004. |
Doty et al., “Sweep Strategies for a Sensory-Driven, Behavior-Based Vacuum Cleaning Agent” AAAI 1993 Fall Symposium Series Instantiating Real-World Agents Research Triangle Park, Raleigh, NC, Oct. 22-24, 1993, pp. 1-6. |
Electrolux designed for the well-lived home, website: http://www.electroluxusa.com/node57.as[?currentURL=node142.asp%3F, accessed Mar. 18, 2005. |
eVac Robotic Vacuum S1727 Instruction Manual, Sharper Image Corp, Copyright 2004. |
Everyday Robots, website: http://www.everydayrobots.com/index.php?option=content&task=view&id=9, accessed Apr. 20, 2005. |
Facts on the Trilobite webpage: “http://trilobiteelectroluxse/presskit_en/node11335asp?print=yes&pressID=” accessed Dec. 12, 2003. |
Friendly Robotics Robotic Vacuum RV400-The Robot Store website: http://www.therobotstore.com/s.nl/sc.9/category,-109/it.A/id.43/.f, accessed Apr. 20, 2005. |
Gat, Erann, Robust Low-computation Sensor-driven Control for Task-Directed Navigation, Proceedings of the 1991 IEEE, International Conference on Robotics and Automation, Sacramento, California, Apr. 1991, pp. 2484-2489. |
Hicks et al., “A Survey of Robot Lawn Mowers”, http://www.robotics.uc.edu/papers/paper2000/lawnmower.pdf (8 pages). |
HITACHI: News release: The home cleaning robot of the autonomous movement type (experimental machine) is developed, website: http://www.i4u.com/japanreleases/hitachirobot.htm., accessed Mar. 18, 2005. |
International Preliminary Report on Patentability dated Sep. 23, 2008 from International Application No. PCT/US2007/064323. |
International Preliminary Report on Patentability dated Sep. 23, 2008 from International Application No. PCT/US2007/064326. |
International Preliminary Report on Patentability in International Application No. PCT/US2015/050477, dated Jun. 20, 2017, 8 pages. |
International Search Report and Written Opinion in International Application No. PCT/US2015/050477, dated Dec. 1, 2015, 12 pages. |
Kahney, “Robot Vacs are in the House,” Retrieved from the Internet: URL<www.wired.com/news/technology/0.1282.59237.00.html> 5 pages, Jun. 2003. |
Karcher Product Manual Download webpage: “http://wwwkarchercom/bta/downloadenshtml?ACTION=SELECTTEILENR&ID=rc3000&submitButtonName=Select+Product+Manual” and associated pdf file “5959-915enpdf (47 MB) English/English” accessed Jan. 21, 2004. |
Karcher RC 3000 Cleaning Robot—user manual Manufacturer: Alfred-Karcher GmbH & Co, Cleaning Systems, Alfred Karcher-Str 28-40, PO Box 160, D-71349 Winnenden, Germany, Dec. 2002. |
Karcher RoboCleaner RC 3000 Product Details webpages: “http://wwwrobocleanerde/english/screen3html” through “. . . screen6html” accessed Dec. 12, 2003. |
Karcher USA, RC3000 Robotic Cleaner, website: http://www.karcher-usa.com/showproducts.php?op=view prod¶m1=143¶m2=¶m3=, accessed Mar. 18, 2005. |
Koolvac Robotic Vacuum Cleaner Owner's Manual, Koolatron, Undated. |
Kubitz et al. “Application of radio frequency identification devices to support navigation of autonomous mobile robots,” Vehicular Technology Conference, vol. 1, pp. 126-130, May 4, 1997. |
Matthies et al., “Detecting Water Hazards for Autonomous Off-Road Navigation,” Proceedings of SPIE Conference 5083: Unmanned Ground Vehicle Technology V, Orlando, FL, Apr. 2003, pp. 231-242. |
Morland, “Autonomous Lawnmower Control”, Downloaded from the internet at: http://cns.bu.edu/˜cimorlan/robotics/lawnmower/report.pdf, 10 pages, Jul. 2002. |
NorthStar Low-Cost, Indoor Localization, Evolution robotics, Powering Intelligent Products. |
On Robo, “Robot Reviews Samsung Robot Vacuum (VC-RP30W),” Retrieved from the Internet: URL <www.onrobo.com/reviews/AT Home/vacuumcleaners/on00vcrb30rosam/index.htm>. 2 pages, 2005. |
Partial International Search Report from counterpart application PCT/US2007/064323 dated Mar. 14, 2008. |
Put Your Roomba . . . On “Automatic” Roomba Timer> Timed Cleaning-Floorvac Robotic Vacuum webpages: http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&category=43575198387&rd=1, accessed Apr. 20, 2005. |
RoboMaid Sweeps Your Floors So You Won't Have To, the Official Site, website: http://www.thereobomaid.com/, acessed Mar. 18, 2005. |
Robotic Vacuum Cleaner-Blue, website: http://www.sharperimage.com/US/en/catalog/productview.jhtml?sku=S1727BLU, accessed Mar. 18, 2005. |
Schofield, Monica, “Neither Master nor Slave ” A Practical Study in the Development and Employment of Cleaning Robots, Emerging Technologies and Factory Automation, 1999 Proceedings EFA'99 1999 7th IEEE International Conference on Barcelona, Spain Oct. 18-21, 1999, pp. 1427-1434. |
Thrun, Learning Occupancy Grid Maps With Forward Sensor Models, School of Computer Science, Carnegie Mellon University, pp. 1-28. |
Wigley, M. “The Electric Lawn”, in The American Lawn, Princeton Architectural Press new York with Canadian Centre for Architecture Montreal, pp. 155-195 (1999). |
Wired News: Robot Vacs Are in the House, website: http://www.wired.com/news/print/0,1294,59237,00.html, accessed Mar. 18, 2005. |
Zoombot Remote Controlled Vaccum-RV-500 NEW Roomba 2, website: http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&category=43526&item=4373497618&rd=1, accessed Apr. 20, 2005. |
United States Office Action issued in U.S. Appl. No. 11/688,213, dated Nov. 2, 2011, 12 pages. |
United States Office Action issued in U.S. Appl. No. 11/688,213, dated Aug. 16, 2012, 14 pages. |
United States Office Action issued in U.S. Appl. No. 11/688,213, dated May 22, 2013, 12 pages. |
United States Office Action issued in U.S. Appl. No. 11/688,213, dated Oct. 3, 2013, 12 pages. |
United States Notice of Allowance issued in U.S. Appl. No. 11/688,213, dated Jun. 18, 2014, 5 pages. |
United States Notice of Allowance issued in U.S. Appl. No. 12/488,094, dated Apr. 12, 2012, 11 pages. |
United States Notice of Allowance issued in U.S. Appl. No. 12/488,094, dated Aug. 24, 2012, 13 pages. |
United States Notice of Allowance issued in U.S. Appl. No. 12/488,094, dated Feb. 14, 2013, 10 pages. |
United States Notice of Allowance issued in U.S. Appl. No. 12/488,094, dated Jun. 19, 2013, 11 pages. |
United States Notice of Allowance issued in U.S. Appl. No. 12/488,094, dated Jan. 31, 2014, 6 pages. |
United States Notice of Allowance issued in U.S. Appl. No. 12/488,094, dated May 9, 2014, 6 pages. |
Number | Date | Country | |
---|---|---|---|
20190250604 A1 | Aug 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15229674 | Aug 2016 | US |
Child | 16397653 | US | |
Parent | 14570616 | Dec 2014 | US |
Child | 15229674 | US |