Method of detecting a difference in level of a surface in front of a robotic cleaning device

Information

  • Patent Grant
  • 11474533
  • Patent Number
    11,474,533
  • Date Filed
    Friday, June 2, 2017
    7 years ago
  • Date Issued
    Tuesday, October 18, 2022
    2 years ago
Abstract
A method for a robotic cleaning device of detecting a difference in level of a surface in front of the robotic cleaning device moves. The method includes illuminating the surface with light, capturing an image of the surface, detecting a luminous section in the captured image caused by the light, identifying at least a first segment and a second segment representing the detected luminous section, and detecting, from a positional relationship between the identified first and second segment, the difference in level of the surface.
Description

This application is a U.S. National Phase application of PCT International Application No. PCT/EP2017/063468, filed Jun. 2, 2017, which is incorporated by reference herein.


TECHNICAL FIELD

The invention relates to a method of detecting a difference in level of a surface in front of a robotic cleaning device, and a robotic cleaning device performing the method.


Further provided is a computer program comprising computer-executable instructions for causing a robotic cleaning device to perform the steps of the method when the computer-executable instructions are executed on a controller included in the robotic cleaning device.


Yet further provided is a computer program product comprising a computer readable medium, the computer readable medium having the computer program embodied thereon.


BACKGROUND

In many fields of technology, it is desirable to use robots with an autonomous behaviour such that they freely can move around a space without colliding with possible obstacles. Robotic vacuum cleaners are known in the art, which are equipped with drive means in the form of a motor for moving the cleaner across a surface to be cleaned. The robotic vacuum cleaners are further equipped with intelligence in the form of microprocessor(s) and navigation means for causing an autonomous behaviour such that the robotic vacuum cleaners freely can move around and clean a surface in the form of e.g. a floor. Thus, these prior art robotic vacuum cleaners have the capability of more or less autonomously vacuum clean a room in which objects such as tables and chairs and other obstacles such as walls and stairs are located.


A problem with prior art robotic vacuum cleaners is that they tend to get stuck to on obstacles such as doorsteps or thick rugs


A particularly problematic obstacle encountered is a ledge, typically in the form of a stair leading down to a lower floor. If such a ledge is not detected by the robotic cleaner, there is a risk that the robot drops off the ledge, falls down the stair and becomes permanently damaged.


SUMMARY

An object of the present invention is to solve, or at least mitigate, this problem in the art and to provide and improved method of detecting a difference in level of a surface in front of the robotic cleaning device.


This object is attained in a first aspect of the invention by a method for a robotic cleaning device of detecting a difference in level of a surface in front of the robotic cleaning device. The method comprises illuminating the surface with light, capturing an image of the surface, detecting a luminous section in the captured image caused by the light, identifying at least a first segment and a second segment representing the detected luminous section, and detecting, from a positional relationship between the identified first and second segment, the difference in level of the surface.


This object is attained in a second aspect of the invention by a robotic cleaning device configured to detect a difference in level of a surface in front of the robotic cleaning device. The robotic cleaning device comprises a propulsion system configured to move the robotic cleaning device, a camera device configured to record images of a vicinity of the robotic cleaning device, and at least one light source configured to illuminate a surface in front of the robotic cleaning device. The robotic cleaning device further comprises a controller configured to control the at least one light source to illuminate a surface in front of the robotic cleaning device, control the camera device to capture an image of the illuminated surface, detect a luminous section in the captured image caused by the at least one light source illuminating the surface, identify at least a first segment and a second segment representing the detected luminous section, and to detect, from a positional relationship between the identified first and second segment, the difference in level of the surface.


Advantageously, by measuring a difference in level of the surface in front of the robotic device, it is possible to timely plan how the robotic device should move for performing a cleaning programme—well before an object or surface is encountered—and further to be able avoid traversing an object such as for example a doorstep, the height of which may be to great for the robotic device to traverse.


Further advantageous is that by detecting height of objects or surfaces in front of the robotic cleaning device, for instance a thick rug, the robot may be controlled to move along the periphery of the rug thereby making efficient use of any side brushes with which the robot may be equipped. After having cleaned along the periphery of the rug, the robotic cleaning device may move up onto the rug, where it typically would refrain from using the side brush. Instead, the robot could alternately rotate a rotatable brush roll, with which the robot may be equipped, in a forward and backward direction to avoid having the fibers of the rug being entangled in the brush roll. Hence, the movement of any brush roll and any side brush(es) can advantageously be controlled by taking into account the determined height of the surfaces and objects in front of the robotic cleaning device.


In an embodiment, the robotic device detects that a discontinuity occurs between the identified first and second segment in the captured image, wherein the difference in level is detected to comprise a ledge.


In this embodiment, which advantageously can be implemented for less complex robotic devices, the difference in level of the floor in front of the robotic device is detected by concluding from the captured image that the first and second segment are discontinuous.


If such a discontinuity occurs, the robotic cleaning device concludes in one embodiment that a floor on a distance in front of the robotic cleaning device is located at a lower level than a floor immediately in front of the robotic cleaning device. However, if such a method is used, the robotic device only knows that there is difference in level, but cannot assess a height of the difference in level.


In another embodiment, the robotic device determines if the second segment is located vertically below the first segment in the captured image, in which case the difference in level constitutes a ledge.


In a further embodiment, which advantageously can be implemented for less complex robotic devices, the difference in level of the floor in front of the robotic device is detected by concluding from the image that the first and second segment are linked by a third segment.


If such a linking segment between the first and the second segment is present in the captured image, the robotic device concludes that the floor on a distance from the robotic cleaning device is located on a higher level than a floor immediately in front of the robotic cleaning device. Hence, an elevation has been encountered.


In another embodiment, the robotic device determines if the second segment is located vertically above the first segment, in which case the difference in level constitutes an elevation.


In yet an embodiment, the robotic device determines a distance to e.g. a ledge and a height of the ledge.


Thus, a more elaborate robotic cleaning device can determine, from x and y image coordinates, the distance to the ledge and the height of the ledge, by converting image coordinates to actual real-world distances.


This information may advantageously be used for sectionalizing an area to be cleaned. For instance, the robotic device may determine that it should finish cleaning the room in which it currently is located before moving on to the next room, i.e. before descending the ledge down to a lower floor.


It may further be envisaged that the robotic device does not descend the ledge at all after having determined its height. Assuming for instance that the height of the ledge exceeds a predetermined level threshold value; this indicates that the ledge is followed by a stairway in which case the robotic device is controlled not the cross the ledge to avoid the robot falling over.


Further provided is a computer program comprising computer-executable instructions for causing a robotic cleaning device to perform the steps of the method when the computer-executable instructions are executed on a controller included in the robotic cleaning device.


Yet further provided is a computer program product comprising a computer readable medium, the computer readable medium having the computer program embodied thereon.


Generally, all terms used in the claims are to be interpreted according to their ordinary meaning in the technical field, unless explicitly defined otherwise herein. All references to “a/an/the element, apparatus, component, means, step, etc.” are to be interpreted openly as referring to at least one instance of the element, apparatus, component, means, step, etc., unless explicitly stated otherwise. The steps of any method disclosed herein do not have to be performed in the exact order disclosed, unless explicitly stated.





BRIEF DESCRIPTION OF THE DRAWINGS

The invention is now described, by way of example, with reference to the accompanying drawings, in which:



FIG. 1 illustrates detection of objects on a surface over which the robotic cleaning device moves in accordance with an embodiment of the present invention;



FIG. 2a illustrates an image captured by the camera of the robotic cleaning device when in position P1 of FIG. 1 according to an embodiment;



FIG. 2b illustrates an image captured by the camera of the robotic cleaning device when in position P2 of FIG. 1 according to an embodiment;



FIG. 2c illustrates a side view of the robotic cleaning device of FIG. 1 when in position P2;



FIG. 3 shows a flowchart illustrating an embodiment of a method of detecting a difference in level of a surface over which the robotic cleaning device moves;



FIG. 4a illustrates an enlarged view of the captured image of FIG. 2b;



FIG. 4b illustrates computing distance to an encountered object according to an embodiment;



FIG. 4c illustrates computing height of an encountered object according to an embodiment;



FIG. 5 illustrates a side view of the robotic cleaning device encountering an elevation;



FIG. 6 illustrates an image captured by the robotic cleaning device in the position shown in FIG. 5;



FIG. 7 shows a bottom view of a robotic cleaning device according to an embodiment; and



FIG. 8 shows a front view of a robotic cleaning device according to an embodiment.





DETAILED DESCRIPTION

The invention will now be described more fully hereinafter with reference to the accompanying drawings, in which certain embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout the description.


The invention relates to robotic cleaning devices, or in other words, to automatic, self-propelled machines for cleaning a surface, e.g. a robotic vacuum cleaner, a robotic sweeper or a robotic floor washer. The robotic cleaning device according to the invention can be mains-operated and have a cord, be battery-operated or use any other kind of suitable energy source, for example solar energy.



FIG. 1 illustrates detection of objects on a surface over which the robotic cleaning device moves in accordance with an embodiment of the present invention.


In this particular exemplifying embodiment, the robotic device 100 uses one vertical line laser 127 for illuminating the surface (i.e. floor 31) over which it moves. However, any appropriate light source may be envisaged for illuminating the surface 31. Further, a smaller or greater part of the surface may be illuminated.


As can be seen in FIG. 1, the line laser 127 projects a laser beam 30a onto the floor 31 and a first wall 32 of a room to be cleaned, while the robotic device 100 uses its camera 123 to capture images of the illuminated surface.



FIG. 2a illustrates an image 40a captured by the camera 123 of the robotic device 100 when in first position P1.


As can be seen, the laser beam 30a will fall onto the floor 31 and the wall 32 and cause corresponding illuminated section 30a′ to be present in the image. The location in the image 40a where the floor 31 meets the wall 32 is indicated with a dashed line only for illustrational purposes; the dashed line is not present in the captured image 40a. From the captured image 40a, the robotic device 100 detects that the laser beam 30a impinges on an obstacle 32, such as a wall, a sofa, a door or the like. By capturing a number of images, the robotic device 100 is capable of identifying the particular obstacle 32 with high reliability. In case a different type of light source would be used, it may even be envisaged that the illuminated section covers the entire captured image. An advantage of using the laser beam as exemplified in FIG. 1 is that a less amount of image data needs to be processed by the robotic device 100.


As is illustrated in the image 40a, since the surface over which the robotic cleaning device 100 moves when in the first position P1 is a flat and even floor 31 and thus no difference in level is present, the illuminated section 30a′ caused by the laser line 30a illuminating the floor 31 is composed of a single segment in the image 40a.


Now, again with reference to FIG. 1, when the robotic cleaning device 100 moves into a second position P2, it encounters a doorway 33 where a step leads down to a lower floor 35. Again, the robotic device 100 captures an image of the illuminated surface. FIG. 2b illustrates the image 40b captured by the camera 123 of the robotic device 100 when in the second position P2.


In this image, the laser beam 30b will fall onto the (upper) floor 31 and ledge 34 indicated with a dashed line in the image 40b for illustrational purposes. Further, the part of the laser beam 30b falling beyond the ledge 34 will incide on the lower floor 35, which is indicated in the image 40b as two line segments; a first segment 30b′ falling onto the upper floor 31 and a second segment 30b″ falling on the lower floor 35.


For illustration, FIG. 2c shows a side view of the robotic device 100 when in position P2, where the laser beam 30a falls onto the upper floor 31, over the ledge 34, and onto the lower floor 35, where the ledge causes a shadow; i.e. the section of the lower floor 35 obscured by the ledge 34 will not be illuminated. The first line segment 30b′ and second line segment 30b″ that will appear in the captured image 40b are further indicated in FIG. 2c.


Hence, with reference to the flowchart of FIG. 3, a method of detecting a difference in level of a surface in front of the robotic cleaning device 100.


As was discussed, the robotic device 100 illuminates in step S101 the surface 31 in front of it with light, in this exemplifying embodiment with structured vertical light using the line laser 127, and captures images of the surface using the camera 123.


When the robotic device 100 is in position P2 as shown in FIG. 1, the captured image will have the appearance of image 40b illustrated in FIG. 2b.


From the captured image 40b, the robotic device 100 detects in step S103 a luminous section caused by the emitted laser line 30b, and identifies the first segment 30b′ and the second segment 30b″ representing the detected luminous section.


Now, after having identified the two segments 30b′, 30b″ in step S103, the robotic device 100 detects in step S104, from a positional relationship between the identified first and second segment 30b′, 30b″, a difference in level of the surface 31.


Advantageously, by measuring a difference in level of the surface in front of the robotic device 100, it is possible to timely plan how the robotic device 100 should move for performing a cleaning programme—well before an object or surface is encountered—and further to be able avoid traversing an object such as for example a doorstep, the height of which may be to great for the robotic device 100 to traverse.


In a basic embodiment, which advantageously can be implemented for less complex robotic devices, the difference in level of the floor 31 over which the robotic device 100 moves is detected by concluding from the 2D image 40b that the first and second segment 30b′, 30b′ are discontinuous.


If such a discontinuity occurs, the robotic device 100 concludes that the (lower) floor 35 is located at a lower level than the (upper) floor 31. However, if such a method is used, the robotic device 100 only knows that there is difference in level, but cannot assess a height of the difference in level.


In another embodiment, the robotic device 100 determines if the second segment 30b″ is located vertically below the first segment 30b′ in which case the difference in level constitutes a ledge.


In a further embodiment, with reference to FIG. 4a illustrating an enlarged view of the captured image 40b of FIG. 2b, the robotic device 100 determines a distance l to the ledge 34 and a height h of the ledge 34, the height h being determined as being the vertical distance between the identified first and second segment 30b′, 30b″.


Thus, a more elaborate robotic cleaning device 100 can determine, from x and y image coordinates, the distance l to the ledge and the height h of the ledge, by converting image coordinates to actual real-world distances. Advantageously, the robotic device 100 will know the distance l to the ledge 34, as well as the height h of the ledge 34.


This information may further advantageously be used for sectionalizing an area to be cleaned. For instance, the robotic device 100 may determine that it should finish cleaning the room in which it currently is located before moving on to the next room, i.e. before descending the ledge 34 down to the lower floor 35.


It may further be envisaged that the robotic device 100 does not descend the ledge 34 after having determined its height h. Assuming for instance that the height h of the ledge 34 exceeds a predetermined level threshold value; this indicate that the ledge 34 is followed by a stairway in which case the robotic device 100 is controlled not the cross the ledge 34 to avoid the robot falling over.



FIG. 4b illustrates in more detail how the robotic device 100 computes distance to a point referred to as “OBJECT”.


The point is illuminated by the laser 127 in order for the camera 123 to see the point. An image coordinate of the point (measured in pixels) will be denoted xi on a sensor 130 of the camera 123, the sensor 130 being arranged behind an appropriate lens 131.


During manufacturing of the robotic cleaning device 100, a calibration is performed as to where on the camera sensor 130 the laser line would impinge if the camera 123 was to observe an object located on an infinite distance from the camera 123, the corresponding coordinate being denoted xi, ∞.


From this information, the following can be deducted:







D
A

=


d
a

=



d


x

i
,



-

x
i




D

=

dA


x

i
,



-

x
i









It is noted that this relation is inversely proportional, meaning that the measurements become more sensitive over longer distances to the object. Hence, the distance D to a given point can be computed.



FIG. 4c illustrates how the height of an object can be computed using the teachings of FIG. 4b.


Lines that are parallel in the “real” world will intersect in one and the same point v in an image, known as the vanishing point. If a point p located on a height h above the floor over which the robot 100 moves is illuminated by the laser 123, a dotted line can be drawn through the point p and the vanishing point v. This dotted line is parallel to the floor in the real world.


An x-coordinate xi,o can be selected in the image where the height Δyi of the dotted line over the floor is measured. If the height Δyi of the dotted line in the image always is measured at the same coordinate xi,o (or at least close to that coordinate) a proportional factor k can be calibrated (typically in units of m/pixel) to compute the height h in the real world:

h=k×Δyi



FIG. 5 illustrates another scenario where the robotic device 100 encounters an elevation 36 instead of a ledge. For instance, it may be envisaged that the robotic device 100 encounters a thick rug, which may have a height of several centimetres, or a doorstep.



FIG. 6 illustrates the image 40c captured by the camera 123 of the robotic device 100 upon encountering the elevation 37.


In this image, the laser beam 30c will fall onto the floor 31 in front of the robotic device 100 and further onto the edge of the rug causing the elevation 36 and the upper side 37 or the rug. This will result in three segments 30c′, 30c″ and 30c′″ visible in the captured image 40c.


From the captured image 40c of FIG. 6, the robotic device 100 again detects a luminous section caused by the emitted laser line 30c, and identifies the first segment 30c′, the second segment 30c″, and a further a third segment 30c′″ linking the first and second segments, which represents the detected luminous section.


Now, after having identified the first and the second segment 30c′, 30c″, the robotic device 100 detects, from a positional relationship between the identified first and second segment 30c′, 30c″, a difference in level of the surface 31.


In a basic embodiment, which advantageously can be implemented for less complex robotic devices, the difference in level of the floor 31 over which the robotic device 100 moves is detected by concluding from the 2D image 40b that the first and second segment 30c′, 30c″ are linked by a third segment 30c′″


If such a linking segment 30c′″ between the first and the second segment 30c′, 30c″ is present in the captured image 40c, the robotic device 100 concludes that the (upper) side 37 of the rug is located at a higher level than the floor 31. However, if such a method is used, the robotic device 100 only knows that there is difference in level, but cannot assess an actual height.


In another embodiment, the robotic device 100 determines if the second segment 30c″ is located vertically above the first segment 30c′ in which case the difference in level constitutes an elevation.


In a further embodiment, again with reference to FIG. 6, the robotic device 100 determines a distance l to the elevation 36 and a height h of the elevation 36, as has been described hereinabove.


This information may advantageously be used for sectionalizing an area to be cleaned. For instance, the robotic device 100 may determine that it should finish cleaning the floor of the room in which it currently is located before moving up on the rug. Alternatively, in case the elevation is caused by a doorstep, the robotic device 100 may determine that the room is to be finished before it crosses the doorstep into the next room.


It may further be envisaged that the robotic device 100 does not ascend the rug/doorstep at all after having determined its height h. Assuming for instance that the height h of an elevation exceeds a predetermined level threshold value; this may indicate that e.g. a doorstep is too high to ascend in order to avoid having the robot being stuck on the doorstep.


The camera 123 of the robotic cleaning device 100 is controlled by a controller such as a microprocessor to capture and record images from which the controller creates a representation or layout of the surroundings that the robotic cleaning device 100 is operating in, by extracting feature points from the images representing detected objects and by measuring the distance from the robotic cleaning device 100 to these objects, as well as the heights of the objects using the method of the invention as has been discussed. while the robotic cleaning device 100 is moving across the surface to be cleaned. Thus, the controller derives positional data of the robotic cleaning device 100 with respect to the surface to be cleaned from the detected objects of the recorded images, generates a 3D representation of the surroundings from the derived positional data and controls driving motors to move the robotic cleaning device 100 across the surface to be cleaned in accordance with the generated 3D representation and navigation information supplied to the robotic cleaning device 100 such that the surface to be cleaned can be autonomously navigated by taking into account the generated 3D representation. Since the derived positional data will serve as a foundation for the navigation of the robotic cleaning device, it is important that the positioning is correct; the robotic device will otherwise navigate according to a “map” of its surroundings that is misleading.


The 3D representation generated from the images recorded by the camera 123 thus facilitates detection of obstacles in the form of walls, floor lamps, table legs, around which the robotic cleaning device must navigate as well as rugs, carpets, doorsteps, etc., that the robotic cleaning device 100 must traverse. The robotic cleaning device 100 is hence configured to learn about its environment or surroundings by operating/cleaning.


The robotic cleaning device 100 may advantageously add indications of detected ledges and elevations to the created representation of its surroundings. Thus, by adding to the created representation an indication of e.g. the ledge 34, it is possible for the robotic cleaning device to plan a cleaning path to be traversed well in advance and further to move very close to, and along, the ledge 34 since it is included in the representation of the surroundings of the robotic cleaning device 100.


Even though it is envisaged that the invention may be performed by a variety of appropriate robotic cleaning devices being equipped with sufficient processing intelligence, FIG. 7 shows a robotic cleaning device 100 according to an embodiment of the present invention in a bottom view, i.e. the bottom side of the robotic cleaning device is shown. The arrow indicates the forward direction of the robotic cleaning device 100 being illustrated in the form of a robotic vacuum cleaner.


The robotic cleaning device 100 comprises a main body 111 housing components such as a propulsion system comprising driving means in the form of two electric wheel motors 115a, 115b for enabling movement of the driving wheels 112, 113 such that the cleaning device can be moved over a surface to be cleaned. Each wheel motor 115a, 115b is capable of controlling the respective driving wheel 112, 113 to rotate independently of each other in order to move the robotic cleaning device 100 across the surface to be cleaned. A number of different driving wheel arrangements, as well as various wheel motor arrangements, can be envisaged. It should be noted that the robotic cleaning device may have any appropriate shape, such as a device having a more traditional circular-shaped main body, or a triangular-shaped main body. As an alternative, a track propulsion system may be used or even a hovercraft propulsion system. The propulsion system may further be arranged to cause the robotic cleaning device 100 to perform any one or more of a yaw, pitch, translation or roll movement.


A controller 116 such as a microprocessor controls the wheel motors 115a, 115b to rotate the driving wheels 112, 113 as required in view of information received from an obstacle detecting device (not shown in FIG. 7) for detecting obstacles in the form of walls, floor lamps, table legs, around which the robotic cleaning device must navigate. The obstacle detecting device may be embodied in the form of a 3D sensor system registering its surroundings, implemented by means of e.g. a 3D camera, a camera in combination with lasers, a laser scanner, etc. for detecting obstacles and communicating information about any detected obstacle to the microprocessor 116. The microprocessor 116 communicates with the wheel motors 115a, 115b to control movement of the wheels 112, 113 in accordance with information provided by the obstacle detecting device such that the robotic cleaning device 100 can move as desired across the surface to be cleaned.


Further, the main body 111 may optionally be arranged with a cleaning member 117 for removing debris and dust from the surface to be cleaned in the form of a rotatable brush roll arranged in an opening 118 at the bottom of the robotic cleaner 100. Thus, the rotatable brush roll 117 is arranged along a horizontal axis in the opening 118 to enhance the dust and debris collecting properties of the robotic cleaning device 100. In order to rotate the brush roll 117, a brush roll motor 119 is operatively coupled to the brush roll to control its rotation in line with instructions received from the controller 116.


Moreover, the main body 111 of the robotic cleaner 100 comprises a suction fan 20 creating an air flow for transporting debris to a dust bag or cyclone arrangement (not shown) housed in the main body via the opening 118 in the bottom side of the main body 111. The suction fan 120 is driven by a fan motor 121 communicatively connected to the controller 116 from which the fan motor 121 receives instructions for controlling the suction fan 120. It should be noted that a robotic cleaning device having either one of the rotatable brush roll 117 and the suction fan 20 for transporting debris to the dust bag can be envisaged. A combination of the two will however enhance the debris-removing capabilities of the robotic cleaning device 100.


The main body 111 of the robotic cleaning device 100 may further equipped with an inertia measurement unit (IMU) 124, such as e.g. a gyroscope and/or an accelerometer and/or a magnetometer or any other appropriate device for measuring displacement of the robotic cleaning device 100 with respect to a reference position, in the form of e.g. orientation, rotational velocity, gravitational forces, etc. A three-axis gyroscope is capable of measuring rotational velocity in a roll, pitch and yaw movement of the robotic cleaning device 100. A three-axis accelerometer is capable of measuring acceleration in all directions, which is mainly used to determine whether the robotic cleaning device is bumped or lifted or if it is stuck (i.e. not moving even though the wheels are turning). The robotic cleaning device 100 further comprises encoders (not shown in FIG. 7) on each drive wheel 112, 113 which generate pulses when the wheels turn. The encoders may for instance be magnetic or optical. By counting the pulses at the controller 116, the speed of each wheel 112, 113 can be determined. By combining wheel speed readings with gyroscope information, the controller 116 can perform so called dead reckoning to determine position and heading of the cleaning device 100.


The main body 111 may further be arranged with a rotating side brush 114 adjacent to the opening 118, the rotation of which could be controlled by the drive motors 115a, 115b, the brush roll motor 119, or alternatively a separate side brush motor (not shown). Advantageously, the rotating side brush 114 sweeps debris and dust such from the surface to be cleaned such that the debris ends up under the main body 111 at the opening 118 and thus can be transported to a dust chamber of the robotic cleaning device. Further advantageous is that the reach of the robotic cleaning device 100 will be improved, and e.g. corners and areas where a floor meets a wall are much more effectively cleaned. As is illustrated in FIG. 7, the rotating side brush 114 rotates in a direction such that it sweeps debris towards the opening 118 such that the suction fan 20 can transport the debris to a dust chamber. The robotic cleaning device 100 may comprise two rotating side brushes arranged laterally on each side of, and adjacent to, the opening 118.


With further reference to FIG. 7, the controller/processing unit 116 embodied in the form of one or more microprocessors is arranged to execute a computer program 125 downloaded to a suitable storage medium 126 associated with the microprocessor, such as a Random Access Memory (RAM), a Flash memory or a hard disk drive. The controller 116 is arranged to carry out a method according to embodiments of the present invention when the appropriate computer program 125 comprising computer-executable instructions is downloaded to the storage medium 126 and executed by the controller 116. The storage medium 126 may also be a computer program product comprising the computer program 125. Alternatively, the computer program 125 may be transferred to the storage medium 126 by means of a suitable computer program product, such as a digital versatile disc (DVD), compact disc (CD) or a memory stick. As a further alternative, the computer program 125 may be downloaded to the storage medium 126 over a wired or wireless network. The controller 116 may alternatively be embodied in the form of a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), etc.


As has been mentioned, by measuring a difference in level of the surface in front of the robotic device 100, it is advantageously possible to timely plan how the robotic device 100 should move for performing a cleaning programme.


With reference to FIG. 7, by detecting height of objects or surfaces in front of the robotic cleaning device 100, for instance a thick rug, the robot 100 may be controlled to move along the periphery of the rug thereby making efficient use of the side brush 114. After having cleaned along the periphery of the rug, the robotic cleaning device 100 may move up onto the rug, where it typically would refrain from using the side brush 114. Instead, the robot 100 could alternately rotate the rotatable brush roll 117 in a forward and backward direction to avoid having the fibers of the rug being entangled in the brush roll 117. Hence, the movement of the brush roll 117 and any side brush(es) 114 can advantageously be controlled by taking into account the determined height of the surfaces and objects in front of the robotic cleaning device 100. FIG. 8 shows a front view of the robotic cleaning device 100 of FIG. 7 in an embodiment of the present invention illustrating the previously mentioned obstacle detecting device in the form of a 3D sensor system comprising at least a camera 123 and a first and a second line laser 127, 128, which may be horizontally or vertically oriented line lasers. Further shown is the controller 116, the main body 111, the driving wheels 112, 113, and the rotatable brush roll 117 previously discussed with reference to FIG. 6. The controller 116 is operatively coupled to the camera 123 for recording images of a vicinity of the robotic cleaning device 100. The first and second line lasers 127, 128 may preferably be vertical line lasers and are arranged lateral of the camera 123 and configured to illuminate a height and a width that is greater than the height and width of the robotic cleaning device 100. Further, the angle of the field of view of the camera 123 is preferably smaller than the space illuminated by the first and second line lasers 127, 128. The camera 123 is controlled by the controller 116 to capture and record a plurality of images per second. Data from the images is extracted by the controller 116 and the data is typically saved in the memory 126 along with the computer program 125.


The first and second line lasers 127, 128 are typically arranged on a respective side of the camera 123 along an axis being perpendicular to an optical axis of the camera. Further, the line lasers 127, 128 are directed such that their respective laser beams intersect within the field of view of the camera 123. Typically, the intersection coincides with the optical axis of the camera 123.


The first and second line laser 127, 128 are configured to scan, preferably in a vertical orientation, the vicinity of the robotic cleaning device 100, normally in the direction of movement of the robotic cleaning device 100. The first and second line lasers 127, 128 are configured to send out laser beams, which illuminate furniture, walls and other objects of e.g. a room to be cleaned. The camera 123 is controlled by the controller 116 to capture and record images from which the controller 116 creates a representation or layout of the surroundings that the robotic cleaning device 100 is operating in, by extracting features from the images and by measuring the distance covered by the robotic cleaning device 100, while the robotic cleaning device 100 is moving across the surface to be cleaned. Thus, the controller 116 derives positional data of the robotic cleaning device 100 with respect to the surface to be cleaned from the recorded images, generates a 3D representation of the surroundings from the derived positional data and controls the driving motors 115a, 115b to move the robotic cleaning device across the surface to be cleaned in accordance with the generated 3D representation and navigation information supplied to the robotic cleaning device 100 such that the surface to be cleaned can be navigated by taking into account the generated 3D representation. Since the derived positional data will serve as a foundation for the navigation of the robotic cleaning device, it is important that the positioning is correct; the robotic device will otherwise navigate according to a “map” of its surroundings that is misleading.


The 3D representation generated from the images recorded by the 3D sensor system thus facilitates detection of obstacles in the form of walls, floor lamps, table legs, around which the robotic cleaning device must navigate as well as rugs, carpets, doorsteps, etc., that the robotic cleaning device 100 must traverse. The robotic cleaning device 100 is hence configured to learn about its environment or surroundings by operating/cleaning.


Hence, the 3D sensor system comprising the camera 123 and the first and second vertical line lasers 127, 128 is arranged to record images of a vicinity of the robotic cleaning from which objects/obstacles may be detected. The controller 116 is capable of positioning the robotic cleaning device 100 with respect to the detected obstacles and hence a surface to be cleaned by deriving positional data from the recorded images. From the positioning, the controller 116 controls movement of the robotic cleaning device 100 by means of controlling the wheels 112, 113 via the wheel drive motors 115a, 115b, across the surface to be cleaned.


The derived positional data facilitates control of the movement of the robotic cleaning device 100 such that cleaning device can be navigated to move very close to an object, and to move closely around the object to remove debris from the surface on which the object is located. Hence, the derived positional data is utilized to move flush against the object, being e.g. a chair, a table, a sofa, a thick rug or a wall. Typically, the controller 116 continuously generates and transfers control signals to the drive wheels 112, 113 via the drive motors 115a, 115b such that the robotic cleaning device 100 is navigated close to the object.


It should further be noted that while the embodiments of the invention has been discussed in the context of using a camera and one or two line lasers for illuminating a surface over which the robotic cleaning device 100 moves, it would further be possible to use known 3D sensors utilizing time of flight measurements of an image being completely illuminated. With such a time of flight 3D sensor, the distance in a captured image would be determined for each pixel and distances to detected objects may be determined in line with the above.


The invention has mainly been described above with reference to a few embodiments. However, as is readily appreciated by a person skilled in the art, other embodiments than the ones disclosed above are equally possible within the scope of the invention, as defined by the appended patent claims.

Claims
  • 1. A method for a robotic cleaning device of detecting a difference in level of a surface in front of the robotic cleaning device, the method comprising: illuminating the surface with a line laser;capturing an image of the surface;detecting a luminous section in the captured image caused by the line laser;identifying at least a first segment of the line laser and a second segment of the line laser representing the detected luminous section, the first segment of the line laser and the second segment of the line laser separated by a discontinuity due to the difference in level of the surface; anddetermining, from coordinates of the captured image, a vertical distance between the identified first segment of the line laser and second segment of the line laser, the vertical distance constituting a height of the detected difference in level of the surface.
  • 2. The method of claim 1, wherein the difference in level is detected to comprise a ledge.
  • 3. The method of claim 1, wherein the detecting of the difference in level of the surface from a positional relationship between the identified first and second segment of the line laser comprises: detecting a third segment linking the identified first and second segment of the line laser in the captured image, wherein the difference in level is detected to comprise an elevation.
  • 4. The method of claim 2, wherein the detecting of the difference in level of the surface from a positional relationship between the identified first and second segment of the line laser comprises: detecting that the identified second segment of the line laser is located vertically below the identified first segment of the line laser in the captured image, wherein the difference in level is detected to comprise a ledge.
  • 5. The method of claim 3, wherein the detecting of the difference in level of the surface from a positional relationship between the identified first and second segment of the line laser comprises: detecting that the identified second segment of the line laser is located vertically above the identified first segment of the line laser in the captured image, wherein the difference in level is detected to comprise an elevation.
  • 6. The method of claim 1, wherein the detecting of the difference in level of the surface from a positional relationship between the identified first and second segment of the line laser comprises: determining, from coordinates of the captured image, a distance to a proximal end of the identified first segment of the line laser.
  • 7. A robotic cleaning device configured to detect a difference in level of a surface in front of the robotic cleaning device, comprising: a propulsion system configured to move the robotic cleaning device;a camera device configured to record images of a vicinity of the robotic cleaning device;at least one light source including a line laser configured to illuminate a surface in front of the robotic cleaning device; anda controller configured to: control the line laser to illuminate a surface in front of the robotic cleaning device;control the camera device to capture an image of the illuminated surface;detect a luminous section in the captured image caused by the line laser illuminating the surface;identify at least a first segment of the line laser and a second segment of the line laser representing the detected luminous section, the first segment of the line laser and the second segment of the line laser separated by a discontinuity due to the difference in level of the surface; anddetermine from coordinates of the captured image, a vertical distance between the identified first segment of the line laser and second segment of the line laser, the vertical distance constituting a height of the detected difference in level of the surface.
  • 8. The robotic cleaning device of claim 7, wherein the difference in level is detected to comprise a ledge.
  • 9. The robotic cleaning device of claim 7, the controller further being configured to, when detecting the difference in level of the surface from a positional relationship between the identified first and second segment of the line laser: detect a third segment linking the identified first and second segment of the line laser in the captured image, wherein the difference in level is detected to comprise an elevation.
  • 10. The robotic cleaning device of claim 8, the controller further being configured to, when detecting the difference in level of the surface from a positional relationship between the identified first and second segment of the line laser: detect that the identified second segment of the line laser is located vertically below the identified first segment of the line laser in the captured image, wherein the difference in level is detected to comprise a ledge.
  • 11. The robotic cleaning device of claim 9, the controller further being configured to, when detecting the difference in level of the surface from a positional relationship between the identified first and second segment of the line laser: detect that the identified second segment of the line laser is located vertically above the identified first segment of the line laser in the captured image, wherein the difference in level is detected to comprise an elevation.
  • 12. The robotic cleaning device of claim 7, the controller further being configured to, when detecting the difference in level of the surface from a positional relationship between the identified first and second segment of the line laser: determine, from coordinates of the captured image, a distance to a proximal end of the identified first segment of the line laser.
  • 13. A non-transitory computer readable medium comprising computer-executable instructions stored thereon that when executed by a controller of a robotic cleaning device controls the robotic cleaning device to: illuminate the surface with a line laser;capture an image of the surface;detect a luminous section in the captured image caused by the line laser;identify at least a first segment of the line laser and a second segment of the line laser representing the detected luminous section, the first segment of the line laser and the second segment of the line laser separated by a discontinuity due to the difference in level of the surface; anddetermine, from coordinates of the captured image, a vertical distance between the identified first segment of the line laser and second segment of the line laser, the vertical distance constituting a height of the detected difference in level of the surface.
PCT Information
Filing Document Filing Date Country Kind
PCT/EP2017/063468 6/2/2017 WO
Publishing Document Publishing Date Country Kind
WO2018/219473 12/6/2018 WO A
US Referenced Citations (1055)
Number Name Date Kind
1286321 Hoover Dec 1918 A
1401007 Staples Dec 1921 A
1823128 Scott Sep 1931 A
3010129 Moore Nov 1961 A
3233274 Kroll Feb 1966 A
3550714 Bellinger Dec 1970 A
3570227 Bellinger Mar 1971 A
3713505 Muller Jan 1973 A
3837028 Bridge Sep 1974 A
4028765 Liebscher Jun 1977 A
4036147 Westling Jul 1977 A
4114711 Wilkins Sep 1978 A
4119900 Kremnitz Oct 1978 A
4306174 Mourier Dec 1981 A
4306329 Yokoi Dec 1981 A
4369543 Chen Jan 1983 A
4502173 Patzold Mar 1985 A
4627511 Yajima Dec 1986 A
4647209 Neukomm Mar 1987 A
4777416 George, II Oct 1988 A
4800978 Wasa Jan 1989 A
4822450 Davis Apr 1989 A
4825091 Breyer Apr 1989 A
4836905 Davis Jun 1989 A
4838990 Jucha Jun 1989 A
4842686 Davis Jun 1989 A
4849067 Jucha Jul 1989 A
4854000 Takimoto Aug 1989 A
4864511 Moy Sep 1989 A
4872938 Davis Oct 1989 A
4878003 Knepper Oct 1989 A
4886570 Davis Dec 1989 A
4918607 Wible Apr 1990 A
4919224 Shyu Apr 1990 A
4922559 Wall May 1990 A
4959192 Trundle Sep 1990 A
4962453 Pong Oct 1990 A
4989818 Trundle Feb 1991 A
5001635 Yasutomi Mar 1991 A
5006302 Trundle Apr 1991 A
5023444 Ohman Jun 1991 A
5032775 Mizuno Jul 1991 A
5034673 Shoji Jul 1991 A
5042861 Trundle Aug 1991 A
5045118 Mason Sep 1991 A
5086535 Grossmeyer Feb 1992 A
5095577 Jonas Mar 1992 A
5107946 Kamimura Apr 1992 A
5155683 Rahim Oct 1992 A
5243732 Koharagi Sep 1993 A
5245177 Schiller Sep 1993 A
5276933 Hennessey Jan 1994 A
5279672 Betker Jan 1994 A
5293955 Lee Mar 1994 A
5307273 Oh Apr 1994 A
5309592 Hiratsuka May 1994 A
5341540 Soupert Aug 1994 A
5345639 Tanoue Sep 1994 A
5349378 FereydounMaali Sep 1994 A
5353224 Lee Oct 1994 A
5367458 Roberts Nov 1994 A
5369347 Yoo Nov 1994 A
5377106 Drunk Dec 1994 A
5390627 van der Berg Feb 1995 A
5398632 Goldbach Mar 1995 A
5402051 Fujiwara Mar 1995 A
5440216 Kim Aug 1995 A
5444893 Hwang Aug 1995 A
5444965 Colens Aug 1995 A
5446356 Kim Aug 1995 A
5454129 Kell Oct 1995 A
5464494 Gerd Nov 1995 A
5518552 Tanoue May 1996 A
5534762 Kim Jul 1996 A
5548511 Bancroft Aug 1996 A
5560077 Crotchett Oct 1996 A
5568589 Hwang Oct 1996 A
5621291 Lee Apr 1997 A
5634237 Paranjpe Jun 1997 A
5646494 Han Jul 1997 A
5666689 Andersen Sep 1997 A
5682313 Edlund Oct 1997 A
5682640 Han Nov 1997 A
5687294 Jeong Nov 1997 A
5698957 Sowada Dec 1997 A
5745946 Thrasher May 1998 A
5758298 Guldner May 1998 A
5778554 Jones Jul 1998 A
5781960 Kilstrom Jul 1998 A
5787545 Colens Aug 1998 A
5815880 Nakanishi Oct 1998 A
5825981 Matsuda Oct 1998 A
5841259 Kim Nov 1998 A
5852984 Matsuyama Dec 1998 A
5867800 Leif Feb 1999 A
5890250 Lange Apr 1999 A
5896488 Jeong Apr 1999 A
5903124 YuichiKawakami May 1999 A
5926909 McGee Jul 1999 A
5933902 Frey Aug 1999 A
5935179 Kleiner Aug 1999 A
5940927 Haegermarck Aug 1999 A
5942869 Katou Aug 1999 A
5947051 Geiger Sep 1999 A
5959423 Nakanishi Sep 1999 A
5959424 Elkmann Sep 1999 A
5966765 Hamada Oct 1999 A
RE36391 vandenBerg Nov 1999 E
5983833 van der Lely Nov 1999 A
5987696 Wang Nov 1999 A
5991951 Kubo Nov 1999 A
5995884 Allen Nov 1999 A
5997670 Walter Dec 1999 A
5999865 Bloomquist Dec 1999 A
6012470 Jones Jan 2000 A
6024107 Jones Feb 2000 A
6064926 Sarangapani May 2000 A
6076662 Bahten Jun 2000 A
6082377 Frey Jul 2000 A
6124694 Bancroft Sep 2000 A
6142252 Kinto Nov 2000 A
6176067 Bahten Jan 2001 B1
6213136 Jones Apr 2001 B1
6226830 Hendriks May 2001 B1
6230360 Singleton May 2001 B1
6240342 Fiegert May 2001 B1
6251551 Kunze-Concewitz Jun 2001 B1
6255793 Peless Jul 2001 B1
6263989 Won Jul 2001 B1
6300737 Bergvall Oct 2001 B1
6311366 Sepke Nov 2001 B1
6327741 Reed Dec 2001 B1
6339735 Peless Jan 2002 B1
6358325 Andreas Mar 2002 B1
6360801 Walter Mar 2002 B1
6370452 Pfister Apr 2002 B1
6370453 Sommer Apr 2002 B2
6374157 Takamura Apr 2002 B1
6381801 Clemons, Sr. May 2002 B1
6389329 Colens May 2002 B1
6413149 Wada Jul 2002 B1
6417641 Peless Jul 2002 B2
6431296 Won Aug 2002 B1
6438456 Feddema Aug 2002 B1
6443509 Levin Sep 2002 B1
6457199 Frost Oct 2002 B1
6457206 Judson Oct 2002 B1
6459955 Bartsch Oct 2002 B1
6465982 Bergvall Oct 2002 B1
6481515 Kirkpatrick Nov 2002 B1
6482678 Frost Nov 2002 B1
6493612 Bisset Dec 2002 B1
6493613 Peless Dec 2002 B2
6496754 Song Dec 2002 B2
6504610 Bauer Jan 2003 B1
6519804 Vujik Feb 2003 B1
6525509 Petersson Feb 2003 B1
D471243 Cioffi Mar 2003 S
6532404 Colens Mar 2003 B2
6535793 Allard Mar 2003 B2
6571415 Gerber Jun 2003 B2
6580246 Jacobs Jun 2003 B2
6581239 Dyson Jun 2003 B1
6594844 Jones Jul 2003 B2
6597143 Song Jul 2003 B2
6601265 Burlington Aug 2003 B1
6605156 Clark Aug 2003 B1
6609962 Wakabayashi Aug 2003 B1
6611120 Song Aug 2003 B2
6611318 LaPolice Aug 2003 B2
6615108 Peless Sep 2003 B1
6615885 Ohm Sep 2003 B1
6633150 Wallach Oct 2003 B1
6637446 Frost Oct 2003 B2
6658325 Zweig Dec 2003 B2
6661239 Ozick Dec 2003 B1
6662889 De Fazio Dec 2003 B2
6667592 Jacobs Dec 2003 B2
6668951 Won Dec 2003 B2
6671592 Bisset Dec 2003 B1
6690134 Jones Feb 2004 B1
6726823 Wang Apr 2004 B1
6732826 Song May 2004 B2
6745431 Dijksman Jun 2004 B2
6748297 Song Jun 2004 B2
6769004 Barrett Jul 2004 B2
6774596 Bisset Aug 2004 B1
6775871 Finch Aug 2004 B1
6781338 Jones Aug 2004 B2
6809490 Jones Oct 2004 B2
6810305 Kirkpatrick, Jr. Oct 2004 B2
6820801 Kaneko Nov 2004 B2
6841963 Song Jan 2005 B2
6845297 Allard Jan 2005 B2
6850024 Peless Feb 2005 B2
6859010 Jeon Feb 2005 B2
6859976 Plankenhorn Mar 2005 B2
6860206 Rudakevych Mar 2005 B1
6868307 Song Mar 2005 B2
6869633 Sus Mar 2005 B2
6870792 Chiappetta Mar 2005 B2
6882334 Meyer Apr 2005 B1
6883201 Jones Apr 2005 B2
6885912 Peless Apr 2005 B2
6901624 Mori Jun 2005 B2
6925679 Wallach Aug 2005 B2
D510066 Hickey Sep 2005 S
6938298 Aasen Sep 2005 B2
6939208 Kamimura Sep 2005 B2
6940291 Ozick Sep 2005 B1
6941199 Bottomley Sep 2005 B1
6942548 Wada Sep 2005 B2
6956348 Landry Oct 2005 B2
6957712 Song Oct 2005 B2
6964312 Maggio Nov 2005 B2
6965209 Jones Nov 2005 B2
6967275 Ozick Nov 2005 B2
6971140 Kim Dec 2005 B2
6971141 Tak Dec 2005 B1
6984952 Peless Jan 2006 B2
7000623 Welsh Feb 2006 B2
7004269 Song Feb 2006 B2
7013200 Wakui Mar 2006 B2
7013527 Thomas, Sr. Mar 2006 B2
7015831 Karlsson Mar 2006 B2
7024278 Chiappetta Apr 2006 B2
7031805 Lee Apr 2006 B2
7040968 Kamimura May 2006 B2
7042342 Luo May 2006 B2
7043794 Conner May 2006 B2
7050926 Theurer May 2006 B2
7053578 Diehl May 2006 B2
7053580 Aldred May 2006 B2
7054716 McKee May 2006 B2
7059012 Song Jun 2006 B2
7079923 Abramson Jul 2006 B2
7082350 Skoog Jul 2006 B2
D526753 Tani Aug 2006 S
7085624 Aldred Aug 2006 B2
7103449 Woo Sep 2006 B2
7113847 Chmura Sep 2006 B2
7117067 McLurkin Oct 2006 B2
7133745 Wang Nov 2006 B2
7134164 Alton Nov 2006 B2
7135992 Karlsson Nov 2006 B2
7143696 Rudakevych Dec 2006 B2
7145478 Goncalves Dec 2006 B2
7150068 Ragner Dec 2006 B1
7155308 Jones Dec 2006 B2
7155309 Peless Dec 2006 B2
7162338 Goncalves Jan 2007 B2
7167775 Abramson Jan 2007 B2
7173391 Jones Feb 2007 B2
7174238 Zweig Feb 2007 B1
7177737 Karlsson Feb 2007 B2
7184586 Jeon Feb 2007 B2
7185396 Im Mar 2007 B2
7185397 Stuchlik Mar 2007 B2
7188000 Chiappetta Mar 2007 B2
7196487 Jones Mar 2007 B2
7199711 Field Apr 2007 B2
7200892 Kim Apr 2007 B2
7202630 Dan Apr 2007 B2
7206677 Hulden Apr 2007 B2
7207081 Gerber Apr 2007 B2
7208892 Tondra Apr 2007 B2
7213298 Cipolla May 2007 B2
7213663 Kim May 2007 B2
7222390 Cipolla May 2007 B2
7225500 Diehl Jun 2007 B2
7237298 Reindie Jul 2007 B2
7246405 Yan Jul 2007 B2
7248951 Hulden Jul 2007 B2
7251853 Park Aug 2007 B2
7254464 McLurkin Aug 2007 B1
7254859 Gerber Aug 2007 B2
7269877 Tondra Sep 2007 B2
7272467 Goncalves Sep 2007 B2
7272868 Im Sep 2007 B2
7274167 Kim Sep 2007 B2
7275280 Haegermarck Oct 2007 B2
7288912 Landry Oct 2007 B2
D556961 Swyst Dec 2007 S
7303776 Sus Dec 2007 B2
7324870 Lee Jan 2008 B2
7331436 Pack Feb 2008 B1
7332890 Cohen Feb 2008 B2
7343221 Ann Mar 2008 B2
7343719 Sus Mar 2008 B2
7346428 Huffman Mar 2008 B1
7349759 Peless Mar 2008 B2
7359766 Jeon Apr 2008 B2
7363994 DeFazio Apr 2008 B1
7369460 Chiappetta May 2008 B2
7372004 Buchner May 2008 B2
7388343 Jones Jun 2008 B2
7389156 Ziegler Jun 2008 B2
7389166 Harwig Jun 2008 B2
7402974 Jeon Jul 2008 B2
7403360 Cunningham Jul 2008 B2
7412748 Lee Aug 2008 B2
7417404 Lee Aug 2008 B2
7418762 Arai Sep 2008 B2
7424766 Reindie Sep 2008 B2
7429843 Jones Sep 2008 B2
7430455 Casey Sep 2008 B2
7438766 Song Oct 2008 B2
7441298 Svendsen Oct 2008 B2
7444206 Abramson Oct 2008 B2
7448113 Jones Nov 2008 B2
7459871 Landry Dec 2008 B2
7464157 Okude Dec 2008 B2
7474941 Kim Jan 2009 B2
7480958 Song Jan 2009 B2
7480960 Kim Jan 2009 B2
D586959 Geringer Feb 2009 S
7489277 Sung Feb 2009 B2
7489985 Ko Feb 2009 B2
7499774 Barrett Mar 2009 B2
7499775 Filippov Mar 2009 B2
7499776 Allard Mar 2009 B2
7499804 Svendsen Mar 2009 B2
7503096 Lin Mar 2009 B2
7515991 Egawa Apr 2009 B2
D593265 Carr May 2009 S
7539557 Yamauchi May 2009 B2
7546891 Won Jun 2009 B2
7546912 Pack Jun 2009 B1
7555363 Augenbraun Jun 2009 B2
7556108 Won Jul 2009 B2
7559269 Rudakevych Jul 2009 B2
7564571 Karabassi Jul 2009 B2
7566839 Hukuba Jul 2009 B2
7567052 Jones Jul 2009 B2
7568259 Yan Aug 2009 B2
7568536 Yu Aug 2009 B2
7571511 Jones Aug 2009 B2
7573403 Goncalves Aug 2009 B2
7574282 Petersson Aug 2009 B2
7578020 Jaworski Aug 2009 B2
7579803 Jones Aug 2009 B2
7581282 Woo Sep 2009 B2
7597162 Won Oct 2009 B2
7600521 Woo Oct 2009 B2
7600593 Filippov Oct 2009 B2
7603744 Reindie Oct 2009 B2
7604675 Makarov Oct 2009 B2
7610651 Baek Nov 2009 B2
7613543 Petersson Nov 2009 B2
7620476 Morse Nov 2009 B2
7636982 Jones Dec 2009 B2
7647144 Haegermarck Jan 2010 B2
7650666 Jang Jan 2010 B2
7654348 Ohm Feb 2010 B2
7660650 Kawagoe Feb 2010 B2
7663333 Jones Feb 2010 B2
7673367 Kim Mar 2010 B2
7679532 Karlsson Mar 2010 B2
7688676 Chiappetta Mar 2010 B2
7693654 Dietsch Apr 2010 B1
7697141 Jones Apr 2010 B2
7706917 Chiappetta Apr 2010 B1
7706921 Jung Apr 2010 B2
7709497 Christensen, IV May 2010 B2
7711450 Im May 2010 B2
7720572 Ziegler May 2010 B2
7721829 Lee May 2010 B2
7729801 Abramson Jun 2010 B2
7749294 Oh Jul 2010 B2
7751940 Lee Jul 2010 B2
7761954 Ziegler Jul 2010 B2
7765635 Park Aug 2010 B2
7765638 Pineschi Aug 2010 B2
7769490 Abramson Aug 2010 B2
7774158 Domingues Goncalves Aug 2010 B2
7779504 Lee Aug 2010 B2
7780796 Shim Aug 2010 B2
7784139 Sawalski Aug 2010 B2
7784570 Couture Aug 2010 B2
7785544 Alward Aug 2010 B2
7787991 Jeung Aug 2010 B2
7793614 Ericsson Sep 2010 B2
7801645 Taylor Sep 2010 B2
7805220 Taylor Sep 2010 B2
7827653 Liu Nov 2010 B1
7832048 Harwig Nov 2010 B2
7835529 Hernandez Nov 2010 B2
7843431 Robbins Nov 2010 B2
7844364 McLurkin Nov 2010 B2
7849555 Hahm Dec 2010 B2
7856291 Jung Dec 2010 B2
7860608 Lee Dec 2010 B2
7861365 Sun Jan 2011 B2
7861366 Hahm Jan 2011 B2
7873437 Aldred Jan 2011 B2
7877166 Harwig Jan 2011 B2
7886399 Dayton Feb 2011 B2
7890210 Choi Feb 2011 B2
7891045 Kim Feb 2011 B2
7891289 Day Feb 2011 B2
7891446 Couture Feb 2011 B2
7894951 Norris Feb 2011 B2
7916931 Lee Mar 2011 B2
7920941 Park Apr 2011 B2
7921506 Baek Apr 2011 B2
7926598 Rudakevych Apr 2011 B2
7930797 Yoo Apr 2011 B2
7934571 Chiu May 2011 B2
7937800 Yan May 2011 B2
7942107 Vosburgh May 2011 B2
7957837 Ziegler Jun 2011 B2
7962997 Chung Jun 2011 B2
7966339 Kim Jun 2011 B2
7975790 Kim Jul 2011 B2
7979175 Allard Jul 2011 B2
7979945 Dayton Jul 2011 B2
7981455 Sus Jul 2011 B2
7997118 Mecca Aug 2011 B2
8001651 Chang Aug 2011 B2
8007221 More Aug 2011 B1
8010229 Kim Aug 2011 B2
8019223 Hudson Sep 2011 B2
8020657 Allard Sep 2011 B2
8032978 Haegermarck Oct 2011 B2
8034390 Sus Oct 2011 B2
8042663 Pack Oct 2011 B1
8046103 Abramson Oct 2011 B2
8061461 Couture Nov 2011 B2
8065778 Kim Nov 2011 B2
8073439 Stromberg Dec 2011 B2
8074752 Rudakevych Dec 2011 B2
8078338 Pack Dec 2011 B2
8079432 Ohm Dec 2011 B2
8082836 More Dec 2011 B2
8086419 Goncalves Dec 2011 B2
8087117 Kapoor Jan 2012 B2
8095238 Jones Jan 2012 B2
8095336 Goncalves Jan 2012 B2
8107318 Chiappetta Jan 2012 B2
8108092 Phillips Jan 2012 B2
8109191 Rudakevych Feb 2012 B1
8112942 Bohm Feb 2012 B2
8113304 Won Feb 2012 B2
8122982 Morey Feb 2012 B2
8127396 Mangiardi Mar 2012 B2
8127399 Dilger Mar 2012 B2
8127704 Vosburgh Mar 2012 B2
8136200 Splinter Mar 2012 B2
8150650 Goncalves Apr 2012 B2
D659311 Geringer May 2012 S
8166904 Israel May 2012 B2
8195333 Ziegler Jun 2012 B2
8196251 Lynch Jun 2012 B2
8199109 Robbins Jun 2012 B2
8200600 Rosenstein Jun 2012 B2
8200700 Moore Jun 2012 B2
8237389 Fitch Aug 2012 B2
8237920 Jones Aug 2012 B2
8239992 Schnittman Aug 2012 B2
8244469 Cheung Aug 2012 B2
8253368 Landry Aug 2012 B2
8255092 Phillips Aug 2012 B2
8256542 Couture Sep 2012 B2
8265793 Cross Sep 2012 B2
8274406 Karlsson Sep 2012 B2
8281703 Moore Oct 2012 B2
8281731 Vosburgh Oct 2012 B2
8290619 McLurkin Oct 2012 B2
8292007 DeFazio Oct 2012 B2
8295125 Chiappetta Oct 2012 B2
D670877 Geringer Nov 2012 S
8308529 DAmbra Nov 2012 B2
8311674 Abramson Nov 2012 B2
8316971 Couture Nov 2012 B2
8318499 Fritchie Nov 2012 B2
D672928 Swett Dec 2012 S
8322470 Ohm Dec 2012 B2
8326469 Phillips Dec 2012 B2
8327960 Couture Dec 2012 B2
8336479 Vosburgh Dec 2012 B2
8342271 Filippov Jan 2013 B2
8347088 Moore Jan 2013 B2
8347444 Schnittman Jan 2013 B2
8350810 Robbins Jan 2013 B2
8353373 Rudakevych Jan 2013 B2
8364309 Bailey Jan 2013 B1
8364310 Jones Jan 2013 B2
8365848 Won Feb 2013 B2
8368339 Jones Feb 2013 B2
8370985 Schnittman Feb 2013 B2
8374721 Halloran Feb 2013 B2
8375838 Rudakevych Feb 2013 B2
8378613 Landry Feb 2013 B2
8380350 Ozick Feb 2013 B2
8382906 Konandreas Feb 2013 B2
8386081 Landry Feb 2013 B2
8387193 Ziegler Mar 2013 B2
8390251 Cohen Mar 2013 B2
8392021 Konandreas Mar 2013 B2
8396592 Jones Mar 2013 B2
8396611 Phillips Mar 2013 B2
8402586 Lavabre Mar 2013 B2
8408956 Vosburgh Apr 2013 B1
8412377 Casey Apr 2013 B2
8413752 Page Apr 2013 B2
8417188 Vosburgh Apr 2013 B1
8417383 Ozick Apr 2013 B2
8418303 Kapoor Apr 2013 B2
8418642 Vosburgh Apr 2013 B2
8428778 Landry Apr 2013 B2
8433442 Friedman Apr 2013 B2
D682362 Mozeika May 2013 S
8438694 Kim May 2013 B2
8438695 Gilbert, Jr. May 2013 B2
8438698 Kim May 2013 B2
8447440 Phillips May 2013 B2
8447613 Hussey May 2013 B2
8452448 Pack May 2013 B2
8453289 Lynch Jun 2013 B2
8456125 Landry Jun 2013 B2
8461803 Cohen Jun 2013 B2
8463438 Jones Jun 2013 B2
8473140 Norris Jun 2013 B2
8474090 Jones Jul 2013 B2
8478442 Casey Jul 2013 B2
8485330 Pack Jul 2013 B2
8505158 Han Aug 2013 B2
8508388 Karlsson Aug 2013 B2
8515578 Chiappetta Aug 2013 B2
8516651 Jones Aug 2013 B2
8525995 Jones Sep 2013 B2
8527113 Yamauchi Sep 2013 B2
8528157 Schnittman Sep 2013 B2
8528162 Tang Sep 2013 B2
8528673 More Sep 2013 B2
8532822 Abramson Sep 2013 B2
8533144 Reeser Sep 2013 B1
8534983 Schoenfeld Sep 2013 B2
8543562 Mule Sep 2013 B2
8548626 Steltz Oct 2013 B2
8551254 Dayton Oct 2013 B2
8551421 Luchinger Oct 2013 B2
8565920 Casey Oct 2013 B2
8572799 Won Nov 2013 B2
8584305 Won Nov 2013 B2
8584306 Chung Nov 2013 B2
8584307 Won Nov 2013 B2
8594840 Chiappetta Nov 2013 B1
8598829 Landry Dec 2013 B2
8599645 Chiappetta Dec 2013 B2
8600553 Svendsen Dec 2013 B2
8606401 Ozick Dec 2013 B2
8634956 Chiappetta Jan 2014 B1
8634958 Chiappetta Jan 2014 B1
8666523 Kim Mar 2014 B2
8671513 Yoo Mar 2014 B2
8732895 Cunningham May 2014 B2
8741013 Swett Jun 2014 B2
8743286 Hasegawa Jun 2014 B2
8745194 Uribe-Etxebarria Jimenez Jun 2014 B2
8755936 Friedman Jun 2014 B2
8761931 Halloran Jun 2014 B2
8763200 Kim Jul 2014 B2
8774970 Knopow Jul 2014 B2
8780342 Dibernardo Jul 2014 B2
8798791 Li Aug 2014 B2
8798792 Park Aug 2014 B2
8799258 Mule Aug 2014 B2
8838274 Jones Sep 2014 B2
8839477 Schnittman Sep 2014 B2
8843245 Choe Sep 2014 B2
8855914 Alexander Oct 2014 B1
8874264 Chiappetta Oct 2014 B1
8880342 Ando Nov 2014 B2
8881339 Gilbert Nov 2014 B2
8924042 Kim Dec 2014 B2
8961695 Romanov Feb 2015 B2
8985127 Konandreas Mar 2015 B2
8996172 Shah Mar 2015 B2
9033079 Shin May 2015 B2
9037396 Pack May 2015 B2
9052721 Dowdall Jun 2015 B1
9104206 Biber Aug 2015 B2
9144361 Landry Sep 2015 B2
9215957 Cohen Dec 2015 B2
9223312 Goel et al. Dec 2015 B2
9259129 Jang Feb 2016 B2
9360300 EnricoDiBernado Jun 2016 B2
9392920 Halloran Jul 2016 B2
9436318 Omura Sep 2016 B2
9550294 Cohen Jan 2017 B2
9596971 Yoon Mar 2017 B2
9629514 Hillen Apr 2017 B2
9687132 SchlischkaCK Jun 2017 B2
9775476 Jang et al. Oct 2017 B2
9939529 Haegermarck Apr 2018 B2
9993129 Santini Jun 2018 B2
9999328 VVanderstegen-Drake Jun 2018 B2
10045675 Haegermarck Aug 2018 B2
10247669 WindorferD Apr 2019 B2
10296007 Vicenti May 2019 B2
10518416 Haegermarck Dec 2019 B2
10766132 Romanov Sep 2020 B2
20010004719 Sommer Jun 2001 A1
20010037163 Allard Nov 2001 A1
20020016649 Jones Feb 2002 A1
20020091466 Song Jul 2002 A1
20020108635 Marrero Aug 2002 A1
20020121288 Marrero Sep 2002 A1
20020121561 Marrero Sep 2002 A1
20020153185 Song Oct 2002 A1
20020164932 Kamimura Nov 2002 A1
20020174506 Wallach Nov 2002 A1
20020185071 Guo Dec 2002 A1
20020189871 Won Dec 2002 A1
20030000034 Welsh Jan 2003 A1
20030025472 Jones Feb 2003 A1
20030030398 Jacobs Feb 2003 A1
20030120972 Matsushima Jun 2003 A1
20030140449 Alton Jul 2003 A1
20030159223 Plankenhorn Aug 2003 A1
20030167000 Mullick Sep 2003 A1
20030229421 Chmura Dec 2003 A1
20040020000 Jones Feb 2004 A1
20040031111 Porchia Feb 2004 A1
20040031121 Martin Feb 2004 A1
20040034952 Ho Feb 2004 A1
20040049877 Jones Mar 2004 A1
20040049878 Thomas Mar 2004 A1
20040074038 Im Apr 2004 A1
20040074039 Kim Apr 2004 A1
20040098167 Yi May 2004 A1
20040111184 Chiappetta Jun 2004 A1
20040111827 Im Jun 2004 A1
20040158357 Lee Aug 2004 A1
20040167667 Goncalves Aug 2004 A1
20040181896 Egawa Sep 2004 A1
20040182839 Denney Sep 2004 A1
20040182840 Denney Sep 2004 A1
20040185011 Alexander Sep 2004 A1
20040187249 Jones Sep 2004 A1
20040207355 Jones Oct 2004 A1
20040208212 Denney Oct 2004 A1
20040210343 Kim Oct 2004 A1
20040220707 Pallister Nov 2004 A1
20050000543 Taylor Jan 2005 A1
20050010331 Taylor Jan 2005 A1
20050015912 Kim Jan 2005 A1
20050015915 Thomas Jan 2005 A1
20050021181 Kim et al. Jan 2005 A1
20050028315 Thomas Feb 2005 A1
20050028316 Thomas Feb 2005 A1
20050042151 Alward Feb 2005 A1
20050065662 Reindle Mar 2005 A1
20050085947 MichaelAldred Apr 2005 A1
20050088643 Anderson Apr 2005 A1
20050156562 Cohen Jul 2005 A1
20050166354 Uehigashi Aug 2005 A1
20050166355 Tani Aug 2005 A1
20050171638 Uehigashi Aug 2005 A1
20050171644 Tani Aug 2005 A1
20050172435 Bernini Aug 2005 A1
20050191949 Kamimura Sep 2005 A1
20050217061 Reindie Oct 2005 A1
20050223514 Stuchlik Oct 2005 A1
20050229340 Sawalski Oct 2005 A1
20050230166 Petersson Oct 2005 A1
20050234611 Uehigashi Oct 2005 A1
20050251292 Casey Nov 2005 A1
20050251457 Kashiwagi Nov 2005 A1
20050251947 Lee Nov 2005 A1
20050267629 Petersson Dec 2005 A1
20050278888 Reindie Dec 2005 A1
20050287038 Dubrovsky Dec 2005 A1
20060006316 Takenaka Jan 2006 A1
20060009879 Lynch Jan 2006 A1
20060010799 Bohm Jan 2006 A1
20060020369 Taylor Jan 2006 A1
20060020370 Abramson Jan 2006 A1
20060028306 Hukuba Feb 2006 A1
20060032013 Kim Feb 2006 A1
20060045981 Tsushi Mar 2006 A1
20060047364 Tani Mar 2006 A1
20060076039 Song Apr 2006 A1
20060095158 Lee May 2006 A1
20060136096 Chiappetta Jun 2006 A1
20060144834 Denney Jul 2006 A1
20060178777 Park Aug 2006 A1
20060190133 Konandreas Aug 2006 A1
20060190134 Ziegler Aug 2006 A1
20060190146 Morse Aug 2006 A1
20060195015 Mullick Aug 2006 A1
20060200281 Ziegler Sep 2006 A1
20060213025 Sawalski Sep 2006 A1
20060235570 Jung Oct 2006 A1
20060235585 HirotoTanaka Oct 2006 A1
20060236492 KazuyaSudo Oct 2006 A1
20060288519 Jaworski Dec 2006 A1
20060293788 Pogodin Dec 2006 A1
20070016328 Ziegler Jan 2007 A1
20070021867 Woo Jan 2007 A1
20070059441 Greer Mar 2007 A1
20070061040 Augenbraun Mar 2007 A1
20070114975 Cohen May 2007 A1
20070118248 Lee May 2007 A1
20070124890 Erko Jun 2007 A1
20070136981 Dliger et al. Jun 2007 A1
20070143950 Lin Jun 2007 A1
20070156286 Yamauchi Jul 2007 A1
20070179670 Chiappetta Aug 2007 A1
20070189347 Denney Aug 2007 A1
20070204426 Nakagawa Sep 2007 A1
20070213892 Jones Sep 2007 A1
20070214601 Chung Sep 2007 A1
20070234492 Svendsen Oct 2007 A1
20070244610 Ozick Oct 2007 A1
20070250212 Halloran Oct 2007 A1
20070266508 Jones Nov 2007 A1
20070267230 Won Nov 2007 A1
20070267570 Park Nov 2007 A1
20070267998 Cohen Nov 2007 A1
20070273864 Cho Nov 2007 A1
20070276541 Sawasaki Nov 2007 A1
20070285041 Jones Dec 2007 A1
20070289267 Makarov Dec 2007 A1
20070290649 Jones Dec 2007 A1
20080000041 Jones Jan 2008 A1
20080000042 Jones Jan 2008 A1
20080001566 Jones Jan 2008 A1
20080007203 Cohen Jan 2008 A1
20080009964 Bruemmer Jan 2008 A1
20080015738 Casey Jan 2008 A1
20080016631 Casey Jan 2008 A1
20080037170 Saliba Feb 2008 A1
20080039974 Sandin Feb 2008 A1
20080047092 Schnittman Feb 2008 A1
20080051953 Jones Feb 2008 A1
20080007193 Bow Mar 2008 A1
20080052846 Kapoor Mar 2008 A1
20080058987 Ozick Mar 2008 A1
20080063400 Hudson Mar 2008 A1
20080065265 Ozick Mar 2008 A1
20080077278 Park Mar 2008 A1
20080079383 Nakamoto Apr 2008 A1
20080084174 Jones Apr 2008 A1
20080086241 Phillips Apr 2008 A1
20080091304 Ozick Apr 2008 A1
20080091305 Svendsen Apr 2008 A1
20080093131 Couture Apr 2008 A1
20080098553 Dayton May 2008 A1
20080105445 Dayton May 2008 A1
20080109126 Sandin May 2008 A1
20080121097 Rudakevych May 2008 A1
20080127445 Konandreas Jun 2008 A1
20080127446 Ziegler Jun 2008 A1
20080133052 Jones Jun 2008 A1
20080134457 Morse Jun 2008 A1
20080134458 Ziegler Jun 2008 A1
20080140255 Ziegler Jun 2008 A1
20080143063 Won Jun 2008 A1
20080143064 Won Jun 2008 A1
20080143065 DeFazio Jun 2008 A1
20080152871 Greer Jun 2008 A1
20080155768 Ziegler Jul 2008 A1
20080179115 Ohm Jul 2008 A1
20080183332 Ohm Jul 2008 A1
20080184518 Taylor Aug 2008 A1
20080196946 Filippov Aug 2008 A1
20080205194 Chiappetta Aug 2008 A1
20080209665 Mangiardi Sep 2008 A1
20080221729 ErwannLavarec Sep 2008 A1
20080223630 Couture Sep 2008 A1
20080235897 Kim Oct 2008 A1
20080236907 Won Oct 2008 A1
20080264456 Lynch Oct 2008 A1
20080266254 Robbins Oct 2008 A1
20080276407 Schnittman Nov 2008 A1
20080276408 Gilbert Nov 2008 A1
20080281470 Gilbert Nov 2008 A1
20080282494 Won Nov 2008 A1
20080294288 Yamauchi Nov 2008 A1
20080307590 Jones Dec 2008 A1
20090007366 Svendsen Jan 2009 A1
20090025155 Nishiyama Jan 2009 A1
20090030551 Hein Jan 2009 A1
20090037024 Jamieson Feb 2009 A1
20090038089 Landry Feb 2009 A1
20090044370 Won Feb 2009 A1
20090045766 Casey Feb 2009 A1
20090055022 Casey Feb 2009 A1
20090065271 Won Mar 2009 A1
20090070946 Tamada Mar 2009 A1
20090078035 Mecca Mar 2009 A1
20090107738 Won Apr 2009 A1
20090125175 Park May 2009 A1
20090126143 Haegermarck May 2009 A1
20090133720 Vandenbogert May 2009 A1
20090145671 Filippov Jun 2009 A1
20090173553 Won Jul 2009 A1
20090180668 Jones Jul 2009 A1
20090226113 Matsumoto Sep 2009 A1
20090232506 Hudson Sep 2009 A1
20090241826 Vosburgh Oct 2009 A1
20090254217 Pack Oct 2009 A1
20090254218 Sandin Oct 2009 A1
20090265036 Jamieson Oct 2009 A1
20090270015 DAmbra Oct 2009 A1
20090274602 Alward Nov 2009 A1
20090281661 Dooley Nov 2009 A1
20090292393 Casey Nov 2009 A1
20090292884 Wang Nov 2009 A1
20090314318 Chang Dec 2009 A1
20090314554 Couture Dec 2009 A1
20090319083 Jones Dec 2009 A1
20100001478 DeFazio Jan 2010 A1
20100011529 Won Jan 2010 A1
20100037418 Hussey Feb 2010 A1
20100049364 Landry Feb 2010 A1
20100049365 Jones Feb 2010 A1
20100049391 Nakano Feb 2010 A1
20100054129 Kuik Mar 2010 A1
20100063628 Landry Mar 2010 A1
20100075054 Kaneyama Mar 2010 A1
20100076600 Cross Mar 2010 A1
20100078415 Denney Apr 2010 A1
20100082193 Chiappetta Apr 2010 A1
20100107355 Won May 2010 A1
20100108098 Splinter May 2010 A1
20100115716 Landry May 2010 A1
20100116566 Ohm May 2010 A1
20100125968 Ho May 2010 A1
20100139029 Kim Jun 2010 A1
20100139995 Rudakevych Jun 2010 A1
20100161225 Hyung Jun 2010 A1
20100173070 Niu Jul 2010 A1
20100206336 Souid Aug 2010 A1
20100217436 Jones Aug 2010 A1
20100256812 Tsusaka et al. Oct 2010 A1
20100257690 Jones Oct 2010 A1
20100257691 Jones Oct 2010 A1
20100263142 Jones Oct 2010 A1
20100263158 Jones Oct 2010 A1
20100268384 Jones Oct 2010 A1
20100268385 Rew Oct 2010 A1
20100275405 Morse Nov 2010 A1
20100286791 Goldsmith Nov 2010 A1
20100305752 Abramson Dec 2010 A1
20100312429 Jones Dec 2010 A1
20100313910 Lee Dec 2010 A1
20100313912 Han Dec 2010 A1
20110000363 More Jan 2011 A1
20110004339 Ozick Jan 2011 A1
20110010873 Kim Jan 2011 A1
20110077802 Halloran Mar 2011 A1
20110082668 Escrig Apr 2011 A1
20110088609 Vosburgh Apr 2011 A1
20110109549 Robbins May 2011 A1
20110125323 Gutmann May 2011 A1
20110131741 Jones Jun 2011 A1
20110154589 Reindie Jun 2011 A1
20110202175 Romanov Aug 2011 A1
20110209726 Dayton Sep 2011 A1
20110252594 Blouin Oct 2011 A1
20110258789 Lavabre Oct 2011 A1
20110271469 Ziegler Nov 2011 A1
20110277269 Kim Nov 2011 A1
20110286886 Luchinger Nov 2011 A1
20110288684 Farlow Nov 2011 A1
20120011668 Schnittman Jan 2012 A1
20120011669 Schnittman Jan 2012 A1
20120011676 Jung Jan 2012 A1
20120011677 Jung Jan 2012 A1
20120011992 Rudakevych Jan 2012 A1
20120036659 Ziegler Feb 2012 A1
20120047676 Jung Mar 2012 A1
20120049798 Cohen Mar 2012 A1
20120079670 Yoon Apr 2012 A1
20120083924 Jones Apr 2012 A1
20120084934 Li Apr 2012 A1
20120084937 Won Apr 2012 A1
20120084938 Fu Apr 2012 A1
20120085368 Landry Apr 2012 A1
20120090133 Kim Apr 2012 A1
20120095619 Pack Apr 2012 A1
20120096656 Jung Apr 2012 A1
20120097783 Pack Apr 2012 A1
20120101661 Phillips Apr 2012 A1
20120102670 Jang May 2012 A1
20120106829 Lee May 2012 A1
20120109423 Pack May 2012 A1
20120110755 Liu May 2012 A1
20120118216 Vosburgh May 2012 A1
20120125363 Kim May 2012 A1
20120137464 Thatcher Jun 2012 A1
20120137949 Vosburgh Jun 2012 A1
20120151709 Tang Jun 2012 A1
20120152280 Bosses Jun 2012 A1
20120152877 Tadayon Jun 2012 A1
20120159725 Kapoor Jun 2012 A1
20120166024 Phillips Jun 2012 A1
20120167917 Gilbert Jul 2012 A1
20120169497 Schnittman Jul 2012 A1
20120173018 Allen Jul 2012 A1
20120173070 Schnittman Jul 2012 A1
20120180254 Morse Jul 2012 A1
20120180712 Vosburgh Jul 2012 A1
20120181099 Moon Jul 2012 A1
20120182392 Kearns Jul 2012 A1
20120183382 Couture Jul 2012 A1
20120185091 Field Jul 2012 A1
20120185094 Rosenstein Jul 2012 A1
20120185095 Rosenstein Jul 2012 A1
20120185096 Rosenstein Jul 2012 A1
20120192898 Lynch Aug 2012 A1
20120194395 Williams Aug 2012 A1
20120194427 Lee et al. Aug 2012 A1
20120197439 Wang Aug 2012 A1
20120197464 Wang Aug 2012 A1
20120199006 Swett Aug 2012 A1
20120199407 Morey Aug 2012 A1
20120200149 Rudakevych Aug 2012 A1
20120219207 Shin et al. Aug 2012 A1
20120222224 Yoon Sep 2012 A1
20120246862 Landry Oct 2012 A1
20120260443 Lindgren Oct 2012 A1
20120260861 Lindgren Oct 2012 A1
20120261204 Won Oct 2012 A1
20120265346 Gilbert Oct 2012 A1
20120265391 Letsky Oct 2012 A1
20120268587 Robbins Oct 2012 A1
20120281829 Rudakevych Nov 2012 A1
20120298029 Vosburgh Nov 2012 A1
20120303160 Ziegler Nov 2012 A1
20120311810 Gilbert Dec 2012 A1
20120312221 Vosburgh Dec 2012 A1
20120317745 Jung Dec 2012 A1
20120322349 Josi Dec 2012 A1
20130015596 Mozeika Jan 2013 A1
20130025085 Kim Jan 2013 A1
20130031734 Porat Feb 2013 A1
20130032078 Yahnker Feb 2013 A1
20130035793 Neumann Feb 2013 A1
20130047368 Tran Feb 2013 A1
20130054029 Huang Feb 2013 A1
20130054129 Wong Feb 2013 A1
20130060357 Li Mar 2013 A1
20130060379 Choe Mar 2013 A1
20130070563 Chiappetta Mar 2013 A1
20130081218 Kim Apr 2013 A1
20130085603 Chiappetta Apr 2013 A1
20130086760 Han Apr 2013 A1
20130092190 Yoon Apr 2013 A1
20130098402 Yoon Apr 2013 A1
20130103194 Jones Apr 2013 A1
20130105233 Couture May 2013 A1
20130117952 Schnittman May 2013 A1
20130118524 Konandreas May 2013 A1
20130138246 Gutmann May 2013 A1
20130138337 Pack May 2013 A1
20130145572 Schregardus Jun 2013 A1
20130152724 Mozeika Jun 2013 A1
20130152970 Porat Jun 2013 A1
20130160226 Lee Jun 2013 A1
20130166107 Robbins Jun 2013 A1
20130174371 Jones Jul 2013 A1
20130198481 Gander Aug 2013 A1
20130200993 Wu Aug 2013 A1
20130204463 Chiappetta Aug 2013 A1
20130204465 Phillips Aug 2013 A1
20130204483 Sung Aug 2013 A1
20130205520 Kapoor Aug 2013 A1
20130206170 Svendsen Aug 2013 A1
20130206177 Burlutskiy Aug 2013 A1
20130211589 Landry Aug 2013 A1
20130214498 DeFazio Aug 2013 A1
20130226344 Wong Aug 2013 A1
20130227801 Kim Sep 2013 A1
20130227812 Kim Sep 2013 A1
20130228198 Hung Sep 2013 A1
20130228199 Hung Sep 2013 A1
20130231779 Purkayastha Sep 2013 A1
20130231819 Hung Sep 2013 A1
20130232702 Baek Sep 2013 A1
20130239870 Hudson Sep 2013 A1
20130241217 Hickey Sep 2013 A1
20130253701 Halloran Sep 2013 A1
20130256042 Rudakevych Oct 2013 A1
20130268118 Grinstead Oct 2013 A1
20130269148 Chiu Oct 2013 A1
20130273252 Miyamoto Oct 2013 A1
20130298350 Schnittman Nov 2013 A1
20130310978 Ozick Nov 2013 A1
20130317944 Huang Nov 2013 A1
20130325178 Jones Dec 2013 A1
20130331987 Karlsson Dec 2013 A1
20130331988 Goel Dec 2013 A1
20130331990 Jeong Dec 2013 A1
20130338525 Allen Dec 2013 A1
20130338828 Chiappetta Dec 2013 A1
20130338831 Noh Dec 2013 A1
20130340201 Jang Dec 2013 A1
20140016469 Ho Jan 2014 A1
20140026338 Kim Jan 2014 A1
20140026339 Konandreas Jan 2014 A1
20140036062 Yoon Feb 2014 A1
20140053351 Kapoor Feb 2014 A1
20140109339 Won Apr 2014 A1
20140123325 Jung May 2014 A1
20140130272 Won May 2014 A1
20140142757 Ziegler May 2014 A1
20140166047 Hillen Jun 2014 A1
20140167931 Lee Jun 2014 A1
20140180968 Song Jun 2014 A1
20140184144 Henricksen Jul 2014 A1
20140207280 Duffley Jul 2014 A1
20140207281 Angle Jul 2014 A1
20140207282 Angle Jul 2014 A1
20140214205 Kwon Jul 2014 A1
20140238440 Dayton Aug 2014 A1
20140249671 Halloran Sep 2014 A1
20140283326 Song Sep 2014 A1
20150000068 Tsuboi Jan 2015 A1
20150005937 Ponulak Jan 2015 A1
20150032259 Kim Jan 2015 A1
20150033488 Varila Feb 2015 A1
20150039127 Matsumoto Feb 2015 A1
20150057800 Cohen Feb 2015 A1
20150120056 Noh Apr 2015 A1
20150158174 Romanov Jun 2015 A1
20150185322 Haegermarck Jul 2015 A1
20150197012 Schnittman Jul 2015 A1
20150206015 Ramalingam Jul 2015 A1
20150265122 Han Sep 2015 A1
20150362921 Hanaoka Dec 2015 A1
20150367512 Hong Dec 2015 A1
20160007817 Schlischka Jan 2016 A1
20160008982 Artes Jan 2016 A1
20160103451 Vicenti Apr 2016 A1
20160144511 Ramanov et al. May 2016 A1
20160167226 Schnittman Jun 2016 A1
20160202703 Matsubara Jul 2016 A1
20160235270 Santini Aug 2016 A1
20160298970 Lindhe Oct 2016 A1
20160306359 MagnusLindhe Oct 2016 A1
20160316982 Kim Nov 2016 A1
20170197315 Haegermarck Jul 2017 A1
20170273521 Klintemyr Sep 2017 A1
20170273524 Klintemyr Sep 2017 A1
20170296021 Li Oct 2017 A1
20170344013 Haegermarck Nov 2017 A1
20170344019 Haegermarck et al. Nov 2017 A1
20180088583 Wang et al. Mar 2018 A1
20180103812 Lee Apr 2018 A1
20190086933 Munich et al. Mar 2019 A1
20210053207 Romanov Feb 2021 A1
Foreign Referenced Citations (217)
Number Date Country
2154758 Jun 1995 CA
1116818 Feb 1996 CN
1668238 Sep 2005 CN
1883889 Dec 2006 CN
1985738 Jun 2007 CN
101161174 Apr 2008 CN
101297267 Oct 2008 CN
102046059 May 2011 CN
102083352 Jun 2011 CN
201861561 Jun 2011 CN
102183959 Sep 2011 CN
102506748 Jun 2012 CN
102949144 Mar 2013 CN
103027634 Apr 2013 CN
103054516 Apr 2013 CN
103308050 Sep 2013 CN
103376801 Oct 2013 CN
103491838 Jan 2014 CN
103505155 Jan 2014 CN
103534659 Jan 2014 CN
103565373 Feb 2014 CN
103948354 Jul 2014 CN
104302453 Jan 2015 CN
105326442 Feb 2016 CN
205091616 Mar 2016 CN
105982611 Oct 2016 CN
3536907 Apr 1986 DE
9307500 Jul 1993 DE
4211789 Oct 1993 DE
4340367 Jun 1995 DE
4439427 May 1996 DE
19849978 May 2000 DE
10311299 Apr 2004 DE
202008017137 Mar 2009 DE
102010000174 Jul 2011 DE
102010000573 Sep 2011 DE
102010037672 Mar 2012 DE
202017000833 Mar 2017 DE
0142594 May 1985 EP
0358628 Mar 1990 EP
0474542 Mar 1992 EP
0569984 Nov 1993 EP
0606173 Jul 1994 EP
1099143 Nov 2003 EP
1360922 Nov 2003 EP
1441271 Jul 2004 EP
1331537 Aug 2005 EP
2050380 Apr 2009 EP
1969438 Sep 2009 EP
1395888 May 2011 EP
2316322 May 2011 EP
2322071 May 2011 EP
2296005 Jun 2011 EP
2251757 Nov 2011 EP
2417894 Feb 2012 EP
2438843 Apr 2012 EP
2466411 Jun 2012 EP
2502540 Sep 2012 EP
2561787 Feb 2013 EP
2578125 Apr 2013 EP
2583609 Apr 2013 EP
2604163 Jun 2013 EP
2624177 Aug 2013 EP
2679130 Jan 2014 EP
2447800 Apr 2014 EP
2741483 Jun 2014 EP
2772815 Sep 2014 EP
2992803 Mar 2016 EP
3047782 Jul 2016 EP
3199083 Aug 2017 EP
2999410 Jun 2014 FR
1447943 Sep 1976 GB
2355523 Apr 2001 GB
2382251 May 2003 GB
2494446 Mar 2013 GB
2884364 Jun 2015 GB
5540959 Mar 1980 JP
6286414 Apr 1987 JP
62109528 May 1987 JP
62120510 Jun 1987 JP
62152421 Jul 1987 JP
62152424 Jul 1987 JP
63127310 May 1988 JP
63181727 Jul 1988 JP
63241610 Oct 1988 JP
03162814 Jul 1991 JP
03166074 Jul 1991 JP
04260905 Sep 1992 JP
0584200 Apr 1993 JP
0584210 Apr 1993 JP
05084200 Apr 1993 JP
05184489 Jul 1993 JP
05189041 Jul 1993 JP
05224745 Sep 1993 JP
05228090 Sep 1993 JP
064133 Jan 1994 JP
0643935 Feb 1994 JP
0683442 Mar 1994 JP
06125861 May 1994 JP
06144215 May 1994 JP
06179145 Jun 1994 JP
075922 Jan 1995 JP
0759695 Mar 1995 JP
0732752 Apr 1995 JP
07129239 May 1995 JP
07281742 Oct 1995 JP
07287617 Oct 1995 JP
08089455 Apr 1996 JP
08286746 Nov 1996 JP
08326025 Dec 1996 JP
0944240 Feb 1997 JP
09150741 Jun 1997 JP
09185410 Jul 1997 JP
11267074 Oct 1999 JP
2001022443 Jan 2001 JP
2001187009 Jul 2001 JP
2002287824 Oct 2002 JP
2002533797 Oct 2002 JP
2002355204 Dec 2002 JP
2002366228 Dec 2002 JP
2003505127 Feb 2003 JP
2003116758 Apr 2003 JP
2003180587 Jul 2003 JP
2003225184 Aug 2003 JP
2003280740 Oct 2003 JP
2004096253 Mar 2004 JP
2004136144 May 2004 JP
2004166968 Jun 2004 JP
2004198212 Jul 2004 JP
2004303134 Oct 2004 JP
200540597 Feb 2005 JP
2005050105 Feb 2005 JP
2005124753 May 2005 JP
2005141636 Jun 2005 JP
2005192609 Jul 2005 JP
2005314116 Nov 2005 JP
2006015113 Jan 2006 JP
2006087507 Apr 2006 JP
2006231477 Sep 2006 JP
2006314669 Nov 2006 JP
2007014369 Jan 2007 JP
2007070658 Mar 2007 JP
2007143645 Jun 2007 JP
2006185438 Jul 2007 JP
2007213236 Aug 2007 JP
2007226322 Sep 2007 JP
2007272665 Oct 2007 JP
2008132299 Jun 2008 JP
2008146617 Jun 2008 JP
2008290184 Dec 2008 JP
2008543394 Dec 2008 JP
2009500741 Jan 2009 JP
2009509220 Mar 2009 JP
2009193240 Aug 2009 JP
2010507169 Mar 2010 JP
201079869 Apr 2010 JP
2010094802 Apr 2010 JP
2010526594 Aug 2010 JP
2010534825 Nov 2010 JP
2011045694 Mar 2011 JP
2011133405 Jul 2011 JP
2011253361 Dec 2011 JP
2012216051 Nov 2012 JP
2013041506 Feb 2013 JP
2013059625 Apr 2013 JP
201389256 May 2013 JP
2013089256 May 2013 JP
2013247986 Dec 2013 JP
2014023930 Feb 2014 JP
2014048842 Mar 2014 JP
2014193383 Oct 2014 JP
2015521760 Jul 2015 JP
2015534048 Nov 2015 JP
950002044 Mar 1995 KR
20040096253 Nov 2004 KR
20050003112 Jan 2005 KR
100544479 Jan 2006 KR
20070070658 Jul 2007 KR
20090028359 Mar 2009 KR
20090076738 Jul 2009 KR
20120047137 May 2012 KR
20130002218 Jan 2013 KR
101231932 Mar 2013 KR
101338143 Dec 2013 KR
20150124011 Nov 2015 KR
101613467 Apr 2016 KR
101650128 Aug 2016 KR
7408667 Jan 1975 NL
8804081 Jun 1988 WO
9303399 Feb 1993 WO
9638770 Dec 1996 WO
0036961 Jun 2000 WO
0036970 Jun 2000 WO
0038025 Jun 2000 WO
0182766 Nov 2001 WO
03022120 Mar 2003 WO
03024292 Mar 2003 WO
2003026474 Apr 2003 WO
2004006034 Jan 2004 WO
2004082899 Sep 2004 WO
2007008148 Jan 2007 WO
2007028049 Mar 2007 WO
2007051972 May 2007 WO
2007065034 Jun 2007 WO
2008048260 Apr 2008 WO
2009132317 Oct 2009 WO
2011003667 Jan 2011 WO
2012008702 Jan 2012 WO
2013105431 Jul 2013 WO
2013157324 Oct 2013 WO
2013162094 Oct 2013 WO
2014033055 Mar 2014 WO
2014151501 Sep 2014 WO
2015016580 Feb 2015 WO
2015090402 Jun 2015 WO
2016005011 Jan 2016 WO
2016130188 Aug 2016 WO
Non-Patent Literature Citations (200)
Entry
Notice of Reasons for Refusal for Japanese Application No. 2018-537617, dated Feb. 28, 2020, 3 pages.
Korean Office Action issued in Korean Application No. 10-2019-7034477, dated Aug. 23, 2021, 6 pages.
USPTO Notice of Allowance issued in U.S. Appl. No. 15/534,327, dated Aug. 13, 2020, 5 pages.
Chinese Office Action issued in Chinese Patent Application No. 201780090566X, dated Feb. 1, 2021, 11 pages.
Chinese Office Action issued in Chinese Application No. 2014800837122, dated Jul. 17, 2020 with translation, 18 pages.
Japanese Notice of Reasons for Refusal issued in Japanese Application No. 2019-542478, dated Mar. 16, 2021 with translation, 5 pages.
Japanese Notice of Reasons for Refusal issued in Japanese Application No. 2019-557633, dated Mar. 18, 2021 with translation, 9 pages.
USPTO Final Office Action issued in U.S. Appl. No. 16/083,161, dated Mar. 31, 2021, 13 pages.
IEEE International Conference on Robotics and Automation Searches for robot vacuum, 2015, 3 pages.
Chinese Office Action issued in Chinese Application No. 2016800855242, dated Jul. 3, 2020, 15 pages.
Chinese Office Action issued in Chinese Application No. 2016800855242, dated May 25, 2021 with translation, 23 pages.
“SM51 Series Opposed Mode Sensors, DC sensors with metal housings: SM51EB/RB, SM51EB6/RB6”, Banner Engineering Corporation, pp. 1-24.
Andersson, et al., “ISR: An Intelligent Service Robot”, Centre for Autonomous Systems, Royal Institute of Technology, S-100 44 Stockholm, Sweden, pp. 1-24.
Berlin, et al. “Development of a Multipurpose Mobile Robot for Concrete Surface Processing”, A Status Report, Feb. 1992, Sweden, pp. 1-10.
Borenstein, et al. “Real-Time Obstacle Avoidance for Fast Mobile Robots”, IEEE, Jan. 6, 1996, pp. 1-18.
Braunstingl, et al., “Fuzzy Logic Wall Following of a Mobile Robot Based on the Concept of General Perception”, ICAR '95, 7th International Conference on Advanced Robotics, Sant Feliu De Guixols, Spain pp. 367-376., Sep. 1995, pp. 1-9.
Caselli, et al. “Mobile Robot Navigation in Enclosed Large-Scale Space”, Italy and U.S.A., pp. 1-5.
Cassens, et al. “Finishing and Maintaining Wood Floors”, Wood Finishing, North Central Regional Extension Publication #136, pp. 1-8.
Chinese Office Action for Chinese Application No. 2014800837122, dated May 5, 2019 with translation, 20 pages.
Chinese Office Action for Application No. 201380081331.6, dated Mar. 26, 2018 with translation, 27 pages.
Chinese Office Action for Chinese Applciation No. 201380081537.9, dated Jun. 4, 2018 with translation, 15 pages.
Chinese Office Action for Chinese Application No. 20130075510.9, dated Feb. 6, 2017 with translation, 14 pages.
Chinese Office Action for Chinese Application No. 201380075503.9, dated Febraury 13, 2017 with translation, 18 pages.
Chinese Office Action for Chinese Application No. 201380075503.9, dated Nov. 8, 2017 with translation, 16 pages.
Chinese Office Action for Chinese Application No. 201380075510.9, dated Oct. 27, 2017 with translation, 13 pages.
Chinese Office Action for Chinese Application No. 201380081103.9, dated Feb. 27, 2018 with translation, 19 pages.
Chinese Office Action for Chinese Application No. 201380081103.9, dated Jun. 6, 2019 with translation, 10 pages.
Chinese Office Action for Chinese Application No. 201380081535.X, dated Jun. 12, 2019, 25 pages.
Chinese Office Action for Chinese Application No. 201380081535.X, dated Mar. 26, 2018, 18 pages.
Chinese Office Action for Chinese Application No. 201380081537.9, dated Jan. 30, 2019 with translation, 12 pages.
Chinese Office Action for Chinese Application No. 201480079515.3, dated Jun. 5, 2019, 9 pages.
Chinese Office Action for Chinese Application No. 201480083392.0, dated Jan. 3, 2020, 8 pages.
Chinese Office Action for Chinese Application No. 2014800837122, dated Jan. 7, 2020, 10 pages.
Chinese Office Action for Chinese Application No. 201480084065.7, dated Sep. 16, 2019 with translation, 16 pages.
Chinese Office Action for Chinese Application No. 2015800781846, dated Sep. 18, 2019 with translation, 15 pages.
Chung et al.,“Path Planning For A Mobile Robot With Grid Type World Model”, Proceedings of the 1992 IEEE/RSJ International Conference on Intelligent Robots and Systems,Jul. 7-10, 1992, pp. 439-444.
Collins, et al. “Cerebellar Control of a Line Following Robot”, Computer Science and Electrical Engineering Department, University of Queensland, St. Lucia, Queensland, 4072 A, pp. 1-6.
Decision for Refusal for Japanese Application No. 2016-526875, dated May 15, 2018 with translation, 6 pages.
Decision of Refusal for Japanese Application No. 2016-526945, dated May 7, 2017 with trasnslation, 5 pages.
Doty, et al. “Sweep Strategies for a Sensory-Driven, Behavior-Based Vacuum Cleaning Agent”, 1993, Machine Intelligence Laboratory-Gainesville Florida, AAAI 1993 Fall Symposium Series—Research Triangle Park—Raleigh, NC, Oct. 22-24, 1993, pp. 1-6.
European Communication Pursuant to Article 94(3) for European Application No. 16176479.0, dated Nov. 27, 2017, 6 pages.
European Communication Pursuant to Article 94(3) for EP Application No. 13817911.4, dated Apr. 2, 2019, 8 pages.
European Communication Pursuant to Article 94(3) for European Application No. 15759442.5, dated Apr. 17, 2019, 6 pages.
Everett, Sensors For Mobile Robots Theory and Application, A.K. Peters, 1995, Chapters 1 and 3, 70 pages.
Everett, Sensors For Mobile Robots Theory and Application, A.K. Peters, Ltd., 1995, Chapters 15 and 16, 59 pages.
Everett, Sensors for Mobile Robots Theory and Application, A.K. Peters, Ltd., 1995, Chapters 6, 7 and 10, 79 pages.
Everett, Sensors For Mobile Robots Theory and Application, A.K. Peters, Ltd., 1995, Chapters, 4a nd 5, 68 pages.
Everett, et al. “Survey of Collision Avoidance and Ranging Sensors for Mobile Robots”, Revision 1, Technical Report 1194, Dec. 1992, pp. 1-154.
Extended European Search Report for European Application No. 16176479.0, dated Nov. 11, 2016, 9 pages.
Extended European Search Report for European Application No. 18157403.9, dated Nov. 14, 2018, 12 pages.
Final Office Action for U.S. Appl. No. 14/409,291, dated Jun. 6, 2017, 21 pages.
Final Office Action for U.S. Appl. No. 14/784,106, dated Mar. 28, 2018, 8 pages.
Final Office Action for U.S. Appl. No. 15/100,667, dated Apr. 21, 2017, 26 pages.
Final Office Action for U.S. Appl. No. 15/100,667, dated Mar. 27, 2018, 23 pages.
Final Office Action for U.S. Appl. No. 15/101,212, dated Oct. 11, 2017, 7 pages.
Final Office Action for U.S. Appl. No. 15/101,510, dated Feb. 8, 2019, 16 pages.
Final Office Action for U.S. Appl. No. 15/102,017, dated Jun. 14, 201812 pages.
Final OFfice Action for U.S. Appl. No. 15/321,333, dated Apr. 18, 2019, 14 pages.
Final Office Action for U.S. Appl. No. 15/504,066, dated Mar. 21, 2019, 18 pages.
Final Office Action for U.S. Appl. No. 15/504,071, dated Dec. 4, 2019, 20 pages.
Final Office Action for U.S. Appl. No. 15/504,071, dated Mar. 5, 2019, 20 pages.
Final Office Action for U.S. Appl. No. 15/534,327, dated Jul. 26, 2019, 16 pages.
Final Office Action for U.S. Appl. No. 15/534,591, dated Dec. 2, 2019, 15 12 pages.
Final Office Action for U.S. Appl. No. 15/535,506, dated Sep. 17, 2019, 11 pages.
Final Office Action for U.S. Appl. No. 15/101,235, dated Jan. 11, 2018, 12 pages.
Gavrilut, et al., “Wall-Following Method for an Autonomous Mobile Robot using Two IR Sensors”, 12th WSEAS International Conference on Systems, Heraklion, Greece, Jul. 22-24, 2008, ISBN: 978-960-6766-83-1, ISSN: 1790-2769, pp. 205-209.
Gutman et al., AMOS: Comparison of Scan Matching Approaches for Self-Localization in Indoor Environments, 1996, IEEE, pp. 61-67.
Herbst, et al., “Micromouse Design Specifications”, Jun. 2, 1998, pp. 1-22.
International Preliminary Report on Patentability for International Application No. PCT/EP2013/077377, dated Jun. 21, 2016, 12 pages.
International Preliminary Report on Patentability for International Application No. PCT/EP2013/077378, dated Jun. 21, 2016, 7 pages.
International Preliminary Report on Patentability for International Application No. PCT/EP2013/077384, dated Jun. 21, 2016, 6 pages.
International Preliminary Report on Patentability for International Application No. PCT/EP2013/077385, dated Jun. 21, 2016, 7 pages.
International Preliminary Report on Patentability for International Application No. PCT/EP2013/077386, dated Jun. 21, 2016, 6 pages.
International Preliminary Report on Patentability for International Application No. PCT/EP2013/077387, dated Jun. 21, 2016, 9 pages.
International Preliminary Report on Patentability for International Application No. PCT/EP2013/077657, dated Jun. 21, 2016, 7 pages.
International Preliminary Report on Patentability for International Application No. PCT/EP2013/077661, dated Jun. 21, 2016, 11 pages.
International Preliminary Report on Patentability for International Application No. PCT/EP2017/056100, dated Sep. 17, 2019, 6 pages.
International Preliminary Report on Patentability for International Application No. PCT/EP203/077380, dated Jun. 21, 2016, 6 pages.
International Search Report and Written Opinion of the Internatinal Searching Authoirty for International Application No. PCT/EP2016/055547, dated Jan. 2, 2017, 10 pages.
International Search Report and Written Opinion for International Application No. PCT/EP2017/056100, dated Dec. 18, 2017, 9 pages.
International Search Report and Written Opinion for International Application No. PCT/EP2017/063468, dated Mar. 1, 2018, 10 pages.
International Search Report and Written Opinion for International Application No. PCT/EP2017/072267, dated Jun. 6, 2018, 9 pages.
International Search Report and Written Opinion for International Application No. PCT/EP2017/074406, dated May 2, 2018, 9 pages.
International Search Report and Written Opinion of the International Searching Authoirty for International Application No. PCT/EP2015/040140, dated May 27, 2016, 11 pages.
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/EP2015/058377, dated Aug. 10, 2016, 15 pages.
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/EP2014/069073, dated May 12, 2015, 10 pages.
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/EP2012/077377, dated Nov. 6, 2014, 18 pages.
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/EP2013/077378, dated Apr. 9, 2014, 9 pages.
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/EP2013/077380, dated Jul. 28, 2014, 8 pages.
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/EP2013/077384, dated Aug. 14, 2016, 9 pages.
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/EP2013/077385, dated May 27, 2015, 9 pages.
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/EP2013/077386, dated Sep. 17, 2014, 9 pages.
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/EP2013/077387, dated Sep. 30, 2014, 12 pages.
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/EP2013/077661, dated Jun. 10, 2014, 15 pages.
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/EP2014/069074, dated May 11, 2015, 9 pages.
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/EP2014/077549, dated Jul. 27, 2015, 9 pages.
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/EP2014/077947, dated Jul. 11, 2016, 14 pages.
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/EP2014/077954, dated Oct. 12, 2015, 19 pages.
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/EP2014/078144, dated Apr. 15, 2015, 7 pages.
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/EP2016/060565, dated Feb. 15, 2017, 12 pages.
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/EP2016/060571, dated Feb. 7, 2017, 8 pages.
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/EP2016/072291, dated Jun. 6, 2017, 11 pages.
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/EP32013/077657, dated Aug. 18, 2014, 10 pages.
International Search Report and Written Opinion of the International Searching Authority for Internatonal Applicaion No. PCT/EP2014/0077142, dated Sep. 11, 2015, 8 pages.
International Search Report for International Application No. PCT/EP2013/057814 dated Dec. 20, 2013, 5 pages.
International Search Report for International Application No. PCT/EP2013/057815 dated Apr. 12, 2014, 4 pages.
International Search Report for International Application No. PCT/EP2013/067500 dated Dec. 10, 2013, 4 pages.
Japanese Notice of Reasons for Refusal for Japanese Application No. 2016-526947, dated Nov. 28, 2019 with translation, 4 pages.
Japanese Notice of Reasons for Refusal for Japanese Application No. 2017-522557, dated Jun. 18, 2019 with translation, 6 pages.
Japanese Office Action for Application for Japanese Application No. 2015-528969, dated Apr. 7, 2017 with translation, 4 pages.
Japanese Office Action for Japanese Application No. 2016-506794, dated Feb. 7, 2017 with translation, 10 pages.
Japanese Office Action for Japanese Application No. 2016-506795 , dated Feb. 7, 2017 with translation, 6 pages.
Japanese Report of Reconsideration by Examiner Before Appeal for Japanese Application No. 206-526947, dated Apr. 10, 2019 with translation, 3 pages.
Jenkins, “Practical Requirements for a Domestic Vacuum-Cleaning Robot”, From: AAAI Technical Report FS-93-03., JRL Consulting, Menlo Park, California, pp. 85-90.
Jones et al., Mobile Robots Inspiration to Implementation, Second Edition, A.K. Peters, Ltd., 1999, Chapters 1 and 5, 72pages.
Jones etal., Mobile Robots Inspiration to Implementation, Second Edition, A.K. Peters ,Ltd., 1999, Chapters 6 and 9, 56pages.
Jones etal., Mobile Robots Inspiration to Implementation, Second Edition, A.K. Peters, Ltd., 1999, Chapters 10 and 11, 45pages.
Jung, et al. “Whisker Based Mobile Robot Navigation”, Wollongong, NSW 2500, Australia, pp. 1-8.
Korean Office Action for Korean Application No. 10-20157030949, dated Mar. 29, 2019, 5 pages.
Korean Office Action for Korean Application No. 10-2016-7015470, dated Sep. 30, 2019 with translation, 8 pages.
Korean Office Action for Korean Application No. 10-2016-7016792, dated Aug. 21, 2019, with translation, 10 pages.
Krishna, et al., “Solving the Local Minima Problem for a Mobile Robot by Classification of Spatio-Temporal Sensory Sequences”, Journal of Robotic Systems 17 (10), 2000, pp. 549-564.
Kube, “A Minimal Infrared Obstacle Detection Scheme”, Department of Computing Science, University of Alberta, Edmonton, Alberta, Canada, The Robotics Practitioner, 2(2): 15-20, 1996, Oct. 23, 1998, pp. 1-8.
Larson, “RoboKent—a case study in man-machine interfaces” Industrial Robot, vol. 25 No. 2, 1998, pp. 95-100.
LeBouthillier, “W. Grey Walter and his Turtle Robots”, The Robot Builder, vol. Eleven No. Five, May 1999, RSSC POB 26044, Santa Ana,CA, pp. 1-8.
Maaref, et al.“Sensor-based navigation of a mobile robot in an indoor environment”, Robotics and Autonomous Systems, 2002, Elsevier, 18pages.
Michael Carsten Bosse, “Atlas: A Framework for Large Scale Automated Mapping and Localization”, Massachusetts Institute of Technology, Feb. 2004, Part 2, 67 pages.
Michael Carsten Bosse, “Atlas: A Framework for Large Scale Automated Mapping and Localization”, Massachusetts Institute of Technology, Feb. 2004, Part 1, 140 pages.
Non Final Office Action for U.S. Appl. No. 14/409,291, dated Dec. 28, 2016, 61 pages.
Non Final Office Action for U.S. Appl. No. 15/101,235, dated Nov. 1, 2017, 11 pages.
Non Final Office Action for U.S. Appl. No. 15/535,506, dated May 1, 2019, 16 pages.
Non Final Office Action for U.S. Appl. No. 14/784,106, dated Oct. 19, 2017, 11 pages.
Non Final Office Action for U.S. Appl. No. 14/784,110, dated Aug. 16, 2018, 13 pages.
Non Final Office Action for U.S. Appl. No. 15/100,667, dated Nov. 29, 2017, 22 pages.
Non Final Office Action for U.S. Appl. No. 15/100,667, dated Sep. 12, 2016, 24 pages.
Non Final Office Action for U.S. Appl. No. 15/101,212, dated May 17, 2017, 8 pages.
Non Final Office Action for U.S. Appl. No. 15/101,235 dated Apr. 21, 2017, 10 pages.
Non Final Office Action for U.S. Appl. No. 15/101,235, dated Jun. 14, 2018, 11 pages.
Non Final Office Action for U.S. Appl. No. 15/101,235, dated Sep. 6, 2019, 10 pages.
Non Final Office Action for U.S. Appl. No. 15/101,257, dated Feb. 10, 2017, 10 pages.
Non Final Office Action for U.S. Appl. No. 15/101,510, dated Jul. 27, 2018, 17 pages.
Non Final Office Action for U.S. Appl. No. 15/101,515, dated Apr. 18, 2018, 14 pages.
Non Final Office Action for U.S. Appl. No. 15/102,015, dated Aug. 17, 2017, 13 pages.
Non Final Office Action for U.S. Appl. No. 15/102,017, dated Feb. 16, 2018, 12 pages.
Non Final Office Action for U.S. Appl. No. 15/102,017, dated Jan. 22, 2 019, 15 pages.
Non Final Office Action for U.S. Appl. No. 15/321,333, dated Oct. 24, 2018, 10 pages.
Non Final Office Action for U.S. Appl. No. 15/504,066, dated Nov. 5, 2018, 18 pages.
Non Final Office Action for U.S. Appl. No. 15/504,071, dated Aug. 8, 2019, 23 pages.
Non Final Office Action for U.S. Appl. No. 15/504,071, dated Nov. 2, 2018, 17 pages.
Non Final Office Action for U.S. Appl. No. 15/534,327, dated Mar. 7, 2019, 11 pages.
Non Final Office Action for U.S. Appl. No. 15/534,591, dated Aug. 9, 2019, 11 pages.
Non Final Office Action for U.S. Appl. No. 15/535,244, dated May 17, 2019, 6 pages.
Non Final Office Action for U.S. Appl. No. 15/565,467, dated Jan. 29, 2020, 12 pages.
Notice of Allowance for U.S. Appl. No. 15/100,667, dated Aug. 6, 2018, 22 pages.
Notice of Allowance for U.S. Appl. No. 14/409,291, dated Jun. 16, 2016, 13 pages.
Notice of Allowance for U.S. Appl. No. 14/409,291, dated Sep. 18, 2017, 8 pages.
Notice of Allowance for U.S. Appl. No. 14/784,106, dated Oct. 11, 2018, 7 pages.
Notice of Allowance for U.S. Appl. No. 15/101,212I dated Apr. 11, 2018, 9 pages.
Notice of Allowance for U.S. Appl. No. 15/101,235, dated Dec. 12, 2019, 8 pages.
Notice of Allowance for U.S. Appl. No. 15/101,257, dated Jul. 6, 2017, 9 pages.
Notice of Allowance for U.S. Appl. No. 15/101,510, dated May 30, 2019, 12 pages.
Notice of Allowance for U.S. Appl. No. 15/101,515, dated Aug. 28, 2018, 11 pages.
Notice of Allowance for U.S. Appl. No. 15/102,015, dated Dec. 11, 2017, 8 pages.
Notice of Allowance for U.S. Appl. No. 15/102,295, dated Sep. 24, 2018, 9 pages.
Notice of Allowance for U.S. Appl. No. 15/504,066, dated Aug. 9, 2019.
Notice of Allowance for U.S. Appl. No. 15/535,244, dated Sep. 10, 2019, 5 pages.
Notice of Reasons for Rejection for Japanese Application No. 2016-526756, dated Aug. 10, 2017, with translation, 6 pages.
Notice of Reasons for Rejection for Japanese Application No. 2016-526759, dated Aug. 24, 2017 with translation, 9 pages.
Notice of Reasons for Rejection for Japanese Application No. 2016-526765, dated Aug. 25, 2017 with translation, 7 pages.
Notice of Reasons for Rejection of Japanese Application No. 2016-526764, dated Aug. 25, 2017 with translation. 6 pages.
Notification for Reasons for Refusal for Japanese Application No. 2016-526875, dated Oct. 31, 2017 with translation, 10 pages. .
Notification of Reasons for Refusal for Japanese Application No. 2016-526765, dated May 15, 2018 with translation, 6 pages.
Notification of Reasons for Refusal for Japanese Application No. 2016-526945, dated Oct. 31, 2017 with translation, 8 pages.
Notification of Reasons for Refusal for Japanese Application No. 2016-568949, dated Oct. 9, 2018 with translation, 6 pages.
Notification of Reasons for Refusal for Japanese Application No. 2017-501374, dated Mar. 6, 2016 with translation, 8 pages.
Notification of Reasons for Refusal for Japanese Application No. 2017-544589, dated Apr. 2, 2019 with translation, 6 pages.
Notification of Reasons for Rejection for Japanese Application No. 2016-526947, dated Sep. 21, 2017 with translation,, 8 pages.
Oren, Reply to Office Action dated Jun. 23, 2014, U.S. Appl. No. 13/757,985, pp. 1-10.
Pack, et al., “Constructing a Wall-Follower Robot for a Senior Design Project”, 1996 ASEE Annual Conference Proceedings, Session 1532, pp. 1-7.
Position_Definition of Position by Merriam-Webster.pdf (Position | Definition of Position by Merriam-Webster, Oct. 16, 2016, Merriam-Webster, https://www.merriam-webster.com/dictinary/position, pp. 1-15).
Report of Reconsideration for Japanese Application No. 2016-011556, dated Oct. 24, 2018, 2 pages.
Saffiotti, “Fuzzy logic in Autonomous Robot Navigation”, a case study, Nov. 1995 Revised: Aug. 1997, IRIDIA, Universite Libre de Bruxelles, Belgium, , Technical Report TR/IRIDIA/ 95 25, Cover page + pp. 1-14.
Written Opinion for International Application No. PCT/EP2013/067500 dated Dec. 10, 2013, 7 pages.
Yamamoto, “SOZZY: A Hormone-Driven Autonomous Vacuum Cleaner”, From: AAAI Technical Report FS-93-03, Matasushita Research Institute, Tokyo, and MIT Artificial Intelligence laboratory, Massachusetts, pp. 116-124 + Figure 9 and Figure 11.
Yoshida et al., “Online Motion Planning Using Path Deformation and Replanning”, 28th Annual Robot Society, with partial translation, 2011, vol. 29, No. 8, Chapter 3, pp. 716-725.
Chinese Office Action issued in Chinese Patent Application No. 2017800875876, dated Jan. 6, 2021, 12 pages.
USPTO Non Final Office Action issued in U.S. Appl. No. 16/491,355, dated Apr. 16, 2021, 14 pages.
Korean Office Action issued in Korean Patent Application No. 10-2019-7025893, dated Apr. 28, 2021, 10 pages.
USPTO Notice of Allowance issued in U.S. Appl. No. 15/565,467, dated Sep. 22, 2020, 6 pages.
Korean Office Action for Korean Application No. 10-2019-7034477, dated Aug. 23, 2021 with translation, 14 pages.
USPTO Non Final Office Action issued in U.S. Appl. No. 16/083,161, dated Oct. 27, 2020, 13 pages.
Bagnell et al., “Learning for Autonomous Navigation”, IEEE Robotics Automation Magazine, 2010, abtract only, 1 page.
Ellis et al., “Autonomous Navigation and Sign Detector Learning”, IEEE Workshop on Robot Vision (WORV), 2013, abstract only, 1 page.
Pontil et al., “Support Vector Machines for 3D Object. Recognition”, IEEE Transactions on Pattern Analysis and Machine Intelligence, 1008, vol. 20, No. 6, abstract only, 1 page.
Zuo et al., “A Reinforcement Learning Based Robotic Navigation System”, 2014 IEEE International Conference on Systems, Man, and Cybernetics, 2014, abstract only, 1 page.
USPTO Notice of Allowance issued in U.S. Appl. No. 16/083,161, dated Jul. 9, 2021, 9 pages.
USPTO Final Office Action issued in U.S. Appl. No. 15/565,467, dated Jun. 2, 2020, 8 pages.
Chinese Office Action for Chinese Application No. 2015800781846, dated Mar. 24, 2020, 9 pages.
Chinese Office Action for Chinese Application No. 201680085524.2, dated Nov. 29, 2021 with translation, 14 pages.
USPTO Non Final Office Action for U.S. Appl. No. 16/099,780, dated Jun. 27, 2022, 19 pages.
Related Publications (1)
Number Date Country
20200081451 A1 Mar 2020 US