The present teachings relate to robotic systems and, more specifically, to navigation systems for autonomous vehicles.
Autonomous vehicles including robotic devices are becoming more prevalent today and are used to perform tasks traditionally considered mundane, time-consuming, or dangerous. As programming technology increases, so does the demand for robotic devices that can navigate around a complex environment or working space with little or no assistance from a human operator.
Autonomous vehicles and associated controls, navigation systems, and other related systems are being developed. For example, U.S. Pat. No. 6,594,844 discloses a Robot Obstacle Detection System, the disclosure of which is hereby incorporated by reference in its entirety. Additional robot control and navigation systems, and other related systems, are disclosed in PCT Published Patent Application No. WO 2004/025947 and in U.S. Pat. Nos. 6,809,490, 6,690,134, 6,781,338, 7,024,478, 6,883,201, and 7,332,890, the disclosures of which are hereby incorporated by reference in their entireties.
Many autonomous vehicles navigate a working space by moving randomly until an obstacle is encountered. Generally, these types of vehicles have on-board obstacle detectors, such as bump sensors or similar devices, which register contact with an obstacle. Once contact is made, command routines can direct the autonomous vehicle to move in a direction away from the obstacle. These types of systems, which are useful for obstacle avoidance, are limited in their ability to allow an autonomous vehicle to track its location within a room or other working environment. Other systems, often used in conjunction with bump sensors as described above, use an infrared or other detector to sense the presence of nearby walls, obstacles, or other objects, and either follow the obstacle or direct the vehicle away from it. These systems, however, are also limited in their ability to allow an autonomous vehicle to navigate effectively in a complex environment, as they only allow the vehicle to recognize when objects are in its immediate vicinity.
In more advanced navigation systems, an autonomous vehicle comprises an infrared or other type of transmitter, which directs a series of infrared patterns in horizontal directions around the autonomous vehicle. These patterns can be detected by a stationary receiver placed at or near a boundary of the working space, for example on a wall. A microprocessor can use the information from signals generated by the receiver to calculate where in the working space the autonomous vehicle is located at all times. Using such systems, the vehicle can navigate around an entire area. These systems, however, are best employed in working spaces where few objects are present that may interfere with the dispersed patterns of infrared signals.
Limitations of the above types of navigation systems are, at present, a hurdle to creating a highly independent autonomous vehicle that can navigate in a complex environment.
The present teachings provide a navigation control system for an autonomous vehicle. The system comprises a transmitter and an autonomous vehicle. The transmitter comprises an emitter for emitting at least one signal, a power source for powering the emitter, a device for capturing wireless energy to charge the power source, and a printed circuit board for converting the captured wireless energy to a form for charging the power source. The autonomous vehicle operates within a working area and comprises a receiver for detecting the at least one signal emitted by the emitter, and a processor for determining a relative location of the autonomous vehicle within the working area based on the signal emitted by the emitter.
The present teachings also provide a transmitter for use in a navigation control system for an autonomous vehicle. The transmitter comprises an emitter for emitting at least one signal, a power source for powering the emitter, a device for capturing wireless energy to charge the power source, and a printed circuit board for converting the captured wireless energy to a form for charging the power source.
The present teachings further provide a method for controlling navigation of an autonomous vehicle within one or more work areas. The method comprises emitting one or more signals from a transmitter, receiving the one or more signals on the autonomous vehicle, powering the transmitter with a power source, charging the power source wirelessly, localizing the autonomous vehicle with respect to the transmitter, and navigating the autonomous vehicle within the one or more work areas.
Additional objects and advantages of the present teachings will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the teachings. The objects and advantages of the present teachings will be realized and attained by means of the elements and combinations particularly pointed out in the appended claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the present teachings, as claimed.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate several embodiments of the present teachings and together with the description, serve to explain the principles of the present teachings.
Reference will, now be made in detail to embodiments of the present teachings, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
In accordance with an exemplary implementation of the present teachings,
For simplicity, this disclosure will describe vacuuming as a demonstrative task of the depicted robotic cleaning device 12. It will be apparent, though, that the navigation system disclosed herein has wide applications across a variety of autonomous systems. For example, an autonomous vehicle may be used for floor waxing and polishing, floor scrubbing, ice resurfacing, sweeping and vacuuming, unfinished floor sanding, stain/paint application, ice melting and snow removal, grass cutting, etc. Any number of task-specific components may be required for such duties, and may each be incorporated into the autonomous vehicle, as necessary.
The transmitter 20 directs at least two infrared signals 22a, 24a from emitters 24A and 24B to a surface remote from the working area 14 upon which the autonomous vehicle 12 operates. The depicted embodiment directs the infrared signals 22a, 24a to the ceiling 18, but it may also direct the signals 22a, 24a to a portion of a wall 16 or to both the walls 16 and ceiling 18. The signals 22a, 24a can be directed to a variety of points on the remote surface, but directing the signals as high as possible above the working area 14 can allow the signals 22a, 24a to be more easily detected by the autonomous vehicle 12, because the field of view of the autonomous vehicle's receiver 28 is less likely to be blocked by an obstacle (such as, for example, a high-backed chair or tall plant). In this disclosure, the regions of contact 22b, 24b of the signals 22a, 24a on the remote surface will be referred to as “points,” regardless of the size of the intersection. For example, by using a collimator in conjunction with the emitters (described below), the points of intersection 22b, 24b of the signals 22a, 24a can be a finite area with the signal strongest at approximately central points.
In certain embodiments of the transmitter 20, the signals 22a, 24a are directed toward a ceiling 18, at two points 22c, 24c, forming a line proximate and parallel to the wall 16 upon which the transmitter 20 is located. Alternatively, and as depicted in
As the autonomous vehicle 12 moves within a working area 14, it detects the signals 22a, 24a emitted by the transmitter 20 as energy bouncing or reflecting off of the diffuse ceiling surface 18. In an alterative embodiment, visible points can be used in place of infrared points. A camera onboard the autonomous vehicle can replace the infrared receiver in detecting either infrared or visible points. The autonomous vehicle's microprocessor can convert the signals 22a, 24a sensed by the receiver 28 into bearings from the robot 12 to the signals 22a, 24a. The microprocessor can then calculate representative elevation angles ε1, ε2 and azimuths α1, α2 of the signals to determine the location of the autonomous vehicle 12 within the working area 14. In this embodiment, the azimuths α1, α2 are measured using a “forward” direction of movement M of the autonomous vehicle 12 as a datum, but any suitable datum can be used. By calculating the elevation angle and azimuth from the autonomous vehicle 12 to the two signals 22a, 24a, the autonomous vehicle 12 can locate itself within a working area with improved accuracy.
The transmitter 120 emits two signals 122a, 124a (depicted graphically by a plurality of arrows) into the two rooms 136, 138, respectively. The signals 122a, 124a can be configured to not overlap each other, thus providing a distinct signal on each side of the door centerline 130. In other embodiments, an overlap of the signals 122a, 124a can be desirable. The autonomous vehicle 112 includes a receiver 128 having a field of vision 134. The emitted signals 122a, 124a can be detected by the receiver 128 when the autonomous vehicle's field of vision 134 intersects the signals 122a, 124a. Similar to the embodiment of
In accordance with various embodiments of the present teachings, the transmitter can include a visible signal option (not shown), aligned with the emitted signals, allowing a user to direct the signals to particular locations. In accordance with the present teachings, more than one transmitter may be used. Such a system could include communication capability between the various transmitters, for example to ensure that only one signal or a subset of signals is emitted at any given time.
A battery-powered transmitter located above a window or door frame can not only permit the autonomous vehicle to localize within a map, coordinate system, or cell grid relative to the transmitter, but can also localize the transmitter within the same map, coordinate system, or cell grid, thereby localizing the window or door frame. Localization of an autonomous vehicle within a working environment is described in detail in U.S. Patent Publication No. 2008/0294288, filed Nov. 27, 2008, the entire disclosure of which is incorporated herein by reference. In the case of a door frame, the door is ordinarily the passage by which the autonomous vehicle navigates from room to room. The transmitter illustrated in
The exemplary embodiment of a transmitter 20 illustrated in
In accordance with various embodiments of the present teachings, more than two emitters can be utilized with collimators 22e, 24e, 122e, 124e, to distinguish different areas within a room. Such a configuration allows the autonomous vehicle to sense its relative location within a room and adjust its cleaning behavior accordingly. For example, a signal could mark an area of the room that an autonomous vehicle would likely get stuck in. The signal could allow an autonomous vehicle to recognize the area and accordingly not enter it, even though it might otherwise be able to do so unimpeded. Alternatively, or in addition, different signals could mark areas that require different cleaning behaviors (e.g., due to carpeting or wood floors, high traffic areas, etc.).
Turning back to
In various embodiments of the present teachings, each signal (regardless of the emitter's location or the number of signals) can be modulated at 10 kHz and coded with an 8-bit code serving as a unique signal identifier, preventing the autonomous vehicle from confusing one signal or point with another. Accordingly, more than two uniquely encoded signals can be employed to increase the accuracy of the autonomous vehicle's calculations regarding its location within a working area. As noted above, using only one emitter allows an autonomous vehicle to take a heading based on that signal. Using two or more signals can allow the autonomous vehicle to continue navigating if fewer than all of the signals are detected (either due to a failure of a signal transmission or the autonomous vehicle moving to a location where fewer than all of the signals are visible).
In certain embodiments, the transmitter can pulse the coded signals as follows. After an initial synchronization pulse, a first signal at 10 kHz is emitted for 100 ms. This can provide a sufficient time for the autonomous vehicle's receiver and processor to calculate azimuth and elevation angles, as discussed in detail below. So that the autonomous vehicle can determine which signal is being received, the transmitter can pulse a series of five bits, each for 10 ms. The five bits include two start bits, for example a zero and a one, followed by a unique three bit identifier to identify that particular signal or point. After a 100 ms delay, the transmitter repeats the sequence for the second signal or point. By changing the modulation frequency and/or the identifier, the second signal or point can be uniquely distinguished from the first. Any number of unique signals can be transmitted and identified in this manner. After the series of signals are transmitted, the transmitter can wait a substantially longer period of time, for example on the order of one to two seconds, before repeating the transmitting sequence, starting again with the first signal. The length of time for each transmission is merely exemplary, and may be varied based on a particular application, device, etc. As stated above, the signals can be modulated at the same or different frequencies.
As illustrated in
The construction of the receiver 328 can be similar to that of
In operation, a receiver (e.g., an infrared receiver) can first measure the “noise floor” of the autonomous vehicle's environment, comprising the amount of energy (e.g., infrared energy) present in the autonomous vehicle's environment, which it sets as the threshold value. This value can represent an average of the values for each photodiode. Any subsequent measurement above this threshold value can trigger an event (e.g., a calculation of point azimuth and elevation). The receiver can then measure the modulation frequency again, searching for an expected increase at 10 kHz (i.e., the frequency of the initial synchronization signal transmitted by the transmitter). If a 10 kHz frequency increase is detected, the autonomous vehicle recognizes the increase as an emitted navigation signal. The autonomous vehicle can then measure the amplitude of the reflected point on all five photodiodes to determine an average value. This value can then be compared to a list of signal frequencies to determine which of the signals has been detected. Alternatively, any detected identity sequence associated with the signal can be compared to a list of transmitter codes or signal IDs stored in a lookup table in the autonomous vehicle's processor memory.
The on-board microprocessor can use the amplitude value to determine the azimuth and elevation of the received signals, which it can then use to determine its location within a working area. To determine the azimuth, the microprocessor enters the values of the two strongest readings from the tour side photodiodes into an algorithm. The algorithm takes the ratio of these two readings to determine the azimuth angle. Far example, if the two strongest readings from two photodiodes are equal, the algorithm recognizes that the point is located at an azimuth angle that is directly between the two photodiodes (i.e., at 45°). In a similar algorithm, the amplitude value measured from the strongest side photodiode and the amplitude value measured from the top-facing photodiode value are used to determine the elevation of the signal. These values can be stored in the autonomous vehicle's memory for future reference.
After the receiver has detected at least two points, and determines the azimuth and elevation of each point, it determines its location within the working space. A triangulation algorithm based on the known ceiling height and the azimuth and elevation of the two detected points allows the processor to determine where in the working space the autonomous vehicle is located. Over time, the values of elevation and azimuth between each coded point and specific locations of the autonomous vehicle within the workspace can be stored in the autonomous vehicle's memory, creating a map of the environment in which the autonomous vehicle operates.
In various embodiments, a navigation system 200 as depicted in
In the embodiment depicted in
Of the four detectors that reside in a single plane, the values of the two strongest signals detected are used to form a ratio to determine the azimuth angle (Step 735). The ratio of second-strongest signal over the strongest signal is either compared to a look-up table or inserted into a mathematical equation to determine an azimuth angle output. Both the look-up table and the equation represent the overlap of the received sensitivity patterns of two orthogonal detectors with known sensor responses. In this embodiment, the photo detector output is modeled as a fourth-order Gaussian response to angle off of “boresight,” a term that generally refers to a vector that is orthogonal to the semiconductor die in the detector package.
To calculate elevation, the strongest signal from azimuth calculation (i.e., the denominator of the ratio) must first be normalized, as if it were on boresight of the respective detector (Step 740). For example, if the azimuth has been determined to be 10° off of boresight from a given detector, that 10° angle is entered into a look-up table or equation that describes the sensor response of any single photo detector. At zero degrees, the output of this look-up table/equation would be 1.00000. As the angle deviates from zero degrees, the output drops to some fraction of 1.00000 (the normalized value at boresight). For example, if a value of 10° is entered into the equation, the output of this operation can be, for example. 0.99000. The denominator of the azimuth ratio can then be divided by this fractional value in order to scale up, or “normalize” that value to what it would be if the azimuth were actually zero degrees. This normalized value can then be stored in memory and elevation can be determined therefrom.
To calculate elevation, the normalized output from the previous step is used to produce a new ratio with the output from the upward-looking (fifth) detector, so that the numerator is the second-strongest of the two values and the denominator is the strongest of the two values (Step 745). This ratio is then entered into the same look-up table or equation from the step above (used to calculate azimuth), thus outputting an elevation angle.
The benefits of this type of navigation system can be numerous. As the autonomous vehicle moves about a working area, measuring the azimuth and elevation of the various points detected, it can create a map of the area, thus determining its location within a given space. With this information, it can fuse data from all of its on-board sensors and improve cleaning or other task efficiency. One way it can do this is to create a map where the high-traffic areas in a house or other building are located (as indicated by readings from the dirt sensor, for example). The autonomous vehicle would then clean the areas it identified as high traffic (and therefore, often dirty) each time it passes over that area, whether directed to or not. The autonomous vehicle may also improve its cleaning function by merging the output from the wheel drop, stasis, bumper, and wall-following sensors to roughly mark areas of entrapment, or where large obstacles exist, so that those areas can potentially be avoided in future runs.
In accordance with various embodiments of the present teachings, another method of improving cleaning efficiency involves selectively programming the autonomous vehicle to clean particular areas, as detailed below. For example, a personal computer or remote control may be used to control the autonomous vehicle. Although the autonomous vehicle can operate without operator intervention, an operator can initially set up the autonomous vehicle, or can direct the autonomous vehicle to operate in particular areas or at particular times. For example, by using more than one transmitter in various rooms on one floor of a house, an operator may be able to direct the autonomous vehicle to clean specific rooms in a particular order and/or at a specific time. The operator could select, in a control program field of a computer program for example, the living room, family room, bathroom, and kitchen areas for cleaning. A remote control for use in accordance with the present teachings is described in more detail with respect to
Once commanded (either immediately or on a predefined schedule), the autonomous vehicle can be signaled to begin its cleaning cycle. The autonomous vehicle undocks from its base/charging station and begins cleaning the closest or first room on the programmed list. It can recognize this room and can differentiate it by the coded group of infrared points (e.g., on a ceiling of the room) or the coded signal emitted in the room. After the first room is cleaned, the autonomous vehicle can, for example, check its level of power and return to its charger for additional charging if needed. In accordance with certain embodiments in order to return to the charger, the autonomous vehicle can follow the point or points on the ceiling back to the base station. Alternatively, the autonomous vehicle can use a known docking behavior. After charging is complete, the autonomous vehicle can traverse roughly back to the place it left off and resume cleaning. This sequence of events continues until all of the programmed rooms have been cleaned. Alternatively, the selection of particular areas to clean could be, for example, made by remote control or by pressing buttons on a control panel located on the base station. By using a personal computer, however, multiple transmitters could communicate with each other and with the base station via power lines using a known communication technology.
An alternative embodiment of the present teachings is depicted in
Alternatively the autonomous vehicle 612 can emit its own coded pulse, to determine if any transmitters are in the area. This coded pulse could “awaken” sleeping or otherwise dormant transmitters, which would then begin their own emission cycle. Alternatively, the pulse can be an audible or visual signal such as a distinct beep, buzz, or visual strobe. Some pulses need not be within the field of view of the transmitter.
The robot 612 will continue to move toward signal 622a until one of several events happens at or near doorway 632a. In a first event, the autonomous vehicle may determine, based on readings from its photodiodes, that it is directly under the transmitter 620a. In a second event, the autonomous vehicle 612 may sense a second signal 624a, which may overlap the first detected signal 622a. This could occur if the configuration of the emitters, collimators, etc., as described in more detail above, provides overlapping signal patterns between signals 622a and 624a. In a third event, autonomous vehicle 612 can sense a signal from an entirely different transmitter, in this case signal 622b from transmitter 620b. Other events are also contemplated, as suitable for a particular application. The occurrence of an event presents the autonomous vehicle 612 with any number of behavioral, functional, or other options. For example, each coded signal may serve as a unique marker for a different working space. Upon detecting the unique marker associated with a particular working space, the autonomous vehicle may alter its cleaning function. Thus, if room A is carpeted but room B is uncarpeted, the autonomous vehicle can adjust its cleaning as it moves from room A to room B. Upon detecting a second signal (in this case, signal 622b) the autonomous vehicle can, in certain embodiments, completely disregard the first signal 622a received when its return to the base station 622 began. Using new signal 622b as a heading, it begins moving toward that signal 622b. The autonomous vehicle 612 can, in certain embodiments, check its battery level at each event, storing that value in its microprocessor. Over time, the autonomous vehicle can thereby create a table of battery levels at each event (and battery level change from event to event), and be able to accurately determine precise battery power remaining at each transmitter location.
Once the autonomous vehicle is traversing room B (shown in phantom as 612′), it will eventually determine, based on battery level, time, or other factors, to follow the heading provided by signal 622b, and continue its return to its base station 622. The autonomous vehicle 612 can follow the heading until an event occurs at or near doorway 632b. Again, the event can be detecting a strength of signal 622b, indicating that the autonomous vehicle is directly below the transmitter, detecting an overlap signal from 624b, or detecting a new signal 622c. The autonomous vehicle 612 can again perform any of the behaviors described above: check and store its battery level; change cleaning characteristics; etc.
Once in room C, the autonomous vehicle can begin following the heading provided by signal 622c. At or near the doorway 632c to room D, an event may direct the autonomous vehicle to perform any number of behaviors. Alternatively, the autonomous vehicle can move directly to charging station 622, guided by emitted signal 626 or another signal or program.
During its return to the base station, as the autonomous vehicle 612 moves from room A to room B to room C and so on, it detects and stores information about each coded signal that it detects along its route. By storing this information, the autonomous vehicle can create a map, using the coded signals as guideposts, allowing it to return to its starting location in the future. After charging, the autonomous vehicle can return to the room it was working in prior to returning to its base by comparing the detected signals and their strengths to the stored information.
Lastly,
Accordingly, the navigation system can be operationally robust and adapted to compensate for variances in infrared energy. For example, if the autonomous vehicle is operating in an environment with high base infrared readings (e.g., a space with a large number of fluorescent lighting fixtures or windows that allow entry of sunlight), the autonomous vehicle can distinguish the infrared signals generated by the transmitter from the infrared noise present in the space. Similarly, the receiver can distinguish between other off-band signals, such as infrared remote controls. In such cases, establishing an initial threshold value of infrared energy and emitting a predefined, known, modulated infrared signal from the transmitter overcomes these environmental disturbances. Additionally, the transmitter can be tuned to emit a sufficiently strong infrared signal to accommodate surfaces with varied reflectivity.
Wireless charging in accordance with the present teachings can comprise, for example, RF scavenging or magnetoresonance. Wireless charging via RF scavenging can be accomplished as disclosed in U.S. Patent Publication No. 2009/0102296, the entire disclosure of which is incorporated herein by reference. The antenna 250 (e.g., a RF wireless communication antenna) can facilitate both energy harvesting and wireless communication for the transmitter 200 and, to facilitate energy harvesting, can harvest RF energy from a variety of sources including, for instance, medium frequency AM radio broadcast, very high frequency (VHF) FM radio broadcast, cellular base stations, wireless data access points, etc. The energy can be harvested from that naturally available in the environment (work area) or can be broadcast by a source such as an RF signal emitter on the autonomous vehicle or on another device such as a base station or a dedicated emitter.
Certain embodiments of the present teachings contemplate wireless charging via strongly coupled magnetic resonances, or magnetoresonance. Such wireless charging is described in detail in Kurs, et al., Wireless Power Transfer via Strongly Coupled Magnetic Resonances, Science, Vol. 317, pp. 83-86 (Jul. 6, 2008), the entire disclosure of which is incorporated herein by reference. For wireless charging via magnetoresonance, the antenna 250 can comprise, for example, a capture coil that can capture and convert magnetic energy to AC voltage or DC voltage. The magnetic energy captured by the capture coil can be supplied by a power source such as a highly resonant magnetic source. The power source can be located, for example, on the autonomous vehicle (in a scenario such as that illustrated in
One skilled in the art will appreciate that the transmitter 200 can derive its power from a source other than a battery, for example from a wall plug or by direct connection to a building's power supply. Also, the emitters can have differing locations on the transmitter, and need not be combined with a lens as illustrated. The size of the transmitter can vary in accordance with functional considerations (e.g., being large enough to house its components) as well as aesthetic considerations (e.g., minimizing size to be less obtrusive).
One skilled in the art will appreciate that the transmitter 500 can derive its power from a source other than a battery, for example from a wall plug or by direct connection to a building's power supply. Also, the emitters can have differing locations on the transmitter, and need not be combined with a collimator and/or a lens as illustrated. The size of the transmitter can vary in accordance with functional consideration (e.g., being large enough to house its components) as well as aesthetic considerations (e.g., minimizing size to be less obtrusive).
In embodiments of the present teachings employing more than two emitters, the signals can be utilized, e.g., with collimators or lenses, to distinguish different areas within a room. Such a configuration allows the autonomous vehicle 12 to sense its relative location within a room and adjust its cleaning behavior accordingly. For example, a signal could mark an area of the room that an autonomous vehicle would likely get stuck in. The signal could allow an autonomous vehicle to recognize the area and accordingly not enter it, even though it might otherwise be able to do so unimpeded. Alternatively or additionally, different signals could mark areas that require different cleaning behaviors (e.g., due to carpeting or wood floors, high traffic areas, etc.).
The transmitters 200, 300 as illustrated in
As illustrated in
Another input device 330, shown in the illustrated embodiment as a toggle pad or toggle button, can allow the user to direct the autonomous vehicle to perform a number of functions. For example, the user can press a center “CLEAN” portion of the toggle button to direct the autonomous vehicle to begin cleaning immediately, or can select the right “DOCK NOW” button to direct the autonomous vehicle to begin a homing behavior and dock. A top “SCHEDULE” button can be pressed to allow the user to select a schedule of rooms and/or times for cleaning, an exemplary process for which is illustrated in
In accordance with certain embodiments of the present teachings, the remote control 370, can also display a status screen such as that illustrated in
Other embodiments of the present teachings will be apparent to those skilled in the art from consideration of the specification and practice of the teachings disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the present teachings being indicated by the following claims.
This application is a continuation application of and claims priority to U.S. patent application Ser. No. 16/827,447, filed on Mar. 23, 2020, which is a continuation application of and claims priority to U.S. patent application Ser. No. 15/491,603, now U.S. Pat. No. 10,599,159, filed on Apr. 19, 2017, which is a continuation of U.S. patent application Ser. No. 15/336,367, now U.S. Pat. No. 9,921,586, filed on Dec. 1, 2016, which is a continuation of patent application Ser. No. 14/966,621, now U.S. Pat. No. 9,529,363, filed on Dec. 11, 2015, which is a continuation of U.S. patent application Ser. No. 13/731,393, now U.S. Pat. No. 9,223,749, filed on Dec. 31, 2012, which is a continuation of U.S. patent application Ser. No. 12/611,814, now U.S. Pat. No. 8,972,052, filed on Nov. 3, 2009. All of the foregoing applications are incorporated by reference in their entireties in the present application.
Number | Name | Date | Kind |
---|---|---|---|
4328545 | Halsall et al. | May 1982 | A |
4482960 | Pryor | Nov 1984 | A |
4638445 | Mattaboni | Jan 1987 | A |
4638446 | Palmberg | Jan 1987 | A |
4679152 | Perdue | Jul 1987 | A |
4691101 | Leonard | Sep 1987 | A |
4790402 | Field | Dec 1988 | A |
4817000 | Eberhardt | Mar 1989 | A |
4933864 | Evans, Jr. | Jun 1990 | A |
4947094 | Dyer | Aug 1990 | A |
4954962 | Evans, Jr. | Sep 1990 | A |
4962453 | Pong, Jr. | Oct 1990 | A |
5001635 | Yasutomi et al. | Mar 1991 | A |
5023788 | Kitazume | Jun 1991 | A |
5032775 | Mizuno | Jul 1991 | A |
5051906 | Evans et al. | Sep 1991 | A |
5086535 | Grossmeyer | Feb 1992 | A |
5165064 | Mattaboni | Nov 1992 | A |
5170352 | McTamaney | Dec 1992 | A |
5204814 | Noonan | Apr 1993 | A |
5220263 | Onishi | Jun 1993 | A |
5307271 | Everett et al. | Apr 1994 | A |
5321614 | Ashworth | Jun 1994 | A |
5341540 | Soupert | Aug 1994 | A |
5410479 | Coker | Apr 1995 | A |
5453931 | Watts, Jr. | Sep 1995 | A |
5525883 | Avitzour | Jun 1996 | A |
5528888 | Miyamoto et al. | Jun 1996 | A |
5634237 | Paranjpe | Jun 1997 | A |
5659779 | Laird | Aug 1997 | A |
5677836 | Bauer | Oct 1997 | A |
5696675 | Nakamura | Dec 1997 | A |
5770936 | Hirai et al. | Jun 1998 | A |
5841259 | Kim et al. | Nov 1998 | A |
5926909 | McGee | Jul 1999 | A |
5940930 | Oh | Aug 1999 | A |
5942869 | Katou et al. | Aug 1999 | A |
5995884 | Allen | Nov 1999 | A |
5998953 | Nakamura | Dec 1999 | A |
6009359 | El-Hakim et al. | Dec 1999 | A |
6076025 | Ueno et al. | Jun 2000 | A |
6076226 | Reed | Jun 2000 | A |
6142252 | Kinto | Nov 2000 | A |
6292712 | Bullen | Sep 2001 | B1 |
6336051 | Pangels et al. | Jan 2002 | B1 |
6339735 | Peless et al. | Jan 2002 | B1 |
6370452 | Pfister | Apr 2002 | B1 |
6370453 | Sommer | Apr 2002 | B2 |
6374155 | Wallach et al. | Apr 2002 | B1 |
6389329 | Colens | May 2002 | B1 |
6457206 | Judson | Oct 2002 | B1 |
6459955 | Bartsch et al. | Oct 2002 | B1 |
6493612 | Bisset et al. | Dec 2002 | B1 |
6496754 | Song et al. | Dec 2002 | B2 |
6496755 | Wallach et al. | Dec 2002 | B2 |
6532404 | Colens | Mar 2003 | B2 |
6574536 | Kawagoe et al. | Jun 2003 | B1 |
6580246 | Jacobs | Jun 2003 | B2 |
6594844 | Jones | Jul 2003 | B2 |
6597076 | Scheible et al. | Jul 2003 | B2 |
6615108 | Peless et al. | Sep 2003 | B1 |
6658325 | Zweig | Dec 2003 | B2 |
6667592 | Jacobs | Dec 2003 | B2 |
6690134 | Jones et al. | Feb 2004 | B1 |
6732826 | Song et al. | May 2004 | B2 |
6781338 | Jones et al. | Aug 2004 | B2 |
6809490 | Jones et al. | Oct 2004 | B2 |
6830120 | Yashima et al. | Dec 2004 | B1 |
6836701 | McKee | Dec 2004 | B2 |
6845297 | Allard | Jan 2005 | B2 |
6868307 | Song | Mar 2005 | B2 |
6898516 | Pechatnikov | May 2005 | B2 |
6965209 | Jones et al. | Nov 2005 | B2 |
7041029 | Fulghum et al. | May 2006 | B2 |
7042342 | Luo | May 2006 | B2 |
7054716 | McKee | May 2006 | B2 |
7155308 | Jones | Dec 2006 | B2 |
7173391 | Jones et al. | Feb 2007 | B2 |
7196487 | Jones et al. | Mar 2007 | B2 |
7332890 | Cohen et al. | Feb 2008 | B2 |
7349091 | Wada et al. | Mar 2008 | B2 |
7388343 | Jones et al. | Jun 2008 | B2 |
7388879 | Sabe et al. | Jun 2008 | B2 |
7389156 | Ziegler et al. | Jun 2008 | B2 |
7448113 | Jones et al. | Nov 2008 | B2 |
7456596 | Goodall | Nov 2008 | B2 |
7480958 | Song | Jan 2009 | B2 |
7567052 | Jones et al. | Jul 2009 | B2 |
7571511 | Jones et al. | Aug 2009 | B2 |
7579803 | Jones et al. | Aug 2009 | B2 |
7636982 | Jones et al. | Dec 2009 | B2 |
7706917 | Chiappetta et al. | Apr 2010 | B1 |
7739036 | Grimm et al. | Jun 2010 | B2 |
7761954 | Ziegler et al. | Jul 2010 | B2 |
7805220 | Taylor et al. | Sep 2010 | B2 |
7860680 | Arms et al. | Dec 2010 | B2 |
8368339 | Jones et al. | Feb 2013 | B2 |
8380350 | Ozick et al. | Feb 2013 | B2 |
8396599 | Matsuo et al. | Mar 2013 | B2 |
8493344 | Fleizach | Jul 2013 | B2 |
8659255 | Jones et al. | Feb 2014 | B2 |
8659256 | Jones et al. | Feb 2014 | B2 |
8681106 | Fleizach | Mar 2014 | B2 |
8686679 | Jones et al. | Apr 2014 | B2 |
8688272 | Hong et al. | Apr 2014 | B2 |
8761931 | Halloran | Jun 2014 | B2 |
8818706 | Ogale | Aug 2014 | B1 |
8843245 | Choe | Sep 2014 | B2 |
8954192 | Ozick et al. | Feb 2015 | B2 |
8972052 | Chiapetta | Mar 2015 | B2 |
8988424 | Tsuchida | Mar 2015 | B2 |
9008835 | Dubrovsky | Apr 2015 | B2 |
9009612 | Fleizach | Apr 2015 | B2 |
9144360 | Ozick et al. | Sep 2015 | B2 |
9229454 | Chiappetta et al. | Jan 2016 | B1 |
9233472 | Angle | Jan 2016 | B2 |
9265396 | Lu | Feb 2016 | B1 |
9314925 | Hong et al. | Apr 2016 | B2 |
9349058 | Schamp | May 2016 | B2 |
9375847 | Angle | Jun 2016 | B2 |
9392920 | Halloran et al. | Jul 2016 | B2 |
9446510 | Vu | Sep 2016 | B2 |
9486924 | Dubrovsky | Nov 2016 | B2 |
9529363 | Chiapetta | Dec 2016 | B2 |
9582005 | Jones et al. | Feb 2017 | B2 |
9629514 | Hillen et al. | Apr 2017 | B2 |
9675226 | Kim et al. | Jun 2017 | B2 |
9717387 | Szatmary | Aug 2017 | B1 |
9802322 | Angle | Oct 2017 | B2 |
9828094 | Mcmillion | Nov 2017 | B2 |
9874873 | Angle et al. | Jan 2018 | B2 |
9921586 | Chiappetta | Mar 2018 | B2 |
9958871 | Jones et al. | May 2018 | B2 |
10168709 | Kleiner et al. | Jan 2019 | B2 |
10310507 | Kleiner et al. | Jun 2019 | B2 |
10365659 | Park et al. | Jul 2019 | B2 |
10591921 | Wang | Mar 2020 | B2 |
10599159 | Chiappetta | Mar 2020 | B2 |
10824165 | Jones et al. | Nov 2020 | B2 |
10990110 | Chiappetta et al. | Apr 2021 | B2 |
11006802 | Watanabe et al. | May 2021 | B2 |
11209833 | Chiappetta | Dec 2021 | B2 |
20010004719 | Sommer | Jun 2001 | A1 |
20020016649 | Jones | Feb 2002 | A1 |
20020060542 | Song et al. | May 2002 | A1 |
20020120364 | Colens | Aug 2002 | A1 |
20020153184 | Song | Oct 2002 | A1 |
20030023356 | Keable | Jan 2003 | A1 |
20030025472 | Jones et al. | Feb 2003 | A1 |
20030030398 | Jacobs | Feb 2003 | A1 |
20030030399 | Jacobs | Feb 2003 | A1 |
20030090522 | Verhaar | May 2003 | A1 |
20030120389 | Abramson | Jun 2003 | A1 |
20030212472 | McKee | Nov 2003 | A1 |
20040020000 | Jones | Feb 2004 | A1 |
20040049877 | Jones et al. | Mar 2004 | A1 |
20040073337 | McKee | Apr 2004 | A1 |
20040083570 | Song et al. | May 2004 | A1 |
20040085037 | Jones et al. | May 2004 | A1 |
20040113777 | Matsuhira | Jun 2004 | A1 |
20040187457 | Colens | Sep 2004 | A1 |
20040201361 | Koh et al. | Oct 2004 | A1 |
20040204792 | Taylor et al. | Oct 2004 | A1 |
20040207355 | Jones et al. | Oct 2004 | A1 |
20040211444 | Taylor et al. | Oct 2004 | A1 |
20040220707 | Pallister | Nov 2004 | A1 |
20040236468 | Taylor et al. | Nov 2004 | A1 |
20050000543 | Taylor et al. | Jan 2005 | A1 |
20050067994 | Jones et al. | Mar 2005 | A1 |
20050171636 | Tani | Aug 2005 | A1 |
20050204505 | Kashiwagi | Sep 2005 | A1 |
20050204717 | Colens | Sep 2005 | A1 |
20050270530 | Wada et al. | Dec 2005 | A1 |
20050287038 | Dubrovsky | Dec 2005 | A1 |
20060038521 | Jones et al. | Feb 2006 | A1 |
20060178777 | Park | Aug 2006 | A1 |
20060220900 | Ceskutti et al. | Oct 2006 | A1 |
20060288519 | Jaworski et al. | Dec 2006 | A1 |
20060293788 | Pogodin | Dec 2006 | A1 |
20070100548 | Small | May 2007 | A1 |
20070250212 | Halloran et al. | Oct 2007 | A1 |
20070266508 | Jones et al. | Nov 2007 | A1 |
20070271011 | Lee | Nov 2007 | A1 |
20070290649 | Jones et al. | Dec 2007 | A1 |
20080007203 | Cohen et al. | Jan 2008 | A1 |
20080039974 | Sandin et al. | Feb 2008 | A1 |
20080051953 | Jones et al. | Feb 2008 | A1 |
20080058987 | Ozick et al. | Mar 2008 | A1 |
20080084174 | Jones et al. | Apr 2008 | A1 |
20080091304 | Ozick et al. | Apr 2008 | A1 |
20080109126 | Sandin et al. | May 2008 | A1 |
20080140255 | Ziegler et al. | Jun 2008 | A1 |
20080155768 | Ziegler et al. | Jul 2008 | A1 |
20080263477 | Ying | Oct 2008 | A1 |
20080266748 | Lee | Oct 2008 | A1 |
20080307590 | Jones et al. | Dec 2008 | A1 |
20090102296 | Greene et al. | Apr 2009 | A1 |
20090182464 | Myeong et al. | Jul 2009 | A1 |
20090319083 | Jones et al. | Dec 2009 | A1 |
20100049365 | Jones et al. | Feb 2010 | A1 |
20100082193 | Chiappetta | Apr 2010 | A1 |
20100257690 | Jones et al. | Oct 2010 | A1 |
20100257691 | Jones et al. | Oct 2010 | A1 |
20100263158 | Jones et al. | Oct 2010 | A1 |
20100268384 | Jones et al. | Oct 2010 | A1 |
20100292839 | Hong et al. | Nov 2010 | A1 |
20100309147 | Fleizach | Dec 2010 | A1 |
20100309148 | Fleizach | Dec 2010 | A1 |
20100312429 | Jones et al. | Dec 2010 | A1 |
20100313125 | Fleizach | Dec 2010 | A1 |
20110004339 | Ozick et al. | Jan 2011 | A1 |
20110077802 | Halloran | Mar 2011 | A1 |
20110202175 | Romanov | Aug 2011 | A1 |
20110264305 | Choe et al. | Oct 2011 | A1 |
20120168240 | Wilson | Jul 2012 | A1 |
20120259481 | Kim | Oct 2012 | A1 |
20120262284 | Irrgang et al. | Oct 2012 | A1 |
20120265391 | Letsky | Oct 2012 | A1 |
20130103194 | Jones et al. | Apr 2013 | A1 |
20140116469 | Kim et al. | May 2014 | A1 |
20140156071 | Hong et al. | Jun 2014 | A1 |
20140166047 | Hillen et al. | Jun 2014 | A1 |
20140207281 | Angle et al. | Jul 2014 | A1 |
20140207282 | Angle et al. | Jul 2014 | A1 |
20140222251 | Jones et al. | Aug 2014 | A1 |
20140316636 | Hong | Oct 2014 | A1 |
20150223659 | Han | Aug 2015 | A1 |
20160010574 | Kumar et al. | Jan 2016 | A1 |
20160282863 | Angle et al. | Sep 2016 | A1 |
20170097641 | Jones et al. | Apr 2017 | A1 |
20170123433 | Chiappetta | May 2017 | A1 |
20170265703 | Park et al. | Sep 2017 | A1 |
20180074508 | Kleiner et al. | Mar 2018 | A1 |
20180257688 | Carter et al. | Sep 2018 | A1 |
20180284786 | Moshkina-Martinson | Oct 2018 | A1 |
20180284792 | Kleiner et al. | Oct 2018 | A1 |
20180292828 | Jones et al. | Oct 2018 | A1 |
20190150686 | Horne et al. | May 2019 | A1 |
20200218282 | Chiappetta | Jul 2020 | A1 |
20200233434 | Chiappetta | Jul 2020 | A1 |
20210011485 | Chiappetta | Jan 2021 | A1 |
20210089040 | Afrouzi et al. | Mar 2021 | A1 |
20210294348 | Lan et al. | Sep 2021 | A1 |
20210333799 | Chiappetta et al. | Oct 2021 | A1 |
20210333800 | Chiappetta et al. | Oct 2021 | A1 |
Entry |
---|
“Facts on the Trilobite,” Electrolux, accessed online <http://trilobite.electrolux.se/presskit_en/node1335.asp?print=yes&pressID=> Dec. 12, 2003, 2 pages. |
[No Author Listed], “Patent Owner's Oral Hearing Demonstratives,” SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Co. (Petitioners) v. iRobot Corp. (Patent Owner), Case No. IPR2020-00734, U.S. Pat. No. 9,921,586, Before Hon. Terrence W. McMillin, Amanda F. Wieker, and Jason W. Melvin, Filed Jul. 7, 2021, 31 pages. |
[No Author Listed], “Petitioner's Demonstratives,” SharkNinja Operating LLC et al. v. iRobot Corp., IPR2020-00734, U.S. Pat. No. 9,921,586, Jul. 12, 2021, 32 pages. |
Certified English Translation of DE Patent No. 10113105, issued on Oct. 4, 2001, Koechel et al. (Exhibit No. 1008 in IPR2020-00734). |
Certified English Translation of JP H07281752, issued on Oct. 27, 1995, Hamaguchi et al. (Exhibit No. 1010 in IPR2020-00734). |
Certified English Translation of JP Patent No. 2002-85305, issued on Mar. 26, 2002, Murakami (Exhibit No. 1005 in IPR2020-00734). |
Curriculum Vitae of Alonzo J. Kelly, (undated), 44 pages (Exhibit No. 1003 in IPR2020-00734). |
Curriculum Vitae of Noah Cowan, Ph.D., dated Apr. 25, 2021, 21 pages (Exhibit 1004, PGR2021-00079), 21 pages. |
DE Patent No. 10113105, published on Oct. 4, 2001, Koechel et al. (Exhibit No. 1007 in IPR2020-00734)(with English Abstract). |
Doty et al., “Sweep Strategies for a Sensory-Driven, Behavior-Based Vacuum Cleaning Agent,” AAAI 1993 Fall Symposium Series, Instantiating Real-World Agents, pp. 1-6, Oct. 22-24, 1993. |
Electrolux “Welcome to the Electrolux trilobite” www.electroluxusa.com/node57.asp?currentURL=node142.asp%3F, 2 pages, Mar. 18, 2005. |
Everett, H.R. (1995). Sensors for Mobile Robots. AK Peters, Ltd., Wellesley, MA., 543 pages. |
File History for U.S. Appl. No. 15/366,367, filed Dec. 1, 2016, 216 pages (Exhibit No. 1013 in IPR2020-00734). |
Hitachi: News release: “The home cleaning robot of the autonomous movement type (experimental machine),” Retrieved from the Internet: URL<www.i4u.com./japanreleases/hitachirobot.htm>, 5 pages, Mar. 2005. |
Honda Motor Co., Ltd., English Translation of JP11212642, Aug. 9, 1999, 31 pages. |
IRobot Corporation v. SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company, “Complaint for Patent Infringement,” Civil Action No. DMA-1-19-cv-12125-1, dated Oct. 15, 2019, 20 pages (IPR2020-00734, Exhibit No. 2012). |
IRobot Corporation v. SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company, “Patent Owner's Preliminary Sur-Reply,” Case No. IPR2020-00734, U.S. Pat. No. 9,921,586, filed Aug. 14, 2020, 9 pages. |
IRobot Corporation v. SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company, “Transcript of Hearing Before the Honorable Allison D. Burroughs,” Civil Action No. 19-cv-12125-ADB, dated Oct. 25, 2019, 33 pages (IPR2020-00734, Exhibit No. 2008). |
Jones, J., Roth, D. (Jan. 2, 2004). Robot Programming: A Practical Guide to Behavior-Based Robotics. McGraw-Hill Education TAB; 288 pages. |
JP Patent No. 2002-85305, issued on Mar. 26, 2002, Murakami (Exhibit No. 1004 in IPR2020-00734) (with English Abstract). |
JP Patent No. H07281752, issued on Oct. 27, 1995, Hamaguchi et al. (Exhibit No. 1009 in IPR2020-00734) (with English Abstract). |
Karcher “Karcher RoboCleaner RC 3000,” Retrieved from the Internet: URL<www.robocleaner.de/english/screen3.html>. 4 pages, Dec. 2003. |
Karcher USA, RC3000 Robotic Cleaner, website: http://www.karcher-USA.com/showproducts.php?op=viewprod¶ml=143¶m2=¶m3=, 3 pages, accessed Mar. 2005. |
Karcher, “Product Manual Download Karch”, available at www.karcher.com, 16 pages, 2004. |
Kurs et al., “Wireless Power Transfer via Strongly Coupled Magnetic Resonances,” Science, Jul. 6, 2007, 317:83-86 (Exhibit 1024, PGR2021-00079), 5 pages. |
Nam et al., “Real-Time Dynamic Visual Tracking Using PSD Sensors and extended Trapezoidal Motion Planning”, Applied Intelligence 10, pp. 53-70, 1999. |
Popec.net Make your digital life http://www.popco.net/zboard/view.php?id=tr_review&no=40 accessed Nov. 1, 2011. |
Prassler et al., “A Short History of Cleaning Robots,” Autonomous Robots 9, 211-226, 2000, 16 pages. |
U.S. Appl. No. 60/586,046, “Celestial Navigation System for an Autonomous Robot,” filed on Jul. 7, 2004, Chiappetta, (Exhibit 1025, PGR2021-00079), 26 pages. |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Management LLC, and SharNinja Sales Company, Petitioners, v. iRobot Corporation, Patent Owner, Case PGR2021-00079, U.S. Pat. No. 10,9990,110, Petitioners' Power of Attorney, Filed Apr. 28, 2021, 3 pages. |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company v. iRobot Corporation, “Declaration of Timothy Johnston,” Case No. IPR2020-00734, U.S. Pat. No. 9,921,586, filed Aug. 7, 2020, 4 pages (Exhibit No. 1025). |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company v. iRobot Corporation, “Patent Owner's Preliminary Response,” Case No. IPR2020-00734, U.S. Pat. No. 9,921,586, filed Jul. 7, 2020, 45 pages. |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company v. iRobot Corporation, “Petitioners' Reply to Patent Owner Preliminary Response,” Case No. IPR2020-00734, U.S. Pat. No. 9,921,586, filed Aug. 7, 2020, 9 pages. |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company v. iRobot Corporation, “Petitioners' Updated Exhibit List,” Case No. IPR2020-00734, U.S. Pat. No. 9,921,586, filed Aug. 7, 2020, 3 pages. |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company v. iRobot Corporation, Declaration of Dr. Alonzo Kelly of U.S. Pat. No. 9,921,586, dated Mar. 23, 2020, 107 pages (Exhibit No. 1002 in IPR2020-00734). |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company v. iRobot Corporation, Petitioners' Power of Attorney in Case No. IPR2020-00734 for U.S. Pat. No. 9,921,586, dated Mar. 18, 2020, 2 pages. |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company v. iRobot Corporation, Petition for Inter Partes Review of U.S. Pat. No. 9,921,586, Case No. IPR2020-00734, dated Mar. 23, 2020, 83 pages. |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company, Petitioners, v. iRobot Corporation, Patent Owner, Case IPR2020-00734, U.S. Pat. No. 9,921,586, Patent Owner's Exhibit List, filed Jul. 7, 2021, 4 pages. |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company, Petitioners, v. iRobot Corporation, Patent Owner, Case IPR2020-00734, U.S. Pat. No. 9,921,586, Patent Owner's Sur-Reply, dated Jun. 4, 2021, 33 pages. |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company, Petitioners, v. iRobot Corporation, Patent Owner, Case IPR2020-00734, U.S. Pat. No. 9,921,586, Petitioners' Reply in Support of Petition for Inter Partes Review of U.S. Pat. No. 9,921,586, filed Apr. 23, 2021, 28 pages (IPR2020-00734). |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company, Petitioners, v. iRobot Corporation, Patent Owner, Case IPR2020-00734, U.S. Pat. No. 9,921,586, Petitioners' Updated Exhibit List, filed Apr. 23, 2021, 4 pages (IPR2020-00734). |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company, Petitioners, v. iRobot Corporation, Patent Owner, Case IPR2020-00734, U.S. Pat. No. 9,921,586, Remote Zoom Deposition of Dr, Richard Hooper, Conducted Apr. 14, 2021, 84 pages (Exhibit 1028 in IPR2020-00734). |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company, Petitioners, v. iRobot Corporation, Patent Owner, Case No. PGR2021-00079, U.S. Pat. No. 10,990,110, Declaration of Dr. Noah Cowan Regarding U.S. Pat. No. 10,990,110, (Exhibit 1003, PGR2021-00079), 55 pages. |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company, Petitioners, v. iRobot Corporation, Patent Owner, Case No. PGR2021-00079, U.S. Pat. No. 10,990,110, Petition for Post-Grant Review of U.S. Pat. No. 10,990,110, Filed Apr. 28, 2021, 60 pages. |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company, Petitioners, v. iRobot Corporation, Patent Owner. Case No. IPR2020-00734, U.S. Pat. No. 9,921,586, Petitioners' Updated Exhibit List, Filed Jul. 7, 2021, 4 pages. |
Statutory Disclaimer in Patent under 37 CFR § 1.321(a), SharkNinja v. iRobot, Case No. IPR2020-00734, dated Jul. 7, 2020, 5 pages (IPR2020-00734, Exhibit No. 2001). |
U.S. Pat. No. 10,990,110, issued on Apr. 27, 2021, Chiappetta, (Exhibit 1001, PGR2021-00079), 38 pages. |
U.S. Pat. No. 6,781,338, issued on Aug. 24, 2004, Jones et al. (Exhibit 1018, PGR2021-00079), 18 pages. |
U.S. Pat. No. 8,972,052, issued on Mar. 3, 2015, Chiappetta, (Exhibit 1010, PGR2021-00079), 51 pages. |
U.S Publication No. 2020/0218282, published on Jul. 9, 2020, Chiappetta, (Exhibit 1005, PGR2021-00079), 36 pages. |
U.S. Appl. No. 17/037,057, Prosecution History of the '110 patent, Chiappetta et al., issued on Apr. 27, 2021, (Exhibit 1002, PGR2021-00079), 178 pages. |
U.S. Pat. No. 10,599,159, issued on Mar. 24, 2020, Chiappetta, (Exhibit 1006, PGR2021-00079), 38 pages. |
U.S. Pat. No. 6,594,844, issued on Jul. 22, 2003, Jones, (Exhibit 1014, PGR2021-00079), 26 pages. |
U.S. Pat. No. 6,690,134, issued on Feb. 10, 2004, Jones et al., (Exhibit 1017, PGR2021-00079), 19 pages. |
U.S. Pat. No. 6,809,490, issued on Oct. 26, 2004, Jones et al., (Exhibit 1016, PGR2021-00079), 28 pages. |
U.S. Pat. No. 6,883,201, issued on Apr. 26, 2005, Jones et al., (Exhibit 1020, PGR2021-00079), 26 pages. |
U.S. Pat. No. 7,024,278, issued on Apr. 4, 2006, Chiappetta et al., (Exhibit 1019, PGR2021-00079), 26 pages. |
U.S. Pat. No. 7,332,890, issued Feb. 19, 2008, Cohen et al., (Exhibit 1021, PGR2021-00079), 26 pages. |
U.S. Pat. No. 7,706,917, issued on Apr. 27, 2010, Chiappetta et al., (Exhibit 1013, PGR2021-00079), 36 pages. |
U.S. Pat. No. 8,594,840, dated Nov. 26, 2013, Chiappetta et al., (Exhibit 1011, PGR2021-00079), 45 pages. |
U.S. Pat. No. 8,634,956, issued on Jan. 21, 2014, Chiappetta et al., (Exhibit No. 1012, PGR2021-00079), 40 pages. |
U.S. Pat. No. 9,223,749, issued on Dec. 29, 2015, Chiappetta, (Exhibit 1009, PGR2021-00079), 49 pages. |
U.S. Pat. No. 9,529,363, issued on Dec. 27, 2016, Chiappetta, (Exhibit 1008, PGR2021-00079), 37 pages. |
U.S. Pat. No. 9,921,586, dated Mar. 20, 2018, Chiappetta et al., (Exhibit 1007, PGR2021-00079), 38 pages. |
U.S. Publication No. 2008/0294288, published on Nov. 27, 2008, Yamauchi, (Exhibit 1022, PGR2021-00079), 61 pages. |
U.S. Publication No. 2009/010229, published Apr. 23, 2009, Greene et al., (Exhibit 1023, PGR2021-00079), 113 pages. |
U.S. Pat. No. 5,440,216, issued on Aug. 8, 1995, Kim (Exhibit No. 1012 in IPR2020-00734). |
U.S. Pat. No. 9,921,586, issued on Mar. 20, 2018, Chiappetta (Exhibit No. 1001 in IPR2020-00734). |
US Publication No. 2002/0156556, issued on Oct. 24, 2002, Ruffner (Exhibit No. 1006 in IPR2020-00734). |
US Publication No. 2004/0083570, published on May 6, 2004, Song et al. (Exhibit No. 1011 in IPR2020-00734). |
US Publication No. 2004/0167667, published on Aug. 26, 2004, Goncalves et al. (Exhibit No. 1014 in IPR2020-00734). |
US Publication No. 2005/0000543, published on Jan. 6, 2005, Taylor et al. (Exhibit No. 1015 in IPR2020-00734). |
US Publication No. 2005/0000543, published on Jan. 6, 2005, Taylor et al., 32 pages (Exhibit No. 1026 in IPR2020-00734). |
US Publication No. 2005/0288079, published on Dec. 29, 2005, Tani, 31 pages (Exhibit No. 1027 in IPR2020-00734). |
W.O Publication No. 2004/025947, published on Mar. 25, 2004, Chiappetta, (Exhibit 1015, PRG2021-00079), 57 pages. |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company, Petitioners, v. iRobot Corporation, Patent Owner, Case PGR2021-00079, U.S. Pat. No. 10,990,110, Patent Owner's Preliminary Response, dated Aug. 16, 2021, 24 pages. |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company, Petitioners, v. iRobot Corporation, Patent Owner, IPRP2020-00734, U.S. Pat. No. 9,921,586, Record of Oral Hearing held on Jul. 12, 2021, entered in Jul. 28, 2021, 42 pages. |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company, Petitioner, v. iRobot Coporation, Patent Owner. IPR2020-00734, U.S. Pat. No. 9,921,586 B2, Final Written Decision Determining Some Challenged Claims Unpatentable, Entered Oct. 4, 2021, 29 pages. |
U.S. Appl. No. 16/827,447, filed Mar. 23, 2020. |
U.S. Appl. No. 15/491,603, filed Apr. 19, 2017. |
U.S. Appl. No. 15/366,367, filed Dec. 1, 2016. |
U.S. Appl. No. 14/966,621, filed Dec. 11, 2015. |
U.S. Appl. No. 13/731,393, filed Dec. 31, 2012. |
U.S. Appl. No. 12/611,814, filed Nov. 3, 2009. |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company, Petitioners', v. iRobot Corporation, Patent Owner. Case No. IPR2020-00734, U.S. Pat. No. 9,921,586, Patent Owner's Notice of Appeal, Dec. 6, 2021, 36 pages. |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company, Petitioners', v. iRobot Corporation, Patent Owner. Case No. IPR2020-00734, U.S. Pat. No. 9,921,586, Petitioners' Notice of Cross-Appeal, Filed Dec. 20, 2021, 34 pages. |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company, Petitioners, v. iRobot Corporation, Patent Owner, Case PGR2021-00079, U.S. Pat. No. 10,990,110, Patent Owner's Response, dated Feb. 1, 2022, 29 pages. |
SharkNinja Operating LLC, SharkNinja Management LLC, and SharkNinja Sales Company, Petitioners, v. iRobot Corporation, Patent Owner. Case IPR2020-00735, U.S. Pat. No. 10,045,676, Petitioner's Reply to Patent Owner's Response, (iRobot Exhibit 2002, SharkNinja v. iRobot PGR2021-00079), dated Feb. 1, 2022, 28 pages. |
Number | Date | Country | |
---|---|---|---|
20210341943 A1 | Nov 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16827447 | Mar 2020 | US |
Child | 17371033 | US | |
Parent | 15491603 | Apr 2017 | US |
Child | 16827447 | US | |
Parent | 15366367 | Dec 2016 | US |
Child | 15491603 | US | |
Parent | 14966621 | Dec 2015 | US |
Child | 15366367 | US | |
Parent | 13731393 | Dec 2012 | US |
Child | 14966621 | US | |
Parent | 12611814 | Nov 2009 | US |
Child | 13731393 | US |