The subject invention relates to nursery and greenhouse operations and to an adaptable container handling system including a robot which is able to pick up and transport containers such as plant containers to a specified location.
Nurseries and greenhouses regularly employ workers to reposition plants such as shrubs and trees in containers on plots of land as large as thirty acres or more. Numerous, for example hundreds or even thousands of containers may be brought to a field and then manually placed in rows at a designated spacing. Periodically, the containers are respaced, typically as the plants grow. Other operations include jamming, (e.g., for plant retrieval in the fall), consolidation, and collection. See U.S. Patent Application No. 2005/0135912 incorporated herein by this reference.
The use of manual labor to accomplish these tasks is both costly and time consuming. But, attempts at automating such container handling tasks have met with limited success.
Large-scale or hard automation is practiced at some greenhouses especially in Europe. The “Walking Plant System” (WPS Horti System, BV) is one prominent greenhouse automation scheme. In this method, conveyor belts, tables on tracks, and other centrally controlled mechanisms maneuver containers throughout the entire production cycle—plants are never placed on an outdoor floor or on un-actuated surfaces. Thus, from seed to sale, virtually no direct human labor is needed.
Such hard automation methods can be cost effective where land and labor prices are high and in operations where only one variety of plant is produced. But, in other circumstances (more typical of the U.S.), this form of automation loses some of its advantage. Because the automation components are fully integrated into the greenhouse, hard automation installations cannot easily be scaled up or down in response to demand. Further, the initial costs of a hard automation system is very high—growers essentially purchase a plant factory.
Visser International Trade and Engineering, BV has automated one container manipulation task known as “spacing.” Spacing involves positioning containers on an outdoor field or greenhouse floor with a surrounding buffer space so that plants have room to grow. A forklift-sized device called the “Space-O-Mat” uses multiple actuated forks to position plants in a regular pattern as the device backs up under autonomous control. The disadvantages of the Space-O-Mat become apparent when it is used on real-world field. If the surface is somewhat soft or uneven, the Space-O-Mat may fail to deposit containers properly. Also, many greenhouse are too tightly configured to accommodate Space-O-Mat.
A research project begun in 2000 and sponsored by NASA, Carnegie Mellon University, and the American Nurseries and Landscape Association developed a semi-autonomous device called Junior also intended to perform container spacing. Junior, a tractor-sized mechanism, automated part of the spacing task. Two workers move containers from a wagon to a conveyor on Junior that reaches across the wagon. The conveyor transports the plants to a mechanism with about a dozen grippers mounted on an arm. When each of the grippers has been loaded with a container, the arm activates. The arm maneuvers the plants onto the ground, spacing them relative to plants already on the ground. Then Junior backs up positioning itself to accept the next group of plants from the wagon.
Junior has not been developed into a commercial product. Were Junior to be offered for sale, it would likely have a high price. Junior is also a very complex piece of equipment and might require special expertise to operate. Because of its large size Junior cannot be used inside many greenhouses.
It is therefore an object of the subject invention to provide a new robot for container handling.
It is a further object of the subject invention to provide a robot which is low cost, and simple and intuitive to operate.
It is a further object of the subject invention to provide such a robot which is simple in design and reliable.
It is a further object of the subject invention to provide such a robot which is small enough to maneuver indoors such as inside a typical greenhouse.
It is a further object of the subject invention to provide such a robot which eliminates much of the manual labor associated with nursery, greenhouse, and other growing operations.
It is a further object of the subject invention to provide such a robot which reliably operates even in adverse conditions.
It is a further object of the subject invention to provide such a robot which can operate for many hours and even work throughout the night.
It is a further object of the subject invention to provide such a robot which is able to place containers in their designated configuration with better accuracy than human workers.
It is a further object of the subject invention to provide a complete container handling system.
It is a further object of the subject invention to provide such a system which is scalable.
It is a further object of the subject invention to provide such a system which is safe.
The subject invention, however, in other embodiments, need not achieve all these objectives and the claims hereof should not be limited to structures or methods capable of achieving these objectives.
The subject invention features an adaptable container handling system including a boundary subsystem and one or more robots. Each robot typically includes a chassis, a container lift mechanism moveable with respect to the robot chassis for transporting at least one container, a drive subsystem for maneuvering the chassis, a boundary sensing subsystem, and a container detection subsystem. A robot controller is responsive to the boundary sensing subsystem and the container detection subsystem and is configured to control the drive subsystem to follow a boundary once intercepted until a container is detected and turn until another container is detected. The controller then controls the container lift mechanism to place a transported container proximate the second detected container.
In one preferred embodiment, the controller controls the drive subsystem to return the robot to a prescribed container source location and controls the container lift mechanism to retrieve another container at the source location. In one example, a transmitting beacon is located at the source transmitting a signal and each robot further includes a receiver for receiving the transmitted beacon signal. The boundary subsystem typically includes reflective tape which may include non-reflective portions denoting distance.
In one design, the boundary sensing subsystem includes at least one infrared emitter and at least one infrared detector and the container detection subsystem includes a linear array of infrared emitters and infrared detectors. In another design the container detection subsystem includes a camera based subsystem. One subsystem includes a laser source which emits a beam intersecting the field of view of the camera between xmin and xmax. The detection subsystem determines the distance between the robot and a container as a function of the angle of the beam. The camera is preferably mounted such that pixels are read out as columns rather than rows. The beam may be turned off between read outs of adjacent columns and then laser off columns are subtracted from laser on columns.
The container detection subsystem may also includes a system configured to detect if a container is present in the container lift mechanism. One system includes an infrared source and an infrared detector.
The preferred drive subsystem includes a pair of driven wheels one on each side of the chassis. One container lift mechanism includes a rotatable yoke, a pair of spaced forks extending from the yoke for grasping a container, and a drive train driven in one direction to rotate the yoke to simultaneously extend and lower the forks and driven in the opposite direction to simultaneously retract and raise the forks. One preferred drive train includes a first sprocket rotatable with respect to the yoke, a second sprocket rotatable with respect to the chassis, a moving link associated with the yoke interconnecting the first and second sprockets, and a means for driving the second sprocket. The system also includes a motor driving a gearbox, a driver sprocket driven by the gearbox, a third sprocket driven by the driver sprocket and fixed to the second sprocket, and a fork extending from the moving link. A forward skid plate may be mounted to the robot chassis.
A typical user interface for the controller includes an input for setting the width of a bed, an input for setting a desired container spacing, an input for setting the desired spacing pattern, and an input for setting the desired container diameter
The controller logic is typically configured to detect a container/boundary condition and to operate the drive subsystem and the container lift mechanism in response based on the condition. A first condition is no containers are detected before three boundaries are detected and a response to the first condition is to locate the third boundary or obstruction and place a transported container proximate the boundary or obstruction to place the first container in the first row. A second condition is no container is detected before two boundaries are detected and a response to the second condition is to follow the second boundary, detect a container, and place a transported container proximate the detected container to fill the first row with containers. A third condition is a first boundary is followed, a container is detected, the robot turns, but no additional containers are detected before a second boundary or obstacle is detected. A response to the third condition is to place a transported container proximate the second boundary or obstruction to fill a row with its first container.
One container handling robot in accordance with the subject invention may include a chassis; a drive subsystem for maneuvering the chassis; a container detection subsystem; and a container lift mechanism. There is a rotatable yoke, a pair of spaced forks extending from the yoke for grasping a container, and a drive train driven in one direction to rotate the yoke to simultaneously extend and lower the forks and drivable in an opposite direction to simultaneously retract and raise the forks.
A preferred version of the robot further includes a controller responsive to the container detection subsystem and configured to control the drive subsystem until a container is detected and to control the container lift mechanism to place a container carried by the forks proximate the detected container.
One adaptable plant container handling robot in accordance with the subject invention features a chassis; a container lift mechanism moveable with respect to the robot chassis for retrieving at least one container in a field at a first location, a drive subsystem for maneuvering the chassis to transport the container to a second location, a boundary sensing subsystem, a container detection subsystem, and a controller. The controller is responsive to the boundary sensing subsystem and the container detection subsystem and is configured to control the drive subsystem to follow a boundary once intercepted until a container at the second location is detected and turn until another container is detected, and control the container lift mechanism to place a transported container proximate the second detected container.
The subject invention also features a plant container handling robot including a chassis, a drive subsystem for maneuvering the chassis, and a container detection subsystem including a subsystem which detects containers on the ground, and a subsystem which detects a container carried by the robot. A container lift mechanism includes a rotatable yoke, a pair of spaced forks extending from the yoke for grasping a container, and a drive train which simultaneously extends and lowers the forks and simultaneously retracts and raises the forks. A controller is responsive to the container detection subsystem and configured to control the drive subsystem to maneuver the robot until a container is detected, control the container lift mechanism drive train to place a container carried by the forks proximate the detected container, control the drive subsystem to return the robot to a prescribed source location, and control the container lift mechanism drive train to retrieve another container at the source location.
Other objects, features and advantages will occur to those skilled in the art from the following description of a preferred embodiment and the accompanying drawings, in which:
Aside from the preferred embodiment or embodiments disclosed below, this invention is capable of other embodiments and of being practiced or being carried out in various ways. Thus, it is to be understood that the invention is not limited in its application to the details of construction and the arrangements of components set forth in the following description or illustrated in the drawings. If only one embodiment is described herein, the claims hereof are not to be limited to that embodiment. Moreover, the claims hereof are not to be read restrictively unless there is clear and convincing evidence manifesting a certain exclusion, restriction, or disclaimer.
In accordance with the subject invention, autonomous robots 20,
Each robot 20,
Electronic controller 34 is responsive to the outputs of both boundary sensing subsystem 30 and container detection subsystem 32 and is configured to control robot drive subsystem 36 and container lift mechanism 38 based on certain robot behaviors as explained below. Controller 34 is also responsive to user interface 100. The controller typically includes one or more microprocessors or equivalent programmed as discussed below. The power supply 31 for all the subsystems typically includes one or more rechargeable batteries located in the rear of the robot.
In one particular example, robot 20,
A drive train is employed to rotate yoke 46,
In one preferred embodiment, controller 34,
Controller 34,
Once positioned at the container source location, controller 34 controls drive subsystem 36 and lift mechanism 38 to retrieve another container as shown in
Thereafter, the remaining rows are filled with properly spaced containers as shown in
In this way, distributed containers at source A,
Using multiple fairly inexpensive and simple robots which operate reliably and continuously, large and even moderately sized growing operations can save money in labor costs.
The determination of the position of a container relative to the robot may be accomplished several ways. In one example, a camera-based container detection system includes a camera and source of structured light mounted on the robot R,
The structured light container measurement system may be a camera-based system designed to measure the x-y position of containers within the field of view of a camera at a relatively short range. The system is largely unaffected by ambient light, is unaffected by robot speeds of up to two meter per second or more, and is very low in cost compared to more conventional systems.
In this system, the field of view of a camera intersects a flat wedge of laser light, as shown in
From
h′/f−h/x, and (1)
tan(θ)=h/(xmax−x) (2)
from which the following can be derived:
h′=f tan(θ)(xmax−x)/x, and (3)
x=(f tan(θ)xmax)/(h′+f tan(θ)) (4)
This novel system has the poorest resolution at the greatest distance and improves as the object moves toward the camera. For a typical system, xmax may be about 2 m, θ about 5 degrees, and ψ is, 40 degrees. Θ is determined by noting that h at xmin must approximately match the height of the smallest container the system needs to see, e.g., 6″. At xmax, h′=0, assume that an object is located 1 cm closer than this. In this way, h′/f=4.34×10−5.
If the object were at xmin (which can be computed to be 18.9 cm) then,
h′/f=0.008. (6)
The ratio of these two values of h′/f is about 181. Thus the system can detect a difference of 1 cm at its maximum range if the camera field of view is 40 degrees and the camera has at least 181 pixels in the vertical dimension. At distances of less than xmax, changes of 1 cm in range produce a greater than one pixel difference in the image.
The container measurement system also employs a novel method to identify the structured light signal in the image. This system is low in cost because it requires minimal computation and storage can use a slow (30 Hz) low-resolution camera. Yet, it is able to deliver rapid, accurate container positioning at all light levels.
The container measurement system uses triangulation and structured light to compute the range to a container. The key challenge in such a system is to identify pixels in the image that are illuminated by the structured laser light and reject pixels that are not. The traditional method for accomplishing this is frame differencing. Here the structured light source is turned on, and an image is recorded, the light source is turned off and a second image is recorded. Next, the second image is subtracted from the first on a pixel-by-pixel basis. If the only change between the two images is the presence of the structure light in the first then the only non-zero pixels in the difference correspond to structured light. Two drawbacks to the use of this method in a robot in accordance with the subject invention are as follows. First, the robots move rapidly and thus the constraint that nothing changes between images except the structured light is not satisfied. Second, the fact that two complete images must be stored and differenced adds storage and computation requirements thus increasing cost.
The inventive container measuring system works as follows. The camera is turned to 90 degrees to its normal mounting position so that pixels normally exposed and read out in rows are read out as columns. The structured light (a plane of laser light) is synchronized to column exposure. That is, the laser is turned on, a row is exposed and read out, then the laser is turned off, and the adjacent row is exposed and read out, and so on. The detail in a typical image can be expected not to change very much from one column to the next. Thus when the laser-off column is subtracted from the adjacent laser-on column, the largest value in the difference will correspond to pixels illuminated by the laser. Further, since cameras typically have more than 100 columns per frame, the robot will move two orders of magnitude less between the processing of two columns than it moves between the time two frames are processed.
Depending on its velocity, the robot may move significantly during the time required to process a frame one column at a time. However, because the speed of the robot is known, it is possible to eliminate the resulting motion distortion from the data. Finally, besides the ability to expose and read out image data one column at a time, the structured light system places no unusual demands on the camera. The camera's built-in facilities for automatic gain adjustment should enable it to operate under all light conditions. If rather than the column-differencing scheme described above, a conventional frame-differencing scheme were used to measure the containers, the standard the camera would have to meet to ensure that the difference between two images was almost entirely due to the modulation of the structured light source rather than the motion of other image features is explained as follows. From the geometry above, the most rapid change in image position occurs when an object is near xmin. If a CIF camera chip with 288 lines is used it turns out that a change of one line in the image corresponds to a motion of 1.15 mm in x when the object is near xmin. At 2.0 m/s it takes the robot just 0.00115/2.0=575 μs to move that far. Thus to cancel the effects of motion the camera would have to grab an image 1739 times per second. Cameras that operate at this high frame rate are considerably more expensive than lower speed cameras. However, for objects near xmax, a standard 30 Hz camera would be sufficient. Thus, an alternate strategy could attempt to detect objects when they are far away using a 30 Hz camera and slow the robot down before the object reaches xmin. This requires that the velocity be about 58 times slower near xmin than at 2.0 meters or 3.4 cm/s. This speed constraint would limit the utility of the system.
Focused light or an infrared source could be used. The container detection system can also be implemented using discrete components rather than a camera. In
A flowchart of the container centering/pickup method is shown in
The preferred system of the subject invention minimizes cost by avoiding high-performance but expensive solutions in favor of lower cost systems that deliver only as much performance as required and only in the places that performance is necessary. Thus navigation and container placement are not typically enabled using, for example a carrier phase differential global positioning system. Instead, a combination of boundary following, beacon following, and dead-reckoning techniques are used. The boundary subsystem of the subject invention provides an indication for the robot regarding where to place containers greatly simplifying the user interface.
The boundary provides a fixed reference and the robot can position itself with high accuracy with respect to the boundary. The robot places containers within a few feet of the boundary. This arrangement affords little opportunity for dead-reckoning errors to build up when the robot turns away from the boundary on the way to placing a container.
After the container is deposited, the robot must return to collect the next container. Containers are typically delivered to the field by the wagonload. By the time one wagonload has been spaced, the next will have been delivered further down the field. In order to indicate the next load, the user may position a beacon near that load. The robot follows this procedure: when no beacon is visible, the robot uses dead-reckoning to travel as nearly as possible to the place it last picked up a container. If it finds a container there, it collects and places the container in the usual way. If the robot can see the beacon, it moves toward the beacon until it encounters a nearby container. In this way, the robot is able to achieve the global goal of spacing all the containers in the field, using only local knowledge and sensing. Relying only on local sensing makes the system more robust and lower in cost.
Users direct the robot by setting up one or two boundary markers, positioning a beacon, and dialing in several values. No programming is needed. The boundary markers show the robots where containers are to be placed. The beacon shows the robots where to pick up the containers.
Thus, although specific features of the invention are shown in some drawings and not in others, this is for convenience only as each feature may be combined with any or all of the other features in accordance with the invention. The words “including”, “comprising”, “having”, and “with” as used herein are to be interpreted broadly and comprehensively and are not limited to any physical interconnection. Moreover, any embodiments disclosed in the subject application are not to be taken as the only possible embodiments. For example, the robot system disclosed herein may be used for tasks other than those associated with growing operations.
In addition, any amendment presented during the prosecution of the patent application for this patent is not a disclaimer of any claim element presented in the application as filed: those skilled in the art cannot reasonably be expected to draft a claim that would literally encompass all possible equivalents, many equivalents will be unforeseeable at the time of the amendment and are beyond a fair interpretation of what is to be surrendered (if anything), the rationale underlying the amendment may bear no more than a tangential relation to many equivalents, and/or there are many other reasons the applicant can not be expected to describe certain insubstantial substitutes for any claim element amended.
Other embodiments will occur to those skilled in the art and are within the following claims.
This application hereby claims the benefit of and priority to U.S. Provisional Application Ser. No. 61/066,768, filed on Feb. 21, 2008 under 35 U.S.C. §§119, 120, 363, 365, and 37 C.F.R. §1.55 and §1.78.
Number | Name | Date | Kind |
---|---|---|---|
3913758 | Faircloth et al. | Oct 1975 | A |
4155198 | Kelley | May 1979 | A |
4217073 | Propst | Aug 1980 | A |
4401236 | Germaine | Aug 1983 | A |
4476651 | Drury | Oct 1984 | A |
4522546 | Ringer | Jun 1985 | A |
4700301 | Dyke | Oct 1987 | A |
4749327 | Roda | Jun 1988 | A |
4793096 | Todd, Sr. | Dec 1988 | A |
4854802 | deGroot | Aug 1989 | A |
4869637 | deGroot | Sep 1989 | A |
4994970 | Noji et al. | Feb 1991 | A |
5016541 | Feaster, Jr. | May 1991 | A |
5020964 | Hyatt et al. | Jun 1991 | A |
5020965 | Tanaka et al. | Jun 1991 | A |
5046914 | Holland et al. | Sep 1991 | A |
5051906 | Evans et al. | Sep 1991 | A |
5081941 | Weeks | Jan 1992 | A |
5085553 | Bouwens et al. | Feb 1992 | A |
5160235 | Bikow | Nov 1992 | A |
5181818 | Tanaka et al. | Jan 1993 | A |
5211523 | Andrada Galan et al. | May 1993 | A |
5315517 | Kawase et al. | May 1994 | A |
5332363 | Tanaka et al. | Jul 1994 | A |
5348063 | Handleman | Sep 1994 | A |
5348361 | Ilchuk | Sep 1994 | A |
5403142 | Stewart | Apr 1995 | A |
5427492 | Tanaka et al. | Jun 1995 | A |
5496143 | Breyer | Mar 1996 | A |
5688102 | Vieselmeyer | Nov 1997 | A |
5769589 | Lubbers | Jun 1998 | A |
5819863 | Zollinger et al. | Oct 1998 | A |
5842306 | Onosaka et al. | Dec 1998 | A |
5959423 | Nakanishi et al. | Sep 1999 | A |
5974348 | Rocks | Oct 1999 | A |
5988971 | Fossey et al. | Nov 1999 | A |
6164537 | Mariani et al. | Dec 2000 | A |
6186730 | Place | Feb 2001 | B1 |
6212821 | Adam et al. | Apr 2001 | B1 |
6216631 | Wissner-Gross | Apr 2001 | B1 |
6243987 | Hessel | Jun 2001 | B1 |
6255793 | Peless et al. | Jul 2001 | B1 |
6336051 | Pangels et al. | Jan 2002 | B1 |
6347920 | Place | Feb 2002 | B1 |
6389329 | Colens | May 2002 | B1 |
6417641 | Peless et al. | Jul 2002 | B2 |
6431818 | Place | Aug 2002 | B1 |
6481948 | Spears | Nov 2002 | B2 |
6496755 | Wallach et al. | Dec 2002 | B2 |
6508033 | Hessel et al. | Jan 2003 | B2 |
6532404 | Colens | Mar 2003 | B2 |
6543983 | Felder et al. | Apr 2003 | B1 |
6611738 | Ruffner | Aug 2003 | B2 |
6638004 | Berger et al. | Oct 2003 | B2 |
6658324 | Bancroft et al. | Dec 2003 | B2 |
6667592 | Jacobs et al. | Dec 2003 | B2 |
6729836 | Stingel, III et al. | May 2004 | B2 |
6850024 | Peless et al. | Feb 2005 | B2 |
6854209 | Van Horssen et al. | Feb 2005 | B2 |
6857493 | Shupp et al. | Feb 2005 | B2 |
6915607 | Tagawa et al. | Jul 2005 | B2 |
6950722 | Mountz | Sep 2005 | B2 |
6984952 | Peless et al. | Jan 2006 | B2 |
6988518 | Rackers | Jan 2006 | B2 |
6997663 | Siebenga | Feb 2006 | B2 |
7069111 | Glenn et al. | Jun 2006 | B2 |
7086820 | Blake | Aug 2006 | B1 |
7137770 | Ueda | Nov 2006 | B2 |
7184855 | Stingel, III et al. | Feb 2007 | B2 |
7198312 | Blaho | Apr 2007 | B2 |
7200465 | Stingel, III et al. | Apr 2007 | B2 |
7261511 | Felder et al. | Aug 2007 | B2 |
7274167 | Kim | Sep 2007 | B2 |
7343222 | Solomon | Mar 2008 | B2 |
7400108 | Minor et al. | Jul 2008 | B2 |
7506472 | Leyns et al. | Mar 2009 | B2 |
7559736 | Mohan | Jul 2009 | B1 |
7579803 | Jones et al. | Aug 2009 | B2 |
7610122 | Anderson | Oct 2009 | B2 |
7613544 | Park et al. | Nov 2009 | B2 |
20010008112 | Opitz | Jul 2001 | A1 |
20020120364 | Colens | Aug 2002 | A1 |
20020146306 | Morrell | Oct 2002 | A1 |
20020182046 | Schempf et al. | Dec 2002 | A1 |
20030030398 | Jacobs et al. | Feb 2003 | A1 |
20030118487 | Pressman et al. | Jun 2003 | A1 |
20030165373 | Felder et al. | Sep 2003 | A1 |
20030199944 | Chapin et al. | Oct 2003 | A1 |
20040139692 | Jacobsen et al. | Jul 2004 | A1 |
20050090961 | Bonk et al. | Apr 2005 | A1 |
20050126144 | Koselka et al. | Jun 2005 | A1 |
20050135912 | Schempf et al. | Jun 2005 | A1 |
20050135913 | Visser | Jun 2005 | A1 |
20050238465 | Razumov | Oct 2005 | A1 |
20050246056 | Marks et al. | Nov 2005 | A1 |
20050254924 | Swetman et al. | Nov 2005 | A1 |
20050254927 | Swetman et al. | Nov 2005 | A1 |
20050268987 | Rackers | Dec 2005 | A1 |
20060045679 | Ostendorff | Mar 2006 | A1 |
20060072988 | Hariki et al. | Apr 2006 | A1 |
20060095169 | Minor et al. | May 2006 | A1 |
20060120834 | Pressman et al. | Jun 2006 | A1 |
20060213167 | Koselka et al. | Sep 2006 | A1 |
20060221769 | Van Loenen et al. | Oct 2006 | A1 |
20060257236 | Stingel, III et al. | Nov 2006 | A1 |
20060293810 | Nakamoto | Dec 2006 | A1 |
20070017181 | Jacobsen et al. | Jan 2007 | A1 |
20070042803 | Anderson | Feb 2007 | A1 |
20070129849 | Zini et al. | Jun 2007 | A1 |
20070140821 | Garon et al. | Jun 2007 | A1 |
20070152619 | Sugiyama et al. | Jul 2007 | A1 |
20070219720 | Trepagnier et al. | Sep 2007 | A1 |
20080046130 | Faivre et al. | Feb 2008 | A1 |
20080131254 | Cope et al. | Jun 2008 | A1 |
20080279663 | Alexander | Nov 2008 | A1 |
20090012667 | Matsumoto et al. | Jan 2009 | A1 |
20090021351 | Beniyama et al. | Jan 2009 | A1 |
20090054222 | Zhang et al. | Feb 2009 | A1 |
20090148034 | Higaki et al. | Jun 2009 | A1 |
20090175709 | Okabe et al. | Jul 2009 | A1 |
20090254217 | Pack et al. | Oct 2009 | A1 |
20110025454 | Pomerantz et al. | Feb 2011 | A1 |
20120114187 | Duarte | May 2012 | A1 |
20130325159 | Kilibarda et al. | Dec 2013 | A1 |
Number | Date | Country |
---|---|---|
3828447 | Mar 1990 | DE |
0195191 | Sep 1986 | EP |
0774702 | May 1997 | EP |
S61204714 | Sep 1986 | JP |
03-285602 | Dec 1991 | JP |
07-065908 | Mar 1995 | JP |
11-077579 | Mar 1999 | JP |
2006-346767 | Dec 2006 | JP |
2007-508667 | Apr 2007 | JP |
2009-511288 | Mar 2009 | JP |
WO 9422094 | Sep 1994 | WO |
9959042 | Nov 1999 | WO |
WO 2007004551 | Jun 2006 | WO |
WO 2009024246 | Feb 2009 | WO |
Entry |
---|
European Search Report for EP09711619, dated Apr. 4, 2013. |
International Search Report and Written Opinion for PCT/US2012/035480, dated Nov. 16, 2012. |
International Searching Authority, Written Opinion of the International Searching Authority, International Application No. PCT/US2009/001031, mailed Apr. 20, 2009, 7 pages. (unnumbered). |
European Examination Report for EP09711619, dated Nov. 4, 2013. |
Number | Date | Country | |
---|---|---|---|
20090214324 A1 | Aug 2009 | US |
Number | Date | Country | |
---|---|---|---|
61066768 | Feb 2008 | US |