The invention generally relates to automated programmable motion control systems, e.g., robotic, sortation and other processing systems, and relates in particular to programmable motion control systems intended for use in environments requiring that a variety of objects (e.g., articles, packages, consumer products, etc.) be processed and moved to a number of processing destinations.
Many object distribution systems, for example, receive objects in a disorganized stream or bulk transfer that may be provided as individual objects or objects aggregated in groups such as in bags, arriving on any of several different conveyances, commonly a conveyor, a truck, a pallet a Gaylord, or a bin etc. Each object must then be distributed to the correct destination location (e.g., a container) as determined by identification information associated with the object, which is commonly determined by a label printed on the object. The destination location may take many forms, such as a bag, a shelf, a container, or a bin.
The processing (e.g., sortation or distribution) of such objects has traditionally been done, at least in part, by human workers that scan the objects, for example with a hand-held barcode scanner, and then place the objects at assigned locations. Many order fulfillment operations, for example, achieve high efficiency by employing a process called wave picking. In wave picking, orders are picked from warehouse shelves and placed at locations (e.g., into bins) containing multiple orders that are sorted downstream. At the sorting stage, individual articles are identified, and multi-article orders are consolidated, for example, into a single bin or shelf location, so that they may be packed and then shipped to customers. The process of sorting these articles has traditionally been done by hand. A human sorter picks an article, and then places the article in the so-determined bin or shelf location where all articles for that order or manifest have been defined to belong. Automated systems for order fulfillment have also been proposed. See, for example, U.S. Patent Application Publication No. 2014/0244026, which discloses the use of a robotic arm together with an arcuate structure that is movable to within reach of the robotic arm.
The identification of objects by code scanning generally either require manual processing, or require that the code location be controlled or constrained so that a fixed or robot-held code scanner (e.g., a barcode scanner) can reliably detect the code. Manually operated barcode scanners are therefore generally either fixed or handheld systems. With fixed systems, such as those at point-of-sale systems, the operator holds the article and places it in front of the scanner, which scans continuously, and decodes any barcodes that it can detect. If the article's code is not immediately detected, the person holding the article typically needs to vary the position or orientation of the article with respect to the fixed scanner, so as to render the barcode more visible to the scanner. For handheld systems, the person operating the scanner may look at the barcode on the article, and then hold the article such that the barcode is within the viewing range of the scanner, and then press a button on the handheld scanner to initiate a scan of the barcode.
Further, many distribution center sorting systems generally assume an inflexible sequence of operation whereby a disorganized stream of input objects is provided (by a human) as a singulated stream of objects that are oriented with respect to a scanner that identifies the objects. An induction element or elements (e.g., a conveyor, a tilt tray, or manually movable bins) transport the objects to desired destination locations or further processing stations, which may be a bin, a chute, a bag or a conveyor, etc.
In conventional object sortation or distribution systems, human workers or automated systems typically retrieve objects in an arrival order, and sort each object or objects into a collection bin based on a set of given heuristics. For example, all objects of a like type might be directed to a particular collection bin, or all objects in a single customer order, or all objects destined for the same shipping destination, etc. may be directed to a common destination location. Generally, the human workers, with the possible limited assistance of automated systems, are required to receive objects and to move each to their assigned collection bin. If the number of different types of input (received) objects is large, then a large number of collection bins is required.
Such a system, however, has inherent inefficiencies as well as inflexibilities since the desired goal is to match incoming objects to assigned collection bins. Such systems may require a large number of collection bins (and therefore a large amount of physical space, large investment costs, and large operating costs), in part, because sorting all objects to all destinations at once is not always most efficient. Additionally, such break-pack systems must also monitor the volume of each like object in a bin, requiring that a human worker continuously count the items in a bin.
Further, current state-of-the-art sortation systems also rely on human labor to some extent. Most solutions rely on a worker that is performing sortation, by scanning each object from an induction area (chute, table, etc.) and placing each object at a staging location, conveyor, or collection bin. When a bin is full, another worker empties the bin into a bag, box, or other container, and sends that container on to the next processing step. Such a system has limits on throughput (i.e., how fast can human workers sort to or empty bins in this fashion) and on number of diverts (i.e., for a given bin size, only so many bins may be arranged to be within efficient reach of human workers).
Unfortunately, these systems do not address the limitations of the total number of system bins. The system is simply diverting an equal share of the total objects to each parallel manual cell. Thus, each parallel sortation cell must have all the same collection bin designations; otherwise, an object may be delivered to a cell that does not have a bin to which the object is mapped. There remains a need, therefore, for a more efficient and more cost effective object processing system that processes objects of a variety of sizes and weights into appropriate collection bins or trays of fixed sizes, yet is efficient in handling objects of varying sizes and weights.
In accordance with an embodiment, the invention provides a processing system for processing objects using a programmable motion device. The processing system includes a perception unit for perceiving identifying indicia representative of an identity of a plurality of objects received from an input conveyance system, an acquisition system for acquiring an object from the plurality of objects at an input area using an end effector of the programmable motion device, wherein the programmable motion device is adapted for assisting in the delivery of the object to an identified processing bin, and the identified processing bin is associated with the identifying indicia and said identified processing location is provided as one of a plurality of processing bins, and a delivery system for bringing the identified processing bin toward the object, where the delivery system includes a carrier for carrying the identified processing bin toward the object.
In accordance with another embodiment, the invention provides a processing system for processing objects using a programmable motion device, where the processing system includes a perception unit for perceiving identifying indicia representative of an identity of a plurality of objects associated with an input conveyance system, an acquisition system for acquiring an object from the plurality of objects at an input area using an end effector of the programmable motion device, wherein the programmable motion device is adapted for assisting in the delivery of the object to an identified processing bin, and the identified processing bin is associated with the identifying indicia and said identified processing container is provided as one of a plurality of processing bins, and a delivery system for bringing the identified processing container toward the programmable motion device by moving the identified processing bin in at least two dimensions.
In accordance with a further embodiment, the invention provides a method of processing objects using a programmable motion device. The method includes the steps of perceiving identifying indicia representative of an identity of a plurality of objects received from an input conveyance system, acquiring an object from the plurality of objects at an input area using an end effector of the programmable motion device, wherein the programmable motion device is adapted for assisting in the delivery of the object to an identified processing bin, said identified processing bin being associated with the identifying indicia and the identified processing location being provided as one of a plurality of processing bins, and bringing the identified processing bin toward the object, including providing a carrier for carrying the identified processing bin toward the object.
The following description may be further understood with reference to the accompanying drawings in which:
The drawings are shown for illustrative purposes only.
In accordance with an embodiment, the invention provides a processing system for processing objects using a programmable motion device. The processing system includes a perception unit, an acquisition system and a delivery system. The perception unit is for perceiving identifying indicia representative of an identity of a plurality of objects received from an input conveyance system. The acquisition system is for acquiring an object from the plurality of objects at an input area using an end effector of the programmable motion device, wherein the programmable motion device is adapted for assisting in the delivery of the object to an identified processing bin. The identified processing bin is associated with the identifying indicia and the identified processing location is provided as one of a plurality of processing bins. The delivery system is for bringing the identified processing bin toward the object, and the delivery system includes a carrier for carrying the identified processing bin toward the object. The processing bins may, for example, be totes, boxes or any of a variety of items for containing objects.
Generally, objects need to be identified and conveyed to desired object specific locations. The systems reliably automate the identification and conveyance of such objects, employing in certain embodiments, a set of conveyors, a perception system, and a plurality of destination bins. In short, applicants have discovered that when automating sortation of objects, there are a few main things to consider: 1) the overall system throughput (objects sorted per hour), 2) the number of diverts (i.e., number of discrete locations to which an object can be routed), 3) the total area of the sortation system (square feet), and 4) the capital and annual costs to purchase and run the system.
Processing objects in a break-pack distribution center is one application for automatically identifying and processing objects. As noted above, in a break-pack distribution center, objects commonly arrive in trucks, are conveyed to sortation stations where they are sorted according to desired destinations into bins (e.g., boxes or packages) that are then loaded in trucks for transport to, for example, shipping or distribution centers or retail stores. In a shipping or distribution center, the desired destination is commonly obtained by reading identifying information printed on the box or package. In this scenario, the destination corresponding to identifying information is commonly obtained by querying the customer's information system. In other scenarios, the destination may be written directly on the box, or may be known through other means such as by assignment to a vendor bin.
The system also requests specific bins of objects from a storage system, which helps optimize the process of having desired objects be delivered to specific singulator cells in an efficient way without simply letting all bins of objects appear at each singulator cell in a purely random order.
The programmable motion device is programmed to access each of the vendor bins 32 and to move any of the objects in bins 32 at input areas 38 to one of a plurality of bins (break-pack packages) 44 at one or more processing locations near the device 40 (as further shown in
With further reference to
Each automated mobile carrier 46 is able to move about the X-Y track 60 with freedom of movement (but for requiring that the control system accommodate moving other mobile devices to make appropriate paths). As shown in
In accordance with a further embodiment,
It is assumed that the bins of objects are marked in one or more places on their exterior with a visually distinctive mark such as a barcode (e.g., providing a UPC code) or radio-frequency identification (RFID) tag or mailing label so that they may be sufficiently identified with a scanner for processing. The type of marking depends on the type of scanning system used, but may include 1D or 2D code symbologies. Multiple symbologies or labeling approaches may be employed. The types of scanners employed are assumed to be compatible with the marking approach. The marking, e.g. by barcode, RFID tag, mailing label or other means, encodes a identifying indicia (e.g., a symbol string), which is typically a string of letters and/or numbers. The symbol string uniquely associates the vendor bin with a specific set of homogenous objects.
The operations of the system described above are coordinated with a central control system 70 as shown in
As discussed above, the system of an embodiment includes a perception system (e.g., 50) that is mounted above a bin of objects to be processed next to the base of the articulated arm 40, looking down into a bin 32. The system 50, for example and as shown in
If an object cannot be fully perceived by the detection system, the perception system considers the object to be two different objects, and may propose more than one candidate grasp of such two different objects. If the system executes a grasp at either of these bad grasp locations, it will either fail to acquire the object due to a bad grasp point where a vacuum seal will not occur (e.g., on the right), or will acquire the object at a grasp location that is very far from the center of mass of the object (e.g., on the left) and thereby induce a great deal of instability during any attempted transport. Each of these results is undesirable.
If a bad grasp location is experienced, the system may remember that location for the associated object. By identifying good and bad grasp locations, a correlation is established between features in the 2D/3D images and the idea of good or bad grasp locations. Using this data and these correlations as input to machine learning algorithms, the system may eventually learn, for each image presented to it, where to best grasp an object, and where to avoid grasping an object.
As shown in
The invention provides therefore in certain embodiments that grasp optimization may be based on determination of surface normal, i.e., moving the end effector to be normal to the perceived surface of the object (as opposed to vertical or gantry picks), and that such grasp points may be chosen using fiducial features as grasp points, such as picking on a barcode, given that barcodes are almost always applied to a flat spot on the object.
In accordance with various embodiments therefore, the invention further provides a processing system that may learn object grasp locations from experience (and optionally human guidance). Systems designed to work in the same environments as human workers will face an enormous variety of objects, poses, etc. This enormous variety almost ensures that the robotic system will encounter some configuration of object(s) that it cannot handle optimally; at such times, it is desirable to enable a human operator to assist the system and have the system learn from non-optimal grasps.
The system optimizes grasp points based on a wide range of features, either extracted offline or online, tailored to the gripper's characteristics. The properties of the suction cup influence its adaptability to the underlying surface, hence an optimal grasp is more likely to be achieved when picking on the estimated surface normal of an object rather than performing vertical gantry picks common to current industrial applications.
In addition to geometric information the system uses appearance-based features since depth sensors may not always be accurate enough to provide sufficient information about graspability. For example, the system can learn the location of fiducials such as barcodes on the object, which can be used as indicator for a surface patch that is flat and impermeable, hence suitable for a suction cup. One such example is the use of barcodes on consumer products. Another example is shipping boxes and bags, which tend to have the shipping label at the object's center of mass and provide an impermeable surface, as opposed to the raw bag material, which might be slightly porous and hence not present a good grasp.
By identifying bad or good grasp points on the image, a correlation is established between features in the 2D/3D imagery and the idea of good or bad grasp points; using this data and these correlations as input to machine learning algorithms, the system can eventually learn, for each image presented to it, where to grasp and where to avoid.
This information is added to experience based data the system collects with every pick attempt, successful or not. Over time the robot learns to avoid features that result in unsuccessful grasps, either specific to an object type or to a surface/material type. For example, the robot may prefer to avoid picks on shrink wrap, no matter which object it is applied to, but may only prefer to place the grasp near fiducials on certain object types such as shipping bags.
This learning can be accelerated by off-line generation of human-corrected images. For instance, a human could be presented with thousands of images from previous system operation and manually annotate good and bad grasp points on each one. This would generate a large amount of data that could also be input into the machine learning algorithms to enhance the speed and efficacy of the system learning.
In addition to experience based or human expert based training data, a large set of labeled training data can be generated based on a detailed object model in physics simulation making use of known gripper and object characteristics. This allows fast and dense generation of graspability data over a large set of objects, as this process is not limited by the speed of the physical robotic system or human input.
In accordance with a further embodiment, the system may include one or more mobile carrier units 130 that carry a bin 44 as shown in
Each mobile carrier unit 130 includes a pair of guide rails 142, 144 that contain the bin 44, as well as a raised region 146 that raises the bin sufficient for there to be room on either side of the raised region for shelf forks to engage the bin as will be further discussed below. Each carrier unit 130 also includes four wheel assemblies 132, 134, 136, 138 that each include guides 140 for following the tracks 120. Each of the wheel assemblies is pivotally mounted such that each wheel assembly may pivot 90 degrees as discussed below. Each carrier unit 130 also includes a pair of paddles 148, 150 on either end of the unit 130. Each paddle may be turned either upward to contain a bin on the unit, or turned downward to permit a bin to be loaded onto or removed from the unit as will also be discussed in more detail below.
In accordance with certain embodiments therefore, the invention provides a plurality of mobile carriers that may include swivel mounted wheels that rotate ninety degrees to cause each mobile carrier to move forward and backward, or to move side to side. When placed on a grid, such mobile carriers may be actuated to move to all points on the grid.
Each carrier 130 also includes a pair of opposing rails 142, 144 for retaining a bin, as well as a raised center portion 146 and stands 143, 145 on which a bin may rest. A pair of independently actuated paddles 148, 150 are also provided. Each paddle 148, 150 may be rotated upward (as shown at P in
Note that the orientation of the carrier 130 (also a bin on the carrier) does not change when the carrier changes direction. Again, a bin may be provided on the top side of the carrier, and may be contained by bin rails 142, 144 on the sides, as well actuatable paddles 148, 150. As will be discussed in further detail below, each paddle 148, 150 may be rotated 180 degrees to either urge a bin onto or off of a shelf, or (if both are actuated) to retain a bin on the carrier during transport. Each paddle may therefore be used in concert with movement of the carrier to control movement of the bin with respect to the carrier 130. For example, when one paddle is flipped into an upward position, it may be used to urge the bin onto a shelf or rack while the carrier is moving toward the shelf or rack. Each carrier may also include one or more emergency stop switches 152 for a person to use to stop the movement of a carrier in an emergency, as well as handles 154 to enable a person to lift the carrier if needed.
The movement of the carrier 130 about an array of tracks is further discussed below with regard to
Systems of the invention therefore provide for binary steering of the automated carrier, allowing only bidirectional column and row travel in a grid. One pivot motor may be used for each pair of wheels, with a linkage to pivot the wheel modules. On other embodiments, one pivot motor and linkage could be used for all four wheels, or each wheel may have an independent pivot actuator. The system allows the wheels to follow square track sections by pivoting around rounded corners of the square track sections. The system does not require differential drive line/trajectory following, and keeps the orientation of the carrier fixed throughout all operations.
The system of an embodiment may also employ motion planning using a trajectory database that is dynamically updated over time, and is indexed by customer metrics. The problem domains contain a mix of changing and unchanging components in the environment. For example, the objects that are presented to the system are often presented in random configurations, but the target locations into which the objects are to be placed are often fixed and do not change over the entire operation.
One use of the trajectory database is to exploit the unchanging parts of the environment by pre-computing and saving into a database trajectories that efficiently and robustly move the system through these spaces. Another use of the trajectory database is to constantly improve the performance of the system over the lifetime of its operation. The database communicates with a planning server that is continuously planning trajectories from the various starts to the various goals, to have a large and varied set of trajectories for achieving any particular task. In various embodiments, a trajectory path may include any number of changing and unchanging portions that, when combined, provide an optimal trajectory path in an efficient amount of time.
In certain embodiments, the system may include a plurality of base locations, as well as a plurality of predetermined path portions associated with the plurality of base locations. The trajectories taken by the articulated arm of the robot system from the input bin to the base location are constantly changing based in part, on the location of each object in the input bin, the orientation of the object in the input bin, and the shape, weight and other physical properties of the object to be acquired.
Once the articulated arm has acquired an object and is positioned at the base location, the paths to each of the plurality of destination bins 44 are not changing. In particular, each destination bin is associated with a unique destination bin location, and the trajectories from the base location to each of the destination bin locations individually is not changing. A trajectory, for example, may be a specification for the motion of a programmable motion device over time. In accordance with various embodiments, such trajectories may be generated by experience, by a person training the system, and/or by automated algorithms. For a trajectory that is not changing, the shortest distance is a direct path to the target destination bin, but the articulated arm is comprised of articulated sections, joints, motors etc. that provide specific ranges of motion, speeds, accelerations and decelerations. Because of this, the robotic system may take any of a variety of trajectories between, for example, base locations and destination bin locations.
The risk factor may be determined in a number of ways including whether the trajectory includes a high (as pre-defined) acceleration or deceleration (linear or angular) at any point during the trajectory. The risk factor may also include any likelihood that the articulated arm may encounter (crash into) anything in the robotic environment. Further, the risk factor may also be defined based on learned knowledge information from experience of the same type of robotic arms in other robotic systems moving the same object from a base location to the same destination location.
As shown in the table at 96 in
The choice of fast time vs. low risk factor may be determined in a variety of ways, for example, by choosing the fastest time having a risk factor below an upper risk factor limit (e.g., 12 or 14), or by choosing a lowest risk factor having a maximum time below an upper limit (e.g., 1.0 or 1.2). Again, if the risk factor is too high, valuable time may be lost by failure of the robotic system to maintain acquisition of the object. An advantage of the varied set is robustness to small changes in the environment and to different-sized objects the system might be handling: instead of re-planning in these situations, the system iterates through the database until it finds a trajectory that is collision-free, safe and robust for the new situation. The system may therefore generalize across a variety of environments without having to re-plan the motions.
Overall trajectories therefore, may include any number of changing and unchanging sections. For example, networks of unchanging trajectory portions may be employed as commonly used paths (roads), while changing portions may be directed to moving objects to a close-by unchanging portion (close road) to facilitate moving the object without requiring the entire route to be planned. For example, the programmable motion device (e.g., a robot) may be tasked with orienting the grasped object in front of an automatic labeler before moving towards the destination. The trajectory to sort the object therefore, would be made up of the following trajectory portions. First, a grasp pose to a home position (motion planned). Then, from home position to an auto-labeler home (pulled from a trajectory database). Then, from the auto-labeler home to a labelling pose (motion planned). Then, from the labelling pose to an auto-labeler home (either motion planned or just reverse the previous motion plan step). Then, from the auto-labeler home to the intended destination (pulled from the trajectory database). A wide variety of changing and unchanging (planned and pulled from a database) portions may be employed in overall trajectories. In accordance with further embodiments, the object may be grasped from a specific pose (planned), and when the object reaches a destination bin (from the trajectory database), the last step may be to again place the object in the desired pose (planned) within the destination bin.
In accordance with further embodiments, the motion planning may also provide that relatively heavy items (as may be determined by knowing information about the grasped object or by sensing weight—or both—at the end effector) may be processed (e.g., moved in trajectories) and placed in boxes in very different ways than the processing and placement of relatively light objects. Again, the risk verses speed calculations may be employed for optimization of moving known objects of a variety of weights and sizes as may occur, for example, in the processing of a wide variety of consumer products.
The output stations 48 may include a platform 200 and lift 202 that receive mobile carriers and bins from the track 60 as shown in
The system, therefore, provides means that interface with the customer's outgoing object conveyance systems. When a bin (or package) is full as determined by the system (in monitoring system operation), a human operator may pull the bin from the processing area, and place the bin in an appropriate conveyor. When a bin is full it gets removed to the closed/labelled area; another empty bin is immediately placed in the location freed up by the removed full bin, and the system continues processing as discussed above.
In accordance with a specific embodiment, the invention provides a user interface that conveys all relevant information to operators, management, and maintenance personnel. In a specific embodiment, this may include lights indicating bins that are about to be ejected (as full), bins that are not completely properly positioned, the in-feed hopper content level, and the overall operating mode of the entire system. Additional information might include the rate of object processing and additional statistics. In a specific embodiment, the system may automatically print labels and scan labels before the operator places the packages on an output conveyor. In accordance with a further embodiment, the system may incorporate software systems that interface with the customer's databases and other information systems, to provide operational information to the customer's system, and to query the customer's system for object information.
Those skilled in the art will appreciate that numerous modifications and variations may be made to the above disclosed embodiments without departing from the spirit and scope of the present invention.
The present application is a continuation of U.S. patent application Ser. No. 16/739,670, filed Jan. 10, 2020, now patented as U.S. Pat. No. 11,402,831, issued on Aug. 2, 2022, which is a continuation of U.S. patent application Ser. No. 15/928,977, filed Mar. 22, 2018, now patented as U.S. Pat. No. 10,576,621, issued on Mar. 3, 2020, which claims priority to U.S. Provisional Patent Application Ser. No. 62/475,483, filed Mar. 23, 2017, the disclosures of which are hereby incorporated by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
1030320 | Morgan | Jun 1912 | A |
2294945 | Zink | Sep 1942 | A |
3223259 | Nicholson | Dec 1965 | A |
4114762 | Beal et al. | Sep 1978 | A |
4508484 | Heiz | Apr 1985 | A |
4678390 | Bonneton et al. | Jul 1987 | A |
4722653 | Williams et al. | Feb 1988 | A |
4759439 | Hartlepp | Jul 1988 | A |
4846335 | Hartlepp | Jul 1989 | A |
4846619 | Crabtree et al. | Jul 1989 | A |
4895242 | Michel | Jan 1990 | A |
5190162 | Hartlepp | Mar 1993 | A |
5393074 | Bear et al. | Feb 1995 | A |
5525884 | Sugiura et al. | Jun 1996 | A |
5595263 | Pignataro | Jan 1997 | A |
5839566 | Bonnet | Nov 1998 | A |
6011998 | Lichti et al. | Jan 2000 | A |
6036812 | Williams et al. | Mar 2000 | A |
6059092 | Jerue et al. | May 2000 | A |
6079570 | Oppliger et al. | Jun 2000 | A |
6208908 | Boyd et al. | Mar 2001 | B1 |
6246023 | Kugle | Jun 2001 | B1 |
6323452 | Bonnet | Nov 2001 | B1 |
6377867 | Bradley et al. | Apr 2002 | B1 |
6390756 | Isaacs et al. | May 2002 | B1 |
6505093 | Thatcher et al. | Jan 2003 | B1 |
6543983 | Felder et al. | Apr 2003 | B1 |
6579053 | Grams et al. | Jun 2003 | B1 |
6685031 | Takizawa | Feb 2004 | B2 |
6688459 | Bonham et al. | Feb 2004 | B1 |
6762382 | Danelski | Jul 2004 | B1 |
6897395 | Shiibashi et al. | May 2005 | B2 |
6946612 | Morikawa | Sep 2005 | B2 |
6997666 | Rodgers et al. | Feb 2006 | B1 |
7728244 | De Leo et al. | Jun 2010 | B2 |
7861844 | Hayduchok et al. | Jan 2011 | B2 |
8718814 | Clark et al. | May 2014 | B1 |
8776694 | Rosenwinkel et al. | Jul 2014 | B2 |
8798784 | Clark et al. | Aug 2014 | B1 |
8831984 | Hoffman et al. | Sep 2014 | B2 |
8911199 | Hermann et al. | Dec 2014 | B2 |
8952284 | Wong et al. | Feb 2015 | B1 |
8972045 | Mountz et al. | Mar 2015 | B1 |
8989918 | Sturm | Mar 2015 | B2 |
8997438 | Fallas | Apr 2015 | B1 |
9020632 | Naylor | Apr 2015 | B2 |
9102336 | Rosenwinkel | Aug 2015 | B2 |
9111251 | Brazeau | Aug 2015 | B1 |
9120622 | Elazary et al. | Sep 2015 | B1 |
9216857 | Kalyan et al. | Dec 2015 | B1 |
9227323 | Konolige et al. | Jan 2016 | B1 |
9346083 | Stone | May 2016 | B2 |
9364865 | Kim | Jun 2016 | B2 |
9481518 | Neiser | Nov 2016 | B2 |
9492923 | Wellman et al. | Nov 2016 | B2 |
9517899 | Watt et al. | Dec 2016 | B2 |
9688471 | Hellenbrand | Jun 2017 | B2 |
9694977 | Aprea et al. | Jul 2017 | B2 |
9751693 | Battles | Sep 2017 | B1 |
9821464 | Stiernagle et al. | Nov 2017 | B2 |
9878349 | Crest et al. | Jan 2018 | B2 |
9926138 | Brazeau et al. | Mar 2018 | B1 |
9975148 | Zhu et al. | May 2018 | B2 |
10029865 | McCalib, Jr. et al. | Jul 2018 | B1 |
10576621 | Wagner et al. | Mar 2020 | B2 |
10611021 | Wagner et al. | Apr 2020 | B2 |
10625934 | Mallady | Apr 2020 | B2 |
10857925 | Sahota | Dec 2020 | B1 |
10913612 | Wagner et al. | Feb 2021 | B2 |
10988323 | Wagner et al. | Apr 2021 | B2 |
11084660 | Wagner et al. | Aug 2021 | B2 |
11117760 | Wagner et al. | Sep 2021 | B2 |
11390459 | Wagner et al. | Jul 2022 | B2 |
11402831 | Wagner | Aug 2022 | B2 |
11661275 | Wagner et al. | May 2023 | B2 |
11814245 | Wagner et al. | Nov 2023 | B2 |
11814246 | Wagner et al. | Nov 2023 | B2 |
20020056297 | Sadler | May 2002 | A1 |
20020092801 | Dominguez | Jul 2002 | A1 |
20020157919 | Sherwin | Oct 2002 | A1 |
20030123970 | Grams et al. | Jul 2003 | A1 |
20040193554 | Hillerich, Jr. et al. | Sep 2004 | A1 |
20050002772 | Stone | Jan 2005 | A1 |
20050137933 | Holsen | Jun 2005 | A1 |
20050220600 | Baker et al. | Oct 2005 | A1 |
20060045672 | Maynard et al. | Mar 2006 | A1 |
20060096131 | Hall | May 2006 | A1 |
20070065258 | Benedict et al. | Mar 2007 | A1 |
20070209976 | Worth et al. | Sep 2007 | A1 |
20080040945 | Buckner | Feb 2008 | A1 |
20080181753 | Bastian | Jul 2008 | A1 |
20080269960 | Kostmann | Oct 2008 | A1 |
20090026017 | Freudelsperger | Jan 2009 | A1 |
20090074545 | Lert, Jr. et al. | Mar 2009 | A1 |
20100122942 | Harres et al. | May 2010 | A1 |
20100247275 | Karlen et al. | Sep 2010 | A1 |
20100316469 | Lert et al. | Dec 2010 | A1 |
20110014021 | Reid et al. | Jan 2011 | A1 |
20110144798 | Freudelsperger | Jun 2011 | A1 |
20110238207 | Bastian, II et al. | Sep 2011 | A1 |
20110243707 | Dumas et al. | Oct 2011 | A1 |
20120128454 | Hayduchok et al. | May 2012 | A1 |
20120185082 | Toebes et al. | Jul 2012 | A1 |
20120185122 | Sullivan et al. | Jul 2012 | A1 |
20120189410 | Toebes et al. | Jul 2012 | A1 |
20120189416 | Toebes et al. | Jul 2012 | A1 |
20120195724 | Toebes et al. | Aug 2012 | A1 |
20120259482 | Jeschke | Oct 2012 | A1 |
20130110280 | Folk | May 2013 | A1 |
20130334158 | Koch | Dec 2013 | A1 |
20140058556 | Kawano | Feb 2014 | A1 |
20140086709 | Kasai | Mar 2014 | A1 |
20140086714 | Malik | Mar 2014 | A1 |
20140100999 | Mountz et al. | Apr 2014 | A1 |
20140244026 | Neiser | Aug 2014 | A1 |
20140277693 | Naylor | Sep 2014 | A1 |
20140308098 | Lert et al. | Oct 2014 | A1 |
20150032252 | Galluzzo et al. | Jan 2015 | A1 |
20150073589 | Kodl et al. | Mar 2015 | A1 |
20150081090 | Dong | Mar 2015 | A1 |
20150098775 | Razumov | Apr 2015 | A1 |
20150114799 | Hansl et al. | Apr 2015 | A1 |
20150259077 | Wiskus | Sep 2015 | A1 |
20150375398 | Penn et al. | Dec 2015 | A1 |
20150375938 | Lert et al. | Dec 2015 | A9 |
20160075521 | Puchwein | Mar 2016 | A1 |
20160129592 | Saboo | May 2016 | A1 |
20160167227 | Wellman et al. | Jun 2016 | A1 |
20160176638 | Toebes | Jun 2016 | A1 |
20160221187 | Bradski et al. | Aug 2016 | A1 |
20160236867 | Brazeau et al. | Aug 2016 | A1 |
20160244262 | O'Brien | Aug 2016 | A1 |
20160274586 | Stubbs et al. | Sep 2016 | A1 |
20160304278 | Hognaland | Oct 2016 | A1 |
20160325934 | Stiernagle et al. | Nov 2016 | A1 |
20160332554 | Ambrosio et al. | Nov 2016 | A1 |
20160347545 | Lindbo et al. | Dec 2016 | A1 |
20160355337 | Lert et al. | Dec 2016 | A1 |
20170021499 | Wellman et al. | Jan 2017 | A1 |
20170043953 | Battles et al. | Feb 2017 | A1 |
20170080566 | Stubbs et al. | Mar 2017 | A1 |
20170106532 | Wellman et al. | Apr 2017 | A1 |
20170107055 | Magens et al. | Apr 2017 | A1 |
20170121114 | Einav | May 2017 | A1 |
20170157648 | Wagner et al. | Jun 2017 | A1 |
20170166400 | Hofmann | Jun 2017 | A1 |
20170225330 | Wagner et al. | Aug 2017 | A1 |
20170305668 | Bestic et al. | Oct 2017 | A1 |
20170305694 | McMurrough et al. | Oct 2017 | A1 |
20170322561 | Stiernagle | Nov 2017 | A1 |
20170349385 | Moroni et al. | Dec 2017 | A1 |
20180085788 | Engel et al. | Mar 2018 | A1 |
20180137454 | Kulkami et al. | May 2018 | A1 |
20180186572 | Issing | Jul 2018 | A1 |
20180194571 | Fryer et al. | Jul 2018 | A1 |
20180244473 | Mathi et al. | Aug 2018 | A1 |
20180265298 | Wagner et al. | Sep 2018 | A1 |
20180273297 | Wagner et al. | Sep 2018 | A1 |
20180282066 | Wagner et al. | Oct 2018 | A1 |
20180305122 | Moulin et al. | Oct 2018 | A1 |
20180319594 | Blevins et al. | Nov 2018 | A1 |
20180346022 | Payeur | Dec 2018 | A1 |
20180354717 | Lindbo et al. | Dec 2018 | A1 |
20190047786 | Suzuki | Feb 2019 | A1 |
20190185267 | Mattern | Jun 2019 | A1 |
20200122924 | Otto et al. | Apr 2020 | A1 |
20200143127 | Wagner et al. | May 2020 | A1 |
Number | Date | Country |
---|---|---|
2006204622 | Mar 2007 | AU |
2015233498 | Oct 2016 | AU |
2795022 | Oct 2011 | CA |
2985166 | Dec 2016 | CA |
1033604 | Jul 1989 | CN |
1927673 | Mar 2007 | CN |
101553416 | Oct 2009 | CN |
102356367 | Feb 2012 | CN |
102390701 | Mar 2012 | CN |
102673964 | Sep 2012 | CN |
103381713 | Nov 2013 | CN |
103998358 | Aug 2014 | CN |
104105641 | Oct 2014 | CN |
104504358 | Apr 2015 | CN |
204250465 | Apr 2015 | CN |
104724430 | Jun 2015 | CN |
105059811 | Nov 2015 | CN |
105263832 | Jan 2016 | CN |
105270800 | Jan 2016 | CN |
105383906 | Mar 2016 | CN |
105417043 | Mar 2016 | CN |
105593143 | May 2016 | CN |
105593842 | May 2016 | CN |
105668255 | Jun 2016 | CN |
105730311 | Jul 2016 | CN |
105899398 | Aug 2016 | CN |
106232503 | Dec 2016 | CN |
205771308 | Dec 2016 | CN |
106276105 | Jan 2017 | CN |
106395225 | Feb 2017 | CN |
106575391 | Apr 2017 | CN |
106662874 | May 2017 | CN |
206186873 | May 2017 | CN |
206358714 | Jul 2017 | CN |
107054960 | Aug 2017 | CN |
107108122 | Aug 2017 | CN |
107161215 | Sep 2017 | CN |
107250004 | Oct 2017 | CN |
957200 | Jan 1957 | DE |
3124537 | Feb 1983 | DE |
19510392 | Sep 1996 | DE |
19633238 | Feb 1998 | DE |
102008039764 | May 2010 | DE |
102010002317 | Aug 2011 | DE |
10009087 | Sep 2013 | DE |
102012102333 | Sep 2013 | DE |
102013100048 | May 2014 | DE |
102014111396 | Feb 2016 | DE |
0235488 | Jan 1990 | EP |
1695927 | Aug 2006 | EP |
2062837 | May 2009 | EP |
2308777 | Apr 2011 | EP |
2477914 | Apr 2013 | EP |
2607292 | Jun 2013 | EP |
2650237 | Oct 2013 | EP |
2745982 | Jun 2014 | EP |
2937299 | Oct 2015 | EP |
3112295 | Jan 2017 | EP |
2036682 | Dec 1970 | FR |
2174163 | Oct 1973 | FR |
2832654 | May 2003 | FR |
2085389 | Apr 1982 | GB |
2525309 | Oct 2015 | GB |
2539562 | Dec 2016 | GB |
2546583 | Jul 2017 | GB |
S54131278 | Oct 1979 | JP |
S63310406 | Dec 1988 | JP |
H0395001 | Apr 1991 | JP |
H08157016 | Jun 1996 | JP |
2003067053 | Mar 2003 | JP |
2007182286 | Jul 2007 | JP |
2008037567 | Feb 2008 | JP |
2010202291 | Sep 2010 | JP |
2014141313 | Aug 2014 | JP |
2016047744 | Apr 2016 | JP |
WO2017150006 | Dec 2018 | JP |
20160136795 | Nov 2016 | KR |
2650237 | Oct 2013 | NL |
20150758 | Dec 2016 | NO |
3095339 | Nov 2003 | WO |
2005118436 | Dec 2005 | WO |
2007007354 | Jan 2007 | WO |
2007009136 | Jan 2007 | WO |
2008091733 | Jul 2008 | WO |
2009143335 | Nov 2009 | WO |
2010017872 | Feb 2010 | WO |
2011038442 | Apr 2011 | WO |
2011128384 | Oct 2011 | WO |
2012024714 | Jan 2012 | WO |
2012127102 | Sep 2012 | WO |
2014130937 | Aug 2014 | WO |
2014166650 | Oct 2014 | WO |
2015035300 | Mar 2015 | WO |
2015118171 | Aug 2015 | WO |
2015140216 | Sep 2015 | WO |
2016172253 | Oct 2016 | WO |
2016198565 | Dec 2016 | WO |
2017036780 | Mar 2017 | WO |
2017064401 | Apr 2017 | WO |
2017081281 | May 2017 | WO |
2017148939 | Sep 2017 | WO |
2017148963 | Sep 2017 | WO |
Entry |
---|
Communication pursuant to Article 94(3) EPC issued by the European Patent Office in related European Patent Application No. 18807152.6 dated Aug. 24, 2022, 4 pages. |
Communication pursuant to Article 94(3) EPC issued by the European Patent Office in related European Patent Application No. 18716070.0 dated Sep. 2, 2022, 4 pages. |
Communication pursuant to Article 94(3) EPC issued by the European Patent Office in related European Patent Application No. 18807761.4 dated Aug. 25, 2022, 6 pages. |
Communication pursuant to Article 94(3) EPC issued by the European Patent Office in related European Patent Application No. 18807476.9 dated Aug. 24, 2022, 4 pages. |
Communication pursuant to Article 94(3) EPC issued by the European Patent Office in related European Patent Application No. 18807477.7 dated Aug. 24, 2022, 4 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 17/210,627 dated Sep. 14, 2022, 11 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 17/387,764 dated Sep. 26, 2022, 6 pages. |
Form PTO-892, Notices of References Cited, issued by the United States Patent and Trademark Office in related U.S. Appl. No. 16/952,428 dated Oct. 20, 2022, 1 page. |
Communication pursuant to Article 94(3) EPC issued by the European Patent Office in related European Patent Application No. 18717184.8 dated Nov. 18, 2022, 3 pages. |
Examiner's Report issued by the Canadian Intellectual Property Office in related Canadian Patent Application No. 3,080,514 dated Oct. 27, 2022, 4 pages. |
Examiner's Report issued by the Canadian Intellectual Property Office in related Canadian Patent Application No. 3,080,616 dated Nov. 1, 2022, 5 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 17/313,549 dated Mar. 1, 2023, 7 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 17/832,484 dated Mar. 7, 2023, 7 pages. |
Notice on the First Office Action and the First Office Action, along with its English translation, issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 202210531402.9 dated Mar. 30, 2023, 39 pages. |
Notice on the First Office Action and the First Office Action, along with its English translation, issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 202210440495.4 dated Mar. 30, 2023, 29 pages. |
Notice on the First Office Action issued by the China National Intellecutal Property Administration in related Chinese Patent Application No. 201880069726.7 dated Feb. 26, 2023, 26 pages. |
Notice on the First Office Action issued by the China National Intellecutal Property Administration in related Chinese Patent Application No. 202111514667.X dated Mar. 30, 2023, 35 pages. |
Notice on the First Office Action, along with its English translation, issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 202210324018.1 dated Mar. 30, 2023, 18 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 18/099,573 dated May 30, 2023, 8 pages. |
Communication pursuant to Article 94(3) EPC issued in related European Patent Application No. 18804772.4 by the European Patent Office dated Jul. 26, 2022, 5 pages. |
Communication pursuant to Article 94(3) EPC issued in related European Patent Application No. 18804759.1 by the European Patent Office dated Jul. 26, 2022, 5 pages. |
Notice on the First Office Action, and its English translation, issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 202110762442.X dated Jul. 4, 2022, 27 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Patent Application No. 3,080,615 dated May 27, 2022, 3 pages. |
Communication pursuant to Article 94(3) EPC issued by the European Patent Office in related European Patent Application No. 18717184.8 dated Nov. 11, 2021, 4 pages. |
Communication pursuant to Article 94(3) EPC issued by the European Patent Office in related European Patent Application No. 18718031.0 dated Nov. 11, 2021, 4 pages. |
Communication pursuant to Article 94(3) EPC issued by the European Patent Office in related European Patent Application No. 18716070.0 dated Nov. 11, 2021, 5 pages. |
Communication pursuant to Rules 161(1) and 162 EPC issued by the European Patent Office dated Oct. 29, 2019 in related European Patent Application No. 18716070.0, 3 pages. |
Communication pursuant to Rules 161(1) and 162 EPC issued by the European Patent Office in related European Patent Application No. 18807152.6 dated Jun. 4, 2020, 3 pages. |
Communication pursuant to Rules 161(1) and 162 EPC issued by the European Patent Office in related European Patent Application No. 18804759.1 dated Jun. 4, 2020, 3 pages. |
Communication pursuant to Rules 161(1) and 162 EPC issued by the European Patent Office in related European Patent Application No. 18807476.9 dated Jun. 4, 2020, 3 pages. |
Communication pursuant to Rules 161(1) and 162 EPC issued by the European Patent Office in related European Patent Application No. 18807477.7 dated Jun. 4, 2020, 3 pages. |
Communication pursuant to Rules 161(1) and 162 EPC issued by the European Patent Office in related European Patent Application No. 18807761.4 dated Jun. 5, 2020, 3 pages. |
Communication pursuant to Rules 161(1) and 162 EPC issued by the European Patent Office in related European Patent Application No. 18804772.4 dated Jun. 4, 2020, 3 pages. |
Communication pursuant to Rules 161(1) and 162EPC issued by the European Patent Office dated Oct. 30, 2019 in related European Patent Application No. 18717184.8, 3 pages. |
Communication pursuant to Rules 161(1) and 162EPC issued by the European Patent Office dated Oct. 30, 2019 in related European Patent Application No. 18718031.0, 3 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada in related Canadian Patent Application No. 3,057,313 dated Nov. 30, 2020, 3 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada in related Canadian Patent Application No. 3,057,367 dated Dec. 1, 2020, 3 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada in related Canadian Patent Application No. 3,056,892 dated Dec. 14, 2020, 3 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Patent Application No. 3,076,511 dated May 5, 2021, 4 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Patent Application No. 3,076,514 dated May 28, 2021, 7 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Patent Application No. 3,080,514 dated Jun. 2, 2021, 4 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Patent Application No. 3,080,616 dated Jun. 28, 2021, 5 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Patent Application No. 3,080,615 dated Jun. 21, 2021, 4 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Patent Application No. 3,056,892 dated Oct. 1, 2021, 4 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Patent Application No. 3,076,511 dated Jan. 4, 2022, 4 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Patent Application No. 3,078,778 dated Jan. 26, 2022, 4 pages. |
Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 16/171,310 dated Sep. 30, 2020, 8 pages. |
Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 16/171,303 dated Feb. 11, 2021, 14 pages. |
Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 16/172,231 dated Jul. 29, 2021, 15 pages. |
First Office Action, and its English translation, issued by the China National Intellectual Property Administration issued in related Chinese Patent Application No. 201880019478.5 dated Aug. 25, 2020, 18 pages. |
International Preliminary Report on Patentability issued by the International Bureau of WIPO dated Sep. 24, 2019 in related international application No. PCT/US2018/023836, 7 pages. |
International Preliminary Report on Patentability issued by the International Bureau of WIPO dated Sep. 24, 2019 in related international application No. PCT/US2018/024065, 8 pages. |
International Preliminary Report on Patentability issued by the International Bureau of WIPO in related International Application No. PCT/US2018/023339 dated Sep. 24, 2019, 7 pages. |
International Preliminary Report on Patentability issued by the International Bureau of WIPO in related International Application No. PCT/US2018/057600 dated Apr. 28, 2020, 7 pages. |
International Preliminary Report on Patentability issued by the International Bureau of WIPO in related International Application No. PCT/US2018/057770 dated Apr. 28, 2020, 8 pages. |
International Preliminary Report on Patentability issued by the International Bureau of WIPO in related International Application No. PCT/US2018/057607 dated Apr. 28, 2020, 11 pages. |
International Preliminary Report on Patentability issued by the International Bureau of WIPO in related International Application No. PCT/US2018/057795 dated Apr. 28, 2020, 11 pages. |
International Preliminary Report on Patentability issued by the International Bureau of WIPO in related International Application No. PCT/US2018/057807 dated Apr. 28, 2020, 11 pages. |
International Preliminary Report on Patentability issued by the International Bureau of WIPO in related International Application No. PCT/US2018/057788 dated Apr. 28, 2020, 6 pages. |
International Search Report and Written Opinion issued by the International Searching Authority, the European Patent Office, in related International Patent Application PCT/US2018/023836 dated Jun. 27, 2018, 12 pages. |
International Search Report and Written Opinion issued by the International Searching Authority, the European Patent Office, dated Jun. 29, 2018 in related international application No. PCT/US2018/024065, 11 pages. |
International Search Report and Written Opinion of the International Search Authority, the European Patent Office, in related International Application No. PCT/US2018/023339 dated Jun. 18, 2018, 10 pages. |
International Search Report and Written Opinion of the International Search Authority, the European Patent Office, in related International Application No. PCT/US2018/057600 dated Feb. 18, 2019, 10 pages. |
International Search Report and Written Opinion of the International Search Authority, the European Patent Office, in related International Application No. PCT/US2018/057788 dated Feb. 18, 2018, 12 pages. |
International Search Report and Written Opinion of the International Search Authority, the European Patent Office, in related International Application No. PCT/US2018/057607 dated Apr. 5, 2019, 16 pages. |
International Search Report and Written Opinion of the International Search Authority, the European Patent Office, in related International Application No. PCT/US2018/057770 dated Feb. 18, 2019, 13 pages. |
International Search Report and Written Opinion of the International Search Authority, the European Patent Office, in related International Application No. PCT/US2018/057807 dated Apr. 25, 2019, 16 pages. |
International Search Report and Written Opinion of the International Search Authority, the European Patent Office, in related International Application No. PCT/US2018/057795 dated Apr. 12, 2019, 17 pages. |
Non-Final Office Action issued by the U.S. Patent and Trademark Office in related U.S. Appl. No. 15/926,497 dated Jun. 10, 2019, 8 pages. |
Non-Final Office Action issued by the U.S. Patent and Trademark Office in related U.S. Appl. No. 15/926,497 dated Dec. 11, 2019, 8 pages. |
Non-Final Office Action issued by the U.S. Patent and Trademark Office in related U.S. Appl. No. 16/172,255 dated Apr. 1, 2020, 6 pages. |
Non-Final Office Action issued by the U.S. Patent and Trademark Office in related U.S. Appl. No. 16/171,310 dated Jan. 9, 2020, 8 pages. |
Non-Final Office Action issued by the U.S. Patent and Trademark Office in related U.S. Appl. No. 16/172,339 dated Mar. 30, 2020, 16 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 16/172,353 dated Jan. 4, 2021, 12 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 16/172,231 dated Apr. 22, 2021, 11 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 16/739,670 dated Oct. 28, 2021, 28 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 16/789,917 dated Jan. 11, 2022, 34 pages. |
Notice on the First Office Action and First Office Action, along with its English translation, issued by the China National Intellectual Property Administration issued in related Chinese Patent Application No. 20880032530.0 dated Sep. 29, 2020, 23 pages. |
Notice on the First Office Action and First Office Action, along with its English translation, issued by the China National Intellectual Property Administration issued in related Chinese Patent Application No. 201880032609.3 dated Nov. 20, 2020, 20 pages. |
Notice on the First Office Action and First Office Action, along with its English translation, issued by the China National Intellectual Property Administration issued in related Chinese Patent Application No. 201880069642.3 dated Mar. 3, 2021, 19 pages. |
Notice on the First Office Action and First Office Action, along with its English translation, issued by the China National Intellectual Property Administration issued in related Chinese Patent Application No. 201880069627.9 dated Jan. 19, 2021, 12 pages. |
Notice on the First Office Action and First Office Action, along with its English translation, issued by the China National Intellectual Property Administration issued in related Chinese Patent Application No. 201880069729.0 dated Mar. 3, 2021, 16 pages. |
Notice on the First Office Action and First Office Action, along with its English translation, issued by the China National Intellectual Property Administration issued in related Chinese Patent Application No. 201880069684.7 dated Mar. 3, 2021, 14 pages. |
Notice on the First Office Action and First Office Action, along with its English translation, issued by the China National Intellectual Property Administration issued in related Chinese Patent Application No. 201880069649.5 dated Mar. 23, 2021, 23 pages. |
Notice on the Second Office Action and Second Office Action, along with its English translation, issued by the China National Intellectual Property Administration issued in related Chinese Patent Application No. 201880019478.5 dated Apr. 20, 2021, 9 pages. |
Notice on the Second Office Action and Second Office Action, along with its English translation, issued by the China National Intellectual Property Administration issued in related Chinese Patent Application No. 201880069627.9 dated Aug. 18, 2021, 16 pages. |
Notice on the Second Office Action and Second Office Action, along with its English translation, issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 201880069642.3 dated Oct. 20, 2021, 13 pages. |
Notice on the Second Office Action and Second Office Action, along with its English translation, issued by the China National Intellectual Property Administration issued in related Chinese Patent Application No. 201880069649.5 dated Nov. 18, 2021, 7 pages. |
Notice on the Second Office Action and the Second Office Action, along with its English translation, issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 201880032609.3 dated Jun. 25, 2021, 7 pages. |
Notice on the Third Office Action and Third Office Action, along with its English translation, issued by the China National Intellectual Property Administration issued in related Chinese Patent Application No. 201880019478.5 dated Sep. 7, 2021, 10 pages. |
Partial International Search Report and Written Opinion of the International Search Authority, the European Patent Office, in related International Application No. PCT/US2018/057607 dated Feb. 11, 2018, 12 pages. |
Partial International Search Report and Written Opinion of the International Search Authority, the European Patent Office, in related International Application No. PCT/US2018/057807 dated Feb. 18, 2019, 13 pages. |
Partial International Search Report and Written Opinion of the International Search Authority, the European Patent Office, in related International Application No. PCT/US2018/057795 dated Feb. 18, 2019, 14 pages. |
U.S. Non-Final Office Action issued by the U.S. Patent and Trademark Office dated May 21, 2019 in related U.S. Appl. No. 15/934,462, 26 pages. |
U.S. Non-Final Office Action issued by the U.S. Patent and Trademark Office dated Jul. 27, 2020 in related U.S. Appl. No. 16/171,303, 13 pages. |
U.S. Non-Final Office Action issued in related U.S. Appl. No. 15/928,977 dated Apr. 23, 2019, 20 pages. |
U.S. Notice of Allowance issued in related U.S. Appl. No. 15/928,977 dated Oct. 28, 2019, 9 pages. |
First Office Action, and its English translation, issued by the China National Intellectual Property Administration issued in related Chinese Patent Application No. 202210535569.2 dated Mar. 27, 2023, 31 pages. |
Communication pursuant to Article 94(3) EPC issued by the European Patent Office in related European Patent Application No. 18804759.1 dated Jan. 12, 2024, 4 pages. |
Communication pursuant to Article 94(3) EPC issued by the European Patent Office in related European Patent Application No. 18716070.0 dated Jan. 12, 2024, 3 pages. |
Communication pursuant to Article 94(3) EPC issued by the European Patent Office in related European Patent Application No. 18807152.6 dated Jan. 12, 2024, 5 pages. |
Communication pursuant to Article 94(3) EPC issued by the European Patent Office in related European Patent Application No. 18807761.4 dated Jan. 12, 2024, 6 pages. |
Communication pursuant to Article 94(3) EPC issued by the European Patent Office in related European Patent Application No. 18807476.9 dated Jan. 12, 2024, 4 pages. |
Communication pursuant to Article 94(3) EPC issued by the European Patent Office in related European Patent Application No. 18807477.7 dated Jan. 12, 2024, 8 pages. |
Communication pursuant to Article 94(3) EPC issued by the European Patent Office in related European Patent Application No. 18804772.4 dated Jan. 12, 2024, 4 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 18/134,456 dated Dec. 5, 2023, 6 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 17/478,402 dated Jan. 30, 2024, 13 pages. |
Notice on Grant of Patent Right for Invention, along with its English translation, issued by the China National Intellectual Property Administration in related Chinese Patent Application bno. 202111514667.X dated Oct. 16, 2023, 8 pages. |
Notice on the Second Office Action issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 202210535569.2 dated Dec. 11, 2023, 8 pages. |
Notice on the Second Office Action, along with its English translation, issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 202210440495.4 dated Sep. 21, 2023, 10 pages. |
Non-Final Office Action issued by the United Sates Patent and Trademark Office in related U.S. Appl. No. 17/950/786 on Mar. 6, 2024, 30 pages. |
Non-Final Office Action issued by the United Sates Patent and Trademark Office in related U.S. Appl. No. 18/367/014 on Mar. 25, 2024, 8 pages. |
Non-Final Office Action issued by the United Sates Patent and Trademark Office in related U.S. Appl. No. 18/134/456 on Mar. 12, 2024, 6 pages. |
Non-Final Office Action issued by the United Sates Patent and Trademark Office in related U.S. Appl. No. 18/368/585 on Apr. 5, 2024, 10 pages. |
Number | Date | Country | |
---|---|---|---|
20220311480 A1 | Sep 2022 | US |
Number | Date | Country | |
---|---|---|---|
62475483 | Mar 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16739670 | Jan 2020 | US |
Child | 17840123 | US | |
Parent | 15928977 | Mar 2018 | US |
Child | 16739670 | US |