Systems and methods for surgical robotic cart placement

Information

  • Patent Grant
  • 11986261
  • Patent Number
    11,986,261
  • Date Filed
    Monday, April 1, 2019
    5 years ago
  • Date Issued
    Tuesday, May 21, 2024
    5 months ago
Abstract
A method of placing a surgical robotic cart assembly includes, determining a first position of a first surgical robotic cart assembly relative to a surgical table, calculating a path for the first surgical robotic cart assembly towards a second position of the first surgical robotic cart assembly relative to the surgical table, wherein in the second position, the first surgical robotic cart assembly is spaced-apart a first safe distance from the surgical table, moving the first surgical robotic cart assembly autonomously towards the second position thereof, and detecting a potential collision along the path of the first surgical robotic cart assembly as the first surgical robotic cart assembly moves towards the second position thereof.
Description
BACKGROUND
Technical Field

The present disclosure relates to movable surgical robotic systems and, more particularly, to systems and methods facilitating placement of one or more surgical robotic cart assemblies relative to a surgical table.


Background of Related Art

Surgical robotic systems are used in minimally invasive medical procedures because of their increased accuracy and expediency. In surgical robotic systems, a robot arm supports a surgical instrument having an end effector mounted thereto by a wrist assembly. In operation, the robot arm inserts the surgical instrument into or holds a surgical instrument in a small incision via a surgical portal or a natural orifice of a patient to position the end effector at a work site within a patient's body.


Most of the surgical robotic systems on the market are heavy and stationary requiring a motor driven pallet jack to be relocated. In some of the more modern surgical robotic systems, the robot arm is supported on a movable surgical robotic cart having a base portion with a set of casters. This is beneficial because the surgical robotic systems can be moved between various rooms and between various positions relative to the surgical table as needed, without a pallet jack.


However, minimally invasive medical procedures require a high degree of accuracy, precision, and speed, and, therefore, movable surgical robotic systems used for minimally invasive medical procedures need to be precisely placed relative to the surgical table to achieve optimal positioning for specific surgical procedures.


Accordingly, there is a need to precisely locate and position a surgical robotic cart relative to the surgical table, and to do so with a high degree of accuracy, precision, and movability.


SUMMARY

Provided in accordance with aspects of the present disclosure is a method of placing a surgical robotic cart assembly. The method includes, determining a first position of a first surgical robotic cart assembly relative to a surgical table, calculating a path for the first surgical robotic cart assembly towards a second position of the first surgical robotic cart assembly relative to the surgical table, wherein in the second position, the first surgical robotic cart assembly is spaced-apart a first safe distance from the surgical table, moving the first surgical robotic cart assembly autonomously towards the second position thereof, and detecting a potential collision along the path of the first surgical robotic cart assembly as the first surgical robotic cart assembly moves towards the second position thereof.


In one aspect of the present disclosure, the method further includes determining a first position of a second surgical robotic cart assembly relative to the first surgical robotic cart assembly and the surgical table, calculating a path for the second surgical robotic cart assembly towards a second position of the second surgical robotic cart assembly relative to the first surgical robotic cart assembly and the surgical table, wherein in the second position, the second surgical robotic cart assembly is spaced-apart a second safe distance from the first surgical robotic cart assembly and a third safe distance from the surgical table, moving the second surgical robotic cart assembly autonomously towards the second position thereof, and detecting a potential collision along the path of the second surgical robotic cart assembly as the second surgical robotic cart assembly moves towards the second position thereof.


In another aspect of the present disclosure, the method may include obtaining a first sensor data from a visual sensor to determine the first position of the first surgical robotic cart assembly and to determine the first position of the second surgical robotic cart assembly.


In yet another aspect of the present disclosure, the method may include obtaining a second sensor data from a floor sensor to determine the first position of the first surgical robotic cart assembly and to determine the first position of the first surgical robotic cart assembly.


In still another aspect of the present disclosure, the method may include obtaining a third sensor data from the first surgical robotic cart assembly to determine the first position of the second surgical robotic cart assembly.


In aspects of the present disclosure, the method may include obtaining a fourth sensor data from the surgical table to determine the first position of the first surgical robotic cart assembly.


In one aspect of the present disclosure, the method may include updating an environmental map to incorporate the first position of the first surgical robotic cart assembly and the first position of the second surgical robotic cart assembly.


In another aspect of the present disclosure, the method may include determining a third position for the second surgical robotic cart assembly upon detecting the potential collision between the second surgical robotic cart assembly and the first surgical robotic cart assembly.


In yet another aspect of the present disclosure, the method may include determining the requirement for troubleshooting of the second surgical robotic cart assembly upon detecting the potential collision between the second surgical robotic cart assembly and the first surgical robotic cart assembly.


In still another aspect of the present disclosure, the method may include moving the first surgical robotic cart assembly and the second surgical robotic cart assembly simultaneously towards the respective second positions thereof.


Provided in accordance with another aspect of the present disclosure is a method of positioning a plurality of surgical robotic cart assemblies within an operating room. The method includes, obtaining a first sensor data from an operating room sensor, determining a first position of a first surgical robotic cart assembly and determining a first position of a second surgical robotic cart assembly, the first surgical robotic cart assembly including a first base portion having a first sensor and a first transmitter, and the second surgical robotic cart assembly including a second base portion having a second sensor and a second transmitter, calculating a first path for the first surgical robotic cart assembly towards a second position of the first surgical robotic cart assembly and calculating a second path for the second surgical robotic cart assembly towards a second position of the second surgical robotic cart assembly, moving the first surgical robotic cart assembly and the second surgical robotic cart assembly autonomously towards the second positions, respectively, thereof, detecting a potential collision along the first path and the second path as the first surgical robotic cart assembly moves towards the second position thereof and as the second surgical robotic cart assembly moves towards the second position thereof, and updating an environmental map with the second position of the first surgical robotic cart assembly and with the second position of the second surgical robotic cart assembly upon moving the first and second surgical robotic cart assemblies to the second positions, respectively, thereof.


In one aspect of the present disclosure, the method may include determining the first position of the first surgical robotic cart assembly and determining the first position of the second surgical robotic cart assembly from the first sensor data obtained from the operating room sensor.


In another aspect of the present disclosure, the method may include obtaining a second sensor data from the first sensor of the first surgical robotic cart assembly to determine the first position of the second surgical robotic cart assembly.


In yet another aspect of the present disclosure, the method may include obtaining a third sensor data from the second sensor of the second surgical robotic cart assembly to determine the first position of the first surgical robotic cart assembly.


In still another aspect of the present disclosure, the method may include calculating the second position of the first surgical robotic cart assembly and calculating the second position of the second surgical robotic cart assembly to maintain a first safe distance between the first and second surgical robotic cart assemblies and to maintain a second safe distance between the first and second surgical robotic cart assemblies and a surgical table.


In one aspect of the present disclosure, the method may include moving the second surgical robotic cart assembly autonomously to a third position thereof when a distance between the first and second surgical robotic cart assemblies is less than the first safe distance.


In another aspect of the present disclosure, the method may include updating the environmental map to register the third position of the second surgical robotic cart assembly as a current position of the second surgical robotic cart assembly when the second surgical robotic cart assembly is moved to the third position thereof.


Provided in accordance with yet another aspect of the present disclosure is a surgical robotic cart assembly. The surgical robotic cart assembly includes a robotic arm and a base portion configured to operatively support the robotic arm thereon. The base portion includes a visual guidance system having a projector mounted on the base portion, a display mounted on the base portion, and a plurality of lights mounted on the base portion and spaced apart thereon. The projector is configured to project a pattern corresponding to a movement direction towards a target location. The display is configured to represent a visual indication corresponding to the movement direction towards the target location. At least one of the plurality of lights is configured to selectively illuminate corresponding to the movement direction towards the target location.


In one aspect of the present disclosure, the pattern projected by the projector may be configured to change as the base portion is moved along the pattern towards the target location.


Further details and aspects of exemplary embodiments of the present disclosure are described in more detail below with reference to the appended figures.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and, together with a general description of the disclosure given above, and the detailed description of exemplary embodiment(s) given below, serve to explain the principles of the disclosure, wherein:



FIG. 1 is a schematic illustration of a robotic surgical system including a robotic surgical assembly in accordance with the present disclosure;



FIG. 2 is a perspective view of a surgical robotic cart assembly of the robotic surgical system of FIG. 1, illustrating a robotic arm supported on a base portion in accordance with an embodiment the present disclosure;



FIG. 3 is a flow chart illustrating a method of positioning the surgical robotic cart assembly of FIG. 2 in accordance with the present disclosure;



FIG. 4 is a top, plan view of an operating room of the robotic surgical system of FIG. 1, illustrating a plurality of surgical robotic cart assemblies in a first position;



FIG. 5 is a top, plan view of the operating room of FIG. 4, illustrating the plurality of surgical robotic cart assemblies in a second position;



FIG. 6 is a top, plan view of the operating room of FIG. 4, illustrating the plurality of surgical robotic cart assemblies in a third position;



FIG. 7 is a perspective view of a surgical robotic cart assembly of the robotic surgical system of FIG. 1, illustrating a robotic arm supported on a base portion in accordance with another embodiment the present disclosure; and



FIG. 8 is a top, plan view of the operating room of the robotic surgical system of FIG. 1, illustrating a plurality of surgical robotic cart assemblies in accordance with another embodiment the present disclosure.





DETAILED DESCRIPTION

The present disclosure provides systems and methods facilitating automated and manual means for locating and moving one or more surgical robotic cart assemblies towards a target location to optimally position one or more robotic arm(s) relative to a surgical table. Embodiments of the present disclosure are described in detail with reference to the drawings, in which like reference numerals designate identical or corresponding elements in each of the several views.


Referring initially to FIG. 1, a surgical system, such as, for example, a robotic surgical system 1 is shown. In embodiments, robotic surgical system 1 is configured for use in an operating room “OR,” and specifically configured for use on a patient “P” lying on a surgical table “ST” to be treated in a minimally invasive manner by means detailed below. Robotic surgical system 1 generally includes a plurality of robotic arms 2, 3; a control device 4; and an operating console 5 coupled with control device 4.


In embodiments, surgical table “ST” includes one or more sensor(s) 16 and a transmitter 18 disposed about the periphery thereof. Sensor(s) 16 may be configured to receive, for example, radio frequency (RF) signals (e.g., ultra wide band RF signals); ultrasound waves; and infrared (IR) signals, and transmitter 18 may be configured to emit the same.


Robotic arms, such as for example, robotic arm 2 may be coupled to the surgical table “ST.” Alternatively, robotic arms, such as for example, robotic arm 3, may be supported on a surgical robotic cart assembly 100.


Operating console 5 includes a display device 6, which is set up in particular to display three-dimensional images; and manual input devices 7, 8, by means of which a person (not shown), e.g., a surgeon, is able to telemanipulate robotic arms 2, 3 in a first operating mode, as known in principle to a person skilled in the art. Each of the robotic arms 2, 3 may be composed of a plurality of members, which are connected through joints, and may include a surgical instrument, such as, for example, an electromechanical instrument 10 removably attached thereto for treating patient “P” in a minimally invasive manner.


Robotic arms 2, 3 may be driven by electric drives (not shown) that are connected to control device 4. Control device 4 (e.g., a computer) is set up to activate the drives, in particular by means of a computer program, in such a way that robotic arms 2, 3 and thus electromechanical instrument 10 (including the electromechanical end effector (not shown)) execute a desired movement according to a movement defined by means of manual input devices 7, 8. Control device 4 may also be set up in such a way that it regulates the movement of robotic arms 2, 3 and/or of the drives. In embodiments, surgical robotic cart assembly 100 may be controlled via manual input devices 7, 8. Additionally/alternatively, control device 4 may be configured to regulate the movement of surgical robotic cart assembly 100.


Robotic surgical system 1 may also include more than two robotic arms 2, 3, the additional robotic arms likewise being connected to control device 4 and being telemanipulatable by means of operating console 5. A surgical instrument, for example, electromechanical instrument 10 (including the electromechanical end effector), may also be attached to the additional robotic arm. Robotic surgical system 1 may also include a plurality of surgical robotic cart assemblies 100 for supporting robotic arms 3, such as, for example, a first surgical robotic cart assembly 100a and a second surgical robotic cart assembly 100b, as shown in FIGS. 4-6. First and second surgical robotic cart assemblies 100a, 100b are likewise configured to be connected to control device 4 and/or manual input devices 7, 8.


In embodiments, robotic surgical system 1 further includes a database 12 in communication with one or more operating room sensors 14. Database 12 is provided to store one or more environmental maps 12a representing the locations of entities (e.g., surgical table “ST” and first and second surgical robotic cart assemblies 100a, 100b) disposed within operating room “OR.” Environmental maps 12a may be generated from pre-programed input and/or generated from data collected from operating room sensors 14 and/or data gathered from surgical table “ST” and first and second surgical robotic cart assemblies 100a, 100b, as will be detailed below. Environmental maps 12a include a static map portion 12a1 and a dynamic map portion 12a2. Static map portion 12a1 represents dimensions or boundaries of operating room “OR” and the locations of any landmarks such as, for example, surgical table “ST.” Dynamic map portion 12a2 represents a current working environment generated by iteratively incorporating data informing of the current positions of movable entities such as, for example, first and second surgical robotic cart assemblies 100a, 100b.


Database 12 may also include patient data 12b such as, for example, pre-operative data and/or anatomical atlases assigned to patient “P.” Database 12 may also be coupled with operating console 5 such that environmental maps 12a and/or patient data 12b may be displayed on display device 6.


Operating room sensors 14 may include a visual sensor 14a mounted to the ceiling of operating room “OR” and a floor sensor 14b disposed about surgical table “ST.” Visual sensor 14a may include one or more cameras, video cameras, and/or imagers configured to detect the three-dimensional geometry of a base portion 130 of surgical robotic cart assembly 100 (FIG. 2) and surgical table “ST.” In embodiments, visual sensor 14a may be configured to identify and track a unique marker “M” such as, for example, an indicia and/or geometric marking disposed on base portion 130 of surgical robotic cart assembly 100. Floor sensor 14b may be incorporated within a sensorized floor covering and configured to detect a pose or orientation of base portion 130 of surgical robotic cart assembly 100. Database 12 and operating room sensors 14 are coupled with control device 4 such that control device 4 may incorporate the information from environmental maps 12a, patient data 12b, and operating room sensors 14 in regulating the movement of surgical cart robotic assembly 100.


As shown in FIG. 1, the various components of robotic surgical system 1 detailed above may be coupled to one another via a wired and/or wireless means to send and receive data therebetween.


For a detailed discussion of the construction and operation of a robotic surgical system, reference may be made to U.S. Pat. No. 8,828,023, filed on Nov. 3, 2011, entitled “Medical Workstation,” the entire content of which is incorporated herein by reference.


With reference to FIG. 2, one exemplary embodiment of surgical robotic cart assembly 100 configured for use in accordance with the present disclosure is generally identified, although it is also envisioned that the aspects and features of the present disclosure be similarly incorporated into any suitable surgical robotic cart assembly. Surgical robotic cart assembly 100 generally includes robotic arm 3, a vertical column 120, and a base portion 130. Base portion 130 includes a plurality of casters 140, 150, and 160 coupled thereto. Each of the casters 140, 150, and 160 is configured to swivel about a respective pivot axis, and configured to allow surgical robotic cart assembly 100 to move, or to inhibit movement of surgical robotic cart assembly 100.


For a detailed discussion of the construction and operation of a surgical robotic cart assembly, reference may be made to U.S. patent application Ser. No. 15/765,544 filed on Apr. 3, 2018, now U.S. Pat. No. 10,595,944, entitled “SURGICAL ROBOTIC CART WITH SELECTIVE WHEEL ALIGNMENT,” and/or International Patent Application Serial No. PCT/US2019/024509 filed on Mar. 28, 2019, published as WO 2019/203999, entitled “ROBOTIC SURGICAL SYSTEMS AND ROBOTIC ARM CARTS THEREOF,” the entire contents of each of which are incorporated herein by reference.


Continuing with FIG. 2, surgical robotic cart assembly 100 includes a camera 132, one or more sensor(s) 134, a transmitter 136, and a unique marker “M” disposed on base portion 130. In embodiments, sensor(s) 134 may be spaced apart and disposed along the periphery of base portion 130, such that, the relative pose or orientation of base portion 130 may be obtained by determining which of the sensor(s) 134 is first to receive a signal from a source, such as, for example, transmitter 136 of a second surgical robotic cart assembly 100b (FIG. 4) and/or transmitter 18 of surgical table “ST.” Alternatively, camera 132, sensor(s) 134, transmitter 136, and unique marker “M” may be disposed about other components (e.g., vertical column 120) of surgical robotic cart assembly 100.


Similar to sensor(s) 16 of surgical table “ST,” sensor(s) 134 may be configured to receive, for example, RF signals (e.g., ultra wide band RF signals); ultrasound waves; and IR signals, and similar to transmitter 18 of surgical table “ST,” transmitter 136 may be configured to emit the same. As will be detailed below with reference to FIGS. 3-6, it is contemplated that sensor(s) 16 and transmitter 18 of surgical table “ST” and camera 132, sensor(s) 134, and transmitter 136 of surgical robotic cart assembly 100 are configured to cooperate to provide an accurate and robust localization of, for example, first surgical robotic cart assembly 100a relative to second surgical robotic cart assemblies 100b and surgical table “ST” disposed within operating room “OR.” Further, sensor(s) 16 and transmitter 18 of surgical table “ST” and camera 132, sensor(s) 134, and transmitter 136 are configured to cooperate to provide an accurate measurement of the relative orientation of each of the plurality of surgical robotic cart assemblies 100 relative to surgical table “ST.”


With reference to FIGS. 3-6, in conjunction with FIGS. 1 and 2, an exemplary method of using robotic surgical system 1 for automatically positioning first surgical robotic cart assembly 100a and second surgical robotic cart assembly 100b around surgical table “ST” such that, robotic arms 3 of first and second surgical robotic cart assemblies 100a, 100b are optimally positioned relative to each other and surgical table “ST” to complete a specified task, is described. Although only two surgical robotic cart assemblies are described, it is contemplated that any number of surgical robotic cart assemblies may be incorporated into robotic surgical system 1. The process of positioning first and second surgical robotic cart assemblies 100a, 100b around surgical table “ST” generally includes a localization phase (e.g., step S200), a path planning phase (e.g., step S208), a movement phase (e.g., step S214), and a confirmation of placement phase (e.g., step S224), as will be detailed below. It is contemplated that robotic surgical system 1 provides a safety system incorporated into the various phases detailed below, to prevent inadvertent collisions between first and second surgical robotic cart assemblies 100a, 100b and any entities located within operating room “OR.” This safety system would incorporate sensors, such as for example LIDAR which generates a cloud of 3D surface points of objects, that discern the physical presence of the edges or periphery/boundary of the arms and table relative to one another. The safety system would combine this boundary information with the current movement as well as predicted future movement of the carts relative to one another and the surgical table to decide if a collision is possible and if so adjusted the path(s) of the cart(s) relative to one another and/or the surgical table. This safety system could also combine the known geometric configuration of the arms and/or surgical table along with the known pose of the same to compute potential collisions and take corrective action to prevent this. In this way, sensors for measurement of surface boundaries would not be necessary. Note that in the case of assessing potential collisions with objects of unknown prior geometry, such as operating room staff and/or equipment moving about the room, the use of surface geometry measurement sensors is desired.


Turning first to FIGS. 3 and 4, in step S200, the localization phase is initiated. Following the initiation of the localization phase, in step S202, sensor data is obtained to determine a current position “PA1” of first surgical robotic cart assembly 100a relative to second surgical robotic cart assembly 100b and surgical table “ST” and a current position “PB1” of second surgical robotic cart assembly 100b relative to first surgical robotic cart assembly 100a and surgical table “ST” in step S204. In embodiments, sensor data is obtained from visual sensor 14a and floor sensor 14b of operating room “OR.” Alternatively/additionally, current positions “PA1” and “PA2” of first and second surgical robotic cart assemblies 100a, 100b, respectively, may be determined by sending and receiving signals between first and second surgical robotic cart assemblies 100a, 100b and surgical table “ST.”


In embodiments, if an operator or clinician “C” is located within operating room “OR,” the clinician “C” may be provided with a tag 138 configured to transmit a signal corresponding to a current position “PC1” of clinician “C” relative to first and second surgical robotic cart assemblies 100a, 100b, and surgical table “ST.” The current position “PC1” of clinician “C” may be determined by one or more of visual sensor 14a, floor sensor 14b, sensor(s) 16 of surgical table “ST,” and/or camera 132 and sensor(s) 134 of each of first and second surgical robotic cart assemblies 100a, 100b.


Following the localization phase, in step S206, environmental maps 12a are updated by incorporating the current positions “PA1,” “PB1,” and “PC1” of first and second surgical robotic cart assemblies 100a, 100b and clinician “C,” respectively, into the dynamic map portion 12a2, and incorporating or aligning the dynamic map portion 12a2 with the static map portion 12a1 to provide a representation of the current positions “PA1,” “PB1,” and “PC1” within the boundaries of operating room “OR.”


Turning next to FIG. 5, in conjunction with FIG. 3, in step S208, the path planning phase is initiated by control device 4. In step S210, using at least the updated environmental maps 12a gathered during the localization phase and the patient data 12b, control device 4 calculates a second, target position “PA2” for first surgical robotic cart assembly 100a and a second, target position “PB2” for second surgical robotic cart assembly 100b to optimally position each robotic arm 3 (FIG. 2) relative to surgical table “ST.” Step S210 includes plotting a path “X1” for first surgical robotic cart assembly 100a and a path “X2” for second surgical robotic cart assembly 100b which avoids any obstructions en route to second positions “PA2” and “PB2” of first and second surgical robotic cart assemblies 100a, 100b, respectively.


In embodiments, as illustrated in FIG. 5, control device 4 calculates path “X1” of first surgical robotic cart assembly 100a to avoid a collision between first surgical robotic cart assembly 100a and surgical table “ST,” and calculates path “X2” of second surgical robotic cart assembly 100b to avoid a collision between second surgical robotic cart assembly 100b and clinician “C.”


Next in step S212, if control device 4 determines that there will be a safe distance “D1” between first and second surgical robotic cart assemblies 100a, 100b as well as a safe distance “D2” between each of first and second surgical robotic cart assemblies 100a, 100b and surgical table “ST” when first and second surgical robotic cart assemblies 100a, 100b are in second positions “PA2” and “PB2,” respectively, control device 4 instructs first and second surgical robotic cart assemblies 100a, 100b to move towards the respective second positions “PA2” and “PB2.”


However, if control device 4 determines that the distance between first and second surgical robotic cart assemblies 100a, 100b will be less than safe distance “D1” and/or the distance between each of first and second surgical robotic cart assemblies 100a, 100b and surgical table “ST” will be less than safe distance “D2,” when first and second surgical robotic cart assemblies 100a, 100b are in second positions “PA2” and “PB2,” respectively, robotic surgical system 1 is returned to step S202 to obtain further sensor data as described above.


With reference to FIGS. 3, 5, and 6, once the path planning phase is completed, the movement phase is initiated in step S214 during which first and second surgical robotic cart assemblies 100a, 100b move autonomously towards second positions “PA2” and “PB2,” respectively, based on the instructions received from control device 4. It is contemplated that first and second surgical robotic cart assemblies 100a, 100b may be moved consecutively or simultaneously towards second positions “PA2” and “PB2,” respectively. It is contemplated that first and second surgical robotic cart assemblies 100a, 100b may be provided with powered wheels and steering assemblies for automated movements thereof.


In step S216, as first and second surgical robotic cart assemblies 100a, 100b move towards second positions “PA2” and “PB2,” respectively, sensor(s) 134 of each of first and second surgical robotic cart assemblies 100a, 100b are configured to continuously detect for signs of close-in contact potential, beyond prior detection of obstructions during the path planning phase as described above. Close-in contact potential can be computed using well established techniques from computer graphics and robotic navigation (e.g. see “FCL: A general purpose library for collision and proximity queries,” Robotics and Automation (ICRA), 2012 IEEE International Conference on, DOI: 10.1109/ICRA.2012.6225337).


In step S216, with first and second surgical robotic cart assemblies 100a, 100b located in second positions “PA2” and “PB2,” respectively, as shown in FIG. 6, if a potential contact distance “D3” between first and second surgical robotic cart assemblies 100a, 100b is detected by sensor(s) 134 of second surgical robotic cart assembly 100b, in step S218, second surgical robotic cart assembly 100b is configured to determine whether troubleshooting is required.


In step S220, if troubleshooting is required, second surgical robotic cart assembly 100b is configured to stop movement and broadcast a signal via transmitter 136 to clinician “C” in operating room “OR” and/or observers (not shown) outside of operating room “OR” indicating the potential for collision with first surgical robotic cart assembly 100a or surgical table “ST.”


In step S222, if troubleshooting is not required, second surgical robotic cart assembly 100b is configured to signal via transmitter 136 to control device 4 indicating the potential for collision with first surgical robotic cart assembly 100a or surgical table “ST.” Following the indication from second surgical robotic cart assembly 100b, control device 4 determines a third position “PB3” for second surgical robotic cart assembly 100b such that, second surgical robotic cart assembly 100b avoids collision with first surgical robotic cart assembly 100a or surgical table “ST.” It is contemplated that third position “PB3” of second surgical robotic cart assembly 100b is calculated to enable second surgical robotic cart assembly 100b to continue performing its specified task while avoiding a collision with first surgical robotic cart assembly 100a and surgical table “ST.”


Once third position “PB3” of second surgical robotic cart assembly 100b is determined, robotic surgical system 1 is returned to step S214 wherein the movement phase is initiated to move second surgical robotic cart assembly 100b to third position “PB3.” Though not specifically illustrated in FIG. 6, it is contemplated that first surgical robotic cart assembly 100a is configured to operate similar to second surgical robotic cart assembly 100b in response to detecting the potential contact distance “D3” between first and second surgical robotic cart assemblies 100a, 100b.


Continuing with FIG. 3, alternatively, if the potential contact distance “D3” between first and second surgical robotic cart assemblies 100a, 100b is not detected by sensor(s) 134 of second surgical robotic cart assembly 100b following step S216, in step S224, the confirmation of placement phase in initiated. Once control device 4 determines that first and second surgical robotic cart assemblies 100a, 100b are in the respective specified positions (e.g., positions “PA2” and “PB2” or “PB3”), in step S226, environmental maps 12a are updated similar to the process in step S204. Specifically, the dynamic map portion 12a2 of environmental maps 12a is updated to register positions “PA2” and “PB2”/“PB3” of first and second surgical robotic cart assemblies 100a, 100b, respectively, as the respective current positions thereof. Next, the updated dynamic map portion 12a2 of environmental maps 12a is again incorporated or aligned with the static map portion 12a1 of environmental maps 12a to provide a representation of the current positions “PA2” and “PB2”/“PB3 within the boundaries of operating room “OR.”


Following the confirmation of placement phase, in step S228, first and second surgical robotic cart assemblies 100a, 100b are configured to perform the respective specified tasks (e.g., medical procedure). It is contemplated that further adjustments may be made to the positions of first and second surgical robotic cart assemblies 100a, 100b as described in the method steps above with reference to FIGS. 3-6.


Turning now to FIG. 7, a surgical robotic cart assembly 300 provided in accordance with another aspect of the present disclosure is shown. Similar to surgical robotic cart assembly 100, surgical robotic cart assembly 300 generally includes a robotic arm 30, a vertical column 320, and a base portion 330. Base portion 330 includes a plurality of casters 340, 350, and 360 coupled thereto.


Base portion 330 also includes a visual guidance system 331 and an on-board safety system 370. Visual guidance system 331 is configured to direct clinician “C” as to where surgical robotic cart assembly 300 needs to be manually moved. In embodiments, visual guidance system 331 includes a projector 332 mounted on a gimbal 334 supported on a bottom or floor-facing surface 330a of base portion 330. In embodiments, visual guidance system 331 also includes a display 336 and one or more light(s) 338 mounted on a top surface 330b thereof. Light(s) 338 may be spaced apart and disposed about the periphery of base portion 330.


With additional reference to FIG. 8, in conjunction with FIG. 7, manual positioning means of a first surgical robotic cart assembly 300a and a second surgical robotic cart assembly 300b around surgical table “ST” is described. Similar to the automated positioning means described above with reference to FIGS. 2-6, the manual positioning means is configured to facilitate optimal positioning of robotic arm 30 of each of first and second surgical robotic cart assemblies 300a, 300b relative to each other and surgical table “ST” to complete a specified task.


In embodiments, projector 332 of each first and second surgical robotic cart assemblies 300a, 300b is configured to emit or project a pattern “A1” and “B1,” respectively, onto the floor of the operating room “OR” which directs clinician “C” to move first and second surgical robotic cart assemblies 300a, 300b towards a target location “A2” and “B2,” respectively. In embodiments, visual sensor 14a mounted to the ceiling of operating room “OR” may include a projector 14c configured to simultaneously project patterns “A1” and “B1,” and target locations “A2” and “B2” for first and second surgical robotic cart assemblies 300a, 300b, respectively. Alternatively, projector 14c may instead be disposed within the operating room lights (not shown) above surgical table “ST.” It is contemplated that as clinician “C” moves first and second surgical robotic cart assemblies 300a, 300b towards target locations “A2” and “B2,” respectively, patterns “A1” and “B1” are configured to change to provide updated directions until exact placement of first and second surgical robotic cart assemblies 300a, 300b is achieved. Upon reaching respective target locations “A2” and “B2,” first and second surgical robotic cart assemblies 300a, 300b are configured to provide audible and/or visible indication that target locations “A2” and “B2” have been reached.


In embodiments, display 336 on base portion 330 is configured to display a visual indication 337 of the direction to move first and second surgical robotic cart assemblies 300a, 300b. Visual indication 337 of display 336 may include geometric indicia such as, for example, an arrow scaled according to the distance first and second surgical robotic cart assemblies 300a, 300b need to move in order to reach target locations “A2” and “B2,” respectively. In embodiments, visual indication 337 of display 336 may include numeric indicia of the remaining distance to target locations “A2” and “B2,” respectively. Display 336 is configured to continuously update the visual indication 337 until target locations “A2” and “B2” have been reached. In embodiments, light(s) 338 disposed about the periphery of base portion 330, may be configured to selectively illuminate to provide indication to move first and second surgical robotic cart assemblies 300a, 300b in a certain direction.


Further, as shown in FIG. 8, clinician “C” may be provided with an augmented reality display device “AR” configured to provide an enhanced view of the operating room “OR” including visual indication of where first and second surgical robotic cart assemblies 300a, 300b need to be moved as well as dynamically updating paths to reach target locations “A2” and “B2,” respectively. Augmented reality display device “AR” may also be configured to provide indication of speed and the need to slow movement of first and second surgical robotic cart assemblies 300a, 300b in order to avoid collisions.


Safety system 370 is configured to operate in conjunction with visual guidance system 331 to prevent collisions between one or more of first and second surgical robotic cart assemblies 300a, 300b and surgical table “ST” as first and second surgical robotic cart assemblies 300a, 300b are moved by clinician “C.” In embodiments, safety system 370 is configured to selectively trigger a locking mechanism to apply brakes to one or more of the plurality of casters 340, 350, and 360 of base portion 330. Safety system 370 is also configured to communicate with control device 4 (FIG. 1) in instances where first and second surgical robotic cart assemblies 300a, 300b require troubleshooting.


While several embodiments of the disclosure have been shown in the drawings, it is not intended that the disclosure be limited thereto, as it is intended that the disclosure be as broad in scope as the art will allow and that the specification be read likewise. Any combination of the above embodiments is also envisioned and is within the scope of the claimed invention. Therefore, the above description should not be construed as limiting, but merely as exemplifications of particular embodiments. Those skilled in the art will envision other modifications within the scope and spirit of the claims appended hereto.

Claims
  • 1. A method of placing a surgical robotic cart assembly, comprising: determining a first position of a first surgical robotic cart assembly relative to a surgical table;determining a first position of a second surgical robotic cart assembly relative to the first surgical robotic cart assembly and the surgical table;retrieving an environmental map including a static map portion including boundaries of an operating room and at least one landmark and a dynamic map portion including positions of the first and second surgical robotic carts;updating the dynamic map portion to incorporate the first position of the first surgical robotic cart assembly and the first position of the second surgical robotic cart assembly;calculating a path for the first surgical robotic cart assembly towards a second position of the first surgical robotic cart assembly relative to the surgical table, wherein in the second position, the first surgical robotic cart assembly is spaced-apart a first safe distance from the surgical table;calculating a path for the second surgical robotic cart assembly towards a second position of the second surgical robotic cart assembly relative to the first surgical robotic cart assembly and the surgical table, wherein in the second position, the second surgical robotic cart assembly is spaced-apart a second safe distance from the first surgical robotic cart assembly and a third safe distance from the surgical table;moving the first surgical robotic cart assembly autonomously towards the second position thereof;moving the second surgical robotic cart assembly autonomously towards the second position thereof;updating the dynamic map portion during movement of the first surgical robotic cart assembly and the second surgical robotic cart assembly;predicting a potential collision along the path of the first surgical robotic cart assembly as the first surgical robotic cart assembly moves towards the second position thereof; andpredicting a potential collision along the path of the second surgical robotic cart assembly as the second surgical robotic cart assembly moves towards the second position thereof.
  • 2. The method according to claim 1, further comprising: obtaining a first sensor data from a visual sensor to determine the first position of the first surgical robotic cart assembly and to determine the first position of the second surgical robotic cart assembly.
  • 3. The method according to claim 1, further comprising: obtaining a second sensor data from a floor sensor to determine the first position of the first surgical robotic cart assembly and to determine the first position of the first surgical robotic cart assembly.
  • 4. The method according to claim 1, further comprising: obtaining a third sensor data from the first surgical robotic cart assembly to determine the first position of the second surgical robotic cart assembly.
  • 5. The method according to claim 1, further comprising: obtaining a fourth sensor data from the surgical table to determine the first position of the first surgical robotic cart assembly.
  • 6. The method according to claim 1, further comprising: updating an environmental map to incorporate the first position of the first surgical robotic cart assembly and the first position of the second surgical robotic cart assembly.
  • 7. The method according to claim 1, further comprising: determining a third position for the second surgical robotic cart assembly upon detecting the potential collision between the second surgical robotic cart assembly and the first surgical robotic cart assembly.
  • 8. The method according to claim 1, further comprising: determining whether the second surgical robotic cart assembly needs troubleshooting upon detecting the potential collision between the second surgical robotic cart assembly and the first surgical robotic cart assembly.
  • 9. The method according to claim 1, further comprising: moving the first surgical robotic cart assembly and the second surgical robotic cart assembly simultaneously towards the respective second positions thereof.
  • 10. The method according to claim 1, further comprising: determining a first position of a clinician relative to the first surgical robotic cart assembly, the second surgical robotic cart assembly, and the surgical table.
  • 11. A method of positioning a plurality of surgical robotic cart assemblies within an operating room, comprising: obtaining a first sensor data from an operating room sensor;determining a first position of a first surgical robotic cart assembly and determining a first position of a second surgical robotic cart assembly, the first surgical robotic cart assembly including a first base portion having a first sensor and a first transmitter, and the second surgical robotic cart assembly including a second base portion having a second sensor and a second transmitter;calculating a first path for the first surgical robotic cart assembly towards a second position of the first surgical robotic cart assembly and calculating a second path for the second surgical robotic cart assembly towards a second position of the second surgical robotic cart assembly;moving the first surgical robotic cart assembly and the second surgical robotic cart assembly autonomously towards the second positions, respectively, thereof;predicting a potential collision along the first path and the second path as the first surgical robotic cart assembly moves towards the second position thereof and as the second surgical robotic cart assembly moves towards the second position thereof; andupdating an environmental map with the second position of the first surgical robotic cart assembly and with the second position of the second surgical robotic cart assembly upon moving the first and second surgical robotic cart assemblies to the second positions, respectively, thereof.
  • 12. The method according to claim 11, further comprising: determining the first position of the first surgical robotic cart assembly and determining the first position of the second surgical robotic cart assembly from the first sensor data obtained from the operating room sensor.
  • 13. The method according to claim 11, further comprising: obtaining a second sensor data from the first sensor of the first surgical robotic cart assembly to determine the first position of the second surgical robotic cart assembly.
  • 14. The method according to claim 11, further comprising: obtaining a third sensor data from the second sensor of the second surgical robotic cart assembly to determine the first position of the first surgical robotic cart assembly.
  • 15. The method according to claim 11, further comprising: calculating the second position of the first surgical robotic cart assembly and calculating the second position of the second surgical robotic cart assembly to maintain a first safe distance between the first and second surgical robotic cart assemblies and to maintain a second safe distance between the first and second surgical robotic cart assemblies and a surgical table.
  • 16. The method according to claim 15, further comprising: moving the second surgical robotic cart assembly autonomously to a third position thereof when a distance between the first and second surgical robotic cart assemblies is less than the first safe distance.
  • 17. The method according to claim 16, further comprising: updating the environmental map to register the third position of the second surgical robotic cart assembly as a current position of the second surgical robotic cart assembly when the second surgical robotic cart assembly is moved to the third position thereof.
  • 18. A surgical robotic cart assembly, comprising: a robotic arm; anda base portion configured to operatively support the robotic arm thereon, the base portion including a visual guidance system having:a projector mounted on the base portion, the projector configured to project a pattern corresponding to a movement direction towards a target location, wherein the pattern projected by the projector is configured to change as the base portion is moved along the pattern towards the target location;a display mounted on the base portion, the display configured to represent a visual indication corresponding to the movement direction towards the target location; anda plurality of lights mounted on the base portion and spaced apart thereon, at least one of the plurality of lights configured to selectively illuminate corresponding to the movement direction towards the target location.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a U.S. National Stage Application filed under 35 U.S.C. § 371(a) of International Patent Application Serial No. PCT/US2019/025108, filed Apr. 1, 2019, which claims the benefit of and priority to U.S. Provisional Patent Application Ser. No. 62/660,476, filed Apr. 20, 2018, the entire disclosure of each of which are incorporated by reference herein.

PCT Information
Filing Document Filing Date Country Kind
PCT/US2019/025108 4/1/2019 WO
Publishing Document Publishing Date Country Kind
WO2019/204013 10/24/2019 WO A
US Referenced Citations (392)
Number Name Date Kind
6132368 Cooper Oct 2000 A
6154139 Heller Nov 2000 A
6205396 Teicher et al. Mar 2001 B1
6206903 Ramans Mar 2001 B1
6246200 Blumenkranz et al. Jun 2001 B1
6312435 Wallace et al. Nov 2001 B1
6331181 Tierney et al. Dec 2001 B1
6374155 Wallach et al. Apr 2002 B1
6394998 Wallace et al. May 2002 B1
6424885 Niemeyer et al. Jul 2002 B1
6438456 Feddema et al. Aug 2002 B1
6441577 Blumenkranz et al. Aug 2002 B2
6459926 Nowlin et al. Oct 2002 B1
6491691 Morley et al. Dec 2002 B1
6491701 Tierney et al. Dec 2002 B2
6493608 Niemeyer Dec 2002 B1
6507771 Payton et al. Jan 2003 B2
6565554 Niemeyer May 2003 B1
6645196 Nixon et al. Nov 2003 B1
6659939 Moll et al. Dec 2003 B2
6671581 Niemeyer et al. Dec 2003 B2
6676684 Morley et al. Jan 2004 B1
6685698 Morley et al. Feb 2004 B2
6699235 Wallace et al. Mar 2004 B2
6714839 Salisbury, Jr. et al. Mar 2004 B2
6716233 Whitman Apr 2004 B1
6728599 Wang et al. Apr 2004 B2
6746443 Morley et al. Jun 2004 B1
6766204 Niemeyer et al. Jul 2004 B2
6770081 Cooper et al. Aug 2004 B1
6772053 Niemeyer Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6793652 Whitman et al. Sep 2004 B1
6793653 Sanchez et al. Sep 2004 B2
6799065 Niemeyer Sep 2004 B1
6837883 Moll et al. Jan 2005 B2
6839612 Sanchez et al. Jan 2005 B2
6840938 Morley et al. Jan 2005 B1
6843403 Whitman Jan 2005 B2
6845297 Allard Jan 2005 B2
6846309 Whitman et al. Jan 2005 B2
6866671 Tierney et al. Mar 2005 B2
6871117 Wang et al. Mar 2005 B2
6879880 Nowlin et al. Apr 2005 B2
6899705 Niemeyer May 2005 B2
6902560 Morley et al. Jun 2005 B1
6936042 Wallace et al. Aug 2005 B2
6941191 Jaeger Sep 2005 B2
6951535 Ghodoussi et al. Oct 2005 B2
6974449 Niemeyer Dec 2005 B2
6991627 Madhani et al. Jan 2006 B2
6994708 Manzo Feb 2006 B2
7048745 Tierney et al. May 2006 B2
7066926 Wallace et al. Jun 2006 B2
7118582 Wang et al. Oct 2006 B1
7125403 Julian et al. Oct 2006 B2
7155315 Niemeyer et al. Dec 2006 B2
7239940 Wang et al. Jul 2007 B2
7306597 Manzo Dec 2007 B2
7357774 Cooper Apr 2008 B2
7373219 Nowlin et al. May 2008 B2
7379790 Toth et al. May 2008 B2
7386365 Nixon Jun 2008 B2
7391173 Schena Jun 2008 B2
7398707 Morley et al. Jul 2008 B2
7413565 Wang et al. Aug 2008 B2
7453227 Prisco et al. Nov 2008 B2
7524320 Tierney et al. Apr 2009 B2
7574250 Niemeyer Aug 2009 B2
7594912 Cooper et al. Sep 2009 B2
7607440 Coste-Maniere et al. Oct 2009 B2
7666191 Orban, III et al. Feb 2010 B2
7667606 Packert et al. Feb 2010 B2
7682357 Ghodoussi et al. Mar 2010 B2
7689320 Prisco et al. Mar 2010 B2
7695481 Wang et al. Apr 2010 B2
7695485 Whitman et al. Apr 2010 B2
7699855 Anderson et al. Apr 2010 B2
7713263 Niemeyer May 2010 B2
7725214 Diolaiti May 2010 B2
7727244 Orban, III et al. Jun 2010 B2
7741802 Prisco et al. Jun 2010 B2
7756036 Druke et al. Jul 2010 B2
7757028 Druke et al. Jul 2010 B2
7762825 Burbank et al. Jul 2010 B2
7778733 Nowlin et al. Aug 2010 B2
7803151 Whitman Sep 2010 B2
7806891 Nowlin et al. Oct 2010 B2
7819859 Prisco et al. Oct 2010 B2
7819885 Cooper Oct 2010 B2
7824401 Manzo et al. Nov 2010 B2
7835823 Sillman et al. Nov 2010 B2
7843158 Prisco Nov 2010 B2
7865266 Moll et al. Jan 2011 B2
7865269 Prisco et al. Jan 2011 B2
7886743 Cooper et al. Feb 2011 B2
7899578 Prisco et al. Mar 2011 B2
7907166 Lamprecht et al. Mar 2011 B2
7935130 Williams May 2011 B2
7963913 Devengenzo et al. Jun 2011 B2
7983793 Toth et al. Jul 2011 B2
8002767 Sanchez et al. Aug 2011 B2
8004229 Nowlin et al. Aug 2011 B2
8012170 Whitman et al. Sep 2011 B2
8054752 Druke et al. Nov 2011 B2
8062288 Cooper et al. Nov 2011 B2
8079950 Stern et al. Dec 2011 B2
8100133 Mintz et al. Jan 2012 B2
8108072 Zhao et al. Jan 2012 B2
8112172 Chen et al. Feb 2012 B2
8120301 Goldberg et al. Feb 2012 B2
8142447 Cooper et al. Mar 2012 B2
8147503 Zhao et al. Apr 2012 B2
8151661 Schena et al. Apr 2012 B2
8155479 Hoffman et al. Apr 2012 B2
8182469 Anderson et al. May 2012 B2
8202278 Orban, III et al. Jun 2012 B2
8205396 Atiyeh, Sr. et al. Jun 2012 B2
8206406 Orban, III Jun 2012 B2
8210413 Whitman et al. Jul 2012 B2
8216250 Orban, III et al. Jul 2012 B2
8220468 Cooper et al. Jul 2012 B2
8244403 Lin et al. Aug 2012 B2
8256319 Cooper et al. Sep 2012 B2
8285517 Sillman et al. Oct 2012 B2
8315720 Mohr et al. Nov 2012 B2
8335590 Costa et al. Dec 2012 B2
8347757 Duval Jan 2013 B2
8374723 Zhao et al. Feb 2013 B2
8418073 Mohr et al. Apr 2013 B2
8419717 Diolaiti et al. Apr 2013 B2
8423182 Robinson et al. Apr 2013 B2
8446288 Mizushima et al. May 2013 B2
8452447 Nixon May 2013 B2
8454585 Whitman Jun 2013 B2
8478442 Casey et al. Jul 2013 B2
8499992 Whitman et al. Aug 2013 B2
8508173 Goldberg et al. Aug 2013 B2
8528440 Morley et al. Sep 2013 B2
8529582 Devengenzo et al. Sep 2013 B2
8540748 Murphy et al. Sep 2013 B2
8551116 Julian et al. Oct 2013 B2
8562594 Cooper et al. Oct 2013 B2
8594841 Zhao et al. Nov 2013 B2
8597182 Stein et al. Dec 2013 B2
8597280 Cooper et al. Dec 2013 B2
8600551 Itkowitz et al. Dec 2013 B2
8608773 Tierney et al. Dec 2013 B2
8620473 Diolaiti et al. Dec 2013 B2
8624537 Nowlin et al. Jan 2014 B2
8634956 Chiappetta et al. Jan 2014 B1
8634957 Toth et al. Jan 2014 B2
8638056 Goldberg et al. Jan 2014 B2
8638057 Goldberg et al. Jan 2014 B2
8644988 Prisco et al. Feb 2014 B2
8666544 Moll et al. Mar 2014 B2
8668638 Donhowe et al. Mar 2014 B2
8746252 McGrogan et al. Jun 2014 B2
8749189 Nowlin et al. Jun 2014 B2
8749190 Nowlin et al. Jun 2014 B2
8758352 Cooper et al. Jun 2014 B2
8761930 Nixon Jun 2014 B2
8768516 Diolaiti et al. Jul 2014 B2
8786241 Nowlin et al. Jul 2014 B2
8790243 Cooper et al. Jul 2014 B2
8798840 Fong et al. Aug 2014 B2
8808164 Hoffman et al. Aug 2014 B2
8816628 Nowlin et al. Aug 2014 B2
8821480 Burbank Sep 2014 B2
8823308 Nowlin et al. Sep 2014 B2
8827989 Niemeyer Sep 2014 B2
8828023 Neff et al. Sep 2014 B2
8838270 Druke et al. Sep 2014 B2
8852174 Burbank Oct 2014 B2
8858547 Brogna Oct 2014 B2
8862268 Robinson et al. Oct 2014 B2
8864751 Prisco et al. Oct 2014 B2
8864752 Diolaiti et al. Oct 2014 B2
8903546 Diolaiti et al. Dec 2014 B2
8903549 Itkowitz et al. Dec 2014 B2
8911428 Cooper et al. Dec 2014 B2
8912746 Reid et al. Dec 2014 B2
8944070 Guthart et al. Feb 2015 B2
8983662 Moore Mar 2015 B2
8989903 Weir et al. Mar 2015 B2
9002518 Manzo et al. Apr 2015 B2
9014856 Manzo et al. Apr 2015 B2
9016540 Whitman et al. Apr 2015 B2
9019345 Patrick Apr 2015 B2
9043027 Durant et al. May 2015 B2
9050120 Swarup et al. Jun 2015 B2
9055961 Manzo et al. Jun 2015 B2
9068628 Solomon et al. Jun 2015 B2
9078684 Williams Jul 2015 B2
9084623 Gomez et al. Jul 2015 B2
9095362 Dachs, II et al. Aug 2015 B2
9096033 Holop et al. Aug 2015 B2
9101381 Burbank et al. Aug 2015 B2
9113877 Whitman et al. Aug 2015 B1
9138284 Krom et al. Sep 2015 B2
9144456 Rosa et al. Sep 2015 B2
9198730 Prisco et al. Dec 2015 B2
9204923 Manzo et al. Dec 2015 B2
9226648 Saadat et al. Jan 2016 B2
9226750 Weir et al. Jan 2016 B2
9226761 Burbank Jan 2016 B2
9232984 Guthart et al. Jan 2016 B2
9241766 Duque et al. Jan 2016 B2
9241767 Prisco et al. Jan 2016 B2
9241769 Larkin et al. Jan 2016 B2
9259275 Burbank Feb 2016 B2
9259277 Rogers et al. Feb 2016 B2
9259281 Griffiths et al. Feb 2016 B2
9259282 Azizian et al. Feb 2016 B2
9261172 Solomon et al. Feb 2016 B2
9265567 Orban, III et al. Feb 2016 B2
9265584 Itkowitz et al. Feb 2016 B2
9283048 Kostrzewski et al. Mar 2016 B2
9283049 Diolaiti et al. Mar 2016 B2
9301811 Goldberg et al. Apr 2016 B2
9314307 Richmond et al. Apr 2016 B2
9317651 Nixon Apr 2016 B2
9345546 Toth et al. May 2016 B2
9393017 Flanagan et al. Jul 2016 B2
9402689 Prisco et al. Aug 2016 B2
9417621 Diolaiti et al. Aug 2016 B2
9424303 Hoffman et al. Aug 2016 B2
9433418 Whitman et al. Sep 2016 B2
9446517 Burns et al. Sep 2016 B2
9452020 Griffiths et al. Sep 2016 B2
9474569 Manzo et al. Oct 2016 B2
9480533 Devengenzo et al. Nov 2016 B2
9503713 Zhao et al. Nov 2016 B2
9550300 Danitz et al. Jan 2017 B2
9554859 Nowlin et al. Jan 2017 B2
9566124 Prisco et al. Feb 2017 B2
9579164 Itkowitz et al. Feb 2017 B2
9585641 Cooper et al. Mar 2017 B2
9615883 Schena et al. Apr 2017 B2
9623563 Nixon Apr 2017 B2
9623902 Griffiths et al. Apr 2017 B2
9629520 Diolaiti Apr 2017 B2
9662177 Weir et al. May 2017 B2
9664262 Donlon et al. May 2017 B2
9687312 Dachs, II et al. Jun 2017 B2
9700334 Hinman et al. Jul 2017 B2
9718190 Larkin et al. Aug 2017 B2
9730719 Brisson et al. Aug 2017 B2
9737199 Pistor et al. Aug 2017 B2
9795446 DiMaio et al. Oct 2017 B2
9797484 Solomon et al. Oct 2017 B2
9801690 Larkin et al. Oct 2017 B2
9814530 Weir et al. Nov 2017 B2
9814536 Goldberg et al. Nov 2017 B2
9814537 Itkowitz et al. Nov 2017 B2
9820823 Richmond et al. Nov 2017 B2
9827059 Robinson et al. Nov 2017 B2
9830371 Hoffman et al. Nov 2017 B2
9839481 Blumenkranz et al. Dec 2017 B2
9839487 Dachs, II Dec 2017 B2
9850994 Schena Dec 2017 B2
9855102 Blumenkranz Jan 2018 B2
9855107 Labonville et al. Jan 2018 B2
9872737 Nixon Jan 2018 B2
9877718 Weir et al. Jan 2018 B2
9883920 Blumenkranz Feb 2018 B2
9888974 Niemeyer Feb 2018 B2
9895813 Blumenkranz et al. Feb 2018 B2
9901408 Larkin Feb 2018 B2
9902061 Kuffner Feb 2018 B1
9918800 Itkowitz et al. Mar 2018 B2
9943375 Blumenkranz et al. Apr 2018 B2
9943964 Hares Apr 2018 B2
9948852 Lilagan et al. Apr 2018 B2
9949798 Weir Apr 2018 B2
9949802 Cooper Apr 2018 B2
9952107 Blumenkranz et al. Apr 2018 B2
9956044 Gomez et al. May 2018 B2
9980778 Ohline et al. May 2018 B2
10008017 Itkowitz et al. Jun 2018 B2
10028793 Griffiths et al. Jul 2018 B2
10033308 Chaghajerdi et al. Jul 2018 B2
10034719 Richmond et al. Jul 2018 B2
10052167 Au et al. Aug 2018 B2
10085811 Weir et al. Oct 2018 B2
10092344 Mohr et al. Oct 2018 B2
10123844 Nowlin et al. Nov 2018 B2
10188471 Brisson Jan 2019 B2
10201390 Swarup et al. Feb 2019 B2
10213202 Flanagan et al. Feb 2019 B2
10258416 Mintz et al. Apr 2019 B2
10278782 Jarc et al. May 2019 B2
10278783 Itkowitz et al. May 2019 B2
10282881 Itkowitz et al. May 2019 B2
10335242 Devengenzo et al. Jul 2019 B2
10405934 Prisco et al. Sep 2019 B2
10433922 Itkowitz et al. Oct 2019 B2
10464219 Robinson et al. Nov 2019 B2
10485621 Morrissette et al. Nov 2019 B2
10500004 Hanuschik et al. Dec 2019 B2
10500005 Weir et al. Dec 2019 B2
10500007 Richmond et al. Dec 2019 B2
10507066 DiMaio et al. Dec 2019 B2
10510267 Jarc et al. Dec 2019 B2
10524871 Liao Jan 2020 B2
10548459 Itkowitz et al. Feb 2020 B2
10575909 Robinson et al. Mar 2020 B2
10592529 Hoffman et al. Mar 2020 B2
10595944 Lattimore Mar 2020 B2
10595946 Nixon Mar 2020 B2
10881469 Robinson Jan 2021 B2
10881473 Itkowitz et al. Jan 2021 B2
10898188 Burbank Jan 2021 B2
10898189 McDonald, II Jan 2021 B2
10905506 Itkowitz et al. Feb 2021 B2
10912544 Brisson et al. Feb 2021 B2
10912619 Jarc et al. Feb 2021 B2
10918387 Duque et al. Feb 2021 B2
10918449 Solomon et al. Feb 2021 B2
10932873 Griffiths et al. Mar 2021 B2
10932877 Devengenzo et al. Mar 2021 B2
10939969 Swarup et al. Mar 2021 B2
10939973 DiMaio et al. Mar 2021 B2
10952801 Miller et al. Mar 2021 B2
10965933 Jarc Mar 2021 B2
10966742 Rosa et al. Apr 2021 B2
10973517 Wixey Apr 2021 B2
10973519 Weir et al. Apr 2021 B2
10984567 Itkowitz et al. Apr 2021 B2
10993773 Cooper et al. May 2021 B2
10993775 Cooper et al. May 2021 B2
11000331 Krom et al. May 2021 B2
11013567 Wu et al. May 2021 B2
11020138 Ragosta Jun 2021 B2
11020191 Diolaiti et al. Jun 2021 B2
11020193 Wixey et al. Jun 2021 B2
11026755 Weir et al. Jun 2021 B2
11026759 Donlon et al. Jun 2021 B2
11040189 Vaders et al. Jun 2021 B2
11045077 Stem et al. Jun 2021 B2
11045274 Dachs, II et al. Jun 2021 B2
11058501 Tokarchuk et al. Jul 2021 B2
11076925 DiMaio et al. Aug 2021 B2
11090119 Burbank Aug 2021 B2
11096687 Flanagan et al. Aug 2021 B2
11098803 Duque et al. Aug 2021 B2
11109925 Cooper et al. Sep 2021 B2
11116578 Hoffman et al. Sep 2021 B2
11129683 Steger et al. Sep 2021 B2
11135029 Suresh et al. Oct 2021 B2
11147552 Burbank et al. Oct 2021 B2
11147640 Jarc et al. Oct 2021 B2
11154373 Abbott et al. Oct 2021 B2
11154374 Hanuschik et al. Oct 2021 B2
11160622 Goldberg et al. Nov 2021 B2
11160625 Wixey et al. Nov 2021 B2
11161243 Rabindran et al. Nov 2021 B2
11166758 Mohr et al. Nov 2021 B2
11166770 DiMaio et al. Nov 2021 B2
11166773 Ragosta et al. Nov 2021 B2
11173597 Rabindran et al. Nov 2021 B2
11185378 Weir et al. Nov 2021 B2
11191596 Thompson et al. Dec 2021 B2
11197729 Thompson et al. Dec 2021 B2
11213360 Hourtash et al. Jan 2022 B2
11221863 Azizian et al. Jan 2022 B2
11234700 Ragosta et al. Feb 2022 B2
11241274 Vaders et al. Feb 2022 B2
11241290 Waterbury et al. Feb 2022 B2
11259870 DiMaio et al. Mar 2022 B2
11259884 Burbank Mar 2022 B2
11272993 Gomez et al. Mar 2022 B2
11272994 Saraliev et al. Mar 2022 B2
11291442 Wixey et al. Apr 2022 B2
11291513 Manzo et al. Apr 2022 B2
20060195226 Matsukawa et al. Aug 2006 A1
20070112461 Zini May 2007 A1
20070118248 Lee et al. May 2007 A1
20080275630 Regienczuk Nov 2008 A1
20080287924 Mangiardi Nov 2008 A1
20090117312 Hanelt et al. May 2009 A1
20120101508 Wook Choi et al. Apr 2012 A1
20140316570 Sun et al. Oct 2014 A1
20150257837 Itkowitz et al. Sep 2015 A1
20160346930 Hares Dec 2016 A1
20170071693 Taylor et al. Mar 2017 A1
20170123421 Kentley et al. May 2017 A1
20170251990 Kheradpir et al. Sep 2017 A1
20180042682 Iceman et al. Feb 2018 A1
20180344421 Cagle Dec 2018 A1
20190041854 Millhouse Feb 2019 A1
20190069962 Tabandeh et al. Mar 2019 A1
Foreign Referenced Citations (10)
Number Date Country
107072723 Aug 2017 CN
107072726 Aug 2017 CN
107787266 Mar 2018 CN
3212109 Sep 2017 EP
101613376 Apr 2016 KR
2016193686 Dec 2016 WO
2017147596 Aug 2017 WO
WO-2017147596 Aug 2017 WO
2017220822 Dec 2017 WO
2019203999 Oct 2019 WO
Non-Patent Literature Citations (8)
Entry
International Search Report dated Jul. 18, 2019 and Written Opinion completed Jul. 18, 2019 corresponding to counterpart Int'l Patent Application PCT/US2019/025108.
Indian Office Action dated Jan. 17, 2022 issued in corresponding IN Appln No. 202017040919.
Partial Supplementary European Search Report dated Dec. 15, 2021 issued in corresponding EP Appln. No. 19788524.7.
Japanese Office Action dated Sep. 7, 2021 corresponding to counterpart Patent Application JP 2020-555060.
Japanese Notice of Allowance dated May 2, 2022 issued in corresponding JP Appln. No. 2020-555060.
Extended European Search Report dated Mar. 22, 2022, issued in corresponding EP AppIn. No. 19788524.7.
Office Action issued in corresponding Chinese application CN 201980025810.3 dated Feb. 12, 2023, together with English language translation (16 pages).
European Examination Report issued in corresponding application EP 19788524.7 dated Oct. 19, 2023.
Related Publications (1)
Number Date Country
20210153958 A1 May 2021 US
Provisional Applications (1)
Number Date Country
62660476 Apr 2018 US