Robotic surgical system with an embedded imager

Information

  • Patent Grant
  • 11553984
  • Patent Number
    11,553,984
  • Date Filed
    Friday, June 2, 2017
    6 years ago
  • Date Issued
    Tuesday, January 17, 2023
    a year ago
Abstract
The present disclosure is directed to a robotic surgical system and a corresponding method. The system includes at least one robot arm and a radiation source coupled to the robot arm. The system also includes a surgical table having a digital imaging receiver configured to output an electrical signal based on radiation received from the radiation source. A controller having a processor and a memory is configured to receive the electrical signal and generate an initial image of a patient on the surgical table based on the electrical signal. The controller transforms the initial image to a transformed image based on an orientation of the radiation source.
Description
BACKGROUND

Robotic surgical systems such as teleoperative systems are used to perform minimally invasive surgical procedures that offer many benefits over traditional open surgery techniques, including less pain, shorter hospital stays, quicker return to normal activities, minimal scarring, reduced recovery time, and less injury to tissue.


Robotic surgical systems can have a number of robotic arms that move attached instruments or tools, such as an image capturing device, a grasper, a stapler, an electrosurgical instrument, etc., in response to movement of input devices by a surgeon viewing images captured by the image capturing device of a surgical site. During a surgical procedure, each of the tools may be inserted through an opening, e.g., a laparoscopic port, into the patient and positioned to manipulate tissue at a surgical site. The openings are placed about the patient's body so that the surgical instruments may be used to cooperatively perform the surgical procedure and the image capturing device may view the surgical site.


During the surgical procedure, radiographic imaging may be required to see the status of a patient's internal anatomical structure as well as the location of any surgical tools located therein. The radiographic imaging is performed by a c-arm style fluoroscope that is brought into the operating room or is a dedicated fluoroscope installed within the operating room. The robotic surgical system may have to be disconnected from the patient and moved out of the way in order to position the fluoroscope around the patient to obtain the radiographic images. The disconnecting, movement, and reconnecting of the robotic surgical system will delay the surgical procedure. As a result of this delay, radiographic images may not be fully utilized as a safety monitoring step or to monitor progress of the surgical procedure.


Accordingly, there is a need for obtaining radiographic images without moving the surgical robotic surgical system.


SUMMARY

In an aspect of the present disclosure, a robotic surgical system is provided. The robotic surgical system includes at least one robot arm and a radiation source removably coupled to the robot arm. The system also includes a surgical table having a digital imaging receiver configured to output an electrical signal based on radiation received from the radiation source. A controller having a processor and a memory is configured to receive the electrical signal and generate an initial image of a patient on the surgical table based on the electrical signal. The controller transforms the initial image to a transformed image based on an orientation of the radiation source relative to the digital imaging receiver.


In embodiments, the controller determines a pose of the radiation source relative to the digital imaging receiver. The pose may include an angle between an imaging axis defined by the radiation source and an axis extending perpendicular to a plane defined by the digital imaging receiver. The pose may include a position of the radiation source relative to the digital imaging receiver.


The controller may transform the initial image to the transformed image based on the angle.


In some embodiments, the initial image may be an angled view (e.g., non-perpendicular) of the patient along an imaging axis of the radiation source.


In some embodiments, the controller may execute a movement plan to generate a 3D reconstruction of a patient. The movement plan may cause the controller to move the radiation source a plurality of times. The controller may generate a plurality of initial images, wherein each initial image corresponds to each time the radiation source is moved. The plurality of initial images may be transformed into a plurality of slices that are used to generate the 3D reconstruction.


In another aspect of the present disclosure, a method for imaging a patient using a robotic surgical system is provided. The method includes emitting radiation from a radiation source, receiving radiation from the radiation source using a digital imaging receiver included in a surgical table, and converting the received radiation into an electrical signal. The electrical signal is converted into an initial image which is transformed into a transformed image based on a pose of the radiation source relative to the digital image receiver.


In some embodiments, the method also includes determining the pose of the radiation source relative to the digital image receiver from a position of the radiation source and an angle between an imaging axis defined by the radiation source and a line perpendicular to a plane defined by the digital image receiver. The initial image is transformed into the transformed image based on the pose.


In some embodiments, the initial image may be an angled view of the patient.


In some embodiments, a movement plan is executed to generate a 3D reconstruction of a patient. The movement plan may cause the radiation source to move a plurality of times. A plurality of initial images would be generated, wherein each initial image corresponds to each time the radiation source is moved. The plurality of initial images may be transformed into a plurality of slices that are used to generate the 3D reconstruction.





BRIEF DESCRIPTION OF THE DRAWINGS

Various aspects of the present disclosure are described hereinbelow with reference to the drawings, which are incorporated in and constitute a part of this specification, wherein:



FIG. 1 is a schematic illustration of a user interface and a robotic system of a robotic surgical system in accordance with the present disclosure;



FIG. 2 is a schematic illustration of an imaging system in accordance with the present disclosure;



FIG. 3 is a flowchart depicting operation of the imaging system of the present disclosure; and



FIG. 4 is a flowchart depicting another operation of the imaging system of the present disclosure.





DETAILED DESCRIPTION

Fluoroscopic images may be obtained without the use of traditional C-arm style fluoroscopes. The present disclosure is directed to systems and methods for obtaining fluoroscopic images using a robotic surgical system. In the systems described herein, a radiation source is incorporated into an end effector attached to an arm of the robotic surgical system. A receiver can incorporated into the surgical table or place on the surgical table. The receiver is configured to receive x-rays emitted by the radiation source. The received x-rays are converted into an image with an angled perspective which is thereafter transformed into an image with a perpendicular perspective relative to the patient.


By placing the radiation source as an end effector on the robotic arm and the receiver on the operating table, a radiograph may be produced without having to move the surgical robot out of the way of a separate imaging device, e.g., c-arm style fluoroscope or dedicated fluoroscope. By knowing the angle of an orientation of the end effector relative to a plane defined by the surface of the operating table, the obtained images may be corrected to produce an appropriate image, i.e., a perpendicular perspective of the patient.


Turning to FIG. 1, a robotic surgical system 100 may be employed with one or more consoles 102 that are next to the operating theater or located in a remote location. In this instance, one team of clinicians or nurses may prep the patient for surgery and configure the robotic surgical system 100 with one or more end effectors 104 while another clinician (or group of clinicians) remotely controls the instruments via the robotic surgical system 100. As can be appreciated, a highly skilled clinician may perform multiple operations in multiple locations without leaving his/her remote console which can be both economically advantageous and a benefit to the patient or a series of patients.


The robotic arms 106 of the surgical system 100 are typically coupled to a pair of master handles 108 by a controller 110. Controller 110 may be integrated with the console 102 or provided as a standalone device within the operating theater. The handles 106 can be moved by the clinician to produce a corresponding movement of the working ends of any type of end effector 104 (e.g., probes, mechanical or electrosurgical end effectors, graspers, knifes, scissors, staplers, etc.) attached to the robotic arms 106. For example, end effector 104 may be a probe that includes an image capture device.


The console 102 includes a display device 112 which is configured to display two-dimensional or three-dimensional images. The display device 112 displays the images of the surgical site which may include data captured by end effector 104 positioned on the ends 114 of the arms 106 and/or include data captured by imaging devices that are positioned about the surgical theater (e.g., an imaging device positioned within the surgical site, an imaging device positioned adjacent the patient, imaging device positioned at a distal end of an imaging arm). The imaging devices may capture visual images, infra-red images, ultrasound images, X-ray images, thermal images, and/or any other known real-time images of the surgical site. The imaging devices transmit captured imaging data to the controller 110 which creates the images of the surgical site in real-time from the imaging data and transmits the images to the display device 112 for display.


The movement of the master handles 108 may be scaled so that the working ends have a corresponding movement that is different, smaller or larger, than the movement performed by the operating hands of the clinician. The scale factor or gearing ratio may be adjustable so that the operator can control the resolution of the working ends of the end effector 104.


During operation of the surgical system 100, the master handles 108 are operated by a clinician to produce a corresponding movement of the robotic arms 106 and/or end effector 104. The master handles 108 provide a signal to the controller 110 which then provides a corresponding signal to one or more drive motors 114. The one or more drive motors 114 are coupled to the robotic arms 106 in order to move the robotic arms 106 and/or end effector 104.


The master handles 108 may include various haptics 116 to provide feedback to the clinician relating to various tissue parameters or conditions, e.g., tissue resistance due to manipulation, cutting or otherwise treating, pressure by the instrument onto the tissue, tissue temperature, tissue impedance, etc. As can be appreciated, such haptics 116 provide the clinician with enhanced tactile feedback simulating actual operating conditions. The haptics 116 may include vibratory motors, electroactive polymers, piezoelectric devices, electrostatic devices, subsonic audio wave surface actuation devices, reverse-electrovibration, or any other device capable of providing a tactile feedback to a user. The master handles 108 may also include a variety of different actuators (not shown) for delicate tissue manipulation or treatment further enhancing the clinician's ability to mimic actual operating conditions.


The controller 110 includes a transceiver 118 and a processor 120. Transceiver 118 receives a signal from a radiation source 122 disposed on end effector 104 indicating a position and orientation of the radiation source 122 and a signal from a digital imaging receiver (DIR) 124 disposed on or in the operating table 126, which will be described in more detail below. In some embodiments, the processor 120 may determine the position and orientation of the radiation source 122. The signals from radiation source 122 and/or DIR 124 may be transmitted to transceiver 118 via any conventional wired or wireless methods. Transceiver 118 provides the signal to an image generating unit 128 in processor 120 which generates an image based on the position and orientation of the radiation source 122 and the DIR 124. A memory 130 may store an algorithm used to perform the image generation.


Turning to FIG. 2, while making reference to FIG. 1, a clinician may attach an end effector 104 including a radiation source 122 to one of the robotic arms 106. The radiation source 122 emits x-rays toward the patient. As the x-rays pass through the patient, the x-rays are attenuated by different amounts as they pass through, having some energy absorbed, or deflect off of various tissues in the patient. The attenuated x-rays are received by the DIR 124 which converts the received x-rays into electrical signals that are provided to the image generating unit 128. The image generating unit converts the electrical signals into an initial image which represents an angled view of the patient.


Once the initial image is generated, the image generating unit 128 uses a pose including the position and the orientation of the radiation source 122 to perform an affine transformation in which the initial image is transformed into a transformed image that would be displayed on display 112. It will be appreciated that the initial image may be a skewed image that is at least keystoned into the transformed image. The perpendicular view presents a view perpendicular the imaging axis “R”.


It is contemplated that the transformed image could be a view perpendicular to the patient such that the displayed view is a top plan view perpendicular to a longitudinal axis of the patient. For example, as shown in FIG. 2, the orientation of the radiation source is represented by the angle “a”. Angle “a” is the angle between an imaging axis “R” extending through the radiation source 122 and a line “P” perpendicular to the plane “X” of the operating table 126. Based on the angle “a”, the image generating unit could perform a transformation of the initial image to the top plan displayed image.



FIG. 3 is a flow chart depicting operation of the imaging system in accordance with embodiments of the present disclosure. FIG. 3 will be described below in conjunction with FIGS. 1 and 2. As shown in FIG. 3, the radiation source 122 is moved into position in step s302. Once in position, the processor 120 determines or receives the pose of the radiation source 122, including the position and the orientation of the radiation source 122, to determine the angle “a” in step s304. In step s306, the processor 120 determines if angle “a” is less than a predetermined angle “p”. If angle “a” is not less than predetermined angle “p”, the resulting transformed image may not work well or at all. If the angle “a” is not less than the angle “p”, the process returns to step s302 where the radiation source is moved. For example, if the misalignment of the predetermined angle “p” and the imaging axis “R” is large, the keystone transformation may create a crude image as a result of large pixilation at one end of the image. If the angle “a” is less than the angle “p”, the process proceeds to step s308, where the initial image is generated based on the x-rays received by the DIR 124. The initial image is then transformed in step s310 by the image generating unit 128 to generate a transformed image. The transformed image is then displayed on display 112 in step s312. In step s314, a user may decide to end the procedure or continue the procedure. If the procedure is continued, the user may then determine if the radiation source 122 needs to be moved in step s316. If the radiation source does not need to be moved, the process returns to step s308. If the radiation source needs to be moved, the process returns to step s302.


In some embodiments, the radiation source 122 may be moved according to a movement plan stored in the memory 130. FIG. 4, which will be discussed in conjunction with FIGS. 1 and 2, depicts a method for generating a 3D reconstruction of a patient. As shown in FIG. 4, a movement plan is executed by controller 110 (i.e., the movement plan is loaded from memory 130 to be executed by processor 120) in step s402. In step s404, the controller 110 generates an initial image by controlling the radiation source 122 to emit x-rays and receiving an electrical signal from DIR 124 based on x-rays received from the radiation source 122. The controller then transforms the initial image to an image slice based on the orientation of the radiation source 122 in step s406. For example, if the imaging axis “R” is perpendicular to the centerline of the patient, the image slice is an image of the patient parallel to a transverse plane of the patient. However, it is contemplated that the imaging axis “R” can be disposed at any angle relative to the patient. In step s408, the controller 110 determines whether the movement plan is completed. If the movement plan is not completed, the controller 110 moves the radiation source 122 in step s410 and then proceeds to step s404. If the movement plan is completed, the controller 110 uses the plurality of slices generated by the controller 110 to generate a 3D reconstruction of the patient in step s412.


The embodiments disclosed herein are examples of the disclosure and may be embodied in various forms. Specific structural and functional details disclosed herein are not to be interpreted as limiting, but as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present disclosure in virtually any appropriately detailed structure. Like reference numerals may refer to similar or identical elements throughout the description of the figures.


The phrases “in an embodiment,” “in embodiments,” “in some embodiments,” or “in other embodiments,” which may each refer to one or more of the same or different embodiments in accordance with the present disclosure. A phrase in the form “A or B” means “(A), (B), or (A and B)”. A phrase in the form “at least one of A, B, or C” means “(A), (B), (C), (A and B), (A and C), (B and C), or (A, B and C)”. A user may refer to a surgeon or any medical professional, such as a doctor, nurse, technician, medical assistant, or the like performing a medical procedure.


The systems described herein may also utilize one or more controllers to receive various information and transform the received information to generate an output. The controller may include any type of computing device, computational circuit, or any type of processor or processing circuit capable of executing a series of instructions that are stored in a memory. The controller may include multiple processors and/or multicore central processing units (CPUs) and may include any type of processor, such as a microprocessor, digital signal processor, microcontroller, or the like. The controller may also include a memory to store data and/or algorithms to perform a series of instructions.


Any of the herein described methods, programs, algorithms or codes may be converted to, or expressed in, a programming language or computer program. A “Programming Language” and “Computer Program” includes any language used to specify instructions to a computer, and includes (but is not limited to) these languages and their derivatives: Assembler, Basic, Batch files, BCPL, C, C+, C++, Delphi, Fortran, Java, JavaScript, Machine code, operating system command languages, Pascal, Perl, PL1, scripting languages, Visual Basic, metalanguages which themselves specify programs, and all first, second, third, fourth, and fifth generation computer languages. Also included are database and other data schemas, and any other metalanguages. No distinction is made between languages which are interpreted, compiled, or use both compiled and interpreted approaches. No distinction is also made between compiled and source versions of a program. Thus, reference to a program, where the programming language could exist in more than one state (such as source, compiled, object, or linked) is a reference to any and all such states. Reference to a program may encompass the actual instructions and/or the intent of those instructions.


Any of the herein described methods, programs, algorithms or codes may be contained on one or more machine-readable media or memory. The term “memory” may include a mechanism that provides (e.g., stores and/or transmits) information in a form readable by a machine such a processor, computer, or a digital processing device. For example, a memory may include a read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, or any other volatile or non-volatile memory storage device. Code or instructions contained thereon can be represented by carrier wave signals, infrared signals, digital signals, and by other like signals.


It should be understood that the foregoing description is only illustrative of the present disclosure. Various alternatives and modifications can be devised by those skilled in the art without departing from the disclosure. For instance, any of the augmented images described herein can be combined into a single augmented image to be displayed to a clinician. Accordingly, the present disclosure is intended to embrace all such alternatives, modifications and variances. The embodiments described with reference to the attached drawing figs. are presented only to demonstrate certain examples of the disclosure. Other elements, steps, methods and techniques that are insubstantially different from those described above and/or in the appended claims are also intended to be within the scope of the disclosure.

Claims
  • 1. A robotic surgical system comprising: at least one robot arm;a radiation source removably coupled to the robot arm;a surgical table having a digital imaging receiver configured to output an electrical signal based on radiation received from the radiation source; anda controller having a processor and a memory, the controller is configured to receive the electrical signal and generate an initial image of a patient on the surgical table based on the electrical signal, wherein the controller is configured to: transform the initial image to a transformed image based on an orientation of the radiation source relative to the digital imaging receiver;execute a movement plan to generate a 3D reconstruction of the patient, wherein the movement plan causes the controller to move the radiation source a plurality of times;generate a plurality of initial images, wherein each initial image corresponds to each time the radiation source is moved; andtransform the plurality of initial images to a plurality of slices.
  • 2. The robotic surgical system of claim 1, wherein the controller determines a pose of the radiation source relative to the digital imaging receiver.
  • 3. The robotic surgical system of claim 2, wherein the pose includes an angle between an imaging axis defined by the radiation source and an axis extending perpendicular to a plane defined by the digital imaging receiver.
  • 4. The robotic surgical system of claim 3, wherein the pose includes a position of the radiation source relative to the digital imaging receiver.
  • 5. The robotic surgical system of claim 2, wherein the controller transforms the initial image to the transformed image based on the determined pose.
  • 6. The robotic surgical system of claim 1, wherein the initial image is an angled view of the patient along an imaging axis of the radiation source.
  • 7. The robotic surgical system of claim 1, wherein the 3D reconstruction is based on the plurality of slices.
  • 8. A robotic surgical system comprising: a radiation source supported on a robot arm;a surgical table having a digital imaging receiver configured to output an electrical signal based on radiation received from the radiation source; anda controller having a processor and a memory, the controller is configured to receive the electrical signal and generate an initial image of a patient on the surgical table based on the electrical signal, wherein the controller is configured to: transforms the initial image to a transformed image based on an orientation of the radiation source relative to the digital imaging receiver;executes a movement plan to generate a 3D reconstruction of the patient, wherein the movement plan causes the controller to move the robot arm to reposition the radiation source a plurality of times;generates a plurality of initial images, wherein each initial image corresponds to each time the radiation source is moved; andtransforms the plurality of initial images to a plurality of slices.
  • 9. The robotic surgical system of claim 8, wherein the controller determines a pose of the radiation source relative to the digital imaging receiver.
  • 10. The robotic surgical system of claim 9, wherein the pose includes an angle between an imaging axis defined by the radiation source and an axis extending perpendicular to a plane defined by the digital imaging receiver.
  • 11. The robotic surgical system of claim 10, wherein the pose includes a position of the radiation source relative to the digital imaging receiver.
  • 12. The robotic surgical system of claim 9, wherein the controller transforms the initial image to the transformed image based on the determined pose.
  • 13. The robotic surgical system of claim 8, wherein the initial image is an angled view of the patient along an imaging axis of the radiation source.
  • 14. The robotic surgical system of claim 8, wherein the 3D reconstruction is based on the plurality of slices.
  • 15. The robotic surgical system of claim 8, further comprising a repositionable robot arm supporting the radiation source.
  • 16. A robotic surgical system comprising: a robot arm configured for selective repositioning;a radiation source coupled to the robot arm;a surgical table having a digital imaging receiver configured to output an electrical signal based on radiation received from the radiation source; anda controller having a processor and a memory, the controller is configured to receive the electrical signal and generate an initial image of a patient disposed on the surgical table based on the electrical signal, wherein the controller is configured to: transform the initial image to a transformed image based on an orientation of the radiation source relative to the digital imaging receiver;execute a movement plan to generate a 3D reconstruction of the patient, wherein the movement plan causes the controller to reposition the robot arm to move the radiation source a plurality of times;generate a plurality of initial images, wherein each initial image corresponds to each time the robot arm is repositioned to move the radiation source; andtransform the plurality of initial images to a plurality of slices.
  • 17. The robotic surgical system of claim 16, wherein the controller determines a pose of the radiation source relative to the digital imaging receiver.
  • 18. The robotic surgical system of claim 17, wherein the pose includes an angle between an imaging axis defined by the radiation source and an axis extending perpendicular to a plane defined by the digital imaging receiver.
  • 19. The robotic surgical system of claim 18, wherein the pose includes a position of the radiation source relative to the digital imaging receiver.
  • 20. The robotic surgical system of claim 17, wherein the controller transforms the initial image to the transformed image based on the determined pose.
  • 21. The robotic surgical system of claim 16, wherein the initial image is an angled view of the patient along an imaging axis of the radiation source.
  • 22. The robotic surgical system of claim 16, wherein the 3D reconstruction is based on the plurality of slices.
  • 23. The robotic surgical system of claim 16, wherein the radiation source is selectively connected to the robot arm.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a U.S. National Stage Application filed under 35 U.S.C. § 371(a) of International Patent Application Serial No. PCT/US2017/035582, filed Jun. 2, 2017, which claims the benefit of and priority to U.S. Provisional Patent Application Ser. No. 62/345,168, filed Jun. 3, 2016, the entire disclosure of which is incorporated by reference herein.

PCT Information
Filing Document Filing Date Country Kind
PCT/US2017/035582 6/2/2017 WO
Publishing Document Publishing Date Country Kind
WO2017/210500 12/7/2017 WO A
US Referenced Citations (310)
Number Name Date Kind
4894855 Kresse Jan 1990 A
6132368 Cooper Oct 2000 A
6206903 Ramans Mar 2001 B1
6246200 Blumenkranz et al. Jun 2001 B1
6312435 Wallace et al. Nov 2001 B1
6331181 Tierney et al. Dec 2001 B1
6394998 Wallace et al. May 2002 B1
6424885 Niemeyer et al. Jul 2002 B1
6441577 Blumenkranz et al. Aug 2002 B2
6459926 Nowlin et al. Oct 2002 B1
6491691 Morley et al. Dec 2002 B1
6491701 Tierney et al. Dec 2002 B2
6493608 Niemeyer Dec 2002 B1
6565554 Niemeyer May 2003 B1
6645196 Nixon et al. Nov 2003 B1
6659939 Moll et al. Dec 2003 B2
6671581 Niemeyer et al. Dec 2003 B2
6676684 Morley et al. Jan 2004 B1
6685698 Morley et al. Feb 2004 B2
6699235 Wallace et al. Mar 2004 B2
6714839 Salisbury, Jr. et al. Mar 2004 B2
6716233 Whitman Apr 2004 B1
6728599 Wang et al. Apr 2004 B2
6746443 Morley et al. Jun 2004 B1
6766204 Niemeyer et al. Jul 2004 B2
6770081 Cooper et al. Aug 2004 B1
6772053 Niemeyer Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6793652 Whitman et al. Sep 2004 B1
6793653 Sanchez et al. Sep 2004 B2
6799065 Niemeyer Sep 2004 B1
6837883 Moll et al. Jan 2005 B2
6839612 Sanchez et al. Jan 2005 B2
6840938 Morley et al. Jan 2005 B1
6843403 Whitman Jan 2005 B2
6846309 Whitman et al. Jan 2005 B2
6866671 Tierney et al. Mar 2005 B2
6871117 Wang et al. Mar 2005 B2
6879880 Nowlin et al. Apr 2005 B2
6899705 Niemeyer May 2005 B2
6902560 Morley et al. Jun 2005 B1
6936042 Wallace et al. Aug 2005 B2
6951535 Ghodoussi et al. Oct 2005 B2
6974449 Niemeyer Dec 2005 B2
6991627 Madhani et al. Jan 2006 B2
6994708 Manzo Feb 2006 B2
7048745 Tierney et al. May 2006 B2
7066926 Wallace et al. Jun 2006 B2
7118582 Wang et al. Oct 2006 B1
7125403 Julian et al. Oct 2006 B2
7155315 Niemeyer et al. Dec 2006 B2
7239940 Wang et al. Jul 2007 B2
7306597 Manzo Dec 2007 B2
7357774 Cooper Apr 2008 B2
7373219 Nowlin et al. May 2008 B2
7379790 Toth et al. May 2008 B2
7386365 Nixon Jun 2008 B2
7391173 Schena Jun 2008 B2
7398707 Morley et al. Jul 2008 B2
7413565 Wang et al. Aug 2008 B2
7453227 Prisco et al. Nov 2008 B2
7524320 Tierney et al. Apr 2009 B2
7545914 Kito et al. Jun 2009 B2
7574250 Niemeyer Aug 2009 B2
7594912 Cooper et al. Sep 2009 B2
7607440 Coste-Maniere et al. Oct 2009 B2
7666191 Orban, III et al. Feb 2010 B2
7682357 Ghodoussi et al. Mar 2010 B2
7689320 Prisco et al. Mar 2010 B2
7695481 Wang et al. Apr 2010 B2
7695485 Whitman et al. Apr 2010 B2
7699855 Anderson et al. Apr 2010 B2
7713263 Niemeyer May 2010 B2
7725214 Diolaiti May 2010 B2
7727244 Orban, III et al. Jun 2010 B2
7741802 Prisco et al. Jun 2010 B2
7756036 Druke et al. Jul 2010 B2
7757028 Druke et al. Jul 2010 B2
7762825 Burbank et al. Jul 2010 B2
7778733 Nowlin et al. Aug 2010 B2
7803151 Whitman Sep 2010 B2
7806891 Nowlin et al. Oct 2010 B2
7819859 Prisco et al. Oct 2010 B2
7819885 Cooper Oct 2010 B2
7824401 Manzo et al. Nov 2010 B2
7835823 Sillman et al. Nov 2010 B2
7843158 Prisco Nov 2010 B2
7865266 Moll et al. Jan 2011 B2
7865269 Prisco et al. Jan 2011 B2
7886743 Cooper et al. Feb 2011 B2
7899578 Prisco et al. Mar 2011 B2
7907166 Lamprecht et al. Mar 2011 B2
7935130 Williams May 2011 B2
7963913 Devengenzo et al. Jun 2011 B2
7983793 Toth et al. Jul 2011 B2
8002767 Sanchez et al. Aug 2011 B2
8004229 Nowlin et al. Aug 2011 B2
8012170 Whitman et al. Sep 2011 B2
8054752 Druke et al. Nov 2011 B2
8062288 Cooper et al. Nov 2011 B2
8079950 Stern et al. Dec 2011 B2
8100133 Mintz et al. Jan 2012 B2
8108072 Zhao et al. Jan 2012 B2
8120301 Goldberg et al. Feb 2012 B2
8142447 Cooper et al. Mar 2012 B2
8147503 Zhao et al. Apr 2012 B2
8151661 Schena et al. Apr 2012 B2
8155479 Hoffman et al. Apr 2012 B2
8182469 Anderson et al. May 2012 B2
8202278 Orban, III et al. Jun 2012 B2
8206406 Orban, III Jun 2012 B2
8210413 Whitman et al. Jul 2012 B2
8216250 Orban, III et al. Jul 2012 B2
8220468 Cooper et al. Jul 2012 B2
8256319 Cooper et al. Sep 2012 B2
8285517 Sillman et al. Oct 2012 B2
8315720 Mohr et al. Nov 2012 B2
8335590 Costa et al. Dec 2012 B2
8347757 Duval Jan 2013 B2
8374723 Zhao et al. Feb 2013 B2
8418073 Mohr et al. Apr 2013 B2
8419717 Diolaiti et al. Apr 2013 B2
8423182 Robinson et al. Apr 2013 B2
8452447 Nixon May 2013 B2
8454585 Whitman Jun 2013 B2
8499992 Whitman et al. Aug 2013 B2
8508173 Goldberg et al. Aug 2013 B2
8528440 Morley et al. Sep 2013 B2
8529582 Devengenzo et al. Sep 2013 B2
8540748 Murphy et al. Sep 2013 B2
8551116 Julian et al. Oct 2013 B2
8562594 Cooper et al. Oct 2013 B2
8594841 Zhao et al. Nov 2013 B2
8597182 Stein et al. Dec 2013 B2
8597280 Cooper et al. Dec 2013 B2
8600551 Itkowitz et al. Dec 2013 B2
8608773 Tierney et al. Dec 2013 B2
8620473 Diolaiti et al. Dec 2013 B2
8624537 Nowlin et al. Jan 2014 B2
8634957 Toth et al. Jan 2014 B2
8638056 Goldberg et al. Jan 2014 B2
8638057 Goldberg et al. Jan 2014 B2
8644988 Prisco et al. Feb 2014 B2
8666544 Moll et al. Mar 2014 B2
8668638 Donhowe et al. Mar 2014 B2
8746252 McGrogan et al. Jun 2014 B2
8749189 Nowlin et al. Jun 2014 B2
8749190 Nowlin et al. Jun 2014 B2
8758352 Cooper et al. Jun 2014 B2
8761930 Nixon Jun 2014 B2
8768516 Diolaiti et al. Jul 2014 B2
8786241 Nowlin et al. Jul 2014 B2
8790243 Cooper et al. Jul 2014 B2
8808164 Hoffman et al. Aug 2014 B2
8816628 Nowlin et al. Aug 2014 B2
8821480 Burbank Sep 2014 B2
8823308 Nowlin et al. Sep 2014 B2
8827989 Niemeyer Sep 2014 B2
8838270 Druke et al. Sep 2014 B2
8852174 Burbank Oct 2014 B2
8858547 Brogna Oct 2014 B2
8862268 Robinson et al. Oct 2014 B2
8864751 Prisco et al. Oct 2014 B2
8864752 Diolaiti et al. Oct 2014 B2
8903546 Diolaiti et al. Dec 2014 B2
8903549 Itkowitz et al. Dec 2014 B2
8911428 Cooper et al. Dec 2014 B2
8912746 Reid et al. Dec 2014 B2
8944070 Guthart et al. Feb 2015 B2
8989903 Weir et al. Mar 2015 B2
9002518 Manzo et al. Apr 2015 B2
9014856 Manzo et al. Apr 2015 B2
9016540 Whitman et al. Apr 2015 B2
9019345 O Apr 2015 B2
9043027 Durant et al. May 2015 B2
9050120 Swarup et al. Jun 2015 B2
9055961 Manzo et al. Jun 2015 B2
9068628 Solomon et al. Jun 2015 B2
9078684 Williams Jul 2015 B2
9084623 Gomez et al. Jul 2015 B2
9095362 Dachs, II et al. Aug 2015 B2
9096033 Holop et al. Aug 2015 B2
9101381 Burbank et al. Aug 2015 B2
9113877 Whitman et al. Aug 2015 B1
9138284 Krom et al. Sep 2015 B2
9144452 Scott et al. Sep 2015 B2
9144456 Rosa et al. Sep 2015 B2
9198730 Prisco et al. Dec 2015 B2
9204923 Manzo et al. Dec 2015 B2
9226648 Saadat et al. Jan 2016 B2
9226750 Weir et al. Jan 2016 B2
9226761 Burbank Jan 2016 B2
9232984 Guthart et al. Jan 2016 B2
9241766 Duque et al. Jan 2016 B2
9241767 Prisco et al. Jan 2016 B2
9241769 Larkin et al. Jan 2016 B2
9259275 Burbank Feb 2016 B2
9259277 Rogers et al. Feb 2016 B2
9259281 Griffiths et al. Feb 2016 B2
9259282 Azizian et al. Feb 2016 B2
9261172 Solomon et al. Feb 2016 B2
9265567 Orban, III et al. Feb 2016 B2
9265584 Itkowitz et al. Feb 2016 B2
9283049 Diolaiti et al. Mar 2016 B2
9301811 Goldberg et al. Apr 2016 B2
9314307 Richmond et al. Apr 2016 B2
9317651 Nixon Apr 2016 B2
9345546 Toth et al. May 2016 B2
9393017 Flanagan et al. Jul 2016 B2
9402689 Prisco et al. Aug 2016 B2
9417621 Diolaiti et al. Aug 2016 B2
9424303 Hoffman et al. Aug 2016 B2
9433418 Whitman et al. Sep 2016 B2
9446517 Burns et al. Sep 2016 B2
9452020 Griffiths et al. Sep 2016 B2
9474569 Manzo et al. Oct 2016 B2
9480533 Devengenzo et al. Nov 2016 B2
9503713 Zhao et al. Nov 2016 B2
9550300 Danitz et al. Jan 2017 B2
9554859 Nowlin et al. Jan 2017 B2
9566124 Prisco et al. Feb 2017 B2
9579164 Itkowitz et al. Feb 2017 B2
9585641 Cooper et al. Mar 2017 B2
9615883 Schena et al. Apr 2017 B2
9623563 Nixon Apr 2017 B2
9623902 Griffiths et al. Apr 2017 B2
9629520 Diolaiti Apr 2017 B2
9662177 Weir et al. May 2017 B2
9664262 Donlon et al. May 2017 B2
9687312 Dachs, II et al. Jun 2017 B2
9700334 Hinman et al. Jul 2017 B2
9718190 Larkin et al. Aug 2017 B2
9730719 Brisson et al. Aug 2017 B2
9737199 Pistor et al. Aug 2017 B2
9795446 DiMaio et al. Oct 2017 B2
9797484 Solomon et al. Oct 2017 B2
9801690 Larkin et al. Oct 2017 B2
9814530 Weir et al. Nov 2017 B2
9814536 Goldberg et al. Nov 2017 B2
9814537 Itkowitz et al. Nov 2017 B2
9820823 Richmond et al. Nov 2017 B2
9827059 Robinson et al. Nov 2017 B2
9830371 Hoffman et al. Nov 2017 B2
9839481 Blumenkranz et al. Dec 2017 B2
9839487 Dachs, II Dec 2017 B2
9850994 Schena Dec 2017 B2
9855102 Blumenkranz Jan 2018 B2
9855107 Labonville et al. Jan 2018 B2
9861328 Kang et al. Jan 2018 B2
9872737 Nixon Jan 2018 B2
9877718 Weir et al. Jan 2018 B2
9883920 Blumenkranz Feb 2018 B2
9888974 Niemeyer Feb 2018 B2
9895813 Blumenkranz et al. Feb 2018 B2
9901408 Larkin Feb 2018 B2
9918800 Itkowitz et al. Mar 2018 B2
9943375 Blumenkranz et al. Apr 2018 B2
9948852 Lilagan et al. Apr 2018 B2
9949798 Weir Apr 2018 B2
9949802 Cooper Apr 2018 B2
9952107 Blumenkranz et al. Apr 2018 B2
9956044 Gomez et al. May 2018 B2
9980778 Ohline et al. May 2018 B2
10008017 Itkowitz et al. Jun 2018 B2
10028793 Griffiths et al. Jul 2018 B2
10033308 Chaghajerdi et al. Jul 2018 B2
10034719 Richmond et al. Jul 2018 B2
10052167 Au et al. Aug 2018 B2
10085811 Weir et al. Oct 2018 B2
10092344 Mohr et al. Oct 2018 B2
10123844 Nowlin et al. Nov 2018 B2
10188471 Brisson Jan 2019 B2
10201390 Swarup et al. Feb 2019 B2
10213202 Flanagan et al. Feb 2019 B2
10258416 Mintz et al. Apr 2019 B2
10278782 Jarc et al. May 2019 B2
10278783 Itkowitz et al. May 2019 B2
10282881 Itkowitz et al. May 2019 B2
10335242 Devengenzo et al. Jul 2019 B2
10405934 Prisco et al. Sep 2019 B2
10433922 Itkowitz et al. Oct 2019 B2
10464219 Robinson et al. Nov 2019 B2
10485621 Morrissette et al. Nov 2019 B2
10500004 Hanuschik et al. Dec 2019 B2
10500005 Weir et al. Dec 2019 B2
10500007 Richmond et al. Dec 2019 B2
10507066 DiMaio et al. Dec 2019 B2
10510267 Jarc et al. Dec 2019 B2
10524871 Liao Jan 2020 B2
10548459 Itkowitz et al. Feb 2020 B2
10575909 Robinson et al. Mar 2020 B2
10592529 Hoffman et al. Mar 2020 B2
10595946 Nixon Mar 2020 B2
10881469 Robinson Jan 2021 B2
10881473 Itkowitz et al. Jan 2021 B2
10898188 Burbank Jan 2021 B2
10898189 McDonald, II Jan 2021 B2
10905506 Itkowitz et al. Feb 2021 B2
10912544 Brisson et al. Feb 2021 B2
10912619 Jarc et al. Feb 2021 B2
10918387 Duque et al. Feb 2021 B2
10918449 Solomon et al. Feb 2021 B2
10932873 Griffiths et al. Mar 2021 B2
10932877 Devengenzo et al. Mar 2021 B2
20110015521 Faul Jan 2011 A1
20120157819 Jerebko et al. Jun 2012 A1
20140107477 Adler Apr 2014 A1
20140247918 Kang et al. Sep 2014 A1
20140343416 Panescu et al. Nov 2014 A1
20150005622 Zhao et al. Jan 2015 A1
Foreign Referenced Citations (4)
Number Date Country
1406117 Mar 2003 CN
102013221032 Apr 2015 DE
2774541 Sep 2014 EP
4129572 Aug 2008 JP
Non-Patent Literature Citations (4)
Entry
Extended European Search Report dated Jan. 13, 2020 corresponding to counterpart Patent Application EP 17807530.5.
Chinese First Office Action dated Oct. 30, 2020 corresponding to counterpart Patent Application CN 201780031784.6.
Chinese Second Office Action dated Jun. 23, 2021 corresponding to counterpart Patent Application CN 201780031784.6.
International Search Report and Written Opinion of corresponding counterpart Int'l Appln. No. PCT/US17/035582 dated Sep. 8, 2017.
Related Publications (1)
Number Date Country
20200323608 A1 Oct 2020 US
Provisional Applications (1)
Number Date Country
62345168 Jun 2016 US