Surgical robotic systems and methods of tracking usage of surgical instruments thereof

Information

  • Patent Grant
  • 12178528
  • Patent Number
    12,178,528
  • Date Filed
    Monday, September 9, 2019
    5 years ago
  • Date Issued
    Tuesday, December 31, 2024
    6 months ago
Abstract
A method of tracking usage of a robotic surgical instrument includes capturing an image of a surgical instrument with an imager during a robotic surgical procedure, identifying a type of the surgical instrument based on the image of the surgical instrument, determining a degree of usage of the surgical instrument based on data acquired by at least one sensor, and determining a stage in a life cycle of the surgical instrument based on the type of surgical instrument identified and the degree of usage determined.
Description
BACKGROUND

Robotic surgical systems have been used in minimally invasive medical procedures. Some robotic surgical systems include a console supporting a surgical robotic arm and a surgical instrument or at least one end effector (for example, forceps or a grasping tool) mounted to the robotic arm. The robotic arm provides mechanical power to the surgical instrument for its operation and movement. Each robotic arm may include an instrument drive unit that operatively supports the surgical instrument.


Typically, the surgical instruments operated by a robotic surgical system have a limited number of uses. Determining when the useful life of the surgical instrument has expired is desired for safety and surgical effectiveness. Accordingly, a need exists for a means for accurately determining when a surgical instrument should be decommissioned.


SUMMARY

In accordance with an aspect of the present disclosure, a method of tracking usage of a robotic surgical instrument includes capturing an image of a surgical instrument with an imager during a robotic surgical procedure, identifying a type of the surgical instrument based on the image of the surgical instrument, determining a degree of usage of the surgical instrument based on data acquired by at least one sensor, and determining a stage in a life cycle of the surgical instrument based on the type of surgical instrument identified and the degree of usage determined.


Some methods may further include determining if the surgical instrument is performing a surgical task based on the image of the surgical instrument.


In some aspects, determining if the surgical instrument is performing a surgical task includes correlating the image of the surgical instrument with the data acquired by the at least one sensor of the surgical instrument.


In aspects, the surgical task may include the surgical instrument acting on tissue.


In other aspects, the degree of usage of the surgical instrument may be determined only when the surgical instrument is acting on tissue.


Some methods may further include assigning a value to the surgical task performed by the surgical instrument corresponding to the degree of usage of the surgical instrument.


In some aspects, the value assigned to the surgical task performed by the surgical instrument may be selected based on an amount of force applied to tissue by the surgical instrument during the surgical task.


Some methods may further include determining a duration of time the surgical task is performed by the surgical instrument at the assigned value.


Other methods may further include displaying on a display the stage in the life cycle of the surgical instrument.


In another aspect of the present disclosure, a robotic surgical system is provided and includes a robotic arm, a surgical instrument configured to be coupled to and operated by the robotic arm, an imager configured to capture an image of the surgical instrument during a surgical procedure, and a control device in communication with the imager. The control device is configured to identify a type of the surgical instrument based on the image of the surgical instrument captured by the imager, determine a degree of usage of the surgical instrument based on data acquired by at least one sensor associated with the surgical instrument, and determine a stage in a life cycle of the surgical instrument based on the type of the surgical instrument identified and the degree of usage determined.


In aspects, the control device may be further configured to determine when the surgical instrument is performing a surgical task based on the image of the surgical instrument.


In other aspects, the surgical task may include the surgical instrument acting on tissue.


In further aspects, the control device may be configured to determine the degree of usage of the surgical instrument only when the surgical instrument is acting on tissue.


In some aspects, the control device may be further configured to assign a value to the surgical task performed by the surgical instrument corresponding to the degree of usage of the surgical instrument.


In aspects, the value assigned to the surgical task performed by the surgical instrument may be selected by the control device based on an amount of force applied to tissue by the surgical instrument during the surgical task.


In other aspects, the control device may be further configured to determine a duration of time the surgical task is performed by the surgical instrument at the assigned value.


In further aspects, the control device may be further configured to display on a display the stage in the life cycle of the surgical instrument.


In some aspects, the imager may be a camera or an imaging modality.


In aspects, the surgical instrument may be a surgical stapler.


Further details and aspects of exemplary embodiments of the present disclosure are described in more detail below with reference to the appended figures.


As used herein, the terms parallel and perpendicular are understood to include relative configurations that are substantially parallel and substantially perpendicular up to about + or −10 degrees from true parallel and true perpendicular.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of the present disclosure are described herein with reference to the accompanying drawings, wherein:



FIG. 1 is a schematic illustration of a robotic surgical system in accordance with the present disclosure;



FIG. 2A is a perspective view of the robotic surgical system of FIG. 1, including a robotic arm, an instrument drive unit coupled to an end of the robotic arm, and a surgical instrument coupled to the instrument drive unit;



FIG. 2B is an enlarged view of the surgical instrument of FIG. 2A and an endoscope of the surgical robotic system of FIG. 1; and



FIG. 3 is a flow chart illustrating a method of tracking usage of the surgical instrument of FIG. 2A.





DETAILED DESCRIPTION

Embodiments of the presently disclosed robotic surgical systems and methods of using such robotic surgical systems, are described in detail with reference to the drawings, in which like reference numerals designate identical or corresponding elements in each of the several views. As used herein the term “distal” refers to that portion of the robotic surgical system that is closer to the patient, while the term “proximal” refers to that portion of the robotic surgical system that is farther from the patient.


As will be described in detail below, a method of tracking usage of a surgical instrument of a surgical robotic system is provided. The method utilizes a camera of the robotic surgical system or a camera within an operating room to capture images of the surgical instrument in real-time during the surgical procedure. Based on the images captured by the camera, a control device of the surgical robotic system determines when the surgical instrument is actually being used to complete a surgical task (e.g., acting on tissue of a patient) as opposed to merely being moved in space without making contact with tissue of a patient. Sensors in the surgical instrument determine the forces applied by the surgical instrument on the tissue and send the determined forces to the control device. The control device then determines a degree of cumulative usage of the surgical instrument based on the amount of time the surgical instrument is experiencing the forces during each surgical task. If it is determined that the cumulative usage of the surgical instrument is beyond a predetermined usage limit, the control device may prevent further actuation of the surgical instrument. In some aspects, a display may provide a clinician with a visual indication of the degree of usage of the surgical instrument to allow a clinician to make the determination of whether to cease use of the surgical instrument.


Referring to FIGS. 1, 2A, and 2B, a surgical system, such as, for example, a surgical robotic system 1, generally includes a plurality of surgical robotic arms 2, 3; an instrument drive unit 20 and an electromechanical instrument 10 attached to an end of the robotic arm 2; a control device 4; and an operating console 5 coupled with the control device 4. The operating console 5 includes a display device 6, which is set up in particular to display three-dimensional images; and manual input devices 7, 8, by means of which a person (not shown), for example a surgeon, is able to telemanipulate robotic arms 2, 3 in a first operating mode, as known in principle to a person skilled in the art.


Each of the robotic arms 2, 3 may be composed of a plurality of members, which are connected through joints. Robotic arms 2, 3 may be driven by electric drives (not shown) that are connected to control device 4. Control device 4 (e.g., a computer) is set up to activate the drives, in particular by means of a computer program, in such a way that robotic arms 2, 3, the attached instrument drive units 20, and thus electromechanical instrument 10 execute a desired movement according to a movement defined by means of manual input devices 7, 8. Control device 4 may also be set up in such a way that it regulates the movement of the robotic arms 2, 3 and/or movement of the drives.


Surgical robotic system 1 is configured for use on a patient “P” lying on a surgical table “ST” to be treated in a minimally invasive manner by means of a surgical instrument, e.g., electromechanical instrument 10. Surgical robotic system 1 may also include more than two robotic arms 2, 3, the additional robotic arms likewise being connected to control device 4 and being telemanipulatable by means of operating console 5. A surgical instrument, for example, an electromechanical surgical instrument 10 including an electromechanical end effector 42, may also be attached to the additional robotic arm.


Control device 4 may control a plurality of motors, e.g., motors (Motor 1 . . . n), with each motor configured to drive movement of robotic arms 2, 3 in a plurality of directions. Further, control device 4 may control an imager 48 of the instrument drive unit 20 to drive movement and operation of the imager 48. The instrument drive unit 20 transfers power and actuation forces from its motors to driven members (not shown) of the electromechanical instrument 10 to ultimately drive movement of components of the end effector 42 of the electromechanical instrument 10, for example, a movement of a knife blade (not shown) and/or a closing and opening of jaw members of the end effector 42.


For a detailed description of the construction and operation of a robotic surgical system, reference may be made to U.S. Pat. No. 8,828,023, entitled “Medical Workstation,” the entire contents of which are incorporated by reference herein.


With specific reference to FIGS. 2A and 2B, the surgical instrument 10 generally has a proximal end portion 42a configured to be engaged with the instrument drive unit 20 and a distal end portion 42b having the end effector 42 extending therefrom. The surgical instrument 10 further includes an elongate body or shaft 44. The end effector 42 extends distally from the distal end portion 42b of the elongate body 44 and is configured for performing a plurality of surgical functions. The surgical instrument 10 further includes a machine-readable representation of data, such as, for example, a barcode 43, disposed thereon, and a plurality of sensors 45 for determining a plurality of conditions of the surgical instrument 10, such as, for example, a clamping force between jaws of the end effector 42, a force required to articulate the end effector 42, a force required to rotate the end effector 42, and/or a force required to actuate a function of the end effector 42 (e.g., a stapling function). The sensors 45 may be force sensors and/or position sensors; however, other types of sensors are also contemplated.


It is contemplated that the surgical instrument 10 may be any suitable surgical instrument for performing a surgical task, such as, for example, a surgical stapler, a surgical cutter, a surgical stapler-cutter, a linear surgical stapler, a linear surgical stapler-cutter, a circular surgical stapler, a circular surgical stapler-cutter, a surgical clip applier, a surgical clip ligator, a surgical clamping device, a vessel sealing device, a vessel expanding device, a lumen expanding device, a scalpel, a fluid delivery device, monopolar or bipolar energy delivery devices (e.g., energized devices that can apply energy (heat, RF, etc.) to cut or coagulate tissue) or any other suitable type of surgical instrument, each of which being configured for actuation and manipulation by the surgical robotic system 1.


The surgical robotic system 1 may further include an imager 48, such as, for example, a camera or an imaging modality, configured to capture an image of the surgical instrument 10. The imager 48 may be positioned at any suitable location of the surgical robotic system 1, such as an endoscope 50 (FIGS. 1 and 2B), the instrument drive unit 20, or any suitable location within an operating room. The imager 48 may be any suitable imaging apparatus configured for still or moving imaging including, but not limited to, digital devices, such as charge-coupled device (CCD) camera, a complementary metal-oxide-semiconductor (CMOS) sensor, an active-pixel sensor (APS), and analog devices, such as a vidicon tube. In embodiments, the imager 48 may also include any suitable lens or optical apparatus (e.g., optical fiber) for transmitting light to the control device 4 (FIG. 1). The imager 48 may be in communication with the display device 6 (FIG. 1) for displaying the images captured thereby.


With reference to FIG. 3, a method of tracking usage of the surgical instrument 40 using the surgical robotic system 1 will now be described. Each surgical instrument 40 may have a predetermined or pre-set life cycle. To determine the stage in the pre-set life cycle of the surgical instrument 40, the following method may be employed. In step 200, the control device 4 (FIG. 1) is configured to direct the imager 48 (FIGS. 2A and 2B) to capture an image or images of the surgical instrument 10 during a robotic surgical procedure. For example, the imager 48 may capture an image of the barcode 43, such that in step 202, the control device 4 may determine the type and identity of the surgical instrument 10 based on the image of the barcode 43 of the surgical instrument 10 captured by the imager 48. In some methods, the processor may have image data stored in a memory thereof of a variety of types of surgical instruments and may match the image taken of the surgical instrument 10 with an image stored in the memory of the processor to identify the type of the surgical instrument 10. It is contemplated that the control device 4 has a processor (not shown) capable of executing a series of instructions, algorithms, or protocols that are stored in a memory (e.g., a storage device and/or external device (not shown)) of the control device 4 for identifying the surgical instrument 10 based on the captured images thereof.


In step 204, the processor determines if and when the surgical instrument 10 is performing a surgical task based on the image of the surgical instrument 10 and/or based on forces sensed by the sensors 45 of the surgical instrument 10 and/or other sensors of the surgical robotic system 1. For example, if the image/video shows the surgical instrument 10 acting on tissue, such as closing the end effector 42 about the tissue, the processor determines that the surgical instrument 10 is performing a surgical task. In some methods, the processor may correlate the image/video with the forces sensed by the sensors 45 of the surgical instrument 10 or other sensors of the surgical robotic system 1 to determine/confirm that the surgical instrument 10 is performing a surgical task. In step 206, if it is determined that the surgical instrument 10 is performing a surgical task, the processor determines a degree of usage of the surgical instrument 10 based on data acquired by one or more of the sensors 45 of the surgical instrument 10 and/or other sensors of the surgical robotic system 1. In other embodiments, the degree of usage may be determined based on data acquired by the imager 48.


In step 208, the processor assigns a value to the surgical task performed by the surgical instrument 10 based on the amount of force applied to the tissue by the surgical instrument 10 during the surgical task. For example, the higher the force applied to the tissue by the surgical instrument 10, the higher the value assigned. In step 210, the processor determines a duration of time the surgical task is performed by the surgical instrument 10 at the value assigned to that surgical task. As such, the processor determines the degree of usage of the surgical instrument 10 based on how much time the surgical instrument 10 is being used at each discrete usage level. For example, in the case of an energy delivering surgical instrument 10, used to cut or coagulate tissue, the degree (amplitude) and time that the energy is delivered to tissue, may be a factor in calculating the degree of usage and/or life remaining for the energy delivering surgical instrument 10.


Accordingly, if during a first surgical task the surgical instrument 10 applies a relatively low force to tissue for a time (x), and during a second surgical task the surgical instrument 10 applies a relatively high force to tissue for a time (x), the degree of usage assigned to the second surgical task will be higher notwithstanding the duration of usage of both of the first and second surgical tasks being the same.


In step 212, the stage in a pre-set or predetermined life cycle of the surgical instrument 10 is determined based on the type of surgical instrument 10 identified and the degree of usage determined. The robotic surgical system 1 may display on a display device 6 (FIG. 1) the stage in the life cycle of the surgical instrument 10. It is contemplated that the stage in the life cycle may be displayed as a number, a percentage, a word, a color, a bar indicator, or using any other suitable indicia. Based on the stage in the life cycle of the surgical instrument 10, the clinician may choose to cease using the surgical instrument 10. In other embodiments, if the surgical instrument 10 is determined to be beyond its useful life (e.g., exceeded its predetermined life cycle), the surgical robotic system 1 may be configured to prevent further activation of the surgical instrument 10.


It will be understood that various modifications may be made to the embodiments disclosed herein. Therefore, the above description should not be construed as limiting, but merely as exemplifications of various embodiments. Those skilled in the art will envision other modifications within the scope and spirit of the claims appended thereto.

Claims
  • 1. A method of tracking usage of a robotic surgical instrument, the method comprising: capturing an image of a surgical instrument with an imager during a robotic surgical procedure;identifying a type of the surgical instrument based on the image of the surgical instrument;during the robotic surgical procedure, determining a degree of usage of the surgical instrument based on data acquired by at least one sensor supported on jaws of the surgical instrument, wherein the degree of usage of the surgical instrument is a function of an amount of time of use of the surgical instrument and a function of forces measured by the at least one sensor during the time of use of the surgical instrument; anddetermining a stage in a life cycle of the surgical instrument based on the type of surgical instrument identified and the degree of usage determined during the robotic surgical procedure.
  • 2. The method according to claim 1, further comprising determining if the surgical instrument is performing a surgical task based on the image of the surgical instrument.
  • 3. The method according to claim 2, wherein determining if the surgical instrument is performing a surgical task includes correlating the image of the surgical instrument with the data acquired by the at least one sensor of the surgical instrument.
  • 4. The method according to claim 2, wherein the surgical task includes the surgical instrument acting on tissue.
  • 5. The method according to claim 4, wherein the degree of usage of the surgical instrument is determined only when the surgical instrument is acting on tissue.
  • 6. The method according to claim 2, further comprising assigning a value to the surgical task performed by the surgical instrument corresponding to the degree of usage of the surgical instrument.
  • 7. The method according to claim 6, wherein the value assigned to the surgical task performed by the surgical instrument is selected based on an amount of force applied to tissue by the surgical instrument during the surgical task.
  • 8. The method according to claim 6, further comprising determining a duration of time the surgical task is performed by the surgical instrument at the assigned value.
  • 9. The method according to claim 1, further comprising displaying on a display the stage in the life cycle of the surgical instrument.
  • 10. A robotic surgical system, comprising: a robotic arm;a surgical instrument configured to be coupled to and operated by the robotic arm;an imager configured to capture an image of the surgical instrument during a surgical procedure; anda control device in communication with the imager and configured to: identify a type of the surgical instrument based on the image of the surgical instrument captured by the imager;during the robotic surgical procedure, determine a degree of usage of the surgical instrument based on data acquired by at least one sensor supported on jaws of the surgical instrument, wherein the degree of usage of the surgical instrument is a function of an amount of time of use of the surgical instrument and a function of forces measured by the at least one sensor during the time of use of the surgical instrument; anddetermine a stage in a life cycle of the surgical instrument based on the type of the surgical instrument identified and the degree of usage determined during the robotic surgical procedure.
  • 11. The surgical robotic system according to claim 10, wherein the control device is further configured to determine if the surgical instrument is performing a surgical task based on the image of the surgical instrument.
  • 12. The surgical robotic system according to claim 11, wherein the surgical task includes the surgical instrument acting on tissue.
  • 13. The surgical robotic system according to claim 12, wherein the control device is configured to determine the degree of usage of the surgical instrument only when the surgical instrument is acting on tissue.
  • 14. The surgical robotic system according to claim 12, wherein the control device is further configured to assign a value to the surgical task performed by the surgical instrument corresponding to the degree of usage of the surgical instrument.
  • 15. The surgical robotic system according to claim 14, wherein the value assigned to the surgical task performed by the surgical instrument is selected by the control device based on an amount of force applied to tissue by the surgical instrument during the surgical task.
  • 16. The surgical robotic system according to claim 14, wherein the control device is further configured to determine a duration of time the surgical task is performed by the surgical instrument at the assigned value.
  • 17. The surgical robotic system according to claim 10, wherein the control device is further configured to display on a display the stage in the life cycle of the surgical instrument.
  • 18. The surgical robotic system according to claim 10, wherein the imager includes an imaging modality.
  • 19. The surgical robotic system according to claim 10, wherein the surgical instrument is a surgical stapler.
PCT Information
Filing Document Filing Date Country Kind
PCT/US2019/050129 9/9/2019 WO
Publishing Document Publishing Date Country Kind
WO2020/055707 3/19/2020 WO A
US Referenced Citations (360)
Number Name Date Kind
6132368 Cooper Oct 2000 A
6206903 Ramans Mar 2001 B1
6246200 Blumenkranz et al. Jun 2001 B1
6312435 Wallace et al. Nov 2001 B1
6331181 Tierney et al. Dec 2001 B1
6394998 Wallace et al. May 2002 B1
6424885 Niemeyer et al. Jul 2002 B1
6441577 Blumenkranz et al. Aug 2002 B2
6459926 Nowlin et al. Oct 2002 B1
6491691 Morley et al. Dec 2002 B1
6491701 Tierney et al. Dec 2002 B2
6493608 Niemeyer Dec 2002 B1
6565554 Niemeyer May 2003 B1
6645196 Nixon et al. Nov 2003 B1
6659939 Moll et al. Dec 2003 B2
6671581 Niemeyer et al. Dec 2003 B2
6676684 Morley et al. Jan 2004 B1
6685698 Morley et al. Feb 2004 B2
6699235 Wallace et al. Mar 2004 B2
6714839 Salisbury, Jr. et al. Mar 2004 B2
6716233 Whitman Apr 2004 B1
6728599 Wang et al. Apr 2004 B2
6746443 Morley et al. Jun 2004 B1
6766204 Niemeyer et al. Jul 2004 B2
6770081 Cooper et al. Aug 2004 B1
6772053 Niemeyer Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6793652 Whitman et al. Sep 2004 B1
6793653 Sanchez et al. Sep 2004 B2
6799065 Niemeyer Sep 2004 B1
6837883 Moll et al. Jan 2005 B2
6839612 Sanchez et al. Jan 2005 B2
6840938 Morley et al. Jan 2005 B1
6843403 Whitman Jan 2005 B2
6846309 Whitman et al. Jan 2005 B2
6866671 Tierney et al. Mar 2005 B2
6871117 Wang et al. Mar 2005 B2
6879880 Nowlin et al. Apr 2005 B2
6899705 Niemeyer May 2005 B2
6902560 Morley et al. Jun 2005 B1
6936042 Wallace et al. Aug 2005 B2
6951535 Ghodoussi et al. Oct 2005 B2
6974449 Niemeyer Dec 2005 B2
6991627 Madhani et al. Jan 2006 B2
6994708 Manzo Feb 2006 B2
7048745 Tierney et al. May 2006 B2
7066926 Wallace et al. Jun 2006 B2
7118582 Wang et al. Oct 2006 B1
7125403 Julian et al. Oct 2006 B2
7155315 Niemeyer et al. Dec 2006 B2
7239940 Wang et al. Jul 2007 B2
7306597 Manzo Dec 2007 B2
7357774 Cooper Apr 2008 B2
7373219 Nowlin et al. May 2008 B2
7379790 Toth et al. May 2008 B2
7386365 Nixon Jun 2008 B2
7391173 Schena Jun 2008 B2
7398707 Morley et al. Jul 2008 B2
7413565 Wang et al. Aug 2008 B2
7453227 Prisco et al. Nov 2008 B2
7524320 Tierney et al. Apr 2009 B2
7574250 Niemeyer Aug 2009 B2
7594912 Cooper et al. Sep 2009 B2
7607440 Coste-Maniere et al. Oct 2009 B2
7666191 Orban, III et al. Feb 2010 B2
7682357 Ghodoussi et al. Mar 2010 B2
7689320 Prisco et al. Mar 2010 B2
7695481 Wang et al. Apr 2010 B2
7695485 Whitman et al. Apr 2010 B2
7699855 Anderson et al. Apr 2010 B2
7713263 Niemeyer May 2010 B2
7725214 Diolaiti May 2010 B2
7727244 Orban, III et al. Jun 2010 B2
7741802 Prisco et al. Jun 2010 B2
7756036 Druke et al. Jul 2010 B2
7757028 Druke et al. Jul 2010 B2
7762825 Burbank et al. Jul 2010 B2
7778733 Nowlin et al. Aug 2010 B2
7803151 Whitman Sep 2010 B2
7806891 Nowlin et al. Oct 2010 B2
7819859 Prisco et al. Oct 2010 B2
7819885 Cooper Oct 2010 B2
7824401 Manzo et al. Nov 2010 B2
7835823 Sillman et al. Nov 2010 B2
7843158 Prisco Nov 2010 B2
7865266 Moll et al. Jan 2011 B2
7865269 Prisco et al. Jan 2011 B2
7886743 Cooper et al. Feb 2011 B2
7899578 Prisco et al. Mar 2011 B2
7907166 Lamprecht et al. Mar 2011 B2
7935130 Williams May 2011 B2
7963913 Devengenzo et al. Jun 2011 B2
7983793 Toth et al. Jul 2011 B2
8002767 Sanchez et al. Aug 2011 B2
8004229 Nowlin et al. Aug 2011 B2
8012170 Whitman et al. Sep 2011 B2
8054752 Druke et al. Nov 2011 B2
8062288 Cooper et al. Nov 2011 B2
8079950 Stern et al. Dec 2011 B2
8100133 Mintz et al. Jan 2012 B2
8108072 Zhao et al. Jan 2012 B2
8120301 Goldberg et al. Feb 2012 B2
8142447 Cooper et al. Mar 2012 B2
8147503 Zhao et al. Apr 2012 B2
8151661 Schena et al. Apr 2012 B2
8155479 Hoffman et al. Apr 2012 B2
8182469 Anderson et al. May 2012 B2
8202278 Orban, III et al. Jun 2012 B2
8206406 Orban, III Jun 2012 B2
8210413 Whitman et al. Jul 2012 B2
8216250 Orban, III et al. Jul 2012 B2
8220468 Cooper et al. Jul 2012 B2
8256319 Cooper et al. Sep 2012 B2
8285517 Sillman et al. Oct 2012 B2
8315720 Mohr et al. Nov 2012 B2
8335590 Costa et al. Dec 2012 B2
8347757 Duval Jan 2013 B2
8374723 Zhao et al. Feb 2013 B2
8418073 Mohr et al. Apr 2013 B2
8419717 Diolaiti et al. Apr 2013 B2
8423182 Robinson et al. Apr 2013 B2
8452447 Nixon May 2013 B2
8454585 Whitman Jun 2013 B2
8499992 Whitman et al. Aug 2013 B2
8508173 Goldberg et al. Aug 2013 B2
8528440 Morley et al. Sep 2013 B2
8529582 Devengenzo et al. Sep 2013 B2
8540748 Murphy et al. Sep 2013 B2
8551116 Julian et al. Oct 2013 B2
8562594 Cooper et al. Oct 2013 B2
8594841 Zhao et al. Nov 2013 B2
8597182 Stein et al. Dec 2013 B2
8597280 Cooper et al. Dec 2013 B2
8600551 Itkowitz et al. Dec 2013 B2
8608773 Tierney et al. Dec 2013 B2
8620473 Diolaiti et al. Dec 2013 B2
8624537 Nowlin et al. Jan 2014 B2
8628518 Blumenkranz Jan 2014 B2
8634957 Toth et al. Jan 2014 B2
8638056 Goldberg et al. Jan 2014 B2
8638057 Goldberg et al. Jan 2014 B2
8644988 Prisco et al. Feb 2014 B2
8666544 Moll et al. Mar 2014 B2
8668638 Donhowe et al. Mar 2014 B2
8746252 McGrogan et al. Jun 2014 B2
8749189 Nowlin et al. Jun 2014 B2
8749190 Nowlin et al. Jun 2014 B2
8758352 Cooper et al. Jun 2014 B2
8761930 Nixon Jun 2014 B2
8768516 Diolaiti et al. Jul 2014 B2
8786241 Nowlin et al. Jul 2014 B2
8790243 Cooper et al. Jul 2014 B2
8808164 Hoffman et al. Aug 2014 B2
8816628 Nowlin et al. Aug 2014 B2
8821480 Burbank Sep 2014 B2
8823308 Nowlin et al. Sep 2014 B2
8827989 Niemeyer Sep 2014 B2
8838270 Druke et al. Sep 2014 B2
8852174 Burbank Oct 2014 B2
8858547 Brogna Oct 2014 B2
8862268 Robinson et al. Oct 2014 B2
8864751 Prisco et al. Oct 2014 B2
8864752 Diolaiti et al. Oct 2014 B2
8903546 Diolaiti et al. Dec 2014 B2
8903549 Itkowitz et al. Dec 2014 B2
8911428 Cooper et al. Dec 2014 B2
8912746 Reid et al. Dec 2014 B2
8944070 Guthart et al. Feb 2015 B2
8989903 Weir et al. Mar 2015 B2
9002518 Manzo et al. Apr 2015 B2
9014856 Manzo et al. Apr 2015 B2
9016540 Whitman et al. Apr 2015 B2
9019345 O'Grady et al. Apr 2015 B2
9043027 Durant et al. May 2015 B2
9050120 Swarup et al. Jun 2015 B2
9055961 Manzo et al. Jun 2015 B2
9068628 Solomon et al. Jun 2015 B2
9078684 Williams Jul 2015 B2
9084623 Gomez et al. Jul 2015 B2
9095362 Dachs, II et al. Aug 2015 B2
9096033 Holop et al. Aug 2015 B2
9101381 Burbank et al. Aug 2015 B2
9113877 Whitman et al. Aug 2015 B1
9138284 Krom et al. Sep 2015 B2
9144456 Rosa et al. Sep 2015 B2
9198730 Prisco et al. Dec 2015 B2
9204923 Manzo et al. Dec 2015 B2
9226648 Saadat et al. Jan 2016 B2
9226750 Weir et al. Jan 2016 B2
9226761 Burbank Jan 2016 B2
9232984 Guthart et al. Jan 2016 B2
9241766 Duque et al. Jan 2016 B2
9241767 Prisco et al. Jan 2016 B2
9241769 Larkin et al. Jan 2016 B2
9259275 Burbank Feb 2016 B2
9259277 Rogers et al. Feb 2016 B2
9259281 Griffiths et al. Feb 2016 B2
9259282 Azizian et al. Feb 2016 B2
9261172 Solomon et al. Feb 2016 B2
9265567 Orban, III et al. Feb 2016 B2
9265584 Itkowitz et al. Feb 2016 B2
9283049 Diolaiti et al. Mar 2016 B2
9301811 Goldberg et al. Apr 2016 B2
9314307 Richmond et al. Apr 2016 B2
9317651 Nixon Apr 2016 B2
9345546 Toth et al. May 2016 B2
9393017 Flanagan et al. Jul 2016 B2
9402689 Prisco et al. Aug 2016 B2
9417621 Diolaiti et al. Aug 2016 B2
9424303 Hoffman et al. Aug 2016 B2
9433418 Whitman et al. Sep 2016 B2
9446517 Burns et al. Sep 2016 B2
9452020 Griffiths et al. Sep 2016 B2
9474569 Manzo et al. Oct 2016 B2
9480533 Devengenzo et al. Nov 2016 B2
9503713 Zhao et al. Nov 2016 B2
9550300 Danitz et al. Jan 2017 B2
9554859 Nowlin et al. Jan 2017 B2
9566124 Prisco et al. Feb 2017 B2
9579164 Itkowitz et al. Feb 2017 B2
9585641 Cooper et al. Mar 2017 B2
9615883 Schena et al. Apr 2017 B2
9623563 Nixon Apr 2017 B2
9623902 Griffiths et al. Apr 2017 B2
9629520 Diolaiti Apr 2017 B2
9662177 Weir et al. May 2017 B2
9664262 Donlon et al. May 2017 B2
9687312 Dachs et al. Jun 2017 B2
9700334 Hinman et al. Jul 2017 B2
9718190 Larkin et al. Aug 2017 B2
9730719 Brisson et al. Aug 2017 B2
9737199 Pistor et al. Aug 2017 B2
9795446 DiMaio et al. Oct 2017 B2
9797484 Solomon et al. Oct 2017 B2
9801690 Larkin et al. Oct 2017 B2
9814530 Weir et al. Nov 2017 B2
9814536 Goldberg et al. Nov 2017 B2
9814537 Itkowitz et al. Nov 2017 B2
9820823 Richmond et al. Nov 2017 B2
9827059 Robinson et al. Nov 2017 B2
9830371 Hoffman et al. Nov 2017 B2
9839481 Blumenkranz et al. Dec 2017 B2
9839487 Dachs, II Dec 2017 B2
9850994 Schena Dec 2017 B2
9855102 Blumenkranz Jan 2018 B2
9855107 Labonville et al. Jan 2018 B2
9872737 Nixon Jan 2018 B2
9877718 Weir et al. Jan 2018 B2
9883920 Blumenkranz Feb 2018 B2
9888974 Niemeyer Feb 2018 B2
9895813 Blumenkranz et al. Feb 2018 B2
9901408 Larkin Feb 2018 B2
9918800 Itkowitz et al. Mar 2018 B2
9943375 Blumenkranz Apr 2018 B2
9948852 Lilagan et al. Apr 2018 B2
9949798 Weir Apr 2018 B2
9949802 Cooper Apr 2018 B2
9952107 Blumenkranz et al. Apr 2018 B2
9956044 Gomez et al. May 2018 B2
9980778 Ohline et al. May 2018 B2
10008017 Itkowitz et al. Jun 2018 B2
10028793 Griffiths et al. Jul 2018 B2
10033308 Chaghajerdi et al. Jul 2018 B2
10034719 Richmond et al. Jul 2018 B2
10052167 Au et al. Aug 2018 B2
10085811 Weir et al. Oct 2018 B2
10092344 Mohr et al. Oct 2018 B2
10123844 Nowlin et al. Nov 2018 B2
10188471 Brisson Jan 2019 B2
10201390 Swarup et al. Feb 2019 B2
10213202 Flanagan et al. Feb 2019 B2
10258416 Mintz et al. Apr 2019 B2
10278782 Jarc et al. May 2019 B2
10278783 Itkowitz et al. May 2019 B2
10282881 Itkowitz et al. May 2019 B2
10335242 Devengenzo et al. Jul 2019 B2
10405934 Prisco et al. Sep 2019 B2
10433922 Itkowitz et al. Oct 2019 B2
10464219 Robinson et al. Nov 2019 B2
10485621 Morrissette et al. Nov 2019 B2
10500004 Hanuschik et al. Dec 2019 B2
10500005 Weir et al. Dec 2019 B2
10500007 Richmond et al. Dec 2019 B2
10507066 DiMaio et al. Dec 2019 B2
10510267 Jarc et al. Dec 2019 B2
10524871 Liao Jan 2020 B2
10548459 Itkowitz et al. Feb 2020 B2
10575909 Robinson et al. Mar 2020 B2
10592529 Hoffman et al. Mar 2020 B2
10595946 Nixon Mar 2020 B2
10881469 Robinson Jan 2021 B2
10881473 Itkowitz et al. Jan 2021 B2
10898188 Burbank Jan 2021 B2
10898189 McDonald, II Jan 2021 B2
10905506 Itkowitz et al. Feb 2021 B2
10912544 Brisson et al. Feb 2021 B2
10912619 Jarc et al. Feb 2021 B2
10918387 Duque et al. Feb 2021 B2
10918449 Solomon et al. Feb 2021 B2
10932873 Griffiths et al. Mar 2021 B2
10932877 Devengenzo et al. Mar 2021 B2
10939969 Swarup et al. Mar 2021 B2
10939973 DiMaio et al. Mar 2021 B2
10952801 Miller et al. Mar 2021 B2
10965933 Jarc Mar 2021 B2
10966742 Rosa et al. Apr 2021 B2
10973517 Wixey Apr 2021 B2
10973519 Weir et al. Apr 2021 B2
10984567 Itkowitz et al. Apr 2021 B2
10993773 Cooper et al. May 2021 B2
10993775 Cooper et al. May 2021 B2
11000331 Krom et al. May 2021 B2
11013567 Wu et al. May 2021 B2
11020138 Ragosta Jun 2021 B2
11020191 Diolaiti et al. Jun 2021 B2
11020193 Wixey et al. Jun 2021 B2
11026755 Weir et al. Jun 2021 B2
11026759 Donlon et al. Jun 2021 B2
11040189 Vaders et al. Jun 2021 B2
11045077 Stern et al. Jun 2021 B2
11045274 Dachs, II et al. Jun 2021 B2
11058501 Tokarchuk et al. Jul 2021 B2
11076925 DiMaio et al. Aug 2021 B2
11090119 Burbank Aug 2021 B2
11096687 Flanagan et al. Aug 2021 B2
11098803 Duque et al. Aug 2021 B2
11109925 Cooper et al. Sep 2021 B2
11116578 Hoffman et al. Sep 2021 B2
11129683 Steger et al. Sep 2021 B2
11135029 Suresh et al. Oct 2021 B2
11147552 Burbank et al. Oct 2021 B2
11147640 Jarc et al. Oct 2021 B2
11154373 Abbott et al. Oct 2021 B2
11154374 Hanuschik et al. Oct 2021 B2
11160622 Goldberg et al. Nov 2021 B2
11160625 Wixey et al. Nov 2021 B2
11161243 Rabindran et al. Nov 2021 B2
11166758 Mohr et al. Nov 2021 B2
11166770 DiMaio et al. Nov 2021 B2
11166773 Ragosta et al. Nov 2021 B2
11173597 Rabindran et al. Nov 2021 B2
11185378 Weir et al. Nov 2021 B2
11191596 Thompson et al. Dec 2021 B2
11197729 Thompson et al. Dec 2021 B2
11213360 Hourtash et al. Jan 2022 B2
11221863 Azizian et al. Jan 2022 B2
11234700 Ragosta et al. Feb 2022 B2
11241274 Vaders et al. Feb 2022 B2
11241290 Waterbury et al. Feb 2022 B2
11259870 DiMaio et al. Mar 2022 B2
11259884 Burbank Mar 2022 B2
11272993 Gomez et al. Mar 2022 B2
11272994 Saraliev et al. Mar 2022 B2
11291442 Wixey et al. Apr 2022 B2
11291513 Manzo et al. Apr 2022 B2
20100087835 Blumenkranz Apr 2010 A1
20100176925 Tethrake et al. Jul 2010 A1
20140100588 Blumenkranz Apr 2014 A1
20150164606 Jacobs et al. Jun 2015 A1
20180071033 Zhao et al. Mar 2018 A1
Foreign Referenced Citations (2)
Number Date Country
WO-2017011646 Jan 2017 WO
2017208678 Dec 2017 WO
Non-Patent Literature Citations (2)
Entry
European Search Report dated May 16, 2022, issued in corresponding EP Appln. No. 19858955, 9 pages.
International Search Report dated Dec. 27, 2019, issued in corresponding international appln No. PCT/US2019/050129, 16 pages.
Related Publications (1)
Number Date Country
20210212784 A1 Jul 2021 US
Provisional Applications (1)
Number Date Country
62731423 Sep 2018 US