System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices

Information

  • Patent Grant
  • 11944325
  • Patent Number
    11,944,325
  • Date Filed
    Monday, May 9, 2022
    a year ago
  • Date Issued
    Tuesday, April 2, 2024
    a month ago
Abstract
A system for robotic surgery makes use of an end-effector which has been configured so that a drill connected thereto is guided in its trajectory and limited in its advancement into an associated anatomical feature by a drill guide. The drill guide may be adjusted manually to engage a corresponding surface of the drill after its advancement by a pre-selected amount. The drill guide likewise includes features to guide the drill during trajectories having oblique angles relative to the surface of the anatomical feature associated with the medical procedure.
Description
FIELD

The present disclosure relates to medical devices and systems, and more particularly, systems for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices.


BACKGROUND

Position recognition systems for robot assisted surgeries are used to determine the position of and track a particular object in 3-dimensions (3D). In robot assisted surgeries, for example, certain objects, such as surgical instruments, need to be tracked with a high degree of precision as the instrument is being positioned and moved by a robot or by a physician, for example.


Position recognition systems may use passive and/or active sensors or markers for registering and tracking the positions of the objects. Using these sensors, the system may geometrically resolve the 3-dimensional position of the sensors based on information from or with respect to one or more cameras, signals, or sensors, etc. These surgical systems can therefore utilize position feedback to precisely guide movement of robotic arms and tools relative to a patients' surgical site. Thus, there is a need for a system that efficiently and accurately provide neuronavigation registration and robotic trajectory guidance in a surgical environment.


End-effectors used in robotic surgery may be limited to use in only certain procedures, or may suffer from other drawbacks or disadvantages.


SUMMARY

According to some implementations, a surgical robot system is configured for surgery on an anatomical feature of a patient, and includes a surgical robot, a robot arm connected to such surgical robot, and an end-effector connected to the robot arm. The end-effector has a surgical tool selectively connected to it, and the robot system includes a memory accessible by a suitable processor circuit which processes machine-readable instructions. Among the instructions which are executable by the system are ones which, in response to user input, determine a drill target and an associated drill trajectory. The instructions also may be executed to cause the end-effector to move to a position corresponding to such determined target and determined drill trajectory. Once positioned in this manner, the system and corresponding instructions will permit advancement of the aforementioned drill toward the determined target.


In certain implementations, the system includes a drill guide which is connectible between the end-effector and the drill. The drill guide is configured to stop advancement of the drill at a pre-selected drill depth associated with the determined target, and is further configured to guide the drill along the determined drill trajectory during advancement of the drill. As such, the drill guide provides further assurance against over-penetration or over-engagement of the anatomical feature being operated upon and likewise assists in guiding the trajectory of the drill, including when such trajectory is oblique to the anatomical feature.


The drill guide includes a guide shaft with a bore extending longitudinally therethrough and having proximal and distal openings to the bore. The guide shaft and the corresponding bore are sized to slideably receive a drill bit of the drill therethrough, such drill bit having an associated drill tip and being operatively connected to advancement and rotating mechanisms of the drill at a proximal location by a suitable chuck. The drill guide includes a depth stop which is mounted to the proximal end of the guide shaft and selectively slideable longitudinally to vary the length of the guide shaft relative to the predetermined length of the drill bit received in the drill. The drill stop has a surface located and otherwise disposed to engage the distal surface of the chuck of the drill. As such, engagement between the chuck and the depth stop limits the depth to which the drill tip is advanceable by an amount corresponding to the position of the depth stop.


In certain implementations, the drill guide makes use of a depth indicator having indicia corresponding to graduated depths of advancement of the drill bit associated with the drill guide during the operative procedures contemplated by the surgical robot system. The depth stop may be formed so as to include a longitudinal stem, this stem being slideably received within the bore of the guide shaft. The guide shaft, in turn, has a window formed therein which is sized and located to reveal the longitudinal stem of the depth stop when it is received in the guide shaft. With such an arrangement, the aforementioned indicia associated with the graduated depth may be located either on the longitudinal stem visible through the window or on portions adjacent to the window, and such indicia are movable relative to a pointer or other indicator relative to which the graduated depth scale moves. In this manner, the desired depth limit may be manually selected by visually perceiving the numerical value of the graduated depths aligned with the associated pointer or indicator.


In still other aspects of the disclosed implementations, the drill guide has an engagement mechanism to set the depth stop at one of a plurality of selectable, longitudinal positions corresponding to the desired drill depth. One suitable engagement mechanism may include a ratchet assembly engageable at one of a plurality of longitudinally spaced locations between the proximal and distal ends of the depth stop. In one suitable implementation, the ratchet assembly includes a ratchet that may be disengaged and engaged by actuation or release, respectively, of a spring-biased trigger. Such trigger may have a trigger lock operatively associated therewith so that once the ratchet of the ratchet assembly has engaged the depth stop at a selected longitudinal position, such trigger lock inhibits inadvertent actuation of the trigger which would cause disengagement of the ratchet and potential longitudinal movement of the depth stop. The drill guide may include a handle operatively connected to the engagement mechanism so as to manually set the depth stop at the selected one of the longitudinal positions.


In still other implementations, the guide shaft has at least one bushing, and such bushing may be sized to engage the longitudinal surface at the distal end of the drill bit in such a manner so as to exert a force on such distal drill bit end to oppose lateral displacement of the drill bit within the guide shaft. Such opposing force may be sufficient to reduce deviation of the drill bit from the determined drill trajectory, especially when such trajectory is at an oblique angle to the anatomical feature being engaged.


The above-described system and its various features may be associated with a variety of related procedures or processes, collectively referred to herein as methods. One such method involves guiding a drill during robot-assisted surgery on an anatomical feature of a patient, such method including the determination by suitable computer means of a drill target and an associated drill trajectory. Thereafter, the end-effector is caused to move, such as by means of a computer, to a position corresponding to the determined target and the determined drill trajectory. At any other point prior to drill advancement during operations, the depth stop connected to the end-effector may be manually set at one of a plurality of selectable positions corresponding to a desired depth of advancement of the drill relative to the anatomical feature. In this manner, the drill is limited from penetrating the anatomical feature beyond the desired depth set manually by the depth stop.


In another possible method hereunder, the depth stop is displaced relative to visually perceptible indicia corresponding to graduated depths of advancement of the drill associated with the system. Once a suitable location of the depth stop is determined by visual perception, the depth stop may be secured at a position corresponding to one of the graduated depths indicated on the indicia. The steps of displacing and securing the depth stop may involve actuating a spring-biased trigger to disengage the depth stop from a first position, longitudinally sliding the depth stop relative to the visually perceptible indicia to a second position corresponding to the desired position of the depth stop, and then releasing the trigger to re-engage the depth stop at the desired position.


Other methods and related devices and systems, and corresponding methods and computer program products according to embodiments will be or become apparent to one with skill in the art upon review of the following drawings and detailed description. It is intended that all such devices and systems, and corresponding methods and computer program products be included within this description, be within the scope of the present disclosure, and be protected by the accompanying claims. Moreover, it is intended that all embodiments disclosed herein can be implemented separately or combined in any way and/or combination.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in a constitute a part of this application, illustrate certain non-limiting embodiments of inventive concepts. In the drawings:



FIG. 1A is an overhead view of an arrangement for locations of a robotic system, patient, surgeon, and other medical personnel during a surgical procedure, according to some embodiments;



FIG. 1B is an overhead view of an alternate arrangement for locations of a robotic system, patient, surgeon, and other medical personnel during a cranial surgical procedure, according to some embodiments;



FIG. 2 illustrates a robotic system including positioning of the surgical robot and a camera relative to the patient according to some embodiments;



FIG. 3 is a flowchart diagram illustrating computer-implemented operations for determining a position and orientation of an anatomical feature of a patient with respect to a robot arm of a surgical robot, according to some embodiments;



FIG. 4 is a diagram illustrating processing of data for determining a position and orientation of an anatomical feature of a patient with respect to a robot arm of a surgical robot, according to some embodiments;



FIGS. 5A-5C illustrate a system for registering an anatomical feature of a patient using a computerized tomography (CT) localizer, a frame reference array (FRA), and a dynamic reference base (DRB), according to some embodiments;



FIGS. 6A and 6B illustrate a system for registering an anatomical feature of a patient using fluoroscopy (fluoro) imaging, according to some embodiments;



FIG. 7 illustrates a system for registering an anatomical feature of a patient using an intraoperative CT fixture (ICT) and a DRB, according to some embodiments;



FIGS. 8A and 8B illustrate systems for registering an anatomical feature of a patient using a DRB and an X-ray cone beam imaging device, according to some embodiments;



FIG. 9 illustrates a system for registering an anatomical feature of a patient using a navigated probe and fiducials for point-to-point mapping of the anatomical feature, according to some embodiments;



FIG. 10 illustrates a two-dimensional visualization of an adjustment range for a centerpoint-arc mechanism, according to some embodiments;



FIG. 11 illustrates a two-dimensional visualization of virtual point rotation mechanism, according to some embodiments;



FIG. 12 is an isometric view of one possible implementation of an end-effector according to the present disclosure;



FIG. 13 is an isometric view of another possible implementation of an end-effector of the present disclosure;



FIG. 14 is a partial cutaway, isometric view of still another possible implementation of an end-effector according to the present disclosure;



FIG. 15 is a bottom angle isometric view of yet another possible implementation of an end-effector according to the present disclosure;



FIG. 16 is an isometric view of one possible tool stop for use with an end-effector according to the present disclosure;



FIGS. 17 and 18 are top plan views of one possible implementation of a tool insert locking mechanism of an end-effector according to the present disclosure;



FIGS. 19 and 20 are top plan views of the tool stop of FIG. 16, showing open and closed positions, respectively;



FIG. 21 is a side, elevational view of another implementation of the robot system disclosed herein, including an end-effector and associated drill guide;



FIG. 22 is a close-up, side-elevational view of the drill guide of FIG. 21;



FIG. 23 is a front elevational view of the drill guide of FIGS. 21-22;



FIG. 24 is a cross sectional view of the drill guide of FIGS. 21-23 taken along the line of A-A of FIG. 23; and



FIG. 25 is an exploded, side-elevational view of the drill guide of FIGS. 21-24.





DETAILED DESCRIPTION

It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the description herein or illustrated in the drawings. The teachings of the present disclosure may be used and practiced in other embodiments and practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Unless specified or limited otherwise, the terms “mounted,” “connected,” “supported,” and “coupled” and variations thereof are used broadly and encompass both direct and indirect mountings, connections, supports, and couplings. Further, “connected” and “coupled” are not restricted to physical or mechanical connections or couplings.


The following discussion is presented to enable a person skilled in the art to make and use embodiments of the present disclosure. Various modifications to the illustrated embodiments will be readily apparent to those skilled in the art, and the principles herein can be applied to other embodiments and applications without departing from embodiments of the present disclosure. Thus, the embodiments are not intended to be limited to embodiments shown, but are to be accorded the widest scope consistent with the principles and features disclosed herein. The following detailed description is to be read with reference to the figures, in which like elements in different figures have like reference numerals. The figures, which are not necessarily to scale, depict selected embodiments and are not intended to limit the scope of the embodiments. Skilled artisans will recognize the examples provided herein have many useful alternatives and fall within the scope of the embodiments.


According to some other embodiments, systems for neuronavigation registration and robotic trajectory guidance, and related methods and devices are disclosed. In some embodiments, a first image having an anatomical feature of a patient, a registration fixture that is fixed with respect to the anatomical feature of the patient, and a first plurality of fiducial markers that are fixed with respect to the registration fixture is analyzed, and a position is determined for each fiducial marker of the first plurality of fiducial markers. Next, based on the determined positions of the first plurality of fiducial markers, a position and orientation of the registration fixture with respect to the anatomical feature is determined. A data frame comprising a second plurality of tracking markers that are fixed with respect to the registration fixture is also analyzed, and a position is determined for each tracking marker of the second plurality of tracking markers. Based on the determined positions of the second plurality of tracking markers, a position and orientation of the registration fixture with respect to a robot arm of a surgical robot is determined. Based on the determined position and orientation of the registration fixture with respect to the anatomical feature and the determined position and orientation of the registration fixture with respect to the robot arm, a position and orientation of the anatomical feature with respect to the robot arm is determined, which allows the robot arm to be controlled based on the determined position and orientation of the anatomical feature with respect to the robot arm.


Advantages of this and other embodiments include the ability to combine neuronavigation and robotic trajectory alignment into one system, with support for a wide variety of different registration hardware and methods. For example, as will be described in detail below, embodiments may support both computerized tomography (CT) and fluoroscopy (fluoro) registration techniques, and may utilize frame-based and/or frameless surgical arrangements. Moreover, in many embodiments, if an initial (e.g. preoperative) registration is compromised due to movement of a registration fixture, registration of the registration fixture (and of the anatomical feature by extension) can be re-established intraoperatively without suspending surgery and re-capturing preoperative images.


Referring now to the drawings, FIG. 1A illustrates a surgical robot system 100 in accordance with an embodiment. Surgical robot system 100 may include, for example, a surgical robot 102, one or more robot arms 104, a base 106, a display 110, an end-effector 112, for example, including a guide tube 114, and one or more tracking markers 118. The robot arm 104 may be movable along and/or about an axis relative to the base 106, responsive to input from a user, commands received from a processing device, or other methods. The surgical robot system 100 may include a patient tracking device 116 also including one or more tracking markers 118, which is adapted to be secured directly to the patient 210 (e.g., to a bone of the patient 210). As will be discussed in greater detail below, the tracking markers 118 may be secured to or may be part of a stereotactic frame that is fixed with respect to an anatomical feature of the patient 210. The stereotactic frame may also be secured to a fixture to prevent movement of the patient 210 during surgery.


According to an alternative embodiment, FIG. 1B is an overhead view of an alternate arrangement for locations of a robotic system 100, patient 210, surgeon 120, and other medical personnel during a cranial surgical procedure. During a cranial procedure, for example, the robot 102 may be positioned behind the head 128 of the patient 210. The robot arm 104 of the robot 102 has an end-effector 112 that may hold a surgical instrument 108 during the procedure. In this example, a stereotactic frame 134 is fixed with respect to the patient's head 128, and the patient 210 and/or stereotactic frame 134 may also be secured to a patient base 211 to prevent movement of the patient's head 128 with respect to the patient base 211. In addition, the patient 210, the stereotactic frame 134 and/or or the patient base 211 may be secured to the robot base 106, such as via an auxiliary arm 107, to prevent relative movement of the patient 210 with respect to components of the robot 102 during surgery. Different devices may be positioned with respect to the patient's head 128 and/or patient base 211 as desired to facilitate the procedure, such as an intra-operative CT device 130, an anesthesiology station 132, a scrub station 136, a neuro-modulation station 138, and/or one or more remote pendants 140 for controlling the robot 102 and/or other devices or systems during the procedure.


The surgical robot system 100 in the examples of FIGS. 1A and/or 1B may also use a sensor, such as a camera 200, for example, positioned on a camera stand 202. The camera stand 202 can have any suitable configuration to move, orient, and support the camera 200 in a de sired position. The camera 200 may include any suitable camera or cameras, such as one or more cameras (e.g., bifocal or stereophotogrammetric cameras), able to identify, for example, active or passive tracking markers 118 (shown as part of patient tracking device 116 in FIG. 2) in a given measurement volume viewable from the perspective of the camera 200. In this example, the camera 200 may scan the given measurement volume and detect the light that comes from the tracking markers 118 in order to identify and determine the position of the tracking markers 118 in three-dimensions. For example, active tracking markers 118 may include infrared-emitting markers that are activated by an electrical signal (e.g., infrared light emitting diodes (LEDs)), and/or passive tracking markers 118 may include retro-reflective markers that reflect infrared or other light (e.g., they reflect incoming IR radiation into the direction of the incoming light), for example, emitted by illuminators on the camera 200 or other suitable sensor or other device.


In many surgical procedures, one or more targets of surgical interest, such as targets within the brain for example, are localized to an external reference frame. For example, stereotactic neurosurgery may use an externally mounted stereotactic frame that facilitates patient localization and implant insertion via a frame mounted arc. Neuronavigation is used to register, e.g., map, targets within the brain based on pre-operative or intraoperative imaging. Using this pre-operative or intraoperative imaging, links and associations can be made between the imaging and the actual anatomical structures in a surgical environment, and these links and associations can be utilized by robotic trajectory systems during surgery.


According to some embodiments, various software and hardware elements may be combined to create a system that can be used to plan, register, place and verify the location of an instrument or implant in the brain. These systems may integrate a surgical robot, such as the surgical robot 102 of FIGS. 1A and/or 1B, and may employ a surgical navigation system and planning software to program and control the surgical robot. In addition or alternatively, the surgical robot 102 may be remotely controlled, such as by nonsterile personnel.


The robot 102 may be positioned near or next to patient 210, and it will be appreciated that the robot 102 can be positioned at any suitable location near the patient 210 depending on the area of the patient 210 undergoing the operation. The camera 200 may be separated from the surgical robot system 100 and positioned near or next to patient 210 as well, in any suitable position that allows the camera 200 to have a direct visual line of sight to the surgical field 208. In the configuration shown, the surgeon 120 may be positioned across from the robot 102, but is still able to manipulate the end-effector 112 and the display 110. A surgical assistant 126 may be positioned across from the surgeon 120 again with access to both the end-effector 112 and the display 110. If desired, the locations of the surgeon 120 and the assistant 126 may be reversed. The traditional areas for the anesthesiologist 122 and the nurse or scrub tech 124 may remain unimpeded by the locations of the robot 102 and camera 200.


With respect to the other components of the robot 102, the display 110 can be attached to the surgical robot 102 and in other embodiments, the display 110 can be detached from surgical robot 102, either within a surgical room with the surgical robot 102, or in a remote location. The end-effector 112 may be coupled to the robot arm 104 and controlled by at least one motor. In some embodiments, end-effector 112 can comprise a guide tube 114, which is able to receive and orient a surgical instrument 108 used to perform surgery on the patient 210. As used herein, the term “end-effector” is used interchangeably with the terms “end-effectuator” and “effectuator element.” Although generally shown with a guide tube 114, it will be appreciated that the end-effector 112 may be replaced with any suitable instrumentation suitable for use in surgery. In some embodiments, end-effector 112 can comprise any known structure for effecting the movement of the surgical instrument 108 in a desired manner.


The surgical robot 102 is able to control the translation and orientation of the end-effector 112. The robot 102 is able to move end-effector 112 along x-, y-, and z-axes, for example. The end-effector 112 can be configured for selective rotation about one or more of the x-, y-, and z-axis such that one or more of the Euler Angles (e.g., roll, pitch, and/or yaw) associated with end-effector 112 can be selectively controlled. In some embodiments, selective control of the translation and orientation of end-effector 112 can permit performance of medical procedures with significantly improved accuracy compared to conventional robots that use, for example, a six degree of freedom robot arm comprising only rotational axes. For example, the surgical robot system 100 may be used to operate on patient 210, and robot arm 104 can be positioned above the body of patient 210, with end-effector 112 selectively angled relative to the z-axis toward the body of patient 210.


In some embodiments, the position of the surgical instrument 108 can be dynamically updated so that surgical robot 102 can be aware of the location of the surgical instrument 108 at all times during the procedure. Consequently, in some embodiments, surgical robot 102 can move the surgical instrument 108 to the desired position quickly without any further assistance from a physician (unless the physician so desires). In some further embodiments, surgical robot 102 can be configured to correct the path of the surgical instrument 108 if the surgical instrument 108 strays from the selected, preplanned trajectory. In some embodiments, surgical robot 102 can be configured to permit stoppage, modification, and/or manual control of the movement of end-effector 112 and/or the surgical instrument 108. Thus, in use, in some embodiments, a physician or other user can operate the system 100, and has the option to stop, modify, or manually control the autonomous movement of end-effector 112 and/or the surgical instrument 108. Further details of surgical robot system 100 including the control and movement of a surgical instrument 108 by surgical robot 102 can be found in co-pending U.S. Patent Publication No. 2013/0345718, which is incorporated herein by reference in its entirety.


As will be described in greater detail below, the surgical robot system 100 can comprise one or more tracking markers configured to track the movement of robot arm 104, end-effector 112, patient 210, and/or the surgical instrument 108 in three dimensions. In some embodiments, a plurality of tracking markers can be mounted (or otherwise secured) thereon to an outer surface of the robot 102, such as, for example and without limitation, on base 106 of robot 102, on robot arm 104, and/or on the end-effector 112. In some embodiments, such as the embodiment of FIG. 3 below, for example, one or more tracking markers can be mounted or otherwise secured to the end-effector 112. One or more tracking markers can further be mounted (or otherwise secured) to the patient 210. In some embodiments, the plurality of tracking markers can be positioned on the patient 210 spaced apart from the surgical field 208 to reduce the likelihood of being obscured by the surgeon, surgical tools, or other parts of the robot 102. Further, one or more tracking markers can be further mounted (or otherwise secured) to the surgical instruments 108 (e.g., a screw driver, dilator, implant inserter, or the like). Thus, the tracking markers enable each of the marked objects (e.g., the end-effector 112, the patient 210, and the surgical instruments 108) to be tracked by the surgical robot system 100. In some embodiments, system 100 can use tracking information collected from each of the marked objects to calculate the orientation and location, for example, of the end-effector 112, the surgical instrument 108 (e.g., positioned in the tube 114 of the end-effector 112), and the relative position of the patient 210. Further details of surgical robot system 100 including the control, movement and tracking of surgical robot 102 and of a surgical instrument 108 can be found in U.S. Patent Publication No. 2016/0242849, which is incorporated herein by reference in its entirety.


In some embodiments, pre-operative imaging may be used to identify the anatomy to be targeted in the procedure. If desired by the surgeon the planning package will allow for the definition of a reformatted coordinate system. This reformatted coordinate system will have coordinate axes anchored to specific anatomical landmarks, such as the anterior commissure (AC) and posterior commissure (PC) for neurosurgery procedures. In some embodiments, multiple pre-operative exam images (e.g., CT or magnetic resonance (MR) images) may be co-registered such that it is possible to transform coordinates of any given point on the anatomy to the corresponding point on all other pre-operative exam images.


As used herein, registration is the process of determining the coordinate transformations from one coordinate system to another. For example, in the co-registration of preoperative images, co-registering a CT scan to an MR scan means that it is possible to transform the coordinates of an anatomical point from the CT scan to the corresponding anatomical location in the MR scan. It may also be advantageous to register at least one exam image coordinate system to the coordinate system of a common registration fixture, such as a dynamic reference base (DRB), which may allow the camera 200 to keep track of the position of the patient in the camera space in real-time so that any intraoperative movement of an anatomical point on the patient in the room can be detected by the robot system 100 and accounted for by compensatory movement of the surgical robot 102.



FIG. 3 is a flowchart diagram illustrating computer-implemented operations 300 for determining a position and orientation of an anatomical feature of a patient with respect to a robot arm of a surgical robot, according to some embodiments. The operations 300 may include receiving a first image volume, such as a CT scan, from a preoperative image capture device at a first time (Block 302). The first image volume includes an anatomical feature of a patient and at least a portion of a registration fixture that is fixed with respect to the anatomical feature of the patient. The registration fixture includes a first plurality of fiducial markers that are fixed with respect to the registration fixture. The operations 300 further include determining, for each fiducial marker of the first plurality of fiducial markers, a position of the fiducial marker relative to the first image volume (Block 304). The operations 300 further include, determining, based on the determined positions of the first plurality of fiducial markers, positions of an array of tracking markers on the registration fixture (fiducial registration array or FRA) with respect to the anatomical feature (Block 306).


The operations 300 may further include receiving a tracking data frame from an intraoperative tracking device comprising a plurality of tracking cameras at a second time that is later than the first time (Block 308). The tracking frame includes positions of a plurality of tracking markers that are fixed with respect to the registration fixture (FRA) and a plurality of tracking markers that are fixed with respect to the robot. The operations 300 further include determining, for based on the positions of tracking markers of the registration fixture, a position and orientation of the anatomical feature with respect to the tracking cameras (Block 310). The operations 300 further include determining, based on the determined positions of the plurality of tracking markers on the robot, a position and orientation of the robot arm of a surgical robot with respect to the tracking cameras (Block 312).


The operations 300 further include determining, based on the determined position and orientation of the anatomical feature with respect to the tracking cameras and the determined position and orientation of the robot arm with respect to the tracking cameras, a position and orientation of the anatomical feature with respect to the robot arm (Block 314). The operations 300 further include controlling movement of the robot arm with respect to the anatomical feature, e.g., along and/or rotationally about one or more defined axis, based on the determined position and orientation of the anatomical feature with respect to the robot arm (Block 316).



FIG. 4 is a diagram illustrating a data flow 400 for a multiple coordinate transformation system, to enable determining a position and orientation of an anatomical feature of a patient with respect to a robot arm of a surgical robot, according to some embodiments. In this example, data from a plurality of exam image spaces 402, based on a plurality of exam images, may be transformed and combined into a common exam image space 404. The data from the common exam image space 404 and data from a verification image space 406, based on a verification image, may be transformed and combined into a registration image space 408. Data from the registration image space 408 may be transformed into patient fiducial coordinates 410, which is transformed into coordinates for a DRB 412. A tracking camera 414 may detect movement of the DRB 412 (represented by DRB 412′) and may also detect a location of a probe tracker 416 to track coordinates of the DRB 412 over time. A robotic arm tracker 418 determines coordinates for the robot arm based on transformation data from a Robotics Planning System (RPS) space 420 or similar modeling system, and/or transformation data from the tracking camera 414.


It should be understood that these and other features may be used and combined in different ways to achieve registration of image space, i.e., coordinates from image volume, into tracking space, i.e., coordinates for use by the surgical robot in real-time. As will be discussed in detail below, these features may include fiducial-based registration such as stereotactic frames with CT localizer, preoperative CT or MRI registered using intraoperative fluoroscopy, calibrated scanner registration where any acquired scan's coordinates are pre-calibrated relative to the tracking space, and/or surface registration using a tracked probe, for example.


In one example, FIGS. 5A-5C illustrate a system 500 for registering an anatomical feature of a patient. In this example, the stereotactic frame base 530 is fixed to an anatomical feature 528 of patient, e.g., the patient's head. As shown by FIG. 5A, the stereotactic frame base 530 may be affixed to the patient's head 528 prior to registration using pins clamping the skull or other method. The stereotactic frame base 530 may act as both a fixation platform, for holding the patient's head 528 in a fixed position, and registration and tracking platform, for alternatingly holding the CT localizer 536 or the FRA fixture 534. The CT localizer 536 includes a plurality of fiducial markers 532 (e.g., N-pattern radio-opaque rods or other fiducials), which are automatically detected in the image space using image processing. Due to the precise attachment mechanism of the CT localizer 536 to the base 530, these fiducial markers 532 are in known space relative to the stereotactic frame base 530. A 3D CT scan of the patient with CT localizer 536 attached is taken, with an image volume that includes both the patient's head 528 and the fiducial markers 532 of the CT localizer 536. This registration image can be taken intraoperatively or preoperatively, either in the operating room or in radiology, for example. The captured 3D image dataset is stored to computer memory.


As shown by FIG. 5B, after the registration image is captured, the CT localizer 536 is removed from the stereotactic frame base 530 and the frame reference array fixture 534 is attached to the stereotactic frame base 530. The stereotactic frame base 530 remains fixed to the patient's head 528, however, and is used to secure the patient during surgery, and serves as the attachment point of a frame reference array fixture 534. The frame reference array fixture 534 includes a frame reference array (FRA), which is a rigid array of three or more tracked markers 539, which may be the primary reference for optical tracking. By positioning the tracked markers 539 of the FRA in a fixed, known location and orientation relative to the stereotactic frame base 530, the position and orientation of the patient's head 528 may be tracked in real time. Mount points on the FRA fixture 534 and stereotactic frame base 530 may be designed such that the FRA fixture 534 attaches reproducibly to the stereotactic frame base 530 with minimal (i.e., submillimetric) variability. These mount points on the stereotactic frame base 530 can be the same mount points used by the CT localizer 536, which is removed after the scan has been taken. An auxiliary arm (such as auxiliary arm 107 of FIG. 1B, for example) or other attachment mechanism can also be used to securely affix the patient to the robot base to ensure that the robot base is not allowed to move relative to the patient.


As shown by FIG. 5C, a dynamic reference base (DRB) 540 may also be attached to the stereotactic frame base 530. The DRB 540 in this example includes a rigid array of three or more tracked markers 542. In this example, the DRB 540 and/or other tracked markers may be attached to the stereotactic frame base 530 and/or to directly to the patient's head 528 using auxiliary mounting arms 541, pins, or other attachment mechanisms. Unlike the FRA fixture 534, which mounts in only one way for unambiguous localization of the stereotactic frame base 530, the DRB 540 in general may be attached as needed for allowing unhindered surgical and equipment access. Once the DRB 540 and FRA fixture 534 are attached, registration, which was initially related to the tracking markers 539 of the FRA, can be optionally transferred or related to the tracking markers 542 of the DRB 540. For example, if any part of the FRA fixture 534 blocks surgical access, the surgeon may remove the FRA fixture 534 and navigate using only the DRB 540. However, if the FRA fixture 534 is not in the way of the surgery, the surgeon could opt to navigate from the FRA markers 539, without using a DRB 540, or may navigate using both the FRA markers 539 and the DRB 540. In this example, the FRA fixture 534 and/or DRB 540 uses optical markers, the tracked positions of which are in known locations relative to the stereotactic frame base 530, similar to the CT localizer 536, but it should be understood that many other additional and/or alternative techniques may be used.



FIGS. 6A and 6B illustrate a system 600 for registering an anatomical feature of a patient using fluoroscopy (fluoro) imaging, according to some embodiments. In this embodiment, image space is registered to tracking space using multiple intraoperative fluoroscopy (fluoro) images taken using a tracked registration fixture 644. The anatomical feature of the patient (e.g., the patient's head 628) is positioned and rigidly affixed in a clamping apparatus 643 in a static position for the remainder of the procedure. The clamping apparatus 643 for rigid patient fixation can be a three-pin fixation system such as a Mayfield clamp, a stereotactic frame base attached to the surgical table, or another fixation method, as desired. The clamping apparatus 643 may also function as a support structure for a patient tracking array or DRB 640 as well. The DRB may be attached to the clamping apparatus using auxiliary mounting arms 641 or other means.


Once the patient is positioned, the fluoro fixture 644 is attached the fluoro unit's x-ray collecting image intensifier (not shown) and secured by tightening clamping feet 632. The fluoro fixture 644 contains fiducial markers (e.g., metal spheres laid out across two planes in this example, not shown) that are visible on 2D fluoro images captured by the fluoro image capture device and can be used to calculate the location of the x-ray source relative to the image intensifier, which is typically about 1 meter away contralateral to the patient, using a standard pinhole camera model. Detection of the metal spheres in the fluoro image captured by the fluoro image capture device also enables the software to de-warp the fluoro image (i.e., to remove pincushion and s-distortion). Additionally, the fluoro fixture 644 contains 3 or more tracking markers 646 for determining the location and orientation of the fluoro fixture 644 in tracking space. In some embodiments, software can project vectors through a CT image volume, based on a previously captured CT image, to generate synthetic images based on contrast levels in the CT image that appear similar to the actual fluoro images (i.e., digitally reconstructed radiographs (DRRs)). By iterating through theoretical positions of the fluoro beam until the DRRs match the actual fluoro shots, a match can be found between fluoro image and DRR in two or more perspectives, and based on this match, the location of the patient's head 628 relative to the x-ray source and detector is calculated. Because the tracking markers 646 on the fluoro fixture 644 track the position of the image intensifier and the position of the x-ray source relative to the image intensifier is calculated from metal fiducials on the fluoro fixture 644 projected on 2D images, the position of the x-ray source and detector in tracking space are known and the system is able to achieve image-to-tracking registration.


As shown by FIGS. 6A and 6B, two or more shots are taken of the head 628 of the patient by the fluoro image capture device from two different perspectives while tracking the array markers 642 of the DRB 640, which is fixed to the registration fixture 630 via a mounting arm 641, and tracking markers 646 on the fluoro fixture 644. Based on the tracking data and fluoro data, an algorithm computes the location of the head 628 or other anatomical feature relative to the tracking space for the procedure. Through image-to-tracking registration, the location of any tracked tool in the image volume space can be calculated.


For example, in one embodiment, a first fluoro image taken from a first fluoro perspective can be compared to a first DRR constructed from a first perspective through a CT image volume, and a second fluoro image taken from a second fluoro perspective can be compared to a second DRR constructed from a second perspective through the same CT image volume. Based on the comparisons, it may be determined that the first DRR is substantially equivalent to the first fluoro image with respect to the projected view of the anatomical feature, and that the second DRR is substantially equivalent to the second fluoro image with respect to the projected view of the anatomical feature. Equivalency confirms that the position and orientation of the x-ray path from emitter to collector on the actual fluoro machine as tracked in camera space matches the position and orientation of the x-ray path from emitter to collector as specified when generating the DRRs in CT space, and therefore registration of tracking space to CT space is achieved.



FIG. 7 illustrates a system 700 for registering an anatomical feature of a patient using an intraoperative CT fixture (ICT) and a DRB, according to some embodiments. As shown in FIG. 7, in one application, a fiducial-based image-to-tracking registration can be utilized that uses an intraoperative CT fixture (ICT) 750 having a plurality of tracking markers 751 and radio-opaque fiducial reference markers 732 to register the CT space to the tracking space. After stabilizing the anatomical feature 728 (e.g., the patient's head) using clamping apparatus 730 such as a three-pin Mayfield frame and/or stereotactic frame, the surgeon will affix the ICT 750 to the anatomical feature 728, DRB 740, or clamping apparatus 730, so that it is in a static position relative to the tracking markers 742 of the DRB 740, which may be held in place by mounting arm 741 or other rigid means. A CT scan is captured that encompasses the fiducial reference markers 732 of the ICT 750 while also capturing relevant anatomy of the anatomical feature 728. Once the CT scan is loaded in the software, the system auto-identifies (through image processing) locations of the fiducial reference markers 732 of the ICT within the CT volume, which are in a fixed position relative to the tracking markers of the ICT 750, providing image-to-tracking registration. This registration, which was initially based on the tracking markers 751 of the ICT 750, is then related to or transferred to the tracking markers 742 of the DRB 740, and the ICT 750 may then be removed.



FIG. 8A illustrates a system 800 for registering an anatomical feature of a patient using a DRB and an X-ray cone beam imaging device, according to some embodiments. An intraoperative scanner 852, such as an X-ray machine or other scanning device, may have a tracking array 854 with tracking markers 855, mounted thereon for registration. Based on the fixed, known position of the tracking array 854 on the scanning device, the system may be calibrated to directly map (register) the tracking space to the image space of any scan acquired by the system. Once registration is achieved, the registration, which is initially based on the tracking markers 855 (e.g. gantry markers) of the scanner's array 854, is related or transferred to the tracking markers 842 of a DRB 840, which may be fixed to a clamping fixture 830 holding the patient's head 828 by a mounting arm 841 or other rigid means. After transferring registration, the markers on the scanner are no longer used and can be removed, deactivated or covered if desired. Registering the tracking space to any image acquired by a scanner in this way may avoid the need for fiducials or other reference markers in the image space in some embodiments.



FIG. 8B illustrates an alternative system 800′ that uses a portable intraoperative scanner, referred to herein as a C-arm scanner 853. In this example, the C-arm scanner 853 includes a c-shaped arm 856 coupled to a movable base 858 to allow the C-arm scanner 853 to be moved into place and removed as needed, without interfering with other aspects of the surgery. The arm 856 is positioned around the patient's head 828 intraoperatively, and the arm 856 is rotated and/or translated with respect to the patient's head 828 to capture the X-ray or other type of scan that to achieve registration, at which point the C-arm scanner 853 may be removed from the patient.


Another registration method for an anatomical feature of a patient, e.g., a patient's head, may be to use a surface contour map of the anatomical feature, according to some embodiments. A surface contour map may be constructed using a navigated or tracked probe, or other measuring or sensing device, such as a laser pointer, 3D camera, etc. For example, a surgeon may drag or sequentially touch points on the surface of the head with the navigated probe to capture the surface across unique protrusions, such as zygomatic bones, superciliary arches, bridge of nose, eyebrows, etc. The system then compares the resulting surface contours to contours detected from the CT and/or MR images, seeking the location and orientation of contour that provides the closest match. To account for movement of the patient and to ensure that all contour points are taken relative to the same anatomical feature, each contour point is related to tracking markers on a DRB on the patient at the time it is recorded. Since the location of the contour map is known in tracking space from the tracked probe and tracked DRB, tracking-to-image registration is obtained once the corresponding contour is found in image space.



FIG. 9 illustrates a system 900 for registering an anatomical feature of a patient using a navigated or tracked probe and fiducials for point-to-point mapping of the anatomical feature 928 (e.g., a patient's head), according to some embodiments. Software would instruct the user to point with a tracked probe to a series of anatomical landmark points that can be found in the CT or MR image. When the user points to the landmark indicated by software, the system captures a frame of tracking data with the tracked locations of tracking markers on the probe and on the DRB. From the tracked locations of markers on the probe, the coordinates of the tip of the probe are calculated and related to the locations of markers on the DRB. Once 3 or more points are found in both spaces, tracking-to-image registration is achieved. As an alternative to pointing to natural anatomical landmarks, fiducials 954 (i.e., fiducial markers), such as sticker fiducials or metal fiducials, may be used. The surgeon will attach the fiducials 954 to the patient, which are constructed of material that is opaque on imaging, for example containing metal if used with CT or Vitamin E if used with MR. Imaging (CT or MR) will occur after placing the fiducials 954. The surgeon or user will then manually find the coordinates of the fiducials in the image volume, or the software will find them automatically with image processing. After attaching a DRB 940 with tracking markers 942 to the patient through a mounting arm 941 connected to a clamping apparatus 930 or other rigid means, the surgeon or user may also locate the fiducials 954 in physical space relative to the DRB 940 by touching the fiducials 954 with a tracked probe while simultaneously recording tracking markers on the probe (not shown) and on the DRB 940. Registration is achieved because the coordinates of the same points are known in the image space and the tracking space.


One use for the embodiments described herein is to plan trajectories and to control a robot to move into a desired trajectory, after which the surgeon will place implants such as electrodes through a guide tube held by the robot. Additional functionalities include exporting coordinates used with existing stereotactic frames, such as a Leksell frame, which uses five coordinates: X, Y, Z, Ring Angle and Arc Angle. These five coordinates are established using the target and trajectory identified in the planning stage relative to the image space and knowing the position and orientation of the ring and arc relative to the stereotactic frame base or other registration fixture.


As shown in FIG. 10, stereotactic frames allow a target location 1058 of an anatomical feature 1028 (e.g., a patient's head) to be treated as the center of a sphere and the trajectory can pivot about the target location 1058. The trajectory to the target location 1058 is adjusted by the ring and arc angles of the stereotactic frame (e.g., a Leksell frame). These coordinates may be set manually, and the stereotactic frame may be used as a backup or as a redundant system in case the robot fails or cannot be tracked or registered successfully. The linear x,y,z offsets to the center point (i.e., target location 1058) are adjusted via the mechanisms of the frame. A cone 1060 is centered around the target location 1058, and shows the adjustment zone that can be achieved by modifying the ring and arc angles of the Leksell or other type of frame. This figure illustrates that a stereotactic frame with ring and arc adjustments is well suited for reaching a fixed target location from a range of angles while changing the entry point into the skull.



FIG. 11 illustrates a two-dimensional visualization of virtual point rotation mechanism, according to some embodiments. In this embodiment, the robotic arm is able to create a different type of point-rotation functionality that enables a new movement mode that is not easily achievable with a 5-axis mechanical frame, but that may be achieved using the embodiments described herein. Through coordinated control of the robot's axes using the registration techniques described herein, this mode allows the user to pivot the robot's guide tube about any fixed point in space. For example, the robot may pivot about the entry point 1162 into the anatomical feature 1128 (e.g., a patient's head). This entry point pivoting is advantageous as it allows the user to make a smaller burr hole without limiting their ability to adjust the target location 1164 intraoperatively. The cone 1160 represents the range of trajectories that may be reachable through a single entry hole. Additionally, entry point pivoting is advantageous as it allows the user to reach two different target locations 1164 and 1166 through the same small entry burr hole. Alternately, the robot may pivot about a target point (e.g., location 1058 shown in FIG. 10) within the skull to reach the target location from different angles or trajectories, as illustrated in FIG. 10. Such interior pivoting robotically has the same advantages as a stereotactic frame as it allows the user to approach the same target location 1058 from multiple approaches, such as when irradiating a tumor or when adjusting a path so that critical structures such as blood vessels or nerves will not be crossed when reaching targets beyond them. Unlike a stereotactic frame, which relies on fixed ring and arc articulations to keep a target/pivot point fixed, the robot adjusts the pivot point through controlled activation of axes and the robot can therefore dynamically adjust its pivot point and switch as needed between the modes illustrated in FIGS. 10 and 11.


Following the insertion of implants or instrumentation using the robot or ring and arc fixture, these and other embodiments may allow for implant locations to be verified using intraoperative imaging. Placement accuracy of the instrument or implant relative to the planned trajectory can be qualitatively and/or quantitatively shown to the user. One option for comparing planned to placed position is to merge a postoperative verification CT image to any of the preoperative images. Once pre- and post-operative images are merged and plan is shown overlaid, the shadow of the implant on postop CT can be compared to the plan to assess accuracy of placement. Detection of the shadow artifact on post-op CT can be performed automatically through image processing and the offset displayed numerically in terms of millimeters offset at the tip and entry and angular offset along the path. This option does not require any fiducials to be present in the verification image since image-to-image registration is performed based on bony anatomical contours.


A second option for comparing planned position to the final placement would utilize intraoperative fluoro with or without an attached fluoro fixture. Two out-of-plane fluoro images will be taken and these fluoro images will be matched to DRRs generated from pre-operative CT or MR as described above for registration. Unlike some of the registration methods described above, however, it may be less important for the fluoro images to be tracked because the key information is where the electrode is located relative to the anatomy in the fluoro image. The linear or slightly curved shadow of the electrode would be found on a fluoro image, and once the DRR corresponding to that fluoro shot is found, this shadow can be replicated in the CT image volume as a plane or sheet that is oriented in and out of the ray direction of the fluoro image and DRR. That is, the system may not know how deep in or out of the fluoro image plane the electrode lies on a given shot, but can calculate the plane or sheet of possible locations and represent this plane or sheet on the 3D volume. In a second fluoro view, a different plane or sheet can be determined and overlaid on the 3D image. Where these two planes or sheets intersect on the 3D image is the detected path of the electrode. The system can represent this detected path as a graphic on the 3D image volume and allow the user to reslice the image volume to display this path and the planned path from whatever perspective is desired, also allowing automatic or manual calculation of the deviation from planned to placed position of the electrode. Tracking the fluoro fixture is unnecessary but may be done to help de-warp the fluoro images and calculate the location of the x-ray emitter to improve accuracy of DRR calculation, the rate of convergence when iterating to find matching DRR and fluoro shots, and placement of sheets/planes representing the electrode on the 3D scan.


In this and other examples, it is desirable to maintain navigation integrity, i.e., to ensure that the registration and tracking remain accurate throughout the procedure. Two primary methods to establish and maintain navigation integrity include: tracking the position of a surveillance marker relative to the markers on the DRB, and checking landmarks within the images. In the first method, should this position change due to, for example, the DRB being bumped, then the system may alert the user of a possible loss of navigation integrity. In the second method, if a landmark check shows that the anatomy represented in the displayed slices on screen does not match the anatomy at which the tip of the probe points, then the surgeon will also become aware that there is a loss of navigation integrity. In either method, if using the registration method of CT localizer and frame reference array (FRA), the surgeon has the option to re-attach the FRA, which mounts in only one possible way to the frame base, and to restore tracking-to-image registration based on the FRA tracking markers and the stored fiducials from the CT localizer 536. This registration can then be transferred or related to tracking markers on a repositioned DRB. Once registration is transferred the FRA can be removed if desired.


Referring now to FIGS. 12-18 generally, with reference to the surgical robot system 100 shown in FIG. 1A, end-effector 112 may be equipped with components, configured, or otherwise include features so that one end-effector may remain attached to a given one of robot arms 104 without changing to another end-effector for multiple different surgical procedures, such as, by way of example only, Deep Brain Stimulation (DBS), Stereoelectroencephalography (SEEG), or Endoscopic Navigation and Tumor Biopsy. As discussed previously, end-effector 112 may be orientable to oppose an anatomical feature of a patient in the manner so as to be in operative proximity thereto, and, to be able to receive one or more surgical tools for operations contemplated on the anatomical feature proximate to the end-effector 112. Motion and orientation of end-effector 112 may be accomplished through any of the navigation, trajectory guidance, or other methodologies discussed herein or as may be otherwise suitable for the particular operation.


End-effector 112 is suitably configured to permit a plurality of surgical tools 129 to be selectively connectable to end-effector 112. Thus, for example, a stylet 113 (FIG. 13) may be selectively attached in order to localize an incision point on an anatomical feature of a patient, or an electrode driver 115 (FIG. 14) may be selectively attached to the same end-effector 112.


With reference to the previous discussion of robot surgical system 100, a processor circuit, as well as memory accessible by such processor circuit, includes various subroutines and other machine-readable instructions configured to cause, when executed, end-effector 112 to move, such as by GPS movement, relative to the anatomical feature, at predetermined stages of associated surgical operations, whether pre-operative, intra-operative or post-operative.


End-effector 112 includes various components and features to either prevent or permit end-effector movement depending on whether and which tools 129, if any, are connected to end-effector 112. Referring more particularly to FIG. 12, end-effector 112 includes a tool-insert locking mechanism 117 located on and connected to proximal surface 119. Tool-insert locking mechanism 117 is configured so as to secure any selected one of a plurality of surgical tools, such as the aforesaid stylet 113, electrode driver 115, or any other tools for different surgeries mentioned previously or as may be contemplated by other applications of this disclosure. The securement of the tool by tool-insert locking mechanism 117 is such that, for any of multiple tools capable of being secured to locking mechanism 117, each such tool is operatively and suitably secured at the predetermined height, angle of orientation, and rotational position relative to the anatomical feature of the patient, such that multiple tools may be secured to the same end-effector 112 in respective positions appropriate for the contemplated procedure.


Another feature of the end-effector 112 is a tool stop 121 located on distal surface 123 of end-effector 112, that is, the surface generally opposing the patient. Tool stop 121 has a stop mechanism 125 and a sensor 127 operatively associated therewith, as seen with reference to FIGS. 16, 19, and 20. Stop mechanism 125 is mounted to end-effector 112 so as to be selectively movable relative thereto between an engaged position to prevent any of the tools from being connected to end-effector 112 and a disengaged position which permits any of the tools 129 to be selectively connected to end-effector 112. Sensor 127 may be located on or within the housing of end-effector 112 at any suitable location (FIGS. 12, 14, 16) so that sensor 127 detects whether stop mechanism 125 is in the engaged or disengaged position. Sensor 127 may assume any form suitable for such detection, such as any type of mechanical switch or any type of magnetic sensor, including Reed switches, Hall Effect sensors, or other magnetic field detecting devices. In one possible implementation, sensor 127 has two portions, a Hall Effect sensor portion (not shown) and a magnetic portion 131, the two portions moving relative to each other so as to generate and detect two magnetic fields corresponding to respective engaged and disengaged position. In the illustrated implementation, the magnetic portion comprises two rare earth magnets 131 which move relative to the complementary sensing portion (not shown) mounted in the housing of end effector 112 in operative proximity to magnets 131 to detect change in the associated magnetic field from movement of stop mechanism 125 between engaged and disengaged positions. In this implementation the Hall effect sensor is bipolar and can detect whether a North pole or South pole of a magnet opposes the sensor. Magnets 131 are configured so that the North pole of one magnet faces the path of the sensor and the South pole of the other magnet faces the path of the sensor. In this configuration, the sensor senses an increased signal when it is near one magnet (for example, in disengaged position), a decreased signal when it is near the other magnet (for example, in engaged position), and unchanged signal when it is not in proximity to any magnet. In this implementation, in response to detection of stop mechanism 125 being in the disengaged position shown in FIGS. 13 and 19, sensor 127 causes the processor of surgical robot system 100 to execute suitable instructions to prevent movement of end-effector 112 relative to the anatomical feature. Such movement prevention may be appropriate for any number of reasons, such as when a tool is connected to end-effector 112, such tool potentially interacting with the anatomical feature of the patient.


Another implementation of a sensor 127 for detecting engaged or disengaged tool stop mechanism 125 could comprise a single magnet behind the housing (not shown) and two Hall Effect sensors located where magnets 131 are shown in the preferred embodiment. In such a configuration, monopolar Hall Effect sensors are suitable and would be configured so that Sensor 1 detects a signal when the magnet is in proximity due to the locking mechanism being disengaged, while Sensor 2 detects a signal when the same magnet is in proximity due to the locking mechanism being engaged. Neither sensor would detect a signal when the magnet is between positions or out of proximity to either sensor. Although a configuration could be conceived in which a sensor is active for engaged position and inactive for disengaged position, a configuration with three signals indicating engaged, disengaged, or transitional is preferred to ensure correct behavior in case of power failure.


End-effector 112, tool stop 121, and tool-insert locking mechanism 117 each have co-axially aligned bores or apertures such that any selected one of the plurality of surgical tools 129 may be received through such bores and apertures. In this implementation end-effector has a bore 133 and tool stop 121 and tool-insert locking mechanism 117 have respective apertures 135 and 137. Stop mechanism 125 includes a ring 139 axially aligned with bore 133 and aperture 135 of tool stop 121. Ring 139 is selectively, manually rotatable in the directions indicated by arrow A (FIG. 16) so as to move stop mechanism 125 between the engaged position and the disengaged position.


In one possible implementation, the selective rotation of ring 139 includes features which enable ring 139 to be locked in either the disengaged or engaged position. So, for example, as illustrated, a detent mechanism 141 is located on and mounted to ring 139 in any suitable way to lock ring 139 against certain rotational movement out of a predetermined position, in this case, such position being when stop mechanism 125 is in the engaged position. Although various forms of detent mechanism are contemplated herein, one suitable arrangement has a manually accessible head extending circumferentially outwardly from ring 139 and having a male protrusion (not shown) spring-loaded axially inwardly to engage a corresponding female detent portion (not shown). Detent mechanism 141, as such, is manually actuatable to unlock ring 139 from its engaged position to permit ring 139 to be manually rotated to cause stop mechanism 125 to move from the engaged position (FIG. 20) to the disengaged position (FIG. 19).


Tool stop 121 includes a lever arm 143 pivotally mounted adjacent aperture 135 of tool stop 121 so end of lever arm 143 selectively pivots in the directions indicated by arrow B (FIGS. 16, 19 and 20). Lever arm 143 is operatively connected to stop mechanism 125, meaning it closes aperture 135 of tool stop 121 in response to stop mechanism 125 being in the engaged position, as shown in FIG. 20. Lever arm 143 is also operatively connected so as to pivot back in direction of arrow B to open aperture 135 in response to stop mechanism 125 being in the disengaged position. As such, movement of stop mechanism 125 between engaged and disengaged positions results in closure or opening of aperture 135, respectively, by lever arm 143.


Lever arm 143, in this implementation, is not only pivotally mounted adjacent aperture 135, but also pivots in parallel with a distal plane defined at a distal-most point of distal surface 123 of end-effector 112. In this manner, any one of the surgical tools 129, which is attempted to be inserted through bore 133 and aperture 135, is stopped from being inserted past the distal plane in which lever arm 143 rotates to close aperture 135.


Turning now to tool-insert locking mechanism 117 (FIG. 13, 17, 18), a connector 145 is configured to meet with and secure any one of the surgical tools 129 at their appropriate height, angle of orientation, and rotational position relative to the anatomical feature of the patient. In the illustrated implementation, connector 145 comprises a rotatable flange 147 which has at least one slot 149 formed therein to receive therethrough a corresponding tongue 151 associated with a selected one of the plurality of tools 129. So, for example, in FIG. 14, the particular electrode driver 115 has multiple tongues, one of which tongue 151 is shown. Rotatable flange 147, in some implementations, may comprise a collar 153, which collar, in turn, has multiple ones of slots 149 radially spaced on a proximally oriented surface 155, as best seen in FIG. 12. Multiple slots 147 arranged around collar 153 are sized or otherwise configured so as to receive therethrough corresponding ones of multiple tongues 151 associated with a selected one of the plurality of tools 129. Therefore, as seen in FIG. 13, multiple slots 149 and corresponding tongues 151 may be arranged to permit securing of a selected one of the plurality of tools 129 only when selected tool is in the correct, predetermined angle of orientation and rotational position relative to the anatomical feature of the patient. Similarly, with regard to the electrode driver shown in FIG. 14, tongues 151 (one of which is shown in a cutaway of FIG. 14) have been received in radially spaced slots 149 arrayed so that electrode driver 115 is received at the appropriate angle of orientation and rotational position.


Rotatable flange 147 has, in this implementation, a grip 173 to facilitate manual rotation between an open and closed position as shown in FIGS. 17 and 18, respectively. As seen in FIG. 17, multiple sets of mating slots 149 and tongues 151 are arranged at different angular locations, in this case, locations which may be symmetric about a single diametric chord of a circle but otherwise radially asymmetric, and at least one of the slots has a different dimension or extends through a different arc length than other slots. In this slot-tongue arrangement, and any number of variations contemplated by this disclosure, there is only one rotational position of the tool 129 (or adapter 155 discussed later) to be received in tool-insert locking mechanism 117 when rotatable flange 147 is in the open position shown in FIG. 17. In other words, when the user of system 100 moves a selected tool 129 (or tool adapter 155) to a single appropriate rotational position, corresponding tongues 151 may be received through slots 149. Upon placement of tongues 151 into slots 149, tongues 151 confront a base surface 175 within connector 145 of rotatable flange 147. Upon receiving tongues 151 into slots 149 and having them rest on underlying base surface 175, dimensions of tongues 151 and slots 149, especially with regard to height relative to rotatable flange 147, are selected so that when rotatable flange 147 is rotated to the closed position, flange portions 157 are radially translated to overlie or engage portions of tongues 151, such engagement shown in FIG. 18 and affixing tool 129 (or adapter 155) received in connector 145 at the desired, predetermined height, angle of orientation, and rotational position relative to the anatomical feature of the patient.


Tongues 151 described as being associated with tools 129 may either be directly connected to such tools 129, and/or tongues 151 may be located on and mounted to the above-mentioned adapter 155, such as that shown in FIGS. 12, 17 and 18, such adapter 155 configured to interconnect at least one of the plurality of surgical tools 129 with end-effector 112. In the described implementation, adapter 155 includes two operative portions—a tool receiver 157 adapted to connect the selected one or more surgical tools 129, and the second operative part being one or more tongues 151 which may, in this implementation, be mounted and connected to the distal end of adapter 155.


Adapter 155 has an outer perimeter 159 which, in this implementation, is sized to oppose an inner perimeter 161 of rotatable flange 147. Adapter 155 extends between proximal and distal ends 163, 165, respectively and has an adapter bore 167 extending between ends 163, 165. Adapter bore 167 is sized to receive at least one of the plurality of surgical tools 129, and similarly, the distance between proximal and distal ends 163, 165 is selected so that at least one of tools 129 is secured to end-effector 112 at the predetermined, appropriate height for the surgical procedure associated with such tool received in adapter bore 167.


In one possible implementation, system 100 includes multiple ones of adapter 155, configured to be interchangeable inserts 169 having substantially the same, predetermined outer perimeters 159 to be received within inner perimeter 161 of rotatable flange 147. Still further in such implementation, the interchangeable inserts 169 have bores of different, respective diameters, which bores may be selected to receive corresponding ones of the tools 129 therein. Bores 167 may comprise cylindrical bushings having inner diameters common to multiple surgical tools 129. One possible set of diameters for bores 167 may be 12, 15, and 17 millimeters, suitable for multiple robotic surgery operations, such as those identified in this disclosure.


In the illustrated implementation, inner perimeter 161 of rotatable flange 147 and outer perimeter 159 of adapter 155 are circular, having central, aligned axes and corresponding radii. Slots 149 of rotatable flange 147 extend radially outwardly from the central axis of rotatable flange 147 in the illustrated implementation, whereas tongues 151 of adapter 155 extend radially outwardly from adapter 155.


In still other implementations, end-effector 112 may be equipped with at least one illumination element 171 (FIGS. 14 and 15) orientable toward the anatomical feature to be operated upon. Illumination element 171 may be in the form of a ring of LEDs 177 (FIG. 14) located within adapter 167, which adapter is in the form of a bushing secured to tool locking mechanism 117. Illumination element 171 may also be a single LED 179 mounted on the distal surface 123 of end-effector 112. Whether in the form of LED ring 177 or a single element LED 179 mounted on distal surface of end-effector 112, or any other variation, the spacing and location of illumination element or elements 171 may be selected so that tools 129 received through bore 133 of end-effector 112 do not cast shadows or otherwise interfere with illumination from element 171 of the anatomical feature being operated upon.


The operation and associated features of end-effector 112 are readily apparent from the foregoing description. Tool stop 121 is rotatable, selectively lockable, and movable between engaged and disengaged positions, and a sensor prevents movement of end-effector 112 when in such disengaged position, due to the potential presence of a tool which may not be advisably moved during such disengaged position. Tool-insert locking mechanism 117 is likewise rotatable between open and closed positions to receive one of a plurality of interchangeable inserts 169 and tongues 151 of such inserts, wherein selected tools 129 may be received in such inserts 169; alternately, tongues 151 may be otherwise associated with tools 129, such as by having tongues 151 directly connected to such tools 129, which tongue-equipped tools likewise may be received in corresponding slots 149 of tool-insert locking mechanism 117. Tool-insert locking mechanism 117 may be rotated from its open position in which tongues 151 have been received in slots 149, to secure associated adapters 155 and/or tools 129 so that they are at appropriate, respective heights, angles of orientation, and rotational positions relative to the anatomical feature of the patient.


For those implementations with multiple adapters 155, the dimensions of such adapters 155, including bore diameters, height, and other suitable dimensions, are selected so that a single or a minimized number of end-effectors 112 can be used for a multiplicity of surgical tools 129. Adapters 155, such as those in the form of interchangeable inserts 169 or cylindrical bushings, may facilitate connecting an expanded set of surgical tools 129 to the end-effector 112, and thus likewise facilitate a corresponding expanded set of associated surgical features using the same end-effector 112.


Another possible embodiment of surgical robot system 100 shown in FIG. 1A is described below and shown with reference to FIGS. 21-25. In this implementation, end-effector 212 is suitably connected to a robot arm, such as that described previously with reference to robot arm 104, and is orientable to oppose an anatomical feature a so as to be in operative proximity thereto. End-effector 212 may include features similar to those described with reference to end-effector 112 of FIGS. 12-18, such as a tool-insert locking mechanism 217 and tool stop 221, which correspond to tool-insert locking mechanism 117 and tools stop 121 described previously. However, it will be appreciated that such features are not required in end-effector 212 and various other features or configurations of end-effector 212 are contemplated by the disclosure with reference to FIGS. 19-24.


End-effector 212 has a surgical tool comprising a drill 223 selectively connectible thereto. Processing circuitry, memory, and suitable machine-readable instructions are associated with this implementation of surgical robot system 100 so as to determine, for drill 223, a drill target and an associated drill trajectory relative to anatomical feature a. Suitable instructions are likewise provided such that, when executed, the end-effector may be automatically moved to a position corresponding to the determined target and determined drill trajectory. Such instructions also permit advancement of the drill toward the target. Manual manipulations of end effector 212 to drill locations and trajectories are likewise contemplated.


A drill guide 225 is releasably connected between drill 223 and end-effector 212. As explained below, drill guide 225 may be configured and otherwise includes features to stop advancement of drill 223 at a preselected drill depth associated with the determined target. Drill guide 225 may be further configured with features to guide the drill along the determined drill trajectory during the advancement of the drill.


Drill guide 225 includes a guide shaft 227 having a bore 229 extending longitudinally therethrough between proximal and distal ends of guide shaft 227. Bore 229 is sized to slideably receive a drill bit 231 which extends distally from a chuck 233 of drill 223. Such drill bit 231 terminates in a drill tip 235 capable of engaging the anatomical feature a upon a suitable amount of advancement of drill bit 231 along the determined drill trajectory.


Referring more particularly to FIG. 22, drill bit 231 has a known, that is, predetermined, length A corresponding to the distance between the distal surface 237 of chuck 233 and the end of drill tip 235. Guide shaft 227, in turn, has a known, that is, predetermined, second length B. Drill guide 225 includes a depth stop 239 having respective proximal and distal depth stop ends. Depth stop 239 is mounted so that its distal depth stop end is selectively and slideably received at the proximal end of guide shaft 227. As such, depth stop 239, when slid relative to guide shaft 227, varies the second length B of guide shaft 227. In particular, movement of depth stop 239 varies the predetermined length B of guide shaft 227 among a plurality or set of length values corresponding to amounts by which drill tip 235 extends beyond distal end of guide shaft 227, such amounts thus corresponding to available drill depths. As such selected lengths B are less than the known, predetermined length A of drill bit 231, and thereby permit the distal end of drill bit 231 and its drill tip 235 to extend distally from the distal end of guide shaft 227 by selected lengths corresponding to desired depths of engagement of the anatomical feature. In the disclosed implementation, depth stop 239 has surface 241 oriented and located to engage a distal surface 237 of chuck 233 of drill 223. Accordingly, the depth to which drill tip 235 is advanceable by drill 223 is limited by an amount corresponding to the position of depth stop 239.


In one possible implementation, drill guide 225 includes a visually perceptible depth indicator 251 operatively associated with depth stop 239. By way of example, depth stop 239 has a longitudinal stem 245 having indicia 243 disposed thereon, so that slideable movement of depth stop 239 slides longitudinal stem 245 and the indicia 243. Guide shaft 227, in turn, has portions forming a window 247 sized and located to reveal at least a portion of longitudinal stem 245 bearing indicia 243. The portions adjacent to window 247 have one or more structures or indicia thereon to form a pointer 249. Indicia 243, in this case, includes a graduated, numerical value scale associated with available depths of penetration of anatomical feature a, the pointer and numerical value scale being moveable relative to each other, in this case by sliding longitudinal stem 245 of depth stop 239 relative to window 247 of guide shaft 227.


It will be appreciated that the above-described relative arrangement of pointer 249 and graduated depths of advancement appearing as numerical values in indicia 243 may be disposed in alternative configurations, such as having the pointer on the slideable longitudinal stem 245 and the graduated depths of advancement disposed longitudinally along portions of window 247. Still further variations are possible.


In still further possible implementation, drill guide 225 makes use of an engagement mechanism 253 which sets depth stop 239 at one of the available, selectable, longitudinal positions corresponding to the desired or predetermined drill depth. Engagement mechanism 253 may include a ratchet assembly 255 which has features, such as mating teeth 257 as shown, to engage and set depth stop 239 at one of the longitudinally spaced locations between the proximal and distal ends of depth stop 239. In the ratchet assembly 255 shown, teeth 257 are longitudinally disposed along a suitable outer surface of longitudinal stem 245, and a ratchet 259 with a confronting surface feature, such as one or more mating teeth 257, is located to oppose teeth 257 disposed on stem 245 and is spring-biased to selectively engage stem 245.


Ratchet 259 and corresponding ratchet assembly 255 may be operated by a spring-biased trigger 261. In the illustrated embodiment, trigger 261 is manually pulled against a spring-biasing force to disengage engagement mechanism 253 from a first one of the longitudinal positions to which it had been previously set. During such disengagement, depth stop 239 is operated to slide longitudinal stem 245 relative to guide shaft 227 to a selected or desired drill depth as indicated by pointer 249 relative to the scale of indicia 243. After movement of depth stop 239 to the desired longitudinal position, spring-biased trigger 261 may be released, allowing the spring-biased force to set engagement mechanism 253 at a second one of the available longitudinal positions, the second longitudinal position corresponding to the desired drill depth.


Once a desired drill depth has been set by the ratchet assembly or other features of engagement mechanism 253, such setting may be locked or secured against inadvertent movement by a trigger lock 263 which is operatively connected to trigger 261, meaning, when engaged, trigger lock 263 holds ratchet 259 engaged with corresponding teeth 257 of depth stop. In the illustrated implementation, trigger lock 263 may be in the form of a locking ring 265, which may be threadably or otherwise engaged to act as a stop against disengagement of mating teeth 257 of ratchet 259, or more generally, to prevent radially outward movement of ratchet 259 relative to the longitudinal axis of depth stop 239.


Actuation of engagement mechanism 253, including, for example, engagement and disengagement of depth stop 239, may be facilitated by providing drill guide 225 with a handle 267. Handle 267 may be pulled radially outwardly from the longitudinal axis of drill guide 225 and, by virtue of connection to engagement mechanism 253, such outward pulling of handle 267 overcomes longitudinally inward spring-biasing force and disengages engagement mechanism 253 from depth stop 239. Conversely, release of handle 267 after desired sliding of depth stop 239 relative to guide shaft 227 operates to set drill guide 225 at a desired drill depth.


In addition to limiting penetration of drill 223 to a desired drill depth, drill guide 225 may include features to reduce deviation of drill bit 231 from the determined drill trajectory β (FIG. 21). In one suitable implementation, such trajectory guiding features comprise at least one bushing 269 disposed within bore 229. Bushing or bushings 269 are sized to receive outer longitudinal surface 273 of drill bit 231 slideably and rotatably therethrough. Accordingly, bushing 269 has an internal diameter sized to not only moveably engage longitudinal surface 273, but to thereby exert a force shown having the direction of arrow F opposing lateral displacement of drill bit 231 within guide shaft 227 (FIG. 22). In one possible implementation, one bushing 269 is located within guide shaft 227 toward distal end thereof and thereby engages drill bit 231 closer to where drill tip 235 engages anatomical feature a. Such distal locations of drill bit 231 often experience greater lateral displacement forces by virtue of their proximity to anatomical feature a to be engaged, especially in the event the drill trajectory is at an oblique angle.


In the illustrated embodiment, a second bushing 269 is disposed at a proximal location along guide shaft 227 and thereby may generate a countervailing force opposing lateral displacement at the proximal end of drill bit 231, such as would result from a cantilevering of drill bit 231 upon oblique engagement of anatomical feature a.


From the foregoing description, operations and related methods of robot surgical system 100 and its drill guide 225 will be readily appreciated. For example, in one method of operation, drill 223 may be guided during robot-assisted surgery, including any of the surgeries described herein on anatomical feature a of a patient. A drill target and an associated drill trajectory are determined, such as by computer processing. End-effector 212 is manually or automatically moved to a position corresponding to the determined target and the determined drill trajectory. After or before such movement of end-effector 212, depth stop 239 may be mechanically or manually set to one of a plurality of selectable positions corresponding to the desired depth of advancement of the drill associated with the contemplated surgery on the anatomical feature. In this way, drill 223 is limited from penetrating the anatomical feature beyond the selected, desired depth.


In one possible method, the displacement of the depth stop is done with the aid of visually perceptible indicia corresponding to graduated depths of advancement of the drill and, upon such visual perception of the desired depth, the depth stop is secured at such desired position. To set the depth stop, a spring-biased trigger 261 is disengaged relative to depth stop 239 from a first position, depth stop 239 is then longitudinally slid relative to visually perceptible indicia 243 to a second position, such second position corresponding to the desired position of the depth stop and the desired depth of drilling on anatomical feature a. Once the desired depth has been selected by movement of depth stop 239, trigger 261 is released to re-engage depth stop 239 at the desired position. Thereafter, trigger lock 263 may be further engaged to avoid inadvertent disengagement and potential movement of depth stop 239 and thereby avoid over drilling.


In the above-description of various embodiments of present inventive concepts, it is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of present inventive concepts. Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which present inventive concepts belong. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.


When an element is referred to as being “connected”, “coupled”, “responsive”, or variants thereof to another element, it can be directly connected, coupled, or responsive to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected”, “directly coupled”, “directly responsive”, or variants thereof to another element, there are no intervening elements present. Like numbers refer to like elements throughout. Furthermore, “coupled”, “connected”, “responsive”, or variants thereof as used herein may include wirelessly coupled, connected, or responsive. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Well-known functions or constructions may not be described in detail for brevity and/or clarity. The term “and/or” includes any and all combinations of one or more of the associated listed items.


It will be understood that although the terms first, second, third, etc. may be used herein to describe various elements/operations, these elements/operations should not be limited by these terms. These terms are only used to distinguish one element/operation from another element/operation. Thus a first element/operation in some embodiments could be termed a second element/operation in other embodiments without departing from the teachings of present inventive concepts. The same reference numerals or the same reference designators denote the same or similar elements throughout the specification.


As used herein, the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof. Furthermore, as used herein, the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia,” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item. The common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.


Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits. These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).


These computer program instructions may also be stored in a tangible computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and/or flowchart block or blocks. Accordingly, embodiments of present inventive concepts may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as “circuitry,” “a module” or variants thereof.


It should also be noted that in some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Moreover, the functionality of a given block of the flowcharts and/or block diagrams may be separated into multiple blocks and/or the functionality of two or more blocks of the flowcharts and/or block diagrams may be at least partially integrated. Finally, other blocks may be added/inserted between the blocks that are illustrated, and/or blocks/operations may be omitted without departing from the scope of inventive concepts. Moreover, although some of the diagrams include arrows on communication paths to show a primary direction of communication, it is to be understood that communication may occur in the opposite direction to the depicted arrows.


Although several embodiments of inventive concepts have been disclosed in the foregoing specification, it is understood that many modifications and other embodiments of inventive concepts will come to mind to which inventive concepts pertain, having the benefit of teachings presented in the foregoing description and associated drawings. It is thus understood that inventive concepts are not limited to the specific embodiments disclosed hereinabove, and that many modifications and other embodiments are intended to be included within the scope of the appended claims. It is further envisioned that features from one embodiment may be combined or used with the features from a different embodiment(s) described herein. Moreover, although specific terms are employed herein, as well as in the claims which follow, they are used only in a generic and descriptive sense, and not for the purposes of limiting the described inventive concepts, nor the claims which follow. The entire disclosure of each patent and patent publication cited herein is incorporated by reference herein in its entirety, as if each such patent or publication were individually incorporated by reference herein. Various features and/or potential advantages of inventive concepts are set forth in the following claims.

Claims
  • 1. A method of guiding a drill for drilling an anatomical feature of a patient during a robotically assisted surgery, the method comprising: determining a drill target and an associated drill trajectory;moving an end effector coupled to a robotic arm to a position corresponding to the determined target and the determined drill trajectory;coupling a drill guide to the end effector, the drill guide having a guide sleeve adapted to receive a drill bit attached to a drill, and a depth stop sleeve slidably coupled to the guide sleeve and longitudinally lockable to the guide sleeve, the depth stop sleeve having a through hole to allow the drill bit to be freely slidable therein,setting a depth stop of the depth stop sleeve;locking the set depth stop;inserting the drill bit through the locked depth stop sleeve and the guide sleeve of the drill guide such that a distal tip of the drill bit extends from a distal end of the drill guide;drilling the anatomical feature with the inserted drill bit until the locked depth stop of the depth stop sleeve stops the drill from advancing further.
  • 2. The method of claim 1, wherein the step of setting a depth stop includes setting the depth stop to one of a plurality of ratchet positions corresponding to the desired depth of advancement of the drill bit relative to the anatomical feature.
  • 3. The method of claim 1, further comprising locking the drill guide to the end effector with a locking mechanism disposed in the end effector.
  • 4. The method of claim 1, wherein the step of setting a depth stop includes raising the depth stop sleeve proximally and away from the anatomical feature until the desired depth has been reached.
  • 5. The method of claim 4, wherein, further comprising: prior to raising the depth stop sleeve, moving a trigger to a depth controlling position; andafter raising the depth stop sleeve, releasing the trigger to thereby place the trigger into a depth lock position.
  • 6. The method of claim 5, further comprising actuating a trigger lock disposed on the drill guide to further ensure locking of the depth stop.
  • 7. The method of claim 1, wherein the setting a depth stop includes: displacing the depth stop relative to visually perceptible indicia, the indicia corresponding to graduated depths of advancement of the drill bit; andsecuring the depth stop at a desire position corresponding to a selected one of the graduated depths.
  • 8. The method of claim 7, wherein displacing and securing the depth stop include: actuating a spring-biased trigger to disengage the depth stop from a first position;longitudinally sliding the depth stop relative to the visually perceptible indicia to a second position, the second position corresponding to the desired position of the depth stop; andreleasing the trigger to re-engage the depth stop at the desired position.
  • 9. A method of guiding a drill during robot-assisted surgery on an anatomical feature of a patient, the method comprising the steps of: determining, by computer processing, a drill target and an associated drill trajectory and effectuating movement of an end-effector to a position corresponding to the determined target and the determined drill trajectory;coupling a drill guide to the end effector, the drill guide having a guide sleeve adapted to receive a drill bit attached to a drill, and a depth stop sleeve slidably received in the guide sleeve and longitudinally lockable to the guide sleeve, the depth stop sleeve having a through hole to allow the drill bit to be freely slidable therein,setting a depth stop of the depth stop sleeve by longitudinally sliding the depth stop sleeve relative to the guide sleeve;locking the set depth stop;inserting the drill bit through the locked depth stop sleeve and the guide sleeve of the drill guide such that a distal tip of the drill bit extends from a distal end of the drill guide;drilling the anatomical feature with the inserted drill bit until the locked depth stop of the depth stop sleeve stops the drill from advancing further.
  • 10. The method of claim 9, wherein the step of setting the depth stop comprises: displacing the depth stop relative to visually perceptible indicia, the indicia corresponding to graduated depths of advancement of the drill; andsecuring the depth stop at a desire position corresponding to a selected one of the graduated depths.
  • 11. The method of claim 10, wherein the steps of displacing and securing the depth stop comprise: actuating a spring-biased trigger to disengage the depth stop from a first position;longitudinally sliding the depth stop relative to the visually perceptible indicia to a second position, the second position corresponding to the desired position of the depth stop; andreleasing the trigger to re-engage the depth stop at the desired position.
  • 12. The method of claim 9, wherein the step of setting a depth stop includes setting the depth stop to one of a plurality of ratchet positions corresponding to the desired depth of advancement of the drill bit relative to the anatomical feature.
  • 13. The method of claim 9, further comprising locking the drill guide having the depth stop to the end effector with a locking mechanism disposed in the end effector.
  • 14. The method of claim 9, wherein the step of setting a depth stop includes raising the depth stop proximally and away from the anatomical feature until the desired depth has been reached.
  • 15. The method of claim 14, wherein, further comprising: prior to raising the depth stop, moving a trigger in the drill guide to a depth controlling position; andafter raising the depth stop, releasing the trigger to thereby place the trigger into a depth lock position.
  • 16. The method of claim 15, further comprising actuating a trigger lock disposed on the drill guide to further ensure locking of the depth stop.
CROSS-REFERENCE TO RELATED APPLICATION

This application is a division of U.S. patent application Ser. No. 16/695,310, filed Nov. 26, 2019, which is a continuation-in-part of U.S. patent application Ser. No. 16/452,737, filed Jun. 26, 2019, which is a continuation-in-part of U.S. patent application Ser. No. 16/361,863, filed on Mar. 22, 2019, tall of which are hereby incorporated herein by reference for all purposes.

US Referenced Citations (729)
Number Name Date Kind
4150293 Franke Apr 1979 A
5246010 Gazzara et al. Sep 1993 A
5354314 Hardy et al. Oct 1994 A
5397323 Taylor et al. Mar 1995 A
5423832 Gildenberg Jun 1995 A
5598453 Baba et al. Jan 1997 A
5647373 Paltieli Jul 1997 A
5772594 Barrick Jun 1998 A
5791908 Gillio Aug 1998 A
5820559 Ng et al. Oct 1998 A
5825982 Wright et al. Oct 1998 A
5887121 Funda et al. Mar 1999 A
5911449 Daniele et al. Jun 1999 A
5951475 Gueziec et al. Sep 1999 A
5987960 Messner et al. Nov 1999 A
6012216 Esteves et al. Jan 2000 A
6031888 Ivan et al. Feb 2000 A
6033415 Mittelstadt et al. Mar 2000 A
6073512 McCormick et al. Jun 2000 A
6080181 Jensen et al. Jun 2000 A
6106511 Jensen Aug 2000 A
6122541 Cosman et al. Sep 2000 A
6144875 Schweikard et al. Nov 2000 A
6157853 Blume et al. Dec 2000 A
6167145 Foley et al. Dec 2000 A
6167292 Badano et al. Dec 2000 A
6200274 McNeirney Mar 2001 B1
6201984 Funda et al. Mar 2001 B1
6203196 Meyer et al. Mar 2001 B1
6205411 DiGioia, III et al. Mar 2001 B1
6212419 Blume et al. Apr 2001 B1
6231565 Tovey et al. May 2001 B1
6236875 Bucholz et al. May 2001 B1
6246900 Cosman et al. Jun 2001 B1
6298262 Franck et al. Oct 2001 B1
6301495 Gueziec et al. Oct 2001 B1
6306126 Montezuma Oct 2001 B1
6312435 Wallace et al. Nov 2001 B1
6314311 Williams et al. Nov 2001 B1
6320929 Von Der Haar Nov 2001 B1
6322567 Mittelstadt et al. Nov 2001 B1
6325808 Bernard et al. Dec 2001 B1
6340363 Bolger et al. Jan 2002 B1
6377011 Ben-Ur Apr 2002 B1
6379302 Kessman et al. Apr 2002 B1
6402762 Hunter et al. Jun 2002 B2
6424885 Niemeyer et al. Jul 2002 B1
6447503 Wynne et al. Sep 2002 B1
6451027 Cooper et al. Sep 2002 B1
6477400 Barrick Nov 2002 B1
6484049 Seeley et al. Nov 2002 B1
6487267 Wolter Nov 2002 B1
6490467 Bucholz et al. Dec 2002 B1
6490475 Seeley et al. Dec 2002 B1
6499488 Hunter et al. Dec 2002 B1
6501981 Schweikard et al. Dec 2002 B1
6507751 Blume et al. Jan 2003 B2
6535756 Simon et al. Mar 2003 B1
6560354 Maurer, Jr. et al. May 2003 B1
6565554 Niemeyer May 2003 B1
6587750 Gerbi et al. Jul 2003 B2
6614453 Suri et al. Sep 2003 B1
6614871 Kobiki et al. Sep 2003 B1
6619840 Rasche et al. Sep 2003 B2
6636757 Jascob et al. Oct 2003 B1
6645196 Nixon et al. Nov 2003 B1
6666579 Jensen Dec 2003 B2
6669635 Kessman et al. Dec 2003 B2
6701173 Nowinski et al. Mar 2004 B2
6757068 Foxlin Jun 2004 B2
6782287 Grzeszczuk et al. Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6786896 Madhani et al. Sep 2004 B1
6788018 Blumenkranz Sep 2004 B1
6804581 Wang et al. Oct 2004 B2
6823207 Jensen et al. Nov 2004 B1
6827351 Graziani et al. Dec 2004 B2
6837892 Shoham Jan 2005 B2
6839612 Sanchez et al. Jan 2005 B2
6840895 Perry et al. Jan 2005 B2
6856826 Seeley et al. Feb 2005 B2
6856827 Seeley et al. Feb 2005 B2
6879880 Nowlin et al. Apr 2005 B2
6892090 Verard et al. May 2005 B2
6920347 Simon et al. Jul 2005 B2
6922632 Foxlin Jul 2005 B2
6968224 Kessman et al. Nov 2005 B2
6978166 Foley et al. Dec 2005 B2
6988009 Grimm et al. Jan 2006 B2
6991627 Madhani et al. Jan 2006 B2
6996487 Jutras et al. Feb 2006 B2
6999852 Green Feb 2006 B2
7007699 Martinelli et al. Mar 2006 B2
7008362 Fitzgibbon Mar 2006 B2
7016457 Senzig et al. Mar 2006 B1
7043961 Pandey et al. May 2006 B2
7062006 Pelc et al. Jun 2006 B1
7063705 Young et al. Jun 2006 B2
7072707 Galloway, Jr. et al. Jul 2006 B2
7083615 Peterson et al. Aug 2006 B2
7097640 Wang et al. Aug 2006 B2
7099428 Clinthorne et al. Aug 2006 B2
7108421 Gregerson et al. Sep 2006 B2
7130676 Barrick Oct 2006 B2
7139418 Abovitz et al. Nov 2006 B2
7139601 Bucholz et al. Nov 2006 B2
7155316 Sutherland et al. Dec 2006 B2
7164968 Treat et al. Jan 2007 B2
7167738 Schweikard et al. Jan 2007 B2
7169141 Brock et al. Jan 2007 B2
7172627 Fiere et al. Feb 2007 B2
7194120 Wicker et al. Mar 2007 B2
7197107 Arai et al. Mar 2007 B2
7231014 Levy Jun 2007 B2
7231063 Naimark et al. Jun 2007 B2
7239940 Wang et al. Jul 2007 B2
7248914 Hastings et al. Jul 2007 B2
7301648 Foxlin Nov 2007 B2
7302288 Schellenberg Nov 2007 B1
7313430 Urquhart et al. Dec 2007 B2
7318805 Schweikard et al. Jan 2008 B2
7318827 Leitner et al. Jan 2008 B2
7319897 Leitner et al. Jan 2008 B2
7324623 Heuscher et al. Jan 2008 B2
7327865 Fu et al. Feb 2008 B2
7331967 Lee et al. Feb 2008 B2
7333642 Green Feb 2008 B2
7339341 Oleynikov et al. Mar 2008 B2
7366562 Dukesherer et al. Apr 2008 B2
7379790 Toth et al. May 2008 B2
7386365 Nixon Jun 2008 B2
7422592 Morley et al. Sep 2008 B2
7435216 Kwon et al. Oct 2008 B2
7440793 Chauhan et al. Oct 2008 B2
7460637 Clinthorne et al. Dec 2008 B2
7466303 Yi et al. Dec 2008 B2
7493153 Ahmed et al. Feb 2009 B2
7505617 Fu et al. Mar 2009 B2
7533892 Schena et al. May 2009 B2
7542791 Mire et al. Jun 2009 B2
7555331 Viswanathan Jun 2009 B2
7567834 Clayton et al. Jul 2009 B2
7594912 Cooper et al. Sep 2009 B2
7606613 Simon et al. Oct 2009 B2
7607440 Coste-Maniere et al. Oct 2009 B2
7623902 Pacheco Nov 2009 B2
7630752 Viswanathan Dec 2009 B2
7630753 Simon et al. Dec 2009 B2
7643862 Schoenefeld Jan 2010 B2
7660623 Hunter et al. Feb 2010 B2
7661881 Gregerson et al. Feb 2010 B2
7683331 Chang Mar 2010 B2
7683332 Chang Mar 2010 B2
7689320 Prisco et al. Mar 2010 B2
7691098 Wallace et al. Apr 2010 B2
7702379 Avinash et al. Apr 2010 B2
7702477 Tuemmler et al. Apr 2010 B2
7711083 Heigl et al. May 2010 B2
7711406 Kuhn et al. May 2010 B2
7720523 Omernick et al. May 2010 B2
7725253 Foxlin May 2010 B2
7726171 Langlotz et al. Jun 2010 B2
7742801 Neubauer et al. Jun 2010 B2
7751865 Jascob et al. Jul 2010 B2
7760849 Zhang Jul 2010 B2
7762825 Burbank et al. Jul 2010 B2
7763015 Cooper et al. Jul 2010 B2
7787699 Mahesh et al. Aug 2010 B2
7796728 Bergfjord Sep 2010 B2
7813838 Sommer Oct 2010 B2
7818044 Dukesherer et al. Oct 2010 B2
7819859 Prisco et al. Oct 2010 B2
7824401 Manzo et al. Nov 2010 B2
7831294 Viswanathan Nov 2010 B2
7834484 Sartor Nov 2010 B2
7835557 Kendrick et al. Nov 2010 B2
7835778 Foley et al. Nov 2010 B2
7835784 Mire et al. Nov 2010 B2
7840253 Tremblay et al. Nov 2010 B2
7840256 Lakin et al. Nov 2010 B2
7843158 Prisco Nov 2010 B2
7844320 Shahidi Nov 2010 B2
7853305 Simon et al. Dec 2010 B2
7853313 Thompson Dec 2010 B2
7865269 Prisco et al. Jan 2011 B2
D631966 Perloff et al. Feb 2011 S
7879045 Gielen et al. Feb 2011 B2
7881767 Strommer et al. Feb 2011 B2
7881770 Melkent et al. Feb 2011 B2
7886743 Cooper et al. Feb 2011 B2
RE42194 Foley et al. Mar 2011 E
RE42226 Foley et al. Mar 2011 E
7900524 Calloway et al. Mar 2011 B2
7907166 Lamprecht et al. Mar 2011 B2
7909122 Schena et al. Mar 2011 B2
7925653 Saptharishi Apr 2011 B2
7930065 Larkin et al. Apr 2011 B2
7935130 Williams May 2011 B2
7940999 Liao et al. May 2011 B2
7945012 Ye et al. May 2011 B2
7945021 Shapiro et al. May 2011 B2
7953470 Vetter et al. May 2011 B2
7954397 Choi et al. Jun 2011 B2
7963913 Devengenzo et al. Jun 2011 B2
7971341 Dukesherer et al. Jul 2011 B2
7974674 Hauck et al. Jul 2011 B2
7974677 Mire et al. Jul 2011 B2
7974681 Wallace et al. Jul 2011 B2
7979157 Anvari Jul 2011 B2
7983733 Viswanathan Jul 2011 B2
7988215 Seibold Aug 2011 B2
7996110 Lipow et al. Aug 2011 B2
8004121 Sartor Aug 2011 B2
8004229 Nowlin et al. Aug 2011 B2
8010177 Csavoy et al. Aug 2011 B2
8019045 Kato Sep 2011 B2
8021310 Sanborn et al. Sep 2011 B2
8035685 Jensen Oct 2011 B2
8046054 Kim et al. Oct 2011 B2
8046057 Clarke Oct 2011 B2
8052688 Wolf, II Nov 2011 B2
8054184 Cline et al. Nov 2011 B2
8054752 Druke et al. Nov 2011 B2
8057397 Li et al. Nov 2011 B2
8057407 Martinelli et al. Nov 2011 B2
8062288 Cooper et al. Nov 2011 B2
8062375 Glerum et al. Nov 2011 B2
8066524 Burbank et al. Nov 2011 B2
8073335 Labonville et al. Dec 2011 B2
8079950 Stern et al. Dec 2011 B2
8086299 Adler et al. Dec 2011 B2
8092370 Roberts et al. Jan 2012 B2
8098914 Liao et al. Jan 2012 B2
8100950 St. Clair et al. Jan 2012 B2
8105320 Manzo Jan 2012 B2
8108025 Csavoy et al. Jan 2012 B2
8109877 Moctezuma de la Barrera et al. Feb 2012 B2
8112292 Simon Feb 2012 B2
8116430 Shapiro et al. Feb 2012 B1
8120301 Goldberg et al. Feb 2012 B2
8121249 Wang et al. Feb 2012 B2
8123675 Funda et al. Feb 2012 B2
8133229 Bonutti Mar 2012 B1
8142420 Schena Mar 2012 B2
8147494 Leitner et al. Apr 2012 B2
8150494 Simon et al. Apr 2012 B2
8150497 Gielen et al. Apr 2012 B2
8150498 Gielen et al. Apr 2012 B2
8165658 Waynik et al. Apr 2012 B2
8170313 Kendrick et al. May 2012 B2
8179073 Farritor et al. May 2012 B2
8182476 Julian et al. May 2012 B2
8184880 Zhao et al. May 2012 B2
8202278 Orban, III et al. Jun 2012 B2
8208708 Homan et al. Jun 2012 B2
8208988 Jenser Jun 2012 B2
8219177 Smith et al. Jul 2012 B2
8219178 Smith et al. Jul 2012 B2
8220468 Cooper et al. Jul 2012 B2
8224024 Foxlin et al. Jul 2012 B2
8224484 Swarup et al. Jul 2012 B2
8225798 Baldwin et al. Jul 2012 B2
8228368 Zhao et al. Jul 2012 B2
8231610 Jo et al. Jul 2012 B2
8263933 Hartmann et al. Jul 2012 B2
8239001 Verard et al. Aug 2012 B2
8241271 Millman et al. Aug 2012 B2
8248413 Gattani et al. Aug 2012 B2
8256319 Cooper et al. Sep 2012 B2
8271069 Jascob et al. Sep 2012 B2
8271130 Hourtash Sep 2012 B2
8281670 Larkin et al. Oct 2012 B2
8282653 Nelson et al. Oct 2012 B2
8301226 Csavoy et al. Oct 2012 B2
8311611 Csavoy et al. Nov 2012 B2
8320991 Jascob et al. Nov 2012 B2
8332012 Kienzle, III Dec 2012 B2
8333755 Cooper et al. Dec 2012 B2
8335552 Stiles Dec 2012 B2
8335557 Maschke Dec 2012 B2
8348931 Cooper et al. Jan 2013 B2
8353963 Glerum Jan 2013 B2
8358818 Miga et al. Jan 2013 B2
8359730 Burg et al. Jan 2013 B2
8374673 Adcox et al. Feb 2013 B2
8374723 Zhao et al. Feb 2013 B2
8379791 Forthmann et al. Feb 2013 B2
8386019 Camus et al. Feb 2013 B2
8392022 Ortmaier et al. Mar 2013 B2
8394099 Patwardhan Mar 2013 B2
8395342 Prisco Mar 2013 B2
8398634 Manzo et al. Mar 2013 B2
8400094 Schena Mar 2013 B2
8414957 Enzerink et al. Apr 2013 B2
8418073 Mohr et al. Apr 2013 B2
8450694 Baviera et al. May 2013 B2
8452447 Nixon May 2013 B2
RE44305 Foley et al. Jun 2013 E
8462911 Vesel et al. Jun 2013 B2
8465476 Rogers et al. Jun 2013 B2
8465771 Wan et al. Jun 2013 B2
8467851 Mire et al. Jun 2013 B2
8467852 Csavoy et al. Jun 2013 B2
8469947 Devengenzo et al. Jun 2013 B2
RE44392 Hynes Jul 2013 E
8480566 Farr Jul 2013 B2
8483434 Buehner et al. Jul 2013 B2
8483800 Jensen et al. Jul 2013 B2
8486532 Enzerink et al. Jul 2013 B2
8489235 Moll et al. Jul 2013 B2
8500132 Norton Aug 2013 B2
8500722 Cooper Aug 2013 B2
8500728 Newton et al. Aug 2013 B2
8504201 Moll et al. Aug 2013 B2
8506555 Ruiz Morales Aug 2013 B2
8506556 Schena Aug 2013 B2
8508173 Goldberg et al. Aug 2013 B2
8512318 Tovey et al. Aug 2013 B2
8515576 Lipow et al. Aug 2013 B2
8518120 Glerum et al. Aug 2013 B2
8521331 Itkowitz Aug 2013 B2
8526688 Groszmann et al. Sep 2013 B2
8526700 Isaacs Sep 2013 B2
8527094 Kumar et al. Sep 2013 B2
8528440 Morley et al. Sep 2013 B2
8529582 Devengenzo et al. Sep 2013 B2
8532741 Heruth et al. Sep 2013 B2
8541970 Nowlin et al. Sep 2013 B2
8548563 Simon et al. Oct 2013 B2
8549732 Burg et al. Oct 2013 B2
8551114 Ramos de la Pena Oct 2013 B2
8551116 Julian et al. Oct 2013 B2
8556807 Scott et al. Oct 2013 B2
8556979 Glerum et al. Oct 2013 B2
8560118 Green et al. Oct 2013 B2
8561473 Blumenkranz Oct 2013 B2
8562594 Cooper et al. Oct 2013 B2
8571638 Shoham Oct 2013 B2
8571710 Coste-Maniere et al. Oct 2013 B2
8573465 Shelton, IV Nov 2013 B2
8574303 Sharkey et al. Nov 2013 B2
8585420 Burbank et al. Nov 2013 B2
8594841 Zhao et al. Nov 2013 B2
8597198 Sanborn et al. Dec 2013 B2
8600478 Verard et al. Dec 2013 B2
8601667 Norton Dec 2013 B2
8602971 Farr Dec 2013 B2
8603077 Cooper et al. Dec 2013 B2
8611985 Lavallee et al. Dec 2013 B2
8613230 Blumenkranz et al. Dec 2013 B2
8621939 Blumenkranz et al. Jan 2014 B2
8624537 Nowlin et al. Jan 2014 B2
8630389 Kato Jan 2014 B2
8634897 Simon et al. Jan 2014 B2
8634957 Toth et al. Jan 2014 B2
8638056 Goldberg et al. Jan 2014 B2
8638057 Goldberg et al. Jan 2014 B2
8639000 Zhao et al. Jan 2014 B2
8641726 Bonutti Feb 2014 B2
8644907 Hartmann et al. Feb 2014 B2
8657809 Schoepp Feb 2014 B2
8660635 Simon et al. Feb 2014 B2
8666544 Moll et al. Mar 2014 B2
8675939 Moctezuma de la Barrera Mar 2014 B2
8678647 Gregerson et al. Mar 2014 B2
8679125 Smith et al. Mar 2014 B2
8679183 Glerum et al. Mar 2014 B2
8682413 Lloyd Mar 2014 B2
8684253 Giordano et al. Apr 2014 B2
8685098 Glerum et al. Apr 2014 B2
8693730 Umasuthan et al. Apr 2014 B2
8694075 Groszmann et al. Apr 2014 B2
8696458 Foxlin et al. Apr 2014 B2
8700123 Okamura et al. Apr 2014 B2
8706086 Glerum Apr 2014 B2
8706185 Foley et al. Apr 2014 B2
8706301 Zhao et al. Apr 2014 B2
8717430 Simon et al. May 2014 B2
8727618 Maschke et al. May 2014 B2
8734432 Tuma et al. May 2014 B2
8738115 Amberg et al. May 2014 B2
8738181 Greer et al. May 2014 B2
8740882 Jun et al. Jun 2014 B2
8746252 McGrogan et al. Jun 2014 B2
8749189 Nowlin et al. Jun 2014 B2
8749190 Nowlin et al. Jun 2014 B2
8761930 Nixon Jun 2014 B2
8764448 Yang et al. Jul 2014 B2
8771170 Mesallum et al. Jul 2014 B2
8781186 Clements et al. Jul 2014 B2
8781630 Banks et al. Jul 2014 B2
8784385 Boyden et al. Jul 2014 B2
8786241 Nowlin et al. Jul 2014 B2
8787520 Baba Jul 2014 B2
8792704 Isaacs Jul 2014 B2
8798231 Notohara et al. Aug 2014 B2
8800838 Shelton, IV Aug 2014 B2
8808164 Hoffman et al. Aug 2014 B2
8812077 Dempsey Aug 2014 B2
8814793 Brabrand Aug 2014 B2
8816628 Nowlin et al. Aug 2014 B2
8818105 Myronenko et al. Aug 2014 B2
8820605 Shelton, IV Sep 2014 B2
8821511 Von Jako et al. Sep 2014 B2
8823308 Nowlin et al. Sep 2014 B2
8827996 Scott et al. Sep 2014 B2
8828024 Farritor et al. Sep 2014 B2
8830224 Zhao et al. Sep 2014 B2
8834489 Cooper et al. Sep 2014 B2
8834490 Bonutti Sep 2014 B2
8838270 Druke et al. Sep 2014 B2
8844789 Shelton, IV et al. Sep 2014 B2
8855822 Bartol et al. Oct 2014 B2
8857821 Norton et al. Oct 2014 B2
8858598 Seifert et al. Oct 2014 B2
8860753 Bhandarkar et al. Oct 2014 B2
8864751 Prisco et al. Oct 2014 B2
8864798 Weiman et al. Oct 2014 B2
8864833 Glerum et al. Oct 2014 B2
8867703 Shapiro et al. Oct 2014 B2
8870880 Himmelberger et al. Oct 2014 B2
8876866 Zappacosta et al. Nov 2014 B2
8880223 Raj et al. Nov 2014 B2
8882803 Iott et al. Nov 2014 B2
8883210 Truncale et al. Nov 2014 B1
8888821 Rezach et al. Nov 2014 B2
8888853 Glerum et al. Nov 2014 B2
8888854 Glerum et al. Nov 2014 B2
8894652 Seifert et al. Nov 2014 B2
8894688 Suh Nov 2014 B2
8894691 Iott et al. Nov 2014 B2
8906069 Hansell et al. Dec 2014 B2
8964934 Ein-Gal Feb 2015 B2
8992580 Bar et al. Mar 2015 B2
8996169 Lightcap et al. Mar 2015 B2
9001963 Sowards-Emmerd et al. Apr 2015 B2
9002076 Khadem et al. Apr 2015 B2
9044190 Rubner et al. Jun 2015 B2
9107683 Hourtash et al. Aug 2015 B2
9125556 Zehavi et al. Sep 2015 B2
9131986 Greer et al. Sep 2015 B2
9215968 Schostek et al. Dec 2015 B2
9308050 Kostrzewski et al. Apr 2016 B2
9380984 Li et al. Jul 2016 B2
9393039 Lechner et al. Jul 2016 B2
9398886 Gregerson et al. Jul 2016 B2
9398890 Dong et al. Jul 2016 B2
9414859 Ballard et al. Aug 2016 B2
9420975 Gutfleisch et al. Aug 2016 B2
9492235 Hourtash et al. Nov 2016 B2
9554864 Taylor et al. Jan 2017 B2
9592096 Maillet et al. Mar 2017 B2
9600138 Thomas et al. Mar 2017 B2
9734632 Thomas et al. Aug 2017 B2
9750465 Engel et al. Sep 2017 B2
9757203 Hourtash et al. Sep 2017 B2
9795354 Menegaz et al. Oct 2017 B2
9814535 Bar et al. Nov 2017 B2
9820783 Donner et al. Nov 2017 B2
9833265 Donner et al. Nov 2017 B2
9848922 Tohmeh et al. Dec 2017 B2
9925011 Gombert et al. Mar 2018 B2
9931025 Graetzel et al. Apr 2018 B1
10034717 Miller et al. Jul 2018 B2
10076844 Rizk Sep 2018 B2
10499974 Heim et al. Dec 2019 B2
10639111 Kopp May 2020 B2
20010036302 Miller Nov 2001 A1
20020035321 Bucholz et al. Mar 2002 A1
20040068172 Nowinski et al. Apr 2004 A1
20040076259 Jensen et al. Apr 2004 A1
20050096502 Khalili May 2005 A1
20050119663 Keyer et al. Jun 2005 A1
20050143651 Verard et al. Jun 2005 A1
20050171558 Abovitz et al. Aug 2005 A1
20050281385 Johnson et al. Dec 2005 A1
20060100610 Wallace et al. May 2006 A1
20060173329 Marquart et al. Aug 2006 A1
20060184396 Dennis et al. Aug 2006 A1
20060241416 Marquart et al. Oct 2006 A1
20060291612 Nishide et al. Dec 2006 A1
20070015987 Benlloch Baviera et al. Jan 2007 A1
20070021738 Hasser et al. Jan 2007 A1
20070038059 Sheffer et al. Feb 2007 A1
20070073133 Schoenefeld Mar 2007 A1
20070156121 Millman et al. Jul 2007 A1
20070156157 Nahum et al. Jul 2007 A1
20070167712 Keglovich et al. Jul 2007 A1
20070233238 Huynh et al. Oct 2007 A1
20080004523 Jensen Jan 2008 A1
20080013809 Zhu et al. Jan 2008 A1
20080033283 Dellaca et al. Feb 2008 A1
20080046122 Manzo et al. Feb 2008 A1
20080082109 Moll et al. Apr 2008 A1
20080108912 Node-Langlois May 2008 A1
20080108991 Von Jako May 2008 A1
20080109012 Falco et al. May 2008 A1
20080144906 Allred et al. Jun 2008 A1
20080161680 Von Jako et al. Jul 2008 A1
20080161682 Kendrick et al. Jul 2008 A1
20080177203 von Jako Jul 2008 A1
20080214922 Hartmann et al. Sep 2008 A1
20080228068 Viswanathan et al. Sep 2008 A1
20080228196 Wang et al. Sep 2008 A1
20080235052 Node-Langlois et al. Sep 2008 A1
20080269596 Revie et al. Oct 2008 A1
20080287771 Anderson Nov 2008 A1
20080287781 Revie et al. Nov 2008 A1
20080300477 Lloyd et al. Dec 2008 A1
20080300478 Zuhars et al. Dec 2008 A1
20080302950 Park et al. Dec 2008 A1
20080306490 Lakin et al. Dec 2008 A1
20080319311 Hamadeh Dec 2008 A1
20090012509 Csavoy et al. Jan 2009 A1
20090030428 Omori et al. Jan 2009 A1
20090080737 Battle et al. Mar 2009 A1
20090118742 Hartmann et al. May 2009 A1
20090185655 Koken et al. Jul 2009 A1
20090198121 Hoheisel Aug 2009 A1
20090216113 Meier et al. Aug 2009 A1
20090228019 Gross et al. Sep 2009 A1
20090259123 Navab et al. Oct 2009 A1
20090259230 Khadem et al. Oct 2009 A1
20090264899 Appenrodt et al. Oct 2009 A1
20090281417 Hartmann et al. Nov 2009 A1
20100022874 Wang et al. Jan 2010 A1
20100039506 Sarvestani et al. Feb 2010 A1
20100125286 Wang et al. May 2010 A1
20100130986 Mailloux et al. May 2010 A1
20100228117 Hartmann Sep 2010 A1
20100228265 Prisco Sep 2010 A1
20100249571 Jensen et al. Sep 2010 A1
20100274120 Heuscher Oct 2010 A1
20100280363 Skarda et al. Nov 2010 A1
20100294828 Bindra et al. Nov 2010 A1
20100298704 Pelissier et al. Nov 2010 A1
20100331858 Simaan et al. Dec 2010 A1
20110022229 Jang et al. Jan 2011 A1
20110077504 Fischer et al. Mar 2011 A1
20110098553 Robbins et al. Apr 2011 A1
20110137152 Li Jun 2011 A1
20110190832 Taylor et al. Aug 2011 A1
20110213384 Jeong Sep 2011 A1
20110224684 Larkin et al. Sep 2011 A1
20110224685 Larkin et al. Sep 2011 A1
20110224686 Larkin et al. Sep 2011 A1
20110224687 Larkin et al. Sep 2011 A1
20110224688 Larkin et al. Sep 2011 A1
20110224689 Larkin et al. Sep 2011 A1
20110224825 Larkin et al. Sep 2011 A1
20110230967 O'Halloran et al. Sep 2011 A1
20110238080 Ranjit et al. Sep 2011 A1
20110276058 Choi et al. Nov 2011 A1
20110282189 Graumann Nov 2011 A1
20110286573 Schretter et al. Nov 2011 A1
20110295062 Solsona et al. Dec 2011 A1
20110295370 Suh et al. Dec 2011 A1
20110306986 Lee et al. Dec 2011 A1
20120035507 George et al. Feb 2012 A1
20120046668 Gantes Feb 2012 A1
20120051498 Koishi Mar 2012 A1
20120053597 Anvari et al. Mar 2012 A1
20120059248 Holsing et al. Mar 2012 A1
20120071753 Hunter et al. Mar 2012 A1
20120108954 Schulhauser et al. May 2012 A1
20120123417 Smith May 2012 A1
20120136372 Amat Girbau et al. May 2012 A1
20120143084 Shoham Jun 2012 A1
20120184839 Woerlein Jul 2012 A1
20120197182 Millman et al. Aug 2012 A1
20120201421 Hartmann et al. Aug 2012 A1
20120226145 Chang et al. Sep 2012 A1
20120235909 Birkenbach et al. Sep 2012 A1
20120245596 Meenink Sep 2012 A1
20120253332 Moll Oct 2012 A1
20120253360 White et al. Oct 2012 A1
20120256092 Zingerman Oct 2012 A1
20120294498 Popovic Nov 2012 A1
20120296203 Hartmann et al. Nov 2012 A1
20130006267 Odermatt et al. Jan 2013 A1
20130016889 Myronenko et al. Jan 2013 A1
20130030571 Ruiz Morales et al. Jan 2013 A1
20130035583 Park et al. Feb 2013 A1
20130060146 Yang et al. Mar 2013 A1
20130060337 Petersheim et al. Mar 2013 A1
20130094742 Feilkas Apr 2013 A1
20130096574 Kang et al. Apr 2013 A1
20130113791 Isaacs et al. May 2013 A1
20130116706 Lee et al. May 2013 A1
20130131695 Scarfogliero et al. May 2013 A1
20130144307 Jeong et al. Jun 2013 A1
20130158542 Manzo et al. Jun 2013 A1
20130165937 Patwardhan Jun 2013 A1
20130178867 Farritor et al. Jul 2013 A1
20130178868 Roh Jul 2013 A1
20130178870 Schena Jul 2013 A1
20130204271 Brisson et al. Aug 2013 A1
20130211419 Jensen Aug 2013 A1
20130211420 Jensen Aug 2013 A1
20130218142 Tuma et al. Aug 2013 A1
20130223702 Holsing et al. Aug 2013 A1
20130225942 Holsing et al. Aug 2013 A1
20130225943 Holsing et al. Aug 2013 A1
20130231556 Holsing et al. Sep 2013 A1
20130237995 Lee et al. Sep 2013 A1
20130245375 DiMaio et al. Sep 2013 A1
20130261640 Kim et al. Oct 2013 A1
20130272488 Bailey et al. Oct 2013 A1
20130272489 Dickman et al. Oct 2013 A1
20130274761 Devengenzo et al. Oct 2013 A1
20130281821 Liu et al. Oct 2013 A1
20130296884 Taylor et al. Nov 2013 A1
20130303887 Holsing et al. Nov 2013 A1
20130307955 Deitz et al. Nov 2013 A1
20130317521 Choi et al. Nov 2013 A1
20130325033 Schena et al. Dec 2013 A1
20130325035 Hauck et al. Dec 2013 A1
20130331686 Freysinger et al. Dec 2013 A1
20130331858 Devengenzo et al. Dec 2013 A1
20130331861 Yoon Dec 2013 A1
20130342578 Isaacs Dec 2013 A1
20130345717 Markvicka et al. Dec 2013 A1
20130345757 Stad Dec 2013 A1
20140001235 Shelton, IV Jan 2014 A1
20140012131 Heruth et al. Jan 2014 A1
20140031664 Kang et al. Jan 2014 A1
20140046128 Lee et al. Feb 2014 A1
20140046132 Hoeg et al. Feb 2014 A1
20140046340 Wilson et al. Feb 2014 A1
20140049629 Siewerdsen et al. Feb 2014 A1
20140058406 Tsekos Feb 2014 A1
20140073914 Lavallee et al. Mar 2014 A1
20140080086 Chen Mar 2014 A1
20140081128 Verard et al. Mar 2014 A1
20140088612 Bartol et al. Mar 2014 A1
20140094694 Moctezuma de la Barrera Apr 2014 A1
20140094851 Gordon Apr 2014 A1
20140096369 Matsumoto et al. Apr 2014 A1
20140100587 Farritor et al. Apr 2014 A1
20140121676 Kostrzewski et al. May 2014 A1
20140128882 Kwak et al. May 2014 A1
20140135796 Simon et al. May 2014 A1
20140142591 Alvarez et al. May 2014 A1
20140142592 Moon et al. May 2014 A1
20140148692 Hartmann et al. May 2014 A1
20140163581 Devengenzo et al. Jun 2014 A1
20140171781 Stiles Jun 2014 A1
20140171900 Stiles Jun 2014 A1
20140171965 Loh et al. Jun 2014 A1
20140180308 von Grunberg Jun 2014 A1
20140180309 Seeber et al. Jun 2014 A1
20140187915 Yaroshenko et al. Jul 2014 A1
20140188132 Kang Jul 2014 A1
20140194699 Roh et al. Jul 2014 A1
20140130810 Azizian et al. Aug 2014 A1
20140221819 Sarment Aug 2014 A1
20140222023 Kim et al. Aug 2014 A1
20140228631 Kwak et al. Aug 2014 A1
20140234804 Huang et al. Aug 2014 A1
20140257328 Kim et al. Sep 2014 A1
20140257329 Jang et al. Sep 2014 A1
20140257330 Choi et al. Sep 2014 A1
20140275760 Lee et al. Sep 2014 A1
20140275985 Walker et al. Sep 2014 A1
20140276931 Parihar et al. Sep 2014 A1
20140276940 Seo Sep 2014 A1
20140276944 Farritor et al. Sep 2014 A1
20140288413 Hwang et al. Sep 2014 A1
20140299648 Shelton, IV et al. Oct 2014 A1
20140303434 Farritor et al. Oct 2014 A1
20140303643 Ha et al. Oct 2014 A1
20140305995 Shelton, IV et al. Oct 2014 A1
20140309659 Roh et al. Oct 2014 A1
20140316436 Bar et al. Oct 2014 A1
20140323803 Hoffman et al. Oct 2014 A1
20140324070 Min et al. Oct 2014 A1
20140330288 Date et al. Nov 2014 A1
20140364720 Darrow et al. Dec 2014 A1
20140371577 Maillet et al. Dec 2014 A1
20150039034 Frankel et al. Feb 2015 A1
20150085970 Bouhnik et al. Mar 2015 A1
20150146847 Liu May 2015 A1
20150150524 Yorkston et al. Jun 2015 A1
20150196261 Funk Jul 2015 A1
20150213633 Chang et al. Jul 2015 A1
20150252940 Goodwin et al. Sep 2015 A1
20150335480 Alvarez et al. Nov 2015 A1
20150342647 Frankel et al. Dec 2015 A1
20150374217 Sinofsky Dec 2015 A1
20160005194 Schretter et al. Jan 2016 A1
20160058513 Giorgi Mar 2016 A1
20160166329 Langan et al. Jun 2016 A1
20160220320 Crawford et al. Aug 2016 A1
20160235480 Scholl et al. Aug 2016 A1
20160249990 Glozman et al. Sep 2016 A1
20160302871 Gregerson et al. Oct 2016 A1
20160320322 Suzuki Nov 2016 A1
20160331335 Gregerson et al. Nov 2016 A1
20170055819 Hansen et al. Mar 2017 A1
20170135770 Scholl et al. May 2017 A1
20170143284 Sehnert et al. May 2017 A1
20170143426 Isaacs et al. May 2017 A1
20170156805 Taylor et al. Jun 2017 A1
20170156816 Ibrahim Jun 2017 A1
20170202629 Maillet et al. Jul 2017 A1
20170212723 Atarot et al. Jul 2017 A1
20170215825 Johnson et al. Aug 2017 A1
20170215826 Johnson et al. Aug 2017 A1
20170215827 Johnson et al. Aug 2017 A1
20170224358 Kostrzewski Aug 2017 A1
20170231710 Scholl et al. Aug 2017 A1
20170258426 Risher-Kelly et al. Sep 2017 A1
20170273748 Hourtash et al. Sep 2017 A1
20170296277 Hourtash et al. Oct 2017 A1
20170309069 Thomas et al. Oct 2017 A1
20170319289 Neff Nov 2017 A1
20170333056 Ponzer et al. Nov 2017 A1
20170360493 Zucher et al. Dec 2017 A1
20170360517 Crawford Dec 2017 A1
20180049825 Kwon et al. Feb 2018 A1
20180056527 Farritor et al. Mar 2018 A1
20180153408 Yao et al. Jun 2018 A1
20180228559 Brierton et al. Aug 2018 A1
20180271511 Stanton Sep 2018 A1
20180325610 Cameron et al. Nov 2018 A1
20190000567 Allen et al. Jan 2019 A1
20190029765 Crawford et al. Jan 2019 A1
20190167362 Crawford et al. Jun 2019 A1
20190274765 Crawford et al. Sep 2019 A1
Foreign Referenced Citations (19)
Number Date Country
107753106 Mar 2018 CN
202008009571 Oct 2008 DE
102014226240 Jun 2016 DE
3241518 Nov 2017 EP
3375399 Sep 2018 EP
2007508117 Apr 2007 JP
2010-269142 Dec 2010 JP
2013517101 May 2013 JP
2013154138 Aug 2013 JP
2016-514562 May 2016 JP
2016-539681 Dec 2016 JP
2017514581 Jun 2017 JP
2017-524483 Aug 2017 JP
2017-205495 Nov 2017 JP
2018011938 Jan 2018 JP
2018079304 May 2018 JP
2018110841 Jul 2018 JP
2018-532465 Nov 2018 JP
2017203531 Nov 2017 WO
Non-Patent Literature Citations (1)
Entry
US 8,231,638 B2, 07/2012, Swarup et al. (withdrawn)
Related Publications (1)
Number Date Country
20220330954 A1 Oct 2022 US
Divisions (1)
Number Date Country
Parent 16695310 Nov 2019 US
Child 17662437 US
Continuation in Parts (2)
Number Date Country
Parent 16452737 Jun 2019 US
Child 16695310 US
Parent 16361863 Mar 2019 US
Child 16452737 US