Robotic system and method for spinal and other surgeries

Information

  • Patent Grant
  • 11744648
  • Patent Number
    11,744,648
  • Date Filed
    Sunday, April 12, 2020
    4 years ago
  • Date Issued
    Tuesday, September 5, 2023
    8 months ago
Abstract
The present invention relates to a method, such as a surgical method for assisting a surgeon for placing screws in the spine using a robot attached to a passive structure. The present invention also related to a method, such as a surgical method for assisting a surgeon for removing volumes in the body of a patient using a robot attached to a passive structure and to a device to carry out said methods. The present invention further concerns a device suitable to carry out the methods according to the present invention.
Description
FIELD OF THE INVENTION

The present invention concerns a robotic system and methods for surgical procedures. More specifically, the present invention concerns methods for assisting the surgeon to carry out a surgical procedure using a robotic system and computer means.


BACKGROUND OF THE INVENTION

Spine Surgeries


BACKGROUND

Spine surgeries often use fixations and implants attached to vertebrae using screws. It is important to place the screws properly so they do not touch or violate neither spinal cord nor arteries. It can be a difficult task due to the needed precision, high density and constrained access to the vertebrae. For these reasons surgeons use support systems that can enhance the accuracy of the screw placement.


In spine surgeries there are the following methods used for placing the screws:


1. Purely manual


2. Manual using navigation systems


3. Using robotic systems


Manual Methods


In the traditional manual technique, a surgeon on the basis of the pre-operative CT scans visually judges the screw trajectory. During drilling, the fluoroscopic images are taken to verify if the trajectory is correct. An advantage of this technique is that except standard reconstruction systems no additional tools are needed and it can be always used in case of an emergency. On the other hand it strongly relies on the surgeon's experience and can be subject to his changing predisposition. Security is also doubtful as the fluoroscopic images are taken only after the drilling is done. The accuracy and information shown on those images can also vary. Drilling is technically difficult because the tools are held in hand. Surgeon needs to have a very good coordination and be able to simultaneously do many tasks. Due to those disadvantages a screw misplacement rate on the level of 30-50% in the cervical spine was reported.


Manual Methods Using Navigation Systems


Navigation systems can measure the position of surgical tools and a patient in the operating room. Currently most often the optical tracking is used for measurements but other methods such as electro-magnetic tracking can be used. Procedures involving those systems will be referred as the image-guided surgeries. Because of the improved accuracy image-guided procedures made the screw placement in the cervical spine possible for certain patients. The image-guided surgeries in the spinal domain are still done manually. For this reason the surgical tools though tracked can be wrongly positioned because of the human constraints. Precision can be a subject of a variable human factor. These techniques demand increased attention from the surgeon as he needs to coordinate operations with virtual indications on the screen. In case of a procedural error big inaccuracies can appear and for this reason a staff training is important. Problems with the verification of the registration accuracy are common.


Methods Using Robotic Systems


Few attempts have been done to introduce robotic systems for spinal surgeries. One of them is developed at the German Aerospace Center (DLR) Miro/KineMedic robotic system. It is designed for a surgical telemanipulation. The robotic part of the system consists of three lightweight robotic arms. Each joint is equipped with a force sensor and uses a sophisticated control system with the force feedback and the gravity compensation. The robot's redundancy is used for the workspace optimization and allows to fulfill additional criterias in the operating room. Proposition of the possible setup for a pedicle screw placement with the Miro/KineMedic system would consist of the DLR lightweight robotic arm, an optical tracking system and the software. The surgeon plans the surgery in advance. In the operating room several robot control modes are available. Initially the robotic arm is moved to the planned position by the surgeon using a hands-on impedance control. When it is in place, the surgeon can start drilling using a driller held by a passive tool holder attached to the robot's end effector. The robot compensates for the position errors while surgeon does axial movement. Authors do not specify in which parts of a spine the robot could work. The proposed registration method using a surface matching only could be insufficient in a general situation as those algorithms need a good starting point and converge to the closest local minimum. It is not specified if in this system standard surgical reconstruction tools could be used which can be crucial for the acceptance in the medical domain. A relatively big robotic arm can have disadvantages in a dense environment of an operating room. It is not said how it would be interfaced with the equipment of an operating room. Sophisticated impedance-control algorithms can be difficult to certify in the medical domain and till now no such arm was certified. Expected accuracy of the system is not mentioned. Accordingly to the author's knowledge no further publications concerning this proposition are available.


Other robotic system for the spinal surgery is the Mazor's SmartAssist. It consists of a miniature robot attached to the spine with a base platform and a workstation for planning and navigation. Registration is based on the matching between pre-operative CT scans and intra-operative fluoroscopic images acquired with a calibrated device. In the next step the robot moves to planned spacial position and the surgeon performs a surgery via the tool guide. The robot does not move during the intervention acting as a tool holder (passive guidance). The system was tested with good results. The SpineAssist can be used only in the thoracic and lumbar parts and can not be used in the cervical spine where high accuracy is most important. Fluoroscopic registration has certain disadvantages and needs a calibrated C-Arm. Possible hard to detect errors were reported. The robotic arm does not compensate for random vertebral movements while drilling. Drill slippage on the surface of the vertebrae causing big inaccuracies was reported.


Another robotic system for spinal surgery is the Cooperative Robotic Assistant. It consists of a 6 degree of freedom robot with a kinematically closed structure. It uses a new drill-by-wire mechanism for placing the screws and uses a 1 degree of freedom haptic device to provide the force feedback for the surgeon. Achieved accuracy below 1 [.mu.m] of the robotic part was reported. Authors claim that closed construction was chosen for rigidity reasons. The robot is taking a lot of space in the operating room. Equipment of the operating room should be strongly adapted to be used with this system. The drill-by-wire mechanism needs its own tools which can be a limit for acceptance in the medical field. The system does not perform any external measurements so nothing about registration methods is known. The precision of the registration will strongly influence the accuracy of the robotic arm measured separately.


Other robotic system is the Spinebot system for the lumbar spine surgery. It consists of a 3 degree of freedom positioner, gimbals and drilling tool having 2 degree of freedom each. It uses an optical tracking system for registration and measurements. Big advantage of the system is that during the surgery holes in spine can be drilled percutaneusly (through the skin). The system can work only in lumbar part of the spine. In this area needed accuracy is much lower than in cervical part and access is easier.


SUMMARY OF THE INVENTION

An aim of the present invention is to improve the known systems and methods.





The invention will be described in more detail in the following specification and with reference to the drawings which show:



FIG. 1 illustrates the different elements of a proposed robotic system for spinal surgeries;



FIGS. 2(a) and 2(b) illustrate an example of patient registration;



FIGS. 3(a) and 3(b) illustrate the indicators helping the surgeon to position the robot;



FIG. 4 illustrate a screenshot of an ENT surgical procedure.



FIG. 5 illustrates a block diagram of the method in one embodiment;



FIG. 6 illustrates a block diagram of the method in another embodiment;



FIG. 7 illustrates a block diagram of the system according to the invention.





In an embodiment the invention concerns a method for assisting a user for placing screws in the spine of a patient using a robot attached to a passive structure and holding a tool, wherein said method comprises the following steps:


after an marker of an tracking system is attached to a vertebrae the patient's position is registered in that the transformation between the position of the vertebrae and of the attached marker and/or planning is found


the robot is positioned such that the planned screw trajectory is inside the robot's workspace by moving the passive structure;


a navigation software assists the user in doing this task, whereby the user unblocks structure of the robot and manually moves the robot to a position indicated by the navigation software;


a target robot position, or at least a suitable robot position is determined;


in this case the user may block the passive structure such that it will be rigidly held in place;


when the screw trajectory is inside the robot's workspace the robot starts to automatically follow it in real-time i.e. the vertebrae and the robot positions are measured and if one of them moves the robot will change the position of the tool to compensate;


the user can proceed with the desired surgical procedure.


In an embodiment, the invention concerns a method for assisting a user for removing volumes in the body of a patient using a robot attached to a passive structure and holding a tool, wherein said method comprises the following steps:


after a marker of the tracking system is attached to the patient the patient' position is registered in that the transformation between the position of the volumes and of the attached marker is found;


the robot is positioned such that the planned volume(s) to be removed is (are) inside the robot's workspace by moving the passive structure;


a navigation software assists the user in doing this task, whereby the user unblocks the passive structure and manually moves the robot to the position indicated by the navigation software;


a target robot position, or at least suitable, robot position is determined;


in this case the user may block the passive structure such that the robot will be rigidly held in place;


when the volume(s) to be removed is (are) are in the robot's workspace the robot starts to automatically compensate for the patient movements in real-time i.e. marker and the robot positions are measured and if one of them moves the robot will change the position of the tool to compensate;


the user can proceed with the standard surgical procedure whereby the navigation software controls the robot's position so that the tool held by the robot (driller or shaver) does not violate the “no-go” zones defined during planning.


In an embodiment, the methods comprise a haptic interaction of the surgeon with the device.


In an embodiment the user feels repulsive/wall-like forces on the haptic device when the tool approaches the “no-go” zone.


In an embodiment the volumes to be removed (stay-in zones) and volumes that must be protected (no-go zones) are defined preoperatively or intra-operatively.


In an embodiment if the user wants to remove certain volumes he enters it with the tool and inside said volume the tool remains blocked inside until he explicitly wants to leave it (“stay-in” volume).


In an embodiment when the tool stays inside the stay-in volume the user feels repulsive/wall-like forces that prevent him from leaving the volume.


In an embodiment margins of interaction around the “no-go” and “stay-in” zones can be defined.


In an embodiment the coupling between the haptic device movements and the robot movements is definable to allow the user to have small movements/high precision or big movements/high speed.


In an embodiment automatic compensation of the patient's movement is switched off and is done manually by the user.


In an embodiment the target position of the robot or at least a suitable robot position is determined as a semi-transparent phantom image (indicator) on a screen, and the phantom is in a first color at the beginning and changes to another color when the robot's workspace contains the screw trajectory or when the robot's workspace contains the volume to be removed. Other indicators may be used.


In an embodiment the invention concerns a device comprising at least


a surgery planning software,


a robotic system, comprising an active robot and a passive structure for positioning the active robot and a controller,


a measurement system for real-time patient and robot position measurements and position tracking, and


a workstation with a navigation software controlling the device and for providing feedback to the user.


In an embodiment the workstation is a computer, such as a personal computer.


In an embodiment a computer contains the surgery planning software and monitors the measurement system.


In an embodiment the active robot covers a small volume and the passive structure covers a large volume.


DETAILED DESCRIPTION OF THE INVENTION
Spine Surgery

The robotic system described in this part is used to assist the surgeon while placing the screws into a vertebrae, as a practical example. The system comprises the following elements (see also FIG. 7):


1. A surgery planning software (known in principle in the art)


a) the planning is based on medical images obtained pre-operatively (CT, MRI or other methods)


b) the planning software allows the surgeon to define needed data for the surgery which can be: screw trajectories and data for the registration. The planning software can suggest the surgeon the best trajectories for the screws


c) if the point to point followed by the surface matching registration method is used the surgeon defines landmarks (natural or artificial) and generates a 3D model of the vertebrae


Alternatively, it is possible to use the following system without the explicit pre-operative planning. In such case, the user/surgeon inter-operatively decides about the trajectory based on his experience and/or medical images.


2. Compact robot with sufficient accuracy and rigidity. The corresponding robotic system is disclosed in parallel applications EP N.degree.11160893.1 filed on Apr. 1, 2011 and PCT application N.degree.PCT/IB2012/051607, filed on Apr. 2, 2012, both in the name of the same Applicant as the present application and the content of which is incorporated by reference in its entirety in the present application.


a) the robot positions or helps to position surgical tools


b) the robot has sufficient number of degrees of freedom to define the screw trajectories in space,


c) the robot's absolute accuracy should be the same or better than the accuracy provided by the optical tracking, medical imaging and application requirements. For example, this accuracy could be around 0.1 mm.


d) the robot's rigidity should be sufficient to ensure the robot's accuracy while the surgeon operates the tools, the robot's workspace should be big enough so that manual positioning of the robot (using the passive structure) is simple,


3. Robot's controller (see the robotic system disclosed in applications EP N.degree.11160893.1 filed on Apr. 1, 2011 and PCT application N.degree.PCT/IB2012/051607 filed on Apr. 2, 2012 mentioned above)


a) controls the robot's end effector position and/or velocity and/or force,


b) can have different control modes: position, velocity, torque.


4. Passive structure positioning the robot in space (see the robotic system disclosed in applications EP N.degree.11160893.1 filed on Apr. 1, 2011 PCT application N.degree.PCT/IB2012/051607 filed on Apr. 2, 2012 mentioned above),


a) the passive structure can be in a blocked state holding the robot rigidly in space or in an unblocked state allowing the surgeon to freely position (manually by the surgeon) the robot in space,


b) the passive structure extends the robot's workspace and should be designed so that all required tool positions can be achieved,


c) the passive structure's rigidity should be sufficient so that the system composed of the passive structure and the robot has the required accuracy while the surgeon operates the tools,


d) it should be possible to integrate the passive structure with the equipment in the operating room


e) to simplify the usage of the passive structure it can have additional features like: a gravity compensation, a manipulation adapted to one person, a feasible blocking/unblocking interface (ex. pedals)


5. Measurement system for real-time patient and robot position measurements (see the robotic system disclosed in applications EP N.degree.11160893.1 filed on Apr. 1, 2011 and PCT application N.degree.PCT/IB2012/051607 filed on Apr. 2, 2012 mentioned above)


a) different measurement systems can be used known in principle in the art: electromagnetic, fixed (when target bone/tissue position is fixed and robot arm is used to register it), template-based and others. The most popular is an optical tracking, with appropriate markers.


b) the optical tracking system comprises for example a camera, markers (attached to the robot and the patient) and a pointer (which can measure a single point in space),


c) precision of the optical tracking system should be sufficient to fulfill the system requirements. For example it should be around 0.2 mm.


d) if the robot's position real-time update (explained later) is to be used the frequency of the measurements (for the whole scene, not one marker) should be sufficient to avoid delays, for example around 20 Hz.


e) the tool position (held by the robot or surgeon) can be also measured. In this case measuring the robot's position could not be necessary


6. Workstation with navigation software controlling all devices and providing feedback for the surgeon (see FIGS. 2(a)-2(b), 3(a)-3(b)).


a) the navigation software knows about the patient and robot positions. It can measure the tool position (if relevant),


b) the navigation software can help the surgeon to find offset between the patient's marker and the vertebrae in the registration process,


c) the navigation software can command the robot's position,


d) the navigation software controls the robot's position so that the surgeon with the robotic assistance places the screw along the planned trajectory,


e) the robot's controller can be external or integrated in the navigation software,


f) the navigation software can assist the surgeon in going through phases of the surgery,


g) the navigation software can present to the surgeon a graphical feedback: real-time 3D rendering of the measured objects (robot, patient, pointer) and medical images


h) the navigation software can integrate interface to the equipment of the operating room like C-Arm, O-Arm. Especially in case of integration with intra-operative medical imaging these devices can provide automatic registration processes and support surgical planning.


i) the navigation software can use different input devices: touchscreen, touchpad, mouse, keyboard, pedals and specialized input devices.


The navigation software may be used to allow the robot to follow any movement of the patient whereby the position is changed. This function may be automatic or on demand.


Example Surgery Workflow (see FIG. 5)


FIG. 1 illustrates the basic elements of the proposed robotic system for spinal surgeries. R corresponds to an active robot, PS corresponds to a passive holding structure, T corresponds to a camera of an optical tracking system, M corresponds to a skull clamp for fixing patient's head. This robotic system corresponds to the one disclosed in applications EP N.degree.11160893.1 filed on Apr. 1, 2011 and PCT application N.degree.PCT/IB2012/051607 filed on Apr. 2, 2012 mentioned above and incorporated herein.


Planning for the surgery is based on CT images obtained pre-operatively, as is usual in the present art. Planning can be also done using medical images obtained from different devices (MRI, fluoroscopy, scanners, ultra sound). The CT images must have proper resolution which can be achieved using standard scanners. The surgeon using standard surgical views (Axial, Sagittal, Coronal) and a 3D view defines screw trajectories, natural landmarks (for a point to point registration) and generates 3D model of the vertebrae (for a surface matching and visualization). Data is saved to the file which can be read by the navigation software.


Alternatively, the planning can be done intra-operatively when the user/surgeon defines the trajectories using elements of the system (like pointer or trocar) and saves them for future execution.



FIG. 2 illustrates the dialogs (screenshots) used during the patient registration typically as presented on the screen of a workstation: specifically, FIG. 2(a) illustrates point to point registration (coarse registration), and FIG. 2(b) illustrates surface matching (fine registration)


During the surgery the patient lies prone with his head fixed in the Mayfield structure M (see FIG. 1). When access to the vertebrae is open, an optical marker of the optical tracking system is attached to it.


Alternatively, patient registration can be done automatically using an intra-operative imaging device.


In the next step the patient's position is registered (the transformation between the vertebrae and attached marker and/or planning is found). Such procedures are known in principle in the art.


Firstly (as a coarse registration) the user/surgeon measures natural landmarks on the vertebrae (using a pointer for example), the same as defined during the planning on images obtained pre-operatively. A navigation software assists him in doing that by showing the natural landmarks on the 3D model (ref FIG. 2a). Specifically, FIG. 2(a) shows a 3D model of the vertebrae (medical images can be used too) with a landmark to be measured shown with a sphere 1. A list of points to be measured is available. User/Surgeon is informed if markers of the optical tracking system are occluded and/or if the precision of the measurement is decreased. A specialized algorithm may be used to find best matching between measured and planned points. The error is shown to the user and if it is sufficiently small the user can proceed to the next step.


The software finds the best correspondence between the set of planned and measured points and shows an estimated error to the surgeon. If the error is acceptable the surgeon can start measuring random points on a surface of the vertebrae (fine registration). When a sufficient number of points is collected (for example 30 points) the navigation software will look for the best match between them and the 3D model of the vertebrae generated during the planning. When the best match is found, the results with an estimated error are shown (ref FIG. 2b). If the error is acceptable the surgery can progress to the next stage, otherwise the registration should be restarted. Specifically, FIG. 2(b) shows the situation where multiple points (illustrated as spheres 2) on the surface of the vertebrae were measured and are added to the 3D model of the vertebrae (medical images can be used too). A list of already measured points is available. The user/surgeon is informed if markers of the optical tracking system are occluded and/or if the precision of the measurement is decreased. Registration is started when a sufficient number of random points is measured and a calculated error is shown to the user.


In the next step the robot R should be positioned using the passive structure PS so that the planned screw trajectory is inside the robot's workspace. The navigation software assists the user/surgeon in doing this task. The user/surgeon unblocks the passive structure and manually moves the robot to the position indicated by the navigation software. The ideal robot position can be shown for example as a semi-transparent phantom (indicator). The phantom is in one color (for example red) at the beginning and changes to another color (for example green) if and when the screw trajectory is inside the robot's workspace. In this case the surgeon can block the passive structure which will rigidly hold the robot in place for the procedure. Of course, other means and procedure can be used to position the robot, for example using haptic principles to indicate to the user when the trajectory is within the working volume. Also other equivalent indicators may be used to position the robot in the proper working volume for the intended procedure.



FIG. 3 illustrates exemplary indicators helping the user/surgeon to manually position the robot R after unblocking the passive structure PS. The current robot position 10 is shown for example in grey, the ideal robot position is shown for example as a semi-transparent color indicator, reference 11. If the planned screw trajectory is outside the robot's workspace the indicator is in one color for example red FIG. 3(a), otherwise it takes another color, for example green FIG. 3(b) when the screw trajectory is in the in the working volume.


When the screw trajectory is inside the robot's workspace the robot can start to automatically follow it in real-time i.e. the vertebrae and the robot positions are measured and if of one of them moves the robot will change the position of the tool to compensate. This is an important feature that increases precision, decreases forces exceeded on the vertebrae and is not possible to do manually. This is done by tracking a change of the position of a vertebrae and imposing the same change to the robot. Alternatively, this function may not be automatic but only upon request by the user.


Now the user/surgeon can proceed with the standard surgical procedure which comprises a drilling (using sharpened wire called K-wire), optionally drilling using a canulated drill and a screw placement.


Different procedures can of course be used with other systems (e.g. non-cannulated systems).



FIG. 5 illustrates an example of an embodiment of the method with a block diagram comprising the general steps.


All operations are done through a tube held by the robot (called trocar) which ensures that the screw is placed in the position defined by the robot. After the screw is placed the surgeon can place another screw in the same vertebrae or move to another vertebrae and redo the process.


Different procedures can be applied for percutaneous surgeries.


ENT Surgeries
BACKGROUND

Some of the ENT (Ear Nose Throat) surgeries comprise the step of removing volumes like tumors, polyps etc. Users/Surgeons use drillers (for bones) and shavers (for soft tissues) which they operate manually. Different tools for tissue removal can be used like lasers, coagulators etc. In many cases they use an endoscopic camera which is not convenient because of a bleeding which drastically decreases the field of view. When the bleeding begins the user/surgeon has to stop the operations, put on a dressing and wait until it stops. For this reason the ENT surgeries take a lot of time. They can be dangerous because when the visibility is constrained important tissues like nerves, orbitals, brain etc. can be destroyed by accident.


System Elements


System elements are similar to the ones used in the spinal surgeries (see above and FIG. 7) with the following changes:


1. Planning:


b) instead of the screw trajectories the user/surgeon defines volumes that he wants to remove (called “stay-in” zones like tumors, polyps) and volumes that must be protected (called “no-go” zones like nerves, orbitals and other important tissues)


2. Compact robot (see the robotic system disclosed in applications EP N.degree.11160893.1 filed on Apr. 1, 2011 and PCT application N.degree.PCT/IB2012/051607 filed on Apr. 2, 2012 mentioned above)


b) the robot has sufficient number of degrees of freedom to guide the driller or shaver or another surgical tool in space, for example 5 or 6 DOFs.


Additional Points:


the robot may have force sensor(s) integrated,


the force sensor(s) may be mounted on the tool tip (for measuring forces on the tool tip) and/or in the tool fixation (for measuring forces on the tool)


1. Robot's controller:


b) should have control mode suitable for teleoperation


2. Workstation with navigation software ( . . . ):


d) the navigation software controls the robot's position so that the tool held by the robot (driller or shaver) does not violate the “no-go” zones defined during planning. If the user/surgeon wants to remove certain volumes he should enter it with the tool. Inside such volumes the tool remains blocked inside until he explicitly wants to leave it (“stay-in” zone). There are other way of realizing the concept of “stay-in” and “no-go” zones the idea being to make such procedures safer.


Additional Points:


the user/surgeon commands the robot positions using a haptic device, the principle of such devices being known in the art


when the tool approaches the “no-go” zone the user/surgeon feels repulsive/wall-like forces on the haptic device to inform him of the position of the tool


when the tool is supposed to stay inside the stay-in volume the user/surgeon feels repulsive/wall-like forces that prevent him from leaving the volume as long as it is required


the margin of interaction around the “no-go” and “stay-in” zones may be defined,


the coupling between the haptic device movements and the robot movements may be defined to allow the surgeon to have small movements/high precision or big movements/high speed and additional features like tumor removal.


Surgery Workflow (see FIG. 6)

Planning is similar as in the system used for the spinal surgery. Instead of the screw trajectories surgeon should generate models for the “no-go” and “stay-in” zones in the preoperative images. Alternatively, such zones may be defined during the procedure if this is possible or suitable.


Registration and manual robot positioning using passive structure is the same as for the spinal surgery disclosed above.


The tool used in the surgery (for example driller, shaver) should be fixed to the robot R end effector. When desired volumes are inside the robot's workspace the user/surgeon can control the robot position using the haptic device with the assistance of the navigation software.


When approaching a “no-go” zone the user/surgeon a feels repulsive force on the haptic device which prevents him from touching important tissues.


When he enters a “stay-in” zone he remains blocked inside said zone until he explicitly wants to leave. He can move the tool inside the volume and follow for example virtual tumor walls felt on the haptic device until he is sure to remove all needed tissue. The margins of interaction with walls can be defined so for example it is possible to remove 80% of the tumor or 120% (tumor and tissues around). The coupling between the haptic device and the robot movements can be defined so that the surgeon can have small movements/high precision or big movements/high speed. Other algorithms for controlling the haptic device can be implemented.


High bleeding can be accepted as it does not disturb the robot operation (the robot and patient positions are measured by the optical tracking so there is no need for endoscope except from control and verification). As tumor can be removed fast (in few minutes) high bleeding during a short time for patient can be accepted.



FIG. 4 illustrates screenshots of the navigation software used in the ENT surgery. The surgeon controls the tool position using a haptic device. He can feel repulsive forces when he approaches the “no-go zones” and he can stay inside the “stay-in zone” until he is sure to remove all needed tissue.



FIG. 6 illustrates an example of an embodiment of the method with a block diagram comprising the general steps.



FIG. 7 illustrates in block-diagram an example of a system of the invention with the different elements forming a system suitable for carrying out the method. As defined hereabove, the system comprises at least a surgery and planning system, a robotic system, a measurement system and a workstation, such as a computer station.


The examples and values (sizes, DOF etc) given in the above description are only for illustrative purposes and should not be construed in a limiting manner on the scope of the invention. Also, equivalent means may be envisaged by a skilled person and the embodiments described herein may also be combined as desired.

Claims
  • 1. A method of performing surgery comprising: providing a robotic surgical system having: a robot comprising an end-effector for holding a surgical tool for use in a surgical procedure;a controller for controlling the position of the end-effector, wherein the end-effector is adapted to be manually and freely positioned by the surgeon during the operation; anda measurement system with a computer processor for: measuring, by the processor a position of the surgical tool held by the end effector and a position of a bone of a patient;determining by the processor a change in the position of the bone; andautomatically adjusting via the robot the position of the end effector based at least in part on the determined change in the position of the bone such that a spatial relationship between the end effector and the bone remains substantially unaltered as at least a portion of the operation is performed, thereby ensuring the surgical tool remains aligned with a predetermined path;performing a surgical procedure with the surgical tool.
  • 2. The method of claim 1, wherein the robot is configured to allow positioning of the surgical tool by the surgeon with at least four degrees of freedom.
  • 3. The method of claim 1, wherein the position of the bone is a position of a marker placed in spatial relation to the bone.
  • 4. The method of claim 1, wherein the measurement system is an optical tracking system comprising a camera, fixed measurement system, or template-based tracking system.
  • 5. The method of claim 4, wherein the measurement system comprises a first marker attached to the robotic surgical system and a second marker attached to the patient.
  • 6. The method of claim 1, comprising a force sensor for measuring forces on the surgical tool.
  • 7. The method of claim 6, wherein the force sensor is mounted on the end-effector.
  • 8. The method of claim 1, wherein the robotic surgical system comprises a display for providing graphical feedback to the surgeon regarding the predetermined path in relation to the bone.
  • 9. The method of claim 1, wherein the surgical tool includes a tube configured to be held by the end-effector.
  • 10. The method of claim 1, wherein the robotic surgical system comprises a passive structure that rigidly holds the robot in place.
  • 11. The method of claim 1, wherein the measurement system provides real-time patient and robot position measurements and position tracking.
  • 12. The method of claim 1, wherein the controller is arranged such that the surgeon can manually and freely position the robot in space using hands-on control.
  • 13. The method of claim 1, wherein the controller permits gross manual positioning of the end-effector.
  • 14. The method of claim 1, wherein the robot comprises a passive structure.
  • 15. The method of claim 14, wherein the passive structure is adapted to be manually and freely positioned by the surgeon during the operation.
  • 16. The method of claim 15, wherein the passive structure is adapted to be blocked by the surgeon such that the robot is rigidly held in place.
  • 17. The method of claim 15, wherein the optical tracking system comprises a pointer that can be used to measure a single point in space.
  • 18. A method of performing a surgical procedure comprising the steps of: providing a robotic surgical system having: a robot comprising an end-effector for holding a surgical tool for use in a surgical procedure, wherein the end-effector can be manually and freely positioned by the surgeon during the operation in a first mode and controlled by the robot in a second mode; anda measurement system with a computer processor for: measuring, by the processor a position of the surgical tool held by the end effector and a position of a bone of a patient;determining by the processor a change in the position of the bone or the position of the end effector based on optical tracking markers positioned on the bone or the end effector;wherein the robot automatically adjusts the position of the end effector based on a pre-planned path;performing a surgical procedure with the surgical tool.
  • 19. The robotic surgical system of claim 18, wherein the robot automatically provides a force on a haptic device and prevents the surgeon from contacting an unsafe portion of the patient, when in the first mode.
  • 20. The robotic surgical system of claim 18, wherein the robot is controlled to remain within a stay-in zone and cannot be moved outside the stay-in zone during an operation in the first mode or the second mode.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of U.S. patent application Ser. No. 15/705,578, which is a continuation of U.S. patent application Ser. No. 14/824,602 filed on Aug. 12, 2015, which is a continuation of U.S. Pat. No. 9,125,680 filed on Oct. 23, 2014 which is a continuation of U.S. Pat. No. 9,308,050 filed on Jan. 10, 2014, which claims the priority of U.S. application 61/470,545 filed on Apr. 1, 2011, the content of which is incorporated by reference in its entirety in the present application.

US Referenced Citations (690)
Number Name Date Kind
4150293 Franke Apr 1979 A
5246010 Gazzara et al. Sep 1993 A
5354314 Hardy et al. Oct 1994 A
5397323 Taylor et al. Mar 1995 A
5408409 Glassman Apr 1995 A
5598453 Baba et al. Jan 1997 A
5772594 Barrick Jun 1998 A
5791908 Gillio Aug 1998 A
5820559 Ng et al. Oct 1998 A
5825982 Wright et al. Oct 1998 A
5887121 Funda et al. Mar 1999 A
5911449 Daniele et al. Jun 1999 A
5951475 Gueziec et al. Sep 1999 A
5987960 Messner et al. Nov 1999 A
6012216 Esteves et al. Jan 2000 A
6031888 Ivan et al. Feb 2000 A
6033415 Mittelstadt et al. Mar 2000 A
6080181 Jensen et al. Jun 2000 A
6106511 Jensen Aug 2000 A
6122541 Cosman et al. Sep 2000 A
6144875 Schweikard et al. Nov 2000 A
6157853 Blume et al. Dec 2000 A
6167145 Foley et al. Dec 2000 A
6167292 Badano et al. Dec 2000 A
6201984 Funda et al. Mar 2001 B1
6203196 Meyer et al. Mar 2001 B1
6205411 DiGioia, III et al. Mar 2001 B1
6212419 Blume et al. Apr 2001 B1
6231565 Tovey et al. May 2001 B1
6236875 Bucholz et al. May 2001 B1
6246900 Cosman et al. Jun 2001 B1
6301495 Gueziec et al. Oct 2001 B1
6306126 Montezuma Oct 2001 B1
6312435 Wallace et al. Nov 2001 B1
6314311 Williams et al. Nov 2001 B1
6320929 Von Der Haar Nov 2001 B1
6322567 Mittelstadt et al. Nov 2001 B1
6325808 Bernard et al. Dec 2001 B1
6340363 Bolger et al. Jan 2002 B1
6377011 Ben-Ur Apr 2002 B1
6379302 Kessman et al. Apr 2002 B1
6402762 Hunter et al. Jun 2002 B2
6424885 Niemeyer et al. Jul 2002 B1
6447503 Wynne et al. Sep 2002 B1
6451027 Cooper et al. Sep 2002 B1
6477400 Barrick Nov 2002 B1
6484049 Seeley et al. Nov 2002 B1
6487267 Wolter Nov 2002 B1
6490467 Bucholz et al. Dec 2002 B1
6490475 Seeley et al. Dec 2002 B1
6499488 Hunter et al. Dec 2002 B1
6501981 Schweikard et al. Dec 2002 B1
6507751 Blume et al. Jan 2003 B2
6535756 Simon et al. Mar 2003 B1
6560354 Maurer, Jr. et al. May 2003 B1
6565554 Niemeyer May 2003 B1
6587750 Gerbi et al. Jul 2003 B2
6614453 Suri et al. Sep 2003 B1
6614871 Kobiki et al. Sep 2003 B1
6619840 Rasche et al. Sep 2003 B2
6636757 Jascob et al. Oct 2003 B1
6645196 Nixon et al. Nov 2003 B1
6666579 Jensen Dec 2003 B2
6669635 Kessman et al. Dec 2003 B2
6701173 Nowinski et al. Mar 2004 B2
6757068 Foxlin Jun 2004 B2
6782287 Grzeszczuk Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6786896 Madhani et al. Sep 2004 B1
6788018 Blumenkranz Sep 2004 B1
6804581 Wang et al. Oct 2004 B2
6823207 Jensen et al. Nov 2004 B1
6827351 Graziani et al. Dec 2004 B2
6837892 Shoham Jan 2005 B2
6839612 Sanchez et al. Jan 2005 B2
6856826 Seeley et al. Feb 2005 B2
6856827 Seeley et al. Feb 2005 B2
6879880 Nowlin et al. Apr 2005 B2
6892090 Verard et al. May 2005 B2
6920347 Simon et al. Jul 2005 B2
6922632 Foxlin Jul 2005 B2
6968224 Kessman et al. Nov 2005 B2
6978166 Foley et al. Dec 2005 B2
6988009 Grimm et al. Jan 2006 B2
6991627 Madhani et al. Jan 2006 B2
6996487 Jutras et al. Feb 2006 B2
6999852 Green Feb 2006 B2
7007699 Martinelli et al. Mar 2006 B2
7016457 Senzig et al. Mar 2006 B1
7043961 Pandey et al. May 2006 B2
7062006 Pelc et al. Jun 2006 B1
7063705 Young et al. Jun 2006 B2
7072707 Galloway, Jr. et al. Jul 2006 B2
7083615 Peterson et al. Aug 2006 B2
7097640 Wang et al. Aug 2006 B2
7099428 Clinthorne et al. Aug 2006 B2
7108421 Gregerson et al. Sep 2006 B2
7130676 Barrick Oct 2006 B2
7139418 Abovitz et al. Nov 2006 B2
7139601 Bucholz et al. Nov 2006 B2
7155316 Sutherland Dec 2006 B2
7164968 Treat et al. Jan 2007 B2
7167738 Schweikard et al. Jan 2007 B2
7169141 Brock et al. Jan 2007 B2
7172627 Fiere et al. Feb 2007 B2
7194120 Wicker et al. Mar 2007 B2
7196454 Baur Mar 2007 B2
7197107 Arai et al. Mar 2007 B2
7231014 Levy Jun 2007 B2
7231063 Naimark et al. Jun 2007 B2
7239940 Wang et al. Jul 2007 B2
7248914 Hastings et al. Jul 2007 B2
7301648 Foxlin Nov 2007 B2
7302288 Schellenberg Nov 2007 B1
7313430 Urquhart et al. Dec 2007 B2
7318805 Schweikard et al. Jan 2008 B2
7318827 Leitner et al. Jan 2008 B2
7319897 Leitner et al. Jan 2008 B2
7324623 Heuscher et al. Jan 2008 B2
7327865 Fu et al. Feb 2008 B2
7331967 Lee et al. Feb 2008 B2
7333642 Green Feb 2008 B2
7339341 Oleynikov et al. Mar 2008 B2
7366562 Dukesherer et al. Apr 2008 B2
7379790 Toth et al. May 2008 B2
7386365 Nixon Jun 2008 B2
7422592 Morley et al. Sep 2008 B2
7435216 Kwon et al. Oct 2008 B2
7440793 Chauhan et al. Oct 2008 B2
7460637 Clinthorne et al. Dec 2008 B2
7466303 Yi et al. Dec 2008 B2
7493153 Ahmed et al. Feb 2009 B2
7505617 Fu et al. Mar 2009 B2
7533892 Schena et al. May 2009 B2
7542791 Mire et al. Jun 2009 B2
7555331 Viswanathan Jun 2009 B2
7567834 Clayton et al. Jul 2009 B2
7594912 Cooper et al. Sep 2009 B2
7606613 Simon et al. Oct 2009 B2
7607440 Coste-Maniere et al. Oct 2009 B2
7623902 Pacheco Nov 2009 B2
7630752 Viswanathan Dec 2009 B2
7630753 Simon et al. Dec 2009 B2
7643862 Schoenefeld Jan 2010 B2
7660623 Hunter et al. Feb 2010 B2
7661881 Gregerson et al. Feb 2010 B2
7683331 Chang Mar 2010 B2
7683332 Chang Mar 2010 B2
7689320 Prisco et al. Mar 2010 B2
7691098 Wallace et al. Apr 2010 B2
7702379 Avinash et al. Apr 2010 B2
7702477 Tuemmler et al. Apr 2010 B2
7711083 Heigl et al. May 2010 B2
7711406 Kuhn et al. May 2010 B2
7720523 Omernick et al. May 2010 B2
7725253 Foxlin May 2010 B2
7726171 Langlotz et al. Jun 2010 B2
7742801 Neubauer et al. Jun 2010 B2
7751865 Jascob et al. Jul 2010 B2
7760849 Zhang Jul 2010 B2
7762825 Burbank et al. Jul 2010 B2
7763015 Cooper et al. Jul 2010 B2
7787699 Mahesh et al. Aug 2010 B2
7796728 Bergfjord Sep 2010 B2
7813838 Sommer Oct 2010 B2
7818044 Dukesherer et al. Oct 2010 B2
7819859 Prisco et al. Oct 2010 B2
7824401 Manzo et al. Nov 2010 B2
7831294 Viswanathan Nov 2010 B2
7834484 Sartor Nov 2010 B2
7835557 Kendrick et al. Nov 2010 B2
7835778 Foley et al. Nov 2010 B2
7835784 Mire et al. Nov 2010 B2
7840253 Tremblay et al. Nov 2010 B2
7840256 Lakin et al. Nov 2010 B2
7843158 Prisco Nov 2010 B2
7844320 Shahidi Nov 2010 B2
7853305 Simon et al. Dec 2010 B2
7853313 Thompson Dec 2010 B2
7865269 Prisco et al. Jan 2011 B2
D631966 Perloff et al. Feb 2011 S
7879045 Gielen et al. Feb 2011 B2
7881767 Strommer et al. Feb 2011 B2
7881770 Melkent et al. Feb 2011 B2
7886743 Cooper et al. Feb 2011 B2
RE42194 Foley et al. Mar 2011 E
RE42226 Foley et al. Mar 2011 E
7900524 Calloway et al. Mar 2011 B2
7907166 Lamprecht et al. Mar 2011 B2
7909122 Schena et al. Mar 2011 B2
7925653 Saptharishi Apr 2011 B2
7930065 Larkin et al. Apr 2011 B2
7935130 Willliams May 2011 B2
7940999 Liao et al. May 2011 B2
7945012 Ye et al. May 2011 B2
7945021 Shapiro et al. May 2011 B2
7953470 Vetter et al. May 2011 B2
7954397 Choi et al. Jun 2011 B2
7971341 Dukesherer et al. Jul 2011 B2
7974674 Hauck et al. Jul 2011 B2
7974677 Mire et al. Jul 2011 B2
7974681 Wallace et al. Jul 2011 B2
7979157 Anvari Jul 2011 B2
7983733 Viswanathan Jul 2011 B2
3004121 Sartor Aug 2011 A1
7988215 Seibold Aug 2011 B2
7996110 Lipow et al. Aug 2011 B2
8004229 Nowlin et al. Aug 2011 B2
8010177 Csavoy et al. Aug 2011 B2
8019045 Kato Sep 2011 B2
8021310 Sanborn et al. Sep 2011 B2
3046057 Clarke Oct 2011 A1
8035685 Jensen Oct 2011 B2
8046054 Kim et al. Oct 2011 B2
3054184 Cline et al. Nov 2011 A1
3054752 Druke et al. Nov 2011 A1
3062288 Cooper et al. Nov 2011 A1
3066524 Burbank et al. Nov 2011 A1
8052688 Wolf, II Nov 2011 B2
8057397 Li et al. Nov 2011 B2
8057407 Martinelli et al. Nov 2011 B2
8062375 Glerum et al. Nov 2011 B2
8073335 Labonville et al. Dec 2011 B2
8079950 Stern et al. Dec 2011 B2
8086299 Adler et al. Dec 2011 B2
8092370 Roberts et al. Jan 2012 B2
8098914 Liao et al. Jan 2012 B2
8100950 St. Clair et al. Jan 2012 B2
8105320 Manzo Jan 2012 B2
8108025 Csavoy et al. Jan 2012 B2
8109877 Moctezuma de la Barrera et al. Feb 2012 B2
8112292 Simon Feb 2012 B2
8116430 Shapiro et al. Feb 2012 B1
8120301 Goldberg et al. Feb 2012 B2
8121249 Wang et al. Feb 2012 B2
8123675 Funda et al. Feb 2012 B2
8133229 Bonutti Mar 2012 B1
8142420 Schena Mar 2012 B2
8147494 Leitner et al. Apr 2012 B2
8150494 Simon et al. Apr 2012 B2
8150497 Gielen et al. Apr 2012 B2
8150498 Gielen et al. Apr 2012 B2
8165658 Waynik et al. Apr 2012 B2
8170313 Kendrick et al. May 2012 B2
8179073 Farritor et al. May 2012 B2
8182476 Julian et al. May 2012 B2
8184880 Zhao et al. May 2012 B2
8202278 Orban, III et al. Jun 2012 B2
8208708 Homan et al. Jun 2012 B2
8208988 Jensen Jun 2012 B2
8219177 Smith et al. Jul 2012 B2
8219178 Smith et al. Jul 2012 B2
8220468 Cooper et al. Jul 2012 B2
8224024 Foxlin et al. Jul 2012 B2
8224484 Swarup et al. Jul 2012 B2
8225798 Baldwin et al. Jul 2012 B2
8228368 Zhao et al. Jul 2012 B2
8231610 Jo et al. Jul 2012 B2
8263933 Hartmann et al. Jul 2012 B2
8239001 Verard et al. Aug 2012 B2
8241271 Millman et al. Aug 2012 B2
8248413 Gattani et al. Aug 2012 B2
8256319 Cooper et al. Sep 2012 B2
8271069 Jascob et al. Sep 2012 B2
8271130 Hourtash Sep 2012 B2
8281670 Larkin et al. Oct 2012 B2
8282653 Nelson et al. Oct 2012 B2
8301226 Csavoy et al. Oct 2012 B2
8311611 Csavoy et al. Nov 2012 B2
8320991 Jascob et al. Nov 2012 B2
8332012 Kienzle, III Dec 2012 B2
8333755 Cooper et al. Dec 2012 B2
8335552 Stiles Dec 2012 B2
8335557 Maschke Dec 2012 B2
8348931 Cooper et al. Jan 2013 B2
8353963 Glerum Jan 2013 B2
8358818 Miga et al. Jan 2013 B2
8359730 Burg et al. Jan 2013 B2
8374673 Adcox et al. Feb 2013 B2
8374723 Zhao et al. Feb 2013 B2
8379791 Forthmann et al. Feb 2013 B2
8386019 Camus et al. Feb 2013 B2
8392022 Ortmaier et al. Mar 2013 B2
8394099 Patwardhan Mar 2013 B2
8395342 Prisco Mar 2013 B2
8398634 Manzo et al. Mar 2013 B2
8400094 Schena Mar 2013 B2
8414957 Enzerink et al. Apr 2013 B2
8418073 Mohr et al. Apr 2013 B2
8450694 Baviera et al. May 2013 B2
8452447 Nixon May 2013 B2
RE44305 Foley et al. Jun 2013 E
8462911 Vesel et al. Jun 2013 B2
8465476 Rogers et al. Jun 2013 B2
8465771 Wan et al. Jun 2013 B2
8467851 Mire et al. Jun 2013 B2
8467852 Csavoy et al. Jun 2013 B2
8469947 Devengenzo et al. Jun 2013 B2
RE44392 Hynes Jul 2013 E
8483434 Buehner et al. Jul 2013 B2
8483800 Jensen et al. Jul 2013 B2
8486532 Enzerink et al. Jul 2013 B2
8489235 Moll et al. Jul 2013 B2
8500722 Cooper Aug 2013 B2
8500728 Newton et al. Aug 2013 B2
8504201 Moll et al. Aug 2013 B2
8506555 Ruiz Morales Aug 2013 B2
8506556 Schena Aug 2013 B2
8508173 Goldberg et al. Aug 2013 B2
8512318 Tovey et al. Aug 2013 B2
8515576 Lipow et al. Aug 2013 B2
8518120 Glerum et al. Aug 2013 B2
8521331 Itkowitz Aug 2013 B2
8526688 Groszmann et al. Sep 2013 B2
8526700 Isaacs Sep 2013 B2
8527094 Kumar et al. Sep 2013 B2
8528440 Morley et al. Sep 2013 B2
8532741 Heruth et al. Sep 2013 B2
8541970 Nowlin et al. Sep 2013 B2
8548563 Simon et al. Oct 2013 B2
8549732 Burg et al. Oct 2013 B2
8551114 Ramos de la Pena Oct 2013 B2
8551116 Julian et al. Oct 2013 B2
8556807 Scott et al. Oct 2013 B2
8556979 Glerum et al. Oct 2013 B2
8560118 Greer et al. Oct 2013 B2
8561473 Blumenkranz Oct 2013 B2
8562594 Cooper et al. Oct 2013 B2
8571638 Shoham Oct 2013 B2
8571710 Coste-Maniere et al. Oct 2013 B2
8573465 Shelton, IV Nov 2013 B2
8574303 Sharkey Nov 2013 B2
8585420 Burbank et al. Nov 2013 B2
8594841 Zhao et al. Nov 2013 B2
8597198 Sanborn et al. Dec 2013 B2
8600478 Verard et al. Dec 2013 B2
8603077 Cooper et al. Dec 2013 B2
8611985 Lavallee et al. Dec 2013 B2
8613230 Blumenkranz et al. Dec 2013 B2
8621939 Blumenkranz et al. Jan 2014 B2
8624537 Nowlin et al. Jan 2014 B2
8630389 Kato Jan 2014 B2
8634897 Simon et al. Jan 2014 B2
8634957 Toth et al. Jan 2014 B2
8638056 Goldberg et al. Jan 2014 B2
8638057 Goldberg et al. Jan 2014 B2
8639000 Zhao et al. Jan 2014 B2
8641726 Bonutti Feb 2014 B2
8644907 Hartmann et al. Feb 2014 B2
8657809 Schoepp Feb 2014 B2
8660635 Simon et al. Feb 2014 B2
8666544 Moll et al. Mar 2014 B2
8675939 Moctezuma de la Barrera Mar 2014 B2
8678647 Gregerson et al. Mar 2014 B2
8679125 Smith et al. Mar 2014 B2
8679183 Glerum et al. Mar 2014 B2
8682413 Lloyd Mar 2014 B2
8684253 Giordano et al. Apr 2014 B2
8685098 Glerum et al. Apr 2014 B2
8693730 Umasuthan et al. Apr 2014 B2
8694075 Groszmann et al. Apr 2014 B2
8696458 Foxlin et al. Apr 2014 B2
8700123 Okamura et al. Apr 2014 B2
8706086 Glerum Apr 2014 B2
8706185 Foley et al. Apr 2014 B2
8706301 Zhao et al. Apr 2014 B2
8717430 Simon et al. May 2014 B2
8727618 Maschke et al. May 2014 B2
8734432 Tuma et al. May 2014 B2
8738115 Amberg et al. May 2014 B2
8738181 Greer et al. May 2014 B2
8740882 Jun et al. Jun 2014 B2
8746252 McGrogan et al. Jun 2014 B2
8749189 Nowlin et al. Jun 2014 B2
8749190 Nowlin et al. Jun 2014 B2
8761930 Nixon Jun 2014 B2
8764448 Yang et al. Jul 2014 B2
8771170 Mesallum et al. Jul 2014 B2
8781186 Clements et al. Jul 2014 B2
8781630 Banks et al. Jul 2014 B2
8784385 Boyden et al. Jul 2014 B2
8786241 Nowlin et al. Jul 2014 B2
8787520 Baba Jul 2014 B2
8792704 Isaacs Jul 2014 B2
8798231 Notohara et al. Aug 2014 B2
8800838 Shelton, IV Aug 2014 B2
8808164 Hoffman et al. Aug 2014 B2
8812077 Dempsey Aug 2014 B2
8814793 Brabrand Aug 2014 B2
8816628 Nowlin et al. Aug 2014 B2
8818105 Myronenko et al. Aug 2014 B2
8820605 Shelton, IV Sep 2014 B2
8821511 Von Jako et al. Sep 2014 B2
8823308 Nowlin et al. Sep 2014 B2
8827996 Scott et al. Sep 2014 B2
8828024 Farritor et al. Sep 2014 B2
8830224 Zhao et al. Sep 2014 B2
8834489 Cooper et al. Sep 2014 B2
8834490 Bonutti Sep 2014 B2
8838270 Druke et al. Sep 2014 B2
8844789 Shelton, IV et al. Sep 2014 B2
8855822 Bartol et al. Oct 2014 B2
8858598 Seifert et al. Oct 2014 B2
8860753 Bhandarkar et al. Oct 2014 B2
8864751 Prisco et al. Oct 2014 B2
8864798 Weiman et al. Oct 2014 B2
8864833 Glerum et al. Oct 2014 B2
8867703 Shapiro et al. Oct 2014 B2
8870880 Himmelberger et al. Oct 2014 B2
8876866 Zappacosta et al. Nov 2014 B2
8880223 Raj et al. Nov 2014 B2
8882803 Iott et al. Nov 2014 B2
8883210 Truncale et al. Nov 2014 B1
8888821 Rezach et al. Nov 2014 B2
8888853 Glerum et al. Nov 2014 B2
8888854 Glerum et al. Nov 2014 B2
8894652 Seifert et al. Nov 2014 B2
8894688 Suh Nov 2014 B2
8894691 Iott et al. Nov 2014 B2
8906069 Hansell et al. Dec 2014 B2
8964934 Ein-Gal Feb 2015 B2
8992580 Bar et al. Mar 2015 B2
8996169 Lightcap et al. Mar 2015 B2
9001963 Sowards-Emmerd et al. Apr 2015 B2
9002076 Khadem et al. Apr 2015 B2
9044190 Rubner et al. Jun 2015 B2
9107683 Hourtash et al. Aug 2015 B2
9125556 Zehavi et al. Sep 2015 B2
9131986 Greer et al. Sep 2015 B2
9215968 Schostek et al. Dec 2015 B2
9308050 Kostrzewski et al. Apr 2016 B2
9380984 Li et al. Jul 2016 B2
9393039 Lechner et al. Jul 2016 B2
9398886 Gregerson et al. Jul 2016 B2
9398890 Dong et al. Jul 2016 B2
9414859 Ballard et al. Aug 2016 B2
9420975 Gutfleisch et al. Aug 2016 B2
9492235 Hourtash et al. Nov 2016 B2
9592096 Maillet et al. Mar 2017 B2
9750465 Engel et al. Sep 2017 B2
9757203 Hourtash et al. Sep 2017 B2
9795354 Menegaz et al. Oct 2017 B2
9814535 Bar et al. Nov 2017 B2
9820783 Donner et al. Nov 2017 B2
9833265 Donner et al. Nov 2017 B2
9848922 Tohmeh et al. Dec 2017 B2
9925011 Gombert et al. Mar 2018 B2
9931025 Graetzel et al. Apr 2018 B1
10034717 Miller et al. Jul 2018 B2
20010036302 Miller Nov 2001 A1
20020035321 Bucholz et al. Mar 2002 A1
20040068172 Nowinski et al. Apr 2004 A1
20040076259 Jensen et al. Apr 2004 A1
20050096502 Khalili May 2005 A1
20050143651 Verard et al. Jun 2005 A1
20050171558 Abovitz et al. Aug 2005 A1
20060100610 Wallace et al. May 2006 A1
20060142657 Quaid Jun 2006 A1
20060161136 Anderson Jul 2006 A1
20060173329 Marquart et al. Aug 2006 A1
20060184396 Dennis et al. Aug 2006 A1
20060241416 Marquart et al. Oct 2006 A1
20060291612 Nishide et al. Dec 2006 A1
20070015987 Benlloch Baviera et al. Jan 2007 A1
20070021738 Hasser et al. Jan 2007 A1
20070032906 Sutherland Feb 2007 A1
20070038059 Sheffer et al. Feb 2007 A1
20070073133 Schoenefeld Mar 2007 A1
20070156121 Millman et al. Jul 2007 A1
20070156157 Nahum Jul 2007 A1
20070167712 Keglovich et al. Jul 2007 A1
20070233238 Huynh et al. Oct 2007 A1
20070270685 Kang Nov 2007 A1
20080004523 Jensen Jan 2008 A1
20080013809 Zhu et al. Jan 2008 A1
20080033283 Dellaca et al. Feb 2008 A1
20080046122 Manzo et al. Feb 2008 A1
20080082109 Moll et al. Apr 2008 A1
20080108912 Node-Langlois May 2008 A1
20080108991 von Jako May 2008 A1
20080109012 Falco et al. May 2008 A1
20080144906 Allred et al. Jun 2008 A1
20080161680 von Jako et al. Jul 2008 A1
20080161682 Kendrick et al. Jul 2008 A1
20080177203 von Jako Jul 2008 A1
20080214922 Hartmann et al. Sep 2008 A1
20080228068 Viswanathan et al. Sep 2008 A1
20080228196 Wang et al. Sep 2008 A1
20080235052 Node-Langlois et al. Sep 2008 A1
20080269596 Revie et al. Oct 2008 A1
20080287771 Anderson Nov 2008 A1
20080287781 Revie et al. Nov 2008 A1
20080300477 Lloyd et al. Dec 2008 A1
20080300478 Zuhars et al. Dec 2008 A1
20080302950 Park et al. Dec 2008 A1
20080306490 Lakin et al. Dec 2008 A1
20080319311 Hamadeh Dec 2008 A1
20090012509 Csavoy et al. Jan 2009 A1
20090030428 Omori et al. Jan 2009 A1
20090080737 Battle et al. Mar 2009 A1
20090185655 Koken et al. Jul 2009 A1
20090198121 Hoheisel Aug 2009 A1
20090216113 Meier et al. Aug 2009 A1
20090228019 Gross et al. Sep 2009 A1
20090259123 Navab et al. Oct 2009 A1
20090259230 Khadem et al. Oct 2009 A1
20090264899 Appenrodt et al. Oct 2009 A1
20090281417 Hartmann et al. Nov 2009 A1
20090326318 Tognaccini Dec 2009 A1
20100022874 Wang et al. Jan 2010 A1
20100039506 Sarvestani et al. Feb 2010 A1
20100125286 Wang et al. May 2010 A1
20100130986 Mailloux et al. May 2010 A1
20100192720 Helmer Aug 2010 A1
20100228117 Hartmann Sep 2010 A1
20100228265 Prisco Sep 2010 A1
20100249571 Jensen et al. Sep 2010 A1
20100274120 Heuscher Oct 2010 A1
20100280363 Skarda et al. Nov 2010 A1
20100331858 Simaan et al. Dec 2010 A1
20110022229 Jang et al. Jan 2011 A1
20110077504 Fischer et al. Mar 2011 A1
20110098553 Robbins et al. Apr 2011 A1
20110137152 Li Jun 2011 A1
20110190789 Thiran Aug 2011 A1
20110213384 Jeong Sep 2011 A1
20110224684 Larkin et al. Sep 2011 A1
20110224685 Larkin et al. Sep 2011 A1
20110224686 Larkin et al. Sep 2011 A1
20110224687 Larkin et al. Sep 2011 A1
20110224688 Larkin et al. Sep 2011 A1
20110224689 Larkin et al. Sep 2011 A1
20110224825 Larkin et al. Sep 2011 A1
20110230967 O'Halloran et al. Sep 2011 A1
20110238080 Ranjit et al. Sep 2011 A1
20110276058 Choi et al. Nov 2011 A1
20110282189 Graumann Nov 2011 A1
20110286573 Schretter et al. Nov 2011 A1
20110295062 Gratacos Solsona et al. Dec 2011 A1
20110295370 Suh et al. Dec 2011 A1
20110306986 Lee et al. Dec 2011 A1
20120035507 George et al. Feb 2012 A1
20120046668 Gantes Feb 2012 A1
20120051498 Koishi Mar 2012 A1
20120053597 Anvari et al. Mar 2012 A1
20120059248 Holsing et al. Mar 2012 A1
20120059378 Farrell Mar 2012 A1
20120071753 Hunter et al. Mar 2012 A1
20120108954 Schulhauser et al. May 2012 A1
20120136372 Amat Girbau et al. May 2012 A1
20120143084 Shoham Jun 2012 A1
20120184839 Woerlein Jul 2012 A1
20120197182 Millman et al. Aug 2012 A1
20120226145 Chang et al. Sep 2012 A1
20120235909 Birkenbach et al. Sep 2012 A1
20120245596 Meenink Sep 2012 A1
20120253332 Moll Oct 2012 A1
20120253360 White et al. Oct 2012 A1
20120256092 Zingerman Oct 2012 A1
20120294498 Popovic Nov 2012 A1
20120296203 Hartmann et al. Nov 2012 A1
20130006267 Odermatt et al. Jan 2013 A1
20130016889 Myronenko et al. Jan 2013 A1
20130030571 Ruiz Morales et al. Jan 2013 A1
20130035583 Park et al. Feb 2013 A1
20130060146 Yang et al. Mar 2013 A1
20130060337 Petersheim et al. Mar 2013 A1
20130094742 Feilkas Apr 2013 A1
20130096574 Kang et al. Apr 2013 A1
20130113791 Isaacs et al. May 2013 A1
20130116706 Lee et al. May 2013 A1
20130131695 Scarfogliero et al. May 2013 A1
20130144307 Jeong et al. Jun 2013 A1
20130158542 Manzo et al. Jun 2013 A1
20130165937 Patwardhan Jun 2013 A1
20130178867 Farritor et al. Jul 2013 A1
20130178868 Roh Jul 2013 A1
20130178870 Schena Jul 2013 A1
20130204271 Brisson et al. Aug 2013 A1
20130211419 Jensen Aug 2013 A1
20130211420 Jensen Aug 2013 A1
20130218142 Tuma et al. Aug 2013 A1
20130223702 Holsing et al. Aug 2013 A1
20130225942 Holsing et al. Aug 2013 A1
20130225943 Holsing et al. Aug 2013 A1
20130231556 Holsing et al. Sep 2013 A1
20130237995 Lee et al. Sep 2013 A1
20130245375 DiMaio et al. Sep 2013 A1
20130261640 Kim et al. Oct 2013 A1
20130272488 Bailey et al. Oct 2013 A1
20130272489 Dickman et al. Oct 2013 A1
20130274761 Devengenzo et al. Oct 2013 A1
20130281821 Liu et al. Oct 2013 A1
20130296884 Taylor et al. Nov 2013 A1
20130303887 Holsing et al. Nov 2013 A1
20130307955 Deitz et al. Nov 2013 A1
20130317521 Choi et al. Nov 2013 A1
20130325033 Schena et al. Dec 2013 A1
20130325035 Hauck et al. Dec 2013 A1
20130331686 Freysinger et al. Dec 2013 A1
20130331858 Devengenzo et al. Dec 2013 A1
20130331861 Yoon Dec 2013 A1
20130342578 Isaacs Dec 2013 A1
20130345717 Markvicka et al. Dec 2013 A1
20130345757 Stad Dec 2013 A1
20140001235 Shelton, IV Jan 2014 A1
20140012131 Heruth et al. Jan 2014 A1
20140031664 Kang et al. Jan 2014 A1
20140046128 Lee et al. Feb 2014 A1
20140046132 Hoeg et al. Feb 2014 A1
20140046340 Wilson et al. Feb 2014 A1
20140049629 Siewerdsen et al. Feb 2014 A1
20140058406 Tsekos Feb 2014 A1
20140073914 Lavallee et al. Mar 2014 A1
20140080086 Chen Mar 2014 A1
20140081128 Verard et al. Mar 2014 A1
20140088612 Bartol et al. Mar 2014 A1
20140094694 Moctezuma de la Barrera Apr 2014 A1
20140094851 Gordon Apr 2014 A1
20140096369 Matsumoto et al. Apr 2014 A1
20140100587 Farritor et al. Apr 2014 A1
20140121676 Kostrzewski et al. May 2014 A1
20140128882 Kwak et al. May 2014 A1
20140135796 Simon et al. May 2014 A1
20140142591 Alvarez et al. May 2014 A1
20140142592 Moon et al. May 2014 A1
20140148692 Hartmann et al. May 2014 A1
20140163581 Devengenzo et al. Jun 2014 A1
20140171781 Stiles Jun 2014 A1
20140171900 Stiles Jun 2014 A1
20140171965 Loh et al. Jun 2014 A1
20140180308 Grunberg Jun 2014 A1
20140180309 Seeber et al. Jun 2014 A1
20140187915 Yaroshenko et al. Jul 2014 A1
20140188132 Kang Jul 2014 A1
20140194699 Roh et al. Jul 2014 A1
20140130810 Azizian et al. Aug 2014 A1
20140221819 Sarment Aug 2014 A1
20140222023 Kim et al. Aug 2014 A1
20140228631 Kwak et al. Aug 2014 A1
20140234804 Huang et al. Aug 2014 A1
20140257328 Kim et al. Sep 2014 A1
20140257329 Jang et al. Sep 2014 A1
20140257330 Choi et al. Sep 2014 A1
20140275760 Lee et al. Sep 2014 A1
20140275985 Walker et al. Sep 2014 A1
20140276931 Parihar et al. Sep 2014 A1
20140276940 Seo Sep 2014 A1
20140276944 Farritor et al. Sep 2014 A1
20140288413 Hwang et al. Sep 2014 A1
20140299648 Shelton, IV et al. Oct 2014 A1
20140303434 Farritor et al. Oct 2014 A1
20140303643 Ha et al. Oct 2014 A1
20140305995 Shelton, IV et al. Oct 2014 A1
20140309659 Roh et al. Oct 2014 A1
20140316436 Bar et al. Oct 2014 A1
20140323803 Hoffman et al. Oct 2014 A1
20140324070 Min et al. Oct 2014 A1
20140330288 Date et al. Nov 2014 A1
20140364720 Darrow et al. Dec 2014 A1
20140371577 Maillet et al. Dec 2014 A1
20150039034 Frankel et al. Feb 2015 A1
20150085970 Bouhnik et al. Mar 2015 A1
20150146847 Liu May 2015 A1
20150150524 Yorkston et al. Jun 2015 A1
20150196261 Funk Jul 2015 A1
20150213633 Chang et al. Jul 2015 A1
20150335480 Alvarez et al. Nov 2015 A1
20150342647 Frankel et al. Dec 2015 A1
20160005194 Schretter et al. Jan 2016 A1
20160166329 Langan et al. Jun 2016 A1
20160235480 Scholl et al. Aug 2016 A1
20160249990 Glozman et al. Sep 2016 A1
20160302871 Gregerson et al. Oct 2016 A1
20160320322 Suzuki Nov 2016 A1
20160331335 Gregerson et al. Nov 2016 A1
20170135770 Scholl et al. May 2017 A1
20170143284 Sehnert et al. May 2017 A1
20170143426 Isaacs et al. May 2017 A1
20170156816 Ibrahim Jun 2017 A1
20170202629 Maillet et al. Jul 2017 A1
20170212723 Atarot et al. Jul 2017 A1
20170215825 Johnson et al. Aug 2017 A1
20170215826 Johnson et al. Aug 2017 A1
20170215827 Johnson et al. Aug 2017 A1
20170231710 Scholl et al. Aug 2017 A1
20170258426 Risher-Kelly et al. Sep 2017 A1
20170273748 Hourtash et al. Sep 2017 A1
20170296277 Hourtash et al. Oct 2017 A1
20170360493 Zucher et al. Dec 2017 A1
Foreign Referenced Citations (3)
Number Date Country
WO-2005122916 Dec 2005 WO
WO-2007136768 Nov 2007 WO
WO-2008097540 Aug 2008 WO
Non-Patent Literature Citations (1)
Entry
US 8,231,638 B2, 07/2012, Swarup et al. (withdrawn)
Related Publications (1)
Number Date Country
20200237448 A1 Jul 2020 US
Provisional Applications (1)
Number Date Country
61470545 Apr 2011 US
Continuations (4)
Number Date Country
Parent 15705578 Sep 2017 US
Child 16846360 US
Parent 14824602 Aug 2015 US
Child 15705578 US
Parent 14522509 Oct 2014 US
Child 14824602 US
Parent 14009050 US
Child 14522509 US