Optical and non-optical sensor tracking of a robotically controlled instrument

Information

  • Patent Grant
  • 12144565
  • Patent Number
    12,144,565
  • Date Filed
    Tuesday, November 8, 2022
    2 years ago
  • Date Issued
    Tuesday, November 19, 2024
    3 days ago
Abstract
Surgical systems, navigation systems, and methods involving a robotic manipulator configured to control movement of an instrument to facilitate a surgical procedure. The navigation system includes a camera unit configured to optically track a pose of the instrument and a non-optical sensor coupled to the instrument. The navigation system includes a computing system coupled to the camera unit and being configured to obtain readings from the non-optical sensor. The computing system detects a condition whereby the camera unit is blocked from optically tracking the pose of the instrument. In response to detection of the condition, the computing system tracks the pose of the instrument with the readings from the non-optical sensor.
Description
TECHNICAL FIELD

The present disclosure relates generally to navigation systems and methods that tracks objects in space by determining changes in the position and/or orientation of such objects over time. More specifically, the present disclosure relates to navigation systems and methods that utilize optical sensors and non-optical sensors to determine the position and/or orientation of objects.


BACKGROUND OF THE INVENTION

Navigation systems assist users in precisely locating objects. For instance, navigation systems are used in industrial, aerospace, defense, and medical applications. In the medical field, navigation systems assist surgeons in precisely placing surgical instruments relative to a patient's anatomy.


Surgeries in which navigation systems are used include neurosurgery and orthopedic surgery. Often the instrument and the anatomy are tracked together with their relative movement shown on a display. The navigation system may display the instrument moving in conjunction with a preoperative image or an intraoperative image of the anatomy. Preoperative images are typically prepared by MRI or CT scans, while intraoperative images may be prepared using a fluoroscope, low level x-ray or any similar device. Alternatively, some systems are image-less in which the patient's anatomy is “painted” by a navigation probe and mathematically fitted to an anatomical model for display.


Navigation systems may employ light signals, sound waves, magnetic fields, RF signals, etc. in order to track the position and/or orientation of the instrument and anatomy. Optical navigation systems are widely used due to the accuracy of such systems.


Prior art optical navigation systems typically include one or more camera units that house one or more optical sensors (such as charge coupled devices or CCDs). The optical sensors detect light emitted from trackers attached to the instrument and the anatomy. Each tracker has a plurality of optical emitters such as light emitting diodes (LEDs) that periodically transmit light to the sensors to determine the position of the LEDs.


The positions of the LEDs on the instrument tracker correlate to the coordinates of a working end of the instrument relative to a camera coordinate system. The positions of the LEDs on the anatomy tracker(s) correlate to the coordinates of a target area of the anatomy in three-dimensional space relative to the camera coordinate system. Thus, the position and/or orientation of the working end of the instrument relative to the target area of the anatomy can be tracked and displayed.


Navigation systems can be used in a closed loop manner to control movement of surgical instruments. In these navigation systems both the instrument and the anatomy being treated are outfitted with trackers such that the navigation system can track their position and orientation. Information from the navigation system is then fed to a control system to control or guide movement of the instrument. In some cases, the instrument is held by a robot and the information is sent from the navigation system to a control system of the robot.


In order for the control system to quickly account for relative motion between the instrument and the anatomy being treated, the accuracy and speed of the navigation system must meet the desired tolerances of the procedure. For instance, tolerances associated with cementless knee implants may be very small to ensure adequate fit and function of the implant. Accordingly, the accuracy and speed of the navigation system may need to be greater than in more rough cutting procedures.


One of the limitations on accuracy and speed of optical navigation systems is that the system relies on the line-of-sight between the LEDs and the optical sensors of the camera unit. When the line-of-sight is broken, the system may not accurately determine the position and/or orientation of the instrument and anatomy being tracked. As a result, surgeries can encounter many starts and stops. For instance, during control of robotically assisted cutting, when the line-of-sight is broken, the cutting tool must be disabled until the line-of-sight is regained. This can cause significant delays and added cost to the procedure.


Another limitation on accuracy occurs when using active LEDs on the trackers. In such systems, the LEDs are often fired in sequence. In this case only the position of the actively fired LED is measured and known by the system, while the positions of the remaining, unmeasured LEDs are unknown. In these systems, the positions of the remaining, unmeasured LEDs are approximated. Approximations are usually based on linear velocity data extrapolated from the last known measured positions of the currently unmeasured LEDs. However, because the LEDs are fired in sequence, there can be a considerable lag between measurements of any one LED. This lag is increased with each additional tracker used in the system. Furthermore, this approximation does not take into account rotations of the trackers, resulting in further possible errors in position data for the trackers.


As a result, there is a need in the art for an optical navigation system that utilizes additional non-optically based data to improve tracking and provide a level of accuracy and speed with which to determine position and/or orientations of objects for precise surgical procedures such as robotically assisted surgical cutting.


SUMMARY

In one example, a surgical system is provided comprising: a robotic manipulator; an instrument coupled to the robotic manipulator, wherein the robotic manipulator is configured to control movement of the instrument to facilitate a surgical procedure; a non-optical sensor coupled to the instrument; and a navigation system configured obtain readings from the non-optical sensor, and the navigation system comprising a camera unit configured to optically track a pose of the instrument, wherein the navigation system is configured to: detect a condition whereby the camera unit is blocked from optically tracking the pose of the instrument; and in response to detection of the condition, track the pose of the instrument with the readings from the non-optical sensor.


In one example, a navigation system is provided that is configured to be utilized with a robotic manipulator configured to control movement of an instrument to facilitate a surgical procedure, the navigation system comprising: a camera unit configured to optically track a pose of the instrument; a non-optical sensor coupled to the instrument; and a computing system coupled to the camera unit and being configured to obtain readings from the non-optical sensor, and the computing system being configured to: detect a condition whereby the camera unit is blocked from optically tracking the pose of the instrument; and in response to detection of the condition, track the pose of the instrument with the readings from the non-optical sensor.


In one example, a method is provided of operating a navigation system including a camera unit, a non-optical sensor, and a computing system, the non-optical sensor being coupled to an instrument that is controlled by a robotic manipulator, the method comprising: optically tracking a pose of the instrument with the camera unit; detecting, with the computing system, a condition whereby the camera unit is blocked from optically tracking the pose of the instrument; obtaining, with the computing system, readings from the non-optical sensor; and in response to detecting the condition, the computing system tracking the pose of the instrument with the readings from the non-optical sensor.





BRIEF DESCRIPTION OF THE DRAWINGS

Advantages of the present invention will be readily appreciated as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings wherein:



FIG. 1 is a perspective view of a navigation system of the present invention being used in conjunction with a robotic manipulator;



FIG. 2 is a schematic view of the navigation system;



FIG. 3 is schematic view of coordinate systems used with the navigation system;



FIG. 4 is a flow diagram of steps carried out by a localization engine of the navigation system;



FIG. 4A is a schematic illustration of matching measured LEDs with a tracker model to obtain a transformation matrix;



FIG. 5 is a flow diagram of steps carried out by the localization engine in a first alternative embodiment;



FIG. 5A is an illustration of a tracker model including real and virtual LEDs;



FIG. 6 is a flow diagram of steps carried out by the localization engine in a second alternative embodiment; and



FIG. 7 is a flow diagram of steps carried out by the localization engine when one or more LEDs are blocked from measurement.





DETAILED DESCRIPTION
I. Overview

Referring to FIG. 1 a surgical navigation system 20 is illustrated. System 20 is shown in a surgical setting such as an operating room of a medical facility. The navigation system 20 is set up to track movement of various objects in the operating room. Such objects include, for example, a surgical instrument 22, a femur F of a patient, and a tibia T of the patient. The navigation system 20 tracks these objects for purposes of displaying their relative positions and orientations to the surgeon and, in some cases, for purposes of controlling or constraining movement of the surgical instrument 22 relative to a predefined path or anatomical boundary.


The surgical navigation system 20 includes a computer cart assembly 24 that houses a navigation computer 26. A navigation interface is in operative communication with the navigation computer 26. The navigation interface includes a display 28 adapted to be situated outside of the sterile field and a display 29 adapted to be situated inside the sterile field. The displays 28, 29 are adjustably mounted to the computer cart assembly 24. Input devices 30, 32 such as a mouse and keyboard can be used to input information into the navigation computer 26 or otherwise select/control certain aspects of the navigation computer 26. Other input devices are contemplated including a touch screen (not shown) on displays 28, 29 or voice-activation.


A localizer 34 communicates with the navigation computer 26. In the embodiment shown, the localizer 34 is an optical localizer and includes a camera unit 36. The camera unit 36 has an outer casing 38 that houses one or more optical sensors 40. In some embodiments at least two optical sensors 40 are employed, preferably three. The optical sensors 40 may be three separate high resolution charge-coupled devices (CCD). In one embodiment three, one-dimensional CCDs are employed. It should be appreciated that in other embodiments, separate camera units, each with a separate CCD, or two or more CCDs, could also be arranged around the operating room. The CCDs detect infrared (IR) signals.


Camera unit 36 is mounted on an adjustable arm to position the optical sensors 40 above the zone in which the procedure is to take place to provide the camera unit 36 with a field of view of the below discussed trackers that, ideally, is free from obstructions.


The camera unit 36 includes a camera controller 42 in communication with the optical sensors 40 to receive signals from the optical sensors 40. The camera controller 42 communicates with the navigation computer 26 through either a wired or wireless connection (not shown). One such connection may be an IEEE 1394 interface, which is a serial bus interface standard for high-speed communications and isochronous real-time data transfer. Connection could also use a company specific protocol. In other embodiments, the optical sensors 40 communicate directly with the navigation computer 26.


Position and orientation signals and/or data are transmitted to the navigation computer 26 for purposes of tracking the objects. The computer cart assembly 24, display 28, and camera unit 36 may be like those described in U.S. Pat. No. 7,725,162 to Malackowski, et al. issued on May 25, 2010, entitled “Surgery System”, hereby incorporated by reference.


The navigation computer 26 can be a personal computer or laptop computer. Navigation computer 26 has the display 28, central processing unit (CPU) and/or other processors, memory (not shown), and storage (not shown). The navigation computer 26 is loaded with software as described below. The software converts the signals received from the camera unit 36 into data representative of the position and orientation of the objects being tracked.


Navigation system 20 includes a plurality of tracking devices 44, 46, 48, also referred to herein as trackers. In the illustrated embodiment, one tracker 44 is firmly affixed to the femur F of the patient and another tracker 46 is firmly affixed to the tibia T of the patient. Trackers 44, 46 are firmly affixed to sections of bone. Trackers 44, 46 may be attached to the femur F in the manner shown in U.S. Pat. No. 7,725,162, hereby incorporated by reference. In further embodiments, an additional tracker (not shown) is attached to the patella to track a position and orientation of the patella. In further embodiments, the trackers 44, 46 could be mounted to other tissue types or parts of the anatomy.


An instrument tracker 48 is firmly attached to the surgical instrument 22. The instrument tracker 48 may be integrated into the surgical instrument 22 during manufacture or may be separately mounted to the surgical instrument 22 in preparation for the surgical procedures. The working end of the surgical instrument 22, which is being tracked, may be a rotating bur, electrical ablation device, or the like. In the embodiment shown, the surgical instrument 22 is an end effector of a surgical manipulator. Such an arrangement is shown in U.S. Provisional Patent Application No. 61/679,258, entitled, “Surgical Manipulator Capable of Controlling a Surgical Instrument in either a Semi-Autonomous Mode or a Manual, Boundary Constrained Mode”, the disclosure of which is hereby incorporated by reference, and also in U.S. patent application Ser. No. 13/958,834, entitled, “Navigation System for use with a Surgical Manipulator Operable in Manual or Semi-Autonomous Mode”, the disclosure of which is hereby incorporated by reference.


The trackers 44, 46, 48 can be battery powered with an internal battery or may have leads to receive power through the navigation computer 26, which, like the camera unit 36, preferably receives external power.


In other embodiments, the surgical instrument 22 may be manually positioned by only the hand of the user, without the aid of any cutting guide, jib, or other constraining mechanism such as a manipulator or robot. Such a surgical instrument is described in U.S. Provisional Patent Application No. 61/662,070, entitled, “Surgical Instrument Including Housing, a Cutting Accessory that Extends from the Housing and Actuators that Establish the Position of the Cutting Accessory Relative to the Housing”, hereby incorporated by reference, and also in U.S. patent application Ser. No. 13/600,888, entitled “Surgical Instrument Including Housing, a Cutting Accessory that Extends from the Housing and Actuators that Establish the Position of the Cutting Accessory Relative to the Housing”, hereby incorporated by reference.


The optical sensors 40 of the localizer 34 receive light signals from the trackers 44, 46, 48. In the illustrated embodiment, the trackers 44, 46, 48 are active trackers. In this embodiment, each tracker 44, 46, 48 has at least three active markers 50 for transmitting light signals to the optical sensors 40. The active markers 50 can be light emitting diodes or LEDs 50. The optical sensors 40 preferably have sampling rates of 100 Hz or more, more preferably 300 Hz or more, and most preferably 500 Hz or more. In some embodiments, the optical sensors 40 have sampling rates of 1000 Hz. The sampling rate is the rate at which the optical sensors 40 receive light signals from sequentially fired LEDs 50. In some embodiments, the light signals from the LEDs 50 are fired at different rates for each tracker 44, 46, 48.


Referring to FIG. 2, each of the LEDs 50 are connected to a tracker controller 62 located in a housing (not shown) of the associated tracker 44, 46, 48 that transmits/receives data to/from the navigation computer 26. In one embodiment, the tracker controllers 62 transmit data on the order of several Megabytes/second through wired connections with the navigation computer 26. In other embodiments, a wireless connection may be used. In these embodiments, the navigation computer 26 has a transceiver (not shown) to receive the data from the tracker controller 62.


In other embodiments, the trackers 44, 46, 48 may have passive markers (not shown), such as reflectors that reflect light emitted from the camera unit 36. The reflected light is then received by the optical sensors 40. Active and passive arrangements are well known in the art.


Each of the trackers 44, 46, 48 also includes a 3-dimensional gyroscope sensor 60 that measures angular velocities of the trackers 44, 46, 48. As is well known to those skilled in the art, the gyroscope sensors 60 output readings indicative of the angular velocities relative to x-, y-, and z-axes of a gyroscope coordinate system. These readings are multiplied by a conversion constant defined by the manufacturer to obtain measurements in degrees/second with respect to each of the x-, y-, and z-axes of the gyroscope coordinate system. These measurements can then be converted to an angular velocity vector {right arrow over (ω)} defined in radians/second.


The angular velocities measured by the gyroscope sensors 60 provide additional non-optically based kinematic data for the navigation system 20 with which to track the trackers 44, 46, 48. The gyroscope sensors 60 may be oriented along the axis of each coordinate system of the trackers 44, 46, 48. In other embodiments, each gyroscope coordinate system is transformed to its tracker coordinate system such that the gyroscope data reflects the angular velocities with respect to the x-, y-, and z-axes of the coordinate systems of the trackers 44, 46, 48.


Each of the gyroscope sensors 60 communicate with the tracker controller 62 located within the housing of the associated tracker that transmits/receives data to/from the navigation computer 26. The navigation computer 26 has one or more transceivers (not shown) to receive the data from the gyroscope sensors 60. The data can be received either through a wired or wireless connection.


The gyroscope sensors 60 preferably have sampling rates of 100 Hz or more, more preferably 300 Hz or more, and most preferably 500 Hz or more. In some embodiments, the gyroscope sensors 60 have sampling rates of 1000 Hz. The sampling rate of the gyroscope sensors 60 is the rate at which signals are sent out from the gyroscope sensors 60 to be converted into angular velocity data.


The sampling rates of the gyroscope sensors 60 and the optical sensors 40 are established or timed so that for each optical measurement of position there is a corresponding non-optical measurement of angular velocity.


Each of the trackers 44, 46, 48 also includes a 3-axis accelerometer 70 that measures acceleration along each of x-, y-, and z-axes of an accelerometer coordinate system. The accelerometers 70 provide additional non-optically based data for the navigation system 20 with which to track the trackers 44, 46, 48.


Each of the accelerometers 70 communicate with the tracker controller 62 located in the housing of the associated tracker that transmits/receives data to/from the navigation computer 26. One or more of the transceivers (not shown) of the navigation computer receives the data from the accelerometers 70.


The accelerometers 70 may be oriented along the axis of each coordinate system of the trackers 44, 46, 48. In other embodiments, each accelerometer coordinate system is transformed to its tracker coordinate system such that the accelerometer data reflects the accelerations with respect to the x-, y-, and z-axes of the coordinate systems of the trackers 44, 46, 48.


The navigation computer 26 includes a navigation processor 52. The camera unit 36 receives optical signals from the LEDs 50 of the trackers 44, 46, 48 and outputs to the processor 52 signals relating to the position of the LEDs 50 of the trackers 44, 46, 48 relative to the localizer 34. The gyroscope sensors 60 transmit non-optical signals to the processor 52 relating to the 3-dimensional angular velocities measured by the gyroscope sensors 60. Based on the received optical and non-optical signals, navigation processor 52 generates data indicating the relative positions and orientations of the trackers 44, 46, 48 relative to the localizer 34.


It should be understood that the navigation processor 52 could include one or more processors to control operation of the navigation computer 26. The processors can be any type of microprocessor or multi-processor system. The term processor is not intended to limit the scope of the invention to a single processor.


Prior to the start of the surgical procedure, additional data are loaded into the navigation processor 52. Based on the position and orientation of the trackers 44, 46, 48 and the previously loaded data, navigation processor 52 determines the position of the working end of the surgical instrument 22 and the orientation of the surgical instrument 22 relative to the tissue against which the working end is to be applied. In some embodiments, navigation processor 52 forwards these data to a manipulator controller 54. The manipulator controller 54 can then use the data to control a robotic manipulator 56 as described in U.S. Provisional Patent Application No. 61/679,258, entitled, “Surgical Manipulator Capable of Controlling a Surgical Instrument in either a Semi-Autonomous Mode or a Manual, Boundary Constrained Mode, the disclosure of which is hereby incorporated by reference, and also in U.S. patent application Ser. No. 13/958,834, entitled, “Navigation System for use with a Surgical Manipulator Operable in Manual or Semi-Autonomous Mode”, the disclosure of which is hereby incorporated by reference.


The navigation processor 52 also generates image signals that indicate the relative position of the surgical instrument working end to the surgical site. These image signals are applied to the displays 28, 29. Displays 28, 29, based on these signals, generate images that allow the surgeon and staff to view the relative position of the surgical instrument working end to the surgical site. The displays, 28, 29, as discussed above, may include a touch screen or other input/output device that allows entry of commands.


II. Coordinate Systems and Transformation

Referring to FIG. 3, tracking of objects is generally conducted with reference to a localizer coordinate system LCLZ. The localizer coordinate system has an origin and an orientation (a set of x-, y-, and z-axes). During the procedure one goal is to keep the localizer coordinate system LCLZ stationary. As will be described further below, an accelerometer mounted to the camera unit 36 may be used to track sudden or unexpected movement of the localizer coordinate system LCLZ, as may occur when the camera unit 36 is inadvertently bumped by surgical personnel.


Each tracker 44, 46, 48 and object being tracked also has its own coordinate system separate from localizer coordinate system LCLZ. Components of the navigation system 20 that have their own coordinate systems are the bone trackers 44, 46 and the instrument tracker 48. These coordinate systems are represented as, respectively, bone tracker coordinate systems BTRK1, BTRK2, and instrument tracker coordinate system TLTR.


Navigation system 20 monitors the positions of the femur F and tibia T of the patient by monitoring the position of bone trackers 44, 46 firmly attached to bone. Femur coordinate system is FBONE and tibia coordinate system is TBONE, which are the coordinate systems of the bones to which the bone trackers 44, 46 are firmly attached.


Prior to the start of the procedure, pre-operative images of the femur F and tibia T are generated (or of other tissues in other embodiments). These images may be based on MRI scans, radiological scans or computed tomography (CT) scans of the patient's anatomy. These images are mapped to the femur coordinate system FBONE and tibia coordinate system TBONE using well known methods in the art. In one embodiment, a pointer instrument P, such as disclosed in U.S. Pat. No. 7,725,162 to Malackowski, et al., hereby incorporated by reference, having its own tracker PT (see FIG. 2), may be used to map the femur coordinate system FBONE and tibia coordinate system TBONE to the pre-operative images. These images are fixed in the femur coordinate system FBONE and tibia coordinate system TBONE.


During the initial phase of the procedure, the bone trackers 44, 46 are firmly affixed to the bones of the patient. The pose (position and orientation) of coordinate systems FBONE and TBONE are mapped to coordinate systems BTRK1 and BTRK2, respectively. Given the fixed relationship between the bones and their bone trackers 44, 46, the pose of coordinate systems FBONE and TBONE remain fixed relative to coordinate systems BTRK1 and BTRK2, respectively, throughout the procedure. The pose-describing data are stored in memory integral with both manipulator controller 54 and navigation processor 52.


The working end of the surgical instrument 22 (also referred to as energy applicator distal end) has its own coordinate system EAPP. The origin of the coordinate system EAPP may represent a centroid of a surgical cutting bur, for example. The pose of coordinate system EAPP is fixed to the pose of instrument tracker coordinate system TLTR before the procedure begins. Accordingly, the poses of these coordinate systems EAPP, TLTR relative to each other are determined. The pose-describing data are stored in memory integral with both manipulator controller 54 and navigation processor 52.


III. Software

Referring to FIG. 2, a localization engine 100 is a software module that can be considered part of the navigation system 20. Components of the localization engine 100 run on navigation processor 52. In some versions of the invention, the localization engine 100 may run on the manipulator controller 54.


Localization engine 100 receives as inputs the optically-based signals from the camera controller 42 and the non-optically based signals from the tracker controller 62. Based on these signals, localization engine 100 determines the pose (position and orientation) of the bone tracker coordinate systems BTRK1 and BTRK2 in the localizer coordinate system LCLZ. Based on the same signals received for the instrument tracker 48, the localization engine 100 determines the pose of the instrument tracker coordinate system TLTR in the localizer coordinate system LCLZ.


The localization engine 100 forwards the signals representative of the poses of trackers 44, 46, 48 to a coordinate transformer 102. Coordinate transformer 102 is a navigation system software module that runs on navigation processor 52. Coordinate transformer 102 references the data that defines the relationship between the pre-operative images of the patient and the patient trackers 44, 46. Coordinate transformer 102 also stores the data indicating the pose of the working end of the surgical instrument relative to the instrument tracker 48.


During the procedure, the coordinate transformer 102 receives the data indicating the relative poses of the trackers 44, 46, 48 to the localizer 34. Based on these data and the previously loaded data, the coordinate transformer 102 generates data indicating the relative position and orientation of both the coordinate system EAPP, and the bone coordinate systems, FBONE and TBONE to the localizer coordinate system LCLZ.


As a result, coordinate transformer 102 generates data indicating the position and orientation of the working end of the surgical instrument 22 relative to the tissue (e.g., bone) against which the instrument working end is applied. Image signals representative of these data are forwarded to displays 28, 29 enabling the surgeon and staff to view this information. In certain embodiments, other signals representative of these data can be forwarded to the manipulator controller 54 to control the manipulator 56 and corresponding movement of the surgical instrument 22.


Steps for determining the pose of each of the tracker coordinate systems BTRK1, BTRK2, TLTR in the localizer coordinate system LCLZ are the same, so only one will be described in detail. The steps shown in FIG. 4 are based on only one tracker being active, tracker 44. In the following description, the LEDs of tracker 44 shall be represented by numerals 50a, 50b, 50c which identify first 50a, second 50b, and third 50c LEDs.


The steps set forth in FIG. 4 illustrate the use of optically-based sensor data and non-optically based sensor data to determine the positions of the LEDs 50a, 50b, 50c of tracker 44. From these positions, the navigation processor 52 can determine the position and orientation of the tracker 44, and thus, the position and orientation of the femur F to which it is attached. Optically-based sensor data derived from the signals received by the optical sensors 40 provide line-of-sight based data that relies on the line-of-sight between the LEDs 50a, 50b, 50c and the optical sensors 40. However, the gyroscope sensor 60, which provides non-optically based signals for generating non-optically based sensor data do not rely on line-of-sight and thus can be integrated into the navigation system 20 to better approximate positions of the LEDs 50a, 50b, 50c when two of the LEDs 50a, 50b, 50c are not being measured (since only one LED measured at a time), or when one or more of the LEDs 50a, 50b, 50c are not visible to the optical sensors 40 during a procedure.


In a first initialization step 200, the system 20 measures the position of the LEDs 50a, 50b, 50c for the tracker 44 in the localizer coordinate system LCLZ to establish initial position data. These measurements are taken by sequentially firing the LEDs 50a, 50b, 50c, which transmits light signals to the optical sensors 40. Once the light signals are received by the optical sensors 40, corresponding signals are generated by the optical sensors 40 and transmitted to the camera controller 42. The frequency between firings of the LEDs 50a, 50b, 50c is 100 Hz or greater, preferably 300 Hz or greater, and more preferably 500 Hz or greater. In some cases, the frequency between firings is 1000 Hz or 1 millisecond between firings.


In some embodiments, only one LED can be read by the optical sensors 40 at a time. The camera controller 42, through one or more infrared or RF transceivers (on camera unit 36 and tracker 44) may control the firing of the LEDs 50a, 50b, 50c, as described in U.S. Pat. No. 7,725,162 to Malackowski, et al., hereby incorporated by reference. Alternatively, the tracker 44 may be activated locally (such as by a switch on tracker 44) which then fires its LEDs 50a, 50b, 50c sequentially once activated, without instruction from the camera controller 42.


Based on the inputs from the optical sensors 40, the camera controller 42 generates raw position signals that are then sent to the localization engine 100 to determine the position of each of the corresponding three LEDs 50a, 50b, 50c in the localizer coordinate system LCLZ.


During the initialization step 200, in order to establish the initial position data, movement of the tracker 44 must be less than a predetermined threshold. A value of the predetermined threshold is stored in the navigation computer 26. The initial position data established in step 200 essentially provides a static snapshot of position of the three LEDs 50a, 50b, 50c at an initial time t0, from which to base the remaining steps of the process. During initialization, velocities of the LEDs 50a, 50b, 50c are calculated by the localization engine 100 between cycles (i.e., each set of three LED measurements) and once the velocities are low enough, i.e., less than the predetermined threshold showing little movement occurred, then the initial position data or static snapshot is established. In some embodiments, the predetermined threshold (also referred to as the static velocity limit) is 200 mm/s or less, preferably 100 mm/s or less, and more preferably 10 mm/s or less along any axis. When the predetermined threshold is 100 mm/s, then the calculated velocities must be less than 100 mm/s to establish the static snapshot.


Referring to FIGS. 4 and 4A, once the static snapshot is taken, the positions of the measured LEDs 50a, 50b, 50c are compared to a model of the tracker 44 in step 202. The model is data stored in the navigation computer 26. The model data indicates the positions of the LEDs on the tracker 44 in the tracker coordinate system BTRK1. The system 20 has stored the number and position of the LEDs 50 of each tracker 44, 46, 48 in each tracker's coordinate system. For trackers 44, 46, 48 the origin of their coordinate systems is set at the centroid of all LED positions of the tracker 44.


The localization engine 100 utilizes a rigid body matching algorithm or point matching algorithm to match the measured LEDs 50a, 50b, 50c in the localizer coordinate system LCLZ to the LEDs in the stored model. Once the best-fit is determined, the localization engine 100 evaluates the deviation of the fit to determine if the measured LEDs 50a, 50b, 50c fit within a stored predefined tolerance of the model. The tolerance may be based on a distance between the corresponding LEDs such that if the fit results in too great of a distance, the initialization step has to be repeated. In some embodiments, the positions of the LEDs must not deviate from the model by more than 2.0 mm, preferably not more than 0.5 mm, and more preferably not more than 0.1 mm.


If the fit is within the predefined tolerance, a transformation matrix is generated to transform any other unmeasured LEDs in the model from the bone tracker coordinate system BTRK1 into the localizer coordinate system LCLZ in step 204. This step is utilized if more than three LEDs are used or if virtual LEDs are used as explained further below. In some embodiments, trackers 44, 46, 48 may have four or more LEDs. Once all positions in the localizer coordinate system LCLZ are established, an LED cloud is created. The LED cloud is an arrangement of all LEDs 50a, 50b, 50c on the tracker 44 in the localizer coordinate system LCLZ based on the x-, y-, and z-axis positions of all the LEDs 50a, 50b, 50c in the localizer coordinate system LCLZ.


Once the LED cloud is initially established, the navigation system 20 can proceed with tracking the tracker 44 during a surgical procedure. As previously discussed, this includes firing the next LED in the sequence. For illustration, LED 50a is now fired. Thus, LED 50a transmits light signals to the optical sensors 40. Once the light signals are received by the optical sensors 40, corresponding signals are generated by the optical sensors 40 and transmitted to the camera controller 42.


Based on the inputs from the optical sensors 40, the camera controller 42 generates a raw position signal that is then sent to the localization engine 100 to determine at time t1 the new position of LED 50a relative to the x-, y-, and z-axes of the localizer coordinate system LCLZ. This is shown in step 206 as a new LED measurement.


It should be appreciated that the designation of time such as t0, t1 . . . tn is used for illustrative purposes to indicate different times or different ranges of time or time periods and does not limit this invention to specific or definitive times.


With the new position of LED 50a determined, a linear velocity vector of LED 50a can be calculated by the localization engine 100 in step 208.


The tracker 44 is treated as a rigid body. Accordingly, the linear velocity vector of LED 50a is a vector quantity, equal to the time rate of change of its linear position. The velocity, even the acceleration of each LED, in localizer coordinate system LCLZ, can be calculated from the previously and currently measured positions and time of that LED in localizer coordinate system LCLZ. The previously and currently measured positions and time of a LED define the position history of that LED. The velocity calculation of LED 50a can take the simplest form of:








v


(

LED

50

a

)

=




x


n

-


x


p




t
n

-

t
p








Where {right arrow over (x)}p=(x, y, z)p and is the previously measured position of LED 50a at time tp; and {right arrow over (x)}n=(x, y, z)n and is the currently measured position of LED 50a at time tn. One can also obtain the velocity and/or acceleration of each LED by data fitting the LED position history of that LED as is well known to those skilled in the art.


At time t1, in step 210, the gyroscope sensor 60 is also measuring an angular velocity of the tracker 44. Gyroscope sensor 60 transmits signals to the tracker controller 62 related to this angular velocity.


The tracker controller 62 then transmits a corresponding signal to the localization engine 100 so that the localization engine 100 can calculate an angular velocity vector {right arrow over (ω)} from these signals. In step 210, the gyroscope coordinate system is also transformed to the bone tracker coordinate system BTRK1 so that the angular velocity vector {right arrow over (ω)} calculated by the localization engine 100 is expressed in the bone tracker coordinate system BTRK1.


In step 212, a relative velocity vector {right arrow over (v)} is calculated for the origin of the bone tracker coordinate system BTRK1 with respect to position vector {right arrow over (x)} (LED50a to ORIGIN). This position vector {right arrow over (x)} (LED50a to ORIGIN) is also stored in memory in the navigation computer 26 for access by the localization engine 100 for the following calculation. This calculation determines the relative velocity of the origin {right arrow over (v)}R (ORIGIN) of the bone tracker coordinate system BTRK1 by calculating the cross product of the angular velocity vector {right arrow over (ω)} derived from the gyroscope signal and the position vector from LED 50a to the origin.

{right arrow over (v)}R(ORIGIN)={right arrow over (ω)}×{right arrow over (x)}(LED50a to ORIGIN)


The localization engine 100 then calculates relative velocity vectors {right arrow over (v)}R for the remaining, unmeasured LEDs 50b, 50c (unmeasured because these LEDs have not been fired and thus their positions are not being measured). These velocity vectors can be calculated with respect to the origin of bone tracker coordinate system BTRK1.


The calculation performed by the localization engine 100 to determine the relative velocity vector {right arrow over (v)}R for each unmeasured LED 50b, 50c at time t1 is based on the cross product of the angular velocity vector {right arrow over (ω)} at time t1 and the position vectors {right arrow over (x)} (ORIGIN to LED50b) and {right arrow over (x)} (ORIGIN to LED50c), which are taken from the origin of bone tracker coordinate system BTRK1 to each of the unmeasured LEDs 50b, 50c. These position vectors {right arrow over (x)} (ORIGIN to LED 50b) and {right arrow over (x)} (ORIGIN to LED50c) are stored in memory in the navigation computer 26 for access by the localization engine 100 for the following calculations:

{right arrow over (v)}R(LED50b)={right arrow over (ω)}×{right arrow over (x)}(ORIGIN to LED50b)
{right arrow over (v)}R(LED50c)={right arrow over (ω)}×{right arrow over (x)}(ORIGIN to LED50c)


Also in step 212, these relative velocities, which are calculated in the bone tracker coordinate system BTRK1, are transferred into the localizer coordinate system LCLZ using the transformation matrix determined in step 202. The relative velocities in the localizer coordinate system LCLZ are used in calculations in step 214.


In step 214 the velocity vector {right arrow over (v)} of the origin of the bone tracker coordinate system BTRK1 in the localizer coordinate system LCLZ at time t1 is first calculated by the localization engine 100 based on the measured velocity vector {right arrow over (v)} (LED50a) of LED 50a at time t1. The velocity vector (ORIGIN) is calculated by adding the velocity vector {right arrow over (v)} (LED50a) of LED 50a at time t1 and the relative velocity vector {right arrow over (v)}R (ORIGIN) of the origin at time t1 expressed relative to the position vector of LED 50a to the origin. Thus, the velocity vector of the origin at time t1 is calculated as follows:

{right arrow over (v)}(ORIGIN)={right arrow over (v)}(LED50a)+{right arrow over (v)}R(ORIGIN)


Velocity vectors of the remaining, unmeasured LEDs in the localizer coordinate system LCLZ at time t1 can now be calculated by the localization engine 100 based on the velocity vector {right arrow over (v)} (ORIGIN) of the origin of the bone tracker coordinate system BTRK1 in the localizer coordinate system LCLZ at time t1 and their respective relative velocity vectors at time t1 expressed relative to their position vectors with the origin of the bone tracker coordinate system BTRK1. These velocity vectors at time t1 are calculated as follows:

{right arrow over (v)}(LED50b)={right arrow over (v)}(ORIGIN)+{right arrow over (v)}R(LED50b)
{right arrow over (v)}(LED50c)={right arrow over (v)}(ORIGIN)+{right arrow over (v)}R(LED50c)


In step 216, the localization engine 100 calculates the movements, i.e., the change in position Δx (in Cartesian coordinates), of each of the unmeasured LEDs 50b, 50c from time t0 to time t1 based on the calculated velocity vectors of LEDs 50b, 50c and the change in time. In some embodiments the change in time Δt for each LED measurement is two milliseconds or less, and in some embodiments one millisecond or less.

Δx(LED50b)={right arrow over (v)}(LED50b)×Δt
Δx(LED50c)={right arrow over (v)}(LED50c)×Δt


These calculated changes in position (x, y, z) can then be added to the previously determined positions of each of LEDs 50b, 50c in the localizer coordinate system LCLZ. Thus, in step 218, changes in position can be added to the previous positions of the LEDs 50b, 50c at time t0, which were determined during the static snapshot. This is expressed as follows:

x(LED50b)t1=x(LED50b)t0+Δx(LED50b)


In step 220, these calculated positions for each of LEDs 50b, 50c at time t1 are combined with the determined position of LED 50a at time t1. The newly determined positions of LEDs 50a, 50b, 50c are then matched to the model of tracker 44 to obtain a best fit using the point matching algorithm or rigid body matching algorithm. The result of this best fit calculation, if within the defined tolerance of the system 20, is that a new transformation matrix is created by the navigation processor 52 to link the bone tracker coordinate system BTRK1 to the localizer coordinate system LCLZ.


With the new transformation matrix the newly calculated positions of the unmeasured LEDs 50b, 50c are adjusted to the model in step 222 to provide adjusted positions. The measured position of LED 50a can also be adjusted due to the matching algorithm such that it is also recalculated. These adjustments are considered an update to the LED cloud. In some embodiments, the measured position of LED 50a is fixed to the model's position of LED 50a during the matching step.


With the best fit transformation complete, the measured (and possibly adjusted) position of LED 50a and the calculated (and adjusted) positions of LEDs 50b, 50c in the localizer coordinate system LCLZ enable the coordinate transformer 102 to determine a new position and orientation of the femur F based on the previously described relationships between the femur coordinate system FBONE, the bone tracker coordinate system BTRK1, and the localizer coordinate system LCLZ.


Steps 206 through 222 are then repeated at a next time t2 and start with the measurement in the localizer coordinate system LCLZ of LED 50b, with LEDs 50a, 50c being the unmeasured LEDs. As a result of this loop at each time t1, t2 . . . tn positions of each LED 50a, 50b, 50c are either measured (one LED being fired at each time) or calculated with the calculated positions being very accurately approximated based on measurements by the optical sensor 40 and the gyroscope sensor 60. This loop of steps 206 through 222 to determine the new positions of the LEDs 50 can be carried out by the localization engine 100 at a frequency of at least 100 Hz, more preferably at least 300 Hz, and most preferably at least 500 Hz.


Referring to FIG. 5, the LED cloud may also include virtual LEDs, which are predetermined points identified on the model, but that do not actually correspond to physical LEDs on the tracker 44. The positions of these points may also be calculated at times t1, t2 . . . tn. These virtual LEDs can be calculated in the same fashion as the unmeasured LEDs with reference to FIG. 4. The only difference is that the virtual LEDs are never fired or included in the sequence of optical measurements, since they do not correspond to any light source, but are merely virtual in nature. Steps 300-322 show the steps used for tracking the trackers 44, 46, 48 using real and virtual LEDS. Steps 300-322 generally correspond to steps 200-222 except for the addition of the virtual LEDs, which are treated like unmeasured LEDs using the same equations described above.


One purpose of using virtual LEDs in addition to the LEDs 50a, 50b, 50c, for example, is to reduce the effect of errors in the velocity calculations described above. These errors may have little consequence on the calculated positions of the LEDs 50a, 50b, 50c, but can be amplified the further away from the LEDs 50a, 50b, 50c a point of interest is located. For instance, when tracking the femur F with tracker 44, the LEDs 50a, 50b, 50c incorporated in the tracker 44 may experience slight errors in their calculated positions of about 0.2 millimeters. However, consider the surface of the femur F that may be located over 10 centimeters away from the LEDs 50a, 50b, 50c. The slight error of 0.2 millimeters at the LEDs 50a, 50b, 50c can result in 0.4 to 2 millimeters of error on the surface of the femur F. The further away the femur F is located from the LEDs 50a, 50b, 50c the more the error increases. The use of virtual LEDs in the steps of FIG. 5 can reduce the potential amplification of such errors as described below.


Referring to FIG. 5A, one virtual LED 50d can be positioned on the surface of the femur F. Other virtual LEDs 50e, 50f, 50g, 50h, 50i, 50j, 50k can be positioned at random locations in the bone tracker coordinate system BTRK1 such as along each of the x-, y-, and z-axes, and on both sides of the origin along these axes to yield 6 virtual LEDs. These virtual LEDs are included as part of the model of the tracker 44 shown in FIG. 5A and used in steps 302 and 320. In some embodiments, only the virtual LEDs 50e-50k are used. In other embodiments, virtual LEDs may be positioned at locations along each of the x-, y-, and z-axes, but at different distances from the origin of the bone tracker coordinate system BTRK1. In still further embodiments, some or all of the virtual LEDs may be located off of the axes x-, y-, and z-.


Now in the model are real LEDs 50a, 50b, 50c and virtual LEDs 50d-50k. At each time t1, t2 . . . tn this extended model is matched in step 320 with the measured/calculated positions of real LEDs 50a, 50b, 50c and with the calculated positions of virtual LEDs 50d-50k to obtain the transformation matrix that links the bone tracker coordinate system BTRK1 with the localizer coordinate system LCLZ. Now, with the virtual LEDs 50d-50k included in the model, which are located at positions outlying the real LEDs 50a, 50b, 50c, the error in the rotation matrix can be reduced. In essence, the rigid body matching algorithm or point matching algorithm has additional points used for matching and some of these additional points are located radially outwardly from the points defining the real LEDs 50a, 50b, 50c, thus rotationally stabilizing the match.


In another variation of the process of FIG. 5, the locations of the virtual LEDs 50e-50k can be changed dynamically during use depending on movement of the tracker 44. The calculated positions of the unmeasured real LEDs 50b, 50c and the virtual LEDs 50e-50k at time t1 are more accurate the slower the tracker 44 moves. Thus, the locations of virtual LEDs 50e-50k along the x-, y-, and z-axes relative to the origin of the bone tracker coordinate system BTRK1 can be adjusted based on speed of the tracker 44. Thus, if the locations of the virtual LEDs 50e-50k are denoted (s,0,0), (-s,0,0), (0,s,0), (0,-s,0), (0,0,s), (0,0,-s), respectively, then s would increase when the tracker 44 moves slowly and s would decrease to a smaller value when the tracker 44 moves faster. This could be handled by an empirical formula for s or s can be adjusted based on an estimate in the error in velocity and calculated positions.


Determining new positions of the LEDs 50 (real and virtual) can be carried out at a frequency of at least 100 Hz, more preferably at least 300 Hz, and most preferably at least 500 Hz.


Data from the accelerometers 70 can be used in situations where optical measurement of an LED 50 is impeded due to interference with the line-of-sight. When an LED to be measured is blocked, the localization engine 100 assumes a constant velocity of the origin to estimate positions. However, the constant velocity assumption in this situation may be inaccurate and result in errors. The accelerometers 70 essentially monitor if the constant velocity assumption in the time period is accurate. The steps shown in FIGS. 6 and 7 illustrate how this assumption is checked.


Continuing to use tracker 44 as an example, steps 400-422 of FIG. 6 generally correspond to steps 300-322 from FIG. 5. However, in step 424, the system 20 determines whether, in the last cycle of measurements, less than 3 LEDs were measured—meaning that one or more of the LEDs in the cycle could not be measured. This could be caused by line-of-sight issues, etc. A cycle for tracker 44 is the last three attempted measurements. If during the last three measurements, each of the LEDs 50a, 50b, 50c were visible and could be measured, then the system 20 proceeds to step 408 and continues as previously described with respect to FIG. 5.


If the system 20 determines that one or more of the LEDs 50a, 50b, 50c could not be measured during the cycle, i.e., were blocked from measurement, then the algorithm still moves to step 408, but if the new LED to be measured in step 406 was the one that could not be measured, the system makes some velocity assumptions as described below.


When a LED, such as LED 50a, is not seen by the optical sensor 40 at its measurement time to in step 406, the previously calculated velocity vector {right arrow over (v)} (ORIGIN) of the origin of the tracker 44 in the localizer coordinate system LCLZ at the previous time t(n-1) is assumed to remain constant. Accordingly, velocity vectors of LEDs 50a, 50b, 50c in the localizer coordinate system LCLZ can be calculated based on the previously calculated velocity vector {right arrow over (v)} (ORIGIN) in the localizer coordinate system LCLZ and the relative velocity vectors of LEDs 50a, 50b, 50c, which are derived from a newly measured angular velocity vector from the gyroscope 60. The equations described in steps 316-322 can then be used to determine new positions of the LEDs 50a, 50b, 50c.


To start, when LED 50a is to be measured at step 406, but is obstructed, the velocity vector of the origin is assumed to be the same as the previous calculation. Accordingly, the velocity of the new LED is not calculated at step 408:

{right arrow over (v)}(ORIGIN)=previous calculation


Step 410 proceeds the same as step 310.


The relative velocity vectors {right arrow over (v)}R of LEDs 50a, 50b, 50c calculated in step 412 are then based on the previous velocity vector {right arrow over (v)} (ORIGIN) and the newly measured angular velocity vector from the gyroscope 60 in the bone tracker coordinate system BTRK1:

{right arrow over (v)}R(LED50a)={right arrow over (ω)}(current)×{right arrow over (x)}(ORIGIN to LED50a)
{right arrow over (v)}R(LED50b)={right arrow over (ω)}(current)×{right arrow over (x)}(ORIGIN to LED50b)
{right arrow over (v)}R(LED50c)={right arrow over (ω)}(current)×{right arrow over (x)}(ORIGIN to LED50c)


In step 414, velocity vectors in the localizer coordinate system LCLZ can the be calculated using the origin velocity vector {right arrow over (v)} (ORIGIN) and the relative velocity vectors {right arrow over (v)}R of LEDs 50a, 50b, 50c:
{right arrow over (v)}(LED50a)={right arrow over (v)}(ORIGIN)+{right arrow over (v)}R(LED50a)
{right arrow over (v)}(LED50b)={right arrow over (v)}(ORIGIN)+{right arrow over (v)}R(LED50b)
{right arrow over (v)}(LED50c)={right arrow over (v)}(ORIGIN)+{right arrow over (v)}R(LED50c)


Steps 416 through 422 proceed the same as steps 316-322.


If the system 20 determines at step 424 that one or more of the LEDs 50a, 50b, 50c could not be measured during the cycle, i.e., were blocked from measurement, another algorithm is carried out simultaneously at steps 500-506 shown in FIG. 7 until a complete cycle of measurements is made where all of the LEDs 50a, 50b, 50c in the cycle were visible to the optical sensor 40. Thus, the system 20 is considered to be in a “blocked” condition until the complete cycle with all visible measurements is made.


Steps 500-506 are carried out continuously while the system 20 is in the blocked condition.


In step 500 the navigation processor 52 starts a clock that tracks how long the system 20 is in the blocked condition. The time in the blocked condition is referred to below as t (blocked).


In step 502, the accelerometer 70 measures accelerations along the x-, y-, and z-axes of the bone tracker coordinate system BTRK1 to track errors in the constant velocity assumption. Accelerometer readings, like gyroscope readings are transformed from the accelerometer coordinate system to the bone tracker coordinate system BTRK1.


If the accelerometer 70 detects acceleration(s) that exceed predefined acceleration tolerance(s), the navigation computer 26 will put the system 20 into an error condition. The acceleration tolerances could be defined differently along each x-, y-, and z-axis, or could be the same along each axis. If a measured acceleration exceeds a tolerance then the constant velocity assumption is unreliable and cannot be used for that particular application of surgical navigation. Different tolerances may be employed for different applications. For instance, during robotic cutting, the tolerance may be very low, but for visual navigation only, i.e., not feedback for cutting control loop, the tolerance may be set higher.


In step 504, velocity errors associated with the positions of the LEDs 50 relative to the optical sensor 40 are taken into account and monitored during the blocked condition. For each of the LEDs 50, the velocity error verror multiplied by the time in the blocked condition tblocked condition must be less than a position error tolerance γ and thus must satisfy the following equation to prevent the system 20 from being put into an error condition:

verror×tblocked condition


In this equation, the velocity error verror is calculated for each of the LEDs 50a, 50b, 50c as follows:







v
error

=



x

error

(
t
)


+

x

error

(

t
-
1

)




Δ

t






Position errors xerror(t) and xerror(t-1) are predefined position errors in the system 20 that are based on location relative to the optical sensors 40 at times t and t-1. In essence, the further away the LEDs 50a, 50b, 50c are located from the optical sensors 40, the higher the potential position errors. These positions errors are derived either experimentally or theoretically and placed in a look-up table or formula so that at each position of the LEDs 50a, 50b, 50c in Cartesian coordinates (x, y, z) an associated position error is provided.


In step 504, the localization engine 100 accesses this look-up table or calculates this formula to determine the position errors for each of LEDs 50a, 50b, 50c at the current time t and at the previous time t-1. The position errors are thus based on the positions in Cartesian coordinates in the localizer coordinate system LCLZ calculated by the system 20 in step 422 for the current time t and at the previous time t-1. The time variable Δt represents the time it takes for subsequent position calculations, so the difference between t and t-1, which for illustrative purposes may be 1 millisecond.


The position error tolerance γ is predefined in the navigation computer 26 for access by the localization engine 100. The position error tolerance γ could be expressed in millimeters. The position error tolerance γ can range from 0.001 to 1 millimeters and in some embodiments is specifically set at 0.5 millimeters. Thus, if the position error tolerance γ is set to 0.5 millimeters, the following equation must be satisfied:

verror×tblocked condition<0.5 millimeters


As can be seen, the longer the system 20 is in the blocked condition, the larger the effect that the time variable has in this equation and thus the smaller the velocity errors that will be tolerated. In some embodiments, this equation is calculated by the localization engine 100 in step 504 separately for each of the LEDs 50a, 50b, 50c. In other embodiments, because of how closely arranged the LEDs 50a, 50b, 50c are on the tracker 44, the velocity error of only one of the LEDs 50a, 50b, 50c is used in this calculation to determine compliance.


In step 506, when the error(s) exceeds the position error tolerance y, the system 20 is placed in an error condition. In such a condition, for example, any control or movement of cutting or ablation tools is ceased and the tools are shut down.


IV. Other Embodiments

In one embodiment, when each of the trackers 44, 46, 48 are being actively tracked, the firing of the LEDs occurs such that one LED from tracker 44 is fired, then one LED from tracker 46, then one LED from tracker 48, then a second LED from tracker 44, then a second LED from tracker 46, and so on until all LEDs have been fired and then the sequence repeats. This order of firing may occur through instruction signals sent from the transceivers (not shown) on the camera unit 36 to transceivers (not shown) on the trackers 44, 46, 48.


The navigation system 20 can be used in a closed loop manner to control surgical procedures carried out by surgical cutting instruments. Both the instrument 22 and the anatomy being cut are outfitted with trackers 50 such that the navigation system 20 can track the position and orientation of the instrument 22 and the anatomy being cut, such as bone.


In one embodiment, the navigation system is part of a robotic surgical system for treating tissue. In some versions, the robotic surgical system is a robotic surgical cutting system for cutting away material from a patient's anatomy, such as bone or soft tissue. The cutting system could be used to prepare bone for surgical implants such as hip and knee implants, including unicompartmental, bicompartmental, or total knee implants. Some of these types of implants are shown in U.S. patent application Ser. No. 13/530,927, entitled, “Prosthetic Implant and Method of Implantation”, the disclosure of which is hereby incorporated by reference.


The robotic surgical cutting system includes a manipulator (see, for instance, FIG. 1). The manipulator has a plurality of arms and a cutting tool carried by at least one of said plurality of arms. A robotic control system controls or constrains movement of the cutting tool in at least 5 degrees of freedom. An example of such a manipulator and control system are shown in U.S. Provisional Patent Application No. 61/679,258, entitled, “Surgical Manipulator Capable of Controlling a Surgical Instrument in either a Semi-Autonomous Mode or a Manual, Boundary Constrained Mode”, hereby incorporated by reference, and also in U.S. patent application Ser. No. 13/958,834, entitled, “Navigation System for use with a Surgical Manipulator Operable in Manual or Semi-Autonomous Mode”, the disclosure of which is hereby incorporated by reference.


In this embodiment, the navigation system 20 communicates with the robotic control system (which can include the manipulator controller 54). The navigation system 20 communicates position and/or orientation data to said robotic control system. The position and/or orientation data is indicative of a position and/or orientation of instrument 22 relative to the anatomy. This communication provides closed loop control to control cutting of the anatomy such that the cutting occurs within a predefined boundary.


In this embodiment, manipulator movement may coincide with LED measurements such that for each LED measurement taken, there is a corresponding movement of the instrument 22 by the manipulator 56. However, this may not always be the case. For instance, there may be such a lag between the last LED measurement and movement by the manipulator 56 that the position and/or orientation data sent from the navigation computer 26 to the manipulator 56 for purposes of control loop movement becomes unreliable. In such a case, the navigation computer 26 can be configured to also transmit to the manipulator controller 54 kinematic data. Such kinematic data includes the previously determined linear and angular velocities for the trackers 44, 46, 48. Since the velocities are already known, positions can calculated based on the lag of time. The manipulator controller 54 could then calculate, for purposes of controlling movement of the manipulator 56, the positions and orientations of the trackers 44, 46, 48 and thus, the relative positions and orientations of the instrument 22 (or instrument tip) to the femur F and/or tibia T.


In this embodiment, the instrument 22 is held by the manipulator shown in FIG. 1 or other robot that provides some form of mechanical constraint to movement. This constraint limits the movement of the instrument 22 to within a predefined boundary. If the instrument 22 strays beyond the predefined boundary, a control is sent to the instrument 22 to stop cutting.


When tracking both the instrument 22 and the anatomy being cut in real time in these systems, the need to rigidly fix anatomy in position can be eliminated. Since both the instrument 22 and anatomy are tracked, control of the instrument 22 can be adjusted based on relative position and/or orientation of the instrument 22 to the anatomy. Also, representations of the instrument 22 and anatomy on the display can move relative to one another—to emulate their real world motion.


In one embodiment, each of the femur F and tibia T has a target volume of material that is to be removed by the working end of the surgical instrument 22. The target volumes are defined by one or more boundaries. The boundaries define the surfaces of the bone that should remain after the procedure. In some embodiments, system 20 tracks and controls the surgical instrument 22 to ensure that working end, e.g., bur, only removes the target volume of material and does not extend beyond the boundary, as disclosed in Provisional Patent Application No. 61/679,258, entitled, “Surgical Manipulator Capable of Controlling a Surgical Instrument in either a Semi-Autonomous Mode or a Manual, Boundary Constrained Mode”, hereby incorporated by reference.


In the described embodiment, control of the instrument 22 is accomplished by utilizing the data generated by the coordinate transformer 102 that indicates the position and orientation of the bur or other cutting tool relative to the target volume. By knowing these relative positions, the surgical instrument 22 or the manipulator to which it is mounted, can be controlled so that only desired material is removed.


In other systems, the instrument 22 has a cutting tool that is movable in three degrees of freedom relative to a handheld housing and is manually positioned by the hand of the surgeon, without the aid of cutting jig, guide arm or other constraining mechanism. Such systems are shown in U.S. Provisional Patent Application No. 61/662,070, entitled, “Surgical Instrument Including Housing, a Cutting Accessory that Extends from the Housing and Actuators that Establish the Position of the Cutting Accessory Relative to the Housing”, the disclosure of which is hereby incorporated by reference.


In these embodiments, the system includes a hand held surgical cutting instrument having a cutting tool. A control system controls movement of the cutting tool in at least 3 degrees of freedom using internal actuators/motors, as shown in U.S. Provisional Patent Application No. 61/662,070, entitled, “Surgical Instrument Including Housing, a Cutting Accessory that Extends from the Housing and Actuators that Establish the Position of the Cutting Accessory Relative to the Housing”, the disclosure of which is hereby incorporated by reference. The navigation system 20 communicates with the control system. One tracker (such as tracker 48) is mounted to the instrument. Other trackers (such as trackers 44, 46) are mounted to a patient's anatomy.


In this embodiment, the navigation system 20 communicates with the control system of the hand held surgical cutting instrument. The navigation system 20 communicates position and/or orientation data to the control system. The position and/or orientation data is indicative of a position and/or orientation of the instrument 22 relative to the anatomy. This communication provides closed loop control to control cutting of the anatomy such that the cutting occurs within a predefined boundary (the term predefined boundary is understood to include predefined trajectory, volume, line, other shapes or geometric forms, and the like).


Features of the invention may be used to track sudden or unexpected movements of the localizer coordinate system LCLZ, as may occur when the camera unit 36 is bumped by surgical personnel. An accelerometer (not shown) mounted to camera unit 36 monitors bumps and stops system 20 if a bump is detected. In this embodiment, the accelerometer communicates with the camera controller 42 and if the measured acceleration along any of the x, y, or z axes exceeds a predetermined value, then the camera controller 42 sends a corresponding signal to the navigation computer 26 to disable the system 20 and await the camera unit 36 to stabilize and resume measurements. In some cases, the initialization step 200, 300, 400 would have to be repeated before resuming navigation.


In some embodiments, a virtual LED is positioned at the working tip of the instrument 22. In this embodiment, the virtual LED is located at the location of the working tip in the model of the instrument tracker 48 so that the working tip location is continuously calculated.


It is an object of the intended claims to cover all such modifications and variations that come within the true spirit and scope of this invention. Furthermore, the embodiments described above are related to medical applications, but the inventions described herein are also applicable to other applications such as industrial, aerospace, defense, and the like.

Claims
  • 1. A surgical system, comprising: a robotic manipulator;an instrument coupled to the robotic manipulator, wherein the robotic manipulator is configured to control movement of the instrument to facilitate a surgical procedure;a non-optical sensor coupled to the instrument; anda navigation system configured obtain readings from the non-optical sensor, and the navigation system comprising a camera unit configured to optically track a pose of the instrument, wherein the navigation system is configured to: detect a condition whereby the camera unit is blocked from optically tracking the pose of the instrument; andin response to detection of the condition, track the pose of the instrument with the readings from the non-optical sensor.
  • 2. The surgical system of claim 1, wherein the navigation system is configured to: detect that the camera unit is no longer blocked from optically tracking the pose of the instrument; andin response, optically track the pose of the instrument with the camera unit.
  • 3. The surgical system of claim 1, wherein, in response to detection of the condition, the navigation system is configured to utilize readings from the non-optical sensor to estimate the pose of the instrument.
  • 4. The surgical system of claim 1, wherein, in response to detection of the condition, the navigation system tracks the pose of the instrument by further being configured to: obtain a previous pose of the instrument that was measured based on optically tracking the pose of the instrument by the camera unit prior to detection of the condition; andutilize readings from the non-optical sensor to calculate a current pose of the instrument relative to the previous pose.
  • 5. The surgical system of claim 1, wherein the non-optical sensor comprises an accelerometer.
  • 6. The surgical system of claim 1, wherein the non-optical sensor comprises a gyroscope.
  • 7. The surgical system of claim 1, wherein the robotic manipulator is configured to utilize the instrument to prepare a patient anatomy to receive an implant.
  • 8. The surgical system of claim 1, wherein the robotic manipulator comprises a robotic arm that is supported by a moveable cart.
  • 9. The surgical system of claim 1, wherein, in response to detection of the condition, the navigation system is configured to communicate the tracked pose of the instrument to a control system coupled to the robotic manipulator to enable the robotic manipulator to control or constrain movement of the instrument relative to a predefined path or boundary.
  • 10. A navigation system configured to be utilized with a robotic manipulator configured to control movement of an instrument to facilitate a surgical procedure, the navigation system comprising: a camera unit configured to optically track a pose of the instrument;a non-optical sensor coupled to the instrument; anda computing system coupled to the camera unit and being configured to obtain readings from the non-optical sensor, and the computing system being configured to: detect a condition whereby the camera unit is blocked from optically tracking the pose of the instrument; andin response to detection of the condition, track the pose of the instrument with the readings from the non-optical sensor.
  • 11. The navigation system of claim 10, wherein the computing system is configured to: detect that the camera unit is no longer blocked from optically tracking the pose of the instrument; andin response, optically track the pose of the instrument with the camera unit.
  • 12. The navigation system of claim 10, wherein, in response to detection of the condition, the computing system is configured to utilize readings from the non-optical sensor to estimate the pose of the instrument.
  • 13. The navigation system of claim 10, wherein, in response to detection of the condition, the computing system tracks the pose of the instrument by further being configured to: obtain a previous pose of the instrument that was measured based on optically tracking the pose of the instrument by the camera unit prior to detection of the condition; andutilize readings from the non-optical sensor to calculate a current pose of the instrument relative to the previous pose.
  • 14. The navigation system of claim 10, wherein the non-optical sensor comprises an accelerometer.
  • 15. The navigation system of claim 10, wherein the non-optical sensor comprises a gyroscope.
  • 16. The navigation system of claim 10, wherein, in response to detection of the condition, the computing system is configured to communicate the tracked pose of the instrument to a control system coupled to the robotic manipulator to enable the robotic manipulator to control or constrain movement of the instrument relative to a predefined path or boundary.
  • 17. A method of operating a navigation system including a camera unit, a non-optical sensor, and a computing system, the non-optical sensor being coupled to an instrument that is controlled by a robotic manipulator, the method comprising: optically tracking a pose of the instrument with the camera unit;detecting, with the computing system, a condition whereby the camera unit is blocked from optically tracking the pose of the instrument;obtaining, with the computing system, readings from the non-optical sensor; andin response to detecting the condition, the computing system tracking the pose of the instrument with the readings from the non-optical sensor.
  • 18. The method of claim 17, comprising: detecting, with the computing system, that the camera unit is no longer blocked from optically tracking the pose of the instrument; andin response, optically tracking the pose of the instrument with the camera unit.
  • 19. The method of claim 17, comprising: in response to detecting the condition, the computing system utilizing readings from the non-optical sensor for estimating the pose of the instrument.
  • 20. The method of claim 17, comprising, in response to detecting the condition, the computing system tracking the pose of the instrument by further: obtaining a previous pose of the instrument that was measured based on optically tracking the pose of the instrument by the camera unit prior to detecting the condition; andutilizing readings from the non-optical sensor for calculating a current pose of the instrument relative to the previous pose.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 16/783,654, filed on Feb. 6, 2020, which is a continuation of U.S. patent application Ser. No. 15/599,935, filed on May 19, 2017, issued as U.S. Pat. No. 10,575,906, which is a continuation of U.S. patent application Ser. No. 14/994,236, filed on Jan. 13, 2016, issued as U.S. Pat. No. 9,687,307, which is a continuation of U.S. patent application Ser. No. 14/635,402, filed on Mar. 2, 2015, issued as U.S. Pat. No. 9,271,804, which is a continuation of U.S. patent application Ser. No. 14/035,207, filed on Sep. 24, 2013, issued as U.S. Pat. No. 9,008,757, which claims priority to and the benefit of U.S. Provisional Patent App. No. 61/705,804, filed on Sep. 26, 2012, the entire contents of all of which are hereby incorporated by reference.

US Referenced Citations (216)
Number Name Date Kind
5086401 Glassman et al. Feb 1992 A
5279309 Taylor et al. Jan 1994 A
5408409 Glassman et al. Apr 1995 A
5494034 Schlondorff et al. Feb 1996 A
5497061 Nonaka et al. Mar 1996 A
5506682 Pryor Apr 1996 A
5569578 Mushabac Oct 1996 A
5577502 Darrow et al. Nov 1996 A
5592401 Kramer Jan 1997 A
5630431 Taylor May 1997 A
5638819 Manwaring et al. Jun 1997 A
5662111 Cosman Sep 1997 A
5676673 Ferre et al. Oct 1997 A
5682886 Delp et al. Nov 1997 A
5704897 Truppe Jan 1998 A
5729475 Romanik, Jr. Mar 1998 A
5730129 Darrow et al. Mar 1998 A
5765561 Chen et al. Jun 1998 A
5800352 Ferre et al. Sep 1998 A
5803089 Ferre et al. Sep 1998 A
5828770 Leis et al. Oct 1998 A
5829444 Ferre et al. Nov 1998 A
5848967 Cosman Dec 1998 A
5923417 Leis Jul 1999 A
5930741 Kramer Jul 1999 A
5953683 Hansen et al. Sep 1999 A
5967980 Ferre et al. Oct 1999 A
5976156 Taylor et al. Nov 1999 A
6006126 Cosman Dec 1999 A
6026315 Lenz et al. Feb 2000 A
6061644 Leis May 2000 A
6106464 Bass et al. Aug 2000 A
6115128 Vann Sep 2000 A
6122538 Sliwa, Jr. et al. Sep 2000 A
6161033 Kuhn Dec 2000 A
6167295 Cosman Dec 2000 A
6175756 Ferre et al. Jan 2001 B1
6226548 Foley et al. May 2001 B1
6228089 Wahrburg May 2001 B1
6234983 Storey et al. May 2001 B1
6235038 Hunter et al. May 2001 B1
6236875 Bucholz et al. May 2001 B1
6246898 Vesely et al. Jun 2001 B1
6266142 Junkins et al. Jul 2001 B1
6273896 Franck et al. Aug 2001 B1
6275725 Cosman Aug 2001 B1
6285902 Kienzle, III et al. Sep 2001 B1
6288785 Frantz et al. Sep 2001 B1
6314312 Wessels et al. Nov 2001 B1
6322567 Mittelstadt et al. Nov 2001 B1
6341231 Ferre et al. Jan 2002 B1
6351659 Vilsmeier Feb 2002 B1
6351661 Cosman Feb 2002 B1
6400460 Chen Jun 2002 B1
6409687 Foxlin Jun 2002 B1
6442416 Schultz Aug 2002 B1
6442417 Shahidi et al. Aug 2002 B1
6445943 Ferre et al. Sep 2002 B1
6450978 Brosseau et al. Sep 2002 B1
6456868 Saito et al. Sep 2002 B2
6466815 Saito et al. Oct 2002 B1
6473635 Rasche Oct 2002 B1
6490467 Bucholz et al. Dec 2002 B1
6491702 Heilbrun et al. Dec 2002 B2
6527443 Vilsmeier et al. Mar 2003 B1
6581000 Hills et al. Jun 2003 B2
6584339 Galloway, Jr. et al. Jun 2003 B2
6584378 Anfindsen Jun 2003 B1
6585651 Nolte et al. Jul 2003 B2
6611141 Schulz et al. Aug 2003 B1
6665079 Tocci et al. Dec 2003 B1
6675032 Chen et al. Jan 2004 B2
6675040 Cosman Jan 2004 B1
6675122 Markendorf et al. Jan 2004 B1
6691074 Moriya et al. Feb 2004 B1
6694167 Ferre et al. Feb 2004 B1
6711431 Sarin et al. Mar 2004 B2
6757582 Brisson et al. Jun 2004 B2
6848304 Geen Feb 2005 B2
6876926 Kirkland et al. Apr 2005 B2
6926709 Bieger et al. Aug 2005 B2
6963409 Benner et al. Nov 2005 B2
7072704 Bucholz Jul 2006 B2
7104996 Bonutti Sep 2006 B2
7139601 Bucholz et al. Nov 2006 B2
7166114 Moctezuma De La Barrera et al. Jan 2007 B2
7206627 Abovitz et al. Apr 2007 B2
7209028 Boronkay et al. Apr 2007 B2
7217276 Henderson et al. May 2007 B2
7283892 Boillot et al. Oct 2007 B1
7289227 Smetak et al. Oct 2007 B2
7295891 Huttenhofer et al. Nov 2007 B2
7395181 Foxlin Jul 2008 B2
7421343 Hawkinson Sep 2008 B2
7535411 Falco May 2009 B2
7542791 Mire et al. Jun 2009 B2
7556428 Sukovic et al. Jul 2009 B2
7640106 Stokar et al. Dec 2009 B1
7689321 Karlsson Mar 2010 B2
7702379 Avinash et al. Apr 2010 B2
7725162 Malackowski et al. May 2010 B2
7725279 Luinge et al. May 2010 B2
7728868 Razzaque et al. Jun 2010 B2
7747312 Barrick et al. Jun 2010 B2
7862570 Russell et al. Jan 2011 B2
7868610 Velinsky et al. Jan 2011 B2
7895761 Pettersson Mar 2011 B2
8010180 Quaid et al. Aug 2011 B2
8028580 Millet Oct 2011 B2
8047075 Nasiri et al. Nov 2011 B2
8057407 Martinelli et al. Nov 2011 B2
8057479 Stone Nov 2011 B2
8057482 Stone et al. Nov 2011 B2
8072614 Deliwala Dec 2011 B2
8082064 Kay Dec 2011 B2
8096163 Goldbach Jan 2012 B2
8100769 Rabin Jan 2012 B2
8103472 Braun et al. Jan 2012 B2
8109890 Kamiar et al. Feb 2012 B2
8111407 Takimasa et al. Feb 2012 B2
8112292 Simon Feb 2012 B2
8126535 Maier et al. Feb 2012 B2
8131344 Strommer et al. Mar 2012 B2
8141424 Seeger et al. Mar 2012 B2
8165844 Luinge et al. Apr 2012 B2
8849374 Yamamoto et al. Sep 2014 B2
9008757 Wu Apr 2015 B2
9271804 Wu Mar 2016 B2
9381085 Axelson, Jr. et al. Jul 2016 B2
9480534 Bowling et al. Nov 2016 B2
9687307 Wu Jun 2017 B2
9707043 Bozung Jul 2017 B2
10575906 Wu Mar 2020 B2
11529198 Wu Dec 2022 B2
20010011175 Hunter et al. Aug 2001 A1
20010037064 Shahidi Nov 2001 A1
20020052546 Frantz et al. May 2002 A1
20020120192 Nolte et al. Aug 2002 A1
20020198448 Zuk et al. Dec 2002 A1
20030031349 Giorgi et al. Feb 2003 A1
20030163142 Paltieli et al. Aug 2003 A1
20040024309 Ferre et al. Feb 2004 A1
20040027587 Morimoto Feb 2004 A1
20040068178 Govari Apr 2004 A1
20040138556 Cosman Jul 2004 A1
20040149036 Foxlin et al. Aug 2004 A1
20040150560 Feng et al. Aug 2004 A1
20040167688 Karlsson et al. Aug 2004 A1
20040172164 Habibi et al. Sep 2004 A1
20040204646 Nagler et al. Oct 2004 A1
20040243148 Wasielewski Dec 2004 A1
20050049485 Harmon et al. Mar 2005 A1
20050085717 Shahidi Apr 2005 A1
20050085718 Shahidi Apr 2005 A1
20050154295 Quistgaard et al. Jul 2005 A1
20050245820 Sarin Nov 2005 A1
20050267353 Marquart et al. Dec 2005 A1
20060142657 Quaid et al. Jun 2006 A1
20060173356 Feilkas Aug 2006 A1
20060178775 Zhang et al. Aug 2006 A1
20060190012 Freitag Aug 2006 A1
20060258938 Hoffman et al. Nov 2006 A1
20060282873 Zalewski et al. Dec 2006 A1
20070225731 Couture et al. Sep 2007 A1
20070270686 Ritter et al. Nov 2007 A1
20070270690 Woerlein Nov 2007 A1
20070287911 Haid et al. Dec 2007 A1
20080018912 Schreiber Jan 2008 A1
20080039868 Tuemmler et al. Feb 2008 A1
20080119726 Immerz et al. May 2008 A1
20080154125 Maier et al. Jun 2008 A1
20080161682 Kendrick et al. Jul 2008 A1
20080192263 Wienand et al. Aug 2008 A1
20080319491 Schoenefeld Dec 2008 A1
20090030608 Soehren et al. Jan 2009 A1
20090048779 Zeng et al. Feb 2009 A1
20090088629 Groszmann et al. Apr 2009 A1
20090143670 Daigneault Jun 2009 A1
20090149740 Hoheisel Jun 2009 A1
20090248044 Amiot et al. Oct 2009 A1
20090281417 Hartmann et al. Nov 2009 A1
20090312629 Razzaque et al. Dec 2009 A1
20090318836 Stone et al. Dec 2009 A1
20100030063 Lee et al. Feb 2010 A1
20100039506 Sarvestani et al. Feb 2010 A1
20100080417 Qureshi et al. Apr 2010 A1
20100100081 Tuma et al. Apr 2010 A1
20100130853 Chandonnet et al. May 2010 A1
20100154540 Uemura Jun 2010 A1
20100160771 Gielen et al. Jun 2010 A1
20100234770 Colombet et al. Sep 2010 A1
20100261999 Soubelet et al. Oct 2010 A1
20100318223 Motoyoshi et al. Dec 2010 A1
20110045736 Wooten Feb 2011 A1
20110054304 Markowitz et al. Mar 2011 A1
20110087092 Kienzle, III Apr 2011 A1
20110105895 Komblau et al. May 2011 A1
20110125006 Yamamoto et al. May 2011 A1
20110160570 Kariv et al. Jun 2011 A1
20110178394 Fitzpatrick Jul 2011 A1
20110190790 Summerer et al. Aug 2011 A1
20110218458 Valin et al. Sep 2011 A1
20110257909 Allen et al. Oct 2011 A1
20110275957 Bhandari Nov 2011 A1
20110320153 Lightcap et al. Dec 2011 A1
20120029389 Amiot et al. Feb 2012 A1
20120046536 Cheung et al. Feb 2012 A1
20120089014 Sabczynski et al. Apr 2012 A1
20120095330 Shechter et al. Apr 2012 A1
20120108954 Schulhauser et al. May 2012 A1
20120108955 Razzaque et al. May 2012 A1
20120165655 Mucha Jun 2012 A1
20130064427 Picard et al. Mar 2013 A1
20130345718 Crawford et al. Dec 2013 A1
20180333214 Han et al. Nov 2018 A1
20200197105 Wu Jun 2020 A1
Foreign Referenced Citations (32)
Number Date Country
2071831 Jun 1991 CA
101267776 Sep 2008 CN
202146362 Feb 2012 CN
102449666 May 2012 CN
4225112 Dec 1993 DE
102004047905 Apr 2006 DE
102004061764 Jul 2006 DE
2008538184 Oct 2008 JP
4231034 Feb 2009 JP
2009254805 Nov 2009 JP
9611624 Apr 1996 WO
200200131 Jan 2002 WO
2006091494 Aug 2006 WO
2007084893 Jul 2007 WO
2008030264 Mar 2008 WO
2008100472 Aug 2008 WO
2010077008 Jul 2010 WO
2010086374 Aug 2010 WO
2010096419 Aug 2010 WO
2010111090 Sep 2010 WO
2010150148 Dec 2010 WO
2011025708 Mar 2011 WO
2011026077 Mar 2011 WO
2011059383 May 2011 WO
2011089606 Jul 2011 WO
2011106861 Sep 2011 WO
2011128766 Oct 2011 WO
2011133927 Oct 2011 WO
2011128766 Dec 2011 WO
2012024672 Feb 2012 WO
2012052070 Apr 2012 WO
2012080840 Jun 2012 WO
Non-Patent Literature Citations (79)
Entry
Inertial and Magnetic Sensing of Human Motion, Daniel Roetenberg, May 24, 2006, 128 pages.
International Search Report of the International Searching Authority for Application No. PCT/US2013/061642 dated Jan. 20, 2014; 7 pages.
Intra-operative Application of a Robotic Knee Surgery System, S. J. Harris, M. Jakopec, J. Cobb, B. L. Davies, Medical Image Computing and Computer-Assisted Intervention—MICCAI'99, Lecture Notes in Computer Science vol. 1679, 1999, pp. 1116-1124; 9pages.
Intraoperative Navigation Techniques Accuracy Tests and Clinical Report, S. Hassfeld, C. Burghart, I. Bertovic, J. Raczkowsky, U. Rembod, H. Worn and J. Muling, 1998, pp. 670-675, 6 pages.
Machining and Accuracy Studies for a Tibial Knee Implant Using a Force-Controlled Robot, Computer Aided Surgery, 1998, pp. 123-133, 11 pages.
Minimally Invasive Total Knee Arthroplasty Surgery Through Navigated Freehand Bone Cutting, Hani Haider, PHD, O. Andres Barrera, MSC, and Kevin L. Garvin, MD, The Journal of Arthroplasty, vol. 22, No. 4, 2007, 8 pages.
Precision Freehand Sculpting of Bone, Gabriel Brisson, Takeo Kanade, Anthony Digioia, Branislav Jaramaz, MICCAI 2004, LNCS 3217, pp. 105-112, 2004, 8 pages.
PrecisioN Knee Navigation Operative Technique, Stryker Software Manual, Knute Buehler, M.D., Chief of Arthritis and Joint Reconstruction the Center—Orthopedic and Neurosurgery Care and Research Bend, Oregon 40 pages.
Premiers Pas Vers La Dissectomie et la Realisation de Protheses du Genou a L'Aide de Robots, M. Fadda, S. Martelli, P. Dario, M. Marcacci, S. Zaffagnini, A. Visani, Innov. Tech. Biol. Med., vol. 13, No. 4, 1992, pp. 394-409, 16 pages.
Primary and Revision Total Hip Replacement Using the Robodoc System, William L. Bargar, MD, Andre Bauer, MD and Martin Borner, MD, Clinical Orthopaedics and Related Research, 1998, No. 354, pp. 82-91, 10 pages.
Registration and immobilization in robot-assisted surgery, Jon T. Lea, Dane Watkins, Aaron Mills, Michael A. Peshkin, Thomas C. Kienzle III and S. David Stulberg, Journal of Image Guided Surgery, 1995, vol. 1, No. 2, pp. 80-87, 11 pages.
Registration Graphs A Language for Modeling and Analyzing Registration in Image-Guided Surgery, Jon Thomas Lea, Dec. 1998, 1-49, 49 pages.
Robot assisted craniofacial surgery: first clinical evaluation, C. Burghart, R. Krempien, T. Redlich, A. Pernozzoli, H. Grabowski, J. Muenchenberg, J. Albers, S. Ha.beta.eld, C. Vahl, U. Rembold and H. Woern, 1999, 7 pages.
Robot Assisted Knee Surgery, 4527 IEEE, Engineering in Medicine & Biology 14, May/Jun. 1995, No. 3, pp. 292-300, 9 pages.
Robot Controlled Osteotomy in Craniofacial Surgery, Catherina Burghart, Jochen Keitel, Stefan Hassfeld, Ulrich Rembold and Heinz Woern, Haptic Devices in Medical Applications, Jun. 23, 1999; pp. 12-22, 13 pages.
Robotergestutzte Osteotomie in der craniofacialen Chirurgie, Catherina R. Burghart, GCA-Verlag, 2000, and partial English translation, 250 pages.
Robotic Assistance in Orthopaedic Surgery A Proof of Principle Using Distal Femoral Arthroplasty, Clinical Orthopaedics and Related Research, 1993, No. 296, pp. 178-186, 9 pages.
Robotic Execution of a Surgical Plan, Howard A. Paul, DVM, William L. Bargar, MD, Brent Mittlestadt, Peter Kazanzides, PH.D., Bela Musites, Joel Suhars, Phillip W. Cain, Bill Williamson, Fred G. Smith, 1992 IEEE, 99. 1621-1623, 3 pages.
Robotics in Minimally Invasive Surgery, Brian Davies, Mechatronics in Medicine Lab, IEE, 1996 The Institution of Electrical Engineers, pp. 1-2, 2 pages.
Robotics in surgery a new generation of surgical tools incorporate computer technology and mechanical actuation to give surgeons much finer control than previously possible during some operations, Rob Buckingham, IEE Review, Sep. 1994, pp. 193-196,4 pages.
Robust Multi Sensor Pose Estimation for Medical Applications, Andreas Tobergte, Mihai Pomarlan and Gerd Hirzinger, 8 pages.
Safe Active Robotic Devices for Surgery, Ro Buckingham, Oct. 1993, vol. 5, pp. 355-358, 4 pages.
Section 4 Robotic Systems and Task-Level Programming, A Model-Based Optimal Planning and Execution System with Active Sensing and Passive Manipulation for Augmentation of Human Precision in Computer-Integrated Surgery, Russell H. Taylor, Yong-YilKim, Alan D. Kalvin, David Larose, Betsy Haddad, Deljou Khoramabadi, Marilyn Noz, Robert Olyha, Nils Brunn and Dieter Grimm, Experimental Robotics II Lecture Notes in Control and Information Sciences, vol. 190, 1993, pp. 177-195, 19 pages.
Semi-Active Guiding Systems in Surgery. A Two-DOF Prototype of the Passive Arm with Dynamic Constraints (PADyC), Jocelyne Troccaz, Yves Delnondedieu, Mechatronics vol. 6, No. 4, pp. 399-421, 1996, 23 pages.
Surgical Robot for Total Hip Replacement Surgery, Proceedings of the 1992 IEEE, International Conference on Robotics and Automation, May 1992, pp. 606-611, 6 pages.
Surgical Robotics An Introduction, Ulrich Rembold and Catherina R. Burghart, Journal of Intelligent and Robotic Systems, vol. 30, Institute of Process Control and Robotics, 2001, pp. 1-28, 28 pages.
Technique and first clinical results of robot-assisted total knee replacement, Werner Siebert, Sabine Mai, Rudolf Kober and Peter F. Heeckt, The Knee, vol. 9, 2002, pp. 173-130, 8 pages.
The Robodoc Clinical Trial A Robotic Assistant for Total Hip Arthroplasty, Evelyn Harkins Spencer, Orthopaedic Nursing, Jan./Feb. 1996, vol. 15, No. 1, pp. 9-14, 6 pages.
Three-Dimensional Digitizer (Neuronavigator); New Equipment for Computed Tomography-Guided Stereotaxic Surgery, Watanabe E, Watanabe T, Manaka S, Mayanagi Y, Takakura K., Surg Neurol., Jun. 1987, Issue 6, pp. 543-547, 5 pages.
Three-Dimensional Ultrasound Imaging Using Multiple Magnetic Tracking Systems and Miniature Sensors, Daniel F. Leotta, Paul R. Detmer, Odd Helge Gilja, Jing-Ming Jong, Roy W. Martin, Jean F. Primozich, Kirk W. Beach and D. Eugene Strandness, Bioengineering, Surgery, Anesthesiology, University of Washington, Seattle, WA, 1995 IEEE Ultrasonics Symposium, 4 pages.
Total Knee Replacement Computer-assisted surgical system uses a calibrated robot, May/Jun. 1995, IEEE Engineering in Medicine and Biology; A Computer-Assisted Total Knee Replacement Surgical System Using a Calibrated Robot, Thomas C. Kienzle III, S.David Stulberg, Michael Peshkin, Arthur Quaid, Jon Lea, Ambarish Goswami, Chi-hau Engineering in Medicine and Biology vol. 14, Issue 3, May 1995, pp. 301-306, 35 pages.
U.S. Appl. No. 61/662,070, filed Jun. 20, 2012.
U.S. Appl. No. 61/679,258, filed Aug. 3, 2012.
Written Opinion of the International Searching Authority for Application No. PCT/US2013/061642 dated Jan. 20, 2014; 7 pages.
A Literature Review: Robots in Medicine, Hsia, T.C.; Mittelstadt, B., Jun. 1991, vol. 10, Issue 2, pp. 13-22, 10 pages.
A Novel Approach to Computer Assisted Spine Surgery, Lutz P. Nolte, Lucia J. Zamorano, Zhaowel Jlang, Qinghai Wang, Frank Langlotz, Erich Arm, Heiko Visarius, Proceedings of the First International Symposium on Medical Robotics and Computer AssistedSurgery, vol. 2, Workshop [Part I & II] Session VI, Sep. 1994, pp. 323-328, 7 pages.
A review of robotics in surgery, B. Davies, Proceedings of the Institution of Mechanical Engineers Part H: Journal of Engineering in Medicine, 2000, pp. 129-140, 13 pages.
A Robotized Surgeon Assistant, T. Wang, M. Fadda, M. Marcacci, S. Martelli, P. Dario and A. Visani, Intelligent Robots and Systems '94. ‘Advanced Robotic Systems and the Real World’, IROS '94. Proceedings of the IEEE/RSJ/GI International Conferenceon (vol. 2) pp. 852-868, 1994, 8 pages.
A Safe Robot System for Craniofacial Surgery, D. Engel, J. Raczkowsky, H. Worn, May 21-26, 2001, Proceedings of the 2001 IEEE, International Conference on Robotics and Automation, pp. 2020-2024, 5 pages.
A Steady-Hand Robotic System for Microsurgical Augmentation, Russell Taylor, Pat Jensen, Louis Whitcomb, Aaron Barnes, Rajest Kumar, Dan Stoianovici, Puneet Gupta, Zhengxian Wange, Eugen Dejuan and Louis Kavoussi, The International Journal ofRobotics Research, 1999, vol. 18, pp. 1201-1210, 11 pages.
A Stereotactic/Robotic System for Pedicle Screw Placement, Julio J. Santos-Munne, Michael A. Peshkin, Srdjan Mirkovic, S. David Stulberg, Thomas C. Kienzle III, 1995, Interactive Technology and the New Paradigm for Hardware, pp. 326-333, 8 pages.
Accuracy Study on the Registration of the Tibia by Means of an Intramedullary Rod in Robot-Assisted Total Knee Arthroplasty, G. Van Ham, J. Bellemans, K. Denis, L. Labey, J. Vander Sloten, R. Van Audekercke, G. Van Der Perre, J. De Schutter, 46thAnnual Meeting, Orthopaedic Research Society, Mar. 2000, pp. 450, 1 page.
Accuracy Validation in Image-Guided Orthopaedic Surgery, David Simon, R.V. O'Toole, Mike Blackwell, F. Morgan, Anthony M. Digioia, and Takeo Kanade, Proceedings of the Second International Symposium on Medical Robotics and Computer Assisted Surgery, 1995, pp. 185-192, 8 pages.
Acrobot—Using Robots and Surgeons Synergistically in Knee Surgery, BL Davies, KL Fan, RD Hibberd, M. Jakopec and SJ Harris, Mechatronics in Medicine, Jul. 1997, pp. 173-178, 6 pages.
Active compliance in robotic surgery—the use of force control as a dynamic constraint, B.L. Davies, S. J. Harris, W. J. Lin, R. D. Hibberd, R. Middleton and J.C. Cobb, Proceedings of the Institution of Mechanical Engineers, Part H: Journal ofEngineering in Medicine, 1997 vol. 211, pp. 285-292; 9 pages.
An Image-directed Robotic System for Hip Replacement Surgery, Russell H. Taylor, Howard A. Paul, Brent D. Mittelstadt, William Hanson, Peter Kazanzides, Joel Zuhars, Edward Glassman, Bela L. Musits, Bill Williamson, William L. Bargar, Oct. 1990, pp. 111-118, 7 pages.
An Integrated CAD-Robotics System for Total Knee Replacement Surgery, Kienzle, T.C., III, 1993, IEEE, pp. 889-894, 6 pages.
Architecture of a Surgical Robot, P. Kazanzides, J. Zuhars, B. Mittelstadt, B. Williams, P. Cain, F. Smith, L. Rose, B. Musits, 1992 IEEE, pp. 1624-1629, 6 pages.
Clinical Introduction of the Caspar System Problems and Initial Results, C.O.R. Grueneis, R.H. Richter, F.F. Hennig, Abstracts from CAOS, 1999, pp. 160, 1 page.
Comparative Tracking Error Analysis of Five Different Optical Tracking Systems, R Khadem, C C Yeh, M Sadeghi-Tehrani, M R Bax, J A Johnson, J N Welch, E P Wilkinson, R Shahidi, Computer Aided Surgery vol. 5, 2000, pp. 98-107.
Comparison of Relative Accuracy Between a Mechanical and an Optical Position Tracker for Image-Guided Neurosurgery, Rohling R, Munger P, Hollerbach JM, Peter T., J Image Guid Surg., 1995;1(1), pp. 30-34, 4 pages.
Computer Assisted Knee Replacement, Scott L. Delp, Ph.D., S. David Stulberg, MD, Brian Davies, Ph.D. Frederic Picard, MD and Francois Leitner, Ph.D., Clinical Orthopaedics, Sep. 1998, vol. 354, pp. 49-56; 8 pages.
Computer Assisted Orthopaedic Surgery Image Guided and Robotic Assistive Technologies, Anthony M. Digioia, Branislav Jaramaz, and B. Colgan, Clinical Orthopaedics and Related Research, No. 354, Sep. 1998, pp. 8-16, 9 pages.
Computer Assisted Planning for Total Knee Arthroplasty, CVRMed-MRCAS'97, M. Fadda, D. Bertelli, S. Martelli, M. Marcacci, P. Dario, C. Paggetti, D. Caramella, D. Tripp!, 1997, pp. 619-628, 10 pages.
Computer Assisted Spine Surgery: a technique for accurate transpedicular screw fixation, S Lavallee, Proceedings of the First International Symposium on Medical Robotics and Computer Assisted Surgery, vol. 2, Workshop [Part I & II] Session VI, Sep. 1994, pp. 315-322, 9 pages.
Computer-assisted and robotics surgery, Brian Davies, International Congress and Symposium Series 223, 1997, pp. 71-82; 12 pages.
Computer-Assisted Knee Arthroplasty at Rizzoli Institutes, M. Fadda, Proceedings of the First International Symposium on Medical Robotics and Computer Assisted Surgery, Sep. 1994, pp. 26-30, 6 pages.
Concepts and methods of registration for computer-integrated surgery, E. Bainville, I. Bricault, P. Cinquin, and S. Lavallee, Computer Assisted Orthopedic Surgery (“CAOS”), L.-P. Nolte and R. Ganz, Eds., pp. 15-34, Hogrefe & Huber, Seattle, Wash, USA, 1999; 22 pages.
Crigos A Compact Robot for Image-Guided Orthopedic Surgery, Guido Brandt, Andreas Zimolong, Lional Carrat, Philippe Merloz, Hans-Walter Staudte, Stephane Lavallee, Klaus Rademacher and Gunter Rau, Dec. 1999, IEEE Transactions on InformationTechnology in Biomedicine, vol. 3, No. 4, pp. 252-260, 9 pages.
Design Optically Tracked Instruments for Image Guided Surgery, Jay B. West and Calvin R. Maurer, Jr., Member, IEEE Transactions on Medical Imaging, vol. 23, No. 5, May 2004, 13 pages.
Digital surgery the future of medicine and human-robot symbiotic interaction, Rony A. Abovitz, Industrial Robot: An International Journal, vol. 28, No. 5, 2001, pp. 401-405, 5 pages.
EasyGuide Neuro, ein neues System zur bildgefuhrten Planung, Simulation und Navigation in der Neurochirurgie, TH. Schmidt und W. Hentschel, 1995, Biomedizinische Technik, Band 40, Erganzungsband 1, pp. 233-234; 2 pages.
Ein Robotersystem fur craniomaxillofaciale chirurgische Eingriffe, J. Raczkowsky, J. Munchenberg, I. Bertovic, C. Burghart, Informatik Forsch. Entw., 1999, pp. 24-35, 12 pages.
English language abstract and machine-assisted English translation for CN 10126776 extracted from espacenet.com database on Jan. 13, 2021, 45 pages.
English language abstract and machine-assisted English translation for CN 202146362 extracted from espacenet.com database on Sep. 28, 2017, 10 pages.
English language abstract and machine-assisted English translation for DE 4225112 extracted from espacenet.com database on Sep. 28, 2017, 12 pages.
English language abstract for CN 102449666 extracted from espacenet.com database on Sep. 28, 2017, 2 pages.
English language abstract for JP 2008-538184 extracted from espacenet.com database on Apr. 29, 2019, 2 pages.
English language abstract for JP 2009-254805 extracted from espacenet.com database on Mar. 20, 2019, 2 pages.
Experiences with Robotic Systems for Knee Surgery, S. J. Harris, W. J. Lin, K. L. Fan, R. D. Hibberd, J. Cobb, R. Middleton, B. L. Davies, CVRMed-MRCAS'97, Lecture Notes in Computer Science vol. 1205, 1997, pp. 757-766; 10 pages.
Force Control for Robotic Surgery, S.C. Ho, R.D. Hibberd, J. Cobb and B.L. Davies, ICAR, 1995, pp. 21-32, 12 pages.
Frameless Neuronavigation in Modern Neurosurgery, Spetzger U, Laborde G, Gilsbach JM, Minim. Invas, Neurosurg., 1995, vol. 38, pp. 163-166, 4 pages.
G.C. Claasen, P. Martin, F. Picard, Tracking and Control for Handheld Surgery Tools, Biomedical Circuits and System Conference (BioCAS), 2011 IEEE, pp. 428-431, IEEE, San Diego, CA, USA; 4 pages.
G.C. Claasen, P. Martin, F. Picard, Tracking and Control for Handheld Surgery Tools, Biomedical Circuits and Systems Conference (BioCAS), 2011 IEEE, pp. 428-431, IEEE, San Diego, CA, USA; 4 pages.
Haptic information displays for computer-assisted surgery, A.E. Quaid and Rony A. Abovitz, Robotics and Automation, 2002 Proceedings ICRA '02. IEEE International Conference on Robotics And Automation, 2092, vol. 2, pp. 2092-2097, 6 pages.
High-Bandwidth Low-Latency Tracking Using Optical and Inertial Sensors, Gontje C. Claasen And Philippe Martin, Centre Automatique et Systemes, and Frederic Picard, Department of Ofthopaedics, Golden Jubilee National Hospital, Proceedings of the 5thInternational Conference on Automation, Robotics and Applications, Dec. 6-8, 2011, Wellington, New Zealand, 6 pages.
Human-Interactive Medical Robotics, Rony A. Abovitz, M.S., Computer Assisted Orthopedic Surgery (“CAOS”), 2000, pp. 71-72, 2 pages.
Image-Guided Manipulator Compliant Surgical Planning Methodology for Robotic Skull-Base Surgery, Wan Sing Ng; Ming Yeong Teo; Yong-Chong Loh; Tseng Tsai Yeo, Medical Imaging and Augmented Reality, 2001 IEEE, pp. 26-29, 4 pages.
English language abstract and machine-assisted English translation for JP 4231034 B2 extracted from espacenet.com database on Aug. 12, 2024, 7 pages.
Related Publications (1)
Number Date Country
20230056674 A1 Feb 2023 US
Provisional Applications (1)
Number Date Country
61705804 Sep 2012 US
Continuations (5)
Number Date Country
Parent 16783654 Feb 2020 US
Child 17982830 US
Parent 15599935 May 2017 US
Child 16783654 US
Parent 14994236 Jan 2016 US
Child 15599935 US
Parent 14635402 Mar 2015 US
Child 14994236 US
Parent 14035207 Sep 2013 US
Child 14635402 US