Various medical procedures require the precise localization of a three-dimensional position of a surgical instrument within the body in order to effect optimized treatment. For example, some surgical procedures to fuse vertebrae require that a surgeon drill multiple holes into the bone structure at specific locations. To achieve high levels of mechanical integrity in the fusing system, and to balance the forces created in the bone structure, it is necessary that the holes are drilled at the correct location. Vertebrae, like most bone structures, have complex shapes made up of non-planar curved surfaces making precise and perpendicular drilling difficult. Conventionally, a surgeon manually holds and positions a drill guide tube by using a guidance system to overlay the drill tube's position onto a three dimensional image of the bone structure. This manual process is both tedious and time consuming. The success of the surgery is largely dependent upon the dexterity of the surgeon who performs it.
Limited robotic assistance for surgical procedures is currently available. For example, the da Vinci® medical robot system (da Vinci® is a registered trademark of Intuitive Surgical) is a robot used in certain surgical applications. In the da Vinci® system, the user controls manipulators that control a robotic actuator. The system converts the surgeon's gross movements into micro-movements of the robotic actuator. Although the da Vinci® system eliminates hand tremor and provides the user with the ability to work through a small opening, like many of the robots commercially available today, it is expensive, obtrusive, and the setup is cumbersome. Further, for procedures such as thoracolumbar pedicle screw insertion, these conventional methods are known to be error-prone and tedious.
One of the characteristics of many of the current robots used in surgical applications which make them error prone is that they use an articular arm based on a series of rotational joints. The use of an articular system may create difficulties in arriving at an accurately targeted location because the level of any error is increased over each joint in the articular system.
Some embodiments of the invention provide a surgical robot (and optionally an imaging system) that utilizes a Cartesian positioning system that allows movement of a surgical instrument to be individually controlled in an x-axis, y-axis and z-axis. In some embodiments, the surgical robot can include a base, a robot arm coupled to and configured for articulation relative to the base, as well as an end-effectuator coupled to a distal end of the robot arm. The effectuator element can include the surgical instrument or can be configured for operative coupling to the surgical instrument. Some embodiments of the invention allow the roll, pitch and yaw rotation of the end-effectuator and/or surgical instrument to be controlled without creating movement along the x-axis, y-axis, or z-axis.
In some embodiments, the end-effectuator can include a guide tube, a tool, and/or a penetrating shaft with a leading edge that is either beveled (shaft cross-cut at an angle) or non-beveled (shaft ending in a pointed tip). In some embodiments, a non-beveled end-effectuator element can be employed to ablate a pathway through tissue to reach the target position while avoiding the mechanical forces and deflection created by a typical bevel tissue cutting system.
Some embodiments of the surgical robot can include a motor assembly comprising three linear motors that separately control movement of the effectuator element and/or surgical instrument on the respective x-, y- and z-axes. These separate motors can provide a degree of accuracy that is not provided by conventional surgical robots, thereby giving the surgeon the capability of more exactly determining position and strike angles on a three dimensional image.
In some embodiments, at least one RF transmitter can be mounted on the effectuator element and/or the surgical instrument. Three or more RF receivers can be mounted in the vicinity of the surgical robot. The location of the RF transmitter and, therefore, the surgical instrument, can be accurately determined by analyzing the RF signals that are emitted from the RF transmitter. For example, by measuring the time of flight of the RF signal from the transmitter to the RF receivers that are positioned at known locations, the position of the end-effectuator element with respect to a patient can be determined. In some embodiments, a physician or surgeon can perform epidural injections of steroids into a patient to alleviate back pain without the use of x-rays as is currently required with x-ray fluoroscopic techniques.
Some embodiments of the invention use RF feedback to actively control the movement of the surgical robot. For example, RF signals can be sent by the RF transmitter on an iterative basis and then analyzed in an iterative process to allow the surgical robot to automatically move the effectuator element and/or surgical instrument to a desired location within a patient's body. The location of the effectuator element and/or surgical instrument can be dynamically updated and, optionally, can be displayed to a user in real-time.
In some embodiments, at least one RF transmitter can be disposed on other elements of the surgical robot, or anywhere within the room where an invasive procedure is taking place, in order to track other devices.
Some embodiments of the invention dispose one or more RF transmitters on the anatomical part of the patient that is the target of the invasive procedure. This system can be used to correct the movement of the surgical robot in the event the anatomical target moves during the procedure.
In some embodiments, the system can be configured to automatically position and rigidly hold the end-effectuator and/or the surgical instrument in accurate alignment with a required trajectory, such as, for example, a selected trajectory of a pedicle screw during pedicle screw insertion procedures. In case of movement of the patient, the system can be configured to automatically adjust the position of the robot to maintain desired alignment relative to an anatomical region of interest.
Before any embodiments of the invention are explained in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. The invention is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Unless specified or limited otherwise, the terms “mounted,” “connected,” “supported,” and “coupled” and variations thereof are used broadly and encompass both direct and indirect mountings, connections, supports, and couplings. Further, “connected” and “coupled” are not restricted to physical or mechanical connections or couplings.
The following discussion is presented to enable a person skilled in the art to make and use embodiments of the invention. Various modifications to the illustrated embodiments will be readily apparent to those skilled in the art, and the generic principles herein can be applied to other embodiments and applications without departing from embodiments of the invention. Thus, embodiments of the invention are not intended to be limited to embodiments shown, but are to be accorded the widest scope consistent with the principles and features disclosed herein. The following detailed description is to be read with reference to the figures, in which like elements in different figures have like reference numerals. The figures, which are not necessarily to scale, depict selected embodiments and are not intended to limit the scope of embodiments of the invention. Skilled artisans will recognize the examples provided herein have many useful alternatives and fall within the scope of embodiments of the invention.
The present invention can be understood more readily by reference to the following detailed description, examples, drawings, and claims, and their previous and following description. However, before the present devices, systems, and/or methods are disclosed and described, it is to be understood that this invention is not limited to the specific devices, systems, and/or methods disclosed unless otherwise specified, and, as such, can, of course, vary. It is also to be understood that the terminology used herein is for the purpose of describing particular aspects only and is not intended to be limiting.
The following description is provided as an enabling teaching of the invention in its best, currently known embodiment. To this end, those skilled in the relevant art will recognize and appreciate that many changes can be made to the various aspects of the invention described herein, while still obtaining the beneficial results of the present invention. It will also be apparent that some of the desired benefits of the present invention can be obtained by selecting some of the features of the present invention without utilizing other features. Accordingly, those who work in the art will recognize that many modifications and adaptations to the present invention are possible and can even be desirable in certain circumstances and are a part of the present invention. Thus, the following description is provided as illustrative of the principles of the present invention and not in limitation thereof.
As used throughout, the singular forms “a,” “an” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a delivery conduit” can include two or more such delivery conduits unless the context indicates otherwise.
As used herein, the terms “optional” or “optionally” mean that the subsequently described event or circumstance may or may not occur, and that the description includes instances where said event or circumstance occurs and instances where it does not.
In some embodiments, the disclosed devices and systems can comprise elements of the devices and systems described in U.S. Patent Publication Nos. 2007/0238985, 2008/0154389, and 2008/0215181, the disclosures of which are incorporated herein by reference in their entireties.
As employed in this specification and annexed drawings, the terms “unit,” “component,” “interface,” “system,” “platform,” and the like are intended to include a computer-related entity or an entity related to an operational apparatus with one or more specific functionalities, wherein the computer-related entity or the entity related to the operational apparatus can be either hardware, a combination of hardware and software, software, or software in execution. One or more of such entities are also referred to as “functional elements.” As an example, a unit may be, but is not limited to being, a process running on a processor, a processor, an object, an executable computer program, a thread of execution, a program, a memory (e.g., a hard disc drive), and/or a computer. As another example, a unit can be an apparatus with specific functionality provided by mechanical parts operated by electric or electronic circuitry which is operated by a software application or a firmware application executed by a processor, wherein the processor can be internal or external to the apparatus and executes at least a part of the software or firmware application. In addition or in the alternative, a unit can provide specific functionality based on physical structure or specific arrangement of hardware elements. As yet another example, a unit can be an apparatus that provides specific functionality through electronic functional elements without mechanical parts, the electronic functional elements can include a processor therein to execute software or firmware that provides at least in part the functionality of the electronic functional elements. An illustration of such apparatus can be control circuitry, such as a programmable logic controller. The foregoing example and related illustrations are but a few examples and are not intended to be limiting. Moreover, while such illustrations are presented for a unit, the foregoing examples also apply to a component, a system, a platform, and the like. It is noted that in certain embodiments, or in connection with certain aspects or features thereof, the terms “unit,” “component,” “system,” “interface,” “platform” can be utilized interchangeably.
Throughout the description and claims of this specification, the word “comprise” and variations of the word, such as “comprising” and “comprises,” means “including but not limited to,” and is not intended to exclude, for example, other additives, components, integers or steps.
Referring now to
In some embodiments, prior to performance of an invasive procedure, a three-dimensional (“3D”) image scan can be taken of a desired surgical area of the patient 18 and sent to a computer platform in communication with surgical robot 15 as described herein (see for example the platform 3400 including the computing device 3401 shown in
In some embodiments, the surgical robot system 1 can comprise a local positioning system (“LPS”) subassembly to track the position of surgical instrument 35. The LPS subassembly can comprise at least one radio-frequency (RF) transmitter 120 that is coupled were affixed to the end-effectuator 30 or the surgical instrument 35 at a desired location. In some embodiments, the at least one RF transmitter 120 can comprise a plurality of transmitters 120, such as, for example, at least three RF transmitters 120. In another embodiment, the LPS subassembly can comprise at least one RF receiver 110 configured to receive one or more RF signals produced by the at least one RF transmitter 120. In some embodiments, the at least one RF receiver 110 can comprise a plurality of RF receivers 110, such as, for example, at least three RF receivers 110. In these embodiments, the RF receivers 110 can be positioned at known locations within the room 10 where the medical procedure is to take place. In some embodiments, the RF receivers 110 can be positioned at known locations within the room 10 such that the RF receivers 110 are not coplanar within a plane that is parallel to the floor of the room 10.
In some embodiments, during use, the time of flight of an RF signal from each RF transmitter 120 of the at least one RF transmitter 120 to each RF receiver 110 of the at least one RF receiver 110 (e.g., one RF receiver, two RF receivers, three RF receivers, etc.) can be measured to calculate the position of each RF transmitter 120. Because the velocity of the RF signal is known, the time of flight measurements result in at least three distance measurements for each RF transmitter 120 (one to each RF receiver 110).
In some embodiments, the surgical robot system 1 can comprise a control device (for example a computer 100 having a processor and a memory coupled to the processor). In some embodiments, the processor of the control device 100 can be configured to perform time of flight calculations as described herein. Further, in some embodiments, can be configured to provide a geometrical description of the location of the at least one RF transmitter 120 with respect to an operative end of the surgical instrument 35 or end-effectuator 30 that is utilized to perform or assist in performing an invasive procedure. In some further embodiments, the position of the RF transmitter 120, as well as the dimensional profile of the surgical instrument 35 or the effectuator element 30 can be displayed on a monitor (for example on a display means 29 such as the display 150 shown in
Another embodiment of the disclosed surgical robot system 1 involves the utilization of a robot 15 that is capable of moving the end-effectuator 30 along x-, y-, and z-axis (see 66, 68, 70 in
In some further embodiments, the end-effectuator 30 can be configured for selective rotation about one or more of the x-axis 66, y-axis 68, and z-axis 70 (such that one or more of the Cardanic Euler Angles (e.g., roll, pitch, and/or yaw) associated with the end-effectuator 30 can be selectively controlled). In some embodiments, during operation, the end-effectuator 30 and/or surgical instrument 35 can be aligned with a selected orientation axis (labeled “Z Tube” in
In some embodiments, as shown in
Referring to
In some embodiments, the computer (not shown in
In some embodiments, the position of surgical instrument 35 can be dynamically updated so that surgical robot 15 is aware of the location of surgical instrument 35 at all times during the procedure. Consequently, in some embodiments, the surgical robot 15 can move the surgical instrument 35 to the desired position quickly, with minimal damage to patient 18, and without any further assistance from a physician (unless the physician so desires). In some further embodiments, the surgical robot 15 can be configured to correct the path of surgical instrument 35 if the surgical instrument 35 strays from the selected, preplanned trajectory.
In some embodiments, the surgical robot 15 can be configured to permit stoppage, modification, and/or manual control of the movement of the end-effectuator 30 and/or surgical instrument 35. Thus, in use, in some embodiments, an agent (e.g., a physician or other user) that can operate the system 1 has the option to stop, modify, or manually control the autonomous movement of end-effectuator 30 and/or surgical instrument 35. Further, in some embodiments, tolerance controls can be preprogrammed into the surgical robot 15 and/or processor of the computer platform 3400 (such that the movement of the end-effectuator 30 and/or surgical instrument 35 is adjusted in response to specified conditions being met). For example, in some embodiments, if the surgical robot 15 cannot detect the position of surgical instrument 35 because of a malfunction in the at least one RF transmitter 120, then the surgical robot 15 can be configured to stop movement of end-effectuator 30 and/or surgical instrument 35. In some embodiments, if surgical robot 15 detects a resistance, such as a force resistance or a torque resistance above a tolerance level, then the surgical robot 15 can be configured to stop movement of end-effectuator 30 and/or surgical instrument 35.
In some embodiments, the computer 100 for use in the system (for example represented by computing device 3401), as further described herein, can be located within surgical robot 15, or, alternatively, in another location within surgical room 10 or in a remote location. In some embodiments, the computer 100 can be positioned in operative communication with positioning sensors 12 and surgical robot 15.
In some further embodiments, the surgical robot 15 can also be used with existing conventional guidance systems. Thus, alternative conventional guidance systems beyond those specifically disclosed herein are within the scope and spirit of the invention. For instance, a conventional optical tracking system 3417 for tracking the location of the surgical device, or a commercially available infrared optical tracking system 3417, such as Optotrak® (Optotrak® is a registered trademark of Northern Digital Inc. Northern Digital, Waterloo, Ontario, Canada), can be used to track the patient 18 movement and the robot's base 25 location and/or intermediate axis location, and used with the surgical robot system 1. In some embodiments in which the surgical robot system 1 comprises a conventional infrared optical tracking system 3417, the surgical robot system 1 can comprise conventional optical markers attached to selected locations on the end-effectuator 30 and/or the surgical instrument 35 that are configured to emit or reflect light. In some embodiments, the light emitted from and/or reflected by the markers can be read by cameras (for example with cameras 8200 shown in
Referring now to
As described earlier, the end-effectuator 30 can comprise a surgical instrument 35, whereas in other embodiments, the end-effectuator 30 can be coupled to the surgical instrument 35. In some embodiments, it is arm 23 can be connected to the end-effectuator 30, with surgical instrument 35 being removably attached to the end-effectuator 30.
Referring now to
In some embodiments, the surgical robot 15 is moveable in a plurality of axes (for instance x-axis 66, y-axis 68, and z-axis 70) in order to improve the ability to accurately and precisely reach a target location. Some embodiments include a robot 15 that moves on a Cartesian positioning system; that is, movements in different axes can occur relatively independently of one another instead of at the end of a series of joints.
Referring now to
In a further embodiment, referring now to
Referring now to
Referring now to
Some embodiments can include a system diagram of surgical robot system 1 having a computer 100, a display means 29 comprising a display 150, user input 170, and motors 160, provided as illustrated in
In some embodiments, prior to performance of a medical procedure, such as, for example, an invasive surgical procedure, user input 170 can be used to plan the trajectory for a desired navigation. After the medical procedure has commenced, if changes in the trajectory and/or movement of the end-effectuator 30 and/or surgical instrument 35 are desired, a user can use the user input 170 to input the desired changes, and the computer 100 can be configured to transmit corresponding signals to the motors 160 in response to the user input 170.
In some embodiments, the motors 160 can be or can comprise conventional pulse motors. In this aspect, in some embodiments, the pulse motors can be in a conventional direct drive configuration or a belt drive and pulley combination attached to the surgical instrument 35. Alternatively, in other embodiments, the motors 160 can be conventional pulse motors that are attached to a conventional belt drive rack-and-pinion system or equivalent conventional power transmission component.
In some embodiments, the use of conventional linear pulse motors within the surgical robot 15 can permit establishment of a non-rigid position for the end-effectuator 30 and/or surgical instrument 35. Thus, in some embodiments, the end-effectuator 30 and/or surgical instrument 35 will not be fixed in a completely rigid position, but rather the end-effectuator 30 and/or the surgical instrument 35 can be configured such that an agent (e.g., a surgeon or other user) can overcome the x-axis 66 and y-axis 68, and force the end-effectuator 30 and/or surgical instrument 35 from its current position. For example, in some embodiments, the amount of force necessary to overcome such axes can be adjusted and configured automatically or by an agent. In some embodiments, the surgical robot 15 can comprise circuitry configured to monitor one or more of: (a) the position of the robot arm 23, the end-effectuator 30, and/or the surgical instrument 35 along the x-axis 66, y-axis 68, and z-axis 70; (b) the rotational position (e.g., roll 62 and pitch 60) of the robot arm 23, the end-effectuator 30, and/or the surgical instrument 35 relative to the x-(66), y-(68), and z-(70) axes; and (c) the position of the end-effectuator 30, and/or the surgical instrument 35 along the travel of the re-orientable axis that is parallel at all times to the end-effectuator 30 and surgical instrument 35 (the Z-tube axis 64).
In one embodiment, circuitry for monitoring the positions of the x-axis 66, y-axis 68, z-axis 70, Z-tube axis 64, roll 62, and/or pitch 60 can comprise relative or absolute conventional encoder units (also referred to as encoders) embedded within or functionally coupled to conventional actuators and/or bearings of at least one of the motors 160. Optionally, in some embodiments, the circuitry of the surgical robot 15 can be configured to provide auditory, visual, and/or tactile feedback to the surgeon or other user when the desired amount of positional tolerance (e.g., rotational tolerance, translational tolerance, a combination thereof, or the like) for the trajectory has been exceeded. In some embodiments, the positional tolerance can be configurable and defined, for example, in units of degrees and/or millimeters.
In some embodiments, the robot 15 moves into a selected position, ready for the surgeon to deliver a selected surgical instrument 35, such as, for example and without limitation, a conventional screw, a biopsy needle 8110, and the like. In some embodiments, as the surgeon works, if the surgeon inadvertently forces the end-effectuator 30 and/or surgical instrument 35 off of the desired trajectory, then the system 1 can be configured to provide an audible warning and/or a visual warning. For example, in some embodiments, the system 1 can produce audible beeps and/or display a warning message on the display means 29, such as “Warning: Off Trajectory,” while also displaying the axes for which an acceptable tolerance has been exceeded.
In some embodiments, in addition to, or in place of the audible warning, a light illumination may be directed to the end-effectuator 30, the guide tube 50, the operation area (i.e. the surgical field 17) of the patient 18, or a combination of these regions. For example, some embodiments include at least one visual indication 900 capable of illuminating a surgical field 17 of a patient 18. Some embodiments include at least one visual indication 900 capable of indicating a target lock by projecting an illumination on a surgical field 17. In some embodiments, the system 1 can provide feedback to the user regarding whether the robot 15 is locked on target. In some other embodiments, the system 1 can provide an alert to the user regarding whether at least one marker 720 is blocked, or whether the system 1 is actively seeking one or more markers 720.
In some embodiments, the visual indication 900 can be projected by one or more conventional light emitting diodes mounted on or near the robot end-effectuator 30. In some embodiments, the visual indication can comprise lights projected on the surgical field 17 including a color indicative of the current situation (see for example,
In some embodiments, if the surgeon attempts to exceed the acceptable tolerances, the robot 15 can be configured to provide mechanical resistance (“push back” or haptic feedback) to the movement of the end-effectuator 30 and/or surgical instrument 35 in this manner, thereby promoting movement of the end-effectuator 30 and/or surgical instrument 35 back to the correct, selected orientation. In some embodiments, when the surgeon then begins to correct the improper position, the robot 15 can be configured to substantially immediately return the end-effectuator 30 and/or surgical instrument 35 back to the desired trajectory, at which time the audible and visual warnings and alerts can be configured to cease. For example, in some embodiments, the visual warning could include a visual indication 900 that may include a green light if no tolerances have been exceeded, or a red light if tolerances are about to, or have been exceeded.
As one will appreciate, a conventional worm-drive system would be absolutely rigid, and a robot 15 having such a worm-drive system would be unable to be passively moved (without breaking the robot 15) no matter how hard the surgeon pushed. Furthermore, a completely rigid articulation system can be inherently unsafe to a patient 18. For example, if such a robot 15 were moving toward the patient 18 and inadvertently collided with tissues, then these tissues could be damaged. Although conventional sensors can be placed on the surface of such a robot 15 to compensate for these risks, such sensors can add considerable complexity to the overall system 1 and would be difficult to operate in a fail-safe mode. In contrast, during use of the robot 15 described herein, if the end-effectuator 30 and/or surgical instrument 35 inadvertently collides with tissues of the patient 18, a collision would occur with a more tolerable force that would be unlikely to damage such tissues. Additionally, in some embodiments, auditory and/or visual feedback as described above can be provided to indicate an increase in the current required to overcome the obstacle. Furthermore, in some embodiments, the end-effectuator 30 of the robot 15 can be configured to displace itself (move away) from the inadvertently contacted tissue if a threshold required motor 160 current is encountered. In some embodiments, this threshold could be configured (by a control component, for example) for each axis such that the moderate forces associated with engagement between the tissue and the end-effectuator 30 can be recognized and/or avoided.
In some embodiments, the amount of rigidity associated with the positioning and orientation of the end-effectuator 30 and/or the surgical instrument 35 can be selectively varied. For example, in some embodiments, the robot 15 can be configured to shift between a high-rigidity mode and a low-rigidity mode. In some embodiments, the robot 15 can be programmed so that it automatically shifts to the low-rigidity mode as the end-effectuator 30 and surgical instrument 35 are shifted from one trajectory to another, from a starting position as they approach a target trajectory and/or target position. Moreover, in some embodiment, once the end-effectuator 30 and/or surgical instrument 35 is within a selected distance of the target trajectory and/or target position, such as, for example, within about 1° and about 1 mm of the target, the robot 15 can be configured to shift to the high-rigidity mode. In some embodiments, this mechanism may improve safety because the robot 15 would be unlikely to cause injury if it inadvertently collided with the patient 18 while in the low-rigidity mode.
Some embodiments include a robot 15 that can be configured to effect movement of the end-effectuator 30 and/or surgical instrument 35 in a selected sequence of distinct movements. In some embodiments, during movement of the end-effectuator 30 and/or surgical instrument 35 from one trajectory to another trajectory, the x-axis 66, y-axis 68, roll 62, and 60 pitch 60 orientations are all changed simultaneously, and the speed of movement of the end-effectuator 30 can be increased. Consequently, because of the range of positions through which the end-effectuator 30 travels, the likelihood of a collision with the tissue of the patient 18 can also be increased. Hence, in some embodiments, the robot 15 can be configured to effect movement of the end-effectuator 30 and/or surgical instrument 35 such that the position of the end-effectuator 30 and/or surgical instrument 35 within the x-axis 66 and the y-axis 68 are adjusted before the roll 62 and pitch 60 of the end-effectuator 30 and/or surgical instrument 35 are adjusted. In some alternative embodiments, the robot 15 can be configured to effect movement of the end-effectuator 30 and/or surgical instrument 35 so that the roll 62 and pitch 60 are shifted to 0°. The position of the end-effectuator 30 and/or surgical instrument 35 within the x-axis 66 and the y-axis 68 are adjusted, and then the roll 62 and pitch 60 of the end-effectuator 30 and/or surgical instrument 35 are adjusted.
Some embodiments include a robot 15 that can be optionally configured to ensure that the end-effectuator 30 and/or surgical instrument 35 are moved vertically along the z-axis 70 (away from the patient 18) by a selected amount before a change in the position and/or trajectory of the end-effectuator 30 and/or surgical instrument 35 is effected. For example, in some embodiments, when an agent (for example, a surgeon or other user, or equipment) changes the trajectory of the end-effectuator 30 and/or surgical instrument 35 from a first trajectory to a second trajectory, the robot 15 can be configured to vertically displace the end-effectuator 30 and/or surgical instrument 35 from the body of the patient 18 along the z-axis 70 by the selected amount (while adjusting x-axis 66 and y-axis 68 configurations to remain on the first trajectory vector, for example), and then effecting the change in position and/or orientation of the end-effectuator 30 and/or surgical instrument 35. This ensures that the end-effectuator 30 and/or surgical instrument 35 do not move laterally while embedded within the tissue of the patient 18. Optionally, in some embodiments, the robot 15 can be configured to produce a warning message that seeks confirmation from the agent (for example, a surgeon or other user, or equipment) that it is safe to proceed with a change in the trajectory of the end-effectuator 30 and/or surgical instrument 35 without first displacing the end-effectuator 30 and/or surgical instrument 35 along the z-axis.
In some embodiments, at least one conventional force sensor (not shown) can be coupled to the end-effectuator 30 and/or surgical instrument 35 such that the at least one force sensor receives forces applied along the orientation axis (Z-tube axis 64) to the surgical instrument 35. In some embodiments, the at least one force sensor can be configured to produce a digital signal. In some embodiments for example, the digital signal can be indicative of the force that is applied in the direction of the Z-tube axis 64 to the surgical instrument 35 by the body of the patient 18 as the surgical instrument 35 advances into the tissue of the patient 18. In some embodiments, the at least one force sensor can be a small conventional uniaxial load cell based on a conventional strain gauge mechanism. In some embodiments, the uniaxial load cell can be coupled to, for example, analog-to-digital filtering to supply a continuous digital data stream to the system 1. Optionally, in some embodiments, the at least one force sensor can be configured to substantially continuously produce signals indicative of the force that is currently being applied to the surgical instrument 35. In some embodiments, the surgical instrument 35 can be advanced into the tissue of the patient 18 by lowering the z-axis 70 while the position of the end-effectuator 30 and/or surgical instrument 35 along the x-axis 66 and y-axes 68 is adjusted such that alignment with the selected trajectory vector is substantially maintained. Furthermore, in some embodiments, the roll 62 and pitch 60 orientations can remain constant or self-adjust during movement of the x-(66), y-(68), and z-(70) axes such that the surgical instrument 35 remains oriented along the selected trajectory vector. In some embodiments, the position of the end-effectuator 30 along the z-axis 70 can be locked at a selected mid-range position (spaced a selected distance from the patient 18) as the surgical instrument 35 advances into the tissue of the patient 18. In some embodiments, the stiffness of the end-effectuator 30 and/or the surgical instrument 35 can be set at a selected level as further described herein. For example, in some embodiments, the stiffness of the Z-tube axis 64 position of the end-effectuator 30 and/or the surgical instrument 35 can be coupled to a conventional mechanical lock (not shown) configured to impart desired longitudinal stiffness characteristics to the end-effectuator 30 and/or surgical instrument 35. In some embodiments, if the end-effectuator 30 and/or surgical instrument 35 lack sufficient longitudinal stiffness, then the counterforce applied by the tissue of the patient 18 during penetration of the surgical instrument 35 can oppose the direction of advancement of the surgical instrument 35 such that the surgical instrument 35 cannot advance along the selected trajectory vector. In other words, as the z-axis 70 advances downwards, the Z-tube axis 64 can be forced up and there can be no net advancement of the surgical instrument 35. In some embodiments, the at least one force sensor can permit an agent (for example, a surgeon or other user, or equipment) to determine, (based on sudden increase in the level of applied force monitored by the force sensor at the end-effectuator 30 and/or the surgical instrument 35), when the surgical instrument 35 has encountered a bone or other specific structure within the body of the patient 18.
In some alternative embodiments, the orientation angle of the end-effectuator 30 and/or surgical instrument 35 and the x-axis 66 and y-axis 68 can be configured to align the Z-tube axis 64 with the desired trajectory vector at a fully retracted Z-tube position, while a z-axis 70 position is set in which the distal tip of the surgical instrument 35 is poised to enter tissue. In this configuration, in some embodiments, the end-effectuator 30 can be positioned in a manner that the end-effectuator 30 can move, for example, exactly or substantially exactly down the trajectory vector if it were advanced only along guide tube 50. In such scenario, in some embodiments, advancing the Z-tube axis 64 can cause the guide tube 50 to enter into tissue, and an agent (a surgeon or other user, equipment, etc.) can monitor change in force from the load sensor. Advancement can continue until a sudden increase in applied force is detected at the time the surgical instrument 35 contacts bone.
In some embodiments, the robot 15 can be configured to deactivate the one or more motors 160 that advance the Z-tube axis 64 such that the end-effectuator 30 and/or the surgical instrument 35 can move freely in the Z-tube axis 64 direction while the position of the end-effectuator 30 and/or the surgical instrument 35 continues to be monitored. In some embodiments, the surgeon can then push the end-effectuator 30 down along the Z-tube axis 64, (which coincides with the desired trajectory vector) by hand. In some embodiments, if the end-effectuator 30 position has been forced out of alignment with the trajectory vector, the position of the surgical instrument 35 can be corrected by adjustment along the x-(66) and/or y-(68) axes and/or in the roll 62 and/or pitch 60 directions. In some embodiments, when motor 160 associated with the Z-tube 50 movement of the surgical instrument 35 is deactivated, the agent (for example, a surgeon or other user, or equipment) can manually force the surgical instrument 35 to advance until a tactile sense of the surgical instrument 35 contacts bone, or another known region of the body).
In some further embodiments, the robotic surgical system 1 can comprise a plurality of conventional tracking markers 720 configured to track the movement of the robot arm 23, the end-effectuator 30, and/or the surgical instrument 35 in three dimensions. It should be appreciated that three dimensional positional information from tracking markers 720 can be used in conjunction with the one dimensional linear positional information from absolute or relative conventional linear encoders on each axis of the robot 15 to maintain a high degree of accuracy. In some embodiments, the plurality of tracking markers 720 can be mounted (or otherwise secured) thereon an outer surface of the robot 15, such as, for example and without limitation, on the base 25 of the robot 15, or the robot arm 23. In some embodiments, the plurality of tracking markers 720 can be configured to track the movement of the robot 15 arm, the end-effectuator 30, and/or the surgical instrument 35. In some embodiments, the computer 100 can utilize the tracking information to calculate the orientation and coordinates of the distal tip 30a of the surgical instrument 35 based on encoder counts along the x-axis 66, y-axis 68, z-axis 70, the Z-tube axis 64, and the roll 62 and pitch 60 axes. Further, in some embodiments, the plurality of tracking markers 720 can be positioned on the base 25 of the robot 15 spaced from the surgical field 17 to reduce the likelihood of being obscured by the surgeon, surgical tools, or other parts of the robot 15. In some embodiments, at least one tracking marker 720 of the plurality of tracking markers 720 can be mounted or otherwise secured to the end-effectuator 30. In some embodiments, the positioning of one or more tracking markers 720 on the end-effectuator 30 can maximize the accuracy of the positional measurements by serving to check or verify the end-effectuator 30 position (calculated from the positional information from the markers on the base 25 of the robot 15 and the encoder counts of the x-(66), y-(68), roll 62, pitch 60, and Z-tube axes 64).
In some further embodiments, at least one optical marker of the plurality of optical tracking markers 720 can be positioned on the robot 15 between the base 25 of the robot 15 and the end-effectuator 30 instead of, or in addition to, the markers 720 on the base 25 of the robot 15, (see
In some embodiments, when the surgical instrument 35 is advanced into the tissue of the patient 18 with the assistance of a guide tube 50, the surgical instrument 35 can comprise a stop mechanism 52 that is configured to prevent the surgical instrument 35 from advancing when it reaches a predetermined amount of protrusion (see for example,
In some embodiments, it can be desirable to monitor not just the maximum protrusion distance of the surgical instrument 35, but also the actual protrusion distance at any instant during the insertion process. Therefore, in some embodiments, the robot 15 can substantially continuously monitor the protrusion distance, and in some embodiments, the distance can be displayed on a display (such as display means 29). In some embodiments, protrusion distance can be substantially continuously monitored using a spring-loaded plunger 54 including a spring-loaded mechanism 55a and sensor pad 55b that has a coupled wiper 56 (see for example
Some embodiments include instruments that enable the stop on a drill bit 42 to be manually adjusted with reference to markings 44 on the drill bit 42. For example,
Some embodiments include the ability to lock and hold the drill bit 42 in a set position relative to the tube 50 in which it is housed. For example, in some embodiments, the drill bit 42 can be locked by locking the drill stop 46 relative to the tube 50 using a locking mechanism.
In some further embodiments, the end-effectuator 30 can be configured not block the tracking optical markers 720 or interfere with the surgeon. For example, in some embodiments, the end-effectuator 30 can comprise a clearance mechanism 33 including an actuator 33a that permits this configuration, as depicted in
In applications such as cervical or lumbar fusion surgery, it can be beneficial to apply distraction or compression across one or more levels of the spine (anteriorly or posteriorly) before locking hardware in place. In some embodiments, the end-effectuator 30 can comprise an attachment element 37 that is configured to apply such forces (see for example
In view of the embodiments described hereinbefore, some embodiments that can be implemented in accordance with the disclosed subject matter can be better appreciated with reference to the flowcharts in
It should be further appreciated that the methods disclosed in the various embodiments described throughout the subject specification can be stored on an article of manufacture, or computer-readable medium, to facilitate transporting and transferring such methods to a computing device (e.g., a desktop computer, a mobile computer, a mobile telephone, a blade computer, a programmable logic controller, and the like) for execution, and thus implementation, by a processor of the computing device or for storage in a memory thereof.
In some embodiments, the surgical robot 15 can adjust its position automatically continuously or substantially continuously in order to move the end-effectuator 30 to an intended (i.e. planned) position. For example, in some embodiments, the surgical robot 15 can adjust its position automatically continuously or substantially continuously based on the current position of the end-effectuator 30 and surgical target as provided by a current snapshot of tracking markers, LPS, or other tracking data. It should further be appreciated that certain position adjustment strategies can be inefficient. For example, an inefficient strategy for the robot 15 to find a target location can be an iterative algorithm to estimate the necessary direction of movement, move toward the target location, and then assess a mismatch between a current location and the target location (the mismatch referred to as an error), and estimate a new direction, repeating the cycle of estimate-movement-assessment until the target location is reached within a satisfactory error. Conversely, the position adjustment strategies in accordance with some embodiments of the invention are substantively more efficient than iterative strategies. For example, in some embodiments, a surgical robot 15 can make movements and adjust its location by calibrating the relative directions of motions in each axis (permitting computation via execution of software or firmware with the computer 100) at each frame of tracking data, of a unique set of necessary motor encoder counts that can cause each of the individual axes to move to the correct location. In some embodiments, the Cartesian design of the disclosed robot 15 can permit such a calibration to be made by establishing a coordinate system for the robot 15 and determining key axes of rotation.
As described in greater detail below, in some embodiments, methods for calibrating the relative directions of the robot's 15 axes can utilize a sequence of carefully planned movements, each in a single axis. In some embodiments, during these moves, temporary tracking markers 720 are attached to the end-effectuator 30 to capture the motion of the end-effectuator 30. It should be appreciated that the disclosed methods do not require the axes of the robot 15 to be exactly or substantially perpendicular, nor do they require the vector along which a particular axis moves (such as the x-axis 66) to coincide with the vector about which rotation occurs (such as pitch 60, which occurs primarily about the x-axis 66). In certain embodiments, the disclosed methods include motion along a specific robot 15 axis that occurs in a straight line. In some embodiments, the disclosed methods for calibrating the relative directions of movement of the robot's 15 axes can utilize one or more frames of tracking data captured at the ends of individual moves made in x-(66), y-(68), roll (62), pitch (60), and Z-tube axes 64 from markers 720 temporarily attached to the end-effectuator's 30 guide tube 50. In some embodiments, when moving individual axes, all other axes can be configured at the zero position (for example, the position where the encoder for the axis reads 0 counts). Additionally or alternatively, one or more frames of tracking data with all robot 15 axes at 0 counts (neutral position) may be necessary, and one or more frames of data with the temporary markers 720 rotated to a different position about the longitudinal axis of the guide tube 50 may be necessary. In some embodiments, the marker 720 positions from these moves can be used to establish a Cartesian coordinate system for the robot 15 in which the origin (0,0,0) is through the center of the end-effectuator 30 and is at the location along the end-effectuator 30 closest to where pitch 60 occurs. Additionally or alternatively, in some embodiments, this coordinate system can be rotated to an alignment in which y-axis 68 movement of the robot 15 can occur exactly or substantially along the coordinate system's y-axis 68, while x-axis 66 movement of the robot 15 occurs substantially perpendicular to the y-axis 68, but by construction of the coordinate system, without resulting in any change in the z-axis 70 coordinate. In certain embodiments, the steps for establishing the robot's 15 coordinate system based at least on the foregoing individual moves can comprise the following: First, from the initial and final positions of the manual rotation of tracking markers 720 about the long axis of the end-effectuator 30, a finite helical axis of motion is calculated, which can be represented by a vector that is centered in and aligned with the end-effectuator 30. It should be appreciated that methods for calculating a finite helical axis of motion from two positions of three or more markers are described in the literature, for example, by Spoor and Veldpaus (Spoor, C. W. and F. E. Veldpaus, “Rigid body motion calculated from spatial co-ordinates of markers,” J Biomech 13(4): 391-393 (1980)). In some embodiments, rather than calculating the helical axis, the vector that is centered in and aligned with the end-effectuator 30 can be defined, or constructed, by interconnecting two points that are attached to two separate rigid bodies that can be temporarily affixed to the entry and exit of the guide tube 50 on the Z-tube axis 64. In this instance, each of the two rigid bodies can include at least one tracking marker 720 (e.g., one tracking marker 720, two tracking markers 720, three tracking markers 720, more than three tracking markers 720, etc.), and a calibration can be performed that provides information indicative of the locations on the rigid bodies that are adjacent to the entry and exit of the guide tube 50 relative to the tracking markers.
A second helical axis can be calculated from the pitch 60 movements, providing a vector substantially parallel to the x-axis of the robot 15 but also close to perpendicular with the first helical axis calculated. In some embodiments, the closest point on the first helical axis to the second helical axis (or vector aligned with the end-effectuator 30) is calculated using simple geometry and used to define the origin of the robot's coordinate system (0,0,0). A third helical axis is calculated from the two positions of the roll 62 axis. In certain scenarios, it cannot be assumed that the vector about which roll occurs (third helical axis) and the vector along which the y-axis 68 moves are exactly or substantially parallel. Moreover, it cannot be assumed that the vector about which pitch 60 occurs and the vector along which x-axis 66 motion occurs are exactly or substantially parallel. Vectors for x-axis 66 and y-axis 68 motion can be determined from neutral and extended positions of x-axis 66 and y-axis 68 and stored separately. As described herein, in some embodiments, the coordinate system can be realigned to enable y-axis movement of the robot 15 to occur exactly or substantially in the y-axis 68 direction of the coordinate system, and x-axis 66 movement of the robot 15 without any change in the z-coordinate (70). In general, to perform such a transformation of coordinate systems, a series of rotations about a coordinate axis is performed and applied to every point of interest in the current coordinate system. Each point is then considered to be represented in the new coordinate system. In some embodiments, to apply a rotation of a point represented by a 3×1 vector about a particular axis, the vector can be pre-multiplied by a 3×3 rotation matrix. The 3×3 rotation matrix for a rotation of Rx degrees about the x-axis is:
The 3×3 rotation matrix for a rotation of Ry degrees about the y-axis is:
The 3×3 rotation matrix for a rotation of Rz degrees about the z-axis is:
In some embodiments, to transform coordinate systems, a series of three rotations can be performed. For example, such rotations can be applied to all vectors and points of interest in the current coordinate system, including the x-movement vector, y-movement vector and each of the helical axe, to align the y movement vector with the new coordinate system's y-axis, and to align the x movement vector as closely as possible to the new coordinate system's x-axis at z=0. It should be appreciated that more than one possible sequence of three rotations can be performed to achieve substantially the same goal. For example, in some embodiments, a sequence of three rotations can comprise (1) a rotation about x using an Rx value appropriate to rotate the y-movement vector until its z coordinate equal 0, followed by (2) a rotation about z using an Rz value appropriate to rotate the y-movement vector until its x coordinate equal 0, followed by (3) a rotation about y using an Ry value appropriate to rotate the x-movement vector until its z coordinate equals 0. In some embodiments, to find the rotation angle appropriate to achieve a given rotation, the arctangent function can be utilized. For example, in some embodiments, the angle needed to rotate a point or vector (x1,y1,z1) about the z axis to y1=0 is −arctan(y1/x1).
It should be appreciated that after transformation of the coordinate system, in some embodiments, although the new coordinate system is aligned such that the y-movement axis of the surgical robot 15 is exactly or substantially exactly aligned with the coordinate system's y-axis 68, the roll 62 rotation movement of the robot 15 should not be assumed to occur exactly or substantially exactly about a vector aligned with the coordinate system's y-axis 68. Similarly, in some embodiments, the pitch 60 movement of the surgical robot 15 should not be assumed to occur exactly or substantially exactly about a vector aligned with the coordinate system's x-axis. In some embodiments, in roll 62 and pitch 60 rotational movement there can be linear and orientational “offsets” from the helical axis of motion to the nearest coordinate axis. In some embodiments, from the helical axes determined above using tracked markers, such offsets can be calculated and retained (e.g., stored in a computing device's memory) so that for any rotation occurring during operation, the offsets can be applied, rotation can be performed, and then negative offsets can be applied so that positional change occurring with rotation motion accounts for the true center of rotation.
In some embodiments, during tracking, the desired trajectory can be first calculated in the medical image coordinate system, then transformed to the robot 15 coordinate system based at least on known relative locations of active markers. For example, in some embodiments, conventional light-emitting markers and/or conventional reflective markers associated with an optical tracking system 3417 can be used (see for example active markers 720 in
In some embodiments, the necessary counts for the end-effectuator 30 to reach the desired position in the robot's 15 coordinate system can be calculated based on the following example process. First the necessary counts to reach the desired angular orientation can be calculated. In some embodiments, a series of three rotations can be applied to shift the coordinate system temporarily to a new coordinate system in which the y-axis 68 coincides or substantially coincides with the helical axis of motion for roll 62, and the x-axis 66 is largely aligned with the helical axis of motion for pitch 60 and by definition, and the helical axis of motion for pitch 60 has constant z=0. Then, the number of counts necessary to achieve the desired pitch 60 can be determined, keeping track of how this pitch 60 can affect roll 62. In one implementation, to find the necessary counts to achieve the desired pitch, the change in pitch angle 60 can be multiplied by the previously calibrated motor counts per degree for pitch. The change in roll 62 caused by this change in pitch 60 can be calculated from the orientation of the helical axis and the rotation angle (pitch) about the helical axis. Then, the necessary roll 62 to get to the desired roll 62 to reach the planned trajectory alignment can be calculated, with the benefit that applying roll 62 does not, by definition of the coordinate system, result in any further change in pitch. The coordinate system is then shifted back to the previously described robot 15 coordinate system by the inverse of the three rotations applied above. Then the necessary counts to reach the desired x-axis 66 position can be calculated, also keeping track of how this x-axis 66 position change will affect y-axis 68 position. Then the necessary y-axis 68 counts to reach the desired y-axis position can be readily calculated with the benefit that changing the y-axis 68 coordinate can have no effect on any other axis since the y-axis motion vector is by definition aligned with the robot's y-axis 68. In a scenario in which the Z-tube 50 position is being actively controlled, the orientation of the Z-tube 50 movement vector is adjusted when adjusting roll 62 and pitch 60 and the counts necessary to move it to the desired position along the trajectory vector is calculated from the offset. In some embodiments, after the necessary counts to achieve the desired positions in all axes are calculated as described, these counts can be sent as computer-accessible instructions (e.g., computer-readable and/or computer-executable instructions) to respective controllers for each axis in order to move the axes to the computed positions.
As shown in
Some embodiments include method 2600 (shown as a flowchart in
In some embodiments, at block 2620, it is determined if a maximum (max) border coordinate is less than the maximum coordinate of the test area, and a minimum (min) border coordinate is greater than the minimum coordinate of the test area, and vertical span of features rendered in the image are equal or substantially equal to horizontal span of such features. As shown in
In some embodiments, at block 2650, it is determined if last row and column in x-y grid are reached and last Z plane is reached as a result of updating the first test area at block 2645. In some embodiments, in the negative case, flow is directed to block 2615, in which the first area is the updated instance of a prior first area, with the flow reiterating one or more of blocks 2620 through 2645. Conversely, in the affirmative case, flow is directed to block 2655 at which invalid marker(s) 730 can be excluded. In some embodiments, a paring process can be implemented to exclude one or more invalid markers 730. For this paring process, in some embodiments, the known spacings between each of the N radio-opaque markers 730 (with N a natural number) on the targeting fixture 690 and each other radio-opaque marker 730 on the targeting fixture 690 can be compared to the markers 730 that have been found on the medical image. In a scenario in which more than N number of markers 730 can be found on the medical image, any sphere found on the medical image that does not have spacings relative to N−1 other markers 730 that are within an acceptable tolerance of known spacings retained, for example, on a list can be considered to be invalid. For example, if a targeting fixture 690 has four radio-opaque markers 730, there are six known spacings, with each marker 730 having a quantifiable spacing relative to three other markers 730: the inter-marker spacings for markers 1-2, 1-3, 1-4, 2-3, 2-4, and 3-4. On the 3D medical image of the targeting fixture 690, in some embodiments, if five potential markers 730 are found on the medical image, their inter-marker spacings can be calculated. In this scenario, there are 10 inter-marker spacings: 1-2, 1-3, 1-4, 1-5, 2-3, 2-4, 2-5, 3-4, 3-5, and 4-5, with each sphere having a quantifiable spacing relative to four other markers 730. Considering each of the five potential markers 730 individually, if any one of such five markers 730 does not have three of its four inter-marker spacings within a very small distance of the spacings on the list of six previously quantified known spacings, it is considered invalid.
In some embodiments, at block 2660, each centered radio-opaque marker 730, identified at block 2640, can be mapped to each radio-opaque marker 730 of a plurality of radio-opaque markers 730. In some embodiments, a sorting process in accordance with one or more aspects described herein can be implemented to map such markers 730 to radio opaque markers 730. In some embodiments, at block 2665, coordinates of each centered sphere can be retained (e.g., in memory of a computer platform 3400). As described herein, in some embodiments, such coordinates can be utilized in a process for tracking movement of a robot 15. In some embodiments, during tracking, the established (e.g., calibrated) spatial relationship between active markers 720 and radio-opaque markers 730 can be utilized to transform the coordinate system from the coordinate system of the medical image to the coordinate system of the tracking system 3417, or vice versa. Some embodiments include a process for transforming coordinates from the medical image's coordinate system to the tracking system's coordinate system can include a fixture 690 comprising four radio-opaque markers OP1, OP2, OP3, and OP4 (for example radio-opaque markers 730) in a rigidly fixed position relative to four active markers AM1, AM2, AM3, AM4 (for example, active markers 720). In some embodiments, at the time the calibration of the fixture 690 occurred, this positional relationship can be retained in a computer memory (e.g., system memory 3412) for later access on real-time or substantially on real-time in a set of four arbitrary reference Cartesian coordinate systems that can be readily reachable through transformations at any later frame of data. In some embodiments, each reference coordinate system can utilize an unambiguous positioning of three of the active markers 720. Some embodiments can include a reference coordinate system for AM1, AM2, and AM3 can be coordinate system in which AM1 can be positioned at the origin (e.g., the three-dimensional vector (0,0,0)); AM2 can be positioned on the x-axis (e.g., x-coordinate AM2x>0, y-coordinate AM2y=0, and z-coordinate AM2z=0); and AM3 can be positioned on the x-y plane (e.g., x-coordinate AM3x unrestricted, y-coordinate AM3y>0, and z-coordinated AM3z=0). Some embodiments include a method to generate a transformation to such coordinate system can comprise (1) translation of AM1, AM2, AM3, OP1, OP2, OP3, and OP4 in a manner that AM1 vector position is (0,0,0); (2) rotation about the x-axis by an angle suitable to position AM2 at z=0 (e.g., rotation applied to AM2, AM3 and OP1-OP4); (3) rotation about the z-axis by an angle suitable to position AM2 at y=0 and x>0 (e.g., rotation applied to AM2, AM3 and OP1-OP4); (4) rotation about the x-axis by an angle suitable to position AM3 at z=0 and y>0 (e.g., rotation applied to AM3 and OP1-OP4). It should be appreciated that, in some embodiments, it is unnecessary to retain these transformations in computer memory, for example; rather, the information retained for later access can be the coordinates of AM1-AM3 and OP1-OP4 in such reference coordinate system. In some embodiments, another such reference coordinate system can transform OP1-OP4 by utilizing AM2, AM3, and AM4. In some embodiments, another such reference coordinate system can transform OP1-OP4 by utilizing AM1, AM3, and AM4. In some further embodiments, another such reference coordinate system can transform OP1-OP4 by utilizing AM1, AM2, and AM4.
In some embodiments, at the time of tracking, during any given frame of data, the coordinates of the active markers AM1-AM4 can be provided by the tracking system 3417. In some embodiments, by utilizing markers AM1, AM2, and AM3, transformations suitable to reach the conditions of the reference coordinate system can be applied. In some embodiments, such transformations can position AM1, AM2, and AM3 on the x-y plane in a position in proximity to the position that was earlier stored in computer memory for this reference coordinate system. In some embodiments, for example, to achieve a best fit of the triad of active markers 720 on their stored location, a least squares algorithm can be utilized to apply an offset and rotation to the triad of markers 720. In one implementation, the least squares algorithm can be implemented as described by Sneath (Sneath P. H. A., Trend-surface analysis of transformation grids, J. Zoology 151, 65-122 (1967)). In some embodiments, transformations suitable to reach the reference coordinate system, including the least squares adjustment, can be retained in memory (e.g., system memory 3412 and/or mass storage device 3404). In some embodiments, the retained coordinates of OP1-OP4 in such reference coordinate system can be retrieved and the inverse of the retained transformations to reach the reference coordinate system can be applied to such coordinates. It should be appreciated that the new coordinates of OP1-OP4 (the coordinates resulting from application of the inverse of the transformations) are in the coordinate system of the tracking system 3417. Similarly, in some embodiments, by utilizing the remaining three triads of active markers 720, the coordinates of OP1-OP4 can be retrieved.
In some embodiments, the four sets of OP1-OP4 coordinates in the tracking system's coordinate system that can be calculated from different triads of active markers 720 are contemplated to have coordinates that are approximately equivalent. In some embodiments, when coordinates are not equivalent, the data set can be analyzed to determine which of the active markers 720 provides non-suitable (or poor) data by assessing how accurately each triad of active markers 720 at the current frame overlays onto the retained positions of active markers 720. In some other embodiments, when the coordinates are nearly equivalent, a mean value obtained from the four sets can be utilized for each radio-opaque marker 730. In some embodiments, to transform coordinates of other data (such as trajectories from the medical image coordinate system) to the tracking system's coordinate system, the same transformations can be applied to the data. For example, in some embodiments, the tip and tail of a trajectory vector can be transformed to the four reference coordinate systems and then retrieved with triads of active markers 720 at any frame of data and transformed to the tracking system's coordinate system.
In another embodiment, a line (e.g., referred to as line t) that is fixed on the image both in angle and position represents the desired trajectory; the surgeon has to rotate and scroll the images to align this trajectory to the desired location and orientation on the anatomy. At least one advantage of such embodiment is that it can provide a more complete, holistic picture of the anatomy in relationship to the desired trajectory that may not require the operator to erase and start over or nudge the line after it is drawn, and this process was therefore adopted. In some embodiments, a planned trajectory can be retained in a memory of a computing device (for example, computing device 3401) that controls the surgical robot 15 or is coupled thereto for use during a specific procedure. In some embodiments, each planned trajectory can be associated with a descriptor that can be retained in memory with the planned trajectory. As an example, the descriptor can be the level and side of the spine where screw insertion is planned.
In another embodiment, the line t that is (fixed on the image both in angle and position representing the desired trajectory) is dictated by the current position of the robot's end effectuator 30, or by an extrapolation of the end effectuator guide tube 50 if an instrument 35 were to extend from it along the same vector. In some embodiments, as the robot 15 is driven manually out over the patient 18 by activating motors 160 controlling individual or combined axes 64, 66, 68, 70, the position of this extrapolated line (robot's end effectuator 30) is updated on the medical image, based on markers 720 attached to the robot, conventional encoders showing current position of an axis, or a combination of these registers. In some embodiments, when the desired trajectory is reached, that vector's position in the medical image coordinate system is stored into the computer memory (for example in memory of a computer platform 3400) so that later, when recalled, the robot 15 will move automatically in the horizontal plane to intersect with this vector. In some embodiments, instead of manually driving the robot 15 by activating motors 160, the robot's axes can be put in a passive state. In some embodiments, in the passive state, the markers 720 continue to collect data on the robot arm 23 position and encoders on each axis 64, 66, 68, 70 continue to provide information regarding the position of the axis; therefore the position of an extrapolated line can be updated on the medical image as the passive robot 15 is dragged into any orientation and position in the horizontal plane. In some embodiments, when a desired trajectory is reached, the position can be stored into the computer memory. Some embodiments include conventional software control or a conventional switch activation capable of placing the robot 15 into an active state to immediately rigidly hold the position or trajectory, and to begin compensating for movement of the patient 18.
In some further embodiments, the computing device that implements the method 2700 or that is coupled to the surgical robot 15 can render one or more planned trajectories. Such information can permit confirming that the trajectories planned are within the range of the robot's 15 reach by calculating the necessary motor 160 encoder counts to reach each desired trajectory, and assessing if the counts are within the range of possible counts of each axis.
In some embodiments, information including whether each trajectory is in range, and how close each trajectory is to being out of range can be provided to an agent (such as a surgeon or other user, or equipment). For example, in some embodiments, a display means 29 (such as a display device 3411) can render (i.e. display) the limits of axis counts or linear or angular positions of one or more axes and the position on each axis where each targeted trajectory is currently located.
In another embodiment, the display device 3411 (for example, a display 150) can render a view of the horizontal work field as a rectangle with the robot's x-axis 66 movement and y-axis 68 movement ranges defining the horizontal and vertical dimensions of the rectangle, respectively. In some embodiments, marks (for example, circles) on the rectangle can represent the position of each planned trajectory at the current physical location of the robot 15 relative to the patient 18. In another embodiment, a 3D Cartesian volume can represent the x-axis 66 movement, y-axis 68 movement and z-axis 70 movement ranges of the robot 15. In some embodiments, line segments or cylinders rendered in the volume can represent the position of each planned trajectory at the current location of the robot 15 relative to the patient 18. Repositioning of the robot 15 or a patient 18 is performed at this time to a location that is within range of the desired trajectories. In other embodiments, the surgeon can adjust the Z Frame 72 position, which can affect the x-axis 66 range and the y-axis 68 range of trajectories that the robot 15 is capable of reaching (for example, converging trajectories require less x-axis 66 or y-axis reach the lower the robot 15 is in the z-axis 70). During this time, simultaneously, a screen shows whether tracking markers on the patient 18 and robot 15 are in view of the detection device of the tracking system (for example, optical tracking system 3417 shown in
In some embodiments, at block 2740, orientation of an end-effectuator 30 in a robot 15 coordinate system is calculated. In some embodiments, at block 2750, position of the end-effectuator 30 in the robot 15 coordinate system is calculated. In some embodiments, at block 2760, a line t defining the planned trajectory in the robot 15 coordinate system is determined. In some embodiments, at block 2770, robot 15 position is locked on the planned trajectory at a current Z level. In some embodiments, at block 2780, information indicative of quality of the trajectory lock can be supplied. In some embodiments, actual coordinate(s) of the surgical robot 15 can be rendered in conjunction with respective coordinate(s) of the planned trajectory. In some embodiments, aural indicia can be provided based on such quality. For instance, in some embodiments, a high-frequency and/or high-amplitude noise can embody aural indicia suitable to represent a low-quality lock. In some alternative embodiments, a brief melody may be repeatedly played, such as the sound associated with successful recognition of a USB memory device by a computer, to indicate successful lock on the planned trajectory. In other embodiments, a buzz or other warning noise may be played if the robot 15 is unable to reach its target due to the axis being mechanically overpowered, or if the tracking markers 720 are undetectable by cameras 8200 or other marker position sensors.
In some embodiments, at block 2790, it is determined if a surgical procedure is finished and, in the affirmative case, the flow terminates. In other embodiments, the flow is directed to block 2710. In some embodiments, the method 2700 can be implemented (i.e., executed) as part of block 2440 in certain scenarios. It should be appreciated that in some embodiments, the method 2700 also can be implemented for any robot 15 having at least one feature that enable movement of the robot 15.
In some embodiments, the tip of the line segment can be obtained as the point along the vector that is closest to the vector representing the helical axis of motion during pitch. In some embodiments, the tail of the line segment can be set an arbitrary distance (for example about 100 mm) up the vector aligned with the guide tube 50 and/or first helical axis. In some embodiments, the Cartesian coordinates of such tip and tail positions can be transformed to a coordinate system described herein in which the y-axis 68 movement can coincide with the y-axis 68 of the coordinate system, and the x-axis 66 can be aligned such that x-axis 66 movement can cause the greatest change in direction in the x-axis 66, moderate change in the y-axis 68, and no change in the z-axis 70. In some embodiments, these coordinates can be retained in a computer memory (for example system memory 3412) for later retrieval. In some embodiments, at block 2815a, tip and tail coordinates for neutral are accessed (i.e., retrieved). In some embodiments, at block 2820a, tip and tail are translated along Z-tube 50 neutral unit vector by monitored Z-tube 50 counts. In some embodiments, at block 2825a, an instantaneous axis of rotation (“IAR”) is accessed. The IAR is the same as the helical axis of motion ignoring the element of translation along the helical axis for pitch 60 for neutral. As described earlier, in some embodiments, the vectors for this IAR were previously stored in computer memory at the time the coordinate system of the robot 15 was calibrated. In some embodiments, at block 2830a, tip coordinate, tail coordinate, and IAR vector direction and location coordinates are transformed (for example, iteratively transformed) to a new coordinate system in which IAR is aligned with X axis. In some embodiments, data indicative of such transformations (T2) can be stored. In some embodiments, at block 2835a, tip coordinate and tail coordinate are rotated about X axis by pitch 60 angle. In some embodiments, at block 2840a, tip coordinate and tail coordinate are transformed back by inverse of T2 to the previous coordinate system. In some embodiments, at block 2845a, previously stored vectors that represent the IAR for roll 62 are accessed. In some embodiments, at block 2850a, tip coordinate, tail coordinate, IAR coordinate are transformed (for example, iteratively transformed) to a new coordinate system in which IAR is aligned with y-axis 68. In some embodiments, data indicative of such transformation(s) (T3) can be retained in memory. In some embodiments, at block 2855a, tip coordinate and tail coordinate are rotated about y-axis 68 by roll 62 angle. In some embodiments, at block 2860a, tip coordinate and tail coordinate are transformed back by inverse of T3 to the previous coordinate system. In some embodiments, at block 2865a, tip coordinate and tail coordinate are translated along a y-axis 68 unit vector (e.g., a vector aligned in this coordinate system with the y-axis 68) by monitored counts. In some embodiments, at block 2870a, tip coordinate and tail coordinate are transformed back by inverse of T1 to the current coordinate system monitored by the tracking system 3417.
In some embodiments, in order to establish a robot 15 coordinate system and calibrate the relative orientations of the axes of movement of the robot 15, a tip and tail of a line segment representing the vector in line with the end-effectuator 30 with temporarily attached tracking markers 720 is located. In some embodiments, the vector's position in space can be determined by finding the finite helical axis of motion of markers manually rotated to two positions around the guide tube 50. In other embodiments, the vector's position in space can be determined by connecting a point located at the entry of the guide tube 50 (identified by a temporarily mounted rigid body 690 with tracking markers 720) to a point located at the exit of the guide tube 50 (identified by a second temporarily mounted rigid body 690 with tracking markers 720).
In some embodiments, the tip of the line segment can be found as the point along the vector that is closest to the vector representing the helical axis of motion during pitch. In some embodiments, the tail of the line segment can be set an arbitrary distance (for example, nearly 100 mm) up the vector aligned with the guide tube/first helical axis. In some embodiments, the Cartesian coordinates of these tip and tail positions can be transformed to a coordinate system described herein in which the y-axis 68 movement substantially coincides with the y-axis 68 of the coordinate system, and the x-axis 66 movement is aligned in a manner that, in some embodiments, x-axis 66 movement causes the greatest change in direction in the x-axis 66, slight change in y-axis 68, and no change in the z-axis 70. It should be appreciated that such coordinates can be retained in memory (for example system memory 3412) for later retrieval. In some embodiments, at block 2815b, tip and tail coordinates for the neutral position are accessed (i.e., retrieved or otherwise obtained). In some embodiments, at block 2820b, tip and tail are translated along Z-tube 50 neutral unit vector by monitored Z-tube 50 counts. In some embodiments, at block 2825b, IAR is accessed. In one implementation, the vectors for this IAR may be available in a computer memory, for example, such vectors may be retained in the computer memory at the time the coordinate system of the robot 15 is calibrated in accordance with one or more embodiments described herein. In some embodiments, at block 2830b, tip coordinate, tail coordinate, and IAR vector direction and location coordinates are transformed to a new coordinate system in which IAR is aligned with x-axis 66. In some embodiments, data indicative of the applied transformations (T2) can be retained in a computer memory. In some embodiments, at block 2835b, tip coordinate and tail coordinate are rotated about x-axis 66 by pitch 60 angle. In some embodiments, at block 2840b, tip coordinate and tail coordinate are transformed back by applying the inverse of T2 to the previous coordinate system. In some embodiments, at block 2870b, tip coordinate and tail coordinate are transformed back by applying the inverse of T1 to the current coordinate system monitored by the tracking system 3417.
The various embodiments of the invention can be operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well-known computing systems, environments, and/or configurations that can be suitable for use with the systems and methods of the invention comprise personal computers, server computers, laptop devices or handheld devices, and multiprocessor systems. Additional examples comprise mobile devices, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that comprise any of the above systems or devices, and the like.
In some embodiments, the processing effected in the disclosed systems and methods can be performed by software components. In some embodiments, the disclosed systems and methods can be described in the general context of computer-executable instructions, such as program modules, being executed by one or more computers, such as computing device 3401, or other computing devices. Generally, program modules comprise computer code, routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types. The disclosed methods also can be practiced in grid-based and distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules can be located in both local and remote computer storage media including memory storage devices.
Further, one skilled in the art will appreciate that the systems and methods disclosed herein can be implemented via a general-purpose computing device in the form of the computing device 3401. In some embodiments, the components of the computing device 3401 can comprise, but are not limited to, one or more processors 3403, or processing units 3403, a system memory 3412, and a system bus 3413 that couples various system components including the processor 3403 to the system memory 3412. In some embodiments, in the case of multiple processing units 3403, the system can utilize parallel computing.
In general, a processor 3403 or a processing unit 3403 refers to any computing processing unit or processing device comprising, but not limited to, single-core processors; single-processors with software multithread execution capability; multi-core processors; multi-core processors with software multithread execution capability; multi-core processors with hardware multithread technology; parallel platforms; and parallel platforms with distributed shared memory. Additionally or alternatively, a processor 3403 or processing unit 3403 can refer to an integrated circuit, an application specific integrated circuit (ASIC), a digital signal processor (DSP), a field programmable gate array (FPGA), a programmable logic controller (PLC), a complex programmable logic device (CPLD), a discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. Processors or processing units referred to herein can exploit nano-scale architectures such as, molecular and quantum-dot based transistors, switches and gates, in order to optimize space usage or enhance performance of the computing devices that can implement the various aspects of the subject invention. In some embodiments, processor 3403 or processing unit 3403 also can be implemented as a combination of computing processing units.
The system bus 3413 represents one or more of several possible types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures can comprise an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an Enhanced ISA (EISA) bus, a Video Electronics Standards Association (VESA) local bus, an Accelerated Graphics Port (AGP) bus, and a Peripheral Component Interconnects (PCI), a PCI-Express bus, a Personal Computer Memory Card Industry Association (PCMCIA), Universal Serial Bus (USB) and the like. The bus 3413, and all buses specified in this specification and annexed drawings also can be implemented over a wired or wireless network connection and each of the subsystems, including the processor 3403, a mass storage device 3404, an operating system 3405, robotic guidance software 3406, robotic guidance data storage 3407, a network adapter 3408, system memory 3412, an input/output interface 3410, a display adapter 3409, a display device 3411, and a human machine interface 3402, can be contained within one or more remote computing devices 3414a,b at physically separate locations, functionally coupled (e.g., communicatively coupled) through buses of this form, in effect implementing a fully distributed system.
In some embodiments, robotic guidance software 3406 can configure the computing device 3401, or a processor thereof, to perform the automated control of position of the local robot 3416 (for example, surgical robot 15) in accordance with aspects of the invention. Such control can be enabled, at least in part, by a tracking system 3417. In some embodiments, when the computing device 3401 embodies the computer 100 functionally coupled to surgical robot 15, robotic guidance software 3406 can configure such computer 100 to perform the functionality described in the subject invention. In some embodiments, robotic guidance software 3406 can be retained in a memory as a group of computer-accessible instructions (for instance, computer-readable instructions, computer-executable instructions, or computer-readable computer-executable instructions). In some embodiments, the group of computer-accessible instructions can encode the methods of the invention (such as the methods illustrated in
Some embodiments include robotic guidance data storage 3407 that can comprise various types of data that can permit implementation (e.g., compilation, linking, execution, and combinations thereof) of the robotic guidance software 3406. In some embodiments, robotic guidance data storage 3407 can comprise data associated with intraoperative imaging, automated adjustment of position of the local robot 3416 and/or remote robot 3422, or the like. In some embodiments, the data retained in the robotic guidance data storage 3407 can be formatted according to any image data in industry standard format. As illustrated, in some embodiments, a remote tracking system 3424 can enable, at least in part, control of the remote robot 3422. In some embodiments, the information can comprise tracking information, trajectory information, surgical procedure information, safety protocols, and so forth.
In some embodiments of the invention, the computing device 3401 typically comprises a variety of computer readable media. The readable media can be any available media that is accessible by the computer 3401 and comprises, for example and not meant to be limiting, both volatile and non-volatile media, removable and non-removable media. In some embodiments, the system memory 3412 comprises computer readable media in the form of volatile memory, such as random access memory (RAM), and/or non-volatile memory, such as read only memory (ROM). In some embodiments, the system memory 3412 typically contains data (such as a group of tokens employed for code buffers) and/or program modules such as operating system 3405 and robotic guidance software 3406 that are immediately accessible to, and/or are presently operated-on by the processing unit 3403. In some embodiments, operating system 3405 can comprise operating systems such as Windows operating system, Unix, Linux, Symbian, Android, Apple iOS operating system, Chromium, and substantially any operating system for wireless computing devices or tethered computing devices. Apple® is a trademark of Apple Computer, Inc., registered in the United States and other countries. iOS® is a registered trademark of Cisco and used under license by Apple Inc. Microsoft® and Windows® are either registered trademarks or trademarks of Microsoft Corporation in the United States and/or other countries. Android® and Chrome® operating system are a registered trademarks of Google Inc. Symbian® is a registered trademark of Symbian Ltd. Linux® is a registered trademark of Linus Torvalds. UNIX® is a registered trademark of The Open Group.
In some embodiments, computing device 3401 can comprise other removable/non-removable, volatile/non-volatile computer storage media. As illustrated, in some embodiments, computing device 3401 comprises a mass storage device 3404 which can provide non-volatile storage of computer code (e.g., computer-executable instructions), computer-readable instructions, data structures, program modules, and other data for the computing device 3401. For instance, in some embodiments, a mass storage device 3404 can be a hard disk, a removable magnetic disk, a removable optical disk, magnetic cassettes or other magnetic storage devices, flash memory cards, CD-ROM, digital versatile disks (DVD) or other optical storage, random access memories (RAM), read only memories (ROM), electrically erasable programmable read-only memory (EEPROM), and the like.
In some embodiments, optionally, any number of program modules can be stored on the mass storage device 3404, including by way of example, an operating system 3405, and tracking software 3406. In some embodiments, each of the operating system 3405 and tracking software 3406 (or some combination thereof) can comprise elements of the programming and the tracking software 3406. In some embodiments, data and code (for example, computer-executable instructions, patient-specific trajectories, and patient 18 anatomical data) can be retained as part of tracking software 3406 and stored on the mass storage device 3404. In some embodiments, tracking software 3406, and related data and code, can be stored in any of one or more databases known in the art. Examples of such databases comprise, DB2®, Microsoft® Access, Microsoft® SQL Server, Oracle®, mySQL, PostgreSQL, and the like. Further examples include membase databases and flat file databases. The databases can be centralized or distributed across multiple systems.
DB2® is a registered trademark of IBM in the United States.
Microsoft®, Microsoft® Access®, and Microsoft® SQL Server™ are either registered trademarks or trademarks of Microsoft Corporation in the United States and/or other countries.
Oracle® is a registered trademark of Oracle Corporation and/or its affiliates.
MySQL® is a registered trademark of MySQL AB in the United States, the European Union and other countries.
PostgreSQL® and the PostgreSQL® logo are trademarks or registered trademarks of The PostgreSQL Global Development Group, in the U.S. and other countries.
In some embodiments, an agent (for example, a surgeon or other user, or equipment) can enter commands and information into the computing device 3401 via an input device (not shown). Examples of such input devices can comprise, but are not limited to, a camera (or other detection device for non-optical tracking markers), a keyboard, a pointing device (for example, a mouse), a microphone, a joystick, a scanner (for example, a barcode scanner), a reader device such as a radiofrequency identification (RFID) readers or magnetic stripe readers, gesture-based input devices such as tactile input devices (for example, touch screens, gloves and other body coverings or wearable devices), speech recognition devices, or natural interfaces, and the like. In some embodiments, these and other input devices can be connected to the processing unit 3403 via a human machine interface 3402 that is coupled to the system bus 3413. In some other embodiments, they can be connected by other interface and bus structures, such as a parallel port, game port, an IEEE 1394 port (also known as a firewire port), a serial port, or a universal serial bus (USB).
In some further embodiments, a display device 3411 can also be functionally coupled to the system bus 3413 via an interface, such as a display adapter 3409. In some embodiments, the computer 3401 can have more than one display adapter 3409 and the computer 3401 can have more than one display device 3411. For example, in some embodiments, a display device 3411 can be a monitor, a liquid crystal display, or a projector. Further, in addition to the display device 3411, some embodiments can include other output peripheral devices that can comprise components such as speakers (not shown) and a printer (not shown) capable of being connected to the computer 3401 via input/output Interface 3410. In some embodiments, the input/output interface 3410 can be a pointing device, either tethered to, or wirelessly coupled to the computing device 3410. In some embodiments, any step and/or result of the methods can be output in any form to an output device. In some embodiments, the output can be any form of visual representation, including, but not limited to, textual, graphical, animation, audio, tactile, and the like.
In certain embodiments, one or more cameras (for example, camera 8200 shown in
Some embodiments include a computing device 3401 that can operate in a networked environment (for example, an industrial environment) using logical connections to one or more remote computing devices 3414a,b, a remote robot 3422, and a tracking system 3424. By way of example, in some embodiments, a remote computing device can be a personal computer, portable computer, a mobile telephone, a server, a router, a network computer, a peer device or other common network node, and so on. In particular, in some embodiments, an agent (for example, a surgeon or other user, or equipment) can point to other tracked structures, including anatomy of a patient 18, using a remote computing device 3414 such as a hand-held probe that is capable of being tracked and sterilized. In some embodiments, logical connections between the computer 3401 and a remote computing device 3414a,b can be made via a local area network (LAN) and a general wide area network (WAN). In some embodiments, the network connections can be implemented through a network adapter 3408. In some embodiments, the network adapter 3408 can be implemented in both wired and wireless environments. Some embodiments include networking environments that can be conventional and commonplace in offices, enterprise-wide computer networks, intranets. In some embodiments, the networking environments generally can be embodied in wire-line networks or wireless networks (for example, cellular networks, such as third generation (“3G”) and fourth generation (“4G”) cellular networks, facility-based networks (for example, femtocell, picocell, wifi networks). In some embodiments, a group of one or more networks 3415 can provide such networking environments. In some embodiments of the invention, the one or more network(s) can comprise a LAN deployed in an industrial environment comprising the system 1 described herein.
As an illustration, in some embodiments, application programs and other executable program components such as the operating system 3405 are illustrated herein as discrete blocks, although it is recognized that such programs and components reside at various times in different storage components of the computing device 3401, and are executed by the data processor(s) of the computer 100. Some embodiments include an implementation of tracking software 3406 that can be stored on or transmitted across some form of computer readable media. Any of the disclosed methods can be performed by computer readable instructions embodied on computer readable media. Computer readable media can be any available media that can be accessed by a computer. By way of example and not meant to be limiting, computer-readable media can comprise “computer storage media,” or “computer-readable storage media,” and “communications media.” “Computer storage media” comprise volatile and non-volatile, removable and non-removable media implemented in any methods or technology for storage of information such as computer readable instructions, data structures, program modules, or other data. In some embodiments of the invention, computer storage media comprises, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer.
As described herein, some embodiments include the computing device 3401 that can control operation of local robots 3416 and/or remote robots 3422. Within embodiments in which the local robot 3416 or the remote robot 3422 are surgical robots 15, the computing device 3401 can execute robotic guidance software 3407 to control such robots 3416, 3422, 15. In some embodiments, the robotic guidance software 3407, in response to execution, can utilize trajectories (such as, tip and tail coordinates) that can be planned and/or configured remotely or locally. In an additional or alternative aspect, in response to execution, the robotic guidance software 3407 can implement one or more of the methods described herein in a local robot's computer or a remote robot's computer to cause movement of the remote robot 15 or the local robot 15 according to one or more trajectories.
In some embodiments, the computing device 3401 can enable pre-operative planning of the surgical procedure.
In some embodiments, the computing device 3401 can permit spatial positioning and orientation of a surgical tool (for example, instrument 35) during intraoperative procedures. In some further embodiments, the computing device 3401 can enable open procedures. In some other embodiments, the computing device 3401 can enable percutaneous procedures.
In certain embodiments, the computing device 3401 and the robotic guidance software 3407 can embody a 3D tracking system 3417 to simultaneously monitor the positions of the device and the anatomy of the patient 18. In some embodiments, the 3D tracking system 3417 can be configured to cast the patient's anatomy and the end-effectuator 30 in a common coordinate system.
In some embodiments, the computing device 3401 can access (i.e., load) image data from a conventional static storage device. In some embodiments, the computing device 3401 can permit a 3D volumetric representation of patient 18 anatomy to be loaded into memory (for example, system memory 3412) and displayed (for example, via display device 3411).
In some embodiments, the computing device 3401, in response to execution of the robotic guidance software 3407 can enable navigation through the 3D volume representation of a patient's anatomy.
In some embodiments, the computing device 3401 can operate with a conventional power source required to offer the device for sale in the specified country. A conventional power cable that supplies power can be a sufficient length to access conventional hospital power outlets. In some embodiments, in the event of a power loss, the computing device 3401 can hold the current end-effectuator 30 in a position unless an agent (for example, a surgeon or other user, or equipment) manually moves the end-effectuator 30.
In some embodiments, the computing device 3401 can monitor system physical condition data. In some embodiments, the computing device 3401 can report to an operator (for example, a surgeon) each of the physical condition data and indicate an out-of-range value.
In some embodiments, the computing device 3401 can enable entry and storage of manufacturing calibration values for end-effectuator 30 positioning using, for example, the input/output interface 3410.
In some embodiments, the computing device 3401 can enable access to manufacturing calibration values by an agent (for example, a surgeon or other user, or equipment) authenticated to an appropriate access level. In some embodiments, the data can be retained in robotic guidance data storage 3407, or can be accessed via network(s) 3415 when the data is retained in a remote computing device 3414a.
In some embodiments, the computing device 3401 can render (using for example display device 3411) a technical screen with a subset of the end-effectuator 30 positioning calibration and system health data. The information is only accessible to an agent (for example, a surgeon or other user, or equipment) authenticated to an appropriate level.
In some embodiments, the computing device 3401 can enable field calibration of end-effectuator 30 positioning only by an agent (for example, a surgeon or other user, or equipment) authenticated to an appropriate access level.
In some embodiments, the computing device 3401 can convey the status of local robot 3416, remote robot 3422, and/or other device being locked in position using a visual or aural alert.
In some further embodiments, the computing device 3401 can include an emergency stop control that upon activation, disables power to the device's motors 160 but not to the processor 3403. In some embodiments, the emergency stop control can be accessible by the operator of computing device 3401. In some embodiments, the computing device 3401 can monitor the emergency stop status and indicate to the operator that the emergency stop has been activated.
In some other embodiments, the computing device 3401 can be operated in a mode that permits manual positioning of the end-effectuator 30.
In some embodiments, the computing device 3401 can boot directly to an application representing the robotic guidance software 3406. In some embodiments, computing device 3401 can perform a system check prior to each use. In scenarios in which the system check fails, the computing device 3401 can notify an operator.
In some embodiments, the computing device 3401 can generate an indicator for reporting system status.
Some embodiments include the computing device 3401 that can minimize or can mitigate delays in processing, and in the event of a delay in processing, notify an agent (for example, a surgeon or other user, or equipment). For example, in some embodiments, a delay may occur while a system scan is being performed to assess system status, and consequently the computing device 3401 can schedule (for example, generate a process queue) system scans to occur at low usage times. In some embodiments, a system clock of the computing device 3401 can be read before and after key processes to assess the length of time required to complete computation of a process. In some embodiments, the actual time to complete the process can be compared to the expected time. In some embodiments, if a discrepancy is found to be beyond an acceptable tolerance, the agent can be notified, and/or concurrently running non-essential computational tasks can be terminated. In one embodiment, a conventional system clock (not shown) can be part of processor 3403.
In some embodiments, the computing device 3401 can generate a display that follows a standardized workflow.
In some embodiments, the computing device 3401 can render or ensure that text is rendered in a font of sufficient size and contrast to be readable from an appropriate distance.
In some embodiments, the computing device 3401 can enable an operator to locate the intended position of a surgical implant or tool.
In some further embodiments, the computing device 3401 can determine the relative position of the end-effectuator 30 to the anatomy of the patient 18. For example, to at least such end, the computing device 3401 can collect data the optical tracking system 3417, and can analyze the data to generate data indicative of such relative position.
In some embodiments, the computing device 3401 can indicate the end-effectuator 30 position and orientation.
In some embodiments, the computing device 3401 can enable continuous control of end-effectuator 30 position relative to the anatomy of a patient 18.
In some embodiments, the computing device 3401 can enable an agent (for example, a surgeon or other user, or equipment) to mark the intended position of a surgical implant or tool (for example, instrument 35).
In some embodiments, the computing device 3401 can allow the position and orientation of a conventional hand-held probe (or an instrument 35) to be displayed overlaid on images of the patient's anatomy.
In some embodiments, the computing device 3401 can enable an agent (for example, a surgeon or other user, or equipment) to position conventional surgical screws. In some embodiments, the computing device 3401 can enable selection of the length and diameter of surgical screws by the agent. In yet another aspect, the computing device can ensure that the relative position, size and scale of screws are maintained on the display 3411 when in graphical representation. In some embodiments, the computing device 3401 can verify screw path plans against an operation envelope and reject screw path plans outside this envelope. In still another aspect, the computing device 3401 can enable hiding of a graphical screw representation.
In some embodiments, the computing device 3401 can enable a function that allows the current view to be stored. In some embodiments, the computing device 3401 can enable a view reset function that sets the current view back to a previously stored view.
In some embodiments, the computing device 3401 can enable an authentication based tiered access system.
In some embodiments, the computing device 3401 can log and store system activity. In some embodiments, the computing device 3401 can enable access to the system activity log to an agent authorized to an appropriate level.
In some embodiments, the computing device 3401 can enable entry and storage of patient 18 data.
In some embodiments, the computing device 3401 can enable the appropriate disposition of patient 18 data and/or procedure data. For example, in a scenario in which such data are being collected for research, the computing device 3401 can implement de-identification of the data in order to meet patient 18 privacy requirements. In some embodiments, the de-identification can be implemented in response to execution of computer-executable instruction(s) retained in memory 3412 or any other memory accessible to the computing device 3401. In some embodiments, the de-identification can be performed automatically before the patient 18 data and/or procedure data are sent to a repository or any other data storage (including mass storage device 3404, for example). In some embodiments, indicia (e.g., a dialog box) can be rendered (for example, at display device 3411) to prompt an agent (e.g., machine or human) to permanently delete patient 18 data and/or procedure data at the end of a procedure.
In some embodiments, the distance between each of the calibrating transmitters 120 relative to each other is measured prior to calibration step 210. Each calibrating transmitter 120 transmits RF signals on a different frequency so that the positioning sensors 12 can determine which transmitter 120 emitted a particular RF signal. In some embodiments, the signal of each of these transmitters 120 is received by positioning sensors 12. In some embodiments, since the distance between each of the calibrating transmitters 120 is known, and the sensors 12 can identify the signals from each of the calibrating transmitters 120 based on the known frequency, using time of flight calculation, the positioning sensors 12 are able to calculate the spatial distance of each of the positioning sensors 12 relative to each other. The system 1 is now calibrated. As a result, in some embodiments, the positioning sensors 12 can now determine the spatial position of any new RF transmitter 120 introduced into the room 10 relative to the positioning sensors 12.
In some embodiments, a step 220a in which a 3D anatomical image scan, such as a CT scan, is taken of the anatomical target. Any 3D anatomical image scan may be used with the surgical robot 15 and is within the scope of the present invention.
In some embodiments, at step 230, the positions of the RF transmitters 120 tracking the anatomical target are read by positioning sensors 110. These transmitters 120 identify the initial position of the anatomical target and any changes in position during the procedure.
In some embodiments, if any RF transmitters 120 must transmit through a medium that changes the RF signal characteristics, then the system will compensate for these changes when determining the transmitter's 120 position.
In some embodiments, at step 240, the positions of the transmitters 120 on the anatomy are calibrated relative to the LPS coordinate system. In other words, the LPS provides a reference system, and the location of the anatomical target is calculated relative to the LPS coordinates. In some embodiments, to calibrate the anatomy relative to the LPS, the positions of transmitters 120 affixed to the anatomical target are recorded at the same time as positions of temporary transmitters 120 placed on precisely known anatomical landmarks also identified on the anatomical image. This calculation is performed by a computer 100.
In some embodiments, at step 250, the positions of the RF transmitters 120 that track the anatomical target are read. Since the locations of the transmitters 120 on the anatomical target have already been calibrated, the system can easily determine if there has been any change in position of the anatomical target.
Some embodiments include a step 260, where the positions of the transmitters 120 on the surgical instrument 35 are read. The transmitters 120 may be located on the surgical instrument 35 itself, and/or there may be transmitters 120 attached to various points of the surgical robot 15.
In some embodiments of the invention, the surgical robot 15 can also include a plurality of attached conventional position encoders that help determine the position of the surgical instrument 35. In some embodiments, the position encoders can be devices used to generate an electronic signal that indicates a position or movement relative to a reference position. In some other embodiments, a position signal can be generated using conventional magnetic sensors, conventional capacitive sensors, and conventional optical sensors.
In some embodiments, position data read from the position encoders may be used to determine the position of the surgical instrument 35 used in the procedure. In some embodiments, the data may be redundant of position data calculated from RF transmitters 120 located on the surgical instrument 35. Therefore, in some embodiments, position data from the position encoders may be used to double-check the position being read from the LPS.
In some embodiments, at step 270, the coordinates of the positions of the transmitters 120 on the surgical instrument 35, and/or the positions read from the position encoders, is calibrated relative to the anatomical coordinate system. In other words, in some embodiments, the position data of the surgical instrument 35 is synchronized into the same coordinate system as the patient's anatomy. In some embodiments, this calculation is performed automatically by the computer 100 since the positions of the transmitters 120 on the anatomical target and the positions of the transmitters 120 on the surgical instrument 35 are in the same coordinate system, and the positions of the transmitters 120 on the anatomical target are already calibrated relative to the anatomy.
In some embodiments, at step 280, the computer 100 superimposes a representation of the location calculated in step 270 of the surgical device on the 3D anatomical image of the patient 18 taken in step 220. In some embodiments, the superimposed image can be displayed to an agent.
In some embodiments, at step 290, the computer 100 sends the appropriate signals to the motors 160 to drive the surgical robot 15. In some embodiments, if the agent preprogrammed a trajectory, then the robot 15 is driven so that the surgical instrument 35 follows the preprogrammed trajectory if there is no further input from the agent. In some embodiments, if there is agent input, then the computer 100 drives the robot 15 in response to the agent input.
In some embodiments, at step 295, the computer 100 determines whether the anatomy needs to be recalibrated. In some embodiments, the agent may choose to recalibrate the anatomy, in which case the computer 100 responds to agent input. Alternatively, in some embodiments, the computer 100 may be programmed to recalibrate the anatomy in response to certain events. For instance, in some embodiments, the computer 100 may be programmed to recalibrate the anatomy if the RF transmitters 120 on the anatomical target indicate that the location of the anatomical target has shifted relative to the RF transmitters 120 (i.e. this spatial relationship should be fixed). In some embodiments, an indicator that the anatomical target location has shifted relative to the transmitters 120 is if the computer 100 calculates that the surgical instrument 35 appears to be inside bone when no drilling or penetration is actually occurring.
In some embodiments, if the anatomy needs to be calibrated, then the process beginning at step 230 is repeated. In some embodiments, if the anatomy does not need to be recalibrated, then the process beginning at step 250 is repeated.
In some embodiments, at any time during the procedure, certain fault conditions may cause the computer 100 to interrupt the program and respond accordingly. For instance, in some embodiments, if the signal from the RF transmitters 120 cannot be read, then the computer 100 may be programmed to stop the movement of the robot 15, or remove the surgical instrument 35 from the patient 18. Another example of a fault condition is if the robot 15 encounters a resistance above a preprogrammed tolerance level.
In some embodiments, the distance between each of the calibrating transmitters 120 relative to each other is measured prior to calibration step 300. In some embodiments, each calibrating transmitter 120 transmits RF signals on a different frequency so the positioning sensors 12 can determine which transmitter 120 emitted a particular RF signal. In some embodiments, the signal of each of these transmitters 120 is received by positioning sensors 12. Since the distance between each of the calibrating transmitters 120 is known, and the sensors 12 can identify the signals from each of the calibrating transmitters 120 based on the known frequency, using time of flight calculation, in some embodiments, the positioning sensors 12 are able to calculate the spatial distance of each of the positioning sensors 12 relative to each other. The system 1 is now calibrated. As a result, in some embodiments, the positioning sensors 12 can now determine the spatial position of any new RF transmitter 120 introduced into the room 10 relative to the positioning sensors 12.
In some embodiments, at step 310, a 3D anatomical image scan, such as a CT scan, is taken of the anatomical target. Any 3D anatomical image scan may be used with the surgical robot 15 and is within the scope of the present invention.
In some embodiments, at step 320, the operator selects a desired trajectory and insertion point of the surgical instrument 35 on the anatomical image captured at step 310. In some embodiments, the desired trajectory and insertion point is programmed into the computer 100 so that the robot 15 can drive a guide tube 50 automatically to follow the trajectory.
In some embodiments, at step 330, the positions of the RF transmitters 120 tracking the anatomical target are read by positioning sensors 110. In some embodiments, these transmitters 120 identify the initial position of the anatomical target and any changes in position during the procedure.
In some embodiments, if any RF transmitters 120 must transmit through a medium that changes the RF signal characteristics, the system will compensate for these changes when determining the transmitter's 120 position.
In some embodiments, at step 340, the positions of the transmitters 120 on the anatomy are calibrated relative to the LPS coordinate system. In other words, the LPS provides a reference system, and the location of the anatomical target is calculated relative to the LPS coordinates. In some embodiments, to calibrate the anatomy relative to the LPS, the positions of transmitters 120 affixed to the anatomical target are recorded at the same time as positions of temporary transmitters 120 on precisely known anatomical landmarks also identified on the anatomical image. This calculation is performed by a computer.
In some embodiments, at step 350, the positions of the RF transmitters 120 that track the anatomical target are read. Since the locations of the transmitters 120 on the anatomical target have already been calibrated, in some embodiments, the system can easily determine if there has been any change in position of the anatomical target.
In some embodiments, at step 360, the positions of the transmitters 120 on the surgical instrument 35 are read. In some embodiments, the transmitters 120 may be located on the surgical instrument 35, and/or attached to various points of the surgical robot 15.
In some embodiments, at step 370, the coordinates of the positions of the transmitters 120 on the surgical instrument 35, and/or the positions read from the position encoders, are calibrated relative to the anatomical coordinate system. In other words, the position data of the surgical instrument 35 is synchronized into the same coordinate system as the anatomy. This calculation is performed automatically by the computer 100 since the positions of the transmitters 120 on the anatomical target and the positions of the transmitters 120 on the surgical instrument 35 are in the same coordinate system and the positions of the transmitters 120 on the anatomical target are already calibrated relative to the anatomy.
In some embodiments, at step 380, the computer 100 superimposes a representation of the location calculated in step 370 of the surgical device on the 3D anatomical image of the patient 18 taken in step 310. The superimposed image can be displayed to the user.
In some embodiments, at step 390, the computer 100 determines whether the guide tube 50 is in the correct orientation and position to follow the trajectory planned at step 320. If it is not, then step 393 is reached. If it is in the correct orientation and position to follow the trajectory, then step 395 is reached.
In some embodiments, at step 393, the computer 100 determines what adjustments it needs to make in order to make the guide tube 50 follow the preplanned trajectory. The computer 100 sends the appropriate signals to drive the motors 160 in order to correct the movement of the guide tube.
In some embodiments, at step 395, the computer 100 determines whether the procedure has been completed. If the procedure has not been completed, then the process beginning at step 350 is repeated.
In some embodiments, at any time during the procedure, certain fault conditions may cause the computer 100 to interrupt the program and respond accordingly. For instance, if the signal from the RF transmitters 120 cannot be read, then the computer 100 may be programmed to stop the movement of the robot 15 or lift the guide tube 50 away from the patient 18. Another example of a fault condition is if the robot 15 encounters a resistance above a preprogrammed tolerance level. Another example of a fault condition is if the RF transmitters 120 on the anatomical target shift so that actual and calculated positions of the anatomy no longer match. One indicator that the anatomical target location has shifted relative to the transmitters 120 is if the computer 100 calculates that the surgical instrument 35 appears to be inside bone when no drilling or penetration is actually occurring.
In some embodiments, the proper response to each condition may be programmed into the system, or a specific response may be user-initiated. For example, the computer 100 may determine that in response to an anatomy shift, the anatomy would have to be recalibrated, and the process beginning at step 330 should be repeated. Alternatively, a fault condition may require the flowchart to repeat from step 300. Another alternative is the user may decide that recalibration from step 330 is desired, and initiate that step himself.
Referring now to
In some embodiments, the distance between each of the calibrating transmitters 120 relative to each other is measured prior to calibration step 400. Each calibrating transmitter 120 transmits RF signals on a different frequency so the positioning sensors 12 can determine which transmitter 120 emitted a particular RF signal. The signal of each of these transmitters 120 is received by positioning sensors 12. Since the distance between each of the calibrating transmitters 120 is known, and the sensors 12 can identify the signals from each of the calibrating transmitters 120 based on the known frequency, the positioning sensors 12 are able to calculate, using time of flight calculation, the spatial distance of each of the positioning sensors 12 relative to each other. The system 1 is now calibrated. As a result, the positioning sensors 12 can now determine the spatial position of any new RF transmitter 120 introduced into the room 10 relative to the positioning sensors 12.
In some embodiments, at step 410, a 3D anatomical image scan, such as a CT scan, is taken of the anatomical target. Any 3D anatomical image scan may be used with the surgical robot 15 and is within the scope of the present invention.
In some embodiments, at step 420, the operator inputs a desired safe zone on the anatomical image taken in step 410. In an embodiment of the invention, the operator uses an input to the computer 100 to draw a safe zone on a CT scan taken of the patient 18 in step 410.
In some embodiments, at step 430, the positions of the RF transmitters 120 tracking the anatomical target are read by positioning sensors. These transmitters 120 identify the initial position of the anatomical target and any changes in position during the procedure.
In some embodiments, if any RF transmitters 120 must transmit through a medium that changes the RF signal characteristics, then the system will compensate for these changes when determining the transmitter's 120 position.
In some embodiments, at step 440, the positions of the transmitters 120 on the anatomy are calibrated relative to the LPS coordinate system. In other words, the LPS provides a reference system, and the location of the anatomical target is calculated relative to the LPS coordinates. To calibrate the anatomy relative to the LPS, the positions of transmitters 120 affixed to the anatomical target are recorded at the same time as positions of temporary transmitters 120 on precisely known landmarks on the anatomy that can also be identified on the anatomical image. This calculation is performed by a computer 100.
In some embodiments, at step 450, the positions of the RF transmitters 120 that track the anatomical target are read. Since the locations of the transmitters 120 on the anatomical target have already been calibrated, the system can easily determine if there has been any change in position of the anatomical target.
In some embodiments, at step 460, the positions of the transmitters 120 on the surgical instrument 35 are read. The transmitters 120 may be located on the surgical instrument 35 itself, and/or there may be transmitters 120 attached to various points of the surgical robot 15.
In some embodiments, at step 470, the coordinates of the positions of the transmitters 120 on the surgical instrument 35, and/or the positions read from the position encoders, are calibrated relative to the anatomical coordinate system. In other words, the position data of the surgical instrument 35 is synchronized into the same coordinate system as the anatomy. This calculation is performed automatically by the computer 100 since the positions of the transmitters 120 on the anatomical target and the positions of the transmitters 120 on the surgical instrument 35 are in the same coordinate system and the positions of the transmitters 120 on the anatomical target are already calibrated relative to the anatomy.
In some embodiments, at step 480, the computer 100 superimposes a representation of the location calculated in step 470 of the surgical device on the 3D anatomical image of the patient 18 taken in step 410. In some embodiments, the superimposed image can be displayed to the user.
In some embodiments, at step 490, the computer 100 determines whether the surgical device attached to the end-effectuator 30 of the surgical robot 15 is within a specified range of the safe zone boundary (for example, within 1 millimeter of reaching the safe zone boundary). In some embodiments, if the end-effectuator 30 is almost to the boundary, then step 493 is reached. In some embodiments, if it is well within the safe zone boundary, then step 495 is reached.
In some embodiments, at step 493, the computer 100 stiffens the arm of the surgical robot 15 in any direction that would allow the user to move the surgical device closer to the safe zone boundary.
In some embodiments, at step 495, the computer 100 determines whether the anatomy needs to be recalibrated. In some embodiments, the user may choose to recalibrate the anatomy, in which case the computer 100 responds to user input. Alternatively, in some embodiments, the computer 100 may be programmed to recalibrate the anatomy in response to certain events. For instance, in some embodiments, the computer 100 may be programmed to recalibrate the anatomy if the RF transmitters 120 on the anatomical target indicate that the location of the anatomical target has shifted relative to the RF transmitters 120 (i.e. this spatial relationship should be fixed.) In some embodiments, an indicator that the anatomical target location has shifted relative to the transmitters 120 is if the computer 100 calculates that the surgical instrument 35 appears to be inside bone when no drilling or penetration is actually occurring.
In some embodiments, if the anatomy needs to be calibrated, then the process beginning at step 430 is repeated. In some embodiments, if the anatomy does not need to be recalibrated, then the process beginning at step 450 is repeated.
In some embodiments, at any time during the procedure, certain fault conditions may cause the computer 100 to interrupt the program and respond accordingly. For instance, in some embodiments, if the signal from the RF transmitters 120 cannot be read, then the computer 100 may be programmed to stop the movement of the robot 15 or remove the surgical instrument 35 from the patient 18. Another example of a fault condition is if the robot 15 encounters a resistance above a preprogrammed tolerance level.
Referring now to
In some embodiments, the distance between each of the calibrating transmitters 120 relative to each other is measured prior to calibration step 500. In some embodiments, each calibrating transmitter 120 transmits RF signals on a different frequency so the positioning sensors 12, 110 can determine which transmitter 120 emitted a particular RF signal. In some embodiments, the signal from each of these transmitters 120 is received by positioning sensors 12, 110. Since the distance between each of the calibrating transmitters 120 is known, and the sensors can identify the signals from each of the calibrating transmitters 120 based on the known frequency, in some embodiments, using time of flight calculation, the positioning sensors 12, 110 are able to calculate the spatial distance of each of the positioning sensors 12, 110 relative to each other. The system is now calibrated. As a result, in some embodiments, the positioning sensors 12, 110 can now determine the spatial position of any new RF transmitter 120 introduced into the room 10 relative to the positioning sensors 12, 110.
In some embodiments, at step 510, reference needles that contain the RF transmitters 120 are inserted into the body. The purpose of these needles is to track movement of key regions of soft tissue that will deform during the procedure or with movement of the patient 18.
In some embodiments, at step 520, a 3D anatomical image scan (such as a CT scan) is taken of the anatomical target. Any 3D anatomical image scan may be used with the surgical robot 15 and is within the scope of the present invention. In some embodiments, the anatomical image capture area includes the tips of the reference needles so that their transmitters' 120 positions can be determined relative to the anatomy.
In some embodiments, at step 530, the RF signals from the catheter tip and reference needles are read.
In some embodiments, at step 540, the position of the catheter tip is calculated. Because the position of the catheter tip relative to the reference needles and the positions of the reference needles relative to the anatomy are known, the computer 100 can calculate the position of the catheter tip relative to the anatomy.
In some embodiments, at step 550, the superimposed catheter tip and the shaft representation is displayed on the anatomical image taken in step 520.
In some embodiments, at step 560, the computer 100 determines whether the catheter tip is advancing toward the anatomical target. If it is not moving to the anatomical target, then step 563 is reached. If it is correctly moving, then step 570 is reached.
In some embodiments, at step 563, the robot 15 arm is adjusted to guide the catheter tip in the desired direction. If the anatomy needs to be calibrated, then in some embodiments, the process beginning at step 520 is repeated. If the anatomy does not need to be recalibrated, then the process beginning at step 540 is repeated.
In some embodiments, at step 570, the computer 100 determines whether the procedure has been completed. If the procedure has not been completed, then the process beginning at step 540 is repeated.
In some embodiments, at any time during the procedure, certain fault conditions may cause the computer 100 to interrupt the program and respond accordingly. For instance, in some embodiments, if the signal from the RF transmitter's 120 cannot be read, then the computer 100 may be programmed to stop the movement of the robot 15 or remove the flexible catheter from the patient 18. Another example of a fault condition is if the robot 15 encounters a resistance above a preprogrammed tolerance level. A further example of a fault condition is if the RF transmitter's 120 on the anatomical target indicate the location of the anatomical target shift so that actual and calculated positions of the anatomy no longer match. In some embodiments, one indicator that the anatomical target location has shifted relative to the transmitter's 120 is if the computer 100 calculates that the surgical instrument 35 appears to be inside bone when no drilling or penetration is actually occurring.
In some embodiments, the proper response to each condition may be programmed into the system, or a specific response may be user-initiated. For example, in some embodiments, the computer 100 may determine that in response to an anatomy shift, the anatomy would have to be recalibrated, and the process beginning at step 520 should be repeated. Alternatively, in some embodiments, a fault condition may require the flowchart to repeat from step 500. In other embodiments, the user may decide that recalibration from step 520 is desired, and initiate that step himself.
Referring now to
In some embodiments, after selecting the desired 3D image of the surgical target 630, the user will plan the appropriate trajectory on the selected image. In some embodiments, an input control is used with the software in order to plan the trajectory of the surgical instrument 35. In one embodiment of the invention, the input control is in the shape of a biopsy needle 8110 for which the user can plan a trajectory.
As described earlier, in some embodiments, the surgical robot 15 can be used with alternate guidance systems other than an LPS. In some embodiments, the surgical robot system 1 can comprise a targeting fixture 690 for use with a guidance system. In some embodiments, one targeting fixture 690 comprises a calibration frame 700, as shown in
As shown in
In some embodiments, through factory calibration or other calibration method(s), such as pivoting calibration, the location of the probe tip relative to the rigid body of the probe can be established. In some embodiments, it can then be possible to calculate the location of the probe's tip from the probe's active markers 720. In some embodiments, for a probe with a concave tip that is calibrated as previously described, the point in space returned during operation of the probe can represent a point distal to the tip of the probe at the center of the tip's concavity. Therefore, in some embodiments, when a probe (configured with a concave tip and calibrated to marker 730 of the same or nearly the same diameter as the targeting fixture's radio-opaque marker 730) is touched to the radio-opaque marker 730, the probe can register the center of the sphere. In some embodiments, active markers 720 can also be placed on the robot in order to monitor a position of the robot 15 and calibration frame 700 simultaneously or nearly simultaneously.
In some embodiments, the calibration frame 700 is mounted on the patient's skin before surgery/biopsy, and will stay mounted during the entire procedure. Surgery/biopsy takes place through the center of the frame 700.
In some embodiments, when the region of the plate with the radio-opaque markers 730 is scanned intra-operatively or prior to surgery (for example, using a CT scanner), the CT scan contains both the medical images of the patient's bony anatomy, and spherical representations of the radio-opaque markers 730. In some embodiments, software is used to determine the locations of the centers of the markers 730 relative to the trajectories defined by the surgeon on the medical images. Because the pixel spacing of the CT scan can be conveyed within encoded headers in DICOM images, or can be otherwise available to a tracking software (for example, the robotic guidance software 3406), it can, in some embodiments, be possible to register locations of the centers of the markers 730 in Cartesian coordinates (in millimeters, for example, or other length units). In some embodiments, it can be possible to register the Cartesian coordinates of the tip and tail of each trajectory in the same length units.
In some embodiments, because the system knows the positions of the trajectories relative to the radio-opaque markers 730, the positions of the radio-opaque markers 730 relative to the active markers 720, and the positions of the active markers 720 on the calibration frame 700 relative to the active markers on the robot 15 (not shown), the system has all information necessary to position the robot's end-effectuator 30 relative to the defined trajectories.
In some other embodiments of the invention, the calibration frame 700 can comprise at least three radio-opaque markers 730 embedded in the periphery of the calibration frame 700. In some embodiments, the at least three radio-opaque markers 730 can be positioned asymmetrically about the periphery of the calibration frame 700 such that the software, as described herein, can sort the at least three radio-opaque markers 730 based only on the geometric coordinates of each marker 730. In some embodiments, the calibration frame 700 can comprise at least one bank of active markers 720. In some embodiments, each bank of the at least one bank can comprise at least three active markers 720. In some embodiments, the at least one bank of active markers 720 can comprise four banks of active markers 720. In yet another aspect, the calibration frame 700 can comprise a plurality of leveling posts 77 coupled to respective corner regions of the calibration frame 700. In some embodiments, the corner regions of the calibration frame 700 can include leveling posts 77 that can comprise radiolucent materials. In some embodiments, the plurality of leveling posts 77 can be configured to promote uniform, rigid contact between the calibration frame 700 and the skin of the patient 18. In some embodiments, a surgical-grade adhesive film, such as, for example and without limitation, Ioban™ from 3M™, can be used to temporarily adhere the calibration frame 700 to the skin of the patient 18. 3M™ and Ioban™ are registered trademarks of 3M Company. In some further embodiments, the calibration frame 700 can comprise a plurality of upright posts 75 that are angled away from the frame 700 (see
As shown in
In some embodiments, the radio-opaque markers 730 are placed in an asymmetrical configuration (notice how OP1 and OP2 are separated from each other by more distance than OP3 and OP4, and OP1 and OP4 are aligned with each other across the gap, however OP3 is positioned more toward the center than OP2). The reason for this arrangement is so that a computer algorithm can automatically sort the markers to determine which is which if only given the raw coordinates of the four markers and not their identification.
In some embodiments, there are four banks of active markers 720 (three markers 720 per bank). Only one bank of three markers 720 is needed (redundancy is for added accuracy and so that the system will still work if the surgeon, tools, or robot are blocking some of the markers.
In some embodiments, despite the horizontal orientation of the patient 18, the angulation of the upright posts can permit the active markers 720 to face toward the cameras (for example cameras 8200 shown in
In some applications, to establish the spatial relationship between the active 720 and radio-opaque markers 730, a conventional digitizing probe, such as a 6-marker probe, embedded with active markers 720 in a known relationship to the probe's tip (see for example
Further embodiments of the invention are shown in
Moreover, in some embodiments, the front markers 720 can have less chance of obscuring the rear markers 720. For example, posts 75 that are farthest away from the camera or farthest from a detection device of the tracking system 3417 can be taller and spaced farther laterally than the posts 75 closest to the camera.
In some further embodiments of the invention, the calibration frame 700 can comprise markers 730 that are both radio-opaque for detection by a medical imaging scanner, and visible by the cameras or otherwise detectable by the real-time tracking system 3417. In some embodiments, the relationship between radio-opaque 730 and active markers (730, 720) does not need to be measured or established because they are one in the same. Therefore, in some embodiments, as soon as the position is determined from the CT scan (or other imaging scan), the spatial relationship between the robot 15 and anatomy of the patient 18 can be defined.
In other embodiments, the targeting fixture 690 can comprise a flexible roll configuration. In some embodiments, the targeting fixture 690 can comprise three or more radio-opaque markers 730 that define a rigid outer frame and nine or more active markers 720 embedded in a flexible roll of material (for example, the flexible roll 705 in
Flock Of Birds® is a registered trademark of Ascension Technology Corporation.
Axiem is a trademark of Medtronic, Inc., and its affiliated companies.
Medtronic® is a registered trademark used for Surgical and Medical Apparatus, Appliances and Instruments.
In some embodiments of the invention, at least a portion of the flexible roll 705 can comprise self-adhering film, such as, for example and without limitation, 3M™Ioban™ adhesive film (iodine-impregnated transparent surgical drape) similar to routinely used operating room product model 6651 EZ (3M, St. Paul, MN). Ioban™ is a trademark of 3M company.
In some embodiments, within the flexible roll 705, the radio-opaque and active markers (730, 720) can be rigidly coupled to each other, with each radio-opaque marker 730 coupled to three or more active markers 720. Alternatively, in some embodiments, the markers can simultaneously serve as radio-opaque and active markers (for example, an active marker 720 whose position can be detected from cameras or other sensors), and the position determined from the 3D medical image can substantially exactly correspond to the center of the marker 720. In some embodiments, as few as three such markers 720 could be embedded in the flexible roll 705 and still permit determination of the spatial relationship between the robot 15 and the anatomy of the patient 18. If radio-opaque markers 730 and active markers 720 are not one in the same, in some embodiments the at least three active markers 720 must be rigidly connected to each radio-opaque marker 730 because three separate non-collinear points are needed to unambiguously define the relative positions of points on a rigid body. That is, if only one or 2 active markers 720 are viewed, there is more than one possible calculated position where a rigidly coupled radio-opaque marker could be.
In some embodiments of the invention, other considerations can be used to permit the use of two active markers 720 per radio-opaque marker 730. For example, in some embodiments, if two active markers 720 and one radio-opaque marker 730 are intentionally positioned collinearly, with the radio-opaque marker 730 exactly at the midpoint between the two active markers 720, the location of the radio-opaque marker 730 can be determined as the mean location of the two active markers 720. Alternatively, in some embodiments, if the two active markers 720 and the radio-opaque marker 730 are intentionally positioned collinearly but with the radio-opaque marker 730 closer to one active marker 720 than the other (see for example
In some embodiments, the flexible roll 705 can be positioned across the patient's back or other area, and adhered to the skin of the patient 18 as it is unrolled. In some embodiments, knowing the spatial relationship between each triad of active markers 720 and the rigidly coupled radio-opaque marker 730, it is possible to establish the relationship between the robot 15 (position established by its own active markers 720) and the anatomy (visualized together with radio-opaque markers 730 on MRI, CT, or other 3D scan). In some embodiments, the flexible roll 705 can be completely disposable. Alternatively, in some other embodiments, the flexible roll 705 can comprise reusable marker groups integrated with a disposable roll with medical grade adhesive on each side to adhere to the patient 18 and the marker groups 720, 730. In some further embodiments, the flexible roll 705 can comprise a drape incorporated into the flexible roll 705 for covering the patient 18, with the drape configured to fold outwardly from the roll 705.
In some embodiments, after the roll 705 has been unrolled, the roll 705 can have a desired stiffness such that the roll 705 does not substantially change its position relative to the bony anatomy of the patient 18. In some embodiments of the invention, a conventional radiolucent wire can be embedded in the perimeter of the frame 700. In some embodiments, it a chain of plastic beads, such as the commercially available tripods shown in
In some embodiments of the invention, the targeting fixture 690 can be an adherable fixture, configured for temporary attachment to the skin of a patient 18. For example, in some embodiments, the targeting fixture 690 can be temporarily adhered to the patient 18 during imaging, removed, and then subsequently reattached during a follow-up medical procedure, such as a surgery. In some embodiments, the targeting fixture 690 can be applied to the skull of a patient 18 for use in placement of electrodes for deep brain stimulation. In some embodiments, this method can use a single fixture 690, or two related fixtures. In this instance, the two related fixtures can share the same surface shape. However, one fixture 690 can be temporarily attached at the time of medical image scanning, and can include radio-opaque markers 730 (but not active markers 720), and the second fixture 690 can be attached at the time of surgery, and can include active markers 720 (but not radio-opaque markers 730).
In some embodiments, the first fixture (for scanning) can comprise a frame 690 with three or more embedded radio-opaque markers 730, and two or more openings 740 for application of markings (the markings shown as 750 in
In some embodiments of the invention, the targeting fixture 690 can comprise a conventional clamping mechanism for securely attaching the targeting fixture 690 to the patient 18. For example, in some embodiments, the targeting fixture 690 can be configured to clamp to the spinous process 6301 of a patient 18 after the surgeon has surgically exposed the spinous process.
StealthStation® is a trademark of Medtronic, Inc., and its affiliated companies.
In some embodiments, during use of a targeting fixture 690 having a conventional clamping mechanism with image guidance, the relationship between the markers 720, 730 and the bony anatomy of the patient 18 can be established using a registration process wherein known landmarks are touched with a digitizing probe at the same time that the markers on the tracker are visible. In some embodiments of the invention, the probe itself can have a shaft protruding from a group of markers 720, 730, thereby permitting the tracking system 3417 to calculate the coordinates of the probe tip relative to the markers 720, 730.
In some embodiments, the clamping mechanism of the targeting fixture 690 can be configured for clamping to the spinous process 2310, or can be configured for anchoring to bone of the patient 18 such that the fixture 690 is substantially stationary and not easily moved. In some further embodiments, the targeting fixture 690 can comprise at least three active markers 720 and distinct radio-opaque markers 730 that are detected on the CT or other 3D image, preferably near the clamp (to be close to bone). In some alternative embodiments, the active markers 720 themselves must be configured to be visualized accurately on CT or other 3D image. In certain embodiments, the portion of the fixture 690 containing a radio-opaque marker 730 can be made to be detachable to enable removal from the fixture after the 3D image is obtained. In some further embodiments, a combination of radio-opaque 730 and active markers 720 can allow tracking with the robot 15 in the same way that is possible with the frame-type targeting fixtures 690 described above.
In some embodiments, one aspect of the software and/or firmware disclosed herein is a unique process for locating the center of the above-described markers 730 that takes advantage of the fact that a CT scan can comprise slices, typically spaced 1.5 mm or more apart in the z direction, and sampled with about 0.3 mm resolution in the x-axis and y-axis directions. In some embodiments, since the diameter of the radio-opaque markers 730 is several times larger than this slice spacing, different z slices of the sphere will appear as circles of different diameters on each successive x-y planar slice. In some embodiments, since the diameter of the sphere is defined beforehand, the necessary z position of the center of the sphere relative to the slices can be calculated to provide the given set of circles of various diameters. Stated similarly, in some embodiments, a z slice substantially exactly through the center of the sphere can yield a circle with a radius R that is substantially the same as that of the sphere. In some embodiments, a z slice through a point at the top or bottom of the sphere can yield a circle with a radius R approximating zero. In some other embodiments, a z slice through a z-axis coordinate Z1 between the center and top or bottom of the sphere can yield a circle with a radius R1=R cos(arcsin(Z1/R)).
In some embodiments of the invention, the observed radii of circles on z slices of known inter-slice spacing can be analyzed using the equation defined by R1=R cos(arcsin(Z1/R)). This provides a unique mathematical solution permitting the determination of the distance of each slice away from the center of the sphere. In cases in which a sphere has a diameter small enough that only a few slices through the sphere appear on a medical image, this process can provide a more precise the center of a sphere.
Some embodiments of the use of the calibration frame 700 are described to further clarify the methods of use. For example, some embodiments include the steps of a conventional closed screw or conventional needle (for example, a biopsy needle 8110) insertion procedure utilizing a calibration frame 700 as follows. In some embodiments, a calibration frame 700 is attached to the patient's 18 skin, substantially within the region at which surgery/biopsy is to take place. In some embodiments, the patient 18 receives a CT scan either supine or prone, whichever positioning orients the calibration frame 700 upward. In some embodiments, the surgeon subsequently manipulates three planar views of the patient's 18 CT images with rotations and translations. In some embodiments, the surgeon then draws trajectories on the images that define the desired position, and strike angle of the end-effectuator 30. In some embodiments, automatic calibration can be performed in order to obtain the centers of radio-opaque makers 730 of the calibration frame 700, and to utilize the stored relationship between the active markers 720 and radio-opaque markers 730. This procedure permits the robot 15 to move in the coordinate system of the anatomy and/or drawn trajectories.
In some embodiments, the robot 15 then will move to the desired position. In some embodiments, if forceful resistance beyond a pre-set tolerance is exceeded, the robot 15 will halt. In some further embodiments, the robot 15 can hold the guide tube 50 at the desired position and strike angle to allow the surgeon to insert a conventional screw or needle (for example, needle 7405, 7410 or biopsy needle 8110). In some embodiments, if tissues move in response to applied force or due to breathing, the movement will be tracked by optical markers 720, and the robot's position will automatically be adjusted.
As a further illustration of a procedure using an alternate guidance system, in some embodiments, the steps of an open screw insertion procedure utilizing an optical guidance system is described. In some embodiments, after surgical exposure, a targeting fixture 690 comprising a small tree of optical markers, for example, can be attached to a bony prominence in the area of interest. In some embodiments, conventional calibration procedures for image guidance can be utilized to establish the anatomy relative to the optical tracking system 3417 and medical images. For another example, the targeting fixture 690 can contain rigidly mounted, substantially permanent or detachable radio-opaque markers 730 that can be imaged with a CT scan. In some embodiments, the calibration procedures consistent with those stated for the calibration frame 700 can be utilized to establish the anatomy relative to the robot 15 and the medical image.
In some embodiments, the surgeon manipulates three planar views of the patient's CT images with rotations and translations. In some embodiments, the surgeon then draws trajectories on the images that define the desired position and strike angle of the end-effectuator 30. In some embodiments, the robot 15 moves to the desired position. In some embodiments, if forceful resistance beyond a pre-set tolerance is exceeded, the robot 15 will halt. In some embodiments, the robot 15 holds the guide tube 50 at the desired position and strike angle to allow the surgeon to insert a conventional screw. In some embodiments, if tissues move in response to applied force or due to breathing, the movement will be tracked by optical markers 720, and the robot's position will automatically be adjusted.
In one embodiment, in response to placement of the surveillance marker 710, execution of a control software application (e.g., robotic guidance software 3406) can permit an agent (e.g., a surgeon, a nurse, a diagnostician) to select “set surveillance marker”. At this time, the vector (3D) distances between the surveillance marker 710, and each of the markers 3611, 3612, 3613, and 3614 on the primary tracker array 3610 can be acquired and retained in computer 100 memory (such as a memory of a computing device 3401 executing the control software application). In an embodiment in which a 4-marker tracker array 3610 is utilized (
In some embodiments, as illustrated in
It should be appreciated that other techniques (for example, methods, systems, and combinations thereof, or the like) can be implemented in order to respond to operational issues that may prevent tracking of the movement of a robot 15 in the surgical robot system 1. In one embodiment, marker reconstruction can be implemented for steadier tracking. In some embodiments, marker reconstruction can maintain the robot end-effectuator 30 steady even if an agent partially blocks markers during operation of the disclosed surgical robot system 1.
As described herein, in some embodiments, at least some features of tracking movement of the robot's end-effectuator 30 can comprise tracking a virtual point on a rigid body utilizing an array of one or more markers 720, such tracking comprising one or more sequences of translations and rotations. As an illustration, an example methodology for tracking a visual point on a rigid body using an array of three attached markers is described in greater detail herein, such methodology can be utilized to implement marker reconstruction technique in accordance with one or more aspects of the invention.
In some embodiments, to rotate the rigid body about the y-axis so that 4002 is in the x-y plane (z=0):
In some embodiments, to rotate the rigid body about the z-axis so that 4002 is at y=0,
In some embodiments, to rotate the rigid body about the x-axis so that 4003 is at z=0:
As described herein, the example method to transform markers 4001, 4002, 4003 as close as possible to the reference frame can comprise; 1). translate the rigid body so that 4001 is at the origin (0,0,0), and 2). rotate about the y-axis so that 4002 is in the x-y plane (i.e., z=0), and 3). rotate about the z-axis so that 4002 is at y=0, x coordinate positive, and 4). rotate about the x-axis so that 4003 is at z=0, y coordinate positive. In other embodiments, a method to reach the same reference can comprise: 1). translate the rigid body so that 4001 is at the origin (0,0,0), and 2). rotate about the x-axis so that 4002 is in the x-y plane (i.e., z=0), and 3). rotate about the z-axis so that 4002 is at y=0, x coordinate positive, 4). rotate about the x-axis so that 4003 is at z=0, y coordinate positive. It should be appreciated that there are other possible methods and related actions, both in the reference frame chosen and in how the rigid body is manipulated to get it there. The described method is simple, but does not treat markers equally. The reference frame requires 4001 to be restricted the most (forced to a point), 4002 less (forced to a line), and 4003 the least (forced to a plane). As a result, errors from noise in markers are manifested asymmetrically. For example, consider a case where in a certain frame of data, noise causes each of the three markers to appear farther outward than they actually are or were (represented by 4001a, 4002a, and 4003a) when the reference frame was stored (as depicted in
In some embodiments, when the transformations are done to align the apparent markers “as close as possible” to their stored reference position, they will be offset. For example, when the stored point of interest is added, it will be misplaced in a direction on which marker was chosen as 4001 in the algorithm (see 4003, 4003a and 4002, 4002a for example in
Some embodiments provide additional or alternative methods for tracking points of interest that can involve more symmetrical ways of overlaying the actual marker positions with the stored reference positions. For example, in some embodiments, for three markers 4001, 4002, 4003, a two-dimensional fitting method typically utilized in zoology can be implemented. (See, e.g., Sneath P. H. A., “Trend-surface analysis of transformation grids,” J. Zoology 151, 65-122 (1967)). The method can include a least squares fitting algorithm for establishing a reference frame and transforming markers to lie as close as possible to the reference. In this case, the reference frame is the same as described earlier except that the common mean point (hereinafter referred to as “CMP”) is at the origin instead of marker 4001. In some embodiments, the CMP after forcing the markers into the x-y plane is defined in the following equation (and can be represented in
In some embodiments, for the markers to be centered around CMP, the markers can be translated by subtracting the CMP from 4001, 4002, and 4003. It should be noted that the point of interest being tracked is not included in determining CMPref.
In some embodiments, the method to transform markers as close as possible to this reference frame can comprise; 1). translating the rigid body so that 4001 is at the origin (0,0,0), and 2). rotating about the y-axis so that 4002 is in the x-y plane (i.e., z=0), and 3). rotating about the z-axis so that 4002 is at y=0, x coordinate positive into the x-ray plane, and 4). rotating about the x-axis so that 4003 is at z=0, y coordinate positive, and finally 5). calculate the CMP for the markers 4001, 4002, 4003 and translating the rigid body so that the CMP is at the origin (i.e., subtract the CMP from each point transformed). In some embodiments, steps 1-5 are done for the original set of markers for which the position of the point of interest was known and for the new set for which you are adding the point of interest. A further step can be included for the new set, for example, 6). rotate about the z-axis to best overlay the stored reference markers. In some embodiments, the rotation angle θ is found using the formula from Sneath:
In some embodiments, if M1, M2, M3 denote the stored reference markers and M′1, M′2, M′3 denote the position being tracked in this data-frame, the equation can be written:
It should be noted that this rotation angle can be small (e.g., smaller than about 1°. In some embodiments, after the markers 4001, 4002, 4003 are overlaid, some embodiments of the invention can include adding the point of interest then transforming the point of interest back to its true present location in the current frame of data. In some embodiments, to transform back, negative values saved from the forward transformation steps 1-6 as discussed above can be utilized. That is, for instance, go from step 6 to step 5 by rotating by negative θ, go from step 5 to step 4 by adding the CMP, etc.)
In some embodiments, using this least-squares algorithm, noise is manifested more symmetrically and the point of interest will probably be calculated to be closer to its actual location. This can be illustrated in
In some embodiments, when tracking 3D movement of a rigid body (for example, a robot 15 end-effectuator 30 or a targeted bone) using an array of 3 tracking markers 4001, 4002, 4003 that are rigidly attached to the rigid body, one example method for quantifying motion can include determining the transformations (translation and rotations) for the movement from a first (neutral) position (defined here as “A”) to second (current frame) position (herein referred to as “B”). In some embodiments, it may be convenient to describe the rotations as a three by three orientation matrix (direction cosines) of the rigid body in the position B, and to treat the three translation values as a 3×1 vector containing the x, y, z coordinates of the origin of the position A coordinate system transformed to position B. In some embodiments, the direction cosine matrix is a 3×3 matrix, the columns of which contain unit vectors that originally were aligned with the x, y, and z axes, respectively, of the neutral coordinate system. In some embodiments, to build a direction cosine matrix, a 3×3 matrix, A, can be defined in a manner that its columns are unit vectors, i, j, and k, aligned with the x, y, and z axes, respectively:
Upon or after rotations of the coordinate system occur, in some embodiments, the new matrix (which is the direction cosine matrix, A′) is as follows, where the unit vectors i′, j′, and k′ represent the new orientations of the unit vectors that were initially aligned with the coordinate axes:
In some embodiments, to determine the direction cosines and translation vector, the origin and unit vectors can be treated as aligned with the coordinate axes as four tracked points of interest in the manner described herein. For example, if the origin (o) and three unit vectors (i, j, k) are aligned with the coordinate axes, they are treated as virtual tracked points of interest with coordinates of:
In some embodiments, these points of interest can provide the direction cosines and translation for the movement when moved along with the three markers from position A to position B. In some embodiments, it may be convenient to implement (for example execute) the method for moving the virtual points to these four points placed into a 3×4 matrix, P.
In some embodiments, the matrix is as follows in position A:
In some embodiments, the matrix is as follows in position B:
In some embodiments, after movement, the direction cosine matrix is
In some embodiments, the vector o′ represents the new position of the origin. In some embodiments, after moving the three markers 4001, 4002, 4003 from position A to position B, and bringing the four points (as a 3×4 matrix) along with the three markers 4001, 4002, 4003, the translation of the origin is described by the first column. Further, in some embodiments, the new angular orientation of the axes can be obtained by subtracting the origin from the 2nd, 3rd, and 4th columns. These methods should be readily apparent from the following graphic representation in
In some embodiments, if more than three markers 4001, 4002, 4003 are utilized for tracking the movement of a rigid body, the same method can be implemented repeatedly for as many triads of markers as are present. For example, in a scenario in which four markers, M1, M2, M3, and M4, are attached to the rigid body, there can be four triads: those formed by {M1, M2, M3}, {M1, M2, M4},{M1, M3, M4}, and {M2, M3, M4}. In some embodiments, each of these triads can be used independently in the method described hereinbefore in order to calculate the rigid body motion. In some embodiments, the final values of the translations and rotations can then be the average of the values determined using the four triads. In some embodiments, in the alternative or in addition, other methods for achieving a best fit when using more than 3 markers may be used.
In some embodiments, when tracking with four markers, in a scenario in which one of the four markers becomes obscured, it can desirable to switch to tracking the rigid body with the remaining three markers instead of four. However, this change in tracking modality can cause a sudden variation in the results of one or more calculations utilized for tracking. In some embodiments, the variation can occur because the solution from the one remaining triad may be substantially different than the average of 4 triads. In some embodiments, if using the tracked position of the rigid body in a feedback loop to control the position of a robot 15 end-effectuator, the sudden variation in results of the calculation can be manifested as a physical sudden shift in the position of the robot 15 end-effectuator 30. In some embodiments, this behavior is undesirable because the robot 15 is intended to hold a guide tube 50 steady with very high accuracy.
Some embodiments include an example method for addressing the issue of sudden variation that occurs when one of the four markers M1, M2, M3, M4 is blocked, thereby causing the position to be calculated from a single triad instead of the average of four triads, can include reconstructing the blocked marker as a virtual marker. In some embodiments, to implement such reconstructing step with high accuracy, the most recent frame of data in which all four markers M1, M2, M3, M4 are visible can be retained substantially continuously or nearly continuously (for example in a memory of a computing device implementing the subject example method). In some embodiments, if all four markers M1, M2, M3, M4 are in view, the x-axis, y-axis, and z-axis coordinates of each of the four markers M1, M2, M3, and M4 are stored in computer 100 memory. It should be appreciated that in some embodiments, it may unnecessary to log all or substantially all frames and is sufficient to overwrite the same memory block with the most recent marker coordinates from a full visible frame. Then, in some embodiments, at a frame of data in which one of the four markers M1, M2, M3, and M4 is lost, the lost marker's position can be calculated based on the remaining triad, using the example method described herein for remaining three markers. That is, the triad (the three visible markers) is transformed to a reference. The stored set of markers is then transformed to the same reference using the corresponding triad with the fourth marker now acting as a virtual landmark. The recovered position of the lost fourth marker can then be transformed back to the current position in space using the inverse of the transformations that took it to the reference position. In some embodiments, after the lost marker's position is reconstructed, calculation of the rigid body movement can be performed as before, based on the average of the fourth triads, or other best fit method for transforming the rigid body from position A to position B.
In some embodiments, an extension to the methods for reconstructing markers 720 is to use multiple ambiguous synchronized lines of sight via multiple cameras 8200 tracking the same markers 720. For example, two or more cameras 8200 (such as Optotrak® or Polaris®) could be set up from different perspectives focused on the tracking markers 720 on the targeting fixture 690 or robot 15. In some embodiments, one camera unit could be placed at the foot of a patient's bed, and another could be attached to the robot 15. In some embodiments, another camera unit could be mounted to the ceiling. In some embodiments, when all cameras 8200 substantially simultaneously view the markers 720, coordinates could be transformed to a common coordinate system, and the position of any of the markers 720 would be considered to be the average (mean) of that marker's three dimensional position from all cameras used. In some embodiments, even with extremely accurate cameras, an average is needed because with system noise, the coordinates as perceived from different cameras would not be exactly equal. However, when one line of sight is obscured, the lines of sight from other cameras 8200 (where markers 720 can still be viewed) could be used to track the robot 15 and targeting fixture 690. In some embodiments, to mitigate twitching movements of the robot 15 when one line of sight is lost, it is possible that the marker 720 positions from the obscured line of sight could be reconstructed using methods as previously described based on an assumed fixed relationship between the last stored positions of the markers 720 relative to the unobstructed lines of sight. Further, in some embodiments, at every frame, the position of a marker 720 from camera 1 relative to its position from camera 2 would be stored; then if camera 1 is obstructed, and until the line of sight is restored, this relative position is recalled from computer memory (for example in memory of a computer platform 3400) and a reconstruction of the marker 720 from camera 1 would be inserted based on the recorded position of the marker from camera 2. In some embodiments, the method could compensate for temporary obstructions of line of sight such as a person standing or walking in front of one camera unit.
In certain embodiments, when a marker M1, M2, M3, M4 is lost but is successfully reconstructed in accordance with one or more aspect described herein, the marker that has been reconstructed can be rendered in a display device 3411. In one example implementation, circles representing each marker can be rendered graphically, coloring the circles for markers M1, M2, M3, M4 that are successfully tracked in green, markers M1, M2, M3, M4 that are successfully reconstructed in blue, and markers M1, M2, M3, M4 that cannot be tracked or reconstructed in red. It should be appreciated that such warning for the agent can serve to indicate that conditions are not optimal for tracking and that it is prudent to make an effort for all four tracking markers to be made fully visible, for example, by repositioning the cameras or standing in a different position where the marker is not blocked. Other formats and/or indicia can be utilized to render a virtual marker and/or distinguish such marker from successfully tracked markers. In some embodiments, it is possible to extend the method described herein to situations relying on more than four markers. For example, in embodiments in which five markers are utilized on one rigid body, and one of the five markers is blocked, it is possible to reconstruct the blocked marker from the average of the four remaining triads or from another method for best fit of the 4 remaining markers on the stored last visible position of all 5 markers. In some embodiments, once reconstructed, the average position of the rigid body is calculated from the average of the 10 possible triads, {M1,M2,M3}, {M1,M2,M4}, {M1,M2,M5}, {M1,M3,M4},{M1,M3,M5}, {M1,M4,M5}, {M2,M3,M4}, {M2,M3,M5}, {M2,M4,M5}, and {M3,M4,M5} or from another method for best fit of 5 markers from position A to position B.
As discussed above, in some embodiments, the end-effectuator 30 can be operatively coupled to the surgical instrument 35. This operative coupling can be accomplished in a wide variety of manners using a wide variety of structures. In some embodiments, a bayonet mount 5000 is used to removably couple the surgical instrument 35 to the end-effectuator 30 as shown in
In some embodiments, the bayonet mount 5000 can include ramps 5010 which allow identification of the surgical instrument 35 and ensure compatible connections as well. In some embodiments, the ramps 5010 can be sized consistently or differently around a circumference of the bayonet mount 5000 coupled to or integral with the surgical instrument 35. In some embodiments, the differently sized ramps 5010 can engage complementary slots 5020 coupled to or integral with the end-effectuator 30 as shown in
In some embodiments, different surgical instruments 35 can include different ramps 5010 and complementary slots 5020 to uniquely identify the particular surgical instrument 35 being installed. Additionally, in some embodiments, the different ramps 5010 and slots 5020 configurations can help ensure that only the correct surgical instruments 35 are installed for a particular procedure.
In some embodiments, conventional axial projections (such as those shown in U.S. Pat. No. 6,949,189 which is incorporated herein as needed to show details of the interface) can be mounted to or adjacent the ramps 5010 in order to provide automatic identification of the surgical instruments 35. In some embodiments, other additional structures can be mounted to or adjacent the ramps 5010 in order to provide automatic identification of the surgical instruments 35. In some embodiments, the axial projections can contact microswitches or a wide variety of other conventional proximity sensors in order to communicate the identity of the particular surgical instrument 35 to the computing device 3401 or other desired user interface. Alternatively, in some other embodiments of the invention, the identity of the particular surgical instrument 35 can be entered manually into the computing device 3401 or other desired user interface.
In some embodiments, instead of a targeting fixture 690 consisting of a combination of radio-opaque 730 and active markers 720, it is possible to register the targeting fixture 690 through an intermediate calibration. For example, in some embodiments, an example of such a calibration method could include attaching a temporary rigid plate 780 that contains radio-opaque markers 730, open mounts 785 (such as snaps, magnets, Velcro, or other features) to which active markers 720 can later be attached in a known position. For example, see
In some alternative embodiments, variants of the order of the above described steps may also be used. For instance, the active markers 720 could already be attached at the time of the scan. This method has advantage that the radio-opaque markers 730 can be positioned close to the anatomy of interest without concern about how they are attached to the tracker 795 with active markers 720. However, it has the disadvantage that an extra step is required in the registration process. In some embodiments, a variant of this method can also be used for improved accuracy in which two trackers of active markers 720 are attached above and below the region of interest. For example, a tracker rostral to the region of interest (shown as 795) could be a spinous process 2310 clamp in the upper lumbar spine and a tracker caudal to the region of interest (shown as 800) could be a rigid array of active markers 720 screwed into the sacrum (see for example
Some embodiments can include methods for transferring registration. For example, a registration performed to establish the transformations in order to transpose from a medical image coordinate system (such as the CT-scanned spine) to the coordinate system of the cameras, can later be transferred to a different reference. In the example described in the above related to
In some embodiments, after registration is transferred to both trackers 796, 797, the robot end effectuator 30 may be perceived by both trackers 796, 797 to be positioned as shown. In some embodiments, it is possible that one of the bones to which a tracker is mounted moves relative to the other, as shown in exaggerated fashion in
In some embodiments, segmentation could involve identifying bordering walls on the 3D image volume or bordering curves on 2D slices comprising the medical image. In some embodiments, by segmenting simple six-sided volumes, enough separation of critical elements could be visualized for the task. In some embodiments, bones on the slice from a CT scans depicted in
An example of transferal of registration to multiple trackers includes conventional pedicle screw placement followed by compression or distraction of the vertebrae. For example, if pedicle screws are being placed at lumbar vertebrae L4 and L5, a tracker could be placed on L3 and registered. In some embodiments, conventional pedicle screws could then be placed at L4 and L5, with extensions coming off of each screw head remaining after placement. In some embodiments, two new trackers (for example, trackers substantially similar to 796, 797) could then be attached to the extensions on the screw heads, one at L4 and one at L5. Then, the registration could be transferred to both of these new trackers and a tracker at L3 could be removed or thereafter ignored. In some embodiments, if the medical image is segmented so that L4 and rostral anatomy is shown relative to the tracker on L4 (while L5 and caudal anatomy is shown relative to the tracker on L5), then it can be possible to see how the L4 and L5 vertebrae move relative to one another, as compressive or distractive forces are applied across that joint. In some embodiments, such compression or distraction might be applied by the surgeon when preparing the disc space for an inter-body spacer, or inserting the spacer, or when compressing the vertebrae together using a surgical tool after the inter-body spacer is in place, and before locking the pedicle screw interconnecting rod.
In some embodiments, if there is snaking of the spine, for example, when conventional screws are driven in place or the surgeon applies a focal force on one portion of the spine, the two marker trees will move (illustrated as 795a for tracker 795 and 800a for tracker 800) by different amounts and to different orientations (illustrated in
In some embodiments, it is possible to use the same surgical robot 15 already described for navigation with 3D imaging in a different setting where only 2 fluoroscopic views are obtained. In this instance, the surgical robot 15 will be able to accurately move to a desired position that is pre-planned on these two fluoroscopic views. Since the two fluoroscopic views can represent views to which the surgeon or radiologist is already accustomed, planning trajectories on these views should be straightforward. In obtaining the fluoroscopic views, a method is needed to establish the position of the coordinate system of the anatomy relative to the robot's 15 coordinate system. In some embodiments, a way to fulfill this registration is to obtain the fluoroscopic views while a targeting fixture 690 that includes features that are identifiable on the fluoroscopic images is attached to the patient 18. For example,
In some embodiments, after adjusting the position of the patient 18 and fluoroscopy unit, an overlay with good certainty may be obtained for images with radio-opaque markers 730.
In some embodiments, after obtaining two images, the two images can be used to construct a 3D Cartesian coordinate system because they represent images of the same thing (the fixture) from two orthogonal views. For example, the A-P image could be used to represent the X-Z plane, and the lateral image could be used to represent the Y-Z plane. Radio-opaque markers 730 on the A-P image have known x-axis and z-axis coordinates (as recorded from the manufacturing process or by calibration using a digitizing probe or other means), and the same radio-opaque markers 730 have known y-axis and z-axis coordinates on the lateral image. Therefore, in some embodiments, the x-axis, y-axis, and z-axis coordinates of the markers 730 can be found on the two images, and the positions of the anatomy and planned trajectories relative to these reference points can be related to these reference positions. In some embodiments, the mapping of a point from 3D space to the 2D image and vice versa can be performed knowing the constant mm per pixel, C, on coronal or sagittal images, and multiplying or dividing points by these constants if the center of the image and coordinate system have been shifted to overlap.
In some embodiments, assuming the A-P x-ray represents the X-Z plane and the lateral x-ray represents the Y-Z plane, the algorithm for planning a trajectory and relating this planned trajectory to the robot 15 coordinate system can include the following steps; 1). Draw a line on the A-P and lateral x-ray views representing where the desired trajectory should be positioned (see for example
In some embodiments, while the robot 15 moves to position itself in the desired orientation and position, it is possible to overlay a graphical representation of the current location of the robot 15 on the fluoroscopic images by a method that can include; 1). retrieve current location of the robot 15 guide tube 50 in the coordinate system of the cameras 8200 based on active markers 720 attached to the robot, and 2). transform the guide tip and tail to the coordinate system of the medical images based on the locations of active markers on the targeting fixture 690, and 3). represent the current positions of tip/tail of the guide tube 50 on the A-P image by a line segment (or other suitable graphical representation) connecting the X,Z coordinates of the tip to the X,Z coordinates of the tail (see for example
In some embodiments, in constructing the Cartesian coordinate system based on the two images, it is important to consider directionality. That is, in some embodiments, an x-ray image of the X-Z plane could show positive X to the right and negative X to the left or vice versa. In some embodiments, it could show positive Z upward and negative Z downward or vice versa. In some embodiments, an x-ray image of the Y-Z plane could show positive Y to the right and negative Y to the left or vice versa. In some embodiments, it could show positive Z upward and negative Z downward or vice versa. In some embodiments, if an incorrect assumption is made about the directionality of one of the axes, it would mean that the constructed 3D coordinate system has one or more of its axes pointing in the wrong direction. In some embodiments, this may send the robot 15 to an incorrect position. In some embodiments, one way of ensuring the correct directionality is to query to the user requesting verification of directionality on the images and/or allowing them to flip (mirror) the images on the display 29, 150, 3401. In some embodiments, another way of ensuring the correct directionality is to design the targeting fixture 690 so that the radio-opaque markers 730 are spaced asymmetrically. In some other embodiments, another way of ensuring the correct directionality is to design the targeting fixture 690 with additional radio-opaque features that unambiguously identify top, bottom, left, right, front and rear on images. For example,
In some embodiments, the algorithm described here provides the user with two perpendicular x-ray views from which to plan a trajectory, and provides a visual feedback of the current location of a probe. Typically, these two views might be lateral and anteroposterior (A-P) views. In some embodiments, it might also be desirable for the user to see a third plane (for example, an axial plane). Based on knowledge of the anatomy and landmarks visible on the x-rays, in some embodiments, it is possible to create a rough “cartoon” showing an axial view. In some embodiments, the cartoon may help the user understand the approximate current location of the robot 15 or probe.
In some embodiments, in order to achieve well aligned x-rays like those shown in
In some embodiments, this method for aligning the radio-opaque markers 730 would have the advantage over trial-and-error methods that are affected by parallax effects, and as described below, do not confound the ability to align markers as needed. For example, with parallax, it may not be clear to the user when good alignment of the markers 730 is achieved, depending on how symmetrically spaced the markers 730 are about the center of the image.
With parallax error, the x-rays may not pass through the subject in a straight line and instead travel from emitter to receiver in a conical pattern. This conical path can produce an image where the details of anatomy on the 2D x-ray that are closer to the emitter of the x-rays will appear farther apart laterally than details of the anatomy that are closer to the receiver plate. In the case of x-ray images in
Further, in the description, two terms used are “near plane” and “far plane”—these terms refer to markers in the 2D views that appear farther apart or closer together because of parallax. The reason markers are farther apart or closer together is because of their proximity to the emitter or collector of the x-ray machine, with markers nearer the emitter appearing farther apart and markers nearer the collector closer together. However, rather than referencing distance from emitter and collector, “near plane” refers to markers that appear magnified (nearer to the eye) and “far plane” refers to markers that appear more distant.
Parallax will affect the image symmetrically about the center of the image. For example, in some embodiments, two markers 730 (one in near plane and one in far plane) that are in the same projected position, and are at the center of the image, may appear to be exactly on top of each other, whereas markers 730 in the near plane and far plane that are in the same projected position, but are close to the edge of the image may appear separated by a substantial distance.
In some embodiments, the algorithm requires information to be gathered on the near and far plane positions of the markers 730 on the image. That is, the user can indicate, using software or an automatic scan of the image, the spacing between markers 730, as shown in
In some embodiments, a method of implementing this system of two orthogonal fluoroscopy images to control a robot 15 can involve combining a robot 15 and fluoroscopy unit into a single interconnected device. There could be some advantages of this combination. For example, a conventional rotating turntable mechanism could be incorporated that could swing the fluoro arm into place, while at the same time swinging the robot arm 23 out of place (since the robot 15 would typically not be in the surgical field 17 at the same time as the fluoro arm). Furthermore, in some embodiments, the size of the robot arm 23 could be reduced compared to the stand-alone robot 15 because the fluoro arm's mass would serve as a counter-balance weight to help stabilize the robot arm 23. Moreover, in some embodiments, with integration, the fluoroscopy unit can more quickly transfer the image to the computer 100 and overlay with a graphical plot, for instance, as line segments starting at the center of the image and extending radially (similar to pie slices) around the image to facilitate appropriate marker 730 overlay. In some embodiments, overlaid near and far plane markers 730 should always fall on the same ray if the plates 690 with embedded markers 730 on the subject are aligned substantially parallel (see for example
Some embodiments can include mapping a 3D anatomical coordinate system on to two 2D orthogonal views (and vice versa) while considering parallax. For example, in some embodiments, a rigid frame is mounted to the patient and two perpendicular x-rays are taken to create a 3D coordinate system. To define this 3D coordinate system, a method is needed to map points from the 2D views (each with parallax) to the 3D volume and vice versa. The 3D coordinate system has coordinates x, y, z while the two 2D coordinate systems have coordinates xAP,zAP and xLat,zLat (“AP” for “anteroposterior” and “Lat” for “lateral” views).
In some embodiments, it can be assumed that the x-ray path from emitter to receiver is conical, and therefore linear interpolation/extrapolation can be used to adjust the positions of represented points. In some embodiments, software can calculate the distance of each landmark from the center of the image (indicated by dashed or dotted arrows). These distances, together with the known distance between near plane and far plane plates, can provide the necessary information to account for the parallax shift when mapping graphical objects whose positions are known in 3D back on to this 2D image.
Some embodiments can include solving to map x,y,z onto xAP,zAP and xLat,zLat. For example, consider two intermediate 2D AP and lateral views represented as follows:
Where xta and zta can be called temporary scaled values of x and z in the AP plane, ytl and ztl are temporary scaled values of y and z in the Lat plane, sAP is the scaling factor in the AP plane, determined from the known near plane1 marker spacing. sLat is the scaling factor in the Lat plane, determined from the known near plane marker spacing of the lateral markers, and xoa,zoa,y0l, and zol are offsets in AP and Lat planes that position the markers such that they are as they appear centered about the image determined from registered positions of the markers on the images. In other words, (xta, zta)=(0,0) represents the center of the AP image and (ytl, ztl)=(0,0) represents the center of the lateral image. These planar values would be enough to display a 2D representation if no parallax were present or near plane markers were only being displayed.
In some embodiments, to find xoa,zoa,yol, and zol consider pairs of points on the x-rays, because the ratio of distance from center on the x-ray is the same as the ratio of distance from center on the temporary scaled values. For example:
In some embodiments, it can be seen from this equation that it is important to stay away from points where xAP1≈xAP2 because it would result in a divide by zero error. Similar equations can be written for zoa,yol, and zol as follows:
This mapping to temporary scaled values gets the near plane markers mapped correctly, but adjustment is needed to account for any position other than near plane as follows:
As specified, ka is a function of y and kl is a function of x. For ka, this function is a linear interpolation function, in which if y is they position of the near plane (yn), then ka=1 and if y is the y position of the far plane (yf), then ka is the ratio of far plane spacing to near plane spacing, ra. For kl, this function is a linear interpolation function, in which if x is the x position of the near plane (xn), then kl=1 and if x is the x position of the far plane (xf), then kl is the ratio of far plane spacing to near plane spacing, rl. Note that yn,yf, xn, and xf are in a coordinate system with the origin at the center of the image.
Combining equations,
It should also be possible to map xAP, zAP, yLat, and zLat onto x,y,z. Having 4 equations and 4 unknowns:
Then substitute into this equation:
And solve for xta:
Quadratic formula:
Where:
Then plug into this equation to solve for ytl:
Then plug into this equation to solve for ztl:
Then plug into this equation to solve for zta:
Solve differently to give another option for z:
Substitute into:
And solve for ytl:
Quadratic formula:
Where:
From these equations, it is possible to go from a known x,y,z coordinate to the perceived xAP,zAP and xLat,zLat coordinates on the two views, or to go from known xAP,zAP and xLat,zLat coordinates on the two views to an x,y,z coordinate in the 3D coordinate system. It is therefore possible to plan a trajectory on the xAP,zAP and xLat,zLat views and determine what the tip and tail of this trajectory are, and it is also possible to display on the xAP,zAP and xLat,zLat views the current location of the robot's end effectuator.
In some embodiments, additional measurement hardware (for example, conventional ultrasound, laser, optical tracking, or a physical extension like a tape measure) can be attached to the fluoro unit to measure distance to the attached plates, or other points on the anatomy to ensure that plates are parallel when fluoro images are obtained.
In some embodiments, the identity of the surgical instrument 35 can be used by the control system for the computing device 3401 or other controller for the surgical robot system 1. In some embodiments, the control system 3401 can automatically adjust axial insertion and/or forces and applied torques depending upon the identity of the surgical instrument 35.
In some embodiments, when performing a typical procedure for needle 7405, 7410 or probe insertion (for biopsy, facet injection, tumor ablation, deep brain stimulation, etc.) a targeting fixture 690 is first attached by the surgeon or technician to the patient 18. The targeting fixture 690 is either clamped to bone (open or percutaneously), adhered as a rigid object to the skin, or unrolled and adhered to the skin (for example using the flexible roll shown as 705 in
In some embodiments, once a targeting fixture 690 is attached, the patient 18 can receive an intraoperative 3D image (Iso-C, O-Arm, or intraoperative CT) with radio-opaque markers 730 included in the field of view along with the region of interest. In some embodiments, for best accuracy and resolution, a fine-slice image is preferred (CT slice spacing=1 mm or less). The 3D scan has to include the radio-opaque markers 730 and the anatomy of interest; not including both would disallow calibration to the robot 15.
In some embodiments, the 3D image series is transferred to (or acquired directly to) the computer 100 of the robot 15. The 3D image has to be calibrated to the robot's position in space using the locations on the 3D image of the radio-opaque markers 730 that are embedded in the targeting fixture 690. In some embodiments, this calibration can be done by the technician scrolling through image slices and marking them using the software, or by an algorithm that automatically checks each slice of the medical image, finds the markers 730, verifying that they are the markers 730 of interest based on their physical spacing (the algorithm is documented herein). In some embodiments, to ensure accuracy, limit subjectivity, and to speed up the process, image thresholding is used to help define the edges of the radio-opaque marker 730, and then to find the center of the marker 730 (the program is documented herein). Some embodiments of the software can do the necessary spatial transformations to determine the location in the room of the robot's markers relative to anatomy through standard rigid body calculations. For example, by knowing the locations of the radio-opaque markers 730 in the coordinate system of the medical image, and knowing the locations of the active markers 720 on the calibration frame 700 relative to these radio-opaque markers 730, and monitoring the locations of the active markers on the robot 15 and targeting fixture 690.
Some embodiments allow the surgeon to use the software to plan the trajectories for needles/probes 7405, 7410. In some embodiments, the software will allow any number of trajectories to be stored for use during the procedure, with each trajectory accompanied by a descriptor.
In some embodiments, the robot 15 is moved next to the procedure table and cameras 8200 for tracking robot 15 and patient 18 are activated. The cameras 8200 and robot 15 are positioned wherever is convenient for the surgeon to access the site of interest. The marker mounts on the robot 15 have adjustable positions to allow the markers 720 to face toward the cameras 8200 in each possible configuration. In some embodiments, a screen can be accessed to show where the robot 15 is located for the current Z-frame 72 position, relative to all the trajectories that are planned. In some embodiments, the use of this screen can confirm that the trajectories planned are within the range of the robot's reach. In some embodiments, repositioning of the robot 15 is performed at this time to a location that is within range of all trajectories. Alternately or additionally, in some embodiments, the surgeon can adjust the Z-frame 72 position, which will affect the range of trajectories that the robot 15 is capable of reaching (converging trajectories require less x-y reach the lower the robot 15 is in the z-axis 70). During this time, substantially simultaneously, a screen shows whether markers 720, 730 on the patient 18 and robot 15 are in view of the cameras 8200. Repositioning of the cameras 8200, if necessary, is also performed at this time for good visibility.
In some embodiments, the surgeon then selects the first planned trajectory and he/she (or assistant) presses “go”. The robot 15 moves in the x-y (horizontal) plane and angulates roll 62 and pitch 60 until the end-effectuator 30 tube intersects the trajectory vector (see
In some embodiments, the surgeon then drives Z-frame 72 down until the tip of the end-effectuator 30 reaches the desired distance from the probe's or needle's target (typically the skin surface). While moving, the projected laser beam point should remain at a fixed location since movement is occurring along the trajectory vector. Once at the desired Z-frame 72 location, in some embodiments, the surgeon or other user can select an option to lock the Z-tube 50 position to remain at the fixed distance from the skin during breathing or other movement. At this point, the surgeon is ready to insert the probe or needle 7405, 7410. If the length of the guide tube 50 has been specified and a stop on the needle 7405, 7410 or probe is present to limit the guide tube 50 after some length has been passed, the ultimate location of the tip of the probe/needle 7405, 7410 can be calculated and displayed on the medical image in some embodiments. As described earlier, Additionally, in some embodiments, it is possible to incorporate a mechanism at the entry of the guide tube 50 that is comprised of a spring-loaded plunger 54 with a through-hole, and measures electronically the depth of depression of the plunger 54, corresponding to the amount by which the probe or needle 7405, 7410 currently protrudes from the tip of the guide tube 50.
In some embodiments, at any time during the procedure, if there is an emergency and the robot 15 is in the way of the surgeon, the “E-stop” button can be pressed on the robot 15, at which point all axes except the Z-frame axis 72 become free-floating and the robot's end-effectuator 30 can be manually removed from the field by pushing against the end-effectuator 30.
Some embodiments can include a bone screw or hardware procedure. For example, during a typical procedure for conventional screw or hardware insertion in the spine, the patient 18 is positioned prone (or other position) on the procedure table, and is supported. In some embodiments, a targeting fixture 690 is attached to the patient's spine by the surgeon or technician. In some embodiments, the targeting fixture 690 is either clamped to bone (open or percutaneously) or unrolled and adhered to the skin (for example using roll 705). The roll 705 could have a disposable drape incorporated. If a flexible roll 705 is used, reflective markers 720 will then be snapped into place in some embodiments.
In some embodiments, once a targeting fixture 690 is attached, the patient 18 can undergo an intraoperative 3D image (Iso-C, O-Arm, or intraoperative CT) with radio-opaque markers 730 included in the field of view along with the bony region of interest. In some embodiments, for best accuracy and resolution, a fine-slice image is preferred (where the CT slice spacing=1 mm or less). The 3D scan in some embodiments has to include the radio-opaque markers 730 and the bony anatomy; not including both would disallow calibration to the robot 15.
In some embodiments, the 3D image series is transferred to (or acquired directly to) the computer 100 of the robot 15, and the 3D image is calibrated in the same way as described above for needle 7405, 7410 or probe insertion. The surgeon then uses the software to plan the trajectories for hardware instrumentation (e.g., pedicle screw, facet screw). Some embodiments of the software will allow any number of trajectories to be stored for use during the procedure, with each trajectory accompanied by a descriptor that may just be the level and side of the spine where screw insertion is planned.
In some embodiments, the robot 15 is moved next to the table and cameras 8200 for tracking robot 15 and patient 18 are activated. The cameras 8200 are positioned near the patient's head. In some embodiments, the markers for the robot 15 are facing toward the cameras 8200, typically in the positive y-axis 68 direction of the robot's coordinate system. In some embodiments, a screen can be accessed to show where the robot 15 is located relative to all the trajectories that are planned for the current Z-frame 72 position. Using this screen it can be confirmed that the trajectories planned are within the range of the robot's reach. In some embodiments, repositioning of the robot 15 to a location that is within range of all trajectories is performed at this time. Alternately or additionally, in some embodiments, the surgeon can adjust the Z-frame 72 position, which will affect the range of trajectories that the robot 15 is capable of reaching (converging trajectories require less x-y reach the lower the robot 15 is in Z). During this time, simultaneously in some embodiments, a screen shows whether markers 720 on the patient 18 and robot 15 are in view of the cameras 8200. Repositioning of the cameras 8200, if necessary, is also performed at this time for good visibility.
In some embodiments, the surgeon then selects the first planned trajectory and he/she (or assistant) presses “go”. The robot 15 moves in the x-y (horizontal) plane and angulates roll 62 and pitch 60 until the end-effectuator 30 tube intersects the trajectory vector. During the process of driving to this location, in some embodiments, a small laser light will indicate end-effectuator 30 position by projecting a beam down the trajectory vector toward the patient 18. This laser simply snaps into the top of the end-effectuator guide tube 50. When the robot's end-effectuator guide tube 50 coincides with the trajectory vector to within the specified tolerance, auditory feedback is provided in some embodiments to indicate that the desired trajectory has been achieved and is being held. In some embodiments, movement of the patient 18 or robot 15 is detected by optical markers 720 and the necessary x-axis 66, y-axis 68, roll 62, and pitch 60 axes are adjusted to maintain alignment.
In some embodiments of the invention, the surgeon then drives Z-frame 72 down until the tip of the end-effectuator 30 reaches a reasonable starting distance from the site of operation, typically just proximal to the skin surface or the first tissues encountered within the surgical field 17. While moving, the projected laser beam point should remain at a fixed location since movement is occurring along the trajectory vector. Once at the desired location, the user may or may not select an option to lock the Z-tube 50 position to remain at the fixed distance from the anatomy during breathing or other movement.
One problem with inserting conventional guide-wires and screws into bone through any amount of soft tissue is that the screw or wire may sometimes deflect, wander, or “skive” off of the bone in a trajectory that is not desired if it does not meet the bone with a trajectory orthogonal to the bone surface. To overcome this difficulty, some embodiments can use a specially designed and coated screw specifically intended for percutaneous insertion. Some other embodiments can use an end-effectuator 30 tip fitted with a guide tube 50 or dilator, capable of being driven all the way down to the bone. In this instance, the guide tube 50 needs to have a sharp (beveled) leading edge 30b, and may need teeth or another feature to secure it well to the bone once in contact. This beveled tube 50 (i.e. guide tube 50 that includes beveled leading edge 30b) is driven through soft tissue and next to bone through one of two different methods using the surgical robot system 1 as described.
In applications where conventional screws are to be driven into bone, the surgeon may want to move the end-effectuator tip 30, fitted with a guide tube 50 or a conventional dilator, all the way down to the bone. Referring to
In some embodiments, the Z-tube axis 64 is fitted with a conventional force sensor with continuous force readings being displayed on the screen (such as display means 29). In some embodiments, the Z-frame 72 is then driven down into tissue while continuously adjusting the x-axis 66 and y-axis 68 to keep the tube 50 aligned with the trajectory vector. In some embodiments, the steps of 6210, 6215, 6220, 6225, 6230, 6235, 6240, 6245, 6250 and 6255 can be used to drive the tube 50 toward the target. In this instance, roll 62 and pitch 60, defining orientation, should not change while moving x-axis 66, y-axis 68, and the Z-frame 72 as Z-axis 70 along this vector, while holding Z-tube 50 rigidly locked at mid-range. For this procedure, in some embodiments, the Z-tube 50 stiffness must be set very high, and may require a conventional mechanical lock to be implemented. In some embodiments, if Z-tube 50 is not stiff enough, a counter force from the tissues being penetrated may cause it to move back in the opposite direction of Z-frame 72, and the tube 50 will not have any net advancement. In some embodiments, based on the surgeon's previous experience and lab testing, Z-frame 72 is driven down until a force level from the monitored force on Z-tube 50 matches the force typical for collision with bone (step 6260).
In some alternative embodiments, Z-tube 50 is positioned near the top of its range and Z-frame 72 is advanced (while adjusting x-axis 66 and y-axis 68 to stay on the trajectory vector) until the tube 50 tip is near the outermost border of dissected tissue (i.e. skin during percutaneous procedures). In some embodiments, the Z-tube's motor 160 is then deactivated to allow it to move freely while still monitoring its position (step 6270). In some embodiments, the surgeon then pushes the end-effectuator 30 down while x-axis 66, y-axis 68, roll 62, and pitch 60 adjustments can allow the tube 50 to be aligned with the trajectory vector (step 6275). Moreover, since the Z-tube 50 is passive, in some embodiments, the surgeon can manually force the tube 50 to advance until he/she experiences the tactile sense of the tube hitting bone, at which point the Z-tube 50 position is locked (motor 160 activated) by the surgeon or assistant (step 6280, 6285).
At this point, in some embodiments, the guide tube 50 is adjacent to bone and the surgeon may wish to drill into the bone with a conventional guide-wire or drill bit, or insert a screw. For screw prep and insertion, in some embodiments, the surgeon either uses a method that incorporates guide-wires, or a method that does not use guide-wires.
Some embodiments include a guide-wire method. For example, in some embodiments, a guide-wire is drilled into bone through the guide tube 50. After the guide-wire is in place, Z-frame 72 and tube 50 are driven upward along the trajectory vector until outside the body. In some embodiments, the tube is then released with a quick release from the robot's end-effectuator 30 so it can be positioned at the next trajectory. In some embodiments, a cannulated screw, already commonly used in spine surgery, can then be driven in place over the guide-wire.
Some embodiments include a non-guide-wire method. For example, a pilot hole may or may not be drilled first. In some embodiments, a screw is then driven into bone directly through the guide tube 50, which abuts bone. In some embodiments, the tip of the screw may have the special non-skiving design mentioned above.
In some embodiments, if hardware other than a screw is being inserted, the surgeon may wish to dilate soft tissue. In some embodiments, a dilated path would enable larger and/or more tools and implants to be inserted. In some embodiments, dilation is performed by sliding a series of larger and larger diameter tubes over the initial central shaft or tube. In some embodiments, a series of dilators, specially designed to integrate to the robot's end-effectuator 30, sequentially snap on to each other for this purpose.
In some embodiments, after the screw or hardware has been inserted in the first trajectory, the surgeon drives the robot 15 back up the trajectory vector away from the patient 18. In some embodiments, after the end-effectuator 30 is clear of the patient 18 in the Z direction, the next trajectory is selected and the robot 15 repeats the above steps.
In some embodiments, at any time during the procedure, if there is an emergency and the robot 15 is in the way of the surgeon, the “E-stop” button can be pressed on the robot 15, at which point all axes except Z-frame 72 become free-floating, and the robot's end-effectuator 30 can be manually removed from the field by pushing against the end-effectuator.
In some embodiments, for nerve avoidance during medical procedures, a special conventional dilator tube (not shown) that can be used with the robot 15. In some embodiments, the dilator tube can include multiple electrodes at its tip that can be sequentially activated to find not only whether a nerve is nearby, but also to find which radial direction is the nearest direction toward the nerve. Some embodiments incorporate this guide tube 50 and can identify, warn or incorporate automatic algorithms to steer clear of the nerve.
In some embodiments, it is known that pairs of bone screws such as pedicle screws have better resistance to screw pullout if they are oriented so that they converge toward each other. In some embodiments, for the best potential biomechanical stability, a two-screw surgical construct can consist of specially designed conventional screws that would interconnect in the X Z plane (not shown). That is, one screw can have a socket to accept a threaded portion of the other screw so that the screws interconnect at their tips. A procedure such as this requires exceptional accuracy, otherwise the screw tips would not properly intersect, and is therefore especially well-suited for a surgical robot 15. This type of hardware is useful with certain embodiments of the invention.
In some embodiments, instead of only straight lines, the surgeon has several options for trajectory planning—straight, curved or boundary for safe-zone surgery. For curved pathway planning, in some embodiments, the surgeon can draw a path on the medical image that has curvature of a user-selectable radius. In some embodiments, special conventional needles and housings can be used to execute these curved paths. In safe zone surgery (tumor or trauma), in some embodiments, the surgeon first plans a box or sphere around the region on the medical image within which the probe tip, incorporating a drill or ablation instrument, will be allowed to reside. In some embodiments, the robot 15 is driven down along a trajectory vector either automatically or manually as described above to position the tip of the probe to be in the center of the safe zone. In some embodiments, the surgeon would then be able pick the tool's axis of rotation (orthogonal to the long axis) based on the desired impact he/she would like for the purpose of preserving tissue and maximizing efficiency and effectiveness for the task at hand. For example, in some embodiments, an axis of rotation at the surface of the skin could be selected to minimize the amount by which the tool travels laterally and rips the skin.
In some embodiments, the robot 15 uses optical markers for tracking. Some embodiments are able to provide accurate localization of the robot 15 relative to the patient 18, and utilize the LPS because of the advantage of not being limited to line-of-sight. Additionally, in some embodiments, probes utilizing RF emitters on the tip (capable of being tracked by the LPS) can be used for steering flexible probes inside the body. In some embodiments, if the LPS is not yet functional for localization, then localization can be performed using an electromagnetic system such as the Aurora by Northern Digital. Aurora® is a registered trademark of Northern Digital Inc. For example, in this instance, an electromagnetic coil and RF emitters are both present in the probe tip. Some embodiments can offer the option of LPS or electromagnetic localization with steerable needles 7600. In this embodiment of the invention, the surgeon can monitor the current location on the medical image where the probe tip is currently positioned in real-time and activate RF electrodes to advance and steer the probe tip in the desired direction using a joystick.
As discussed earlier, in some embodiments, the end-effectuator 30 can include a bayonet mount 5000 is used to removably couple the surgical instrument 35 to the end-effectuator 30 as shown in
In some embodiments, the surgeon would make a stab incision in the midline and then slide the clamps 6302 of the clamping piece 6300 down along the sides of the spinous process 6301, pushing tissue away as the tip of the clamping piece is advanced. In some embodiments, the leading edge of the clamping mechanism 6300 would be beveled (see the leading edges 6305 of each clamp 6302 of the clamping mechanism 6300), and have a shape similar to a periosteal elevator. This allows the clamping mechanism 6300 to separate the muscle tissue from the bony spinous process 6301 as it is advanced. In some embodiments, the leading edges 6305 of the clamping mechanism 6300 can be electrified to enable it to more easily slide through muscle and connective tissues to prevent excessive bleeding.
In some embodiments, a mechanism activated from farther back on the shaft (for example a turn screw 6320, or conventional spring, etc.) can be activated to deploy clamp teeth 6330 on the clamps 6302. The same mechanism or another mechanism would close and compress the clamps 6302 together to firmly secure the clamping mechanism 6300 to the spinous process 6301 (see
The embodiments as described above and shown in
As described above, the opaque markers 730 must be included in a CT scan of the anatomy. However, it is desirable to crop CT scans as close as possible to the spine to improve resolution. In some embodiments, instead of using markers 730 near where the active markers 720 are located, an alternative is to have a rigid extension containing opaque markers 730 that are temporarily attached near the spine when the scan is taken. In some embodiments, the clamping piece 6300 can be coupled with, or otherwise modified with a targeting fixture 690. For example,
In some embodiments, it may also be desirable to mount the targeting fixture 690 to another piece that is already rigidly attached to the patient 18. For example, for deep brain stimulation or other brain procedure where the patient 18 is positioned in a Mayfield head holder, the head holder could serve as an attachment point for the targeting fixture 690. Since the head holder 6700 and skull form a rigid body, it is possible to track the head holder 6700 under the assumption that the skull moves the same amount as the head holder 6700. Further, in some embodiments of the invention, a surveillance marker (such as surveillance marker 710 as illustrated in
One problem with some robotic procedures is that the guide tube 50 must be physically rigidly mounted to the robot's end-effectuator, and therefore mounting one or more dilator tubes can be challenging. To address this problem, in some embodiments, dilators can be placed over the central guide-tube 50 without removing the robot end-effectuator 30. For example, some embodiments can include an end-effectuator 30 that includes at least one dilator tube 6800, 6810. For example,
In some further embodiments, the system 1 can include an end-effectuator 30 that is coupled with at least one cylindrical dilator tube 6900. For example,
Some embodiments include tubes 6900 that comprise a polymeric material. In some embodiments, the tubes 6900 can include at least one either radiolucent or radio-opaque material. In some embodiments, dilators 6900 may be radio-opaque so that their position may be easily confirmed by x-ray. Further, in some embodiments, the outermost dilator 6910 may be radiolucent so that the position of pathology drawn out through the tube, or implants, or materials passed into the patient through the tube, may be visualized by x-ray.
As described earlier, in some embodiments, the use of conventional linear pulse motors 160 within the surgical robot 15 can permit establishment of a non-rigid position for the end-effectuator 30 and/or surgical instrument 35. In some embodiments, the use of linear pulse motors 160 instead of motors with worm gear drive enables the robot 15 to quickly switch between active and passive modes.
The ability to be able to quickly switch between active and passive modes can be important for various embodiments. For example, if there is a need to position the robot 15 in the operative field, or remove the robot 15 from the operative field. Instead of having to drive the robot 15 in or out of the operative field, in some embodiments, the user can simply deactivate the motors 160, making the robot 15 passive. The user can then manually drag it where it is needed, and then re-activate the motors 160.
The ability to be able to quickly switch between active and passive modes can be important for safe zone surgery. In some embodiments, the user can outline a region with pathology (for example a tumor 7300) on the medical images (see for example
In some further embodiments, the user can place restrictions (through software) on the range of orientations allowed by the tool within the safe zone (for example, boundary 7320, and displayed as boundary 7325 in
Some embodiments include curved and/or sheathed needles for nonlinear trajectory to a target (for example, such as a tumor 7320 described earlier). In some embodiments, with a curved trajectory, it is possible to approach targets inside the body of a patient 18 that might otherwise be impossible to reach via a straight-line trajectory. For example,
Some other embodiments may use a straight guide tube 50 with a wire or tool 7410 that may be curved or straight. For example,
In some further embodiments, a portion of the leading edge of the guide tube 7500 may be insulated (i.e. comprise a substantially non-electrically conductive area), and a portion of the leading edge may be uninsulated (i.e. the region is inherently electrically conductive area). In this instance, it can be possible to determine the radial direction of the tube 7500 that is closest to the nerve by watching the response as the tube 7500 is rotated. That is, as the tube 7500 is rotated, the EMG nerve detection will have the most pronounced response when the uninsulated portion is nearest the nerve, and the least pronounced response when the uninsulated portion is farthest from the nerve. In some embodiments, it would then be possible for the user to manually steer the robot 15 to automatically steer the tube 7500 farther away from the nerve. In addition, this modified tube 7500 could have a conventional fan-like retractor (not shown) that can be deployed to gently spread the underlying muscle fibers, thereby making an entry point for disk removal, or screw insertion. In some embodiments, the combination of EMG and gentle retraction can enhance the safety and outcomes of robotic assisted spinal surgery.
As described above, one way of taking advantage of the directional electromyographic response is for the user to manually rotate the tube 7500. In some other embodiments, the tube 7500 can be to continuously oscillated back and forth, rotating about its axis while potentials are monitored. In some embodiments, to achieve the same function without rotating the tube 7500, the leading edge of the tube 7500 could have conductive sections that could be automatically sequentially activated while monitoring potentials. For example, in some embodiments, an array of two, three, four, or more electrodes 7510 (shown in
Some embodiments can include a steerable needle capable of being tracked inside the body. For example, U.S. Pat. No. 8,010,181, “System utilizing radio frequency signals for tracking and improving navigation of slender instruments during insertion in the body”, herein incorporated by reference, describes a steerable flexible catheter with two or more RF electrodes on the tip, which are used for steering. According to the method described in U.S. Pat. No. 8,010,181, the side or sides of the tip where the electrodes emit RF have less friction and therefore the probe will steer away from these sides.
In some embodiments of the invention, a steerable needle 7600 can be coupled with the system 1. In some embodiments, the system 1 can include a steerable needle 7600 coupled with the robot 15 through a coupled end-effectuator 30, the steerable needle 7600 capable of being tracked inside the body of a patient 18. For example,
During surgical procedures, pedicle screws or anterior body screws are inserted in two locations. However, there is a chance of failure due to screw pullout. To enhance resistance to pullout, screws are angled toward each other. For example, some embodiments can include intersecting and interlocking bone screws 7700 such as those illustrated in
Some embodiments of the system 1 can include conventional tracking cameras with dual regions of focus. For example, camera units such as Optotrak® or Polaris® from Northern Digital, Inc., can be mounted in a bar so that their calibration volume and area of focus are set. Optotrak® or Polaris® are registered trademarks of Northern Digital, Inc (see for example
In some embodiments, one solution to this issue is to set up two pairs of cameras 8200 with one camera shared, that is, cameras 1 and 2 form one pair, and cameras 2 and 3 form another pair. This configuration is the same as the Optotrak® system (i.e., three cameras in a single bar), however, the Optotrak® only has one volume and one common focal point. Conversely, some embodiments of the invention would be tuned to have two focal points and two volumes that would allow both the targeting fixture 690 and the robot 15 to be centered at the same time. In some embodiments, the orientations of the lateral cameras can be adjusted by known amounts with predictable impact on the focal point and volume.
In a further embodiment of the invention, two separate camera units (for example, two Polaris® units) can be mounted to a customized conventional bracket fixture including adjustment features (not shown). In some embodiments, this fixture would be calibrated so that the vectors defining the directions of the volumes and distance to focal point can be adjustable by known amounts. In some embodiments, the user could then point one Polaris® unit at the robot's markers, and the other Polaris® unit at the targeting fixture's 690 markers 720. The position of the adjustment features on the bracket would tell the computer what the transformation is required to go from one camera's coordinate system to the other.
In some further embodiments, the cameras 8200 (such as Optotrak® or Polaris®) focused on a particular region could be further improved by a conventional automated mechanism to direct the cameras 8200 at the center of the target. Such a method would improve accuracy because in general, image quality is better toward the center of focus than toward the fringes. In some embodiments, conventional motorized turrets could be utilized to adjust azimuth and elevation of a conventional bracket assembly for aiming the cameras 8200 (and/or in conjunction with movement of cameras 8200 on camera arm 8210 as shown in
Some embodiments can include a snap-in end-effectuator 30 with attached tracking fixtures 690 (including active markers 720). For example, some embodiments include snap-in posts 7800 attached to the end-effectuator 30 and tracking fixtures 690. In some embodiments, the snap-in posts 7800 can facilitate orienting tracking markers 720 to face cameras 8200 in different setups by allowing markers 720 to be mounted to each end-effectuator 30.
The robot system 1 contains several unique software algorithms to enable precise movement to a target location without requiring an iterative process. In some embodiments, an initial step includes a calibration of each coordinate axis of the end-effectuator 30. During the calibration, the robot 15 goes through a sequence of individual moves while recording the movement of active markers 720 that are temporarily attached to the end-effectuator (see
In some embodiments, it is possible to mount optical markers 720 for tracking the movement of the robot 15 on the base of the robot 15, then to calculate the orientation and coordinates of the guide tube 50 based on the movement of sequential axes (see earlier description related to
In some embodiments, it is possible to mount markers 720 at either extreme or at an intermediate axis. For example, in some embodiments, the markers 720 can be mounted on the x-axis 66. Thus, when the x-axis 66 moves, so do the optical markers 720. In this location, there is less chance that the surgeon will block them from the cameras 8200 or that they would become an obstruction to surgery. Because of the high accuracy in calculating the orientation and position of the end-effectuator 30 based on the encoder outputs from each axis, it is possible to very accurately determine the position of the end-effectuator 30 knowing only the position of the markers on the x-axis 66.
Some embodiments include an algorithm for automatically detecting the centers of the radio-opaque markers 730 on the medical image. This algorithm scans the medical image in its entirety looking for regions bounded on all sides by a border of sufficient gradient. If further markers 730 are found, they are checked against the stored locations and thrown out if outside tolerance.
Some biopsy procedures can be affected by the breathing process of a patient, for example when performing a lung biopsy. In some procedures, it is difficult for the clinician to obtain a sample during the correct breathing phase. The use of tracking markers 720 coupled to a bone of the patient cannot alone compensate for the breathing induced movement of the target biopsy region. Some embodiments include a method of performing a lung biopsy with breathing correction using the system 1. Currently, for radiation treatment of lung tumors, breathing is monitored during CT scan acquisition using a “bellows” belt (see for example CT scanner 8000 in
Some embodiments include a method of performing a lung biopsy with breathing correction using the system 1. In some embodiments, a tracking fixture 690 is attached to the patient 18 near biopsy site and bellows belt on the patient's 18 waist. In some embodiments, a CT scan of the patient 18 is performed with the patient holding their breath, and while monitoring the breathing phase. In some embodiments, a clinician locates the target (for example, a tumor) on the CT volume, and configures the robot 15 to the target using at least one of the embodiments as described earlier. In some embodiments, the robot 15 calibrates according to at least one embodiment described earlier. In some embodiments, the robot 15 moves into position above the biopsy site based the location of at least one tracking marker 720, 730. In some embodiments, the bellows belt remains in place, whereas in other embodiments, the markers 720, 730 on the patient 18 can track the breathing phase. In some embodiments, based on the bellows or tracking markers 720, 730, the computer 100 of the computing device 3401 within platform 3400 can use robotic guidance software 3406 to send a trigger during the calibrated breathing phase to deploy a biopsy gun to rapidly extract a biopsy of the target (such as a tumor). In some embodiments, a conventional biopsy gun (or tool, such as biopsy gun tip 8100 in
Deep brain stimulation (“DBS”) requires electrodes to be placed precisely at targets in the brain. Current technology allows CT and MRI scans to be merged for visualizing the brain anatomy relative to the bony anatomy (skull). It is therefore possible to plan trajectories for electrodes using a 3D combined CT/MRI volume, or from CT or MRI alone. Some embodiments include robot 15 electrode placement for asleep deep brain stimulation using the system 1 where the acquired volume can then be used to calibrate the robot 15 and move the robot 15 into position to hold a guide 50 for electrode implantation.
In some embodiments, a Mayfield frame 6700 modified including one possible configuration for active and radio-opaque markers (shown in
In some embodiments, the system 1 can perform the method steps 7910-7990 as outlined in
In some embodiments, the robot system 1 includes at least one mounted camera. For example,
Some embodiments include an arm 8210 and camera arm 8200 that can fold into a compact configuration for transportation of the robot system 1. For example,
Some embodiments can include methods for prostate 8330 immobilization with tracking for imaged-guided therapy. In some embodiments, to enable the insertion of a needle (7405, 7410, 7600, 8110 for example) into the prostate 8330 utilizing 3D image guidance, a 3D scan of the prostate 8330 relative to reference markers 720, 730 or other tracking system 3417 is needed. However, the prostate 8330 is relatively mobile and can shift with movement of the patient 18. In some embodiments, it may be possible to immobilize the prostate 8330 while also positioning and securing tracking markers 720 in close proximity to improve tracking and image guidance in the prostate 8330.
The prostate 8330 is anatomically positioned adjacent to the bladder 8320, the pubic bone 8310, and the rectum 8340 (see for example
In some embodiments, the balloon 8410 has the advantage that it can be inserted into the rectum 8340 un-inflated, and then when inflated. In some embodiments, it will displace the wall of the rectum 8340 and prostate 8330 laterally toward the pubic bone 8310. In some embodiments, a paddle 8420 can cause lateral displacement of the rectal wall and prostate 8330 if a pivot point near the anus is used.
In some embodiments, it is possible to configure a device consisting of a balloon 8410 and paddle 8420 such that fiducials are embedded in the device, with these fiducials being detectable on the 3D medical image (for instance, such as MRI). For example,
In some embodiments, in addition to applying lateral force from the side of the rectum 8340, it is also possible to apply lateral force from the side of the abdomen of the patient 18. In some embodiments, this secondary lateral force, used in conjunction with the force from the rectal wall, may assist in keeping the prostate 8330 immobilized. Additionally, it can serve as a support to which the tracking markers 720 are attached, and can serve as a support to which the rectal paddle/balloon 8420, 8410 can be attached for better stabilization. In some embodiments, the abdominal support can consist of a piece that presses from anterior toward posterior/inferior to press against the top of the bladder 8320 region. For example, conventional straps or pieces that encircle the legs can provide additional support. Since the abdominal shape and leg shape varies among patients, some customization would be beneficial. In some embodiments, adjustable straps and supports made of thermoplastic material could be utilized for customization. In some embodiments, commercially available thermoplastic supports (for example, from Aquaplast Inc) can be used. In some embodiments, the supports are formed by first dipping the support material in hot water to soften it, then applying the support to the patient's skin and molding it. After removing the support material from the hot water, the temperature is low enough that it does not burn the skin, but is warm enough that the support material remains soft for 1-5 minutes. In some embodiments, when the support cools, it maintains the skin contours against which it has been formed. In some embodiments, this type of support could be made for immobilizing the prostate 8330 shaped like moldable briefs. In this instance, the support would be dipped in hot water and then external straps and/or manual pressure would be applied to force the support device to press down toward the prostate 8330. Further, in some embodiments, the support could be manufactured in two halves, formed so that it is molded while two halves are tied together, and then removed (untied) when cool (so that it can later be reattached in the same configuration during the procedure).
In some embodiments, the combination of the elements as described above (including balloon 8410 and/or paddle 8420, enables real-time tracking of the prostate 8330, and manual or robotically assisted insertion of needles (for example, 7405, 7410, 7600, 8110) into the prostate 8330 based on targeting under image guidance. In some embodiments, the procedure can include the conventional abdominal support device as described above. The device would be prepared by dipping in hot water until soft, then applying to the patient such that gentle pressure is maintained from anterior to posterior/inferior against the bladder 8320 region and prostate 8330. In some embodiments, under palpation, the tracking device (paddle 8420 with coupled fixture 690 including markers 720 illustrated in
Some embodiments can use a dual mode prostate 8330 tracking for image-guided therapy. For example, in some embodiments, it is possible to accurately track the prostate 8330 using a combination of two tracking modalities, including fiber optic tracking. For this alternate method to be used, an optical tracker (fiber optic probe 8700) would first be applied externally. This probe 8700 would be registered to the 3D medical image (for example, using an MRI scan) in substantially the same way as previously described, such as for the spine tracking using CT imaging. In some embodiments, after registering and calibrating so that the coordinate systems of the medical image and cameras 8200 are synchronized, a means of updating and correcting for movement of the prostate 8330 can be used. In some embodiments, the probe 8700 can comprise a fiber optic sensor with a Bragg grating. For example,
In some embodiments, markings 8910 (gradations) capable of being visualized on MRI can be placed on the outer shaft of the probe 8700 (see for example,
In some embodiments, image-guided therapy can be performed using one or more of the embodiments as described. For example, in some embodiments, the fiber optic probe as depicted in
In some embodiments, the patient 18 is positioned outside or in the gantry of the Mill scanner before scanning. In some embodiments, the fiber optic tracking system 9100 is briefly activated to record position of the fiber optic probe 8700 along its entire length for later reference (see
In some embodiments, an MRI scan is obtained. The scan must visualize the prostate 8330, the radio-opaque fiducials 730 on the targeting fixture 690, and the markings 8910 that are present along the urethral tube that will be tracked with fiber optic probe 8700. In some embodiments, the position of the prostate 8330 along the fiber optic probe 8700 at the time of the scan is recorded from the radio-opaque markings 8910 on its surface.
In some embodiments, the patient is positioned on the procedure table, and optical tracking markers 720 are snapped into the targeting fixture (see
In some embodiments, the offset of the prostate 8330 from the position recorded on the MRI scan is determined as the offset of the prostate 8330 in the optically sensed position of the probe 8700 relative to the position at the time of the MRI scan. In some embodiments, the surgeon plans trajectories for insertion of the needle 7405 into the prostate 8330 (from the medical image), and the robot 15 moves the guide tube 50 to the desired 3D location for a needle 7405 to be inserted to the desired depth (see
In some other embodiments, the probe 8700 could be inserted down the esophagus to track movement of the stomach, intestines, or any portion of the digestive system. In some embodiments, it could be inserted into a blood vessel to track the position of major vessels inside the body. In some embodiments, it could be inserted through the urethra into the bladder, ureters, or kidney. In all cases, it would help localize internal points for better targeting for therapy.
In some further embodiments, the probe 8700 could be combined with a conventional catheter for other uses. For example, fluid could be injected or withdrawn through a hollow conventional catheter that is attached along its length to the probe 8700. Further, in some embodiments, a conventional balloon catheter could also be utilized. The balloon could be temporarily inflated to secure a portion of the probe 8700 within the urethra, or other position inside the body, ensuring that the probe 8700 does not move forward or backward once positioned where desired.
A number of technologies for real-time 3D visualization of deforming soft tissue and bony anatomy without the radiation are available and/or are in development. In some embodiments, the surgical robot 15 can use these technologies during surgery, or other image-guided therapy. In some embodiments, the use of real-time 3D visualization, automated non-linear path planning and automated steering and advancement of flexible catheters or wires (for example wires 7405, 7410, 8600, or 8110) in a non-linear path becomes increasingly important.
In some embodiments, it may be possible to visualize soft tissues in real time by combining MRI (magnetic resonance imaging) and ultrasound or contrast enhanced ultrasound (“CEUS”). For example, in some embodiments, an MRI scan and a baseline ultrasound scan would be obtained of the anatomy of interest. In some embodiments, landmarks visualized on the ultrasound would be correlated to the MRI (for example, borders of organs, blood vessels, bone, etc.). In some embodiments, a discrete set of key landmarks could be correlated such that the movement of other points of interest between these landmarks could be interpolated. In some embodiments, a computerized geometric model (with its unmoved baseline position corresponding to the anatomy seen on the MRI) would be created. Then, when movements of the landmark points are detected on ultrasound, the positions of the corresponding tissues visualized on the model can be adjusted. In some embodiments, the ultrasound would be allowed to run continuously, providing real-time data on the positions of the landmarks. In some embodiments, changes in landmark position would be used to update the model in real time, providing an accurate 3D representation of the soft tissues without exposure to radiation. In some embodiments, optical tracking markers 720 attached to the conventional ultrasound probes could provide data on the movement of the probes relative to the anatomy, which would affect the model calibration. In some embodiments, for accurate 3D positions of the points on the soft tissues, it may be necessary to utilize several conventional ultrasound probes locked in a rigid orientation relative to each other. In other embodiments, the ultrasound probes can be synchronized so that their relative positions are known or can be extracted. In some embodiments, optical markers 720 on multiple conventional ultrasound probes would allow registration of the multiple ultrasound probe orientations in the same coordinate system.
In some further embodiments of the invention, other methods for assessing distance to tissues of interest, such as electrical conductivity, capacitance, or inductance of the tissues as mild electrical current is applied.
In the modeling approach described above for visualizing soft tissues, it should be recognized that tracking a large number of landmarks helps ensure that the model is accurate. However, there is a trade-off that tracking a large number of landmarks may slow down the process, and disallow real-time updating or require a lengthy registration process. In some embodiments, as fewer landmarks are tracked, tissue modeling to predict deformation of the non-tracked parts of the model becomes increasingly important. In some embodiments, for tissue modeling, the elasticity and other mechanical qualities of the tissues are needed. It may be possible to assess the status of the tissues through a mechanism such as spectroscopy, where absorbance of light passed through tissue might provide information on the composition of tissues, electrical conductivity, DEXA scan, MRI scan, CT scan or other means. This information could be provided to the computer model to allow better estimation of soft tissue deformation.
Another possible mechanism for visualizing soft tissues can include injecting a conventional liquid tracer into the patient 18 that causes different tissues to become temporarily detectable by an external scan. For example, the tracer could comprise a radioactive isotope that is attracted more to certain types of cells than others. Then, when the patient is placed near an array of conventional radiation sensors, the sensors could detect the concentrations of the isotope in different spatial locations.
Some embodiments include a mechanism to allow the user to control the advancement and direction of a flexible catheter or wire (for example wire 7405, 7410, 7600, or 8110) through an interface with the robot 15. In some embodiments, this mechanism can snap or lock into the robot's end-effectuator 30. In some embodiments, the guide tube 50 on the robot's end-effectuator 30 provides accurately controlled orientation and position of the catheter or wire at the point where it enters the patient. In some embodiments, the mechanism would then allow the user to control the rate and amount of advancement of the tube 50, the rate and amount of rotation of the tube 50, and activation of steering RF energy (for example, as described earlier with regard to steerable needle 7600 in
In some embodiments, a mechanism similar to the one described above can also be used for automatic hole preparation and insertion of screws. For example, in some embodiments, the end-effectuator 30 could have a conventional mechanism that would allow a tool to be retrieved from a conventional tool repository located somewhere outside the surgical field 17 In some embodiments, features on the tool holder would allow easy automated engagement and disengagement of the tool. In some embodiments, after retrieving the tool, the end effectuator 30 would move to the planned screw location and drill a pilot hole by rotating the assembly at an optimal drilling speed while advancing. In some embodiments, the system 1 would then guide the robot 15 to replace the drill in the repository, and retrieve a driver with appropriately sized screw. In some embodiments, the screw would then be automatically positioned and inserted. In some embodiments, during insertion of the screw, thrust and torque should be coordinated to provide good bite of the screw into bone. That is, the appropriate amount of forward thrust should be applied during rotation so the screw will not strip the hole.
Some embodiments of the method also include algorithms for automatically positioning conventional screws. For example, in some embodiments, different considerations may dictate the decision of where the screw should be placed. In some embodiments, it may be desirable to place the screw into the bone such that the screw is surrounded by the thickest, strongest bone. In some embodiments, algorithms can be used to locate the best quality bone from CT or DEXA scans, and to find an optimized trajectory such that the width of bone around the screw is thickest, or remains within cortical instead of cancellous bone for the greatest proportion. In some embodiments, it may be desirable to place the screw into the bone at an entry point that is most perpendicular to the screw, or is at a “valley” instead of a peak or slope on the bony articulations. In some embodiments, by placing the screw in this way, it is less likely to skive or rotate during insertion and therefore likely to end up in a more accurate inserted location. In some embodiments, algorithms can be used to assess the surface and find the best entry point to guide the screw to the target, while penetrating the bone perpendicular to the bone surface. In other embodiments, it may be desirable to place screws in a multi-level case such that all the screw heads line up in a straight line or along a predictable curve. In some embodiments, by aligning screw heads in this way, the amount by which the surgeon must bend the interconnecting rod is minimized, reducing the time of the procedure, and reducing weakening of the metal rod due to repeated bending. In some embodiments, algorithms can be used that keep track of anticipated head locations as they are planned, and suggest adjustments to trajectories that provide comparable bony purchase, but better rod alignment.
Some embodiments of the invention can use an LPS system that uses time-of-flight of RF signals from an emitter to an array of receivers to localize the position of the emitter. In some embodiments, it may be possible to improve the accuracy of the LPS system by combining it with other modalities. For example, in some embodiments, it may be possible use a magnetic field, ultrasound scan, laser scan, CT, MRI or other means to assess the density and position of tissues and other media in the region where the RF will travel. Since RF travels at different rates through different media (air, tissue, metal, etc.), knowledge of the spatial orientation of the media through which the RF will travel will improve the accuracy of the time-of-flight calculations.
In some embodiments, an enhancement to the robot 15 could include inserting a conventional ultrasound probe into the guide tube 50. In some embodiments, the ultrasound probe could be used as the guide tube 50 penetrates through soft tissue to help visualize what is ahead. As the guide tube 50 advances, penetrating soft tissue and approaching bone, the ultrasound probe would be able to detect contours of the bone being approached. In some embodiments, this information could be used as a visual reference to verify that the actual anatomy being approached is the same as the anatomy currently being shown on the 3D re-sliced medical image over which the robot is navigating. For example, in some embodiments, if a small protrusion of bone is being approached dead center on the probe/guide tube 50 as it is pushed forward, the region in the center of the ultrasound field representing the raised bone should show a short distance to bone, while the regions toward the perimeter should show a longer distance to bone. In some embodiments, if the position of the bony articulation on the re-sliced medical image does not appear to be lined up with the 2D ultrasound view of where the probe is approaching, this misalignment could be used to adjust the registration of the robot 15 relative to the medical image. Similarly, in some embodiments, if the distance of the probe tip to bone does not match the distance perceived on the medical image, the registration could also be adjusted. In some embodiments, where the guide tube 50 is approaching something other than bone, this method may also be useful for indicating when relative movement of internal soft tissues, organs, blood vessels, and nerves occurs.
Some embodiments can include a nerve sensing probe. For example, in some embodiments, for sensing whether a penetrating probe is near a nerve, an electromyography (“EMG”) response to applied current could be used, enabling the ability of the robot 15 to steer around nerves. For example, as shown in
In some embodiments, the probe 9400 could be advanced manually or automatically and stopped, then the stimulating wire 9410 could be extended and current applied. In some embodiments, the EMG could be checked to verify whether a nerve is in proximity. In some embodiments, the simulating wire 9410 could be retracted, and probe 9400 rotated so that the portal for the stimulating wire 9410 is positioned at a different azimuth index. In some embodiments, the probe 9400 could again be extended to check for the presence of nerves in a different region ahead. In some embodiments, if a nerve is encountered, it would be known which direction the nerve is located, and which direction the probe 9400 would need to be steered to avoid it. In some embodiments, instead of a single wire 9410 extending and checking for a nerve, multiple wires 9410 could simultaneously be extended from several portals around the probe 9400. In some embodiments, the wires 9410 could be activated in sequence, checking for EMG signals and identifying which wire 9410 caused a response to identify the direction to avoid or steer. In some embodiments, it could be necessary to fully retract the stimulating wires 9410 before attempting to further advance the probe 9400 to avoid blocking progress of the probe 9400. In some embodiments, the stimulating wires 9410 would have a small enough diameter so as to be able to penetrate a nerve without causing nerve damage.
As noted elsewhere in this application, the robot 15 executed trajectories for paths into a patient 18 are planned using software (for example, at least one module of the software 3406 running on the computing device 3401 including computer 100) where the desired vectors are defined relative to radio opaque markers 730 on the image and therefore relative to active markers 720 on the targeting fixture 690. In some embodiments, these trajectories can be planned at any time after the image is acquired, before or after registration is performed. In some embodiments, it is possible that this trajectory planning can be done on another computerized device. For example, in some embodiments, a conventional portable device (such as a tablet computer, or a laptop computer, or a smartphone computer) could be used. In some embodiments, the 3D image volume would be transferred to the portable device, and the user would then plan and save the desired trajectories. In some embodiments, when robotic control is needed, this same image volume could be loaded on the console that controls the robot 15 and the trajectory plan could be transferred from the portable device. In some embodiments, using this algorithm, it would therefore be possible for a series of patients 18 to each to have a targeting fixture 690 applied and an imaging scan, such as a CT scan. In some embodiments, the 3D volume for each patient 18 could be exported to different portable devices, and the same or different surgeons could plan trajectories for each patient 18. In some embodiments, the same or different robot 15 could then move from room to room. In some embodiments, in each room, the robot 15 would be sterilized (or have sterile draping applied, and would receive the scan and trajectory plan. The robot 15 would then execute the plan, and then move to the next room to repeat the process. Similarly, the portion of the registration process in which the 3D image volume is searched for radio-opaque markers 730 could be performed on the portable device. Then, in some embodiments, when the robot 15 arrives, the registration information and the trajectories are both transferred to the robot 15 console. In some embodiments, by following this procedure, the time of computation of the image search algorithm on the robot 15 console is eliminated, increasing efficiency of the overall process when the robot 15 is required in multiple rooms.
Although several embodiments of the invention have been disclosed in the foregoing specification, it is understood that many modifications and other embodiments of the invention will come to mind to which the invention pertains, having the benefit of the teaching presented in the foregoing description and associated drawings. It is thus understood that the invention is not limited to the specific embodiments disclosed hereinabove, and that many modifications and other embodiments are intended to be included within the scope of the appended claims. Moreover, although specific terms are employed herein, as well as in the claims which follow, they are used only in a generic and descriptive sense, and not for the purposes of limiting the described invention, nor the claims which follow.
It will be appreciated by those skilled in the art that while the invention has been described above in connection with particular embodiments and examples, the invention is not necessarily so limited, and that numerous other embodiments, examples, uses, modifications and departures from the embodiments, examples and uses are intended to be encompassed by the claims attached hereto. The entire disclosure of each patent and publication cited herein is incorporated by reference, as if each such patent or publication were individually incorporated by reference herein. Various features and advantages of the invention are set forth in the following claims.
This application a continuation of is U.S. patent application Ser. No. 15/449,260, filed on Mar. 3, 2017, which is a continuation of U.S. patent application Ser. No. 13/924,505 filed on Jun. 21, 2013 which claims priority under 35 U.S.C. § 119 to U.S. Provisional Patent Application No. 61/662,702 filed on Jun. 21, 2012 and U.S. Provisional Patent Application No. 61/800,527 filed on Mar. 15, 2013, all of which are incorporated herein by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4150293 | Franke | Apr 1979 | A |
5246010 | Gazzara et al. | Sep 1993 | A |
5354314 | Hardy et al. | Oct 1994 | A |
5397323 | Taylor et al. | Mar 1995 | A |
5598453 | Baba et al. | Jan 1997 | A |
5772594 | Barrick | Jun 1998 | A |
5791908 | Gillio | Aug 1998 | A |
5820559 | Ng et al. | Oct 1998 | A |
5825982 | Wright et al. | Oct 1998 | A |
5887121 | Funda et al. | Mar 1999 | A |
5911449 | Daniele et al. | Jun 1999 | A |
5951475 | Gueziec et al. | Sep 1999 | A |
5987960 | Messner et al. | Nov 1999 | A |
6012216 | Esteves et al. | Jan 2000 | A |
6031888 | Ivan et al. | Feb 2000 | A |
6033415 | Mittelstadt et al. | Mar 2000 | A |
6080181 | Jensen et al. | Jun 2000 | A |
6106511 | Jensen | Aug 2000 | A |
6122541 | Cosman et al. | Sep 2000 | A |
6144875 | Schweikard et al. | Nov 2000 | A |
6157853 | Blume et al. | Dec 2000 | A |
6167145 | Foley et al. | Dec 2000 | A |
6167292 | Badano et al. | Dec 2000 | A |
6201984 | Funda et al. | Mar 2001 | B1 |
6203196 | Meyer et al. | Mar 2001 | B1 |
6205411 | DiGioia, III et al. | Mar 2001 | B1 |
6212419 | Blume et al. | Apr 2001 | B1 |
6231565 | Tovey et al. | May 2001 | B1 |
6236875 | Bucholz et al. | May 2001 | B1 |
6246900 | Cosman et al. | Jun 2001 | B1 |
6276471 | Kratzenberg et al. | Aug 2001 | B1 |
6301495 | Gueziec et al. | Oct 2001 | B1 |
6306126 | Montezuma | Oct 2001 | B1 |
6312435 | Wallace et al. | Nov 2001 | B1 |
6314311 | Williams et al. | Nov 2001 | B1 |
6320929 | Von Der Haar | Nov 2001 | B1 |
6322567 | Mittelstadt et al. | Nov 2001 | B1 |
6325808 | Bernard et al. | Dec 2001 | B1 |
6340363 | Bolger et al. | Jan 2002 | B1 |
6377011 | Ben-Ur | Apr 2002 | B1 |
6379302 | Kessman et al. | Apr 2002 | B1 |
6402762 | Hunter et al. | Jun 2002 | B2 |
6424885 | Niemeyer et al. | Jul 2002 | B1 |
6447503 | Wynne et al. | Sep 2002 | B1 |
6451027 | Cooper et al. | Sep 2002 | B1 |
6477400 | Barrick | Nov 2002 | B1 |
6484049 | Seeley et al. | Nov 2002 | B1 |
6487267 | Wolter | Nov 2002 | B1 |
6490467 | Bucholz et al. | Dec 2002 | B1 |
6490475 | Seeley et al. | Dec 2002 | B1 |
6499488 | Hunter et al. | Dec 2002 | B1 |
6501981 | Schweikard et al. | Dec 2002 | B1 |
6507751 | Blume et al. | Jan 2003 | B2 |
6535756 | Simon et al. | Mar 2003 | B1 |
6560354 | Maurer, Jr. et al. | May 2003 | B1 |
6565554 | Niemeyer | May 2003 | B1 |
6587750 | Gerbi et al. | Jul 2003 | B2 |
6614453 | Suri et al. | Sep 2003 | B1 |
6614871 | Kobiki et al. | Sep 2003 | B1 |
6619840 | Rasche et al. | Sep 2003 | B2 |
6636757 | Jascob et al. | Oct 2003 | B1 |
6645196 | Nixon et al. | Nov 2003 | B1 |
6666579 | Jensen | Dec 2003 | B2 |
6669635 | Kessman et al. | Dec 2003 | B2 |
6701173 | Nowinski et al. | Mar 2004 | B2 |
6757068 | Foxlin | Jun 2004 | B2 |
6782287 | Grzeszczuk et al. | Aug 2004 | B2 |
6783524 | Anderson et al. | Aug 2004 | B2 |
6786896 | Madhani et al. | Sep 2004 | B1 |
6788018 | Blumenkranz | Sep 2004 | B1 |
6804581 | Wang et al. | Oct 2004 | B2 |
6823207 | Jensen et al. | Nov 2004 | B1 |
6827351 | Graziani et al. | Dec 2004 | B2 |
6837892 | Shoham | Jan 2005 | B2 |
6839612 | Sanchez et al. | Jan 2005 | B2 |
6856826 | Seeley et al. | Feb 2005 | B2 |
6856827 | Seeley et al. | Feb 2005 | B2 |
6879880 | Nowlin et al. | Apr 2005 | B2 |
6892090 | Verard et al. | May 2005 | B2 |
6920347 | Simon et al. | Jul 2005 | B2 |
6922632 | Foxlin | Jul 2005 | B2 |
6968224 | Kessman et al. | Nov 2005 | B2 |
6978166 | Foley et al. | Dec 2005 | B2 |
6988009 | Grimm et al. | Jan 2006 | B2 |
6991627 | Madhani et al. | Jan 2006 | B2 |
6996487 | Jutras et al. | Feb 2006 | B2 |
6999852 | Green | Feb 2006 | B2 |
7007699 | Martinelli et al. | Mar 2006 | B2 |
7016457 | Senzig et al. | Mar 2006 | B1 |
7043961 | Pandey et al. | May 2006 | B2 |
7062006 | Pelc et al. | Jun 2006 | B1 |
7063705 | Young et al. | Jun 2006 | B2 |
7072707 | Galloway, Jr. et al. | Jul 2006 | B2 |
7083615 | Peterson et al. | Aug 2006 | B2 |
7097640 | Wang et al. | Aug 2006 | B2 |
7099428 | Clinthorne et al. | Aug 2006 | B2 |
7108421 | Gregerson et al. | Sep 2006 | B2 |
7130676 | Barrick | Oct 2006 | B2 |
7139418 | Abovitz et al. | Nov 2006 | B2 |
7139601 | Bucholz et al. | Nov 2006 | B2 |
7155316 | Sutherland et al. | Dec 2006 | B2 |
7164968 | Treat et al. | Jan 2007 | B2 |
7167738 | Schweikard et al. | Jan 2007 | B2 |
7169141 | Brock et al. | Jan 2007 | B2 |
7172627 | Fiere et al. | Feb 2007 | B2 |
7194120 | Wicker et al. | Mar 2007 | B2 |
7197107 | Arai et al. | Mar 2007 | B2 |
7231014 | Levy | Jun 2007 | B2 |
7231063 | Naimark et al. | Jun 2007 | B2 |
7239940 | Wang et al. | Jul 2007 | B2 |
7248914 | Hastings et al. | Jul 2007 | B2 |
7301648 | Foxlin | Nov 2007 | B2 |
7302288 | Schellenberg | Nov 2007 | B1 |
7313430 | Urquhart et al. | Dec 2007 | B2 |
7318805 | Schweikard et al. | Jan 2008 | B2 |
7318827 | Leitner et al. | Jan 2008 | B2 |
7319897 | Leitner et al. | Jan 2008 | B2 |
7324623 | Heuscher et al. | Jan 2008 | B2 |
7327865 | Fu et al. | Feb 2008 | B2 |
7331967 | Lee et al. | Feb 2008 | B2 |
7333642 | Green | Feb 2008 | B2 |
7339341 | Oleynikov et al. | Mar 2008 | B2 |
7366562 | Dukesherer et al. | Apr 2008 | B2 |
7379790 | Toth et al. | May 2008 | B2 |
7386365 | Nixon | Jun 2008 | B2 |
7422592 | Morley et al. | Sep 2008 | B2 |
7435216 | Kwon et al. | Oct 2008 | B2 |
7440793 | Chauhan et al. | Oct 2008 | B2 |
7460637 | Clinthorne et al. | Dec 2008 | B2 |
7466303 | Yi et al. | Dec 2008 | B2 |
7493153 | Ahmed et al. | Feb 2009 | B2 |
7505617 | Fu et al. | Mar 2009 | B2 |
7533892 | Schena et al. | May 2009 | B2 |
7542791 | Mire et al. | Jun 2009 | B2 |
7555331 | Viswanathan | Jun 2009 | B2 |
7567834 | Clayton et al. | Jul 2009 | B2 |
7594912 | Cooper et al. | Sep 2009 | B2 |
7606613 | Simon et al. | Oct 2009 | B2 |
7607440 | Coste-Maniere et al. | Oct 2009 | B2 |
7623902 | Pacheco | Nov 2009 | B2 |
7630752 | Viswanathan | Dec 2009 | B2 |
7630753 | Simon et al. | Dec 2009 | B2 |
7643862 | Schoenefeld | Jan 2010 | B2 |
7660623 | Hunter et al. | Feb 2010 | B2 |
7661881 | Gregerson et al. | Feb 2010 | B2 |
7683331 | Chang | Mar 2010 | B2 |
7683332 | Chang | Mar 2010 | B2 |
7689320 | Prisco et al. | Mar 2010 | B2 |
7691098 | Wallace et al. | Apr 2010 | B2 |
7702379 | Avinash et al. | Apr 2010 | B2 |
7702477 | Tuemmler et al. | Apr 2010 | B2 |
7711083 | Heigl et al. | May 2010 | B2 |
7711406 | Kuhn et al. | May 2010 | B2 |
7720523 | Omernick et al. | May 2010 | B2 |
7725253 | Foxlin | May 2010 | B2 |
7726171 | Langlotz et al. | Jun 2010 | B2 |
7742801 | Neubauer et al. | Jun 2010 | B2 |
7751865 | Jascob et al. | Jul 2010 | B2 |
7760849 | Zhang | Jul 2010 | B2 |
7762825 | Burbank et al. | Jul 2010 | B2 |
7763015 | Cooper et al. | Jul 2010 | B2 |
7787699 | Mahesh et al. | Aug 2010 | B2 |
7796728 | Bergfjord | Sep 2010 | B2 |
7813838 | Sommer | Oct 2010 | B2 |
7818044 | Dukesherer et al. | Oct 2010 | B2 |
7819859 | Prisco et al. | Oct 2010 | B2 |
7824401 | Manzo et al. | Nov 2010 | B2 |
7831294 | Viswanathan | Nov 2010 | B2 |
7834484 | Sartor | Nov 2010 | B2 |
7835557 | Kendrick et al. | Nov 2010 | B2 |
7835778 | Foley et al. | Nov 2010 | B2 |
7835784 | Mire et al. | Nov 2010 | B2 |
7840253 | Tremblay et al. | Nov 2010 | B2 |
7840256 | Lakin et al. | Nov 2010 | B2 |
7843158 | Prisco | Nov 2010 | B2 |
7844320 | Shahidi | Nov 2010 | B2 |
7853305 | Simon et al. | Dec 2010 | B2 |
7853313 | Thompson | Dec 2010 | B2 |
7865269 | Prisco et al. | Jan 2011 | B2 |
D631966 | Perloff et al. | Feb 2011 | S |
7879045 | Gielen et al. | Feb 2011 | B2 |
7881767 | Strommer et al. | Feb 2011 | B2 |
7881770 | Melkent et al. | Feb 2011 | B2 |
7886743 | Cooper et al. | Feb 2011 | B2 |
RE42194 | Foley et al. | Mar 2011 | E |
RE42226 | Foley et al. | Mar 2011 | E |
7900524 | Calloway et al. | Mar 2011 | B2 |
7907166 | Lamprecht et al. | Mar 2011 | B2 |
7909122 | Schena et al. | Mar 2011 | B2 |
7925653 | Saptharishi | Apr 2011 | B2 |
7930065 | Larkin et al. | Apr 2011 | B2 |
7935130 | Willliams | May 2011 | B2 |
7940999 | Liao et al. | May 2011 | B2 |
7945012 | Ye et al. | May 2011 | B2 |
7945021 | Shapiro et al. | May 2011 | B2 |
7953470 | Vetter et al. | May 2011 | B2 |
7954397 | Choi et al. | Jun 2011 | B2 |
7971341 | Dukesherer et al. | Jul 2011 | B2 |
7974674 | Hauck et al. | Jul 2011 | B2 |
7974677 | Mire et al. | Jul 2011 | B2 |
7974681 | Wallace et al. | Jul 2011 | B2 |
7979157 | Anvari | Jul 2011 | B2 |
7983733 | Viswanathan | Jul 2011 | B2 |
7988215 | Seibold | Aug 2011 | B2 |
7996110 | Lipow et al. | Aug 2011 | B2 |
8004121 | Sartor | Aug 2011 | B2 |
8004229 | Nowlin et al. | Aug 2011 | B2 |
8010177 | Csavoy et al. | Aug 2011 | B2 |
8019045 | Kato | Sep 2011 | B2 |
8021310 | Sanborn et al. | Sep 2011 | B2 |
8035685 | Jensen | Oct 2011 | B2 |
8046054 | Kim et al. | Oct 2011 | B2 |
8046057 | Clarke | Oct 2011 | B2 |
8052688 | Wolf, II | Nov 2011 | B2 |
8054184 | Cline et al. | Nov 2011 | B2 |
8054752 | Druke et al. | Nov 2011 | B2 |
8057397 | Li et al. | Nov 2011 | B2 |
8057407 | Martinelli et al. | Nov 2011 | B2 |
8062288 | Cooper et al. | Nov 2011 | B2 |
8062375 | Glerum et al. | Nov 2011 | B2 |
8066524 | Burbank et al. | Nov 2011 | B2 |
8073335 | Labonville et al. | Dec 2011 | B2 |
8079950 | Stern et al. | Dec 2011 | B2 |
8086299 | Adler et al. | Dec 2011 | B2 |
8092370 | Roberts et al. | Jan 2012 | B2 |
8098914 | Liao et al. | Jan 2012 | B2 |
8100950 | St. Clair et al. | Jan 2012 | B2 |
8105320 | Manzo | Jan 2012 | B2 |
8108025 | Csavoy et al. | Jan 2012 | B2 |
8109877 | Moctezuma de la Barrera et al. | Feb 2012 | B2 |
8112292 | Simon | Feb 2012 | B2 |
8116430 | Shapiro et al. | Feb 2012 | B1 |
8120301 | Goldberg et al. | Feb 2012 | B2 |
8121249 | Wang et al. | Feb 2012 | B2 |
8123675 | Funda et al. | Feb 2012 | B2 |
8133229 | Bonutti | Mar 2012 | B1 |
8142420 | Schena | Mar 2012 | B2 |
8147494 | Leitner et al. | Apr 2012 | B2 |
8150494 | Simon et al. | Apr 2012 | B2 |
8150497 | Gielen et al. | Apr 2012 | B2 |
8150498 | Gielen et al. | Apr 2012 | B2 |
8165658 | Waynik et al. | Apr 2012 | B2 |
8170313 | Kendrick et al. | May 2012 | B2 |
8179073 | Farritor et al. | May 2012 | B2 |
8182476 | Julian et al. | May 2012 | B2 |
8184880 | Zhao et al. | May 2012 | B2 |
8202278 | Orban, III et al. | Jun 2012 | B2 |
8208708 | Homan et al. | Jun 2012 | B2 |
8208988 | Jensen | Jun 2012 | B2 |
8219177 | Smith et al. | Jul 2012 | B2 |
8219178 | Smith et al. | Jul 2012 | B2 |
8220468 | Cooper et al. | Jul 2012 | B2 |
8224024 | Foxlin et al. | Jul 2012 | B2 |
8224484 | Swarup et al. | Jul 2012 | B2 |
8225798 | Baldwin et al. | Jul 2012 | B2 |
8228368 | Zhao et al. | Jul 2012 | B2 |
8231610 | Jo et al. | Jul 2012 | B2 |
8263933 | Hartmann et al. | Jul 2012 | B2 |
8239001 | Verard et al. | Aug 2012 | B2 |
8241271 | Millman et al. | Aug 2012 | B2 |
8248413 | Gattani et al. | Aug 2012 | B2 |
8256319 | Cooper et al. | Sep 2012 | B2 |
8271069 | Jascob et al. | Sep 2012 | B2 |
8271130 | Hourtash | Sep 2012 | B2 |
8281670 | Larkin et al. | Oct 2012 | B2 |
8282653 | Nelson et al. | Oct 2012 | B2 |
8301226 | Csavoy et al. | Oct 2012 | B2 |
8311611 | Csavoy et al. | Nov 2012 | B2 |
8320991 | Jascob et al. | Nov 2012 | B2 |
8332012 | Kienzle, III | Dec 2012 | B2 |
8333755 | Cooper et al. | Dec 2012 | B2 |
8335552 | Stiles | Dec 2012 | B2 |
8335557 | Maschke | Dec 2012 | B2 |
8348931 | Cooper et al. | Jan 2013 | B2 |
8353963 | Glerum | Jan 2013 | B2 |
8358818 | Miga et al. | Jan 2013 | B2 |
8359730 | Burg et al. | Jan 2013 | B2 |
8374673 | Adcox et al. | Feb 2013 | B2 |
8374723 | Zhao et al. | Feb 2013 | B2 |
8379791 | Forthmann et al. | Feb 2013 | B2 |
8386019 | Camus et al. | Feb 2013 | B2 |
8392022 | Ortmaier et al. | Mar 2013 | B2 |
8394099 | Patwardhan | Mar 2013 | B2 |
8395342 | Prisco | Mar 2013 | B2 |
8398634 | Manzo et al. | Mar 2013 | B2 |
8400094 | Schena | Mar 2013 | B2 |
8414957 | Enzerink et al. | Apr 2013 | B2 |
8418073 | Mohr et al. | Apr 2013 | B2 |
8450694 | Baviera et al. | May 2013 | B2 |
8452447 | Nixon | May 2013 | B2 |
RE44305 | Foley et al. | Jun 2013 | E |
8462911 | Vesel et al. | Jun 2013 | B2 |
8465476 | Rogers et al. | Jun 2013 | B2 |
8465771 | Wan et al. | Jun 2013 | B2 |
8467851 | Mire et al. | Jun 2013 | B2 |
8467852 | Csavoy et al. | Jun 2013 | B2 |
8469947 | Devengenzo et al. | Jun 2013 | B2 |
RE44392 | Hynes | Jul 2013 | E |
8483434 | Buehner et al. | Jul 2013 | B2 |
8483800 | Jensen et al. | Jul 2013 | B2 |
8486532 | Enzerink et al. | Jul 2013 | B2 |
8489235 | Moll et al. | Jul 2013 | B2 |
8500722 | Cooper | Aug 2013 | B2 |
8500728 | Newton et al. | Aug 2013 | B2 |
8504201 | Moll et al. | Aug 2013 | B2 |
8506555 | Ruiz Morales | Aug 2013 | B2 |
8506556 | Schena | Aug 2013 | B2 |
8508173 | Goldberg et al. | Aug 2013 | B2 |
8512318 | Tovey et al. | Aug 2013 | B2 |
8515576 | Lipow et al. | Aug 2013 | B2 |
8518120 | Glerum et al. | Aug 2013 | B2 |
8521331 | Itkowitz | Aug 2013 | B2 |
8526688 | Groszmann et al. | Sep 2013 | B2 |
8526700 | Isaacs | Sep 2013 | B2 |
8527094 | Kumar et al. | Sep 2013 | B2 |
8528440 | Morley et al. | Sep 2013 | B2 |
8532741 | Heruth et al. | Sep 2013 | B2 |
8541970 | Nowlin et al. | Sep 2013 | B2 |
8548563 | Simon et al. | Oct 2013 | B2 |
8549732 | Burg et al. | Oct 2013 | B2 |
8551114 | Ramos de la Pena | Oct 2013 | B2 |
8551116 | Julian et al. | Oct 2013 | B2 |
8556807 | Scott et al. | Oct 2013 | B2 |
8556979 | Glerum et al. | Oct 2013 | B2 |
8560118 | Green et al. | Oct 2013 | B2 |
8561473 | Blumenkranz | Oct 2013 | B2 |
8562594 | Cooper et al. | Oct 2013 | B2 |
8571638 | Shoham | Oct 2013 | B2 |
8571710 | Coste-Maniere et al. | Oct 2013 | B2 |
8573465 | Shelton, IV | Nov 2013 | B2 |
8574303 | Sharkey et al. | Nov 2013 | B2 |
8585420 | Burbank et al. | Nov 2013 | B2 |
8594841 | Zhao et al. | Nov 2013 | B2 |
8597198 | Sanborn et al. | Dec 2013 | B2 |
8600478 | Verard et al. | Dec 2013 | B2 |
8603077 | Cooper et al. | Dec 2013 | B2 |
8611985 | Lavallee et al. | Dec 2013 | B2 |
8613230 | Blumenkranz et al. | Dec 2013 | B2 |
8621939 | Blumenkranz et al. | Jan 2014 | B2 |
8624537 | Nowlin et al. | Jan 2014 | B2 |
8630389 | Kato | Jan 2014 | B2 |
8634897 | Simon et al. | Jan 2014 | B2 |
8634957 | Toth et al. | Jan 2014 | B2 |
8638056 | Goldberg et al. | Jan 2014 | B2 |
8638057 | Goldberg et al. | Jan 2014 | B2 |
8639000 | Zhao et al. | Jan 2014 | B2 |
8641726 | Bonutti | Feb 2014 | B2 |
8644907 | Hartmann et al. | Feb 2014 | B2 |
8657809 | Schoepp | Feb 2014 | B2 |
8660635 | Simon et al. | Feb 2014 | B2 |
8666544 | Moll et al. | Mar 2014 | B2 |
8675939 | Moctezuma de la Barrera | Mar 2014 | B2 |
8678647 | Gregerson et al. | Mar 2014 | B2 |
8679125 | Smith et al. | Mar 2014 | B2 |
8679183 | Glerum et al. | Mar 2014 | B2 |
8682413 | Lloyd | Mar 2014 | B2 |
8684253 | Giordano et al. | Apr 2014 | B2 |
8685098 | Glerum et al. | Apr 2014 | B2 |
8693730 | Umasuthan et al. | Apr 2014 | B2 |
8694075 | Groszmann et al. | Apr 2014 | B2 |
8696458 | Foxlin et al. | Apr 2014 | B2 |
8700123 | Okamura et al. | Apr 2014 | B2 |
8706086 | Glerum | Apr 2014 | B2 |
8706185 | Foley et al. | Apr 2014 | B2 |
8706301 | Zhao et al. | Apr 2014 | B2 |
8717430 | Simon et al. | May 2014 | B2 |
8727618 | Maschke et al. | May 2014 | B2 |
8734432 | Tuma et al. | May 2014 | B2 |
8738115 | Amberg et al. | May 2014 | B2 |
8738181 | Greer et al. | May 2014 | B2 |
8740882 | Jun et al. | Jun 2014 | B2 |
8746252 | McGrogan et al. | Jun 2014 | B2 |
8749189 | Nowlin et al. | Jun 2014 | B2 |
8749190 | Nowlin et al. | Jun 2014 | B2 |
8761930 | Nixon | Jun 2014 | B2 |
8764448 | Yang et al. | Jul 2014 | B2 |
8771170 | Mesallum et al. | Jul 2014 | B2 |
8781186 | Clements et al. | Jul 2014 | B2 |
8781630 | Banks et al. | Jul 2014 | B2 |
8784385 | Boyden et al. | Jul 2014 | B2 |
8786241 | Nowlin et al. | Jul 2014 | B2 |
8787520 | Baba | Jul 2014 | B2 |
8792704 | Isaacs | Jul 2014 | B2 |
8798231 | Notohara et al. | Aug 2014 | B2 |
8800838 | Shelton, IV | Aug 2014 | B2 |
8808164 | Hoffman et al. | Aug 2014 | B2 |
8812077 | Dempsey | Aug 2014 | B2 |
8814793 | Brabrand | Aug 2014 | B2 |
8816628 | Nowlin et al. | Aug 2014 | B2 |
8818105 | Myronenko et al. | Aug 2014 | B2 |
8820605 | Shelton, IV | Sep 2014 | B2 |
8821511 | von Jako et al. | Sep 2014 | B2 |
8823308 | Nowlin et al. | Sep 2014 | B2 |
8827996 | Scott et al. | Sep 2014 | B2 |
8828024 | Farritor et al. | Sep 2014 | B2 |
8830224 | Zhao et al. | Sep 2014 | B2 |
8834489 | Cooper et al. | Sep 2014 | B2 |
8834490 | Bonutti | Sep 2014 | B2 |
8838270 | Druke et al. | Sep 2014 | B2 |
8844789 | Shelton, IV et al. | Sep 2014 | B2 |
8855822 | Bartol et al. | Oct 2014 | B2 |
8858598 | Seifert et al. | Oct 2014 | B2 |
8860753 | Bhandarkar et al. | Oct 2014 | B2 |
8864751 | Prisco et al. | Oct 2014 | B2 |
8864798 | Weiman et al. | Oct 2014 | B2 |
8864833 | Glerum et al. | Oct 2014 | B2 |
8867703 | Shapiro et al. | Oct 2014 | B2 |
8870880 | Himmelberger et al. | Oct 2014 | B2 |
8876866 | Zappacosta et al. | Nov 2014 | B2 |
8880223 | Raj et al. | Nov 2014 | B2 |
8882803 | Iott et al. | Nov 2014 | B2 |
8883210 | Truncale et al. | Nov 2014 | B1 |
8888821 | Rezach et al. | Nov 2014 | B2 |
8888853 | Glerum et al. | Nov 2014 | B2 |
8888854 | Glerum et al. | Nov 2014 | B2 |
8894652 | Seifert et al. | Nov 2014 | B2 |
8894688 | Suh | Nov 2014 | B2 |
8894691 | Iott et al. | Nov 2014 | B2 |
8906069 | Hansell et al. | Dec 2014 | B2 |
8964934 | Ein-Gal | Feb 2015 | B2 |
8992580 | Bar et al. | Mar 2015 | B2 |
8996169 | Lightcap et al. | Mar 2015 | B2 |
9001963 | Sowards-Emmerd et al. | Apr 2015 | B2 |
9002076 | Khadem et al. | Apr 2015 | B2 |
9044190 | Rubner et al. | Jun 2015 | B2 |
9107683 | Hourtash et al. | Aug 2015 | B2 |
9125556 | Zehavi et al. | Sep 2015 | B2 |
9131986 | Greer et al. | Sep 2015 | B2 |
9215968 | Schostek et al. | Dec 2015 | B2 |
9308050 | Kostrzewski et al. | Apr 2016 | B2 |
9380984 | Ll et al. | Jul 2016 | B2 |
9393039 | Lechner et al. | Jul 2016 | B2 |
9398886 | Gregerson et al. | Jul 2016 | B2 |
9398890 | Dong et al. | Jul 2016 | B2 |
9414859 | Ballard et al. | Aug 2016 | B2 |
9420975 | Gutfleisch et al. | Aug 2016 | B2 |
9492235 | Hourtash et al. | Nov 2016 | B2 |
9592096 | Maillet et al. | Mar 2017 | B2 |
9750465 | Engel et al. | Sep 2017 | B2 |
9757203 | Hourtash et al. | Sep 2017 | B2 |
9795354 | Menegaz et al. | Oct 2017 | B2 |
9814535 | Bar et al. | Nov 2017 | B2 |
9820783 | Donner et al. | Nov 2017 | B2 |
9833265 | Donner et al. | Nov 2017 | B2 |
9848922 | Tohmeh et al. | Dec 2017 | B2 |
9925011 | Gombert et al. | Mar 2018 | B2 |
9931025 | Graetzel et al. | Apr 2018 | B1 |
10034717 | Miller et al. | Jul 2018 | B2 |
20010036302 | Miller | Nov 2001 | A1 |
20020035321 | Bucholz et al. | Mar 2002 | A1 |
20030055049 | Brock | Mar 2003 | A1 |
20030153829 | Sarin et al. | Aug 2003 | A1 |
20040024311 | Quaid | Feb 2004 | A1 |
20040068172 | Nowinski et al. | Apr 2004 | A1 |
20040076259 | Jensen et al. | Apr 2004 | A1 |
20050096502 | Khalili | May 2005 | A1 |
20050143651 | Verard et al. | Jun 2005 | A1 |
20050171558 | Abovitz et al. | Aug 2005 | A1 |
20050245820 | Sarin | Nov 2005 | A1 |
20060100610 | Wallace et al. | May 2006 | A1 |
20060173329 | Marquart et al. | Aug 2006 | A1 |
20060184396 | Dennis et al. | Aug 2006 | A1 |
20060241416 | Marquart et al. | Oct 2006 | A1 |
20060291612 | Nishide et al. | Dec 2006 | A1 |
20070015987 | Benlloch Baviera et al. | Jan 2007 | A1 |
20070021738 | Hasser et al. | Jan 2007 | A1 |
20070038059 | Sheffer et al. | Feb 2007 | A1 |
20070073133 | Schoenefeld | Mar 2007 | A1 |
20070156121 | Millman et al. | Jul 2007 | A1 |
20070156157 | Nahum et al. | Jul 2007 | A1 |
20070167712 | Keglovich et al. | Jul 2007 | A1 |
20070233238 | Huynh et al. | Oct 2007 | A1 |
20080004523 | Jensen | Jan 2008 | A1 |
20080013809 | Zhu et al. | Jan 2008 | A1 |
20080033283 | Dellaca et al. | Feb 2008 | A1 |
20080046122 | Manzo et al. | Feb 2008 | A1 |
20080082109 | Moll et al. | Apr 2008 | A1 |
20080108912 | Node-Langlois | May 2008 | A1 |
20080108991 | von Jako | May 2008 | A1 |
20080109012 | Falco et al. | May 2008 | A1 |
20080144906 | Allred et al. | Jun 2008 | A1 |
20080161680 | von Jako et al. | Jul 2008 | A1 |
20080161682 | Kendrick et al. | Jul 2008 | A1 |
20080177203 | von Jako | Jul 2008 | A1 |
20080214922 | Hartmann et al. | Sep 2008 | A1 |
20080228068 | Viswanathan et al. | Sep 2008 | A1 |
20080228196 | Wang et al. | Sep 2008 | A1 |
20080235052 | Node-Langlois et al. | Sep 2008 | A1 |
20080269596 | Revie et al. | Oct 2008 | A1 |
20080287771 | Anderson | Nov 2008 | A1 |
20080287781 | Revie et al. | Nov 2008 | A1 |
20080300477 | Lloyd et al. | Dec 2008 | A1 |
20080300478 | Zuhars et al. | Dec 2008 | A1 |
20080302950 | Park et al. | Dec 2008 | A1 |
20080306490 | Lakin et al. | Dec 2008 | A1 |
20080319311 | Hamadeh | Dec 2008 | A1 |
20090012509 | Csavoy et al. | Jan 2009 | A1 |
20090030428 | Omori et al. | Jan 2009 | A1 |
20090080737 | Battle et al. | Mar 2009 | A1 |
20090185655 | Koken et al. | Jul 2009 | A1 |
20090198121 | Hoheisel | Aug 2009 | A1 |
20090216113 | Meier et al. | Aug 2009 | A1 |
20090228019 | Gross et al. | Sep 2009 | A1 |
20090259123 | Navab et al. | Oct 2009 | A1 |
20090259230 | Khadem et al. | Oct 2009 | A1 |
20090264899 | Appenrodt et al. | Oct 2009 | A1 |
20090281417 | Hartmann et al. | Nov 2009 | A1 |
20100022874 | Wang et al. | Jan 2010 | A1 |
20100039506 | Sarvestani et al. | Feb 2010 | A1 |
20100125286 | Wang et al. | May 2010 | A1 |
20100130986 | Mailloux et al. | May 2010 | A1 |
20100228117 | Hartmann | Sep 2010 | A1 |
20100228265 | Prisco | Sep 2010 | A1 |
20100249571 | Jensen et al. | Sep 2010 | A1 |
20100274120 | Heuscher | Oct 2010 | A1 |
20100280363 | Skarda et al. | Nov 2010 | A1 |
20100331858 | Simaan et al. | Dec 2010 | A1 |
20110022229 | Jang et al. | Jan 2011 | A1 |
20110077504 | Fischer et al. | Mar 2011 | A1 |
20110098553 | Robbins et al. | Apr 2011 | A1 |
20110137152 | Li | Jun 2011 | A1 |
20110213384 | Jeong | Sep 2011 | A1 |
20110224684 | Larkin et al. | Sep 2011 | A1 |
20110224685 | Larkin et al. | Sep 2011 | A1 |
20110224686 | Larkin et al. | Sep 2011 | A1 |
20110224687 | Larkin et al. | Sep 2011 | A1 |
20110224688 | Larkin et al. | Sep 2011 | A1 |
20110224689 | Larkin et al. | Sep 2011 | A1 |
20110224825 | Larkin et al. | Sep 2011 | A1 |
20110230967 | O'Halloran et al. | Sep 2011 | A1 |
20110238080 | Ranjit et al. | Sep 2011 | A1 |
20110276058 | Choi et al. | Nov 2011 | A1 |
20110282189 | Graumann | Nov 2011 | A1 |
20110286573 | Schretter et al. | Nov 2011 | A1 |
20110295062 | Gratacos Solsona et al. | Dec 2011 | A1 |
20110295370 | Suh et al. | Dec 2011 | A1 |
20110306986 | Lee et al. | Dec 2011 | A1 |
20120035507 | George et al. | Feb 2012 | A1 |
20120046668 | Gantes | Feb 2012 | A1 |
20120051498 | Koishi | Mar 2012 | A1 |
20120053597 | Anvari et al. | Mar 2012 | A1 |
20120059248 | Holsing et al. | Mar 2012 | A1 |
20120071753 | Hunter et al. | Mar 2012 | A1 |
20120108954 | Schulhauser et al. | May 2012 | A1 |
20120136372 | Amat Girbau et al. | May 2012 | A1 |
20120143084 | Shoham | Jun 2012 | A1 |
20120184839 | Woerlein | Jul 2012 | A1 |
20120197182 | Millman et al. | Aug 2012 | A1 |
20120209290 | Selover et al. | Aug 2012 | A1 |
20120226145 | Chang et al. | Sep 2012 | A1 |
20120235909 | Birkenbach et al. | Sep 2012 | A1 |
20120245596 | Meenink | Sep 2012 | A1 |
20120253332 | Moll | Oct 2012 | A1 |
20120253360 | White et al. | Oct 2012 | A1 |
20120256092 | Zingerman | Oct 2012 | A1 |
20120294498 | Popovic | Nov 2012 | A1 |
20120296203 | Hartmann et al. | Nov 2012 | A1 |
20130006267 | Odermatt et al. | Jan 2013 | A1 |
20130016889 | Myronenko et al. | Jan 2013 | A1 |
20130030571 | Ruiz Morales et al. | Jan 2013 | A1 |
20130035583 | Park et al. | Feb 2013 | A1 |
20130060146 | Yang et al. | Mar 2013 | A1 |
20130060337 | Petersheim et al. | Mar 2013 | A1 |
20130094742 | Feilkas | Apr 2013 | A1 |
20130096574 | Kang et al. | Apr 2013 | A1 |
20130113791 | Isaacs et al. | May 2013 | A1 |
20130116706 | Lee et al. | May 2013 | A1 |
20130131695 | Scarfogliero et al. | May 2013 | A1 |
20130144307 | Jeong et al. | Jun 2013 | A1 |
20130158542 | Manzo et al. | Jun 2013 | A1 |
20130165937 | Patwardhan | Jun 2013 | A1 |
20130178867 | Farritor et al. | Jul 2013 | A1 |
20130178868 | Roh | Jul 2013 | A1 |
20130178870 | Schena | Jul 2013 | A1 |
20130204271 | Brisson et al. | Aug 2013 | A1 |
20130211419 | Jensen | Aug 2013 | A1 |
20130211420 | Jensen | Aug 2013 | A1 |
20130218142 | Tuma et al. | Aug 2013 | A1 |
20130223702 | Holsing et al. | Aug 2013 | A1 |
20130225942 | Holsing et al. | Aug 2013 | A1 |
20130225943 | Holsing et al. | Aug 2013 | A1 |
20130231556 | Holsing et al. | Sep 2013 | A1 |
20130237995 | Lee et al. | Sep 2013 | A1 |
20130245375 | DiMaio et al. | Sep 2013 | A1 |
20130261640 | Kim et al. | Oct 2013 | A1 |
20130272488 | Bailey et al. | Oct 2013 | A1 |
20130272489 | Dickman et al. | Oct 2013 | A1 |
20130274761 | Devengenzo et al. | Oct 2013 | A1 |
20130281821 | Liu et al. | Oct 2013 | A1 |
20130296884 | Taylor et al. | Nov 2013 | A1 |
20130303887 | Holsing et al. | Nov 2013 | A1 |
20130307955 | Deitz et al. | Nov 2013 | A1 |
20130317521 | Choi et al. | Nov 2013 | A1 |
20130325033 | Schena et al. | Dec 2013 | A1 |
20130325035 | Hauck et al. | Dec 2013 | A1 |
20130331686 | Freysinger et al. | Dec 2013 | A1 |
20130331858 | Devengenzo et al. | Dec 2013 | A1 |
20130331861 | Yoon | Dec 2013 | A1 |
20130342578 | Isaacs | Dec 2013 | A1 |
20130345717 | Markvicka et al. | Dec 2013 | A1 |
20130345718 | Crawford et al. | Dec 2013 | A1 |
20130345757 | Stad | Dec 2013 | A1 |
20140001235 | Shelton, IV | Jan 2014 | A1 |
20140012131 | Heruth et al. | Jan 2014 | A1 |
20140031664 | Kang et al. | Jan 2014 | A1 |
20140046128 | Lee et al. | Feb 2014 | A1 |
20140046132 | Hoeg et al. | Feb 2014 | A1 |
20140046340 | Wilson et al. | Feb 2014 | A1 |
20140049629 | Siewerdsen et al. | Feb 2014 | A1 |
20140058406 | Tsekos | Feb 2014 | A1 |
20140066944 | Taylor et al. | Mar 2014 | A1 |
20140073914 | Lavallee et al. | Mar 2014 | A1 |
20140080086 | Chen | Mar 2014 | A1 |
20140081128 | Verard et al. | Mar 2014 | A1 |
20140088612 | Bartol et al. | Mar 2014 | A1 |
20140094694 | Moctezuma de la Barrera | Apr 2014 | A1 |
20140094851 | Gordon | Apr 2014 | A1 |
20140096369 | Matsumoto et al. | Apr 2014 | A1 |
20140100587 | Farritor et al. | Apr 2014 | A1 |
20140121676 | Kostrzewski et al. | May 2014 | A1 |
20140128882 | Kwak et al. | May 2014 | A1 |
20140135796 | Simon et al. | May 2014 | A1 |
20140142591 | Alvarez et al. | May 2014 | A1 |
20140142592 | Moon et al. | May 2014 | A1 |
20140148692 | Hartmann et al. | May 2014 | A1 |
20140163581 | Devengenzo et al. | Jun 2014 | A1 |
20140171781 | Stiles | Jun 2014 | A1 |
20140171900 | Stiles | Jun 2014 | A1 |
20140171965 | Loh et al. | Jun 2014 | A1 |
20140180308 | von Grunberg | Jun 2014 | A1 |
20140180309 | Seeber et al. | Jun 2014 | A1 |
20140187915 | Yaroshenko et al. | Jul 2014 | A1 |
20140188132 | Kang | Jul 2014 | A1 |
20140194699 | Roh et al. | Jul 2014 | A1 |
20140130810 | Azizian et al. | Aug 2014 | A1 |
20140221819 | Sarment | Aug 2014 | A1 |
20140222023 | Kim et al. | Aug 2014 | A1 |
20140228631 | Kwak et al. | Aug 2014 | A1 |
20140234804 | Huang et al. | Aug 2014 | A1 |
20140257328 | Kim et al. | Sep 2014 | A1 |
20140257329 | Jang et al. | Sep 2014 | A1 |
20140257330 | Choi et al. | Sep 2014 | A1 |
20140275760 | Lee et al. | Sep 2014 | A1 |
20140275955 | Crawford et al. | Sep 2014 | A1 |
20140275985 | Walker et al. | Sep 2014 | A1 |
20140276931 | Parihar et al. | Sep 2014 | A1 |
20140276940 | Seo | Sep 2014 | A1 |
20140276944 | Farritor et al. | Sep 2014 | A1 |
20140288413 | Hwang et al. | Sep 2014 | A1 |
20140299648 | Shelton, IV et al. | Oct 2014 | A1 |
20140303434 | Farritor et al. | Oct 2014 | A1 |
20140303643 | Ha et al. | Oct 2014 | A1 |
20140305995 | Shelton, IV et al. | Oct 2014 | A1 |
20140309659 | Roh et al. | Oct 2014 | A1 |
20140316436 | Bar et al. | Oct 2014 | A1 |
20140323803 | Hoffman et al. | Oct 2014 | A1 |
20140324070 | Min et al. | Oct 2014 | A1 |
20140330288 | Date et al. | Nov 2014 | A1 |
20140364720 | Darrow et al. | Dec 2014 | A1 |
20140371577 | Maillet et al. | Dec 2014 | A1 |
20140379130 | Lee et al. | Dec 2014 | A1 |
20150039034 | Frankel et al. | Feb 2015 | A1 |
20150085970 | Bouhnik et al. | Mar 2015 | A1 |
20150146847 | Liu | May 2015 | A1 |
20150150524 | Yorkston et al. | Jun 2015 | A1 |
20150196261 | Funk | Jul 2015 | A1 |
20150213633 | Chang et al. | Jul 2015 | A1 |
20150335480 | Alvarez et al. | Nov 2015 | A1 |
20150342647 | Frankel et al. | Dec 2015 | A1 |
20160005194 | Schretter et al. | Jan 2016 | A1 |
20160166329 | Langan et al. | Jun 2016 | A1 |
20160235480 | Scholl et al. | Aug 2016 | A1 |
20160249990 | Glozman et al. | Sep 2016 | A1 |
20160302871 | Gregerson et al. | Oct 2016 | A1 |
20160320322 | Suzuki | Nov 2016 | A1 |
20160331335 | Gregerson et al. | Nov 2016 | A1 |
20170135770 | Scholl et al. | May 2017 | A1 |
20170143284 | Sehnert et al. | May 2017 | A1 |
20170143426 | Isaacs et al. | May 2017 | A1 |
20170156816 | Ibrahim | Jun 2017 | A1 |
20170202629 | Maillet et al. | Jul 2017 | A1 |
20170212723 | Atarot et al. | Jul 2017 | A1 |
20170215825 | Johnson et al. | Aug 2017 | A1 |
20170215826 | Johnson et al. | Aug 2017 | A1 |
20170215827 | Johnson et al. | Aug 2017 | A1 |
20170231710 | Scholl et al. | Aug 2017 | A1 |
20170258426 | Risher-Kelly et al. | Sep 2017 | A1 |
20170273748 | Hourtash et al. | Sep 2017 | A1 |
20170296277 | Hourtash et al. | Oct 2017 | A1 |
20170360493 | Zucher et al. | Dec 2017 | A1 |
Number | Date | Country |
---|---|---|
2286729 | Feb 2011 | EP |
898843 | Apr 1996 | JP |
8313304 | Nov 1996 | JP |
2008538184 | Oct 2008 | JP |
02071369 | Sep 2002 | WO |
Entry |
---|
US 8,231,638 B2, 07/2012, Swarup et al. (withdrawn) |
Number | Date | Country | |
---|---|---|---|
20220110701 A1 | Apr 2022 | US |
Number | Date | Country | |
---|---|---|---|
61800527 | Mar 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15449260 | Mar 2017 | US |
Child | 17490423 | US | |
Parent | 13924505 | Jun 2013 | US |
Child | 15449260 | US |