Robotic spine surgery system and methods

Information

  • Patent Grant
  • 12035985
  • Patent Number
    12,035,985
  • Date Filed
    Thursday, May 11, 2023
    a year ago
  • Date Issued
    Tuesday, July 16, 2024
    5 months ago
Abstract
Robotic spinal surgery systems and methods include a robotic manipulator with a tool guide and a skin incision tool to be inserted into the tool guide. A navigation system includes a localizer for tracking the patient and a base tracker of the robotic manipulator. A control system registers, with the navigation system, a line haptic object to a vertebra of the patient, the line haptic object being associated with a desired trajectory for the vertebra. In response to a user input, the control system autonomously moves the robotic manipulator to align the tool guide to the desired trajectory. The tool guide is constrained to the desired trajectory with the line haptic object to enable insertion of the skin incision tool within the tool guide to facilitate creation of the incision in the skin at the desired trajectory.
Description
BACKGROUND

Robotic systems for performing surgical procedures in a patient's spine are well known. For instance, robotic systems are currently utilized to place pedicle screws in a patient's spine.


When a patient requires surgery that involves placing pedicle screws, pre-operative imaging and/or intra-operative imaging is often employed to visualize the patient's anatomy that requires treatment—in this case the patient's spine. A surgeon then plans where to place the pedicle screws with respect to the images and/or with respect to a 3-D model created from the images. Planning includes determining a position and orientation (i.e., pose) of each pedicle screw with respect to the particular vertebra in which they are being placed, e.g., by identifying the desired pose in the images and/or the 3-D model. Once the plan is set, then the plan is transferred to the robotic system for execution.


Typically, the robotic system comprises a robotic manipulator that positions a tool guide above the patient and along a desired trajectory that is aligned with the desired orientation of the pedicle screw to be placed. The robotic system also comprises a navigation system to determine a location of the tool guide with respect to the patient's anatomy so that the robotic manipulator can place the tool guide along the desired trajectory according to the surgeon's plan. In some cases, the navigation system includes tracking devices attached to the manipulator and to the patient so that the robotic system can monitor and respond to movement of the patient during the surgical procedure by moving the tool guide as needed to maintain the desired trajectory.


Once the tool guide has been positioned in alignment with the desired trajectory, the robotic manipulator is controlled to maintain the alignment. Thereafter, a surgeon positions a cannula through the tool guide and adjacent to the vertebra. The surgeon inserts a conventional drilling tool into the cannula to drill a pilot hole for the pedicle screw. The surgeon then removes the drilling tool and drives the pedicle screw into position in the pilot hole with a pedicle screw driver. In this methodology, the robotic manipulator is somewhat underutilized as the robotic manipulator plays little to no role in drilling the pilot hole or inserting the pedicle screw.


SUMMARY

In one aspect, a robotic spinal surgery system is provided that comprises a manipulator comprising a base, a robotic arm coupled to the base and including a plurality of links and joints, and a surgical tool coupled to the robotic arm, wherein the surgical tool is a tool guide comprising an opening; a skin incision tool configured to be inserted into the opening of the tool guide and configured to create an incision in a skin of a patient; a navigation system comprising a localizer configured to track the patient and to track a base tracker coupled to the base of the manipulator; and a control system coupled to the manipulator and the navigation system and configured to: register, with the navigation system, a line haptic object to a vertebra of the patient, the line haptic object being associated with a desired trajectory for the vertebra; receive a user input; in response to receipt of the user input, autonomously move the robotic arm to align the tool guide to the desired trajectory; and constrain the tool guide to the desired trajectory with the line haptic object to enable insertion of the skin incision tool within the opening of the tool guide to facilitate creation of the incision in the skin at the desired trajectory.


In another aspect, a method is provided for operating a robotic spinal surgery system comprising a manipulator comprising a base, a robotic arm coupled to the base and including a plurality of links and joints, and a surgical tool coupled to the robotic arm, wherein the surgical tool is a tool guide comprising an opening, a skin incision tool configured to be inserted into the opening of the tool guide and configured to create an incision in a skin of a patient, a navigation system comprising a localizer configured to track the patient and to track a base tracker coupled to the base of the manipulator, and a control system coupled to the manipulator and the navigation system, the method comprising the control system performing the following: registering, with the navigation system, a line haptic object to a vertebra of the patient, the line haptic object being associated with a desired trajectory for the vertebra; receiving a user input; in response to receiving the user input, autonomously moving the robotic arm for aligning the tool guide to the desired trajectory; and constraining the tool guide to the desired trajectory with the line haptic object for enabling insertion of the skin incision tool within the opening of the tool guide for creating the incision in the skin at the desired trajectory.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a perspective view of a robotic surgical system.



FIG. 2 is a perspective view of a surgical robotic arm used with the system of FIG. 1.



FIG. 3 is a perspective view of the robotic surgical system being used in combination with an imaging device to perform a spine procedure.



FIG. 4 is a partial perspective view of a robotic arm coupled to a surgical tool that includes a housing coupled to a drill.



FIG. 5 is a partial perspective view of the robotic arm coupled to the surgical tool coupled to a driver and screw.



FIG. 6 is an elevational view of an alternative surgical tool.



FIG. 7 is an illustration of drilling a pilot hole in a pedicle.



FIG. 8 is an illustration of driving a pedicle screw into the pilot hole.



FIGS. 9A and 9B are illustrations showing electrical current output vs. depth, which can be used to verify that drilling and pedicle screw insertion is according to a user's plan.



FIG. 10A is an illustration of a skin incision tool attached to the robotic arm.



FIG. 10B is an illustration of an alternative skin incision tool attached to the robotic arm.



FIG. 11 is an illustration of a Jamshidi needle attached to the robotic arm.



FIG. 12 is a flow chart of sample steps carried out in one procedure to place an implant at a desired location.



FIG. 13 is a flow chart of sample steps carried out in one procedure to make an incision.





DETAILED DESCRIPTION

Referring to FIGS. 1 and 2, a surgical robotic system 10 is shown which can be used for various surgical procedures, including, but not limited to, spine procedures, such as spine procedures in which pedicle screws, other screws, or other types of implants are placed in the spine. The robotic system 10 comprises a navigation system 12 including a localizer 14 and a tracking device 16, one or more displays 18, and a robotic manipulator (e.g., a robotic arm 20 mounted to a base 22, a table, or the like). The robotic arm 20 includes a base link 24 rotatably coupled to the base 22 and a plurality of arm links 26 serially extending from the base link 24 to a distal end 28. The arm links 26 pivot/rotate about a plurality of joints in the robotic arm 20. A surgical tool for use in performing the spine procedure, for example, is shown generally at 30. The surgical tool 30 may be pivotally connected to the distal end 28 of the robotic arm 20.


A robotic controller 32 is configured to provide control of the robotic arm 20 or guidance to the surgeon during manipulation of the surgical tool 30. In one embodiment, the robotic controller 32 is configured to control the robotic arm 20 (e.g., by controlling joint motors thereof) to provide haptic feedback to the user via the robotic arm 20. This haptic feedback helps to constrain or inhibit the surgeon from manually moving the surgical tool 30 beyond predefined virtual boundaries associated with the surgical procedure. Such a haptic feedback system and associated haptic objects that define the virtual boundaries are described, for example, in U.S. Pat. No. 8,010,180 to Quaid et al., filed on Feb. 21, 2006, entitled “Haptic Guidance System And Method,” and/or U.S. Patent Application Publication No. 2014/0180290 to Otto et al., filed on Dec. 21, 2012, entitled “Systems And Methods For Haptic Control Of A Surgical Tool,” each of which is hereby incorporated by reference herein in its entirety. In one embodiment, the robotic system 10 is the RIO™ Robotic Arm Interactive Orthopedic System manufactured by MAKO Surgical Corp. of Fort Lauderdale, FL, USA.


In some embodiments, the robotic arm 20 acts autonomously based on predefined tool paths and/or other predefined movements to perform the surgical procedure. Such movements may be defined during the surgical procedure and/or before the procedure. In further embodiments, a combination of manual and autonomous control is utilized. For example, a robotic system that employs both a manual mode in which a user applies force to the surgical tool 30 to cause movement of the robotic arm 20 and a semi-autonomous mode in which the user holds a pendant to control the robotic arm 20 to autonomously follow a tool path is described in U.S. Pat. No. 9,566,122 to Bowling et al., filed on Jun. 4, 2015, and entitled “Robotic System And Method For Transitioning Between Operating Modes,” hereby incorporated by reference herein in its entirety.


The navigation system 12 is set up to track movement of various objects in the operating room with respect to a target coordinate system. Such objects include, for example, the surgical tool 30, the patient's anatomy of interest, e.g., one or more vertebra, and/or other objects. The navigation system 12 tracks these objects for purposes of displaying their relative positions and orientations in the target coordinate system to the surgeon and, in some cases, for purposes of controlling or constraining movement of the surgical tool 30 relative to virtual boundaries associated with the patient's anatomy and defined with respect to the target coordinate system (e.g., via coordinate system transformations well known in surgical navigation).


The surgical navigation system 12 includes a computer cart assembly 34 that houses a navigation controller 36. The navigation controller 36 and the robotic controller 32 collectively form a control system of the robotic system 10. A navigation interface is in operative communication with the navigation controller 36. The navigation interface includes the displays 18 that are adjustably mounted to the computer cart assembly 34. Input devices such as a keyboard and mouse can be used to input information into the navigation controller 36 or otherwise select/control certain aspects of the navigation controller 36. Other input devices are contemplated including a touch screen (not shown) or voice-activation.


The localizer 14 communicates with the navigation controller 36. In the embodiment shown, the localizer 14 is an optical localizer and includes a camera unit (one example of a sensing device). The camera unit has an outer casing that houses one or more optical position sensors. In some embodiments at least two optical sensors are employed, sometimes three or more. The optical sensors may be separate charge-coupled devices (CCD). The camera unit is mounted on an adjustable arm to position the optical sensors with a field of view of the below discussed tracking devices 16 that, ideally, is free from obstructions. In some embodiments the camera unit is adjustable in at least one degree of freedom by rotating about a rotational joint. In other embodiments, the camera unit is adjustable about two or more degrees of freedom.


The localizer 14 includes a localizer controller (not shown) in communication with the optical sensors to receive signals from the optical sensors. The localizer controller communicates with the navigation controller 36 through either a wired or wireless connection (not shown). One such connection may be an IEEE 1394 interface, which is a serial bus interface standard for high-speed communications and isochronous real-time data transfer. The connection could also use a company specific protocol. In other embodiments, the optical sensors communicate directly with the navigation controller 36.


Position and orientation signals and/or data are transmitted to the navigation controller 36 for purposes of tracking the objects. The computer cart assembly 34, the displays 18, and the localizer 14 may be like those described in U.S. Pat. No. 7,725,162 to Malackowski, et al. issued on May 25, 2010, entitled “Surgery System,” hereby incorporated by reference.


The robotic controller 32 and the navigation controller 36 may each, or collectively, comprise one or more personal computers or laptop computers, memory suitable for storage of data and computer-readable instructions, such as local memory, external memory, cloud-based memory, random access memory (RAM), non-volatile RAM (NVRAM), flash memory, or any other suitable form of memory. The robotic controller 32 and the navigation controller 36 may each, or collectively, comprise one or more processors, such as microprocessors, for processing instructions or for processing algorithms stored in memory to carry out the functions described herein. The processors can be any type of processor, microprocessor or multi-processor system. Additionally or alternatively, the robotic controller 32 and the navigation controller 36 may each, or collectively, comprise one or more microcontrollers, field programmable gate arrays, systems on a chip, discrete circuitry, and/or other suitable hardware, software, or firmware that is capable of carrying out the functions described herein. The robotic controller 32 and the navigation controller 36 may be carried by the robotic manipulator, the computer cart assembly 34, and/or may be mounted to any other suitable location. The robotic controller 32 and/or the navigation controller 36 is loaded with software as described below. The software converts the signals received from the localizer 14 into data representative of the position and orientation of the objects being tracked.


Referring to FIG. 3, navigation system 12 includes a plurality of tracking devices 16, also referred to herein as trackers. In the illustrated embodiment, trackers 16 are coupled to separate vertebra of the patient. In some cases, the trackers 16 are firmly affixed to sections of bone via bone screws, bone pins, or the like. In other cases, clamps on the spinous process or other portion of the spine may be used to attach the trackers 16. In further embodiments, the trackers 16 could be mounted to other tissue types or parts of the anatomy. The position of the trackers 16 relative to the anatomy to which they are attached can be determined by registration techniques, such as point-based registration in which a digitizing probe 73 (e.g., navigation pointer with its own markers) is used to touch off on bony landmarks on the bone or to touch on several points on the bone for surface-based registration. Conventional registration techniques can be employed to correlate the pose of the trackers 16 to the patient's anatomy, e.g., the vertebra V being treated.


Other types of registration are also possible such as using trackers 16 with mechanical clamps that attach to the spinous process of the vertebra V and that have tactile sensors (not shown) to determine a shape of the spinous process to which the clamp is attached. The shape of the spinous process can then be matched to the 3-D model of the spinous process for registration. A known relationship between the tactile sensors and the three or more markers on the tracking device 16 is pre-loaded into the navigation controller 36. Based on this known relationship, the positions of the markers relative to the patient's anatomy can be determined.


A base tracker 16 is also coupled to the base 22 to track the pose of the surgical tool 30. In other embodiments, a separate tracker 16 could be fixed to the surgical tool 30, e.g., integrated into the surgical tool 30 during manufacture or may be separately mounted to the surgical tool 30 in preparation for the surgical procedures. In any case, a working end of the surgical tool 30 is being tracked by virtue of the base tracker 16 or other tracker. The working end may be a distal end of an accessory of the surgical tool 30. Such accessories may comprise a drill, a bur, a saw, an electrical ablation device, a screw driver, a tap, a surgical knife, a Jamshidi needle, or the like.


In the illustrated embodiment, the trackers 16 are passive trackers. In this embodiment, each tracker 16 has at least three passive tracking elements or markers M for reflecting light from the localizer 14 back to the optical sensors. In other embodiments, the trackers 16 are active trackers and may have light emitting diodes or LEDs transmitting light, such as infrared light to the optical sensors. Based on the received optical signals, navigation controller 36 generates data indicating the relative positions and orientations of the trackers 16 relative to the localizer 14 using conventional triangulation techniques. In some cases, more or fewer markers may be employed. For instance, in cases in which the object being tracked is rotatable about a line, two markers can be used to determine an orientation of the line by measuring positions of the markers at various locations about the line. It should be appreciated that the localizer 14 and trackers 16, although described above as utilizing optical tracking techniques, could alternatively, or additionally, utilize other tracking modalities to track the objects, such as electromagnetic tracking, radio frequency tracking, inertial tracking, combinations thereof, and the like.


It may also be desired to track the patient's skin surface to ensure that the surgical tool 30 does not inadvertently contact or penetrate the patient's skin outside of any desired incision boundaries. For this purpose, skin attached markers M, such as active or passive markers with adhesive backing may be attached to the patient's skin to define a boundary associated with the patient's skin. An array of such markers M could be provided in a peripheral ring 74 (circular, rectangular, etc.), such that the surgical procedure continues inside the ring 74 without substantially disturbing the ring 74 (i.e., the ring is placed on the patient's skin about the incision and vertebrae of interest). One suitable skin marker array is the SpineMask® tracker manufactured by Stryker Leibinger GmbH & Co. KG, Bötzinger Straße 41, D-79111 Freiburg, Germany. See also U.S. Patent Application Publication No. 2015/0327948 to Schoepp et al., entitled “Navigation System For And Method Of Tracking The Position Of A Work Target,” filed on May 13, 2015, hereby incorporated herein by reference in its entirety. Other suitable skin trackers are also contemplated. The digitizing probe could also be used to map the skin surface and/or incision as well. However, once mapped, any movement of the skin would not be detected without further digitizing, whereas the attached tracker array can detect movement of the patient's skin.


Prior to the start of the surgical procedure, additional data are loaded into the navigation controller 36. Based on the position and orientation of the trackers 16 and the previously loaded data, navigation controller 36 determines the position of the working end of the surgical tool 30 and the orientation of the surgical tool 30 relative to the tissue against which the working end is to be applied. The additional data may comprise calibration data, such as geometric data relating positions and/or orientations of the trackers 16 or markers M thereof to the working end of the surgical tool 30. This calibration data may also be determined pre-operatively or intra-operatively, such as by using a calibration probe or calibration divot on a tracker 16 of known geometry to determine a position of the working end of the surgical tool 30, e.g., relative to its own tracker or to the base tracker 16. The additional data may comprise registration data, such as transformation data associating the trackers 16 to the patient's anatomy or 3-D models thereof. In some embodiments, navigation controller 36 forwards these data to the robotic controller 32. The robotic controller 32 can then use the data to control the robotic arm 20 as described in U.S. Pat. Nos. 8,010,180 or 9,566,122, both of which were previously incorporated by reference herein.


The navigation controller 36 also generates image signals that indicate the relative position of the working end of the surgical tool 30 to the tissue of interest. These image signals are applied to the displays 18. Displays 18, based on these signals, generate images that allow the surgeon and staff to view the relative position of the surgical tool 30 to the surgical site. The displays 18 as discussed above, may include a touch screen or other input/output device that allows entry of commands.


In the embodiment shown, using the navigation system 12, the pose of the surgical tool 30 can be determined by tracking the location of the base 22 via the base tracker 16 and calculating the pose of the surgical tool 30 based on joint encoder data from the joints of the robotic arm 20 and a known geometric relationship between the surgical tool 30 and the robotic arm 20. Ultimately, the localizer 14 and the tracking devices 16 enable the determination of the pose of the surgical tool 30 and the patient's anatomy so the navigation system 12 knows the relative relationship between the surgical tool 30 and the patient's anatomy. One such navigation system is shown in U.S. Pat. No. 9,008,757 to Wu, filed on Sep. 24, 2013, entitled “Navigation System Including Optical And Non-Optical Sensors,” hereby incorporated herein by reference.


In operation, for certain surgical tasks, the user manually manipulates (e.g., moves or causes the movement of) the robotic arm 20 to manipulate the surgical tool 30 to perform the surgical procedure on the patient, such as drilling, cutting, sawing, reaming, implant installation, and the like. As the user manipulates the surgical tool 30, the navigation system 12 tracks the location of the surgical tool 30 and/or the robotic arm 20 and provides haptic feedback (e.g., force feedback) to the user to limit the user's ability to move (or cause movement of) the surgical tool 30 beyond one or more predefined virtual boundaries that are registered (or mapped) to the patient's anatomy, which results in highly accurate and repeatable drilling, cutting, sawing, reaming, and/or implant placement.


In one embodiment, the robotic arm 20 operates in a passive manner and provides haptic feedback when the surgeon attempts to move the surgical tool 30 beyond the virtual boundary. The haptic feedback is generated by one or more actuators (e.g., joint motors) in the robotic arm 20 and transmitted to the user via a flexible transmission, such as a cable drive transmission. When the robotic arm 20 is not providing haptic feedback, the robotic arm 20 is freely moveable by the user. In other embodiments, like that shown in U.S. Pat. No. 9,566,122, previously incorporated herein by reference, the robotic arm 20 is manipulated by the user in a similar manner, but the robotic arm 20 operates in an active manner. For instance, the user applies force to the surgical tool 30, which is measured by a force/torque sensor, and the robotic arm 30 emulates the user's desired movement based on measurements from the force/torque sensor. For other surgical tasks, the robotic arm 20 operates autonomously.


Turning to FIGS. 4 and 5, the surgical tool 30 is shown coupled to the distal end 28 of the robotic arm 20. More specifically, a coupling 40 is provided between the surgical tool 30 and the distal end 28 of the robotic arm 20 to allow rotation of the surgical tool 30 relative to the distal end 28 about axis A. In FIG. 4, the surgical tool 30 comprises a drill 42 for drilling a pilot hole for a pedicle screw, other screw, or other type of implant. The drill 42 is arranged to rotate about a rotational axis R. In FIG. 5, the surgical tool 30 comprises a driver 44 (e.g., a screw driver) arranged along the rotational axis R to rotate about the rotational axis R for driving in a pedicle screw PS or other implant.


The surgical tool 30 comprises a housing 45. A drive system (e.g., motor) is located in the housing 45 to drive the drill 42, driver 44, or other accessory. The drive system may be variable speed. A handle 46 depends from the housing 45 and includes a grip for being grasped by the user to manipulate the surgical tool 30 and/or the robotic arm 20 during the surgical procedure.


The housing 45 further comprises a collet 47 or other type of coupler for releasably attaching the drill 42, driver 44, or other accessory to the drive system. In some cases, a speed reducer 48 (see FIG. 5) may be releasably attached to the collet 47 to be used for certain accessories. The speed reducer 48 comprises a transmission or gear arrangement that causes the rotational speed of the accessory to be reduced as compared to being connected directly to the drive system. This is useful in cases where slower rotational speeds are desired. A trigger 49 may also be present to control a speed of the drill 42 and/or driver 44, to initiate movement of the robotic arm 20, to align the rotational axis R with a desired trajectory, or the like. The trigger 49 may communicate signals to the robotic controller 32 (which may include a tool controller) to control the robotic arm 20 and/or the surgical tool 30.


In another embodiment shown in FIG. 6, one end of the coupling 40 supports the surgical tool 30 for rotation about the axis A. Another end of the coupling 40 supports the housing 45. The housing 45 may be fixed to the coupling 40 or may be supported for rotation within the coupling 40 about the rotational axis R. In other words, the housing 45 may be able to passively rotate within the coupling 40. At the same time, however, the coupling 40 limits axial movement of the housing 45 along the rotational axis R relative to the coupling 40 so that positioning of the housing 45 can be precisely controlled. A tracker (not shown) could be mounted to the housing 45 to track the position and/or orientation of the housing 45 and thereby track the rotational axis R and/or a distal end of the accessory attached to the housing 45. A rotating shaft 60 is rotatably supported in the housing 45. The rotating shaft 60 has a distal interface/collet 62 that couples to the accessory (e.g., driver 44 as shown) and a proximal interface/collet 64 that couples to a power source, such as a source of torque, e.g., a motor, rotatable handle for manual rotation, and the like. For example, the driver 44 is shown coupled to the distal interface 62/rotating shaft 60 and a handpiece 66 with internal motor is shown coupled to the proximal interface 64 so that the user is able to grip the handpiece 66, trigger operation of the motor, and cause the motor to transmit torque through the rotating shaft 60 to the driver 44 and ultimately to the pedicle screw PS. By virtue of this configuration, the user is able to feel direct torque feedback when inserting the pedicle screws PS.


Pre-operative imaging and/or intra-operative imaging may be employed to visualize the patient's anatomy that requires treatment—such as the patient's spine. The surgeon plans where to place the pedicle screws PS with respect to the images and/or with respect to a 3-D model created from the images. Planning includes determining a pose of each pedicle screw PS with respect to the particular vertebra V in which they are being placed, e.g., by identifying the desired pose in the images and/or the 3-D model. This may include creating or positioning a separate 3-D model of the pedicle screw PS with respect to the 3-D model of the patient's anatomy. Once the plan is set, then the plan is transferred to the robotic system 10 for execution.


The robotic system 10 may be used in concert with an imaging device 50 (e.g., a C-arm as shown in FIG. 3) to take the intra-operative images of the patient's anatomy in addition to, or alternatively to, any pre-operative images, e.g., X-rays, CT scans, or MRI images taken before surgery. The intra-operative images from the imaging device 50 can help to determine the actual position of the drill 42 or driver 44 relative to the desired orientation of the pedicle screws PS being placed in the patient's spine. Separate tracking devices 16 can be employed on each vertebra V to separately track each vertebra V and the corresponding pose of the drill 42 and/or driver 44 relative to the separate vertebra V when placing the pedicle screws PS or other implants into the vertebra V.


The robotic system 10 evaluates the desired pose of the pedicle screws PS and creates virtual boundaries (e.g., haptic objects), pre-defined tool paths, and/or other autonomous movement instructions, that correspond to the desired pose of the pedicle screws PS to control movement of the robotic arm 20 so that the drill 42 and driver 44 of the surgical tool 30 are controlled in a manner that ultimately places the pedicle screws PS according to the user's plan. This may comprise, for example, ensuring during the surgical procedure that a trajectory of the surgical tool 30 is aligned with the desired pose of the pedicle screws PS, e.g., aligning the rotational axis R with the desired pose of the pedicle screw PS.


In other embodiments, the user may intra-operatively plan the desired trajectory and/or screw placement. For example, the user can position the drill 42 at a desired entry point relative to the anatomy of interest, e.g., a vertebra V, and orient the drill 42 until the display 18 shows that the trajectory of the rotational axis R is in a desired orientation. Once the user is satisfied with the trajectory, the user can provide input (e.g., touchscreen, button, foot pedal, etc.) to the control system to set this trajectory as the desired trajectory to be maintained during the procedure. The haptic object created for constraining movement of the surgical tool 30 to maintain the rotational axis R to stay along the desired trajectory may be a line haptic object LH, such as that shown in FIG. 4. The line haptic object LH may have a starting point SP, as described further below, a target point TP, which defines a desired depth of the drill 42, pedicle screw PS, etc., and an exit point EP. Other haptic object shapes, sizes, etc. are also contemplated.


Referring to FIGS. 7 and 8, one of the vertebra V is shown. During the surgical procedure, such as a spinal fusion surgery, a surgeon may insert one or more pedicle screws PS through pedicle regions into a vertebral body 100 of vertebra V. Prior to inserting the pedicle screws PS, the surgeon may employ the drill 42 to cut pilot holes 102 in the vertebral body 100.


In one embodiment, before drilling commences, the robotic system 10 controls movement of the surgical tool 30 to place the rotational axis R along the desired trajectory by autonomously aligning the rotational axis R of the surgical tool 30 with the desired trajectory, which coincides with the desired orientation of the pilot holes 102. In this case, the robotic arm 20 may autonomously position the drill 42 along the desired trajectory, but spaced above the vertebral body 100 (as shown in FIG. 4) so that the drill 42 has not yet contacted the vertebral body 100. Such autonomous positioning may be initiated by the user pulling the trigger on the surgical tool 30, or otherwise providing input to the control system to start the movement. In some cases, a tool center point (TCP) of the surgical tool 30 is first brought to within a predefined distance of the starting point SP of the line haptic object LH that provides the desired trajectory (such as within a predefined starting sphere). Once the TCP (e.g., bur centroid, drill tip center, etc.) is within the predefined distance of the starting point SP, then pulling the trigger (or alternatively pressing a foot pedal or actuating some other input) causes the robotic arm 20 to autonomously align and position the surgical tool 30 on the desired trajectory. See, for example, the teachings in U.S. Patent Application Publication No. 2014/0180290 to Otto et al., filed on Dec. 21, 2012, entitled “Systems And Methods For Haptic Control Of A Surgical Tool,” which is hereby incorporated by reference herein in its entirety. The robotic arm 20 may be programmed to move the surgical tool 30 to a distance from the patient based on pre-surgical planning or may move the TCP to the closest point on the trajectory. Once the surgical tool 30 is in the desired pose, the robotic system 10 may effectively hold the rotational axis R of the surgical tool 30 on the desired trajectory by tracking movement of the patient and autonomously adjusting the robotic arm 20 as needed to keep the rotational axis R on the desired trajectory, i.e., aligned with the line haptic object LH.


While the robotic system 10 holds the surgical tool 30 on the desired trajectory, the user may then manually manipulate the surgical tool 30 to move (or cause movement of) the drill 42 along the line haptic object LH toward the vertebral body 100 to drill the pilot holes 102. In some cases, such as when using a passive robotic arm 20, the robotic system 10 constrains the user's movement of the surgical tool 30 to stay along the desired trajectory by providing haptic feedback to the user should the user attempt to move the surgical tool 30 in a manner that deviates from the line haptic object LH and the desired trajectory. If the user desires to return the robotic arm 20 to a free mode, for unconstrained movement of the surgical tool 30, the user can pull the surgical tool 30 back along the line haptic object LH, away from the patient, until the exit point EP is reached.


The user then drills the pilot holes 102 to desired depths. Drilling speed can be controlled by the user via the trigger, or can be controlled automatically based on the particular location of the drill 42 relative to the patient's anatomy. For instance, a rotational speed of the drill 42 may be set high during initial drilling into the vertebral body V, but may be slowed during further drilling into the vertebral body V, and set even slower during final drilling to the final depth. The control system can also monitor contact/contact force during line haptic guiding via one or more sensors S (e.g., one or more force sensors, force/torque sensors, torque sensors, pressure sensors, optical sensors, or the like) that communicates with the robotic controller 32. If no significant contact/contact force is detected, which means the surgical tool 30 is passing through soft tissue, the control system avoids activating the motor of the surgical tool 30 or other power source (e.g., RF energy, ultrasonic motor, etc.). When contact with bone is detected (e.g., optically, sensed force is above a predefined threshold, etc.), the control system can activate the motor or other power source. Users can also passively feel the contact/contact force and trigger a switch to activate the power source.


The virtual boundaries (e.g., haptic objects) used to constrain the user's movement along the desired trajectory may also indicate, via haptic feedback, when the user has reach the desired depth of the pilot holes 102, e.g., reached the target point TP. Separate virtual boundaries could also be used to set the desired depths. In other cases, the robotic system 10 may autonomously drill the pilot holes 102 to the desired depths. In further cases, the robotic system 10 may initially drill autonomously, but then final drilling may be done manually, or vice versa. Once the pilot holes 102 are created, the pedicle screws PS can then be placed using the driver 44. In some embodiments, pilot holes 102 may be unnecessary and the pedicle screws PS can be placed over guide wires placed by the robotic system 10 or without any guidance.


One advantage of using the navigation system 12 to continuously track each vertebra V separately and to track movement of the drill 42 is that the pedicle screws PS may be inserted in close proximity to spinal cord 103, and thus, the placement of pedicle screws PS and their corresponding pilot holes 102 must be precisely aligned so as to avoid interacting with or damaging spinal cord 103. If a surgeon drills the pilot holes 102 at an improper angle and/or too deeply, pedicle screws PS or the drill 42 used to drill pilot holes 102 may damage the spinal cord 103. As a result, by using the navigation system 12 to track a pose of the drill 42 and/or the driver 44 relative to the patient's anatomy and specifically the anatomy as outlined in the preoperative images and/or the intraoperative images, the spinal cord 103 can be avoided.


Once drilling is complete, referring specifically to FIG. 7, the drill 42 is removed from the vertebral body 100, the drill 42 is disconnected from the drive system via the collet 47, and the driver 44 is coupled to the drive system (with or without the speed reducer 48). A pedicle screw PS is attached to a distal end of the driver 44 for placement in one of the pilot holes 102. The original line haptic object could be used for driving the pedicle screw PS or a new line haptic object, with new starting point, target point, and exit point, could be created upon attaching the driver 44 and/or pedicle screw PS. In this case, the drill 42 and/or driver 44 could have RFID tags or other identification devices so that the robotic controller 32 is able to identify which accessory is connected to the housing 45. The housing 45 may have a corresponding RFID reader, etc. in communication with the robotic controller 32 to read the tag and determine which accessory is attached. Based on this information, the controller may then create, access, or otherwise determine a new line haptic object. Similarly, the pedicle screws PS could also be outfitted with RFID tags and the driver 44 may have a similar reader so that the robotic controller 32 can also determine which size/type of pedicle screw PS is attached. Accordingly, the line haptic object can be based on the driver 44 and/or the pedicle screw PS so that the robotic arm 20 is controlled precisely to place that particular pedicle screw PS to a desired location, e.g., a desired orientation and depth with respect to the patient's anatomy.


Additionally, with automatic detection of the accessory, either via the RFID tags, or other detection devices, such as a vision camera, the control system is able to advance any surgical procedure software utilized with the robotic system 10 to the next screen associated with the driver 44, which may have different prompts, instructions, etc. for the user now that the driver 44 is connected. Voice recognition, gesture sensing, or other input devices may be used to advance the software and/or to change to the next vertebra 100 to be treated and/or to change a side of the vertebral body 100 in which the operation is being carried out. This could also be based on the location of the surgical tool 30. For example, if the TCP of the attached accessory is manually placed by the user closer to one side of a particular vertebra V than another, the software may automatically advance to correspond to that side of the vertebra V. The selected vertebra V and side of operation can be confirmed visually with the displays 18 or through audio input/output.


Again, in much the same manner as the drill 42 is controlled, while the robotic system 10 holds the surgical tool 30 on the desired trajectory, the user may then manually manipulate the surgical tool 30 to move (or cause movement of) the driver 44 and pedicle screw PS along the line haptic object LH toward the vertebral body 100 to insert the pedicle screw PS in the pilot hole 102. In some cases, such as when using a passive robotic arm 20, the robotic system 10 controls movement of the surgical tool 30 by constraining the user's movement of the surgical tool 30 so that the surgical tool 30 remains aligned with and stays along the desired trajectory. This can be accomplished by providing haptic feedback to the user should the user attempt to move the surgical tool 30 in a manner that deviates from the desired trajectory—thus the robotic arm 20 is still able to control installation of the implant in the spine of the patient so that the implant is placed at a desired location. The user then drives the pedicle screw PS into the pilot hole 102 to the desired location, e.g., to the desired depth at the desired orientation. Drive speed can be controlled by the user via the trigger, or can be controlled automatically based on the particular location of the driver 44 and/or pedicle screw PS relative to the patient's anatomy. For instance, a rotational speed of the driver 44 may be set high during initial installation into the vertebral body V, but may be slowed during further installation into the vertebral body V, and set even slower during final implanting to the final depth.


The virtual boundaries (e.g., line haptic objects) used to constrain the user's movement along the desired trajectory may also indicate, via haptic feedback, when the user has reached the desired depth of the pedicle screw PS. Separate virtual boundaries could also be used to set the desired depth. In other cases, the robotic system 10 may autonomously insert the pedicle screws PS to the desired depths. In further cases, the robotic system 10 may initially drive the pedicle screws PS autonomously to an initial depth, but then final implanting to a final depth may be done manually, or vice versa. In one example, the pedicle screws PS are placed autonomously until within a predefined distance of the final depth (as determined by the navigation system 12). At this point, the user either finishes implanting the pedicle screw PS manually with the surgical tool 30 so that the user is able to feel tightening of the pedicle screws 30, or a separate tool (powered or manual) is used to complete placement of the pedicle screw PS. The user may be instructed by the control system, via displays 18, how many turns remain before the pedicle screw PS has reached full depth, and/or the displays 18 may graphically represent the pedicle screws PS, anatomy, and/or the target point so that the user is able to easily visualize how much further driving of the pedicle screw PS is required.


In some procedures, the rotational axis R may be moved off the desired trajectory between drilling the pilot holes and driving the implants, such as when all the pilot holes are drilled first, and then later, all the pedicle screws PS are driven into their desired location. In such a case, before placing each of the pedicle screws PS, the robotic system 10 may first control movement of the surgical tool 30 to place the rotational axis R along the desired trajectory by autonomously aligning the rotational axis R of the surgical tool 30 with the desired trajectory for each of the pedicle screws PS in the manner previously described.


A partial facetectomy may be carried out with the surgical tool 30 to provide a smooth bony surface for final receipt of a head of the pedicle screw PS. The resection volume can be defined based on the user's plan, i.e., by determining a location of the head in the 3-D model. A bur or pre-shaped reamer 70 that corresponds to the head shape can be used to remove the material. In some cases, the drill 42 may incorporate the reamer therein, as shown in hidden lines in FIG. 7, to avoid separate tools so that the drill 42 has a smaller profile drilling shaft to create the pilot hole and more proximally located is the reamer 70 to create the seat 72 for the head of the pedicle screw PS—thus at least part of the pilot hole 102 and the seat 72 can be formed simultaneously. In the embodiment shown, the drill 42 has a drilling shaft with proximal and distal ends and a drill tip at the distal end. The reamer 70 is spaced proximally from the drill tip so that the reamer 70 is located near a facet once the drill 42 has been inserted to the desired depth in the target vertebral body. Any suitable drill and/or reamer cutting features may be employed to form the hole, e.g., to form the pilot hole and the seat in the spine of the patient to receive the implant.


The robotic controller 32 can be used to control insertion of the pedicle screws PS by measuring torque associated with driving of the pedicle screws PS with the driver 44. More specifically, the torque required to insert the pedicle screws PS into the vertebral body 100 increases the deeper the pedicle screw PS is placed in the vertebral body 100, and further increases once an end of the pilot hole 102 is reached. As a result, torque output of the motor in the surgical tool 30 can indicate whether the pedicle screw PS has reached the desired depth and/or the end of the pilot hole 102. The robotic controller 32 monitors this torque (e.g. via a torque sensor, such as by monitoring current draw of the motor, or the like) and controls rotation of the driver 44 accordingly. For instance, once a threshold torque is reached, the driver 44 may be stopped.


Referring to FIGS. 9A and 9B, the control system may also be able to use the torque output, e.g., current, or other measured force parameter to verify the location of the drill 42 or pedicle screw PS during insertion. This may be particularly useful in cases where the tracking device 16 inadvertently moves relative to the vertebra 100, which may otherwise be undetected and result in errors in drilling or screw driving. For example, pre-operative and/or intra-operative images taken of the vertebra 100 may be used to generate a volumetric map of bone mineral density (BMD) for the vertebra 100. Generating and utilizing such BMD maps for robotic surgery is shown and described in U.S. Patent Application Publication No. 2017/0000572 to Moctezuma de la Barrera et al., filed on Jun. 28, 2016, entitled “Robotic Systems And Methods For Controlling A Tool Removing Material From A Workpiece,” which is hereby incorporated by reference herein. During the drilling or screw driving, the control system can evaluate the BMD map to predict the BMD at the contact point of the drill 42/pedicle screw PS with the bone according to the 3-D model and the user's plan (i.e., the current contact point if the drill/pedicle screw PS is following the plan). The control system can then predict the corresponding value of current or torque of the surgical tool 30 or interaction force (e.g., using a force/torque sensor) and compare its value to measured actual values to determine if a discrepancy above a threshold is found. If a discrepancy is found, it can be used to stop the procedure or update the plan. FIG. 9B illustrates a profile of insertion current, torque, and force of pedicle screws PS. In effect, during screw driving, the robotic system 10 can monitor the profile of insertion current, torque, and force of screws to indicate that the pedicle screw follows the planned trajectory. The profile of insertion torque can also be used to indicate a degree of osteoporosis of bone.


An ultrasound transducer (not shown) could also be mounted on the back of the patient's skin to generate real-time images of the patient's anatomy and progress of the surgical procedure. The intra-operative images could be used to determine that the pedicle screw PS follows the planned desired trajectory or to determine if the drill 42 or pedicle screw PS, is getting close to any critical structures including a nerve and medial or lateral cortical boundary.


Referring to FIG. 10A, one of the accessories of the surgical tool 30 may comprise a skin incision tool 80, such as a scalpel, electrosurgical knife, other tools with sharp tips, and the like. The skin incision tool 80 can be mounted much like the drill 42 and/or driver 44, or may be part of a separate end effector and connected to a mount 81 that attaches to the coupling 40, and a skin incision I can be made with haptic guidance in a similar manner as previously described, i.e., virtual boundaries (e.g., haptic objects) can be used when creating the incision I to constrain the user's movement with respect to a desired incision in the patient's skin. In one example, the digitizing probe 73 can be used to touch the desired incision location and create the associated boundary/haptic object. In another example, a 3-D skin model can be determined based on the pose of the ring 74, through digitizing, and/or through pre-operative methods, and the desired plan of pedicle screw placement can be used by the control system to determine the incision I location based on the skin model.


Referring to FIG. 10B, other types of pointers, similar to the digitizing probe 73 can also be used to identify the desired location of the incision, such as a laser pointer LP that could be mounted to the skin incision tool 80, end effector, or other component to project visible light LT onto the skin of the patient to indicate the location of the incision. Such a laser pointer can be used by first aligning the rotational axis R of the skin incision tool 80 with the desired trajectory and thereafter activating the laser pointer LP to project the light along the desired trajectory. An alternative form of skin incision tool 80 is shown in FIG. 10B, which is placed through a tool guide TG held in place by the robotic arm. Owing to the tracking of the patient's skin accomplished via the skin tracker (e.g., the ring 74), the navigation system 12 is also able to approximately determine the desired location of the incision I based on the skin model (e.g., a surface model, point cloud, etc.) and the intersection of the desired trajectory with the skin model so that the user is able to cut the desired incision in the patient's skin at the desired location by virtue of haptic feedback.


Haptic objects can be defined in various ways to establish the haptic feedback to guide making of the incision (see, e.g., the V-shaped haptic object VH shown in FIG. 10A). The haptic objects can be defined based on a width of the skin incision tool, a desired length of the skin incision, and/or a desired depth of the incision. A desired incision depth can also be controlled by the user within a maximum incision depth which can be determined by either the maximum incision depth programmed as part of the haptic object or a mechanical stop can be used to prevent the skin incision tool 80 from sliding through a guide opening (not shown) in the tool guide TG of the end effector beyond a predetermined point.


Referring to FIG. 11, one of the accessories of the surgical tool 30 may comprise a wire insertion tool 90, such as a Jamshidi needle, another access cannula with stylet, or the like. The wire insertion tool 90 can be mounted much like the skin incision tool 80, or may be part of a separate end effector and fixedly connected to a mount 91 that attaches to the coupling 40. If no relative motion is allowed between the wire insertion tool 90 and the mount 91, i.e., they are fixed to one another, then the wire insertion tool 90 can be guided with a line haptic object LH to enter the skin incision I and reach a target point TP on the bone, e.g., the vertebra. If relative axial sliding motion between the wire insertion tool 90 and the mount 91 is allowed, such as when the mount 91 comprises a tool guide TG with opening 93, then the tool guide TG can be positioned at the desired orientation and the wire insertion tool 90 can be inserted along opening 93 in the tool guide TG. Depending on relative distance to the target point TP, length of the wire insertion tool 90, and the tool guide position, the wire insertion tool 90 can be guided via the line haptic object LH in the same manner previously described for the drill 42 and/or driver 44.



FIG. 12 illustrates a flowchart of sample steps that could be carried out in a surgical procedure for placing an implant at a desired location, such as placing a screw into bone. In step 200, the anatomy is first prepared to receive the implant. Such preparation may comprise several steps, such as: (1) forming an incision in the patient (see also FIG. 13); (2) retracting tissue with a tissue retractor; (3) placing a cannula in the retracted tissue; (4) drilling a pilot hole in the anatomy; (5) tapping threads into the anatomy; and the like.


If the rotational axis R is not yet aligned with the desired trajectory, or if the rotational axis R has been moved away from the desired trajectory for other reasons, the rotational axis R is aligned in step 202. Specifically, in step 202, the robotic system 10 controls movement of the surgical tool 30 to place the rotational axis R along the desired trajectory. This may comprise the robotic system 10 causing autonomous movement of the surgical tool 30 to place the rotational axis R along the desired trajectory.


Once the rotational axis R has been placed on the desired trajectory, then the robotic system 10 operates to maintain the rotational axis R along the desired trajectory in step 204. This may comprise controlling manual manipulation of the surgical tool 30 by constraining movement of the surgical tool 30 so that the surgical tool 30 remains aligned with the desired trajectory while a user manually moves or manually causes movement of the surgical tool 30 toward the spine.


Installation of the implant in the spine of the patient occurs in steps 206 and 208 such that the implant is placed at a desired location. In step 206, the robotic system 10 causes autonomous movement of the surgical tool 30 to place the implant in the spine of the patient until the implant is within a predefined distance of the desired location. Thereafter, in step 208 the user manual manipulates the surgical tool 30 and the robotic system 10 controls such manual manipulation of the surgical tool 30 until the implant is placed at the desired location. The robotic system 10 can control such manual manipulation, for instance, by generating haptic feedback to the user with the robotic controller 32 to indicate that the implant has reached the desired location. Once the implant is placed at the desired location, the surgical tool 30 is withdrawn away from the anatomy in step 210 and the procedure proceeds until all implants are placed.



FIG. 13 illustrates a flowchart of sample steps carried out to form the incision I in the skin of the patient. In step 300, a desired location of the incision is first identified with the pointer, while the skin tracker (e.g., ring 74) is attached to the patient. In one example, the pointer comprises the digitizing probe 73 which can be used to touch the desired incision location to identify the desired location of the incision I and create the associated boundary/haptic object. In another example, the laser pointer LP can be used to identify the desired location of the incision.


In step 302, once the desired location of the incision I is identified, then the skin (and the desired location on the skin for the incision I) can be tracked with the navigation system 12 in the manner previously described.


Owing to the skin and the desired location for the incision I being tracked, the robotic system 10 can control movement of the skin incision tool 80 with respect to a haptic object created for the incision in step 304. The haptic object is defined in the target coordinate system so that the incision is made at the desired location in the skin of the patient. In one example, the robotic system 10 can control movement of the skin incision tool 80 with respect to the haptic object by controlling manual manipulation of the skin incision tool 80. This can be done by constraining movement of the skin incision tool 80 with respect to a virtual boundary defined by the haptic object so that the skin incision tool 80 makes the incision I at the desired location while a user manually moves or manually causes movement of the skin incision tool 80. The robotic system 10 can constrain movement of the skin incision tool 80 with respect to the haptic object by generating haptic feedback to the user to indicate that the skin incision tool 80 has reached a desired depth of the incision I or otherwise has reached a desired limit for the incision I. Once the incision I is made at the desired location, the skin incision tool 80 is withdrawn away from the anatomy in step 306 and the procedure proceeds until all incisions are made.


It should be appreciated that the systems and methods described herein can be employed to place pedicle screws PS, other screws, fasteners, or other implants into a patient. So, even though pedicle screws PS are referenced throughout as one example, the same systems and methods described herein could be utilized for treating any anatomy of the patient and/or for placing any implants into the patient, e.g., in the hip, knee, femur, tibia, face, shoulder, spine, etc. For instance, the robotic arm 20 may also be used to place a cage for a spine implant, to place rods, or to place other components, and could be used for discectomy or other procedures. Different end effectors could also be attached to the robotic arm 30 for other procedures. In some cases, the end effector may also have an articulating arm to facilitate implant insertion, i.e., placing the implant in a desired pose. The articulating arm of the end effector could simply be a miniature version of the robotic arm 20 controlled in the same manner to place the implant or could be another mechanism controlled to position the implant. The navigation system 12 may comprise an optical navigation system with optical-based trackers, but could additionally or alternatively employ other modalities, such as ultrasound navigation systems that track objects via ultrasound, radio frequency navigation systems that track objects via RF energy, and/or electromagnetic navigation systems that track objects via electromagnetic signals. Other types of navigation systems are also contemplated. It should also be appreciated that the models described herein may comprise triangulated meshes, volumetric models using voxels, or other types of 3-D and/or 2-D models in some cases.


Several embodiments have been discussed in the foregoing description. However, the embodiments discussed herein are not intended to be exhaustive or limit the invention to any particular form. The terminology which has been used is intended to be in the nature of words of description rather than of limitation. Many modifications and variations are possible in light of the above teachings and the invention may be practiced otherwise than as specifically described.

Claims
  • 1. A robotic spinal surgery system comprising: a manipulator comprising a base, a robotic arm coupled to the base and including a plurality of links and joints, and a surgical tool coupled to the robotic arm, wherein the surgical tool is a tool guide comprising an opening;a skin incision tool configured to be inserted into the opening of the tool guide and configured to create an incision in a skin of a patient;a navigation system comprising a localizer configured to track the patient and to track a base tracker coupled to the base of the manipulator; anda control system coupled to the manipulator and the navigation system and configured to: register, with the navigation system, a line haptic object to a vertebra of the patient, the line haptic object being associated with a desired trajectory for the vertebra;receive a user input;in response to receipt of the user input, autonomously move the robotic arm to align the tool guide to the desired trajectory; andconstrain the tool guide to the desired trajectory with the line haptic object to enable insertion of the skin incision tool within the opening of the tool guide to facilitate creation of the incision in the skin at the desired trajectory.
  • 2. The robotic spinal surgery system of claim 1, wherein, after alignment of the tool guide to the desired trajectory, the control system is configured to enable a user to apply force to the tool guide to manually move, or manually cause movement of, the tool guide along the line haptic object while constraining the tool guide to the line haptic object.
  • 3. The robotic spinal surgery system of claim 1, wherein the line haptic object comprises an exit, and wherein the control system is configured to enable the tool guide to exit the line haptic object in response to the tool guide being moved along the line haptic object until the exit is reached.
  • 4. The robotic spinal surgery system of claim 1, wherein the control system is configured to control the manipulator to generate haptic feedback in response to an attempt by a user to move the tool guide in a manner that deviates from the line haptic object.
  • 5. The robotic spinal surgery system of claim 1, wherein the navigation system further comprises a skin tracker configured to attach to the skin of the patient and wherein the localizer is configured to track the patient with the skin tracker.
  • 6. The robotic spinal surgery system of claim 1, wherein the navigation system further comprises a vertebra tracker configured to attach to the vertebra and wherein the localizer is configured to track a pose of the vertebra with the vertebra tracker.
  • 7. The robotic spinal surgery system of claim 6, wherein the control system is configured to register the line haptic object to the pose of the vertebra tracked by the localizer.
  • 8. The robotic spinal surgery system of claim 1, wherein the navigation system further comprises a digitizing probe configured to be tracked by the localizer and configured to register points on the vertebra, and wherein the control system is configured to register the line haptic object to the vertebra based on the registered points.
  • 9. The robotic spinal surgery system of claim 1, wherein the line haptic object is defined based on a desired depth of the incision.
  • 10. The robotic spinal surgery system of claim 1, wherein the line haptic object is defined based on a size of the skin incision tool.
  • 11. The robotic spinal surgery system of claim 1, further comprising a foot pedal configured to be pressed to provide the user input to autonomously move the robotic arm to align the tool guide to the desired trajectory.
  • 12. A method of operating a robotic spinal surgery system, the robotic spinal surgery system comprising a manipulator comprising a base, a robotic arm coupled to the base and including a plurality of links and joints, and a surgical tool coupled to the robotic arm, wherein the surgical tool is a tool guide comprising an opening, a skin incision tool configured to be inserted into the opening of the tool guide and configured to create an incision in a skin of a patient, a navigation system comprising a localizer configured to track the patient and to track a base tracker coupled to the base of the manipulator, and a control system coupled to the manipulator and the navigation system, the method comprising the control system performing the following: registering, with the navigation system, a line haptic object to a vertebra of the patient, the line haptic object being associated with a desired trajectory for the vertebra;receiving a user input;in response to receiving the user input, autonomously moving the robotic arm for aligning the tool guide to the desired trajectory; andconstraining the tool guide to the desired trajectory with the line haptic object for enabling insertion of the skin incision tool within the opening of the tool guide for creating the incision in the skin at the desired trajectory.
  • 13. The method of claim 12, comprising, after aligning the tool guide to the desired trajectory, the control system controlling the manipulator for enabling a user to apply force to the tool guide to manually move, or manually cause movement of, the tool guide along the line haptic object while constraining the tool guide to the line haptic object.
  • 14. The method of claim 12, wherein the line haptic object comprises an exit, and comprising the control system enabling the tool guide to exit the line haptic object in response to the tool guide being moved along the line haptic object until the exit is reached.
  • 15. The method of claim 12, comprising the control system controlling the manipulator for generating haptic feedback in response to an attempt by a user to move the tool guide in a manner that deviates from the line haptic object.
  • 16. The method of claim 12, wherein the navigation system further comprises a skin tracker configured to attach to the skin of the patient and wherein the localizer is configured to track the patient with the skin tracker, and comprising: receiving, with the control system, tracked states of the patient based on the localizer tracking the skin tracker.
  • 17. The method of claim 12, wherein the navigation system further comprises a vertebra tracker configured to attach to the vertebra and wherein the localizer is configured to track a pose of the vertebra with the vertebra tracker, and comprising the control system: receiving tracked states of the vertebra based on the localizer tracking the vertebra tracker; and registering the line haptic object to the pose of the vertebra tracked by the localizer.
  • 18. The method of claim 12, wherein the navigation system further comprises a digitizing probe configured to be tracked by the localizer and configured to register points on the vertebra, and comprising the control system: receiving the registered points on the vertebra; andregistering the line haptic object to the vertebra based on the registered points.
  • 19. The method of claim 12, comprising the control system defining the line haptic object based on one or both of: a desired depth of the incision, and a size of the skin incision tool.
  • 20. The method of claim 12, further comprising a foot pedal configured to be pressed to provide the user input, and comprising the control system: receiving the user input from the foot pedal; andin response to receiving the user input from the foot pedal, autonomously moving the robotic arm for aligning the tool guide to the desired trajectory.
RELATED APPLICATIONS

This application is a continuation application of U.S. patent application Ser. No. 17/353,889, filed Jun. 22, 2021, which is a continuation of U.S. patent application Ser. No. 15/976,376, filed on May 10, 2018, now U.S. Pat. No. 11,065,069, which claims priority to and the benefit of U.S. Provisional Patent App. No. 62/504,019, filed on May 10, 2017, the entire contents and disclosure of each of the aforementioned applications hereby being incorporated by reference.

US Referenced Citations (477)
Number Name Date Kind
4124026 Berner et al. Nov 1978 A
4359906 Cordey Nov 1982 A
5014794 Hansson May 1991 A
5320115 Kenna Jun 1994 A
5397327 Koop et al. Mar 1995 A
5507211 Wagner Apr 1996 A
5799055 Peshkin et al. Aug 1998 A
6102850 Wang et al. Aug 2000 A
6228089 Wahrburg May 2001 B1
6236875 Bucholz et al. May 2001 B1
6246200 Blumenkranz et al. Jun 2001 B1
6535756 Simon et al. Mar 2003 B1
6711432 Krause et al. Mar 2004 B1
6772002 Schmidt et al. Aug 2004 B2
6782287 Grzeszczuk et al. Aug 2004 B2
6785572 Yanof et al. Aug 2004 B2
6823207 Jensen et al. Nov 2004 B1
6837892 Shoham Jan 2005 B2
6866671 Tierney et al. Mar 2005 B2
6877239 Leitner et al. Apr 2005 B2
6920347 Simon et al. Jul 2005 B2
6947786 Simon et al. Sep 2005 B2
6993374 Sasso Jan 2006 B2
7001393 Schwenke et al. Feb 2006 B2
7048745 Tierney et al. May 2006 B2
7091683 Smith et al. Aug 2006 B1
7104998 Yoon et al. Sep 2006 B2
7107883 Casutt Sep 2006 B2
7188626 Foley et al. Mar 2007 B2
7194120 Wicker et al. Mar 2007 B2
7235076 Pacheco Jun 2007 B2
7294106 Birkenbach et al. Nov 2007 B2
7331965 Nielsen Feb 2008 B2
7338526 Steinberg Mar 2008 B2
7396357 Tornier et al. Jul 2008 B2
7497868 Steinberg Mar 2009 B2
7542791 Mire et al. Jun 2009 B2
7567833 Moctezuma de la Barrera et al. Jul 2009 B2
7570791 Frank et al. Aug 2009 B2
7587076 Kraus et al. Sep 2009 B2
7587235 Wist et al. Sep 2009 B2
7607238 Kim et al. Oct 2009 B2
7630753 Simon et al. Dec 2009 B2
7637913 De Villiers et al. Dec 2009 B2
7637929 Auth Dec 2009 B2
7670343 Meridew et al. Mar 2010 B2
7677801 Pakzaban Mar 2010 B2
7699877 Davison Apr 2010 B2
7722530 Davison May 2010 B2
7725162 Malackowski et al. May 2010 B2
7747312 Barrick et al. Jun 2010 B2
7751868 Glossop Jul 2010 B2
7766930 DiPoto et al. Aug 2010 B2
7799036 Davison et al. Sep 2010 B2
7822244 Blumhofer Oct 2010 B2
7831292 Quaid et al. Nov 2010 B2
7835778 Foley et al. Nov 2010 B2
7835784 Mire et al. Nov 2010 B2
7840253 Tremblay et al. Nov 2010 B2
7853305 Simon et al. Dec 2010 B2
7887567 Shoham et al. Feb 2011 B2
7900524 Calloway et al. Mar 2011 B2
8010177 Csavoy et al. Aug 2011 B2
8010180 Quaid et al. Aug 2011 B2
8010181 Smith et al. Aug 2011 B2
8016835 Birkmeyer et al. Sep 2011 B2
8036441 Frank et al. Oct 2011 B2
8046054 Kim et al. Oct 2011 B2
8052688 Wolf, II Nov 2011 B2
8057407 Martinelli et al. Nov 2011 B2
8092471 Momoi et al. Jan 2012 B2
8100950 St. Clair et al. Jan 2012 B2
8108025 Csavoy et al. Jan 2012 B2
8116848 Shahidi Feb 2012 B2
8133234 Meridew et al. Mar 2012 B2
8165660 Pfister et al. Apr 2012 B2
8167823 Nycz et al. May 2012 B2
8182469 Anderson et al. May 2012 B2
8182470 Devengenzo et al. May 2012 B2
8182491 Selover et al. May 2012 B2
8195272 Piferi et al. Jun 2012 B2
8206405 Beverland et al. Jun 2012 B2
8219177 Smith et al. Jul 2012 B2
8219178 Smith et al. Jul 2012 B2
8241296 Wasielewski Aug 2012 B2
8265731 Kukuk et al. Sep 2012 B2
8271066 Sarin et al. Sep 2012 B2
8277491 Selover et al. Oct 2012 B2
8311611 Csavoy et al. Nov 2012 B2
8328852 Zehavi et al. Dec 2012 B2
8335553 Rubner et al. Dec 2012 B2
8337426 Nycz Dec 2012 B2
8374678 Graumann Feb 2013 B2
8394144 Zehavi et al. Mar 2013 B2
8419717 Diolaiti et al. Apr 2013 B2
8425522 Bonutti Apr 2013 B2
8442677 Shoham May 2013 B2
8454583 Perez-Cruet et al. Jun 2013 B2
8454619 Head Jun 2013 B1
8469963 Shoham Jun 2013 B2
8491603 Yeung et al. Jul 2013 B2
8500738 Wolf, II Aug 2013 B2
8509503 Nahum et al. Aug 2013 B2
8518051 Shoham et al. Aug 2013 B2
8571638 Shoham Oct 2013 B2
8572860 Fritzinger Nov 2013 B2
8615288 Govari et al. Dec 2013 B2
8620473 Diolaiti et al. Dec 2013 B2
8634897 Simon et al. Jan 2014 B2
8644906 Piferi et al. Feb 2014 B2
8657829 McCardel Feb 2014 B2
8705829 Frank et al. Apr 2014 B2
8706185 Foley et al. Apr 2014 B2
8709016 Park et al. Apr 2014 B2
8740885 Arkin et al. Jun 2014 B2
8747476 Steinberg Jun 2014 B2
8758413 Heiges et al. Jun 2014 B2
8814877 Wasielewski Aug 2014 B2
8814914 Miller et al. Aug 2014 B2
8838205 Shoham et al. Sep 2014 B2
8840629 Bonutti Sep 2014 B2
8848977 Bammer et al. Sep 2014 B2
8864752 Diolaiti et al. Oct 2014 B2
8876837 Smith et al. Nov 2014 B2
8900244 Meridew et al. Dec 2014 B2
8911429 Olds et al. Dec 2014 B2
8951256 Burroughs Feb 2015 B2
8961526 Burroughs Feb 2015 B2
8974460 De la Fuente Klein et al. Mar 2015 B2
8979859 Leparmentier et al. Mar 2015 B2
8992580 Bar et al. Mar 2015 B2
8998909 Gillman et al. Apr 2015 B2
9008757 Wu Apr 2015 B2
9011456 Ranawat et al. Apr 2015 B2
9017313 Steinberg Apr 2015 B2
9042960 Neubardt May 2015 B2
9044190 Rubner et al. Jun 2015 B2
9050108 Grinberg et al. Jun 2015 B2
9056015 Zehavi et al. Jun 2015 B2
9066751 Sasso Jun 2015 B2
9066755 Jacobs Jun 2015 B1
9078685 Smith et al. Jul 2015 B2
9101443 Bonutti Aug 2015 B2
9107721 Plotkin Aug 2015 B2
9119572 Gorek et al. Sep 2015 B2
9125556 Zehavi et al. Sep 2015 B2
9125680 Kostrzewski et al. Sep 2015 B2
9131986 Greer et al. Sep 2015 B2
9138319 Fanson et al. Sep 2015 B2
9149281 Bonutti Oct 2015 B2
9155544 Bonutti Oct 2015 B2
9161799 Benson et al. Oct 2015 B2
9168154 Behzadi Oct 2015 B2
9192395 Bonutti Nov 2015 B2
9198731 Balaji et al. Dec 2015 B2
9211128 Gillman et al. Dec 2015 B2
9220612 Behzadi Dec 2015 B2
9232906 Wolf, II Jan 2016 B2
9237861 Nahum et al. Jan 2016 B2
9240046 Carrell et al. Jan 2016 B2
9241771 Kostrzewski et al. Jan 2016 B2
9243881 Bourque et al. Jan 2016 B2
9265551 Kust et al. Feb 2016 B2
9271741 Bonutti Mar 2016 B2
9271779 Bonutti Mar 2016 B2
9283048 Kostrzewski et al. Mar 2016 B2
9295500 Marigowda Mar 2016 B2
9308050 Kostrzewski et al. Apr 2016 B2
9333042 Diolaiti et al. May 2016 B2
9339278 Meridew et al. May 2016 B2
9339345 Song et al. May 2016 B2
9345387 Larkin May 2016 B2
9358130 Livorsi et al. Jun 2016 B2
9398962 Steinberg Jul 2016 B2
9439675 Penenberg Sep 2016 B2
9452019 Schena et al. Sep 2016 B2
9456827 Grinberg et al. Oct 2016 B2
9462943 Brownell Oct 2016 B2
9468538 Nycz et al. Oct 2016 B2
9480516 Crawford et al. Nov 2016 B2
9486227 Bonutti Nov 2016 B2
9491415 Deitz et al. Nov 2016 B2
9492241 Joskowicz et al. Nov 2016 B2
9519341 Hasegawa et al. Dec 2016 B2
9532730 Wasielewski Jan 2017 B2
9532849 Anderson et al. Jan 2017 B2
9536309 Sela et al. Jan 2017 B2
9539112 Thornberry Jan 2017 B2
9545233 Sirpad et al. Jan 2017 B2
9545280 Crawford et al. Jan 2017 B2
9549781 He et al. Jan 2017 B2
9554763 Paon et al. Jan 2017 B2
9554864 Taylor et al. Jan 2017 B2
9554865 Olds et al. Jan 2017 B2
9561082 Yen et al. Feb 2017 B2
9566121 Staunton et al. Feb 2017 B2
9566122 Bowling et al. Feb 2017 B2
9576353 Mahn et al. Feb 2017 B2
9585677 Garcia et al. Mar 2017 B2
9585725 Bonutti Mar 2017 B2
9585768 Sherman et al. Mar 2017 B2
9592096 Maillet et al. Mar 2017 B2
9600138 Thomas et al. Mar 2017 B2
9622757 Bourque et al. Apr 2017 B2
9622779 Horton et al. Apr 2017 B2
9629687 Bonutti Apr 2017 B2
9636162 Crawford et al. May 2017 B2
9649160 van der Walt et al. May 2017 B2
9649202 Behzadi et al. May 2017 B2
9655649 Shoham May 2017 B2
9662160 Beale et al. May 2017 B2
9662174 Taylor et al. May 2017 B2
9668768 Piron et al. Jun 2017 B2
9668875 Steinberg Jun 2017 B2
9675272 Selover et al. Jun 2017 B2
9687306 Markey et al. Jun 2017 B2
9693878 Kunz et al. Jul 2017 B2
9713499 Bar et al. Jul 2017 B2
9724167 Ziaei et al. Aug 2017 B2
9734632 Thomas et al. Aug 2017 B2
9743971 Belkoff et al. Aug 2017 B2
9743995 Lohmeier et al. Aug 2017 B2
9750510 Kostrzewski et al. Sep 2017 B2
9750545 Cryder et al. Sep 2017 B2
9757087 Simon et al. Sep 2017 B2
9782229 Crawford et al. Oct 2017 B2
9788966 Steinberg Oct 2017 B2
9795319 Lavallee et al. Oct 2017 B2
9795394 Bonutti Oct 2017 B2
9808318 Bonutti Nov 2017 B2
9814535 Bar et al. Nov 2017 B2
9815206 Balicki et al. Nov 2017 B2
9833292 Kostrzewski et al. Dec 2017 B2
9877793 Bonutti Jan 2018 B2
9931059 Borja Apr 2018 B2
9987050 Robinson Jun 2018 B2
9987092 Hladio et al. Jun 2018 B2
10004562 Kostrzewski et al. Jun 2018 B2
RE46954 Pedicini Jul 2018 E
10028722 Moreau-Gaudry Jul 2018 B2
10028800 Bourque et al. Jul 2018 B2
10034753 Dressler et al. Jul 2018 B2
10076385 Shoham et al. Sep 2018 B2
10080509 Wasielewski Sep 2018 B2
10080615 Bartelme et al. Sep 2018 B2
10085786 Chandanson et al. Oct 2018 B2
10226298 Ourselin et al. Mar 2019 B2
11033341 Kang et al. Jun 2021 B2
11065069 Kang et al. Jul 2021 B2
11219487 He Jan 2022 B2
20030173096 Setton et al. Sep 2003 A1
20050085717 Shahidi Apr 2005 A1
20050149050 Stifter et al. Jul 2005 A1
20050171557 Shoham Aug 2005 A1
20060036264 Selover et al. Feb 2006 A1
20060142657 Quaid Jun 2006 A1
20070058406 Inoshita et al. Mar 2007 A1
20070093689 Steinberg Apr 2007 A1
20070156157 Nahum et al. Jul 2007 A1
20070250078 Stuart Oct 2007 A1
20080004634 Farritor Jan 2008 A1
20080058837 Steinberg Mar 2008 A1
20080071374 Steinberg Mar 2008 A1
20080108994 Steinberg May 2008 A1
20080114376 Steinberg May 2008 A1
20080147188 Steinberg Jun 2008 A1
20080281332 Taylor Nov 2008 A1
20090182348 Nahapetian et al. Jul 2009 A1
20100137871 Borja Jun 2010 A1
20100204714 Shoham Aug 2010 A1
20100241129 Markey et al. Sep 2010 A1
20110015649 Anvari et al. Jan 2011 A1
20110066160 Simaan et al. Mar 2011 A1
20110092859 Neubardt Apr 2011 A1
20110306873 Shenai et al. Dec 2011 A1
20120046665 Kim Feb 2012 A1
20120209117 Mozes et al. Aug 2012 A1
20120283747 Popovic Nov 2012 A1
20130004918 Huwais Jan 2013 A1
20130158575 Klotz et al. Jun 2013 A1
20130307955 Deitz et al. Nov 2013 A1
20130345718 Crawford Dec 2013 A1
20140031722 Li et al. Jan 2014 A1
20140052150 Taylor et al. Feb 2014 A1
20140058406 Tsekos Feb 2014 A1
20140135791 Nikou et al. May 2014 A1
20140135796 Simon et al. May 2014 A1
20140180290 Otto et al. Jun 2014 A1
20140197844 Chang Jul 2014 A1
20140222012 Belkoff et al. Aug 2014 A1
20140257296 Morgenstern Lopez Sep 2014 A1
20140272789 Mozes et al. Sep 2014 A1
20140275955 Crawford et al. Sep 2014 A1
20140309560 Bonutti Oct 2014 A1
20140316436 Bar et al. Oct 2014 A1
20140360305 Olds et al. Dec 2014 A1
20140371577 Maillet et al. Dec 2014 A1
20140378999 Crawford et al. Dec 2014 A1
20150031985 Reddy et al. Jan 2015 A1
20150032164 Crawford et al. Jan 2015 A1
20150066043 Nallakrishnan Mar 2015 A1
20150100066 Kostrzewski Apr 2015 A1
20150112344 Shoham et al. Apr 2015 A1
20150142372 Singh May 2015 A1
20150182285 Yen et al. Jul 2015 A1
20150196326 Bar et al. Jul 2015 A1
20150196365 Kostrzewski et al. Jul 2015 A1
20150202009 Nussbaumer et al. Jul 2015 A1
20150209056 Shoham et al. Jul 2015 A1
20150209119 Theodore et al. Jul 2015 A1
20150223897 Kostrzewski et al. Aug 2015 A1
20150223906 O'Neill Aug 2015 A1
20150238206 Benson et al. Aug 2015 A1
20150272696 Fry et al. Oct 2015 A1
20150289992 Anglin et al. Oct 2015 A1
20150305817 Kostrzewski Oct 2015 A1
20150313684 Fanson et al. Nov 2015 A1
20150327948 Schoepp et al. Nov 2015 A1
20150335386 Smith et al. Nov 2015 A1
20150351860 Piron et al. Dec 2015 A1
20150366624 Kostrzewski et al. Dec 2015 A1
20160000512 Gombert et al. Jan 2016 A1
20160008011 Kostrzewski Jan 2016 A1
20160030117 Mewes Feb 2016 A1
20160038238 Kostrzewski et al. Feb 2016 A1
20160038242 Lo Iacono et al. Feb 2016 A1
20160081753 Kostrzewski Mar 2016 A1
20160081754 Kostrzewski et al. Mar 2016 A1
20160081819 Kelman et al. Mar 2016 A1
20160089121 Stand, III et al. Mar 2016 A1
20160095631 Stad Apr 2016 A1
20160095720 Behzadi Apr 2016 A1
20160120612 Yorimoto May 2016 A1
20160128789 Kostrzewski et al. May 2016 A1
20160151120 Kostrzewski et al. Jun 2016 A1
20160157941 Anvari et al. Jun 2016 A1
20160175110 Behzadi et al. Jun 2016 A1
20160199141 Mewes et al. Jul 2016 A1
20160206347 Bar et al. Jul 2016 A1
20160220315 Falardeau et al. Aug 2016 A1
20160220320 Crawford et al. Aug 2016 A1
20160220385 Falardeau et al. Aug 2016 A1
20160228133 Meridew et al. Aug 2016 A1
20160235490 Srivastava et al. Aug 2016 A1
20160235492 Morard et al. Aug 2016 A1
20160235493 LeBoeuf, II et al. Aug 2016 A1
20160242860 Diolaiti et al. Aug 2016 A1
20160242934 van der Walt et al. Aug 2016 A1
20160256225 Crawford et al. Sep 2016 A1
20160278865 Capote et al. Sep 2016 A1
20160278875 Crawford et al. Sep 2016 A1
20160278941 Livorsi et al. Sep 2016 A1
20160296266 Chandanson et al. Oct 2016 A1
20160310218 Ruckel et al. Oct 2016 A1
20160310221 Bar et al. Oct 2016 A1
20160331479 Crawford Nov 2016 A1
20160331481 Bonutti Nov 2016 A1
20160374769 Schena et al. Dec 2016 A1
20170000572 Moctezuma de la Barrera et al. Jan 2017 A1
20170007334 Crawford et al. Jan 2017 A1
20170020630 Johnson et al. Jan 2017 A1
20170027652 Johnson et al. Feb 2017 A1
20170042620 Bartelme et al. Feb 2017 A1
20170055940 Shoham Mar 2017 A1
20170056086 Kostrzewski et al. Mar 2017 A1
20170056116 Kostrzewski Mar 2017 A1
20170065428 Behzadi Mar 2017 A1
20170065432 Singh Mar 2017 A1
20170071682 Bar et al. Mar 2017 A1
20170071685 Crawford et al. Mar 2017 A1
20170071691 Crawford et al. Mar 2017 A1
20170071759 Behzadi et al. Mar 2017 A1
20170079727 Crawford et al. Mar 2017 A1
20170086896 Crawford et al. Mar 2017 A1
20170086927 Auld et al. Mar 2017 A1
20170086928 Auld et al. Mar 2017 A1
20170086932 Auld et al. Mar 2017 A1
20170100195 Velusamy Apr 2017 A1
20170105846 Behzadi Apr 2017 A1
20170119339 Johnson et al. May 2017 A1
20170119472 Herrmann et al. May 2017 A1
20170132789 Deitz et al. May 2017 A1
20170143426 Isaacs et al. May 2017 A1
20170143429 Richmond et al. May 2017 A1
20170151025 Mewes et al. Jun 2017 A1
20170156805 Taylor et al. Jun 2017 A1
20170156816 Ibrahim Jun 2017 A1
20170172669 Berkowitz et al. Jun 2017 A1
20170172762 Sherman et al. Jun 2017 A1
20170178349 Ketcha et al. Jun 2017 A1
20170181774 Cahill Jun 2017 A1
20170186180 Piron et al. Jun 2017 A1
20170196506 Behzadi Jul 2017 A1
20170196597 Corbin et al. Jul 2017 A1
20170196599 Kwon et al. Jul 2017 A1
20170196641 Jagga et al. Jul 2017 A1
20170196701 Behzadi et al. Jul 2017 A1
20170196705 Behzadi Jul 2017 A1
20170196706 Behzadi Jul 2017 A1
20170196708 Behzadi et al. Jul 2017 A1
20170196710 Behzadi Jul 2017 A1
20170196711 Behzadi Jul 2017 A1
20170202628 Dell et al. Jul 2017 A1
20170202629 Maillet et al. Jul 2017 A1
20170202683 Behzadi Jul 2017 A1
20170215825 Johnson et al. Aug 2017 A1
20170215826 Johnson et al. Aug 2017 A1
20170215827 Johnson et al. Aug 2017 A1
20170224358 Kostrzewski Aug 2017 A1
20170231702 Crawford et al. Aug 2017 A1
20170239002 Crawford et al. Aug 2017 A1
20170239003 Crawford et al. Aug 2017 A1
20170239006 Crawford et al. Aug 2017 A1
20170239007 Crawford et al. Aug 2017 A1
20170239451 Berkowitz Aug 2017 A1
20170239452 Berkowitz et al. Aug 2017 A1
20170245951 Crawford et al. Aug 2017 A1
20170252112 Crawford et al. Sep 2017 A1
20170252114 Crawford Sep 2017 A1
20170258469 Shelton, IV et al. Sep 2017 A1
20170258533 Crawford et al. Sep 2017 A1
20170258535 Crawford et al. Sep 2017 A1
20170261348 LeBoeuf, II et al. Sep 2017 A1
20170265774 Johnson et al. Sep 2017 A1
20170281145 Crawford et al. Oct 2017 A1
20170290666 Behzadi Oct 2017 A1
20170296274 van der Walt et al. Oct 2017 A1
20170296276 Bonutti Oct 2017 A1
20170312039 Crawford et al. Nov 2017 A1
20170316561 Helm et al. Nov 2017 A1
20170325892 Aghazadeh Nov 2017 A1
20170333057 Kostrzewski et al. Nov 2017 A1
20170333136 Hladio et al. Nov 2017 A1
20170333137 Roessler Nov 2017 A1
20170340448 Behzadi Nov 2017 A1
20170340456 Behzadi Nov 2017 A1
20170348037 Sexson et al. Dec 2017 A1
20170354368 Behzadi Dec 2017 A1
20170354468 Johnson Dec 2017 A1
20170360493 Zucker et al. Dec 2017 A1
20170360575 Behzadi et al. Dec 2017 A1
20170367847 Piriou et al. Dec 2017 A1
20180000543 Hibner Jan 2018 A1
20180008324 Cryder et al. Jan 2018 A1
20180008353 Kostrzewski et al. Jan 2018 A1
20180008358 Kostrzewski et al. Jan 2018 A1
20180021096 Kostrzewski et al. Jan 2018 A1
20180042650 Gao et al. Feb 2018 A1
20180042684 Kostrzewski et al. Feb 2018 A1
20180049823 Shelton, IV et al. Feb 2018 A1
20180049832 Eckert et al. Feb 2018 A1
20180078201 Behzadi Mar 2018 A1
20180078266 Fry et al. Mar 2018 A1
20180092648 Sun et al. Apr 2018 A1
20180092757 Behzadi et al. Apr 2018 A1
20180110573 Kostrzewski Apr 2018 A1
20180147018 Crawford et al. May 2018 A1
20180168539 Singh et al. Jun 2018 A1
20180185100 Weinstein Jul 2018 A1
20180185107 Nikou et al. Jul 2018 A1
20180193171 van der Walt et al. Jul 2018 A1
20180199951 Chappuis et al. Jul 2018 A1
20180199999 Syverson et al. Jul 2018 A1
20180200002 Kostrzewski et al. Jul 2018 A1
20180200016 Chappuis et al. Jul 2018 A1
20180214223 Turner Aug 2018 A1
20180221097 Bonutti Aug 2018 A1
20180250077 Xu et al. Sep 2018 A1
20180250144 Li et al. Sep 2018 A1
20180256259 Crawford Sep 2018 A1
20180263714 Kostrzewski et al. Sep 2018 A1
20180325608 Kang et al. Nov 2018 A1
20190076195 Shalayev et al. Mar 2019 A1
20190090966 Kang et al. Mar 2019 A1
20190269469 Bush, Jr. Sep 2019 A1
20210275260 Kang et al. Sep 2021 A1
20210307849 Kang Oct 2021 A1
Foreign Referenced Citations (57)
Number Date Country
201422918 Mar 2010 CN
201542641 Aug 2010 CN
101700184 Jan 2011 CN
101579269 Apr 2011 CN
101853333 Nov 2012 CN
105101891 Nov 2015 CN
105169570 Dec 2015 CN
105208962 Dec 2015 CN
107898594 Apr 2018 CN
2015533304 Nov 2015 JP
20170129995 Nov 2017 KR
2005039391 May 2005 WO
2006091494 Aug 2006 WO
2009092164 Jul 2009 WO
2011063715 Jun 2011 WO
2013075500 May 2013 WO
2013192598 Dec 2013 WO
2014077920 May 2014 WO
2014138916 Sep 2014 WO
2014139023 Sep 2014 WO
2014139024 Sep 2014 WO
2015061638 Apr 2015 WO
2015087335 Jun 2015 WO
2015115807 Aug 2015 WO
2015115809 Aug 2015 WO
2015166487 Nov 2015 WO
2015193479 Dec 2015 WO
2016008880 Jan 2016 WO
2016042152 Mar 2016 WO
2016088130 Jun 2016 WO
2016115423 Jul 2016 WO
2016118744 Jul 2016 WO
2017001851 Jan 2017 WO
2017023825 Feb 2017 WO
2017027331 Feb 2017 WO
2017035592 Mar 2017 WO
2017036340 Mar 2017 WO
2017037113 Mar 2017 WO
2017037127 Mar 2017 WO
2017043926 Mar 2017 WO
2017048736 Mar 2017 WO
2017064719 Apr 2017 WO
2017115227 Jul 2017 WO
2017121874 Jul 2017 WO
2017122202 Jul 2017 WO
2017123506 Jul 2017 WO
2017136550 Aug 2017 WO
2017123506 Sep 2017 WO
2017151607 Sep 2017 WO
2017162981 Sep 2017 WO
2017177046 Oct 2017 WO
2017211950 Dec 2017 WO
2017218423 Dec 2017 WO
2017219207 Dec 2017 WO
2017219208 Dec 2017 WO
2018031752 Feb 2018 WO
2018072003 Apr 2018 WO
Non-Patent Literature Citations (37)
Entry
Abstract of Campbell, EM, “Multiterawatt Nd: Glass Lasers Based on Chirped-Pulsed Amplification”, Femtosecond and Nanosecond High-Intensity Lasers and Applications, Society of Photo-Optical Instrumentation Engineers, Los Angeles, California/Bellingham, WA, SPIE; Jan. 17-18, 1990, 4 pages.
Adogwa, O. et al., “Comparative Effectiveness of Minimally Invasive Versus Open Transforaminal Lumbar Interbody Fusion: 2-Year Assessment of Narcotic Use, Return to Work, Disability, and Quality of Life”, J. Spinal Disord. Tech., vol. 24, 2011, pp. 479-484.
Amini-Nik, S. et al., “Ultrafast Mid-IR Laser Scalpel: Protein Signals of the Fundamental Limits to Minimally Invasive Surgery”, PLoS ONE, vol. 5, No. 9, 2010, 6 pages.
Antipov, Oleg et al., “Highly Efficient 2???m CW and Q-Switched Tm3+:Lu2O3 Ceramics Lasers In-Band Pumped by a Raman-Shifted Erbium Fiber Laser at 1670??nm”, Opt. Lett. 41, 2016, pp. 2298-2301.
Devito, DP et al., Clinical Acceptance and Accuracy Assessment of Spinal Implants Guided with the SpineAssist Surgical Robot—Retrospective Study, Spine, vol. 35, No. 24, 2010, pp. 2109-2115.
English language abstract and machine-assisted English translation (of equivalent CN 103126767) for WO 2013/075500 extracted from espacenet.com database on Nov. 28, 2018, 12 pages.
English language abstract and machine-assisted English translation for CN 101579269 extracted from espacenet.com database on Oct. 1, 2018, 11 pages.
English language abstract and machine-assisted English translation for CN 101700184 extracted from espacenet.com database on Oct. 1, 2018, 15 pages.
English language abstract and machine-assisted English translation for CN 101853333 extracted from espacenet.com database on Oct. 1, 2018, 13 pages.
English language abstract and machine-assisted English translation for CN 201422918 extracted from espacenet.com database on Oct. 1, 2018, 11 pages.
English language abstract and machine-assisted English translation for CN 201542641 extracted from espacenet.com database on Oct. 1, 2018, 15 pages.
English language abstract and machine-assisted English translation for KR 20170129995 A extracted from espacenet.com database on Jun. 30, 2021, 12 pages.
English language abstract and machine-assisted English translation for WO 2015/115807 extracted from espacenet.com database on Nov. 28, 2018, 16 pages.
English language abstract and machine-assisted English translation for WO 2015/115809 extracted from espacenet.com database on Oct. 1, 2018, 14 pages.
English language abstract and machine-assisted English translation for WO 2017/043926 A1 extracted from espacenet.com database on Jun. 30, 2021, 19 pages.
English language abstract and machine-assisted English translation for WO 2017/119208 extracted from espacenet.com database on Nov. 28, 2018, 11 pages.
English language abstract and machine-assisted English translation for WO 2017/162981 extracted from espacenet.com database on Oct. 1, 2018, 13 pages.
English language abstract and machine-assisted English translation for WO 2017/219207 extracted from espacenet.com database on Nov. 28, 2018, 14 pages.
English language abstract and machine-assisted English translation for WO 2017/21950 A1 extracted from espacenet.com database on Jun. 30, 2021, 19 pages.
English language abstract and machine-assisted English translation of corresponding CN 101700184B for WO 2011/063715 extracted from espacenet.com database on Oct. 1, 2018, 15 pages.
English language abstract for WO 2017/036340 extracted from espacenet.com database on Oct. 3, 2018, 2 pages.
International Search Report for Application No. PCT/US2018/031999 dated Nov. 7, 2018, 5 pages.
International Search Report for Application No. PCT/US2019/060502 dated Apr. 6, 2020, 4 pages.
Invitation to Pay Additional Fees and Partial Search Report for Application No. PCT/US2018/031999 dated Sep. 12, 2018, 3 pages.
Kotani, Y. et al., “Mid-Term Clinical Results of Minimally Invasive Decompression and Posterolateral Fusion With Percutaneous Pedicle Screws Versus Conventional Approach For Degenerative Spondylolisthesis With Spinal Stenosis”, Eur. Spine J., vol. 21, 2012, pp. 1171-1177.
Lee, P. et al., “Perioperative and Postoperative Complications of Single-Level Minimally Invasive Transforaminal Lumbar Interbody Fusion in Elderly Adults”, J. Clin. Neurosci., vol. 19, 2012;, pp. 111-114.
Lorensen, William E. et al., “Marching Cubes: A High Resolution 3D Surface Construction Algorithm”, ACM Siggraph Computer Graphics, vol. 21, No. 4, ACM, 1987, 7 pages.
Mahvash, M. et al., “Modeling the Forces of Cutting With Scissors”, IEEE Trans. Biomed. Eng., vol. 55, No. 3, 2008, pp. 848-856.
McGirt MJ et al., “Comparative Analysis of Perioperative Surgical Site Infection After Minimally Invasive Versus Open Posterior/Transforaminal Lumbar Interbody Fusion: Analysis of Hospital Billing and Discharge Data from 5,170 patients”, J. Neurosurg. Spine, vol. 14, 2011, pp. 771-778.
Romero, Francisco et al., “Experimental and Analytical Validation of a Modular Acetabular Prosthesis in Total Hip Arthroplasty”, Journal of Orthopaedic Surgery and Research, May 16, 2007, pp. 1-9.
Sakai, Y et al., “Segmental Pedicle Screwing For Idiopathic Scoliosis Using Computer-Assisted Surgery”, J. Spinal Disord. Tech., vol. 21, 2008, pp. 181-186.
Vogel, A. et al., “Mechanisms of Pulsed Laser Ablation of Biological Tissues”, Chem. Rev., vol. 103, No. 2, 2003, pp. 577-644.
English language abstract for CN 105101891 A extracted from espacenet.com database on Dec. 30, 2023, 2 pages.
English language abstract and machine-assisted English translation for CN 105169570 A extracted from espacenet.com database on Dec. 30, 2023, 11 pages.
English language abstract for CN 105208962 A extracted from espacenet.com database on Dec. 30, 2023, 2 pages.
English language abstract and machine-assisted English translation for CN 107898594 A extracted from espacenet.com database on Dec. 30, 2023, 7 pages.
English language abstract for JP 2015-533304 A extracted from espacenet.com database on Aug. 16, 2023, 2 pages.
Related Publications (1)
Number Date Country
20230277259 A1 Sep 2023 US
Provisional Applications (1)
Number Date Country
62504019 May 2017 US
Continuations (2)
Number Date Country
Parent 17353889 Jun 2021 US
Child 18195966 US
Parent 15976376 May 2018 US
Child 17353889 US