The present application relates to trajectory guidance, and more specifically to providing surgical trajectory guidance for surgical instruments.
Traditionally, accurate operation of surgical instruments including their movement during surgical procedures is aided by existing navigation systems. These navigation systems use cameras, sensors, and other medical imaging devices such as X-ray and C-arm imaging systems to measure and track the actual and relative position of surgical instruments and of a patient's anatomy during an operation. A number of different tracking modalities can be utilized, including the above-mentioned optical tracking systems, as well as electromagnetic tracking systems (that utilize, e.g., coils and field generators) and others known in the art. A display device of the navigation system that is provided in the surgical room is used to output or display images and other data based on the measured instrument and patient anatomy information, such as the position of the instrument relative to the patient's anatomy.
Such navigation systems are often used to determine whether surgical instruments are being operated in accordance with predetermined or planned trajectories. For example, if a patient is undergoing orthopedic surgery such as spine surgery, a planned trajectory into the patient's body and to the area on the patient's spine where a bone anchor is to be implanted is calculated preoperatively based on three-dimensional images of the patient's surgical area. The planned trajectory refers to a path through which the instrument (and other objects such as a bone anchor) can be advanced and retracted in and out of the patient's body safely or most effectively. During surgery, the position of the instrument and patient's anatomy visualized by the navigation system can be shown, via the display device, relative to the pre-operatively determined trajectory. The surgeon (or other instrument operator) can therefore observe the display device during surgery and operation of the instrument to determine whether the instrument is positioned in alignment with the planned trajectory. To maintain the instrument in its proper alignment or to move the instrument into an aligned position in accordance with the predetermined trajectory, the surgeon in real-time must interpret and/or transform the information shown on the display device into the three-dimensional, real-world surgical area, in order to align the instrument with the planned trajectory. As a result, the surgeon's visual focus and attention on the surgical area and procedure in-progress must be redistributed to repeatedly look at the display device and process the displayed data to monitor and ensure the instrument's alignment with the planned trajectory.
Other systems and devices have been used to guide or ensure instrument alignment in a manner that reduces the impact on the surgeon's attention. These can include the use of surgical robots with navigation systems to mechanically guide the surgeon by placing a guiding tube in alignment with the predetermined trajectory. The surgeon then uses the guiding tube to advance and retract the instrument in and out of the patient's anatomy. These and other existing alternative techniques are bulky, expensive, and/or time-consuming to configure.
Accordingly, there is a need for systems, methods, and devices that guide the positioning of an instrument into alignment with a planned trajectory in a manner that is readily available, intuitive, and minimizes the impact on the surgeon's attention. There is also a need for such systems, methods, and devices to be less expensive than traditional means while providing accurate trajectory guidance for surgical instruments.
Systems and methods are provided for surgical trajectory guidance. In some example embodiments, a laser system can be used to emit one or more lasers toward the patient's anatomy, indicating how the proximal and distal ends of a surgical instrument should be positioned in order to be aligned with a planned trajectory. The planned trajectory is a path into or through a patient's body, through which the instrument and/or objects are distally and proximally moved. The planned trajectory can be calculated pre-operatively or intra-operatively based on patient data such as medical images of the relevant areas of the patient's anatomy. The patient data and pre- or intra-operatively calculated planned trajectory can be used to calibrate the laser system with the patient in the surgical environment and instruments to be used during the operation. Calibration enables the laser system to translate the calculated planned trajectory into the real-world, three-dimensional space in which the patient is being operated.
With the planned trajectory being applicable to the patient's anatomy in surgery, positions of laser target points on or proximate to the patient's body can be calculated. Emitting lasers onto the laser target points can visually aid a user in positioning an instrument for insertion into, or removal from, a patient's body. In some embodiments, multiple lasers can be emitted (e.g., two lasers in some embodiments). In an embodiment employing multiple lasers, a distal laser can indicate an entry area or entry point on the patient's body where, for example, the distal end of the instrument is to be inserted. A proximal laser can indicate an axis with which the proximal end of the instrument is to intersect when the distal end is aligned with the distal laser to ensure the instrument is aligned with a planned trajectory for its use. In some embodiments, the proximal end of the instrument can include a guide having a target point thereon. The target point indicates where the proximal laser should intersect with the proximal end of the instrument in order to be properly aligned with the planned trajectory. In some embodiments, the laser system can emit a single laser toward a single target point. The emitted laser can be accurately emitted to indicate a longitudinal axis equal to or corresponding to the central longitudinal axis of the planned trajectory. That is, to be in alignment with the planned trajectory, the instrument can be moved such that its distal end and the target point on the guide of the proximal end can be aligned or intersect with the emitted laser. The position of the instrument and/or patient can be tracked in real-time relative to the planned trajectory. The laser target points and direction of emitted lasers can be adjusted in real-time to ensure continued guidance as the instrument is advanced into or retracted from the patient's body.
In other example embodiments, alignment of the instrument with the planned trajectory can be provided using a guide map. The guide map can be a component provided at or proximate to the proximal end of the instrument, and can include embedded LEDs or other light sources that can output visual cues to the operator of the instrument, for example, concerning the position of the instrument. The position of the instrument can be compared with the planned trajectory to determine whether the proximal and distal ends are aligned with a longitudinal axis equal to or corresponding to the planned trajectory. The alignment or misalignment of the distal or proximal ends of the instrument relative to the planned trajectory can cause portions or areas of the guide map to be activated, for example, such that LEDs are illuminated. If the distal or proximal ends of the instrument are misaligned with the planned trajectory, specific LEDs of the guide map can be activated to indicate a direction the distal or proximal ends must be moved to align with the planned trajectory.
In one aspect, a system for providing trajectory guidance is provided that includes one or more laser output devices operable to emit lasers, and at least one processor communicatively coupled to the one or more laser output devices. The at least one processor is operable to calculate a planned trajectory within an object based on first object data corresponding to the object, the planned trajectory indicating a target path for distally and proximally moving one or more instruments in and out of the object. The at least one processor is further operable to obtain actual data including at least second object data corresponding to the object and first instrument data corresponding to an instrument from among the one or more instruments. The at least one processor is further operable to perform calibration on the object and the instrument based on the actual data and the planned trajectory. The at least one processor is further operable to calculate one or more target points based on the actual data and the planned trajectory, each of the one or more target points indicating a position on the object toward which to direct one or more lasers respectively emitted from the one or more laser output devices, and cause to emit the one or more lasers toward the one or more laser target points. Further, the one or more target points are calculated such that the emitted one or more lasers guide the instrument to a planned alignment corresponding to the planned trajectory.
The systems and methods described herein can include any of a variety of additional or alternative features or steps, all of which are considered part of the present disclosure. For example, in some embodiments the planned alignment of the instrument can be a position in which a central longitudinal axis of the instrument is the same as a central longitudinal axis of the planned trajectory. And in some embodiments, the planned alignment can be a position in which a distal portion and a proximal portion of the instrument intersect the one or more lasers.
In certain embodiments, the one or more target points can include a proximal target point and a distal target point indicating a position on the object towards which to direct a proximal laser and a distal laser, respectively. Further, the planned alignment can be a position in which: (1) a distal end of the instrument intersects with the distal laser, and (2) the proximal end of the instrument intersects with the proximal laser. In some embodiments, the proximal end can include a guide having a target point thereon, and the proximal end of the instrument can intersect with the proximal laser at the target point of the guide when the instrument is in the planned alignment. In certain embodiments, the distal end of the instrument can intersect with the distal laser at an entry point into the object, and the entry point can correspond to a proximal end of the planned trajectory. In some embodiments, if the instrument is distally advanced when in the planned alignment, the instrument can move along the planned trajectory.
In certain embodiments, the one or more target points can include a single target point indicating a position on the object toward which to direct a single laser emitted from a single laser output device, the single laser output device being positioned such that a path of the emitted single laser toward the single target point indicates the planned alignment of the instrument. In some embodiments, the path of the emitted single laser can have a central longitudinal axis equal to the central longitudinal axis of the planned trajectory.
In some embodiments, the processor can be further operable to obtain updated actual data including at least updated first instrument data and recalculate the one or more target points based on the updated actual data and the planned trajectory, where the one or more lasers can be caused to be emitted toward the one or more updated laser target points. In certain embodiments, the updated actual data can be based on operating of the instrument. In some embodiments, the obtaining of the updated actual data, recalculating the one or more target points, and causing the one or more lasers to be emitted can be performed in real-time during the operating of the instrument.
In another aspect, a method for providing trajectory guidance is provided that includes calculating a planned trajectory within an object based on first object data corresponding to the object, the planned trajectory indicating a target path for distally and proximally moving one or more instruments in and out of the object. The method further includes obtaining actual data including at least second object data corresponding to the object and first instrument data corresponding to an instrument from among the one or more instruments. The method also includes performing calibration on the object and the instrument based on the actual data and the planned trajectory and calculating one or more target points based on the actual data and the planned trajectory, each of the one or more target points indicating a position on the object toward which to direct one or more lasers respectively emitted from the one or more laser output devices. The method further includes emitting, from the one or more output laser devices, the one or more lasers toward the one or more laser target points. Moreover, the one or more target points are calculated such that the emitted one or more lasers guide the instrument to a planned alignment corresponding to the planned trajectory.
As with the system described above, any of a variety of alternative or additional features can be included. For example, in some embodiments, the planned alignment of the instrument can be a position in which a central longitudinal axis of the instrument is the same as a central longitudinal axis of the planned trajectory. And in some embodiments, the planned alignment can be a position in which a distal portion and a proximal portion of the instrument intersect the one or more lasers.
In certain embodiments, the one or more target points can include a proximal target point and a distal target point indicating a position on the object towards which to direct a proximal laser and a distal laser, respectively, and the planned alignment can be a position in which: (1) a distal end of the instrument intersects with the distal laser, and (2) the proximal end of the instrument intersects with the proximal laser. In some embodiments, the proximal end can include a guide having a target point thereon, and the proximal end of the instrument can intersect with the proximal laser at the target point of the guide when the instrument is in the planned alignment. In some embodiments, the distal end of the instrument can intersect with the distal laser at an entry point into the object, the entry point corresponding to a proximal end of the planned trajectory. In certain embodiments, if the instrument is distally advanced in the planned alignment, the instrument moves along the planned trajectory.
In some embodiments, the one or more target points can include a single target point indicating a position on the object toward which to direct a single laser emitted from a single laser output area, the single laser output area being positioned such that a path of the emitted single laser toward the single target point indicates the planned alignment of the instrument. In certain embodiments, the path of the emitted single laser can have a central longitudinal axis equal to the central longitudinal axis of the planned trajectory.
In certain embodiments, the method can further include obtaining updated actual data including at least updated first instrument data and recalculating the one or more target points based on the updated actual data and the planned trajectory. Moreover, the one or more lasers can be caused to be emitted toward the one or more updated laser target points. In some embodiments, the updated actual data can be based on operating of the instrument. And, in some embodiments, the obtaining of the updated actual data, recalculating the one or more target points, and causing the one or more lasers to be emitted can be performed in real-time during the operating of the instrument.
In another aspect, a system for providing trajectory guidance is provided that includes an instrument including a guide map configured to output visual cues associated with a position of the instrument, the guide map including one or more proximal end areas associated with the a position of a proximal end of the instrument and one or more distal end areas associated with a position of a distal end of the instrument. The system further includes at least one processor communicatively coupled to the instrument, the at least one processor being operable to obtain first guidance data associated with a first position of the instrument at a first time instance, the first guidance data including first distal data corresponding to the distal end of the instrument and first proximal data corresponding to the proximal end of the instrument. The at least one processor being further operable to identify (1) at least one of the proximal end areas of the guide map to activate based on the first distal data and a planned trajectory, and (2) at least one of the distal end areas of the guide map to activate based on the first proximal data and the planned trajectory. The at least one processor being further operable to activate a first portion of the at least one of the proximal end areas of the guide map and a first portion of the at least one of the distal end areas of the guide map identified for activation. Further, the first portion of the at least one of the proximal end areas of the guide map and the first portion of the at least one of the distal end areas that are activated function as visual cues indicating the alignment or misalignment of the proximal and distal ends of the instrument relative to the planned trajectory.
In some embodiments, the one or more proximal end areas of the guide map can include a proximal end alignment area and a proximal end misalignment area, and the one or more distal end areas of the guide map can include a distal end alignment area and a distal end misalignment area. In certain embodiments, the proximal end misalignment area and the distal end misalignment area can be divided into sections. In some embodiments, the first portion of the at least one of the proximal end areas of the guide map and the first portion of the at least one of the distal end areas that are activated can correspond to one or more of the sections of the proximal end misalignment area and the distal end misalignment area, respectively. Moreover, in some embodiments each of the one or more of the sections of the proximal end misalignment area and the distal end misalignment area that correspond to the first portions that are activated can indicate a manner in which to move the proximal end and the distal end of the instrument, respectively, to be aligned with the longitudinal axis of the planned trajectory. And in some embodiments, the first portion of the at least one of the proximal end areas of the guide map and the first portion of the at least one of the distal end areas that are activated can correspond to the proximal end alignment area and the distal end alignment area. Further, in certain embodiments the activation of the proximal end alignment area and the distal end alignment area can indicate alignment of the proximal end and the distal end of the instrument with the longitudinal axis of the planned trajectory.
In certain embodiments, the at least one processor can be further operable to obtain second guidance data associated with a second position of the instrument at a second time instance subsequent to the first time instance, the second guidance data including second distal data corresponding to the distal end of the instrument and second proximal data corresponding to the proximal end of the instrument. The at least one processor can be further operable to identify (1) at least one of the proximal end areas of the guide map to activate based on the second distal data and the planned trajectory, and (2) at least one of the distal end areas of the guide map to activate based on the second proximal data and the planned trajectory, and activate a second portion of the at least one of the proximal end areas of the guide map and a second portion of the at least one of the distal end areas of the guide map identified for activation. Moreover, the proximal end areas of the first portion and the second portion are not identical, and wherein the distal end areas of the first portion and the second portion are not identical, and the first position is different than the second position.
In some embodiments, the system can further include a guidance system communicatively coupled to the instrument and the at least one processor, the guidance system being operable to measure, in real-time, a position of the instrument during its operation, including the first position, and transmit to the instrument the first guidance data including the first position at the first time instance.
In another aspect, a method for providing trajectory guidance is provided that includes obtaining first guidance data associated with a first position of the instrument at a first time instance, the first guidance data including first distal data corresponding to a distal end of the instrument and first proximal data corresponding to a proximal end of the instrument. The method further includes identifying (1) at least one of proximal end areas of a guide map of the instrument to activate based on the first distal data and a planned trajectory, and (2) at least one of distal end areas of the guide map to activate based on the first proximal data and the planned trajectory. The method further includes activating a first portion of the at least one of the proximal end areas of the guide map and a first portion of the at least one of the distal end areas of the guide map identified for activation. Moreover, the first portion of the at least one of the proximal end areas of the guide map and the first portion of the at least one of the distal end areas that are activated function as visual cues indicating the alignment or misalignment of the proximal and distal ends of the instrument relative to the planned trajectory.
Any of the features or variations described above can be applied to any particular aspect or embodiment of the present disclosure in a number of different combinations. The absence of explicit recitation of any particular combination is due solely to the avoidance of repetition in this summary.
This disclosure will be more fully understood from the following detailed description taken in conjunction with the accompanying drawings, in which:
Certain exemplary embodiments will now be described to provide an overall understanding of the principles of the structure, function, manufacture, and use of the systems and methods disclosed herein. One or more examples of these embodiments are illustrated in the accompanying drawings. Those skilled in the art will understand that the systems, devices and methods specifically described herein and illustrated in the accompanying drawings are non-limiting exemplary embodiments. The features illustrated or described in connection with one exemplary embodiment may be combined with the features of other embodiments. Such modifications and variations are intended to be included within the scope of the present disclosure. Further, to the extent features or steps are described as being, for example, “first” or “second,” such numerical ordering is generally arbitrary, and thus such numbering can be interchangeable.
The present disclosure includes some illustrations and descriptions that include prototypes or bench models. A person skilled in the art will recognize how to rely upon the present disclosure to integrate the techniques, systems, devices, and methods provided for into a product, such as a consumer ready, warehouse-ready, or operating room ready surgical system. A person skilled in the art will appreciate that the present disclosure has application in conventional endoscopic, minimally-invasive, open surgical procedures and beyond.
Exemplary embodiments of the present disclosure provide surgical trajectory guidance using lasers. A planned trajectory is a path into or through an object or patient's body, into or via which an instrument or other object is to be inserted and removed. The planned trajectory can be determined pre-operatively or intra-operatively based on patient information including medical imaging, to indicate the safest or most effective path into the patient's body. In surgery, a laser system can be used to emit lasers indicating how an instrument to be inserted into the patient's body should be aligned to conform with the planned trajectory. The lasers are emitted toward target points on or in the direction of the patient's body. The position of the target points is calculated and adjusted accordingly such that the emitted lasers accurately indicate how the distal and proximal ends of the instrument should be aligned. A distal laser emitted toward an entry point on the patient's body indicates or corresponds to where the distal end of the tool should be positioned. The proximal laser indicates a path or axis with which the proximal end of the instrument should be aligned. When the distal end of the instrument aligns with the distal laser and the proximal end aligns with the proximal laser, the instrument is in alignment with the planned trajectory such that, if it is longitudinally moved, its movement will adhere to the planned trajectory. A guide component including a target point thereon can be provided on or proximate to the proximal end of the instrument. The target point of the guide of the instrument indicates the area of the proximal end of the instrument with which the proximal laser should be aligned or intersected.
In some example embodiments, a single laser is emitted toward a single target point on or proximate to the patient's body. The path of the single laser is equal to a longitudinal axis of or corresponding to the planned trajectory. The single laser thus indicates how the distal end of the instrument and the target point on the guide of the proximal end of the instrument should be positioned in order to align with the planned trajectory. When properly aligned, longitudinal movement of the instrument in a proximal and distal direction occurs within or in accordance with the planned trajectory.
In some example embodiments, a guide map can be provided on the instrument, at or near its proximal end. The guide map is a component with portions or areas that can be activated to provide visual cues to the operator of the instrument as to how to position the instrument into alignment with the planned trajectory. The portions or areas of the guide map can be or include LEDs or other light sources indicating alignment and misalignment of the proximal or distal ends of the instrument. If the proximal or distal ends of the instrument are not in alignment with the path or longitudinal axis of the planned trajectory, specific LEDs can be activated or illuminated to indicate how (e.g., the direction in which) the proximal or distal ends should be moved. When the proximal or distal ends are moved in accordance with the activated misalignment LEDs into proper alignment, corresponding alignment LEDs are in turn activated. The instrument can also or alternatively provide haptic and/or audio cues or outputs to indicate position or other relevant information surgical data.
Surgical Navigation
Surgical navigation systems, such as navigation system 101, can track the location and position of instruments during a surgical procedure, for example, by identifying the movement of the instrument 101i or instrument array 101i-2 using its cameras 101c. The tracked information concerning the location and position of instruments can be used to calculate and output information, for example, to assist the surgeon 105 (and the like) in adhering to planned surgical steps and processes.
For instance, the navigation system 101 can calculate or receive information regarding one or more target trajectories for the surgeon 105 to execute during a surgery of the patient 103. The target trajectories can refer to a desired or planned path through which a surgical instrument is to be driven, moved or operated, for example, relative to the anatomy of the patient 103. The target trajectories can be calculated in a number of manners either pre-operatively or intra-operatively, including, for example, using three-dimensional (3D) imaging (e.g., CT, MRI) of the patient's anatomy to identify the planned paths therethrough. During the surgery, the navigation system 101 can identify the actual position of the instrument 101i relative to the patient 103. The actual position of the instrument 101i relative to the patient 103 that is identified during the surgery can, in turn, be output on or via the display device 101d, with reference to the previously calculated target trajectories. This allows the surgeon 105 to navigate instruments during surgery by referencing the display 101d of the system 101, and thereby attempt to adhere to the target trajectories.
Although not illustrated in
Laser-Based Trajectory Guidance
Laser systems are devices that are configured to generate and emit light in the form of one or more laser beams. In the exemplary embodiments of
The laser output 207o-1 and 207o-2 can be configured to maximize the directions and areas in or to which the laser beams Ld and Lp can be emitted. For example, the laser outputs 207o-1 and 207o-2 can form part of respective laser emission components that can be driven or adjusted by the laser system 207 as needed. For example, the laser emission components can be designed such that that they can be angled as needed to emit the laser to desired areas on or proximate to the patient 203. Moreover, the laser emission components that include the laser outputs 207o-1 and 207o-2 can be multi jointed (e.g., a multi jointed cylinder) to provide further flexibility. In some embodiments, the laser output components can be mounted or formed on a base of the laser system 207 that can be rotated or moved along the X and Y axes. It should be understood that the design, configuration and/or mounting of the laser outputs 207o-1 and 207o-2 (and/or their laser emission components) can be different for each, such that, for example, one laser output is static relative to the laser system 207, while another is movable and/or adjustable.
Moreover, while the laser system 207 is illustrated in
In order to accurately align the laser outputs 207o-1 and 207o-2 with the positioning of the instrument 209, the laser system 207 can be integrated with a surgical navigation system 101, as described in connection with
In another embodiment shown in
In any of the above-described embodiments, additional navigation arrays (e.g., navigation spheres for use in optical navigation systems, coils in electromagnetic navigation systems, or other positioning sensors such as inertial motion sensors, etc.) can be utilized in connection with a patient, an operating table, or other objects in the operating environment to allow accurate detection of each object's position and accurate coordination of guidance for a user of a given instrument or object with respect to the patient. Adjustments in alignment between objects and trajectory guidance can be provided in real time intra-operatively to account for deviations from planned trajectories, e.g., due to patient movement, movement of a surgical table due to a bump from a user, etc.
Although not illustrated in
Still with reference to
The surgical instrument 209 can be or include guidewires, needles, taps, drivers, drills, cutters, blades, bone skids, retractors, access devices, and forceps, as well as implants such as bone anchors, spacers, cages, rods, plates, connectors, and the like. The instrument 209 includes a distal end 209d, initially positioned closest to the patient 203, and a proximal end 209p, initially positioned away from the patient 203. As noted above, the instrument can include a navigation array or sensor 209i to allow its position to be tracked by a navigation system integrated with the laser system. As explained in further detail below with reference to
More specifically, the instrument 209 can include a guide 209g on or proximate to its proximal end 209p. In some embodiments, the guide 209g is a circular plate, though the guide 209g can be of different shapes and have different characteristics. Moreover, in some embodiments, the guide 209g includes a target point 209t corresponding to the center of the guide 209g and thus the planned point with which the laser beam Lp is to be aligned or is to intersect, in order to replicate the planned trajectory To. The target point 209t can marked by a shape such as a circle or X on the center of the guide 209g, for example, using a different color than other adjacent portions of the guide 209g. In some embodiments, the guide 209g is modular, such that it can be added and removed from the instrument 209 and thus interchangeable between instruments. The guide 209g can also be configured to be movable when disposed on the instrument 209. For instance, the guide 209g can be pivotable relative to the shaft of the instrument 209 to optimize the ability to align the guide 209g and/or its target point 209t to the laser beam Lp.
Operation of the laser system for providing laser-based trajectory guidance is now described with reference to
At step 350, patient data of the patient 203 is obtained by the laser system 207. The patient data can include anatomical data and/or images of the patient 203 (e.g., CT, MRI, X-Ray) as well as other information that is relevant to the surgical procedure of the patient (e.g., age, weight, height, procedure, etc.). The patient data obtained at step 350 can be generated pre-operatively or intra-operatively. As described in further detail below, the patient data obtained at step 350 includes at least sufficient information to calculate the planned or target trajectory To for the instrument 209.
In some embodiments, the patient data can be received from one or more systems or devices that are communicatively coupled to the laser system 207. For example, images of the relevant areas of the patient's anatomy (e.g., the surgical site) can be received directly from an imaging system or an imaging database that stores patient scans. Patient data can be received by the laser system 207, for example, from a healthcare provider system that stores information related to patients (e.g., patient 203), such as patient symptoms, diagnosis, planned procedure, and the like, as known to those of skill in the art. In some embodiments, part or all of the patient data can be obtained from one or more devices or components that are part of the laser system 207. For instance, patient data can be stored on one or more memories of or associated with the laser system 207 and retrieved therefrom at step 350.
In turn, at step 352, a target or planned trajectory To is calculated. As described herein, the planned trajectory To refers to a path into and/or through the patient's anatomy, which indicates the planned angle, direction and/or depth to which the instrument 209 should be advanced and/or retracted. The target or planned trajectory can be calculated by the laser system 207 based on the patient data obtained at step 350. For example, patient data such as the patient images, diagnosis, and planned procedure can be analyzed to determine the required instruments and their planned manipulation during the surgery, including the trajectory to and through which they should be moved. Alternatively or additionally, in some embodiments, the planned trajectory To can be calculated at step 352 based on information input by medical professionals, such as the surgical opinion and plan of the surgeon 205, and/or on expert medical data (e.g., research, manuals, journals, etc.).
For example, in the context of a spinal surgery, the instrument 209 can be a bone anchor driver loaded with a bone anchor to be inserted or implanted into a pedicle of the patient 203. Accordingly, the laser system 207 can obtain images such as an Mill of the spine of the patient 203 at step 350 and, in turn, calculate the planned trajectory To of the driver 209 into the patient's body and pedicle, at step 352, based on the obtained images and, for example, information input by or from the surgeon 205.
It should be understood that the calculated trajectory To can be in the form of an image (e.g., a path illustrated on an image of the relevant portion of the body of the patient 203) and/or information defining the planned trajectory To. The information defining the planned trajectory To can include angles, depths, entry points, and the other information that is relevant to the operation of the instrument. This information can be measured and/or provided relative to points of reference including markers and/or anatomical landmarks of the of the patient 203, such as a pedicle or pedicle center, spinous process edge, midline axis, or intervertebral disc. By defining the planned trajectory information To relative to points of reference, the planned trajectory To can be translated from a theoretic calculation by the system 207 into a real-world surgical environment where it can be applied or replicated on the patient 203. The relevant markers and/or landmarks that are used in some embodiments to define the planned trajectory To can be accounted for or included in the patient data obtained at step 350. That is, for example, the markers and/or landmarks can be included, highlighted or otherwise illustrated in images of the patient's anatomy. In other embodiments, the laser system 207 can be integrated with a surgical navigation system 101, as described above in connection with
Further, note that the planned trajectory information To can be determined intra-operatively in a variety of manners. In some embodiments, for example, medical imaging or other data gathering techniques can be utilized intra-operatively in a manner analogous to how they can be used pre-operatively. In some embodiments, however, the planned trajectory information To can be determined based on a user positioning an instrument in a desired orientation and using sensors, such as those comprising a surgical navigation system, to detect the positioning and orientation of the instrument and using it to set planned trajectory information To. For example, a user can hold an instrument, such as a pointer device, at a desired entry point and aligned with a desired trajectory and its position can be captured by a surgical navigation system or any of a variety of sensors of various modalities (e.g., optical, electromagnetic, etc.). In such a case, any emitted lasers can be moved to match the positioning of the instrument and at the time the planned trajectory information To is set and can subsequently guide the instrument as it is utilized during a procedure.
Returning to
The in-surgery information with which the pre-operative or intra-operative data (e.g., patient data, patient images) is correlated for the calibration of step 354 can be obtained in various ways, including use of the surgical navigation systems described above. By way of further example, in some embodiments IMUs and other sensors configured to measure specific force, angular rate, magnetic field, rotation (e.g., pitch, yaw, roll), acceleration, position, location, and angular reference, among other data, can be placed at relevant areas in the surgical environment, such as on the patient's body, surgical table, surgical instruments, and/or other systems and devices (e.g., laser system, navigation system, imaging device, etc.). Position sensors can measure absolute and/or relative positions along multiple (e.g., two, three) axes. The position information sensed or obtained by the sensors (e.g., in the operative environment) can be transmitted to communicatively coupled systems and devices including, for example, other sensors and/or the laser system 207. In some embodiments utilizing pre-operatively generated data, for example, to correlate the anatomy of the patient 203 in-surgery with the patient data (e.g., patient imaging) obtained pre-operatively at step 350 that is used to calculate the planned trajectory To, the sensors can be placed at areas corresponding to markers or anatomical landmarks identified or identifiable in the patient data. In this way, in-surgery data about the patient 203, such as the actual or real-world distance from one anatomical landmark to another, can be detected or calculated and, in turn, compared or correlated with the pre-operative patient data obtained at step 350. In other words, the calibration of step 354 enables the system 207 to translate a coordinate (e.g., instrument insertion point) on an image of the patient's body obtained from a pre-operative patient image into a real-world coordinate on the patient's body in surgery.
The calibration of step 354 can also or alternatively include calibration of the laser system 207 and other instruments, systems, and/or devices. For example, at step 354, the instrument 209 can be calibrated to the laser system 207 and/or to the patient 203 using sensors placed thereon, as discussed above. Similar to the calibration of the in-surgery patient anatomy describe above, sensors included or placed on instruments, systems and/or devices such as the instrument 209 can be used to detect or measure their actual and/or relative positions, and transmit the sensed information to other communicatively coupled sensors (e.g., sensor-equipped instruments), systems and/or devices. This enables the laser system 207 to identify or calculate how the instrument 209 is manipulated during surgery, including, for example, how the instrument 209 is being moved and/or operated relative to the patient's body and the patient's surgical area. In some embodiments, the calibration of the laser system 207 and/or other instruments, systems and devices enables the laser system 207 to determine whether the instrument 209 is being moved in accordance with the planned trajectory calculated at step 352, and to make or trigger adjustments as needed, as described above.
As also noted above, in addition or alternative to calibrating using IMUs and other sensors that sense and transmit information such as position data, calibration can be performed using surgical navigation software and hardware, which can include one or more cameras (or imaging devices) and reference arrays that are configured to be identified by the cameras. Reference arrays can be provided on or attached to objects, systems, devices, and patients, including, for example, the instrument 209 (as shown in
Surgical navigation can in some cases be provided using navigation hardware and software included in an independent navigation system, as shown in the embodiment of
Still with reference to
At step 356 the laser system 207 calculates one or more target points. The target points are points where or in the direction in which the lasers emitted by the laser system 207 are to be aimed or directed. As described above, in some embodiments, trajectory guidance is provided by emitting two lasers Ld and Lp from the laser system 207 in the direction of the patient, to indicate or guide (1) where the distal end 209d of the instrument 209 is to be inserted into the patient 203, and (2) how the proximal end 209p of the instrument 209 is to be aligned to replicate the planned trajectory To calculated at step 352. Accordingly, in such embodiments, the laser system 207 calculates the position of two target points, a proximal end target point Pp and a distal end target point Pd, located on or proximate to the patient 203 where the lasers Ld and Lp are to be emitted. It should be understood that the point Pp on or near the patient's body corresponds to the area toward which the laser Lp is to be emitted. However, because the objective of the trajectory guidance described herein is in some embodiments to align the guide 209g of the instrument 209 with the laser Lp, the laser Lp can in some cases not be visible at or near the point Pp, despite being emitted in its direction, due to the guide 209g being aligned in a manner that interferes with the path of the light of the laser Lp.
The target points Pp and Pd can be calculated based on (1) data defining the planned trajectory To calculated at step 352, and/or (2) data obtained or obtainable by the laser system 207. As described above, the planned trajectory To calculated at step 352 is a path in the body of the patient 203, through which the instrument 209 is advanced (and/or retracted) to perform a surgical step or process. The planned trajectory To can be defined by angles, depths, entry points, and the like. As also described above, the data obtained or obtainable by the laser system 207 can include information about the patient 203, instrument 209, laser system 207, and other systems, devices, and instruments. Such information can include actual or relative positions, angles, and the like. The pre-operative measurements and data defining the planned trajectory To can be converted into real-world data applicable to the in-surgery patient 203.
In some embodiments, the target distal point Pd can be calculated, for instance, by determining the point on the patient's body at which the instrument 209 should be inserted to adhere to the planned trajectory To. The target proximal point Pp indicates where the laser Lp should be directed in order to create a laser beam that, when intersected by the instrument 209 at the target point 209t of its guide 209g, indicates the proper alignment of the proximal end in order to replicate the planned trajectory To. Thus, the target proximal point Pp can be calculated, for example, based on information about the objective trajectory To (e.g., angle), the length of the instrument 209, the position 203 and the instrument 209 (e.g., relative to one another or to the laser system 207), and other data as known to those of skill in the art.
In turn, at step 358, the laser system 207 emits lasers (e.g., lasers Lp and Ld) corresponding and directed to the target points (e.g., points Pp and Pd) calculated at step 356. As described above, in some embodiments, the emitted lasers can have different characteristics to make them distinguishable from one another. For example, the lasers can be of different colors or have beams of different diameters.
As shown in exemplary
It should be understood that, in some embodiments in which the laser system 207 is communicatively coupled to a surgical navigation system and physically movable relative thereto (e.g., as shown in
Once the lasers have been emitted at step 358, the surgical instrument 209 can be placed and/or manipulated to engage in the desired surgical step or procedure. That is, at step 360, the instrument 209 is operated by a surgeon, medical professional or the like, for example, by placing the distal end of the instrument 209 at the surgical entry point on the patient 203. The instrument 209 can be manipulated and adjusted as needed to achieve a desired alignment corresponding to the planned trajectory To. Moreover, as described in further detail below, the operation of the instrument 209 can be tracked in real-time by the laser system 207 (and/or interconnected surgical navigation system), to make adjustments as needed in order to achieve desired instrument guidance. The operation of the surgical instrument 209 performed at step 360 (as well as the laser emission of step 358) is described with reference to
As shown in
As a result of the manipulation of the instrument 209 from time t0 to time t1, the target point 209t of the guide 209g is caused to be aligned with the laser Lp directed at the point Pp. Alignment between the target point 209t and the laser Lp means that the laser Lp intersects with the point 209t. Moreover, the alignment of the target point 209t and the laser Lp attained at time t1, together with the alignment of the distal end 209d of the instrument 209 with the distal target point Pd results in the trajectory Te at time t1 overlapping with, being equal to, and/or having the same (or substantially the same) central longitudinal axis as the planned trajectory To (as shown, for example, in
Returning to the trajectory guidance method illustrated in
That is, at step 360 of
It should be understood that although
The tracking of the instrument 209 performed at step 360 can be performed by the laser system 207, for example, using embedded or interconnected hardware and/or software such as cameras, sensors and the like, as described above in detail with reference to
If none of the process completion criteria are deemed to be met at step 362, the laser system 207 recalculates the target points Pd and Pp at step 364 in a process similar to that described above in connection with the laser target point calculation of step 356 but using updated information obtained during the tracking performed at step 360. For example, recalculating the target points can include determining an updated proximal point Pp in the direction in which the laser Lp is to be directed using the updated information relating to the instrument 209, such as its position and angle. In some embodiments, for instance, if the tool has been distally advanced a distance d from time t1 (
In some embodiments, the real-time feedback loop of steps 358 to 364 illustrated in
In some embodiments, laser-based trajectory guidance similar to the process illustrated in
More specifically,
As shown in
Notably, the single and multi-laser trajectory guidance described above allows the surgeon 205 (and/or other operator of the instrument 209) to identify how the instrument 209 should be operated to conform with a calculated trajectory To using visual aids (e.g., lasers) within the line of sight of the surgeon 205 and at or proximate to the surgical area (e.g., surgical entry point).
On-Instrument Trajectory Guidance
The surgical environment 400 also includes a guidance system 411. The guidance system 411 can include hardware (e.g., memory, processors, communication means) and software configured to activate or trigger the guide 409g as needed to direct the instrument 409 to be aligned in accordance with the planned trajectory To based, for example, on detection of a position of the instrument 409 using a navigation array 409i or other sensor or element coupled to the instrument (e.g., as described above with respect to surgical navigation systems employing optical sensors, electromagnetic sensors, inertial motion units, etc.). It should be understood that although the guidance system 411 is illustrated as an overhead system in the environment 400, the guidance system 411 can be made up of one or more communicatively coupled devices that can be provided at various locations within and outside of the surgical environment 400. In some embodiments, the guidance system 411 can calculate the planned trajectory To as described herein based on information relating to the patient 403, the procedure and/or steps being performed on the patient, the instrument 409, and/or other communicatively coupled systems, devices and instruments. Such data can be captured by the system 411 and/or can be received from other systems. It should be understood that, in some embodiments, the planned trajectory To and/or information defining the planned trajectory To can be calculated by a system other than the guidance system 411, and transmitted thereto. In some embodiments, the guidance system 411 includes navigation hardware and/or software, and/or is communicatively coupled to a navigation system. The navigation hardware and/or software can include cameras and/or other components configured to measure or detect information about the patient 403, instrument 409 and/or other systems and devices, such as position data. The information that is measured or detected by the navigation hardware and software of the guidance system 411 and/or the navigation system can be used, for example, to calculate the planned trajectory To and/or track (e.g., in real-time) the instrument 409 during its operation. Based on the planned trajectory To and the information relating to the instrument 409 (e.g., its actual or relative position), the guidance system 411 can transmit instructions to cause the guide 409g to be triggered to guide the position of the instrument 409 to be aligned with a planned trajectory.
More specifically,
In the exemplary embodiments illustrated in
As shown in
As described above, the LEDs of the LED map 409m can be illuminated by the instrument 409 and/or caused to be illuminated by the guidance system 411 as a result of its movement, which can be tracked in real-time or periodically (e.g., every 1 ms, 1 s, etc.) by the guidance system 411 (e.g., using its navigation hardware and/or software, or that of an independent navigation system). Thus, as the instrument 409 is moved and its adjusted position is identified by the guidance system 411, the guidance system 411 can determine which segments of the LED map 409m to illuminate based on the new relative position of the proximal or distal ends 409p and 409d, respectively, to the central longitudinal axis AL. If the distal end 409d is moved between time t0 and t1 form a misaligned position to an aligned position as described above in connection with
Similar to the illumination of the LEDs of the LED map 409m described above in connection with
As shown in
It should be understood that the map 409m can have shapes other than those illustrated in
The guide 409g of the instrument 409 described above provides trajectory guidance to the operator of the instrument using visual cues, by indicating whether the proximal and distal ends of the instrument are properly aligned to replicate and/or adhere to a planned trajectory To. It should be understood that the guide 409g can be used not only to indicate whether the instrument 409 is aligned with the planned trajectory To, but also to indicate or alert when the instrument (e.g., the distal end of the instrument) is approaching or within a predetermined distance of a planned or non-planned structure. For example, visual cues such as those described above can indicate when the instrument is proximate to or has reached a desired structure (e.g., a bone or bone depth to be operated on) by illuminating a respective area on the map 409m using green lights, or by illuminating a respective area on the map 409m when the instrument is proximate to or has reached a non-planned structure (e.g., nerve). Such guidance can be provided based on information obtained by the guidance system, including, for example, pre-operative patient data (e.g., patient images). Such guidance can also be provided based on information obtained using other sensors disposed in an operating theater and/or coupled to the instrument 409. For example, information regarding proximity to nerves or other neural tissues or structures can be provided using one or more sensors integrated into the instrument 409. Such sensors can include one or more electrodes to detect the presence of, and proximity to, nerves. Exemplary sensors include those used in connection with electromyography (EMG) and mechanomyography (MMG).
Moreover, while the guide 409g of the instrument 409 described above provides trajectory guidance to the operator of the instrument using visual cues, the instrument 409 can be configured to, additionally or alternatively, provide non-visual cues to the operator of the instrument 409, such as haptic and/or audio cues, prompts or feedbacks. For example, the instrument 409 (e.g., at a handle or shaft portion) can be configured to vibrate (or vibrate in various ways) to provide guidance to the instrument operator. For example, vibration and other haptic cues can be used to indicate when the instrument is aligned or misaligned with the axis of the planned trajectory, or when the instrument is proximate or adjacent to a desired or planned structure. Moreover, such guidance can be provided through audio cues, ranging from indirect feedback (e.g., rings, dings, beeps, and the like) to more direct feedback (e.g., spoken cues such as “distal end left,” “proximal end right,” “danger”).
In the embodiment shown in
The example embodiments described above, including the systems and procedures depicted in or discussed in connection with
In addition, it should be understood that the figures are presented for example purposes only. The architecture of the example embodiments presented herein is sufficiently flexible and configurable, such that it may be utilized and navigated in ways other than that shown in the accompanying figures. It is also to be understood that the procedures recited in the claims need not be performed in the order presented. Although specific embodiments are described above, it should be understood that numerous changes may be made within the spirit and scope of the concepts described. Accordingly, the disclosure is not to be limited by what has been particularly shown and described. All publications and references cited herein are expressly incorporated herein by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4573448 | Kambin | Mar 1986 | A |
4646738 | Trott | Mar 1987 | A |
4678459 | Onik et al. | Jul 1987 | A |
4863430 | Klyce et al. | Sep 1989 | A |
4888146 | Dandeneau | Dec 1989 | A |
5080662 | Paul | Jan 1992 | A |
5195541 | Obenchain | Mar 1993 | A |
5285795 | Ryan et al. | Feb 1994 | A |
5395317 | Kambin | Mar 1995 | A |
5439464 | Shapiro | Aug 1995 | A |
5529580 | Kusunoki et al. | Jun 1996 | A |
5540706 | Aust et al. | Jul 1996 | A |
5569290 | McAfee | Oct 1996 | A |
5591187 | Dekel | Jan 1997 | A |
5601569 | Pisharodi | Feb 1997 | A |
5662300 | Michelson | Sep 1997 | A |
5688222 | Hluchy et al. | Nov 1997 | A |
5730754 | Obenchain | Mar 1998 | A |
5733242 | Rayburn et al. | Mar 1998 | A |
5735792 | Vanden Hoek et al. | Apr 1998 | A |
5820623 | Ng | Oct 1998 | A |
5885300 | Tokuhashi et al. | Mar 1999 | A |
5894369 | Akiba et al. | Apr 1999 | A |
5899425 | Corey Jr. et al. | May 1999 | A |
5954635 | Foley et al. | Sep 1999 | A |
6033105 | Barker et al. | Mar 2000 | A |
6053907 | Zirps | Apr 2000 | A |
6063021 | Hossain et al. | May 2000 | A |
6110182 | Mowlai-Ashtiani | Aug 2000 | A |
6200322 | Branch et al. | Mar 2001 | B1 |
6217509 | Foley et al. | Apr 2001 | B1 |
6234961 | Gray | May 2001 | B1 |
6283966 | Houfburg | Sep 2001 | B1 |
6286179 | Byrne | Sep 2001 | B1 |
6296644 | Saurat et al. | Oct 2001 | B1 |
6322498 | Gravenstein et al. | Nov 2001 | B1 |
6354992 | Kato | Mar 2002 | B1 |
6371968 | Kogasaka et al. | Apr 2002 | B1 |
6383191 | Zdeblick et al. | May 2002 | B1 |
6447446 | Smith et al. | Sep 2002 | B1 |
6468289 | Bonutti | Oct 2002 | B1 |
6558407 | Ivanko et al. | May 2003 | B1 |
6575899 | Foley et al. | Jun 2003 | B1 |
6579281 | Palmer et al. | Jun 2003 | B2 |
6626830 | Califiore et al. | Sep 2003 | B1 |
6648915 | Sazy | Nov 2003 | B2 |
6676597 | Guenst et al. | Jan 2004 | B2 |
6688564 | Salvermoser et al. | Feb 2004 | B2 |
6758809 | Briscoe et al. | Jul 2004 | B2 |
6808505 | Kadan | Oct 2004 | B2 |
6887198 | Phillips et al. | May 2005 | B2 |
6983930 | La Mendola et al. | Jan 2006 | B1 |
7087058 | Cragg | Aug 2006 | B2 |
7104986 | Hovda et al. | Sep 2006 | B2 |
7137949 | Scirica et al. | Nov 2006 | B2 |
7182731 | Nguyen et al. | Feb 2007 | B2 |
7341556 | Shalman | Mar 2008 | B2 |
7434325 | Foley et al. | Oct 2008 | B2 |
7591790 | Pflueger | Sep 2009 | B2 |
7594888 | Raymond et al. | Sep 2009 | B2 |
7618431 | Roehm, III et al. | Nov 2009 | B2 |
7636596 | Solar | Dec 2009 | B2 |
7637905 | Saadat et al. | Dec 2009 | B2 |
7641659 | Emstad et al. | Jan 2010 | B2 |
7771384 | Ravo | Aug 2010 | B2 |
7794456 | Sharps et al. | Sep 2010 | B2 |
7811303 | Allin et al. | Oct 2010 | B2 |
7931579 | Bertolero et al. | Apr 2011 | B2 |
7946981 | Cubb | May 2011 | B1 |
7951141 | Sharps et al. | May 2011 | B2 |
7959564 | Ritland | Jun 2011 | B2 |
7988623 | Pagliuca et al. | Aug 2011 | B2 |
8007492 | DiPoto et al. | Aug 2011 | B2 |
8038606 | Otawara | Oct 2011 | B2 |
8043381 | Hestad et al. | Oct 2011 | B2 |
8062218 | Sebastian et al. | Nov 2011 | B2 |
8092464 | McKay | Jan 2012 | B2 |
8096944 | Harrel | Jan 2012 | B2 |
8202216 | Melkent et al. | Jun 2012 | B2 |
8236006 | Hamada | Aug 2012 | B2 |
8333690 | Ikeda | Dec 2012 | B2 |
8360970 | Mangiardi | Jan 2013 | B2 |
8372131 | Hestad et al. | Feb 2013 | B2 |
8382048 | Nesper et al. | Feb 2013 | B2 |
8397335 | Gordin et al. | Mar 2013 | B2 |
8435174 | Cropper et al. | May 2013 | B2 |
8442621 | Gorek et al. | May 2013 | B2 |
8460180 | Zarate et al. | Jun 2013 | B1 |
8460186 | Ortiz et al. | Jun 2013 | B2 |
8460310 | Stern | Jun 2013 | B2 |
8518087 | Lopez et al. | Aug 2013 | B2 |
8535220 | Mondschein | Sep 2013 | B2 |
8556809 | Vijayanagar | Oct 2013 | B2 |
8585726 | Yoon et al. | Nov 2013 | B2 |
8602979 | Kitano | Dec 2013 | B2 |
8622894 | Banik et al. | Jan 2014 | B2 |
8636655 | Childs | Jan 2014 | B1 |
8690764 | Clark et al. | Apr 2014 | B2 |
8721536 | Marino et al. | May 2014 | B2 |
8740779 | Yoshida | Jun 2014 | B2 |
8784421 | Carrison et al. | Jul 2014 | B2 |
8821378 | Morgenstern Lopez et al. | Sep 2014 | B2 |
8834507 | Mire et al. | Sep 2014 | B2 |
8845734 | Weiman | Sep 2014 | B2 |
8852242 | Morgenstern Lopez et al. | Oct 2014 | B2 |
8870753 | Boulais et al. | Oct 2014 | B2 |
8870756 | Maurice | Oct 2014 | B2 |
8876712 | Yee et al. | Nov 2014 | B2 |
8894573 | Loftus et al. | Nov 2014 | B2 |
8894653 | Solsberg et al. | Nov 2014 | B2 |
8926502 | Levy et al. | Jan 2015 | B2 |
8932207 | Greenburg et al. | Jan 2015 | B2 |
8932360 | Womble et al. | Jan 2015 | B2 |
8936605 | Greenberg | Jan 2015 | B2 |
8974381 | Lovell et al. | Mar 2015 | B1 |
8986199 | Weisenburgh, II et al. | Mar 2015 | B2 |
8992580 | Bar et al. | Mar 2015 | B2 |
9028522 | Prado | May 2015 | B1 |
9050146 | Woolley et al. | Jun 2015 | B2 |
9055936 | Mire et al. | Jun 2015 | B2 |
9072431 | Adams et al. | Jul 2015 | B2 |
9078562 | Poll et al. | Jul 2015 | B2 |
9131948 | Fang et al. | Sep 2015 | B2 |
9144374 | Maurice, Jr. | Sep 2015 | B2 |
9198674 | Benson et al. | Dec 2015 | B2 |
9211059 | Drach et al. | Dec 2015 | B2 |
9216016 | Fiechter et al. | Dec 2015 | B2 |
9216125 | Sklar | Dec 2015 | B2 |
9232935 | Brand et al. | Jan 2016 | B2 |
9247997 | Stefanchik et al. | Feb 2016 | B2 |
9265491 | Lins et al. | Feb 2016 | B2 |
9277928 | Morgenstern Lopez | Mar 2016 | B2 |
9307972 | Lovell et al. | Apr 2016 | B2 |
9320419 | Kirma et al. | Apr 2016 | B2 |
RE46007 | Banik et al. | May 2016 | E |
RE46062 | James et al. | Jul 2016 | E |
9386971 | Casey et al. | Jul 2016 | B1 |
9387313 | Culbert et al. | Jul 2016 | B2 |
9414828 | Abidin et al. | Aug 2016 | B2 |
9486296 | Mire et al. | Nov 2016 | B2 |
9492194 | Morgenstern Lopez et al. | Nov 2016 | B2 |
9510853 | Aljuri et al. | Dec 2016 | B2 |
9526401 | Saadat et al. | Dec 2016 | B2 |
9579012 | Vazales et al. | Feb 2017 | B2 |
9603510 | Ammirati | Mar 2017 | B2 |
9603610 | Richter et al. | Mar 2017 | B2 |
9610007 | Kienzle et al. | Apr 2017 | B2 |
9610095 | To | Apr 2017 | B2 |
9629521 | Ratnakar | Apr 2017 | B2 |
9655605 | Serowski et al. | May 2017 | B2 |
9655639 | Mark | May 2017 | B2 |
9668643 | Kennedy, II et al. | Jun 2017 | B2 |
9675235 | Lieponis | Jun 2017 | B2 |
9700378 | Mowlai-Ashtiani | Jul 2017 | B2 |
9706905 | Levy | Jul 2017 | B2 |
20020022762 | Beane et al. | Feb 2002 | A1 |
20020138020 | Pflueger | Sep 2002 | A1 |
20030083555 | Hunt et al. | May 2003 | A1 |
20030171744 | Leung et al. | Sep 2003 | A1 |
20030191474 | Cragg et al. | Oct 2003 | A1 |
20040122446 | Solar | Jun 2004 | A1 |
20040127992 | Serhan et al. | Jul 2004 | A1 |
20040143165 | Alleyne | Jul 2004 | A1 |
20050085692 | Kiehn et al. | Apr 2005 | A1 |
20050090848 | Adams | Apr 2005 | A1 |
20050187570 | Nguyen et al. | Aug 2005 | A1 |
20050256525 | Culbert et al. | Nov 2005 | A1 |
20060036264 | Selover et al. | Feb 2006 | A1 |
20060206118 | Kim et al. | Sep 2006 | A1 |
20070055259 | Norton et al. | Mar 2007 | A1 |
20070129634 | Hickey et al. | Jun 2007 | A1 |
20070149975 | Oliver et al. | Jun 2007 | A1 |
20070203396 | McCutcheon et al. | Aug 2007 | A1 |
20070225556 | Ortiz et al. | Sep 2007 | A1 |
20070260113 | Otawara | Nov 2007 | A1 |
20080015621 | Emanuel | Jan 2008 | A1 |
20080033251 | Araghi | Feb 2008 | A1 |
20080081951 | Frasier et al. | Apr 2008 | A1 |
20080188714 | McCaffrey | Aug 2008 | A1 |
20090018566 | Escudero et al. | Jan 2009 | A1 |
20090024158 | Viker | Jan 2009 | A1 |
20090062871 | Chin et al. | Mar 2009 | A1 |
20090105543 | Miller et al. | Apr 2009 | A1 |
20090156898 | Ichimura | Jun 2009 | A1 |
20090187080 | Seex | Jul 2009 | A1 |
20090240111 | Kessler et al. | Sep 2009 | A1 |
20090274271 | Pfister et al. | Nov 2009 | A1 |
20090287061 | Feigenbaum et al. | Nov 2009 | A1 |
20090318765 | Torii | Dec 2009 | A1 |
20100004651 | Biyani | Jan 2010 | A1 |
20100022841 | Takahashi et al. | Jan 2010 | A1 |
20100076476 | To et al. | Mar 2010 | A1 |
20100114147 | Biyani | May 2010 | A1 |
20100151161 | Da Rolo | Jun 2010 | A1 |
20100161060 | Schaller et al. | Jun 2010 | A1 |
20100256446 | Raju | Oct 2010 | A1 |
20100280325 | Ibrahim et al. | Nov 2010 | A1 |
20100284580 | OuYang et al. | Nov 2010 | A1 |
20100286477 | OuYang et al. | Nov 2010 | A1 |
20100312053 | Larsen | Dec 2010 | A1 |
20110028791 | Marino et al. | Feb 2011 | A1 |
20110054507 | Batten et al. | Mar 2011 | A1 |
20110106261 | Chin et al. | May 2011 | A1 |
20110125158 | Diwan et al. | May 2011 | A1 |
20110130634 | Solitario, Jr. et al. | Jun 2011 | A1 |
20110295070 | Yasunaga | Dec 2011 | A1 |
20110319941 | Bar et al. | Dec 2011 | A1 |
20120095296 | Trieu et al. | Apr 2012 | A1 |
20120101338 | O'Prey et al. | Apr 2012 | A1 |
20120209273 | Zaretzka et al. | Aug 2012 | A1 |
20120221007 | Batten et al. | Aug 2012 | A1 |
20120232350 | Seex | Sep 2012 | A1 |
20120232552 | Morgenstern Lopez et al. | Sep 2012 | A1 |
20120298820 | Manolidis | Nov 2012 | A1 |
20120316400 | Vijayanagar | Dec 2012 | A1 |
20130103067 | Fabro et al. | Apr 2013 | A1 |
20130103103 | Mire et al. | Apr 2013 | A1 |
20130150670 | O'Prey et al. | Jun 2013 | A1 |
20130150674 | Haig et al. | Jun 2013 | A1 |
20130172676 | Levy et al. | Jul 2013 | A1 |
20130282022 | Yousef | Oct 2013 | A1 |
20130289399 | Choi et al. | Oct 2013 | A1 |
20130303846 | Cybulski et al. | Nov 2013 | A1 |
20140066940 | Fang et al. | Mar 2014 | A1 |
20140074170 | Mertens et al. | Mar 2014 | A1 |
20140107473 | Dumoulin et al. | Apr 2014 | A1 |
20140142584 | Sweeney | May 2014 | A1 |
20140148647 | Okazaki | May 2014 | A1 |
20140180321 | Dias et al. | Jun 2014 | A1 |
20140194697 | Seex | Jul 2014 | A1 |
20140215736 | Gomez et al. | Aug 2014 | A1 |
20140257489 | Warren et al. | Sep 2014 | A1 |
20140275799 | Schuele | Sep 2014 | A1 |
20140276840 | Richter et al. | Sep 2014 | A1 |
20140277204 | Sandhu | Sep 2014 | A1 |
20140318582 | Mowlai-Ashtiani | Oct 2014 | A1 |
20140357945 | Duckworth | Dec 2014 | A1 |
20150018623 | Friedrich et al. | Jan 2015 | A1 |
20150065795 | Titus | Mar 2015 | A1 |
20150073218 | Ito | Mar 2015 | A1 |
20150112398 | Morgenstern Lopez et al. | Apr 2015 | A1 |
20150164496 | Karpowicz et al. | Jun 2015 | A1 |
20150216593 | Biyani | Aug 2015 | A1 |
20150223676 | Bayer et al. | Aug 2015 | A1 |
20150230697 | Phee et al. | Aug 2015 | A1 |
20150342621 | Jackson, III | Dec 2015 | A1 |
20150374213 | Maurice, Jr. | Dec 2015 | A1 |
20160015467 | Vayser et al. | Jan 2016 | A1 |
20160030061 | Thommen et al. | Feb 2016 | A1 |
20160066965 | Chegini et al. | Mar 2016 | A1 |
20160067003 | Chegini et al. | Mar 2016 | A1 |
20160074029 | O'Connell et al. | Mar 2016 | A1 |
20160095505 | Johnson et al. | Apr 2016 | A1 |
20160106408 | Ponmudi et al. | Apr 2016 | A1 |
20160166135 | Fiset | Jun 2016 | A1 |
20160174814 | Igov | Jun 2016 | A1 |
20160213500 | Beger et al. | Jul 2016 | A1 |
20160228280 | Schuele et al. | Aug 2016 | A1 |
20160235284 | Yoshida et al. | Aug 2016 | A1 |
20160287264 | Chegini et al. | Oct 2016 | A1 |
20160296220 | Mast et al. | Oct 2016 | A1 |
20160353978 | Miller et al. | Dec 2016 | A1 |
20170003493 | Zhao | Jan 2017 | A1 |
20170007226 | Fehling | Jan 2017 | A1 |
20170027606 | Cappelleri et al. | Feb 2017 | A1 |
20170042408 | Washburn et al. | Feb 2017 | A1 |
20170042411 | Kang et al. | Feb 2017 | A1 |
20170065269 | Thommen et al. | Mar 2017 | A1 |
20170065287 | Silva et al. | Mar 2017 | A1 |
20170086939 | Vayser et al. | Mar 2017 | A1 |
20170135699 | Wolf | May 2017 | A1 |
20170156755 | Poll et al. | Jun 2017 | A1 |
20170156814 | Thommen et al. | Jun 2017 | A1 |
20170196549 | Piskun et al. | Jul 2017 | A1 |
20170224391 | Biester et al. | Aug 2017 | A1 |
20180333208 | Kotian et al. | Nov 2018 | A1 |
Number | Date | Country |
---|---|---|
102727309 | Nov 2014 | CN |
9415039 | Nov 1994 | DE |
29916026 | Nov 1999 | DE |
0 537 116 | Apr 1993 | EP |
0 807 415 | Nov 1997 | EP |
3254627 | Dec 2017 | EP |
2481727 | Jan 2012 | GB |
2017015480 | Jan 2017 | NO |
9629014 | Sep 1996 | WO |
2001056490 | Aug 2001 | WO |
2001089371 | Nov 2001 | WO |
2002002016 | Jan 2002 | WO |
2004103430 | Dec 2004 | WO |
2008121162 | Oct 2008 | WO |
2009033207 | Mar 2009 | WO |
2013033426 | Mar 2013 | WO |
2013059640 | Apr 2013 | WO |
2014050236 | Apr 2014 | WO |
2014100761 | Jun 2014 | WO |
2014185334 | Nov 2014 | WO |
2015142762 | Sep 2015 | WO |
2016111373 | Jul 2016 | WO |
2016131077 | Aug 2016 | WO |
2016168673 | Oct 2016 | WO |
2017006684 | Jan 2017 | WO |
2017083648 | May 2017 | WO |
Entry |
---|
International Search Report and Written Opinion for Application No. PCT/US2015/043554, dated Nov. 19, 2015 (8 pages). |
International Search Report and Written Opinion for Application No. PCT/US2015/048485, dated Feb. 9, 2016. (16 pages). |
International Search Report and Written Opinion for Application No. PCT/US2015/060978, dated Feb. 15, 2016 (8 pages). |
International Search Report and Written Opinion for Application No. PCT/US2016/050022, dated Feb. 1, 2017 (19 pages). |
Invitation to Pay Additional Fees for Application No. PCT/US2016/050022, dated Nov. 3, 2016 (2 pages). |
Iprenburg, M, “Percutaneous Transforaminal Endoscopic Discectomy: The Thessys Method,” in Lewandrowski, K., et al, Minimally Invasive Spinal Fusion Techniques, Summit Communications, 2008 pp. 65-81. |
Jung, K., et al., “A hands-free region-of-interest selection interface for solo surgery with a wide-angle endoscope: preclinical proof of concept,” Surg Endosc, 2017, v. 31, pp. 974-980. |
Invitation to Pay Additional Fees and Partial Search Report for Application No. PCT/IB2020/052932, dated Jun. 9, 2020 (15 pages). |
International Search Report and Written Opinion for Application No. PCT/IB2020/052932, dated Jul. 31, 2020 (21 pages). |
Number | Date | Country | |
---|---|---|---|
20200315711 A1 | Oct 2020 | US |