Robotic surgical systems and methods

Information

  • Patent Grant
  • 11872000
  • Patent Number
    11,872,000
  • Date Filed
    Monday, June 1, 2020
    3 years ago
  • Date Issued
    Tuesday, January 16, 2024
    3 months ago
Abstract
The disclosed technology relates to robotic surgical systems for improving surgical procedures. In certain embodiments, the disclosed technology relates to robotic surgical systems for use in osteotomy procedures in which bone is cut to shorten, lengthen, or change alignment of a bone structure. The osteotome, an instrument for removing parts of the vertebra, is guided by the surgical instrument guide which is held by the robot. In certain embodiments, the robot moves only in the “locked” plane (one of the two which create the wedge—i.e., the portion of the bone resected during the osteotomy). In certain embodiments, the robot shall prevent the osteotome (or other surgical instrument) from getting too deep/beyond the tip of the wedge. In certain embodiments, the robotic surgical system is integrated with neuromonitoring to prevent damage to the nervous system.
Description
BACKGROUND OF THE INVENTION

Robotic-assisted surgical systems have been developed to improve surgical precision and enable the implementation of new surgical procedures. For example, robotic systems have been developed to sense a surgeon's hand movements and translate them to scaled-down micro-movements and filter out unintentional tremors for precise microsurgical techniques in organ transplants, reconstructions, and minimally invasive surgeries. Other robotic systems are directed to telemanipulation of surgical tools such that the surgeon does not have to be present in the operating room, thereby facilitating remote surgery. Feedback-controlled robotic systems have also been developed to provide smoother manipulation of a surgical tool during a procedure than could be achieved by an unaided surgeon.


However, widespread acceptance of robotic systems by surgeons and hospitals is limited for a variety of reasons. Current systems are expensive to own and maintain. They often require extensive preoperative surgical planning prior to use, and they extend the required preparation time in the operating room. They are physically intrusive, possibly obscuring portions of a surgeons field of view and blocking certain areas around the operating table, such that a surgeon and/or surgical assistants are relegated to one side of the operating table. Current systems may also be non-intuitive or otherwise cumbersome to use, particularly for surgeons who have developed a special skill or “feel” for performing certain maneuvers during surgery and who find that such skill cannot be implemented using the robotic system. Finally, robotic surgical systems may be vulnerable to malfunction or operator error, despite safety interlocks and power backups.


Certain surgical procedures, such as neurosurgery, orthopedic surgery, and spinal surgery require precise movement of surgical instruments and placement of devices. For example, spinal surgeries often require precision drilling and placement of screws or other implements in relation to the spine, and there may be constrained access to the vertebrae during surgery that makes such maneuvers difficult. Catastrophic damage or death may result from improper drilling or maneuvering of the body during spinal surgery, due to the proximity of the spinal cord and arteries. Common spinal surgical procedures include a discectomy for removal of all or part of a disk, a foraminotomy for widening of the opening where nerve roots leave the spinal column, a laminectomy for removal of the lamina or bone spurs in the back, and spinal fusion for fusing of two vertebrae or vertebral segments together to eliminate pain caused by movement of the vertebrae.


Surgeries that involve screw placement require preparation of holes in bone (e.g., vertebral segments) prior to placement of the screws. Where such procedures are performed manually, in some implementations, a surgeon judges a drill trajectory for subsequent screw placement on the basis of pre-operative CT scans. Other manual methods which do not involve usage of the pre-operative CT scans, such as fluoroscopy, 3D fluoroscopy or natural landmark-based, may be used to determine the trajectory for preparing holes in bone prior to placement of the screws. In some implementations, the surgeon holds the drill in his hand while drilling, and fluoroscopic images are obtained to verify if the trajectory is correct. Some surgical techniques involve usage of different tools, such as a pedicle finder or K-wires. Such procedures rely strongly on the expertise of the surgeon, and there is significant variation in success rate among different surgeons. Screw misplacement is a common problem in such surgical procedures.


In some procedures, such as osteotomy, a portion of the vertebra is removed (e.g., a wedge is created) such that the alignment of the spine can be changed. However, correcting the shape of the spine manually is difficult, prone to error, and cumbersome. For example, FIGS. 2A through 2D illustrate the principles of osteotomy, which is to correct the shape of the spine. A part of the vertebra is removed in order to obtain the right curvature of the spine. After part of the vertebra is removed, the vertebra(e) is fixed with the screws as shown in FIG. 2D to prevent spinal instability. An example osteotomy instrument is shown in FIG. 3. A surgeon manipulates this instrument, sometimes by hitting it with a hammer, to remove part of the vertebra(e). Similar procedures can be performed on other portions of a patient's skeletal structure.


Inaccurate or incorrect surgical procedures such as osteotomies, are frequent and typically the result of inadequacy of instruments and the difficulty of accurately removing portions of the bone with manual tools. Thus, there is a need for a robotic surgical system to assist with surgical procedures.


SUMMARY OF THE INVENTION

The disclosed technology relates to robotic surgical systems for improving surgical procedures. In certain embodiments, the disclosed technology relates to robotic surgical systems for use in osteotomy procedures in which bone is cut to shorten, lengthen, or change alignment of a bone structure. The disclosed technology can be used for many surgical procedures including, but not limited to, spinal surgery; neurosurgery (surgery performed on the nervous system), such as brain surgery; and orthopedic surgery, such as hip, knee, leg, or knee surgery.


The instrument, such as an osteotome for removing parts of bone, is guided by the surgical instrument guide which is held by the robot. In certain embodiments, the robot moves only in the “locked” plane (one of the two which create the wedge—i.e., the portion of the bone resected during the osteotomy). The guide allows for translational movement of the instrument, such as an osteotome, which is necessary to cut the bone (e.g., vertebra). A surgeon can, for example, use a hammer or advance the instrument only using his hand. In certain embodiments, a navigation marker measures the position of the instrument which is necessary for the system to determine the locked planes (e.g., the planes along which the cuts are made to form the wedge). In other embodiments, the marker is on the robot and robot's actual position (measured by robot's encoders and calculated using robot model) is used to determine the position of the instrument in space.


In certain embodiments, the robot shall prevent the instrument (or other surgical instrument) from getting too deep/beyond the tip of the wedge. This can be achieved be having a notch at the correct distance above the patient thereby preventing the instrument from getting deeper than the notch end.


In certain embodiments, the robotic surgical system is integrated with neuromonitoring to prevent damage to the nervous system. For example, the electrical potential applied to the patient via the surgical instrument can be measured to ensure that the amount remains below an acceptable level. This can be measured by a neuromonitor (e.g., such as a neuromonitoring system with a sensor cable). When a threshold level is reached/detected or a nerve has been touched, a signal is sent to the appropriate system to stop insertion of the surgical instrument and/or move the surgical instrument away such that the depth of penetration is less.


In one aspect, the disclosed technology includes a robotic surgical system for use in a surgical procedure performed on a patient, the system including: a robotic arm including an end-effector; an actuator for controlled movement of the robotic arm and positioning of the end effector, thereby controlling the trajectory and/or insertion depth of a surgical instrument in a guide affixed to the end effector; a neuromonitoring module for implementing real-time neuromonitoring during a surgical procedure; and a processor and a memory storing instructions thereon, wherein the instructions, when executed, cause the processor to: receive, by the neuromonitoring module, a trigger based on a neurological response of a portion of a nerve structure of the patient that is measured by a neuromonitoring system; and prevent, by the neuromonitoring module, deeper insertion into the patient of a surgical instrument guided by the robotic surgical system upon receipt of the trigger.


In certain embodiments, the system includes preventing deeper insertion into the patient of a surgical instrument guided by the robotic surgical system upon receipt of the trigger including moving, by the robotic surgical system, a position of the end-effector away from the patient (e.g., along an axis).


In certain embodiments, the system includes a surgical instrument guide arranged to pass a neuromonitoring cable therethrough.


In certain embodiments, the surgical instrument guide is integrated with the neuromonitoring system such that a neuromonitoring cable can pass through a sterile zone.


In certain embodiments, the neuromonitoring system which is separate from the robotic surgical system.


In certain embodiments, the neuromonitoring system includes a cable that extends through a surgical instrument guide connected to the end-effector.


In certain embodiments, the surgical instrument guide includes a user interface thereon.


In certain embodiments, the user interface includes one or more buttons thereon.


In certain embodiments, the surgical instrument guide includes a block and/or notch (e.g., at a correct distance above the patient) for preventing further insertion of the surgical instrument.


In certain embodiments, the system includes a navigation module for maintaining the position of the end-effector upon detection, by a navigation system, of movement of a navigation marker.


In certain embodiments, the system includes a user interface on a robotic arm of the robotic surgical system.


In certain embodiments, the user interface includes a touch screen.


In certain embodiments, the instructions, when executed by the processor, cause the processor to: provide for display on the user interface a list of one or more trajectories for selection by a user.


In certain embodiments, the instructions, when executed by the processor, cause the processor to: limit movement of the end effector such that movement of the surgical instrument is limited to a locked plane (e.g., wherein the locked plane is long which one of the cuts to create a wedge in the vertebra(e) is made).


In certain embodiments, the instructions, when executed by the processor, cause the processor to: limit movement of the end effector such that movement of the surgical instrument is limited to translational movement (e.g., which is necessary to cut the vertebrae).


In certain embodiments, the instructions, when executed by the processor, cause the processor to: determine the position of the surgical instrument (e.g., osteotome).


In certain embodiments, the position of the surgical instrument is determined (e.g., for depth/insertion monitoring; e.g., to determine locked planes for the surgical instrument) by a navigation system based at least in part on the position of a marker on the osteotome.


In certain embodiments, the position of the surgical instrument is determined by a navigation system based at least in part on the position of a marker on the robotic surgical system and the robotic arms actual position (e.g., as measured by the robotic surgical systems encoders and calculated using the robotic surgical systems movement model).


In certain embodiments, the end effector is a force and/or torque control end-effector.


In certain embodiments, the end effector is configured to hold a first surgical tool.


In certain embodiments, the end-effector includes a tool holder attached to the robotic arm via a force sensor, wherein the tool holder is sized and shaped to hold a first surgical tool.


In certain embodiments, the system includes a manipulator configured to allow robotically-assisted or unassisted positioning and/or movement of the end-effector by a user with at least four degrees of freedom.


In certain embodiments, the system includes a handle extending from the end effector that may be grasp by a hand of a user to move and/or position the end effector.


In certain embodiments, the system includes a force sensor located between the robotic arm and the tool holder for measuring forces and/or torques applied by a user to the first surgical tool held by the tool holder.


In certain embodiments, the system includes a sensor that detects the presence of the hand of the user on the handle.


In certain embodiments, the robotic surgical system is configured to permit a surgeon to manually move the end-effector to a position for an operation.


In certain embodiments, the surgery is spinal surgery, neurosurgery, or orthopedic surgery.


In certain embodiments, the end-effector is configured to releasably hold the first surgical tool, allowing the first surgical tool to be removed and replaced with a second surgical tool.


In certain embodiments, the manipulator is configured to allow robotically assisted or unassisted positioning and/or movement of the end-effector by a user with at least six degrees of freedom, wherein the six degrees of freedom are three degrees of translations and three degrees of rotations.


In certain embodiments, the patient position is a position of one or more markers placed in spatial relation to one or more vertebrae.


In certain embodiments, controlling the actuator to move the end-effector includes controlling the actuator to move the end-effector in a direction corresponding to a direction of application of the force and/or torque.


In certain embodiments, the end-effector is configured to move at a predetermined measured pace upon application and detection of user force and/or torque applied to the end-effector in excess of the predetermined minimum force and/or torque and the predetermined measured pace is a steady, slow velocity.


In certain embodiments, the system includes the neuromonitoring system for providing depth control and/or protection.


In certain embodiments, the surgical instrument is an osteotome.


In another aspect, the disclosed technology includes a method of controlling the position of an end-effector of a robotic surgical system, the method including: receiving, by a neuromonitoring module of the robotic surgical system, a trigger from a neuromonitoring system, wherein the robotic surgical system includes: a robotic arm including the end-effector, an actuator for controlled movement of the robotic arm and positioning of the end effector, thereby controlling the trajectory and/or insertion depth of a surgical instrument in a guide affixed to the end effector, and the neuromonitoring module for implementing real-time neuromonitoring during a surgical procedure; and controlling, by a processor of a computing device in the robotic surgical system, a position of an end-effector of the robotic surgical system to prevent deeper insertion into a patient of a surgical instrument guided by the robotic surgical system upon receipt of the trigger.


In certain embodiments, preventing deeper insertion into the patient of a surgical instrument guided by the robotic surgical system upon receipt of the trigger including moving, by the robotic surgical system, a position of the end-effector away from the patient (e.g., along an axis).


In certain embodiments, the robotic surgical system includes a surgical instrument guide arranged to pass a neuromonitoring cable therethrough.


In certain embodiments, the surgical instrument guide is integrated with the neuromonitoring system such that a neuromonitoring cable can pass through a sterile zone.


In certain embodiments, the robotic surgical system includes the neuromonitoring system is separate from the robotic surgical system.


In certain embodiments, the neuromonitoring system includes a cable that extends through a surgical instrument guide connected to the end-effector.


In certain embodiments, the surgical instrument guide includes a user interface thereon.


In certain embodiments, the user interface includes one or more buttons thereon.


In certain embodiments, the surgical instrument guide includes a block and/or notch (e.g., at a correct distance above the patient) for preventing further insertion of the surgical instrument.


In certain embodiments, the method includes receiving, by a navigation module in the robotic surgical system, a navigation signal indicating movement of a navigation marker; and moving, by the robotic surgical system, a position of the end-effector based on the navigation signal.


In certain embodiments, the robotic surgical system includes a user interface on a robotic arm of the robotic surgical system.


In certain embodiments, the user interface includes a touch screen.


In certain embodiments, the instructions, when executed by the processor, cause the processor to: provide for display on the user interface a list of one or more trajectories for selection by a user.


In certain embodiments, the instructions, when executed by the processor, cause the processor to: limit movement of the end effector such that movement of the surgical instrument is limited to a locked plane (e.g., wherein the locked plane is long which one of the cuts to create a wedge in the vertebra(e) is made).


In certain embodiments, the instructions, when executed by the processor, cause the processor to: limit movement of the end effector such that movement of the surgical instrument is limited to translational movement (e.g., which is necessary to cut the vertebrae).


In certain embodiments, the instructions, when executed by the processor, cause the processor to: determine the position of the surgical instrument (e.g., osteotome).


In certain embodiments, the position of the surgical instrument is determined (e.g., for depth/insertion monitoring; e.g., to determine locked planes for the surgical instrument) by a navigation system based at least in part on the position of a marker on the osteotome.


In certain embodiments, the position of the surgical instrument is determined by a navigation system based at least in part on the position of a marker on the robotic surgical system and the robotic arms actual position (e.g., as measured by the robotic surgical systems encoders and calculated using the robotic surgical systems movement model).


In certain embodiments, the end effector is a force and/or torque control end-effector.


In certain embodiments, the end effector is configured to hold a first surgical tool.


In certain embodiments, the end-effector includes a tool holder attached to the robotic arm via a force sensor, wherein the tool holder is sized and shaped to hold a first surgical tool.


In certain embodiments, the robotic surgical system includes a manipulator configured to allow robotically-assisted or unassisted positioning and/or movement of the end-effector by a user with at least four degrees of freedom.


In certain embodiments, the robotic surgical system includes a handle extending from the end effector that may be grasp by a hand of a user to move and/or position the end effector.


In certain embodiments, the robotic surgical system includes a force sensor located between the robotic arm and the tool holder for measuring forces and/or torques applied by a user to the first surgical tool held by the tool holder.


In certain embodiments, the robotic surgical system includes a sensor that detects the presence of the hand of the user on the handle.


In certain embodiments, the robotic surgical system is configured to permit a surgeon to manually move the end-effector to a position for an operation.


In certain embodiments, the surgery is spinal surgery.


In certain embodiments, the end-effector is configured to releasably hold the first surgical tool, allowing the first surgical tool to be removed and replaced with a second surgical tool.


In certain embodiments, the manipulator is configured to allow robotically assisted or unassisted positioning and/or movement of the end-effector by a user with at least six degrees of freedom, wherein the six degrees of freedom are three degrees of translations and three degrees of rotations.


In certain embodiments, the patient position is a position of one or more markers placed in spatial relation to one or more vertebrae.


In certain embodiments, controlling the actuator to move the end-effector includes controlling the actuator to move the end-effector in a direction corresponding to a direction of application of the force and/or torque.


In certain embodiments, the end-effector is configured to move at a predetermined measured pace upon application and detection of user force and/or torque applied to the end-effector in excess of the predetermined minimum force and/or torque and the predetermined measured pace is a steady, slow velocity.


In certain embodiments, the robotic surgical system includes the neuromonitoring system for providing depth control and/or protection.


In certain embodiments, the surgical instrument is an osteotome.





BRIEF DESCRIPTION OF THE FIGURES

The foregoing and other objects, aspects, features, and advantages of the present disclosure will become more apparent and better understood by referring to the following description taken in conjunction with the accompanying drawings, in which:



FIG. 1 is an illustration of an example robotic surgical system in an operating room; FIGS. 2A through 2D illustrate the principles of osteotomy;



FIG. 3 is an illustration of an osteotome;



FIG. 4A is an illustration of an example robotic surgical system;



FIG. 4B is an illustration of as example integration of an osteotome instrument with a robotic surgical system;



FIG. 5A is an illustration of an example surgical instrument guide for use with a robotic surgical system;



FIG. 5B is an illustration of an example surgical instrument guide with an intermediate lock for use with a robotic surgical system;



FIG. 5C is an illustration of an example surgical instrument guide with an end lock for use with a robotic surgical system;



FIG. 6 is a diagram of a robotic surgical system for use in a surgical procedure performed on a patient;



FIG. 7 shows a block diagram of an exemplary cloud computing environment; and



FIG. 8 is a block diagram of a computing device and a mobile computing device.





The features and advantages of the present disclosure will become more apparent from the detailed description set forth below when taken in conjunction with the drawings, in which like reference characters identify corresponding elements throughout. In the drawings, like reference numbers generally indicate identical, functionally similar, and/or structurally similar elements.


DETAILED DESCRIPTION OF THE INVENTION


FIG. 1 illustrates an example robotic surgical system in an operating room 100. In some implementations, one or more surgeons, surgical assistants, surgical technologists and/or other technicians (e.g., 106a-c) perform an operation on a patient 104 using a robotic-assisted surgical system. In the operating room 100 the surgeon may be guided by the robotic system to accurately execute an operation. This may be achieved by robotic guidance of the surgical tools, including ensuring the proper trajectory of the tool (e.g., drill or screw). In some implementations, the surgeon defines the trajectory intra-operatively with little or no pre-operative planning. The system allows a surgeon to physically manipulate the tool holder to safely achieve proper alignment of the tool for performing crucial steps of the surgical procedure. Operation of the robot arm by the surgeon (or other operator) in force control mode permits movement of the tool in a measured, even manner that disregards accidental, minor movements of the surgeon. The surgeon moves the tool holder to achieve proper trajectory of the tool (e.g., a drill or screw) prior to operation or insertion of the tool into the patient 104. Once the robotic arm is in the desired position, the arm is fixed to maintain the desired trajectory. The tool holder serves as a stable, secure guide through which a tool may be moved through or slid at an accurate angle. Thus, the disclosed technology provides the surgeon with reliable instruments and techniques to successfully perform his/her surgery.


In some embodiments, the operation may be spinal surgery, such as a discectomy, a foraminotomy, a laminectomy, or a spinal fusion, neurosurgery, or orthopedic surgery. In some implementations, the surgical robotic system includes a surgical robot 102 on a mobile cart 114. The surgical robot 102 in the example shown in FIG. 1 is positioned in proximity to an operating table 112 without being attached to the operating table 112, thereby providing maximum operating area and mobility to surgeons around the operating table 112 and reducing clutter on the operating table 112. In alternative embodiments, the surgical robot 102 (or cart) is securable to the operating table 112. In certain embodiments, both the operating table 112 and the cart 114 are secured to a common base to prevent any movement of the cart or table 112 in relation to each other, even in the event of an earth tremor.


The mobile cart 114 may permit a user (operator) 106a, such as a technician, nurse, surgeon, or any other medical personnel in the operating room 100, to move the surgical robot 102 to different locations before, during, and/or after a surgical procedure. The mobile cart 104 enables the surgical robot 102 to be easily transported into and out of the operating room 100. For example, a user 106a may move the surgical robot 102 into the operating room 100 from a storage location. In some implementations, the mobile cart 114 may include wheels, a track system, such as a continuous track propulsion system, or other similar mobility systems for translocation of the cart. The mobile cart 114 may include an attached or embedded handle for locomotion of the mobile cart 114 by an operator (e.g., user 106a).


For safety reasons, the mobile cart 114 may be provided with a stabilization system that may be used during a surgical procedure performed with a surgical robot 102. The stabilization mechanism increases the global stiffness of the mobile cart 114 relative to the floor in order to ensure the accuracy of the surgical procedure. In some implementations, the wheels include a locking mechanism that prevents the cart 114 from moving. The stabilizing, braking, and/or locking mechanism may be activated when the machine is turned on. In some implementations, the mobile cart 114 includes multiple stabilizing, braking, and/or locking mechanisms. In some implementations, the stabilizing mechanism is electro-mechanical with electronic activation. The stabilizing, braking, and/or locking mechanism(s) may be entirely mechanical. The stabilizing, braking, and/or locking mechanism(s) may be electronically activated and deactivated.


In some implementations, the surgical robot 102 includes a robotic arm mounted on a mobile cart 114. An actuator may move the robotic arm. The robotic arm may include a force control end-effector configured to hold a surgical tool. The robot 102 may be configured to control and/or allow positioning and/or movement of the end-effector with at least four degrees of freedom (e.g., six degrees of freedom, three translations and three rotations). The robotic surgical system can limit movement of a surgical instrument in a surgical instrument guide affixed to the end effector to movement along a trajectory, along a plane (or a portion of a plane) and/or to a particular depth.


In some implementations, the robotic arm is configured to releasably hold a surgical tool, allowing the surgical tool to be removed and replaced with a second surgical tool. The system may allow the surgical tools to be swapped without re-registration, or with automatic or semi-automatic re-registration of the position of the end-effector.


In some implementations, the surgical system includes a surgical robot 102, a tracking detector 108 that captures the position of the patient and different components of the surgical robot 102, and a display screen 110 that displays, for example, real time patient data and/or real time surgical robot trajectories.


In some implementations, a tracking detector 108 monitors the location of patient 104 and the surgical robot 102. The tracking detector 108 may be a camera, a video camera, an infrared detector, field generator and sensors for electro-magnetic tracking or any other motion detecting apparatus. In some implementation, based on the patient and robot position, the display screen 110 displays a projected trajectory and/or a proposed trajectory for the robotic arm of robot 102 from its current location to a patient operation site. By continuously monitoring the patient 104 and robotic arm positions, using tracking detector 108, the surgical system can calculate updated trajectories and visually display these trajectories on display screen 110 to inform and guide surgeons and/or technicians in the operating room 100 using the surgical robot. In addition, in certain embodiments, the surgical robot 102 may also change its position and automatically position itself based on trajectories calculated from the real time patient and robotic arm positions captured using the tracking detector 108. For instance, the trajectory of the end-effector can be automatically adjusted in real time to account for movement of the vertebrae and/or other part of the patient 104 during the surgical procedure. An example robotic surgical system that may be used with the disclosed technology or modified for use with the disclosed technology is described in U.S. patent application Ser. No. 14/266,769, filed Apr. 30, 2014 and entitled Apparatus, Systems, and Methods for Precise Guidance of Surgical Tools, the contents of which are hereby incorporated by reference in their entirety.



FIG. 4A is an illustration of an example robotic surgical system 400. Starting from the end effector 402, the robot holds an instrument guide 404. In certain embodiments, the instrument guide 404 is integrated with a depth block 410 that stops movement of the inserted instrument in a particular direction (e.g., max depth of penetration by the instrument can be set). Examples of surgical instrument guides that may be used herein or modified for use herein are disclosed in U.S. patent application Ser. No. 14/597,883, filed January 2015 and entitled “Notched Apparatus for Guidance of an Insertable Instrument Along an Axis During Surgery,” the contents of which are hereby incorporated by reference in their entirety.


In certain embodiments, the guide 404 has sterilizable, reusable user interface 406. In certain embodiments, the interface 406 is an electrical assembly with one or more input devices for commanding the robotic surgical system 400. The one or more input devices may include two or more buttons configured to enable a user to place the robotic surgical system 400 in one of a rotation mode, a translation mode, or a combined translation and rotation mode. In some implementations, upon selection of a first button of the two or more buttons, the robotic surgical system 400 is in the rotation mode, upon selection of a second button of the two or more buttons, the robotic surgical system 400 is in the translation mode, and upon selection of both the first and second buttons, the robotic surgical system 400 is in the combined translation and rotation mode. In certain embodiments, this electrical assembly is provided for on or built into to the surgical instrument guide. In some implementations, the electrical assembly can be done separately (e.g., using overmolding on buttons and cable or epoxy resin to form an assembly which is integrated into the guide using a rapid locking device).


In some implementations, the surgical instrument guide 404 and input device(s) thereon (e.g., buttons) can be used for instructing the robotic system to translate along a line when the translation button is pressed, rotate around the line if the rotation button is pressed, and/or translate and rotate around the line if both buttons are pressed. The electrical assembly may be directly integrated into the surgical instrument guide 404.


The guide 404, in certain embodiments, is configured to be attached directly or indirectly to an end-effector 402 of the robotic surgical system 400. In some implementations, the robotic surgical system 400 is configured to allow robotically-assisted or unassisted positioning and/or movement of the end effector 402 by a user with at least six degrees of freedom. The six degrees of freedom may be three degrees of translations and three degrees of rotations.


In certain embodiments, a user interface 408 (e.g., for use by a surgeon) is on the robotic arm (e.g., the forearm). An example of such a user interface 408 is described in U.S. patent application Ser. No. 14/858,325, filed Sep. 18, 2015, entitled “Robot-Mounted User Interface for Interacting with Operation Room Equipment”, the contents of which are hereby incorporated by reference in its entirety. It can based on the touch-screen technology and implemented using a tablet computer. This user interface 408 can be used to present the trajectory list to the user and allowing him/her to select one.


In certain embodiments, the robot 400 includes a neuromonitoring cable 412. The neuromonitoring cable 412 can pass through a hole (e.g., sealed) in the surgical instrument guide 404. A neuromonitoring probe can be incorporated with the guide 404 and/or surgical instrument, thereby allowing the robotic surgical system 400 to monitor a patient's neurological response to the procedure. In certain embodiments, a neuromonitoring interface 414 allows the robot 400 to communicate with an external neuromonitoring system. In other embodiments, the entire neuromonitoring system is external to the robotic surgical system 400 or the entire neuromonitoring system is integrated with the robotic surgical system 400.



FIG. 4B is an illustration of as example integration of an osteotome instrument 452 with a robotic surgical system. Other instruments (e.g., instruments for removing the cancellous bone, clean-up and closure, etc.) used in surgical procedures may similarly be integrated and/or used with the robotic surgical system 400. For example, the system may be used with Navlock™ Instruments by Medtronic of Minneapolis, Minnesota.


An osteotome 452 is rigid and sharp such that it can be used to remove hard, external parts of the bone 458, shown as a vertebrae in FIG. 4B. FIG. 4B illustrates a set-up for the use of the osteotome 452 with the robotic surgical system 400. The osteotome 452 is guided by the guide 404 which is held by the robot 400. In certain embodiments, the robot 400 moves only in the “locked” plane 460 (one of the two which create the wedge in the bone). In certain embodiments, the guide 404 allows (e.g., at the appropriate time) for translational movement of the osteotome 452 which is necessary to cut the bone (e.g., vertebrae). In certain embodiments, a user might use a hammer to advance the osteotome 452. In other embodiments, a user might advance the osteotome 452 using his hand.


A navigation marker 454 measures the position of the osteotome 452 which is necessary for the system to determine the locked planes (e.g., the planes along which the cuts to form the wedge in the bone are made). In an alternative set-up, the marker 454 can be on the robot 400 and robot's actual position (measured by robot's encoders and calculated using robot model) can be used to determine the position of the osteotome 452 in space.


In certain embodiments, the robot 400 prevents the osteotome 452 from getting too deep/beyond the tip of the desired wedge. This can be achieved be having the notch 456 in the guide 404 the correct distance above the patient—the navigation marker rod 454 would prevent the osteotome 452 from getting deeper than the notch 456 permits.


During an osteotomy procedure, in certain embodiments, the resection measurement is based on preoperative measurements. Determining the degree of the resection to accomplish the desired correction can be performed by the surgeon, by the computer system, or a combination thereof. For example, the system can determine the ideal shape of the spine, compare the ideal shape to a patient's spine, and determine the location of the resection and/or the amount that must be resected.


In certain embodiments, the tool holder 404 is integrated with neuromonitoring. In certain embodiments, depth control and protection is provided such that depth/insertion movement is stopped upon receipt of a trigger (e.g., external or internal). For example, in certain embodiments, neuromonitoring causes the robotic surgical system 400 to stop depth movement (e.g., in response to an external signal). The neuromonitoring system, in certain embodiments, includes the ability to react in response to a signal and/or generate a signal as well as the capability to stop the instrument (e.g., 452) and/or prevent the instrument (e.g., 452) from going beyond a certain threshold. In certain embodiments, the system 400 also moves the surgical instrument and/or surgical instrument guide 404 back (e.g., less depth of penetration in instances, for example, where a threshold has been exceeded) in response to a trigger. Neuromonitoring may be used in many surgical procedures, including osteotomy.


In certain embodiments, a neuromonitoring cable can pass through the sterile zone.


An example of how to pass a cable or electrical connection through the sterile zone is described in U.S. patent application Ser. No. 14/602,627, filed Jul. 27, 2015 and entitled “Sterile Drape and Adapter for Covering a Robotic Surgical Arm and Preventing Contamination of a Sterile Field,” the contents of which are hereby incorporated by reference in their entirety. In certain embodiments, the neuromonitoring cable passes through the tool holder 404.


In certain embodiments, the robotic surgical system 400 integrates with a navigation system, such as StealthStation and Steathlink (e.g., to obtain trajectories from Stealthstation and for tracking real-time data)) by Medtronic of Minneapolis, Minnesota.


As shown in FIG. 5A, a guide 500, in some implementations, includes a tubular structure 506 (e.g., body), with a first longitudinal notch 522a along its length and a second longitudinal notch 522b along its length. In some implementations, the first notch 522a and second notch 522b are located on opposite sides/portions of the body 506 of the guide 500 as shown in FIG. 5A. In some implementations, the guide 500 includes two or more notches that are spaced evenly (as shown in FIG. 5A) or unevenly around the body of the guide.


In some implementations, the longitudinal notches 522a and 522b are slots. The longitudinal notches 522a-b, in some implementations, are sized in relation to one or more pegs that couples a navigation marker to a tool support. As the tool support slides through the guide 500, one of the notches 522a-b permits the tool support to slide along the axis defined by the guide while the guide is held in a fixed position by the robotic surgical system. The peg extends through one of the notches 522a-b and outside of the guide 500 and permits the navigation marker attached to the tool support via the peg to be viewed by a navigation camera along an entire range of movement of the tool support through the guide. In some implementations, the peg is utilized without the navigation marker to maintain the orientation of the surgical instrument. In some implementations, the navigation marker is used by navigation camera to track the surgical instrument. The notches 522a-b may constrain movement of the marker in a fixed orientation along the axis defined by the guide. In some implementations, longitudinal notches 522a-b are sized in relation to a peg to permit the surgical instrument to slide along the axis of insertion in reference to the tool support.


Among other things, incorporation of two or more notches, such as notches 522a and 522b, permits for ambidextrous manipulation of the end effector and/or tool. Moreover, it permits positioning of the robotic surgical system on both sides of the operating room table.


Furthermore, it permits positioning of the robotic surgical system on both sides of the operating room table in reference to a navigation system (e.g., tracking camera).


In some implementations, the guide 500 includes one or more input devices, such as electro-mechanical buttons. For example, the guide 50 may include two electromechanical buttons 508a and 508b. In some implementations, the guide 50 includes an activation switch 560. The activation switch 560 may be separate from the buttons 508a and 508b. The activation switch 560 may be a presence detection that can be used for enabling movements of the surgical robot. The types of movements may be defined by the buttons 508a and/or 508b. The present detection may include a long button that is pressed when a user grabs the handle (e.g., to thereby move the handle). In some implementations, the activation switch detects the presence of a hand on the handle.


In some implementations, a user may use the one or more input devices to select to enter a translation mode, positioning mode, axis rotation mode, axis insertion mode and/or axis position mode. In some implementations, the guide 500 includes an enabling button, rotation button and/or a translation button. In some implementations, the enabling button must be selected with one or more other buttons to enable movement of the end effector. For example, to rotate the end effector, the user may need to select the enabling button and the rotation button. Similarly, to enable translations of the end effector, the user may need to select the enabling button and the translations button. In some implementations, the end effector may enter a course positioning mode when a user selects the enabling button, translations button, or rotations button. In some implementations, selection of the enabling button causes the robotic arm to enter the positioning mode in which the user is able to position the tool appropriately and allows the operator to freely move the robotic arm (e.g., via course movements).


Selection of the translation mode allows, in some implementations, the end effector to be moved along a plane (e.g., a plan in line with the end of a tool such as a drill guide). An operator may use the translation mode to make fine movements with the end effector and to find an entry point. Selection of the rotation mode locks movement of the end effector except rotations (e.g., the manipulator may only be rotated). In some implementations, activation of the rotation mode permits an operator to make fine rotations around an entry point. In axis rotation mode an operator may rotate the end effector around a specific axis (e.g., the axis formed by a drill guide). In axis position mode, an operator may move the end effector without changing an axis (e.g., the axis formed by a drill guide). In axis insertion mode, an operator may move the end effector along a trajectory.


The various positioning modes allow an operator to quickly and accurately move the end effector to a desired position (e.g., on or along a determined trajectory). When all of the buttons are released, in some implementations, the robot actively holds the position of the end effector. For example, if a drill guide is coupled to the end effector, an operator may insert a drill into the drill guide without moving the position of the end effector or drill guide. Thus, after carefully positioning the drill guide along a desired trajectory, an operator may accurately drill along the desired trajectory.



FIG. 5B is an illustration of an example surgical instrument guide 530 with an intermediate lock 532 to lock the position of the surgical instrument in the guiding tube 506. Instead of having a long guiding tube, the robot may move the guiding tube 506 along a trajectory (e.g., in a straight line) thus creating a very long “virtual” guidance without compromising haptic feedback for the surgeon. Additionally, the intermediate lock 532 enables the surgical instrument to be placed in the guiding tube prior to determining the correct trajectory. After the correct trajectory is determined, the robotic arm may be moved away from the patient such that, for example, the vertebrae may be accessed by a surgeon.


After the vertebrae is prepared, the robot can assist the surgeon in finding the right trajectory again, thus significantly decreasing the time necessary for screw placement in comparison to manual spinal surgeries.


An intermediate lock 532 may be placed at an initial distance 534, such as 80 mm, from an entry of the guiding tube 506. In some implementations, the initial distance is 80 mm. In some implementations, the initial distance is between 70-90 mm, 60-80 mm, or 80-100 mm. In some implementations, the initial distance corresponds to the length of the longest pedicle screws used with a small amount of margin (e.g., 5, 5, 15, or 20 mm of margin). In some implementations, the intermediate lock 532 is a unidirectional lock that only blocks insertion movement. In some implementations, the initial distance 534 is long enough to allow guidance of the inserted instrument when intermediate lock 532 is in the locked position. For example, the initial distance, in some implementations, is 30 mm. In some implementations, the initial distance is between 25-25 mm, 20-40 mm, or 35-50 mm. In some implementations, the intermediate lock 532 is a bidirectional lock that blocks insertion and removal of the surgical instrument.


When the intermediate lock 532 is released (e.g., unlocked), the surgical instrument may be slide further into the guide. In some implementations, the insertion distance 536 (e.g., distance the surgical instrument can move forward after the intermediate lock 532 is released) is selected to allow sufficient guidance of the surgical instrument inside the vertebrae. In some implementations, the insertion distance is 80 mm. In some implementations, the insertion distance is between 70-90 mm, 60-80 mm, or 80-100 mm. This may be defined by the type of surgery and may be, for example, the length of a pedicle screw with some margin (e.g., 40-80 mm of total travel; e.g., 55, 60, 65, 70, or 75 mm total). The intermediate lock 532 may be implemented using a variety of mechanisms. The intermediate lock 532 may be a spring lock (e.g., a button that is pressed through a hole on the guide by a spring when the instrument is slide into a particular position). The intermediate lock 532 may be a small device that blocks the movement of the tool inside the guide 506. For example, the intermediate lock 532 may block the peg that holds a marker to a tool support. The intermediate lock 532 may be one or two bars that prevent movement of the instrument unilaterally or bilaterally, respectively. For example, two bars may be used to prevent the peg from moving. In some implementations, a lock is provided to lock the surgical instrument in place when it is fully inserted in the guide 506. The lock may be designed and/or function similarly to the intermediate lock.



FIG. 5C is an illustration of an example surgical instrument guide 1150 with an end lock 552 to lock the position of the surgical instrument in the guiding tube 506. The end lock may be used to prevent the surgical instrument from accidentally being removed from the guiding tube 506. In some implementations, an instrument position sensor 556 (e.g., position detector) is integrated in the guiding tube 506 (e.g., any guiding tube described herein). The instrument position sensor 556 may be an inductive sensor, capacitive sensor, resistive sensor, mechanical end switches, optical measuring device, force sensing device, or other similar position sensor. When the surgical instrument is inside the tube 506, the relative position of the instrument may be measured by the instrument position sensor 556. In some implementations, the sensor 556 detects discrete positions of the instrument inside the guiding tube 506. For example, the sensor 556 may detect when the surgical instrument is at a top, bottom, or middle position within the guide.


In some implementations, the robot generates movement of the tube 506 in response to the position of the instrument (e.g., to achieve movement along a desired trajectory). The movement may be generated only when the surgical instrument is at the extremities of the tube 506 (e.g., at either end of the notch 522). The combination of these features and the ability to combine movement of the instrument inside the guiding tube 506 and guidance of the tube 506 by the robot to provides the ability to obtain long and complicated trajectories using simple and short surgical instrument guide tubes (e.g., 506) held by the robot.


The end lock 552 may be a spring lock (e.g., a button that is pressed through a hole on the guide by a spring when the instrument is slide into a particular position). The end lock 552 may be a small device that blocks the movement of the tool inside the guide 506. For example, the end lock 552 may block the peg that holds a marker to a tool support. The end lock 552 may be one or two bars that prevent movement of the instrument unilaterally or bilaterally, respectively. For example, two bars may be used to prevent the peg from moving.



FIG. 6 is a diagram of a robotic surgical system 600 for use in a surgical procedure performed on a patient. In this example, the system 600 includes a robotic arm having an end-effector thereon and an actuator for controlled movement of the robotic arm and positioning of the end effector. A processor 604 and memory 602 are used to control movement of the robotic arm and coordinate behavior of the system 600 with various modules. As described above, this allows the system 600 to control the trajectory and/or insertion depth of a surgical instrument in a guide affixed to the end effector. In certain embodiments, the system 600 includes a neuromonitoring module 606 for implementing real-time neuromonitoring during the surgical procedure. In certain embodiments, the neuromonitoring module 606 receives a trigger based on a neurological response of a portion of a nerve structure of the patient that is measured by a neuromonitoring system 608. The neuromonitoring module 606, upon receipt of the trigger, prevents deeper insertion into the patient of a surgical instrument guided by the robotic surgical system 600. Preventing deeper insertion into the patient of a surgical instrument can be accomplished by moving, by the robotic surgical system 600, a position of the end-effector away from the patient (e.g., along an axis—such as the trajectory of an instrument held by the end-effector). A neuromonitoring cable can be used by the neuromonitoring system 608 to detect a neurological response that results in the neuromonitoring system 608 sending the trigger to the neuromonitor module 606. In certain embodiments, the surgical instrument guide is arranged to pass a neuromonitoring cable therethrough. In certain embodiments, the surgical instrument guide is integrated with the neuromonitoring system 608 such that a neuromonitoring cable can pass through the guide and thus through a sterile zone.


In certain embodiments, the neuromonitoring system 608 is separate from the robotic surgical system. In other embodiments, the neuromonitoring system 608 is part of the robot 600.


In certain embodiments, the robot 600 includes a navigation module 610 that communicates with a navigation system 612 that can monitor the position of the patient (e.g., the patient's skeletal structure, such as a specific piece or area of a bone), the robot, and/or surgical instrument. For example, the position of the surgical instrument can be determined by a navigation system 612 based at least in part on the position of a marker on the surgical instrument. In another example, the position of the surgical instrument is determined by a navigation system 612 based at least in part on the position of a marker on the robotic surgical system 600 and the robotic arms actual position (e.g., as measured by the robotic surgical systems 600 encoders and calculated using the robotic surgical systems 200 movement model).


As shown in FIG. 7, an implementation of a network environment 700 for use in the robotic surgical system is shown and described. In brief overview, referring now to FIG. 7, a block diagram of an exemplary cloud computing environment 700 is shown and described. The cloud computing environment 700 may include one or more resource providers 702a, 702b, 702c (collectively, 702). Each resource provider 702 may include computing resources. In some implementations, computing resources may include any hardware and/or software used to process data. For example, computing resources may include hardware and/or software capable of executing algorithms, computer programs, and/or computer applications. In some implementations, exemplary computing resources may include application servers and/or databases with storage and retrieval capabilities. Each resource provider 702 may be connected to any other resource provider 702 in the cloud computing environment 700. In some implementations, the resource providers 702 may be connected over a computer network 708. Each resource provider 702 may be connected to one or more computing device 704a, 704b, 704c (collectively, 704), over the computer network 708.


The cloud computing environment 700 may include a resource manager 706. The resource manager 706 may be connected to the resource providers 702 and the computing devices 704 over the computer network 708. In some implementations, the resource manager 706 may facilitate the provision of computing resources by one or more resource providers 702 to one or more computing devices 704. The resource manager 706 may receive a request for a computing resource from a particular computing device 704. The resource manager 706 may identify one or more resource providers 702 capable of providing the computing resource requested by the computing device 704. The resource manager 706 may select a resource provider 702 to provide the computing resource. The resource manager 706 may facilitate a connection between the resource provider 702 and a particular computing device 704. In some implementations, the resource manager 706 may establish a connection between a particular resource provider 702 and a particular computing device 704. In some implementations, the resource manager 706 may redirect a particular computing device 704 to a particular resource provider 702 with the requested computing resource.



FIG. 8 shows an example of a computing device 800 and a mobile computing device 850 that can be used to implement the techniques described in this disclosure. The computing device 800 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The mobile computing device 850 is intended to represent various forms of mobile devices, such as personal digital assistants, cellular telephones, smart-phones, and other similar computing devices. The components shown here, their connections and relationships, and their functions, are meant to be examples only, and are not meant to be limiting.


The computing device 800 includes a processor 802, a memory 804, a storage device 806, a high-speed interface 808 connecting to the memory 804 and multiple high-speed expansion ports 810, and a low-speed interface 812 connecting to a low-speed expansion port 814 and the storage device 806. Each of the processor 802, the memory 804, the storage device 806, the high-speed interface 808, the high-speed expansion ports 810, and the low-speed interface 812, are interconnected using various busses, and may be mounted on a common motherboard or in other manners as appropriate. The processor 802 can process instructions for execution within the computing device 800, including instructions stored in the memory 804 or on the storage device 806 to display graphical information for a GUI on an external input/output device, such as a display 816 coupled to the high-speed interface 808. In other implementations, multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory. Also, multiple computing devices may be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system).


The memory 804 stores information within the computing device 800. In some implementations, the memory 804 is a volatile memory unit or units. In some implementations, the memory 804 is a non-volatile memory unit or units. The memory 804 may also be another form of computer-readable medium, such as a magnetic or optical disk.


The storage device 806 is capable of providing mass storage for the computing device 800. In some implementations, the storage device 806 may be or contain a computer-readable medium, such as a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations. Instructions can be stored in an information carrier. The instructions, when executed by one or more processing devices (for example, processor 802), perform one or more methods, such as those described above. The instructions can also be stored by one or more storage devices such as computer- or machine-readable mediums (for example, the memory 804, the storage device 806, or memory on the processor 802).


The high-speed interface 808 manages bandwidth-intensive operations for the computing device 800, while the low-speed interface 812 manages lower bandwidth-intensive operations. Such allocation of functions is an example only. In some implementations, the high-speed interface 808 is coupled to the memory 804, the display 816 (e.g., through a graphics processor or accelerator), and to the high-speed expansion ports 810, which may accept various expansion cards (not shown). In the implementation, the low-speed interface 812 is coupled to the storage device 806 and the low-speed expansion port 814. The low-speed expansion port 814, which may include various communication ports (e.g., USB, Bluetooth®, Ethernet, wireless Ethernet) may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.


The computing device 800 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a standard server 820, or multiple times in a group of such servers. In addition, it may be implemented in a personal computer such as a laptop computer 822. It may also be implemented as part of a rack server system 824. Alternatively, components from the computing device 800 may be combined with other components in a mobile device (not shown), such as a mobile computing device 850. Each of such devices may contain one or more of the computing device 800 and the mobile computing device 850, and an entire system may be made up of multiple computing devices communicating with each other.


The mobile computing device 850 includes a processor 852, a memory 864, an input/output device such as a display 854, a communication interface 866, and a transceiver 868, among other components. The mobile computing device 850 may also be provided with a storage device, such as a micro-drive or other device, to provide additional storage. Each of the processor 852, the memory 864, the display 854, the communication interface 866, and the transceiver 868, are interconnected using various buses, and several of the components may be mounted on a common motherboard or in other manners as appropriate.


The processor 852 can execute instructions within the mobile computing device 850, including instructions stored in the memory 864. The processor 852 may be implemented as a chipset of chips that include separate and multiple analog and digital processors. The processor 852 may provide, for example, for coordination of the other components of the mobile computing device 850, such as control of user interfaces, applications run by the mobile computing device 850, and wireless communication by the mobile computing device 850.


The processor 852 may communicate with a user through a control interface 858 and a display interface 856 coupled to the display 854. The display 854 may be, for example, a TFT (Thin-Film-Transistor Liquid Crystal Display) display or an OLED (Organic Light Emitting Diode) display, or other appropriate display technology. The display interface 856 may comprise appropriate circuitry for driving the display 854 to present graphical and other information to a user. The control interface 858 may receive commands from a user and convert them for submission to the processor 852. In addition, an external interface 862 may provide communication with the processor 852, so as to enable near area communication of the mobile computing device 850 with other devices. The external interface 862 may provide, for example, for wired communication in some implementations, or for wireless communication in other implementations, and multiple interfaces may also be used.


The memory 864 stores information within the mobile computing device 850. The memory 864 can be implemented as one or more of a computer-readable medium or media, a volatile memory unit or units, or a non-volatile memory unit or units. An expansion memory 874 may also be provided and connected to the mobile computing device 850 through an expansion interface 872, which may include, for example, a SIMM (Single In Line Memory Module) card interface. The expansion memory 874 may provide extra storage space for the mobile computing device 850, or may also store applications or other information for the mobile computing device 850. Specifically, the expansion memory 874 may include instructions to carry out or supplement the processes described above, and may include secure information also. Thus, for example, the expansion memory 874 may be provided as a security module for the mobile computing device 850, and may be programmed with instructions that permit secure use of the mobile computing device 850. In addition, secure applications may be provided via the SIMM cards, along with additional information, such as placing identifying information on the SIMM card in a non-hackable manner.


The memory may include, for example, flash memory and/or NVRAM memory (non-volatile random access memory), as discussed below. In some implementations, instructions are stored in an information carrier and, when executed by one or more processing devices (for example, processor 852), perform one or more methods, such as those described above. The instructions can also be stored by one or more storage devices, such as one or more computer- or machine-readable mediums (for example, the memory 864, the expansion memory 874, or memory on the processor 852). In some implementations, the instructions can be received in a propagated signal, for example, over the transceiver 868 or the external interface 862.


The mobile computing device 850 may communicate wirelessly through the communication interface 866, which may include digital signal processing circuitry where necessary. The communication interface 866 may provide for communications under various modes or protocols, such as GSM voice calls (Global System for Mobile communications), SMS (Short Message Service), EMS (Enhanced Messaging Service), or MMS messaging (Multimedia Messaging Service), CDMA (code division multiple access), TDMA (time division multiple access), PDC (Personal Digital Cellular), WCDMA (Wideband Code Division Multiple Access), CDMA2000, or GPRS (General Packet Radio Service), among others. Such communication may occur, for example, through the transceiver 868 using a radio-frequency. In addition, short-range communication may occur, such as using a Bluetooth®, Wi-Fi™, or other such transceiver (not shown). In addition, a GPS (Global Positioning System) receiver module 870 may provide additional navigation- and location-related wireless data to the mobile computing device 850, which may be used as appropriate by applications running on the mobile computing device 850.


The mobile computing device 850 may also communicate audibly using an audio codec 860, which may receive spoken information from a user and convert it to usable digital information. The audio codec 860 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset of the mobile computing device 850. Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, etc.) and may also include sound generated by applications operating on the mobile computing device 850.


The mobile computing device 850 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a cellular telephone 880. It may also be implemented as part of a smart-phone 882, personal digital assistant, or other similar mobile device.


Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.


These computer programs (also known as programs, software, software applications or code) include machine instructions for a programmable processor, and can be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms machine-readable medium and computer-readable medium refer to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term machine-readable signal refers to any signal used to provide machine instructions and/or data to a programmable processor.


To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in any form, including acoustic, speech, or tactile input.


The systems and techniques described here can be implemented in a computing system that includes a back end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front end component (e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (LAN), a wide area network (WAN), and the Internet.


The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.


In view of the structure, functions and apparatus of the systems and methods described here, in some implementations, a system and method for use in performing a surgical procedure with a robotic surgical system are provided. Having described certain implementations of methods and apparatus for supporting a robotic surgical system, it will now become apparent to one of skill in the art that other implementations incorporating the concepts of the disclosure may be used. Therefore, the disclosure should not be limited to certain implementations, but rather should be limited only by the spirit and scope of the following claims.


Throughout the description, where apparatus and systems are described as having, including, or comprising specific components, or where processes and methods are described as having, including, or comprising specific steps, it is contemplated that, additionally, there are apparatus, and systems of the disclosed technology that consist essentially of, or consist of, the recited components, and that there are processes and methods according to the disclosed technology that consist essentially of, or consist of, the recited processing steps.


It should be understood that the order of steps or order for performing certain action is immaterial so long as the disclosed technology remains operable. Moreover, two or more steps or actions may be conducted simultaneously.

Claims
  • 1. A method of controlling the position of an end-effector of a robotic surgical system, the method comprising: receiving a surgical instrument in a surgical instrument guide affixed to the end effector;receiving, by a neuromonitoring module of the robotic surgical system, a trigger from a neuromonitoring system, wherein the robotic surgical system includes: a robotic arm having the end-effector connected thereto,an actuator that controls movement of the robotic arm and positioning of the end effector, thereby controlling the trajectory and insertion depth of the surgical instrument received in the surgical instrument guide affixed to the end effector, andthe neuromonitoring module for real-time neuromonitoring during a surgical procedure while the surgical instrument is positioned in the surgical instrument guide; andcontrolling, by a processor of a computing device in the robotic surgical system, a position of the end-effector to prevent deeper insertion into a patient of the surgical instrument guided by the robotic surgical system upon receipt of the trigger,wherein the surgical instrument guide is integrated with the neuromonitoring module.
  • 2. The method of claim 1, wherein preventing deeper insertion into the patient of the surgical instrument includes moving, by the robotic surgical system, a position of the end-effector away from the patient.
  • 3. The method of claim 1, wherein the surgical instrument guide of the end effector is arranged to pass a neuromonitoring cable therethrough.
  • 4. The method of claim 1, wherein the surgical instrument guide is integrated with the neuromonitoring system such that a neuromonitoring cable can pass through a sterile zone.
  • 5. The method of claim 1, wherein the neuromonitoring system is a separate system which is not integrated with the robotic surgical system.
  • 6. The method of claim 1, wherein the neuromonitoring system comprises a cable that extends through the surgical instrument guide connected to the end-effector.
  • 7. The method of claim 1, wherein the surgical instrument guide comprises a user interface thereon.
  • 8. The method of claim 1, wherein the surgical instrument guide comprises a user interface thereon and includes one or more buttons.
  • 9. The method of claim 1, wherein the surgical instrument guide comprises a block or notch for preventing further insertion of the surgical instrument.
  • 10. The method of claim 1, further comprising: receiving, by a navigation module in the robotic surgical system, a navigation signal indicating movement of a navigation marker; andmoving, by the robotic surgical system, the end-effector based on the navigation signal.
  • 11. The method of claim 1, wherein the robotic surgical system comprises a touch screen user interface on the robotic arm of the robotic surgical system.
  • 12. A method of cutting a bone of a patient along a plane, the method comprising: moving, by the robotic surgical system, an end effector to a trajectory, wherein the robotic surgical system includes: a robotic arm having the end-effector connected thereto; andan actuator that controls movement of the robotic arm and positioning of the end effector, thereby controlling the trajectory of a surgical instrument received in a surgical instrument guide connected to the end effector;receiving the surgical instrument in the surgical instrument guide connected to the end effector;maintaining the position of the end-effector in a locked plane in which the surgical instrument received in the surgical instrument guide is only allowed to move within a single plane defined by the locked plane, wherein the surgical instrument is a bone cutting tool;receiving, by a neuromonitoring module of the robotic surgical system, a trigger from a neuromonitoring system;preventing, by a processor of a computing device in the robotic surgical system, a deeper insertion into a patient of the surgical instrument guided by the robotic surgical system upon receipt of the trigger,wherein the surgical instrument guide is integrated with the neuromonitoring module.
  • 13. The method of claim 12, wherein: surgical instrument guide supports an osteotome as the bone cutting tool; andthe locked plane is a single plane that defines one of the two which create a wedge in the bone to be cut.
  • 14. The method of claim 12, further comprising determining the locked plane based at least in part on the position of a navigation marker on the surgical instrument, the marker being tracked by a navigation system in communication with the robotic surgical system.
  • 15. The method of claim 12, permitting, by the robotic surgical system, manual movement of the end-effector to a position for an operation.
  • 16. The method of claim 12, wherein the robotic surgical system is adapted to allow robotically assisted or unassisted positioning and movement of the end-effector by a user with at least six degrees of freedom, wherein the six degrees of freedom include three degrees of translation and three degrees of rotation.
  • 17. The method of claim 12, further comprising moving, by the robotic surgical system, the end-effector at a predetermined measured pace upon application and detection of user force applied to the end-effector in excess of a predetermined minimum force and the predetermined measured pace is a steady velocity.
  • 18. The method of claim 12, wherein preventing deeper insertion into the patient of the surgical instrument includes moving, by the robotic surgical system, a position of the end-effector away from the patient.
  • 19. The method of claim 12, wherein the surgical instrument guide of the end effector is arranged to pass a neuromonitoring cable therethrough.
RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 15/253,206, filed Aug. 31, 2016, which claims priority to U.S. Provisional Patent Application No. 62/212,551, filed Aug. 31, 2015, entitled “ROBOTIC SURGICAL SYSTEMS AND METHODS FOR SPINAL AND OTHER SURGERIES,” the entire contents of which are hereby incorporated by reference.

US Referenced Citations (687)
Number Name Date Kind
4150293 Franke Apr 1979 A
5246010 Gazzara et al. Sep 1993 A
5354314 Hardy et al. Oct 1994 A
5397323 Taylor et al. Mar 1995 A
5520692 Ferrante May 1996 A
5598453 Baba et al. Jan 1997 A
5772594 Barrick Jun 1998 A
5791908 Gillio Aug 1998 A
5820559 Ng et al. Oct 1998 A
5825982 Wright et al. Oct 1998 A
5887121 Funda et al. Mar 1999 A
5911449 Daniele et al. Jun 1999 A
5951475 Gueziec et al. Sep 1999 A
5987960 Messner et al. Nov 1999 A
6012216 Esteves et al. Jan 2000 A
6031888 Ivan et al. Feb 2000 A
6033415 Mittelstadt et al. Mar 2000 A
6080181 Jensen et al. Jun 2000 A
6106511 Jensen Aug 2000 A
6122541 Cosman et al. Sep 2000 A
6144875 Schweikard et al. Nov 2000 A
6157853 Blume et al. Dec 2000 A
6167145 Foley et al. Dec 2000 A
6167292 Badano et al. Dec 2000 A
6201984 Funda et al. Mar 2001 B1
6203196 Meyer et al. Mar 2001 B1
6205411 DiGioia, III et al. Mar 2001 B1
6212419 Blume et al. Apr 2001 B1
6231565 Tovey et al. May 2001 B1
6236875 Bucholz et al. May 2001 B1
6246900 Cosman et al. Jun 2001 B1
6301495 Gueziec et al. Oct 2001 B1
6306126 Montezuma Oct 2001 B1
6312435 Wallace et al. Nov 2001 B1
6314311 Williams et al. Nov 2001 B1
6320929 Von Der Haar Nov 2001 B1
6322567 Mittelstadt et al. Nov 2001 B1
6325808 Bernard et al. Dec 2001 B1
6340363 Bolger et al. Jan 2002 B1
6377011 Ben-Ur Apr 2002 B1
6379302 Kessman et al. Apr 2002 B1
6402762 Hunter et al. Jun 2002 B2
6424885 Niemeyer et al. Jul 2002 B1
6447503 Wynne et al. Sep 2002 B1
6451027 Cooper et al. Sep 2002 B1
6477400 Barrick Nov 2002 B1
6484049 Seeley et al. Nov 2002 B1
6487267 Wolter Nov 2002 B1
6490467 Bucholz et al. Dec 2002 B1
6490475 Seeley et al. Dec 2002 B1
6499488 Hunter et al. Dec 2002 B1
6501981 Schweikard et al. Dec 2002 B1
6507751 Blume et al. Jan 2003 B2
6535756 Simon et al. Mar 2003 B1
6560354 Maurer, Jr. et al. May 2003 B1
6565554 Niemeyer May 2003 B1
6587750 Gerbi et al. Jul 2003 B2
6614453 Suri et al. Sep 2003 B1
6614871 Kobiki et al. Sep 2003 B1
6619840 Rasche et al. Sep 2003 B2
6636757 Jascob et al. Oct 2003 B1
6645196 Nixon et al. Nov 2003 B1
6666579 Jensen Dec 2003 B2
6669635 Kessman et al. Dec 2003 B2
6701173 Nowinski et al. Mar 2004 B2
6757068 Foxlin Jun 2004 B2
6782287 Grzeszczuk et al. Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6786896 Madhani et al. Sep 2004 B1
6788018 Blumenkranz Sep 2004 B1
6804581 Wang et al. Oct 2004 B2
6823207 Jensen et al. Nov 2004 B1
6827351 Graziani et al. Dec 2004 B2
6837892 Shoham Jan 2005 B2
6839612 Sanchez et al. Jan 2005 B2
6856826 Seeley et al. Feb 2005 B2
6856827 Seeley et al. Feb 2005 B2
6879880 Nowlin et al. Apr 2005 B2
6892090 Verard et al. May 2005 B2
6920347 Simon et al. Jul 2005 B2
6922632 Foxlin Jul 2005 B2
6968224 Kessman et al. Nov 2005 B2
6978166 Foley et al. Dec 2005 B2
6988009 Grimm et al. Jan 2006 B2
6991627 Madhani et al. Jan 2006 B2
6996487 Jutras et al. Feb 2006 B2
6999852 Green Feb 2006 B2
7007699 Martinelli et al. Mar 2006 B2
7016457 Senzig et al. Mar 2006 B1
7043961 Pandey et al. May 2006 B2
7062006 Pelc et al. Jun 2006 B1
7063705 Young et al. Jun 2006 B2
7072707 Galloway, Jr. et al. Jul 2006 B2
7083615 Peterson et al. Aug 2006 B2
7097640 Wang et al. Aug 2006 B2
7099428 Clinthorne et al. Aug 2006 B2
7108421 Gregerson et al. Sep 2006 B2
7130676 Barrick Oct 2006 B2
7139418 Abovitz et al. Nov 2006 B2
7139601 Bucholz et al. Nov 2006 B2
7155316 Sutherland et al. Dec 2006 B2
7164968 Treat et al. Jan 2007 B2
7167738 Schweikard et al. Jan 2007 B2
7169141 Brock et al. Jan 2007 B2
7172627 Fiere et al. Feb 2007 B2
7194120 Wicker et al. Mar 2007 B2
7197107 Arai et al. Mar 2007 B2
7231014 Levy Jun 2007 B2
7231063 Naimark et al. Jun 2007 B2
7239940 Wang et al. Jul 2007 B2
7248914 Hastings et al. Jul 2007 B2
7301648 Foxlin Nov 2007 B2
7302288 Schellenberg Nov 2007 B1
7313430 Urquhart et al. Dec 2007 B2
7318805 Schweikard et al. Jan 2008 B2
7318827 Leitner et al. Jan 2008 B2
7319897 Leitner et al. Jan 2008 B2
7324623 Heuscher et al. Jan 2008 B2
7327865 Fu et al. Feb 2008 B2
7331967 Lee et al. Feb 2008 B2
7333642 Green Feb 2008 B2
7339341 Oleynikov et al. Mar 2008 B2
7366562 Dukesherer et al. Apr 2008 B2
7379790 Toth et al. May 2008 B2
7386365 Nixon Jun 2008 B2
7422592 Morley et al. Sep 2008 B2
7435216 Kwon et al. Oct 2008 B2
7440793 Chauhan et al. Oct 2008 B2
7460637 Clinthorne et al. Dec 2008 B2
7466303 Yi et al. Dec 2008 B2
7493153 Ahmed et al. Feb 2009 B2
7505617 Fu et al. Mar 2009 B2
7533892 Schena et al. May 2009 B2
7542791 Mire et al. Jun 2009 B2
7555331 Viswanathan Jun 2009 B2
7567834 Clayton et al. Jul 2009 B2
7569058 Zwirnmann Aug 2009 B2
7594912 Cooper et al. Sep 2009 B2
7606613 Simon et al. Oct 2009 B2
7607440 Coste-Maniere et al. Oct 2009 B2
7623902 Pacheco Nov 2009 B2
7630752 Viswanathan Dec 2009 B2
7630753 Simon et al. Dec 2009 B2
7643862 Schoenefeld Jan 2010 B2
7660623 Hunter et al. Feb 2010 B2
7661881 Gregerson et al. Feb 2010 B2
7683331 Chang Mar 2010 B2
7683332 Chang Mar 2010 B2
7689320 Prisco et al. Mar 2010 B2
7691098 Wallace et al. Apr 2010 B2
7702379 Avinash et al. Apr 2010 B2
7702477 Tuemmler et al. Apr 2010 B2
7711083 Heigl et al. May 2010 B2
7711406 Kuhn et al. May 2010 B2
7720523 Omernick et al. May 2010 B2
7725253 Foxlin May 2010 B2
7726171 Langlotz et al. Jun 2010 B2
7742801 Neubauer et al. Jun 2010 B2
7751865 Jascob et al. Jul 2010 B2
7760849 Zhang Jul 2010 B2
7762825 Burbank et al. Jul 2010 B2
7763015 Cooper et al. Jul 2010 B2
7787699 Mahesh et al. Aug 2010 B2
7796728 Bergfjord Sep 2010 B2
7813838 Sommer Oct 2010 B2
7818044 Dukesherer et al. Oct 2010 B2
7819859 Prisco et al. Oct 2010 B2
7824401 Manzo et al. Nov 2010 B2
7831294 Viswanathan Nov 2010 B2
7834484 Sartor Nov 2010 B2
7835557 Kendrick et al. Nov 2010 B2
7835778 Foley et al. Nov 2010 B2
7835784 Mire et al. Nov 2010 B2
7840253 Tremblay et al. Nov 2010 B2
7840256 Lakin et al. Nov 2010 B2
7843158 Prisco Nov 2010 B2
7844320 Shahidi Nov 2010 B2
7853305 Simon et al. Dec 2010 B2
7853313 Thompson Dec 2010 B2
7865269 Prisco et al. Jan 2011 B2
D631966 Perloff et al. Feb 2011 S
7879045 Gielen et al. Feb 2011 B2
7881767 Strommer et al. Feb 2011 B2
7881770 Melkent et al. Feb 2011 B2
7886743 Cooper et al. Feb 2011 B2
RE42194 Foley et al. Mar 2011 E
RE42226 Foley et al. Mar 2011 E
7900524 Calloway et al. Mar 2011 B2
7907166 Lamprecht et al. Mar 2011 B2
7909122 Schena et al. Mar 2011 B2
7925653 Saptharishi Apr 2011 B2
7930065 Larkin et al. Apr 2011 B2
7935130 Willliams May 2011 B2
7940999 Liao et al. May 2011 B2
7945012 Ye et al. May 2011 B2
7945021 Shapiro et al. May 2011 B2
7953470 Vetter et al. May 2011 B2
7954397 Choi et al. Jun 2011 B2
7971341 Dukesherer et al. Jul 2011 B2
7974674 Hauck et al. Jul 2011 B2
7974677 Mire et al. Jul 2011 B2
7974681 Wallace et al. Jul 2011 B2
7979157 Anvari Jul 2011 B2
7983733 Viswanathan Jul 2011 B2
7988215 Seibold Aug 2011 B2
7996110 Lipow et al. Aug 2011 B2
8004121 Sartor Aug 2011 B2
8004229 Nowlin et al. Aug 2011 B2
8010177 Csavoy et al. Aug 2011 B2
8019045 Kato Sep 2011 B2
8021310 Sanborn et al. Sep 2011 B2
8035685 Jensen Oct 2011 B2
8046054 Kim et al. Oct 2011 B2
8046057 Clarke Oct 2011 B2
8052688 Wolf, II Nov 2011 B2
8054184 Cline et al. Nov 2011 B2
8054752 Druke et al. Nov 2011 B2
8057397 Li et al. Nov 2011 B2
8057407 Martinelli et al. Nov 2011 B2
8062288 Cooper et al. Nov 2011 B2
8062375 Glerum et al. Nov 2011 B2
8066524 Burbank et al. Nov 2011 B2
8073335 Labonville et al. Dec 2011 B2
8079950 Stern et al. Dec 2011 B2
8086299 Adler et al. Dec 2011 B2
8092370 Roberts et al. Jan 2012 B2
8098914 Liao et al. Jan 2012 B2
8100950 St. Clair et al. Jan 2012 B2
8105320 Manzo Jan 2012 B2
8108025 Csavoy et al. Jan 2012 B2
8109877 Moctezuma de la Barrera et al. Feb 2012 B2
8112292 Simon Feb 2012 B2
8116430 Shapiro et al. Feb 2012 B1
8120301 Goldberg et al. Feb 2012 B2
8121249 Wang et al. Feb 2012 B2
8123675 Funda et al. Feb 2012 B2
8133229 Bonutti Mar 2012 B1
8142420 Schena Mar 2012 B2
8147494 Leitner et al. Apr 2012 B2
8150494 Simon et al. Apr 2012 B2
8150497 Gielen et al. Apr 2012 B2
8150498 Gielen et al. Apr 2012 B2
8165658 Waynik et al. Apr 2012 B2
8170313 Kendrick et al. May 2012 B2
8179073 Farritor et al. May 2012 B2
8182476 Julian et al. May 2012 B2
8184880 Zhao et al. May 2012 B2
8202278 Orban, III et al. Jun 2012 B2
8208708 Homan et al. Jun 2012 B2
8208988 Jensen Jun 2012 B2
8219177 Smith et al. Jul 2012 B2
8219178 Smith et al. Jul 2012 B2
8220468 Cooper et al. Jul 2012 B2
8224024 Foxlin et al. Jul 2012 B2
8224484 Swarup et al. Jul 2012 B2
8225798 Baldwin et al. Jul 2012 B2
8228368 Zhao et al. Jul 2012 B2
8231610 Jo et al. Jul 2012 B2
8263933 Hartmann et al. Jul 2012 B2
8239001 Verard et al. Aug 2012 B2
8241271 Millman et al. Aug 2012 B2
8248413 Gattani et al. Aug 2012 B2
8256319 Cooper et al. Sep 2012 B2
8271069 Jascob et al. Sep 2012 B2
8271130 Hourtash Sep 2012 B2
8281670 Larkin et al. Oct 2012 B2
8282653 Nelson et al. Oct 2012 B2
8301226 Csavoy et al. Oct 2012 B2
8311611 Csavoy et al. Nov 2012 B2
8320991 Jascob et al. Nov 2012 B2
8332012 Kienzle, III Dec 2012 B2
8333755 Cooper et al. Dec 2012 B2
8335552 Stiles Dec 2012 B2
8335557 Maschke Dec 2012 B2
8348931 Cooper et al. Jan 2013 B2
8353963 Glerum Jan 2013 B2
8358818 Miga et al. Jan 2013 B2
8359730 Burg et al. Jan 2013 B2
8374673 Adcox et al. Feb 2013 B2
8374723 Zhao et al. Feb 2013 B2
8379791 Forthmann et al. Feb 2013 B2
8386019 Camus et al. Feb 2013 B2
8392022 Ortmaier et al. Mar 2013 B2
8394099 Patwardhan Mar 2013 B2
8395342 Prisco Mar 2013 B2
8398634 Manzo et al. Mar 2013 B2
8400094 Schena Mar 2013 B2
8414957 Enzerink et al. Apr 2013 B2
8418073 Mohr et al. Apr 2013 B2
8450694 Baviera et al. May 2013 B2
8452447 Nixon May 2013 B2
RE44305 Foley et al. Jun 2013 E
8462911 Vesel et al. Jun 2013 B2
8465476 Rogers et al. Jun 2013 B2
8465771 Wan et al. Jun 2013 B2
8467851 Mire et al. Jun 2013 B2
8467852 Csavoy et al. Jun 2013 B2
8469947 Devengenzo et al. Jun 2013 B2
RE44392 Hynes Jul 2013 E
8483434 Buehner et al. Jul 2013 B2
8483800 Jensen et al. Jul 2013 B2
8486532 Enzerink et al. Jul 2013 B2
8489235 Moll et al. Jul 2013 B2
8500722 Cooper Aug 2013 B2
8500728 Newton et al. Aug 2013 B2
8504201 Moll et al. Aug 2013 B2
8506555 Ruiz Morales Aug 2013 B2
8506556 Schena Aug 2013 B2
8508173 Goldberg et al. Aug 2013 B2
8512318 Tovey et al. Aug 2013 B2
8515576 Lipow et al. Aug 2013 B2
8518120 Glerum et al. Aug 2013 B2
8521331 Itkowitz Aug 2013 B2
8526688 Groszmann et al. Sep 2013 B2
8526700 Isaacs Sep 2013 B2
8527094 Kumar et al. Sep 2013 B2
8528440 Morley et al. Sep 2013 B2
8532741 Heruth et al. Sep 2013 B2
8541970 Nowlin et al. Sep 2013 B2
8548563 Simon et al. Oct 2013 B2
8549732 Burg et al. Oct 2013 B2
8551114 Ramos de la Pena Oct 2013 B2
8551116 Julian et al. Oct 2013 B2
8556807 Scott et al. Oct 2013 B2
8556979 Glerum et al. Oct 2013 B2
8560118 Greer Oct 2013 B2
8561473 Blumenkranz Oct 2013 B2
8562594 Cooper et al. Oct 2013 B2
8571638 Shoham Oct 2013 B2
8571710 Coste-Maniere et al. Oct 2013 B2
8573465 Shelton, IV Nov 2013 B2
8574303 Sharkey et al. Nov 2013 B2
8585420 Burbank et al. Nov 2013 B2
8594841 Zhao et al. Nov 2013 B2
8597198 Sanborn et al. Dec 2013 B2
8600478 Verard et al. Dec 2013 B2
8603077 Cooper et al. Dec 2013 B2
8611985 Lavallee et al. Dec 2013 B2
8613230 Blumenkranz et al. Dec 2013 B2
8621939 Blumenkranz et al. Jan 2014 B2
8624537 Nowlin et al. Jan 2014 B2
8630389 Kato Jan 2014 B2
8634897 Simon et al. Jan 2014 B2
8634957 Toth et al. Jan 2014 B2
8638056 Goldberg et al. Jan 2014 B2
8638057 Goldberg et al. Jan 2014 B2
8639000 Zhao et al. Jan 2014 B2
8641726 Bonutti Feb 2014 B2
8644907 Hartmann et al. Feb 2014 B2
8657809 Schoepp Feb 2014 B2
8660635 Simon et al. Feb 2014 B2
8666544 Moll et al. Mar 2014 B2
8675939 Moctezuma de la Barrera Mar 2014 B2
8678647 Gregerson et al. Mar 2014 B2
8679125 Smith et al. Mar 2014 B2
8679183 Glerum et al. Mar 2014 B2
8682413 Lloyd Mar 2014 B2
8684253 Giordano et al. Apr 2014 B2
8685098 Glerum et al. Apr 2014 B2
8693730 Umasuthan et al. Apr 2014 B2
8694075 Groszmann et al. Apr 2014 B2
8696458 Foxlin et al. Apr 2014 B2
8700123 Okamura et al. Apr 2014 B2
8706086 Glerum Apr 2014 B2
8706185 Foley et al. Apr 2014 B2
8706301 Zhao et al. Apr 2014 B2
8717430 Simon et al. May 2014 B2
8727618 Maschke et al. May 2014 B2
8734432 Tuma et al. May 2014 B2
8738115 Amberg et al. May 2014 B2
8738181 Greer et al. May 2014 B2
8740882 Jun et al. Jun 2014 B2
8746252 McGrogan et al. Jun 2014 B2
8749189 Nowlin et al. Jun 2014 B2
8749190 Nowlin et al. Jun 2014 B2
8761930 Nixon Jun 2014 B2
8764448 Yang et al. Jul 2014 B2
8771170 Mesallum et al. Jul 2014 B2
8781186 Clements et al. Jul 2014 B2
8781630 Banks et al. Jul 2014 B2
8784385 Boyden et al. Jul 2014 B2
8786241 Nowlin et al. Jul 2014 B2
8787520 Baba Jul 2014 B2
8792704 Isaacs Jul 2014 B2
8798231 Notohara et al. Aug 2014 B2
8800838 Shelton, IV Aug 2014 B2
8808164 Hoffman et al. Aug 2014 B2
8812077 Dempsey Aug 2014 B2
8814793 Brabrand Aug 2014 B2
8816628 Nowlin et al. Aug 2014 B2
8818105 Myronenko et al. Aug 2014 B2
8820605 Shelton, IV Sep 2014 B2
8821511 Von Jako et al. Sep 2014 B2
8823308 Nowlin et al. Sep 2014 B2
8827996 Scott et al. Sep 2014 B2
8828024 Farritor et al. Sep 2014 B2
8830224 Zhao et al. Sep 2014 B2
8834489 Cooper et al. Sep 2014 B2
8834490 Bonutti Sep 2014 B2
8838270 Druke et al. Sep 2014 B2
8844789 Shelton, IV et al. Sep 2014 B2
8855822 Bartol et al. Oct 2014 B2
8858598 Seifert et al. Oct 2014 B2
8860753 Bhandarkar et al. Oct 2014 B2
8864751 Prisco et al. Oct 2014 B2
8864798 Weiman et al. Oct 2014 B2
8864833 Glerum et al. Oct 2014 B2
8867703 Shapiro et al. Oct 2014 B2
8870880 Himmelberger et al. Oct 2014 B2
8876866 Zappacosta et al. Nov 2014 B2
8880223 Raj et al. Nov 2014 B2
8882803 Iott et al. Nov 2014 B2
8883210 Truncale et al. Nov 2014 B1
8888821 Rezach et al. Nov 2014 B2
8888853 Glerum et al. Nov 2014 B2
8888854 Glerum et al. Nov 2014 B2
8894652 Seifert et al. Nov 2014 B2
8894688 Suh Nov 2014 B2
8894691 Iott et al. Nov 2014 B2
8906069 Hansell et al. Dec 2014 B2
8964934 Ein-Gal Feb 2015 B2
8992580 Bar et al. Mar 2015 B2
8996169 Lightcap et al. Mar 2015 B2
9001963 Sowards-Emmerd et al. Apr 2015 B2
9002076 Khadem et al. Apr 2015 B2
9044190 Rubner et al. Jun 2015 B2
9107683 Hourtash et al. Aug 2015 B2
9125556 Zehavi et al. Sep 2015 B2
9131986 Greer et al. Sep 2015 B2
9215968 Schostek et al. Dec 2015 B2
9308050 Kostrzewski et al. Apr 2016 B2
9380984 Li et al. Jul 2016 B2
9393039 Lechner et al. Jul 2016 B2
9398886 Gregerson et al. Jul 2016 B2
9398890 Dong et al. Jul 2016 B2
9414859 Ballard et al. Aug 2016 B2
9420975 Gutfleisch et al. Aug 2016 B2
9492235 Hourtash et al. Nov 2016 B2
9592096 Maillet et al. Mar 2017 B2
9750465 Engel et al. Sep 2017 B2
9757203 Hourtash et al. Sep 2017 B2
9795354 Menegaz et al. Oct 2017 B2
9814535 Bar et al. Nov 2017 B2
9820783 Donner et al. Nov 2017 B2
9833265 Donner et al. Nov 2017 B2
9848922 Tohmeh et al. Dec 2017 B2
9925011 Gombert et al. Mar 2018 B2
9931025 Graetzel et al. Apr 2018 B1
10034717 Miller et al. Jul 2018 B2
20010036302 Miller Nov 2001 A1
20020035321 Bucholz et al. Mar 2002 A1
20040068172 Nowinski et al. Apr 2004 A1
20040068187 Krause Apr 2004 A1
20040076259 Jensen et al. Apr 2004 A1
20050096502 Khalili May 2005 A1
20050143651 Verard et al. Jun 2005 A1
20050171558 Abovitz et al. Aug 2005 A1
20060100610 Wallace et al. May 2006 A1
20060173329 Marquart et al. Aug 2006 A1
20060184396 Dennis et al. Aug 2006 A1
20060241416 Marquart et al. Oct 2006 A1
20060291612 Nishide et al. Dec 2006 A1
20070015987 Benlloch Baviera et al. Jan 2007 A1
20070016068 Gunwald Jan 2007 A1
20070021738 Hasser et al. Jan 2007 A1
20070038059 Sheffer et al. Feb 2007 A1
20070073133 Schoenefeld Mar 2007 A1
20070156121 Millman et al. Jul 2007 A1
20070156157 Nahum et al. Jul 2007 A1
20070167712 Keglovich et al. Jul 2007 A1
20070233238 Huynh et al. Oct 2007 A1
20080004523 Jensen Jan 2008 A1
20080013809 Zhu et al. Jan 2008 A1
20080033283 Dellaca et al. Feb 2008 A1
20080046122 Manzo et al. Feb 2008 A1
20080082109 Moll et al. Apr 2008 A1
20080108912 Node-Langlois May 2008 A1
20080108991 Von Jako May 2008 A1
20080109012 Falco et al. May 2008 A1
20080144906 Allred et al. Jun 2008 A1
20080161680 Von Jako et al. Jul 2008 A1
20080161682 Kendrick et al. Jul 2008 A1
20080177203 von Jako Jul 2008 A1
20080183190 Adcox Jul 2008 A1
20080214922 Hartmann et al. Sep 2008 A1
20080228068 Viswanathan et al. Sep 2008 A1
20080228196 Wang et al. Sep 2008 A1
20080235052 Node-Langlois et al. Sep 2008 A1
20080269596 Revie et al. Oct 2008 A1
20080287771 Anderson Nov 2008 A1
20080287781 Revie et al. Nov 2008 A1
20080300477 Lloyd et al. Dec 2008 A1
20080300478 Zuhars et al. Dec 2008 A1
20080302950 Park et al. Dec 2008 A1
20080306490 Lakin et al. Dec 2008 A1
20080319311 Hamadeh Dec 2008 A1
20090012509 Csavoy et al. Jan 2009 A1
20090030428 Omori et al. Jan 2009 A1
20090080737 Battle et al. Mar 2009 A1
20090088848 Martz Apr 2009 A1
20090185655 Koken et al. Jul 2009 A1
20090198121 Hoheisel Aug 2009 A1
20090216113 Meier et al. Aug 2009 A1
20090228019 Gross et al. Sep 2009 A1
20090259123 Navab et al. Oct 2009 A1
20090259230 Khadem et al. Oct 2009 A1
20090264899 Appenrodt et al. Oct 2009 A1
20090281417 Hartmann et al. Nov 2009 A1
20100022874 Wang et al. Jan 2010 A1
20100039506 Sarvestani et al. Feb 2010 A1
20100125286 Wang et al. May 2010 A1
20100130986 Mailloux et al. May 2010 A1
20100228117 Hartmann Sep 2010 A1
20100228265 Prisco Sep 2010 A1
20100249571 Jensen et al. Sep 2010 A1
20100274120 Heuscher Oct 2010 A1
20100280363 Skarda et al. Nov 2010 A1
20100331858 Simaan et al. Dec 2010 A1
20110022229 Jang et al. Jan 2011 A1
20110077504 Fischer et al. Mar 2011 A1
20110098553 Robbins et al. Apr 2011 A1
20110137152 Li Jun 2011 A1
20110213384 Jeong Sep 2011 A1
20110224684 Larkin et al. Sep 2011 A1
20110224685 Larkin et al. Sep 2011 A1
20110224686 Larkin et al. Sep 2011 A1
20110224687 Larkin et al. Sep 2011 A1
20110224688 Larkin et al. Sep 2011 A1
20110224689 Larkin et al. Sep 2011 A1
20110224825 Larkin et al. Sep 2011 A1
20110230967 O'Halloran et al. Sep 2011 A1
20110238080 Ranjit et al. Sep 2011 A1
20110276058 Choi et al. Nov 2011 A1
20110282189 Graumann Nov 2011 A1
20110286573 Schretter et al. Nov 2011 A1
20110295062 Gratacos Solsona et al. Dec 2011 A1
20110295370 Suh et al. Dec 2011 A1
20110306986 Lee et al. Dec 2011 A1
20120035507 George et al. Feb 2012 A1
20120046668 Gantes Feb 2012 A1
20120051498 Koishi Mar 2012 A1
20120053597 Anvari et al. Mar 2012 A1
20120059248 Holsing et al. Mar 2012 A1
20120071753 Hunter et al. Mar 2012 A1
20120108954 Schulhauser et al. May 2012 A1
20120136372 Amat Girbau et al. May 2012 A1
20120143084 Shoham Jun 2012 A1
20120184839 Woerlein Jul 2012 A1
20120197182 Millman et al. Aug 2012 A1
20120226145 Chang et al. Sep 2012 A1
20120235909 Birkenbach et al. Sep 2012 A1
20120245596 Meenink Sep 2012 A1
20120253332 Moll Oct 2012 A1
20120253360 White et al. Oct 2012 A1
20120256092 Zingerman Oct 2012 A1
20120294498 Popovic Nov 2012 A1
20120296203 Hartmann et al. Nov 2012 A1
20130006267 Odermatt et al. Jan 2013 A1
20130016889 Myronenko et al. Jan 2013 A1
20130030571 Ruiz Morales et al. Jan 2013 A1
20130035583 Park et al. Feb 2013 A1
20130060146 Yang et al. Mar 2013 A1
20130060337 Petersheim et al. Mar 2013 A1
20130094742 Feilkas Apr 2013 A1
20130096574 Kang et al. Apr 2013 A1
20130113791 Isaacs et al. May 2013 A1
20130116706 Lee et al. May 2013 A1
20130131695 Scarfogliero et al. May 2013 A1
20130144307 Jeong et al. Jun 2013 A1
20130158542 Manzo et al. Jun 2013 A1
20130165937 Patwardhan Jun 2013 A1
20130178867 Farritor et al. Jul 2013 A1
20130178868 Roh Jul 2013 A1
20130178870 Schena Jul 2013 A1
20130204271 Brisson et al. Aug 2013 A1
20130211419 Jensen Aug 2013 A1
20130211420 Jensen Aug 2013 A1
20130218142 Tuma et al. Aug 2013 A1
20130223702 Holsing et al. Aug 2013 A1
20130225942 Holsing et al. Aug 2013 A1
20130225943 Holsing et al. Aug 2013 A1
20130231556 Holsing et al. Sep 2013 A1
20130237995 Lee et al. Sep 2013 A1
20130245375 DiMaio et al. Sep 2013 A1
20130261640 Kim et al. Oct 2013 A1
20130272488 Bailey et al. Oct 2013 A1
20130272489 Dickman et al. Oct 2013 A1
20130274761 Devengenzo et al. Oct 2013 A1
20130281821 Liu et al. Oct 2013 A1
20130296884 Taylor et al. Nov 2013 A1
20130303887 Holsing et al. Nov 2013 A1
20130307955 Deitz et al. Nov 2013 A1
20130317521 Choi et al. Nov 2013 A1
20130325033 Schena et al. Dec 2013 A1
20130325035 Hauck et al. Dec 2013 A1
20130331686 Freysinger et al. Dec 2013 A1
20130331858 Devengenzo et al. Dec 2013 A1
20130331861 Yoon Dec 2013 A1
20130342578 Isaacs Dec 2013 A1
20130345717 Markvicka et al. Dec 2013 A1
20130345757 Stad Dec 2013 A1
20140001235 Shelton, IV Jan 2014 A1
20140012131 Heruth et al. Jan 2014 A1
20140031664 Kang et al. Jan 2014 A1
20140046128 Lee et al. Feb 2014 A1
20140046132 Hoeg et al. Feb 2014 A1
20140046340 Wilson et al. Feb 2014 A1
20140049629 Siewerdsen et al. Feb 2014 A1
20140058406 Tsekos Feb 2014 A1
20140073914 Lavallee et al. Mar 2014 A1
20140080086 Chen Mar 2014 A1
20140081128 Verard et al. Mar 2014 A1
20140088612 Bartol et al. Mar 2014 A1
20140094694 Moctezuma de la Barrera Apr 2014 A1
20140094851 Gordon Apr 2014 A1
20140096369 Matsumoto et al. Apr 2014 A1
20140100587 Farritor et al. Apr 2014 A1
20140121676 Kostrzewski May 2014 A1
20140128882 Kwak et al. May 2014 A1
20140135796 Simon et al. May 2014 A1
20140142591 Alvarez et al. May 2014 A1
20140142592 Moon et al. May 2014 A1
20140148692 Hartmann et al. May 2014 A1
20140163581 Devengenzo et al. Jun 2014 A1
20140171781 Stiles Jun 2014 A1
20140171900 Stiles Jun 2014 A1
20140171965 Loh et al. Jun 2014 A1
20140180308 von Grunberg Jun 2014 A1
20140180309 Seeber et al. Jun 2014 A1
20140187915 Yaroshenko et al. Jul 2014 A1
20140188132 Kang Jul 2014 A1
20140194699 Roh et al. Jul 2014 A1
20140130810 Azizian et al. Aug 2014 A1
20140221819 Sarment Aug 2014 A1
20140222023 Kim et al. Aug 2014 A1
20140228631 Kwak et al. Aug 2014 A1
20140234804 Huang et al. Aug 2014 A1
20140257328 Kim et al. Sep 2014 A1
20140257329 Jang et al. Sep 2014 A1
20140257330 Choi et al. Sep 2014 A1
20140275760 Lee et al. Sep 2014 A1
20140275985 Walker et al. Sep 2014 A1
20140276931 Parihar et al. Sep 2014 A1
20140276940 Seo Sep 2014 A1
20140276944 Farritor et al. Sep 2014 A1
20140288413 Hwang et al. Sep 2014 A1
20140299648 Shelton, IV et al. Oct 2014 A1
20140303434 Farritor et al. Oct 2014 A1
20140303643 Ha et al. Oct 2014 A1
20140305995 Shelton, IV et al. Oct 2014 A1
20140309659 Roh et al. Oct 2014 A1
20140316436 Bar et al. Oct 2014 A1
20140323803 Hoffman et al. Oct 2014 A1
20140324070 Min et al. Oct 2014 A1
20140330288 Date et al. Nov 2014 A1
20140364720 Darrow et al. Dec 2014 A1
20140371577 Maillet et al. Dec 2014 A1
20150039034 Frankel et al. Feb 2015 A1
20150085970 Bouhnik et al. Mar 2015 A1
20150146847 Liu May 2015 A1
20150150524 Yorkston et al. Jun 2015 A1
20150196261 Funk Jul 2015 A1
20150196365 Kostrzewski Jul 2015 A1
20150213633 Chang et al. Jul 2015 A1
20150335480 Alvarez et al. Nov 2015 A1
20150342647 Frankel et al. Dec 2015 A1
20160005194 Schretter et al. Jan 2016 A1
20160166329 Langan et al. Jun 2016 A1
20160235480 Scholl et al. Aug 2016 A1
20160249990 Glozman et al. Sep 2016 A1
20160302871 Gregerson et al. Oct 2016 A1
20160320322 Suzuki Nov 2016 A1
20160331335 Gregerson et al. Nov 2016 A1
20170135770 Scholl et al. May 2017 A1
20170143284 Sehnert et al. May 2017 A1
20170143426 Isaacs et al. May 2017 A1
20170156816 Ibrahim Jun 2017 A1
20170202629 Maillet et al. Jul 2017 A1
20170212723 Atarot et al. Jul 2017 A1
20170215825 Johnson et al. Aug 2017 A1
20170215826 Johnson et al. Aug 2017 A1
20170215827 Johnson et al. Aug 2017 A1
20170231710 Scholl et al. Aug 2017 A1
20170258426 Risher-Kelly et al. Sep 2017 A1
20170273748 Hourtash et al. Sep 2017 A1
20170296277 Hourtash et al. Oct 2017 A1
20170360493 Zucher et al. Dec 2017 A1
Foreign Referenced Citations (2)
Number Date Country
201532227 Oct 2010 JP
WO-2013192598 Dec 2013 WO
Non-Patent Literature Citations (1)
Entry
US 8,231,638 B2, 07/2012, Swarup et al. (withdrawn)
Related Publications (1)
Number Date Country
20210038326 A1 Feb 2021 US
Provisional Applications (1)
Number Date Country
62212551 Aug 2015 US
Continuations (1)
Number Date Country
Parent 15253206 Aug 2016 US
Child 16888917 US