Robot surgical platform

Information

  • Patent Grant
  • 11135015
  • Patent Number
    11,135,015
  • Date Filed
    Tuesday, July 17, 2018
    5 years ago
  • Date Issued
    Tuesday, October 5, 2021
    2 years ago
Abstract
A surgical implant planning computer for intra-operative CT workflow, pre-operative CT imaging workflow, and fluoroscopic imaging workflow. A network interface is connectable to a CT image scanner and a robot surgical platform having a robot base coupled to a robot arm that is movable by motors. A CT image of a bone is received from the CT image scanner and displayed. A user's selection is received of a surgical screw from among a set of defined surgical screws. A graphical screw representing the selected surgical screw is displayed as an overlay on the CT image of the bone. Angular orientation and location of the displayed graphical screw relative to the bone in the CT image is controlled responsive to receipt of user inputs. An indication of the selected surgical screw and an angular orientation and a location of the displayed graphical screw are stored in a surgical plan data structure.
Description
TECHNICAL FIELD

The present disclosure relates to medical devices, and more particularly, robotic surgical systems and related methods and devices.


BACKGROUND

Various medical procedures require the precise localization of a three-dimensional position of a surgical instrument within the body of a patient in order to effect optimized treatment. For example, some surgical procedures to fuse vertebrae require that a surgeon drill multiple holes into the bone structure at specific locations. To achieve high levels of mechanical integrity in the fusing system, and to balance the forces created in the bone structure, it is necessary that the holes are drilled precisely at desired locations. Vertebrae, like most bone structures, have complex shapes made up of non-planar curved surfaces making precise and perpendicular drilling difficult. Conventionally, a surgeon manually holds and positions a drill guide tube by using a guidance system to overlay the drill tube's position onto a three dimensional image of the bone structure. This manual process is both tedious and time consuming. The success of the surgery is largely dependent upon the dexterity of the surgeon who performs it.


Robot surgical platforms are being introduced that can assist surgeons with positioning surgical tools and performing surgical procedures within a patient body. A robot surgical platform can include a robot that is coupled to an end-effector element, and where the robot is configured to control movement and positioning of the end-effector relative to the body. The end-effector may be a surgical tool guide tube, such as a drill guide tube, or may be the surgical tool itself.


There is a need for a robot surgical platform that provides accurate localization of a three-dimensional position of a surgical tool relative to the body in order to effect optimized treatment. Improved localization accuracy can minimize human and robotic error while allowing fast and efficient surgical process. The ability to perform operations on a patient with a robot surgical platform and computer software can enhance the overall surgical procedure and the results achieved for the patient.


SUMMARY

Some embodiments of the present disclosure are directed to a surgical implant planning computer that can be used for intra-operative computed tomography (CT) imaging workflow. The surgical implant planning computer includes at least one network interface, a display device, at least one processor, and at least one memory. The at least one network interface is connectable to a CT image scanner and to a robot having a robot base coupled to a robot arm that is movable by motors relative to the robot base. The at least one memory stores program code that is executed by the at least one processor to perform operations that include displaying on the display device a CT image of a bone that is received from the CT image scanner through the at least one network interface and receiving a user's selection of a surgical screw from among a set of defined surgical screws. The operations further include displaying a graphical screw representing the selected surgical screw as an overlay on the CT image of the bone and controlling angular orientation and location of the displayed graphical screw relative to the bone in the CT image responsive to receipt of user inputs. An indication of the selected surgical screw and an angular orientation and a location of the displayed graphical screw are stored in a surgical plan data structure responsive to receipt of a defined user input.


Some other embodiments of the present disclosure are directed to a surgical implant planning computer that can be used for pre-operative CT imaging workflow. The surgical implant planning computer includes at least one network interface, a display device, at least one processor, and at least one memory. The at least one network interface is connectable to an image database. The at least one memory stores program code that is executed by the at least one processor to perform operations that include loading a CT image of a bone, which is received from the image database through the at least one network interface, into the at least one memory. The operations display displaying the CT image on the display device. The operations receive a user's selection of a surgical screw from among a set of defined surgical screws, and display a graphical screw representing the selected surgical screw as an overlay on the CT image of the bone. The operations control angular orientation and location of the displayed graphical screw relative to the bone in the CT image responsive to receipt of user inputs, and store an indication of the selected surgical screw and an angular orientation and a location of the displayed graphical screw in a surgical plan data structure responsive to user input, the surgical plan data structure being configured for use by a robot with a robot base coupled to a robot arm that is movable by motors relative to the robot base.


Some other embodiments of the present disclosure are directed to a surgical implant planning computer that can be used for fluoroscopic imaging workflow. The surgical implant planning computer includes at least one network interface, a display device, at least one processor, and at least one memory. The at least one network interface is connectable to a fluoroscopy imager, a marker tracking camera, and a robot having a robot base that is coupled to a robot arm which movable by motors relative to the robot base. The at least one memory stores program code that is executed by the at least one processor to perform operations that include performing a registration setup mode that includes determining occurrence of a first condition indicating the marker tracking camera can observe to track reflective markers that are attached to a fluoroscopy registration fixture of a fluoroscopy imager, and determining occurrence of a second condition indicating the marker tracking camera can observe to track dynamic reference base markers attached to the robot arm and/or an end-effector connected to the robot arm. While both of the first and second conditions are determined to continue to occur, the at least one processor allows operations to be performed to obtain a first intra-operative fluoroscopic image of a patient along a first plane and to obtain a second intra-operative fluoroscopic image of the patient along a second plane that is orthogonal to the first plane.


Corresponding methods and computer program products are disclosed.


Still other surgical implant landing computers, methods, and computer program products according to embodiments of the inventive subject matter will be or become apparent to one with skill in the art upon review of the following drawings and detailed description. It is intended that all such surgical implant landing computers, methods, and computer program products be included within this description, be within the scope of the present inventive subject matter, and be protected by the accompanying claims. Moreover, it is intended that all embodiments disclosed herein can be implemented separately or combined in any way and/or combination.





DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in a constitute a part of this application, illustrate certain non-limiting embodiments of inventive concepts. In the drawings:



FIG. 1 illustrates a robotic system that includes a robotic base station and a camera stand.



FIG. 2 illustrates components of a robotic base station.



FIG. 3 illustrates the monitor of the robotic base station.



FIG. 4 illustrates the control panel on the rear of the robotic base station and the control panel functions.



FIG. 5 illustrates the connector panel located at the rear of the robotic base station.



FIG. 6 illustrates the 5-axis robotic arm.



FIG. 7 illustrates the lower arm.



FIG. 8 illustrates the upper part of the vertical column.



FIG. 9 illustrates the camera stand.



FIG. 10 illustrates the rear view of the camera stand showing alignment buttons.



FIG. 11 illustrates isometric and top views of the end-effector.



FIG. 12 illustrates the detent mechanism on the instrument sensing ring.



FIG. 13 illustrates a scalpel used through the guide tube.



FIG. 14 illustrates the trajectory of the outer cannula.



FIGS. 15(a)-15(f) illustrate one technique for dilating tissue with the devices. FIG. 15(a) illustrates how the outer cannula is positioned above the incision. FIG. 15(b) illustrates how the cannulas is placed into the guide tube such that it rests on skin. FIG. 15(c) illustrates how the first inner cannula is inserted into the incision. FIG. 15(d) illustrates how the second inner cannula is then inserted into the incision. FIG. 15(e) illustrates how the outer cannula is then inserted into the incision. FIG. 15(f) illustrates both inner cannulas then being removed and lowering the guide tube until it sits within the outer cannula.



FIG. 16 illustrate some embodiments of the navigated survival instruments.



FIG. 17 illustrates the array.



FIG. 18 illustrates the verification probe.



FIG. 19 illustrates the patient attachment instruments.



FIG. 20 illustrates tightening bone clamp using clamp driver.



FIG. 21 illustrates the guide post and the quattro spike.



FIGS. 22(a)-22(d) illustrate one method for inserting a low profile quattro spike into rigid bony anatomy. FIG. 22(a) illustrates positioning a quattro spike over a guide post. FIG. 22(b) illustrates attaching an impaction cap. FIG. 22(c) illustrates inserting an assembly into a rigid anatomy. FIG. 22(d) illustrates removing a cap and guide pose.



FIG. 23 illustrates inserting a rod attachment instrument including a set screw, to attach to the existing spinal rod.



FIG. 24 illustrates a surveillance marker.



FIG. 25 illustrates a use of a surveillance marker with a bone clamp.



FIG. 26 illustrates a dynamic reference base.



FIG. 27 illustrates a intra-op registration fixture and pivoting arm.



FIG. 28 illustrates a Fluoroscopy Registration Fixture.



FIG. 29 illustrates an end effector motion when moving from one trajectory to the next, wherein 1, 2, and 3 are automatic movements; 4 is manual and optional.



FIG. 30 illustrates a power button, line power indicator and battery indicator.



FIGS. 31(a) and 31(b) illustrate a camera stand undocking. FIG. 31(a) illustrates pulling up on the release handle located on a camera stand. FIG. 31(b) illustrates clearing the legs of a camera stand legs automatically releasing and moving outward.



FIG. 32 illustrates the connection of a camera to a connector panel on a base station.



FIG. 33 illustrates a camera positioning.



FIG. 34 illustrates pressing a laser button to align the camera.



FIG. 35 illustrates a system with a sterile drape.



FIG. 36 illustrates a foot pedal cable connection.



FIG. 37 illustrates buttons which are illuminated when stabilizers engage and stabilizers disengage.



FIG. 38 illustrates the robotic arm interface plate for connection to the end effector.



FIG. 39 illustrates opening brackets on an end effector and place the end effector on the interface plate by aligning the V grooves and alignment spheres.



FIG. 40 illustrates squeezing brackets on both sides of an end effector and press the handle down to lock into place.



FIG. 41 illustrates a correct and incorrect positioning of a handle down to lock into place.



FIG. 42 illustrates a removal of the end effector.



FIG. 43 illustrates inserting an instrument shaft into an array sleeve.



FIG. 44 illustrates a surgical instrument assembly.



FIG. 45 illustrates attaching a quick connect handle on the proximal end of a shaft of the surgical instrument assembly.



FIGS. 46(a) and 46(b) illustrate attaching a reflective marker to one of a plurality of marker posts of the instrument assembly. FIG. 46(a) illustrates lowering the reflective marker onto a marker post. FIG. 46(b) illustrates a marker fully seated on the post.



FIG. 47 illustrates a login screen displayed on a monitor.



FIG. 48 illustrates a case management screen displayed on a monitor.



FIG. 49 illustrates a CONFIGURE tab used to display procedure types.



FIG. 50 illustrates a PREPLAN tab displayed on the monitor to select the implant system, desired vertebral level and orientation.



FIG. 51 illustrates a VERIFY tab displaying navigation details including visibility, location and verification status of the instruments selected on the PREPLAN tab.



FIG. 52 illustrates a pop-up screen appearing on the VERIFY tab to indicate the verification progress.



FIG. 53 illustrates verification divots located on the end effector.



FIG. 54 illustrates a green circle indicating a successful verification.



FIG. 55 illustrates a red crossed circle indicating a failed verification.



FIG. 56 illustrates securing a Dynamic Reference Base to a patient attachment instrument.



FIG. 57 illustrates using a clamp driver to a Dynamic Reference Base.



FIG. 58 illustrates the placement of a Dynamic Reference Base and a surveillance marker.



FIG. 59 illustrates a quattro spike.



FIG. 60 illustrates a quattro spike removal tool.



FIG. 61 illustrates removing a quattro spike with a removal tool.



FIG. 62 illustrates attaching a registration fixture to a pivoting arm.



FIG. 63 illustrates a registration fixture connecting to a patient attachment instrument.



FIG. 64 illustrates a registered fiducial.



FIG. 65 illustrates a PLAN tab allowing a user to plan all screw trajectories on a patient image.



FIG. 66 illustrates a NAVIGATE tab allowing a user to visualize a navigated instrument trajectory and a planned trajectory with respect to patient anatomy.



FIG. 67 illustrates a PLAN tab allowing a user to plan all screw trajectories on a patient image.



FIG. 68 illustrates the first screen highlighting the three steps to complete before the fluoroscopy images can be taken to register the pre-operative CT image.



FIG. 69 illustrates a Fluoroscopy Registration Fixture attached to image intensifier.



FIG. 70 illustrates a lateral image within the NAVIGATE tab.



FIG. 71 illustrates selecting the desired level.



FIG. 72 illustrates a successful registration with a check mark being shown next to the active level.



FIG. 73 illustrates how the real-time instrument/implant trajectory is displayed on the patient images along with the planned screw, allowing the user to confirm the desired trajectory.



FIG. 74 illustrates a lateral image within the NAVIGATE tab.



FIG. 75 illustrates the PLAN tab allowing the user to plan all screw trajectories on the patient image.



FIG. 76 illustrates the NAVIGATE tab allowing the user to visualize the navigated instrument trajectory and the planned trajectory with respect to patient anatomy.



FIG. 77 illustrates how the robotic computer system may be used for navigation without the robotic arm and end effector.



FIG. 78 illustrates how the robotic computer system may be used for trajectory guidance using the robotic arm without navigated instruments.



FIG. 79 illustrates a block diagram of electronic components of a robot portion of a robot surgical platform which is configured according to embodiments.



FIG. 80 illustrates a block diagram of a surgical system that includes a surgical implant planning computer which may be separate from and operationally connected to the robot or incorporated therein.



FIGS. 81-87 are flowcharts of operations that may be performed by a surgical implant planning computer which is configured according to embodiments.





DETAILED DESCRIPTION

The following discussion is presented to enable a person skilled in the art to make and use embodiments of the present disclosure. Various modifications to the illustrated embodiments will be readily apparent to those skilled in the art, and the principles herein can be applied to other embodiments and applications without departing from embodiments of the present disclosure. Thus, the embodiments are not intended to be limited to embodiments shown, but are to be accorded the widest scope consistent with the principles and features disclosed herein. The following detailed description is to be read with reference to the figures, in which like elements in different figures have like reference numerals. The figures, which are not necessarily to scale, depict selected embodiments and are not intended to limit the scope of the embodiments. Skilled artisans will recognize the examples provided herein have many useful alternatives and fall within the scope of the embodiments.


System Overview


The robotic computer system enables real-time surgical navigation using radiological patient images and guides the trajectory of specialized surgical instruments along a surgeon-specified path using a robotic arm. The system software reformats patient-specific CT images acquired before surgery, or fluoroscopic images acquired during surgery, and displays them on screen from a variety of views. Prior to operating, the surgeon may then create, store, access, and simulate trajectories. During surgery, the system guides the instruments to follow the trajectory specified by the user, and tracks the position of surgical instruments in or on the patient anatomy and continuously updates the instrument position on these images. The surgery is performed by the surgeon, using the specialized surgical instruments.


The software can also show how the actual position and path during surgery relate to the pre-surgical plan, and can help guide the surgeon along the planned trajectory. While the surgeon's judgment remains the ultimate authority, real-time positional and trajectory information obtained through the robotic computer system can serve to validate this judgment. An example robotic computer system that could be used with embodiments herein is the ExcelsiusGPS™ by Globus Medical.


Device Description


The robotic computer system is a Robotic Positioning System that includes a computer controlled robotic arm, hardware, and software that enables real time surgical navigation and robotic guidance using radiological patient images (pre-operative CT, intra-operative CT and fluoroscopy), using a dynamic reference base and positioning camera. The navigation and guidance system determines the registration or mapping between the virtual patient (points on the patient images) and the physical patient (corresponding points on the patient's anatomy). Once this registration is created, the software displays the relative position of a tracked instrument, including the end-effector of the robotic arm, on the patient images. This visualization can help guide the surgeon's planning and approach. As an aid to visualization, the surgeon can plan implant placement on the patient images prior to surgery. The information of the plan coupled with the registration provides the necessary information to provide visual assistance to the surgeon during free hand navigation or during automatic robotic alignment of the end-effector.


During surgery, the system tracks the position of GPS compatible instruments, including the end-effector of the robotic arm, in or on the patient anatomy and continuously updates the instrument position on patient images utilizing optical tracking. Standard non-navigated metallic instruments that fit through the guide tube at the selected trajectory may be used without navigation while the guide tube is stationary, for uses such as bone preparation (e.g. rongeurs, reamers etc.) or placing MIS implants (e.g. rod inserters, locking cap drivers) that are not related to screw placement. Navigation can also be performed without guidance. System software is responsible for all motion control functions, navigation functions, data storage, network connectivity, user management, case management, and safety functions, robotic computer system surgical instruments are non-sterile, re-usable instruments that can be operated manually or with the use of the positioning system.


Robotic computer system instruments include registration instruments, patient reference instruments, surgical instruments, and end-effectors. Registration instruments incorporate arrays of reflective markers, and are used to track patient anatomy and surgical instruments and implants; components include the verification probe, surveillance marker, surgical instrument arrays, intra-op CT registration fixture, fluoroscopy registration fixture, and dynamic reference base (DRB). Patient reference instruments are either clamped or driven into any appropriate rigid anatomy that is considered safe and provides a point of rigid fixation for the DRB. Surgical instruments are used to prepare the implant site or implant the device, and include awls, drills, drivers, taps, and probes. End-effectors can be wirelessly powered guide tubes that attach to the distal end of the robotic arm and provide a rigid structure for insertion of surgical instruments.


Indications for Use


The robotic computer system is intended for use as an aid for precisely locating anatomical structures and for the spatial positioning and orientation of instrument holders or tool guides to be used by surgeons for navigating or guiding standard surgical instruments in open or percutaneous procedures. The system is indicated for any medical condition in which the use of stereotactic surgery may be appropriate, and where reference to a rigid anatomical structure, such as the skull, a long bone, or vertebra can be identified relative to a CT-based model, fluoroscopy images, or digitized landmarks of the anatomy.


Contraindications


Medical conditions which contraindicate the use of the robotic computer system and its associated applications include any medical conditions which may contraindicate the medical procedure itself.


Navigation Integrity


The robotic computer system has built-in precautions to support navigation integrity but additional steps should be taken to verify the accuracy of the system during navigation. Specific steps include:


Ensure the stabilizers have been engaged prior to using the robotic arm.


Do not move the dynamic reference base after successful registration.


Use a surveillance marker with every procedure to further confirm the accuracy of the images in relation to real-time patient anatomy.


If a surveillance marker alerts movement of patient relative to the dynamic reference base, perform a landmark check. If a landmark check fails, re-register the patient.


Use a verified navigation instrument to perform an anatomical landmark check prior to a procedure. If a landmark check fails, re-register the patient.


Compliance with Standards


This product conforms to the requirements of council directive 93/42/EEC concerning medical devices, when it bears the CE Mark of Conformity shown below, shown at right.


This product conforms to the requirements of standards listed below when it bears the following NRTL Certification Compliance Mark, shown at right.


Electric and electromagnetic testing have been performed in accordance with the following applicable standards: ANSI/AAMI ES60601-1, CSA C22.2 #60601-1, CISPR 11, IEC 60601-1 (including all national deviations), IEC 60601-1-2, IEC 60601-1-6, IEC 60601-1-9, IEC 60601-2-49 (only portions of this standard are used to demonstrate compliance and proper operation of the robotic computer system when used with high frequency surgical equipment such as a cauterizer), IEC 60825-1, IEC 62304, IEC 62366.


HF Surgical Equipment


Based on the robotic computer system floating applied part (type BF) and the safety testing performed, the system is compatible with the use of HF surgical equipment with no restrictions on the conditions of use.


EMC Compliance


In accordance with IEC 60601-1-2:2014 Edition 3 and 4, Medical Electrical Equipment needs special precautions regarding Electro Magnetic Compatibility (EMC) and needs to be installed and put into service according to the EMC information provided in the tables below. Portable and mobile RF communications equipment can adversely affect electrical medical equipment. The tables supply details about the level of compliance and provide information about potential interactions between devices. EMC Compliance tables from 3rd Edition are shown on the next page with values adjusted for 4th Edition where appropriate.


The robotic computer system has an optional 802.11 g/b/n wireless router and tablet option. When installed, this transmits RF power at 2.4 GHz (2.412-2.484 GHz) using DSSS or OFDM with DQPSK or QAM modulation. Maximum RF transmit power is 100 mW.












Recommended separation distances









Separation distance according



to frequency of transmitter (m)












Rated maximum
150 kHz to
80 MHz to
800 MHz to



output power of
80 MHz
800 MHz
2.5 GHz



transmitter (W)
d = 1.2√{square root over (P)}
d = 1.2√{square root over (P)}
d = 2.3√{square root over (P)}
















0.01
0.3*
0.3*
0.3*



0.1
0.37
0.37
0.74



1
1.17
1.17
2.33



10
3.69
3.69
7.38



100
11.67
11.67
23.33







*30 cm is the minimum recommended separation distance even though the calculation would yield a shorter distance.



For transmitters rated at a maximum output power not listed above, the recommended separation distance in meters (m) can be estimated using the equation applicable to the frequency of the transmitter, where P is the maximum output power rating of the transmitter in watts (W) according to the transmitter manufacturer.



NOTE 1:



At 80 MHz and 800 MHz, the separation distance for the higher frequency range applies.



NOTE 2:



These guidelines may not apply in all situations. Electromagnetic propagation is affected by absorption and reflection from structures, objects and people.







Cybersecurity


The robotic computer system adheres to industry best practices and FDA guidance on cybersecurity in medical devices. This includes firewall protection and additional protection against virus, malware, data corruption, and unauthorized system access.


System Overview


The robotic computer system consists of four main components: Robotic Base Station (shown below), Camera Stand (shown below), Instruments, and System Software. FIG. 1 illustrates a robotic system that includes a robotic base station and a camera stand.


Robotic Base Station


The Robotic Base Station is the main control center for the robotic computer system and includes the components shown below. FIG. 2 illustrates Components of the Robotic Base station. The robotic base station includes a vertical column 206 that supports an upper arm 200 connected to a lower arm 202, with a bracelet and end effector 204 connected to the lower arm 202. An information ring 220 on the vertical column 206 is illuminated to provide information as described below. A monitor 218 is connected to the vertical column 206. The robotic base station also includes a tablet compartment 216, a control panel 208, a connector panel 210, stabilizers 212, and rolling casters 214.


Monitor


The monitor allows the surgeon to plan the surgery and visualize anatomical structures, instruments, and implants in real time. It is a high resolution, flat panel touch screen liquid crystal display (LCD) located on the vertical column. The monitor can be adjusted to the desired location with two hands. An external mouse is available for optional use with the monitor. The mouse is not intended for use within the sterile field. FIG. 3 illustrates the monitor of the robotic base station.


Tablet


An optional wireless tablet is available for use as a second touchscreen monitor for operative planning and software control. The main monitor remains active at all times during use. The user can lockout tablet use if desired. The tablet compartment is used to store the tablet. The tablet is not intended for use within the sterile field.


Control Panel


The control panel is located at the rear of the Robotic Base Station. This panel is used to display and control system power and general positioning functions. FIG. 4 illustrates the control panel on the rear of the Robotic Base Station and the control panel functions. The control panel includes: emergency stop button 400, stabilizers disengage button 402, a left position button 404, a straight position button 406, a right position button 408, a vertical column up button 410, a vertical column down button 412, a dock position button 414, a stabilizers engage button 416, a battery status indicator 418, a power button 420, and a line power indicator 422.












Control panel functions









Button
Function
To Use





Emergency Stop
Removes power from motors and applies
Press down to activate. To



brake
deactivate and re-power, twist




knob counterclockwise.


Line Power Indicator
Illuminates when system is plugged into AC
Press to turn ON/OFF



power outlet


Power Button
Powers the Robotic Base Station ON/OFF.
Press to turn ON/OFF



Illuminated when ON.


Battery Indicator
Indicates level and state of charge



All bars are illuminated when fully charged



When operating on battery, number of illuminated



bars indicates percent of charge



Bars progressively illuminate when charging


Stabilizers Disengage
Illuminates when system is free to move
Press to disengage the stabilizers




to allow movement of the system


Stabilizers Engage
Illuminates when system is secured to floor
Press to engage the stabilizers, to




lock the system in place


Left Position
Moves upper arm forward and lower arm at a
Press and hold button. Operator



90° angle to the left
may release button prior to final




position and arm will stop in




current position.


Right Position
Moves upper arm forward and lower arm at a



90° angle to the right.


Straight Position
Moves upper and lower arm forward
Stop in current position


Dock Position
Moves upper and lower arm to rest over the



cabinet


Vertical Column Up
Moves vertical column up
Press and hold button. Operator




should release button once the




desired height is reached.


Vertical Column Down
Moves vertical column down










Connector Panel


The connector panel is located at the rear of the Robotic Base Station. This panel contains external connection ports for various devices. FIG. 5 illustrates the connector panel located at the rear of the Robotic Base Station. The connector panel includes: an equipotential terminal 562, a foot pedal connector 563, a camera connector port 564, an HDMI connector 565, an ethernet connector 566, and dual USB 3.0 ports 567.












Connector panel functions








Item
Function





Equipotential
Used to connect to other auxiliary


Terminal
equipment; used by service personnel


Foot Pedal Connector
Connects to the foot pedal cable


Camera Connector
Connects to the camera stand cable


HDMI Connector
Connects to an external monitor


Ethernet Connector
Connects to a network or intra-operative



imaging system for image transfer


USB Port 3.0
Connects to a USB device for image transfer



Connects to C-Arm via video capture supplied



with the Fluoroscopy Registration Fixture










Casters and Stabilizers


The system consists of four casters with integrated stabilizers. The stabilizers are used to immobilize the system to ensure that it does not move during use.


Upper Arm, Lower Arm, and Vertical Column


The robotic arm, which consists of an upper and lower arm, is attached to the vertical column of the robotic computer system Robotic Base Station. This configuration allows for a wide range of motion.


The robotic computer system employs a state of the art drive control system along with high performance servo drives to accurately position and control the 5-axis robotic arm in an operating room environment. FIG. 6 illustrates the 5-axis robotic arm. The 5 axes of motion are identified below.
















Axis
Travel Distance









Vertical 670
≥480 mm



Shoulder 672
−150° to 180°



Elbow 674
−150° to 150°



Roll 676
−135° to 135°



Pitch 678
−70° to 70°











Bracelet


The bracelet is located at the distal end of the lower arm. It is a load sensing component that allows user guided positioning of the robotic arm.


To initiate motion, squeeze the bracelet ring with the thumb and forefinger on opposite sides. While squeezed, apply light force toward the desired direction of motion. The robotic arm will move in the desired direction. The arm moves manually in any direction or along a trajectory if a screw plan is active. FIG. 7 illustrates the lower arm which includes a bracelet 700 and a bracelet ring 722.


Information Ring


The information ring is located on the upper part of the vertical column. The information ring indicates the status of the robotic computer system. The information ring light blinks while the system is booting up; a solid green light is displayed when the system is ready. Individual colors are used to indicate status, as shown in the table below. FIG. 8 illustrates the upper part of the vertical column in which includes an information ring 800 that is limited to provide information indications to a user.












Information ring color indications










Color
Description







Red
System is in an error state. Stop all tasks and




resolve the issue immediately as it is either a safety




issue or a serious problem with the system.



Yellow
System is in a state in which user intervention is




required before a planned trajectory can be activated.



Green
System is ready.











Camera Stand


The camera stand is mobile and adjusts in order to position the camera to view the operating field and optical markers. FIG. 9 illustrates the camera stand. The camera stand includes: a camera 904; a camera laser alignment light 906; a positioning handle 908; a support arm 910; a height adjustment handle 912; a locking handle 914; a docking handle 916; a release handle 918; a cable holder 920; legs 922; and casters 924. FIG. 10 illustrates the rear view of the camera stand showing alignment buttons. The camera stand further includes a handle tilt button 1020 and a laser button 1022.












Camera stand functions








Item
Function





Camera
Used to detect the reflective markers and is attached to the



top of the camera stand. For more information, please



refer to the NDI Passive Polaris Spectra User Guide.


Positioning
Used to adjust the camera position to ensure the surgical


Handle
field is in view.


Handle Tilt
Used to adjust the angle of the positioning handle with


Button
respect to the camera in the field of view.


Laser Button
Turns the camera laser alignment light on and off. The laser



light is used for assistance in aligning the camera in the



field of view.


Arm
Provides a large range of positions for the camera.


Height
Allows for adjustment of camera height.


Adjustment



Handle



Locking
Used to lock camera position.


Handle



Docking
Used to collapse the legs for docking the camera stand into


Handle
the Robotic Base Station.


Release
Releases the camera from the Robotic Base Station.


Handle



Casters
The camera stand contains four casters. The rear casters are



lockable to prevent the camera stand from moving.


Legs
The camera stand legs swing inward for docking and



outward when deployed.


Cable Holder
Provides storage for the camera stand cable.










Cabling


The following cable characteristics are required for connecting to external devices:


HDMI—Connecting to an external HDMI Monitor requires a shielded HDMI-Male to HDMI-Male cable.


Network—Connecting to a Hospital network can be done with an unshielded CAT-5e Ethernet cable.


Electronic Components of Surgical Robot



FIG. 79 illustrates a block diagram of electronic components of a robot 500 portion of a robot surgical platform which is configured according to embodiments. The robot 500 can include platform subsystem 502, computer subsystem 520, motion control subsystem 540, and tracking subsystem 530. Platform subsystem 502 can include battery 506, power distribution module 504, platform network interface 512, and tablet charging station 510. Computer subsystem 520 can include computer 522, display 524, and speaker 526. Motion control subsystem 540 can include driver circuit 542, motors 550, 551, 552, 553, 554, stabilizers 555, 556, 557, 558, end-effector 544, and controller 546 (e.g., one or more processors and associated circuitry). Tracking subsystem 530 can include position sensor 532 and camera converter 534 which is connectable to a marker tracking camera 570, e.g., via the platform network interface 512. Robot 500 can include a foot pedal 580 and tablet computer 590.


Input power is supplied to robot 500 via a power source 560 which may be provided to power distribution module 504. Power distribution module 504 receives input power and is configured to generate different power supply voltages that are provided to other modules, components, and subsystems of robot 500. Power distribution module 504 may be configured to provide different voltage supplies to platform network interface 512, which may be provided to other components such as computer 520, display 524, speaker 526, driver 542 to, for example, power motors 550, 551, 552, 553, 554 and end-effector 544, ring 514, camera converter 534, and other components for robot 500 for example, fans for cooling the various electrical components.


Power distribution module 504 may also provide power to other components such as tablet charging station 510 that may be located within a tablet drawer. Tablet charging station 510 may be configured to communicate through a wired and/or wireless interface with tablet 590. Tablet 590 may be used to display images and other information for use by surgeons and other users consistent with various embodiments disclosed herein.


Power distribution module 504 may also be connected to battery 506, which serves as a temporary power source in the event that power distribution module 504 does not receive power from input power 560. At other times, power distribution module 504 may serve to charge battery 506 when needed.


Other components of platform subsystem 502 can include connector panel 508, control panel 516, and ring 514. Connector panel 508 may serve to connect different devices and components to robot 500 and/or associated components and modules. Connector panel 508 may contain one or more ports that receive lines or connections from different components. For example, connector panel 508 may have a ground terminal port that may ground robot 500 to other equipment, a port to connect foot pedal 580 to robot 500, and/or a port to connect to tracking subsystem 530. The tracking subsystem 530 can include a position sensor 532, camera converter 534, and the marker tracking camera 570 which may be supported by a camera stand. Connector panel 516 can include other ports to allow USB, Ethernet, HDMI communications to other components, such as computer 520.


Control panel 516 may provide various buttons or indicators that control operation of robot 500 and/or provide information regarding robot 500. For example, control panel 516 may include buttons to power on or off robot 500, lift or lower stabilizers 555-558 that may be designed to engage casters to lock robot 500 from physically moving and/or to raise and lower the robot base and/or a vertical support for the robot arm. Other buttons may control robot 500 to stop movement of a robot arm in the event of an emergency, which may remove all motor power and apply mechanical and/or electromechanical brakes to stop all motion from occurring. Control panel 516 may also have indicators notifying the user of certain system conditions such as a line power indicator or status of charge for battery 506.


Ring 514 may be a visual indicator to notify the user of robot 500 of different modes that robot 500 is operating under and certain warnings to the user.


Computer 522 of the computer subsystem 520 includes at least one processor circuit (also referred to as a processor for brevity) and at least one memory circuit (also referred to as a memory for brevity) containing computer readable program code. The processor may include one or more data processing circuits, such as a general purpose and/or special purpose processor, e.g., microprocessor and/or digital signal processor. The processor is configured to execute the computer readable program code in the memory circuit to perform operations, which may include some or all of the operations described herein as being performed by a surgical robot and may further perform some or all of the operations described herein as being performed by a surgical implant planning computer.


The program code includes an operating system and software to operate robot 500. Computer 522 may receive and process information from other components (for example, tracking subsystem 530, platform subsystem 502, and/or motion control subsystem 540) in order to display information to the user. Further, computer subsystem 520 may include speaker 526 to provide audio notifications from the computer 522 to the user.


Tracking subsystem 530 can include position sensor 532 and camera converter 534. The position sensor 532 may include the marker tracking camera 570. Tracking subsystem 530 may track the location of markers that are located on the different components of robot 500 and/or instruments used by a user during a surgical procedure. This tracking may be conducted in a manner consistent with the present disclosure which can include the use of infrared technology that illuminates and enables tracking by the camera 570 of the location of active or passive elements, such as LEDs or reflective markers, respectively. The location, orientation, and position of structures having these types of markers may be provided to computer 522 which may be shown to a user on display 524 and/or tablet 590. For example, a surgical instrument or other tool having these types of markers and tracked in this manner (which may be referred to as a navigational space) may be shown to a user in relation to a three dimensional image of a patient's anatomical structure, such as a CT image scan, fluoroscopic image, and/or other medical image.


The robot 500 can include a robot base that is coupled to a robot arm which is movable by the motors, e.g., one or more of motors 550-554, relative to the robot base. The robot arm can include an upper arm connected to a vertical support and a lower arm that is rotatably coupled to an end of the upper arm and extends to couple to the end-effector 544. Motion control subsystem 540 may be configured to physically move a vertical column of the robot 500, e.g., raise and lower the robot arm and/or the robot base in a vertical direction, move an upper arm of the robot 500, move a lower arm of the robot 500, and/or rotate the end-effector 544. The physical movement may be conducted through the use of one or more motors 550-554. For example, motor 550 may be configured to vertically lift or lower the robot base and/or the robot arm in a vertical direction. Motor 551 may be configured to laterally move an upper arm around a point of engagement. Motor 552 may be configured to laterally move a lower arm around a point of engagement with the upper arm. Motors 553 and 554 may be configured to move the end-effector 544 in a manner that controls the roll and/or tilt, thereby providing multiple angles that end-effector 544 may be moved. These movements may be performed by controller 546 responsive to commands from the computer 522 and which may control these movements through load cells disposed on the end-effector 544 and activated by a user engaging these load cells to move the end-effector 544 in a desired manner.


The robot 500 may augment manual input by a user, e.g., when a user applies force to one or more load cells on the end-effector 544, and/or provide automatic movement of the robot arm. The robot 500 may also augment manual movement by a user and/or provide automatic movement of a vertical column of the robot base. For automatic movement, the computer 522 may respond to receiving input from a user, such as by indicating on display 524 (which may be a touchscreen input device) the location of a surgical instrument or component on a three dimensional medical image of the patient's anatomy on display 524. The computer 522 can control one or more of the motors 550-554 to perform automatic movement of the robot arm along a trajectory that has been computed to move the end effector 544 based on location of the user's input relative to the medical image. The user may initiate automatic movement by stepping on foot pedal 580 and/or by manipulation of another user interface.


Instruments


End Effector


The end-effector is the interface between the robotic arm and the system specific surgical instruments. It allows for a rigid connection through the sterile drape to provide precise positioning of instruments placed within its guide tube. The end-effector is provided as a separate component and is sterilized by the user prior to use. FIG. 11 illustrates the isometric and top view of the end-effector 1122 including a guide tube 1122.


The end-effector is powered wirelessly from the robotic arm. This power is used to drive the active markers that are used by the camera to identify the location and orientation of the end-effector. The blue indicator LED illuminates when the end-effector is powered.


Two end-effectors are available to interface with various surgical instruments. They differ only in the diameter of the guide tube; the active markers have the same geometries. The end-effectors are etched with the guide tube diameter and are color-coded to help ensure that the corresponding size instruments are used.


The 15 mm end-effector is used with all navigated instruments except REVOLVE® instruments, and the 17 mm end-effector is used with REVOLVE® instruments. Non-navigated Globus instruments may be used with either end-effector; they are not sized to the guide tube, but must fit within the inner diameter


Instrument Sensing Ring


Located within the guide tube of the end-effector is an instrument sensing ring. A detector circuit is embedded within the sensing ring that detects when a metal instrument is inserted through the guide tube and disables the active markers and prevents movement of the robotic arm. The visible LED on the end-effector does not illuminate when a metallic instrument is inserted, indicating that an instrument is detected and the active IR emitters are disabled. Disabling the IR emitters prevents the robotic arm from moving. Non-metallic instruments are not identified by the sensing ring and may not be used in the guide tube.


Detent Mechanism


Size 15 mm end-effectors have a detent mechanism on the inside of the tube which interfaces with grooves on the array sleeves to resist array rotation. This aids in holding the tracking array oriented toward the camera while the operator rotates the instrument. FIG. 12 illustrates the detent mechanism 120 on the instrument sensing ring.


Scalpel


A specialized scalpel can be used to create a skin mark at the planned trajectory. Attach a standard scalpel blade to the handle.


Position the guide tube on the end-effector to the planned trajectory. Adjust the end-effector up or down along the trajectory to allow the scalpel to be viewed. Ensure that scalpel tip can be viewed before making the skin mark.


Note: The scalpel has a metal core within the radiolucent PEEK material and is detected while in the guide tube. FIG. 13 illustrates a scalpel used through the guide tube.


Cannulas


Cannulas, or dilators, can be used for performing minimally invasive or other techniques that require sequential tissue dilation. The cannulas should only be used under trajectory guidance. Note: The terms “cannula” and “dilator” are used interchangeably.


Prior to performing sequential tissue dilation, a scalpel may be used through the guide tube to create a skin mark at the desired trajectory. Move the guide tube away from the trajectory using the bracelet, and create an incision with a scalpel. Refer to the Scalpel section of this manual for instructions.


Once the guide tube is at the desired trajectory, position the outer cannula under the guide tube and above the incision, along the same trajectory. Insert the two inner cannulas into the guide tube and through the outer cannula, and rest on the skin. To sequentially dilate the tissue, slowly insert the first (smallest) cannula into the incision using a cannula pusher. Then advance the second cannula in the same manner. Complete tissue dilation by slowly advancing the outer cannula over the inner cannula. Remove the inner cannula. Lower the guide tube until it sits just within the outer cannula. Perform surgery through the guide tube and outer cannula. FIG. 14 illustrates the trajectory of the outer cannula. Referring to FIG. 14, a first inner cannula 1400 is slid into a second inner cannula 1402 along trajectory 1404 into the outer cannula 1406 which is placed within the incision 1408. FIGS. 15(a)-15(g) illustrates one technique for dilating tissue with the devices. FIG. 15a illustrates how the outer cannula is positioned above the incision. FIG. 15b illustrates how the cannulas is placed into the guide tube such that it rests on skin. FIG. 15c illustrates how the first inner cannula is inserted into the incision. FIG. 15d illustrates how the second inner cannula is then inserted into the incision. FIG. 15e illustrates how the outer cannula is then inserted into the incision. FIG. 15f illustrates both inner cannulas then being removed. FIG. 15g illustrates lowering the guide tube until it sits within the outer cannula.


Navigated Instruments


The navigated surgical instruments for use with robotic computer system include drills, awls, probes, taps, and drivers, which may be used to insert Globus screws. These instruments can be used with arrays if navigation is desired, or without arrays if navigation is not used. Each instrument and corresponding array must be assembled prior to use. Instruments are identified by a unique array pattern that is recognized by the camera.


Navigated instruments are available for each Globus implant system. Refer to the specific system instrument brochures for more information. FIG. 16 illustrate some embodiments of the navigated instruments. The instruments include an awl 1600, a probe 1602, a drill 1604, a tap 1606, and a driver 1608.


Arrays


Arrays have 4 posts for attaching reflective markers and are available for use with the surgical instruments. The navigated surgical instruments are assembled to a corresponding instrument array, designed with a unique marker pattern which identifies the instrument type. The array is etched with the specific instrument type, e.g. “AWL”, “PROBE”, “DRILL”, “TAP”, “DRIVER”. Each instrument array has a verification divot, used for instrument verification.


The verification probe has a built-in array with posts for the reflective markers and is used to verify each instrument before use.


Arrays used with instruments for the standard 15 mm end-effector are identified by a black sleeve. Arrays used with instruments for the 17 mm end-effector are identified by a tan sleeve. FIG. 17 illustrates the array 1700 with a release button 1702, a handgrip 1704, a marker post 1706, an array sleeve 1708, and array support 1710. FIG. 17 also illustrates a verification divot 1712 between the array 1700 and the handgrip 1704. FIG. 18 illustrates the verification probe.


Patient Attachment Instruments


Patient attachment instruments are secured to the patient's rigid anatomy, depending on the specific surgical procedure or preference, and are available in various configurations. These instruments may be secured to a variety of anatomical sites. The rod attachment instrument is designed to attach to an existing spinal rod.


Patient attachment instruments must be safely and rigidly secured to the patient to achieve navigation and guidance accuracy. Verify secure attachment by applying a light force to the distal end of the attachment instrument in all directions. If secure attachment is not maintained during the procedure, the surveillance marker will demonstrate excessive movement; if this occurs, reposition the patient attachment instrument and re-register the patient to the patient images.


Refer to the specific procedure in the Application section for recommended anatomical locations. FIG. 19 illustrates the patient attachment instruments, which include a bone clamp 1900 with surveillance marker, a quattro spike 1902, a low profile quattro spike 1904, and a rod attachment 1906.


Bone Clamps


Bone clamps are clamped onto anatomical structures such as the spinous process, iliac crest, long bone, or any rigid bony structure that can be safely clamped.


The bone clamp is placed onto rigid bony anatomy. The clamp driver is used to tighten the bone clamp. To remove, loosen the bone clamp with the clamp driver, attach the removal tool and lift up the bone clamp. FIG. 20 illustrates tightening bone clamp using clamp driver.


Quattro Spikes


Quattro spikes are inserted into rigid bone of the iliac crest or long bone. The quattro spike is inserted into rigid bony anatomy and gently impacted with a mallet.


The low profile quattro spike is inserted using a guide post and impaction cap. Find the desired anatomy using the guide post. Place the patient attachment instrument over the guide post. Attach the impaction cap (for low profile quattro spike). Gently impact the assembly with a mallet to insert into bony anatomy. Remove the impaction cap and guide post from the spike. FIG. 21 illustrates the guide post 2100 and the quattro spike 2102. FIGS. 22(a)-22(d) illustrates one method for inserting the quattro spike into rigid bony anatomy. FIG. 22(a) illustrates positioning the quattro spike over the guide post. FIG. 22(b) illustrates attaching the impaction cap. FIG. 22(c) illustrates inserting the assembly into a rigid anatomy. FIG. 22(d) illustrates removing the cap and guide pose.


Rod Attachment Instrument


The rod attachment instrument is designed to attach to an existing spinal rod (4.5 mm to 6.35 mm diameter). Position the instrument on the existing spinal rod and tighten the set screw with a driver. Ensure a rigid connection. To remove, loosen the set screw and disengage from the rod. FIG. 23 illustrates the rod attachment instrument 2300 including a set screw 2302, which are attached to the existing spinal rod.


Surveillance Marker



FIG. 24 illustrates a surveillance marker. The surveillance marker is a single reflective marker used to monitor a shift in the Dynamic Reference Base (DRB). Surveillance markers may be used alone or in conjunction with a bone clamp.


Surveillance markers are directly inserted into the iliac crest or long bone, or may be attached to the spinous process using a bone clamp. FIG. 25 illustrates the use of a surveillance marker with a bone clamp. To use a bone clamp with the marker, attach a disposable surveillance marker 240 onto the tip of the bone clamp. Use the clamp driver to secure the bone clamp. Verify that the bone clamp is rigidly secured.


Registration Instruments


The Dynamic Reference Base (DRB) and patient attachment instruments are used in the patient registration process.


The DRB is an array with 4 posts for reflective markers and allows the camera to track the location of the patient. The DRB may be attached to any of the patient attachment instruments, using the knob and compression clamp. FIG. 26 illustrates the dynamic reference base, which includes marker posts 2600 connected to a compression clamp 2602 operated by a DRB knob 2604.


Registration Fixtures


Intra-Op Ct Registration Fixture


The intra-op CT registration fixture, consisting of a registration fixture and pivoting arm, allows for any intra-operative CT image to be used with the robotic computer system software application. The pivoting arm and registration fixture are assembled prior to use by matching the starburst gears and snapping the two components together.


The intra-op registration fixture is placed onto a patient attachment instrument by clamping the compression clamp onto the shaft of the attachment instrument, allowing the fixture to hover over the surgical site. The fiducials are detected automatically in the intra-operative scan and are used to register the patient's anatomy during the scan to the DRB, which is tracked by the camera throughout the procedure. The reflective markers are detected by the camera. Once the registration is transferred to the DRB, the intra-op registration fixture is removed to provide access to the surgical site. FIG. 27 illustrates the intra-op registration fixture 2712 and pivoting arm 2708. FIG. 27 further illustrates the compression clamp 2602, the DRB knob 2604, a starburst connection 2406, a gear tooth joint 2710, and a set of seven fiducials 2714.


Fluoroscopy Registration Fixture



FIG. 28 illustrates the Fluoroscopy Registration Fixture. The Fluoroscopy Registration Fixture allows for any intra-operative fluoroscopic image to be used with the robotic computer system software application. The fluoroscopy fixture is attached to the image intensifier of the fluoroscope using the integrated clamps. The fluoroscope and Fluoroscopy Registration Fixture are draped and the reflective markers are placed on the fixture, outside of the drape. The fixture should be positioned such that the reflective markers are seen by the camera in all intended fluoroscope positions (AP, lateral, etc).


Robotic Arm Motion


The robotic computer system robotic arm positions the end-effector to guide instruments for screw insertion at the desired trajectory. The surgeon manually performs surgery while the instruments are aligned in the desired trajectory for accurate screw placement. Note: The terms “screw plan”, “screw trajectory” and “trajectory” are used interchangeably in this manual.


Motion of the robotic arm is only allowed with continuous pressing of the bracelet or foot pedal. The arm is manually moved by the user in Wrist mode, or is automatically moved to the selected trajectory in Trajectory mode.


In Wrist mode, the arm may be moved manually to any position within reach of the arm.


In Trajectory mode, the arm is automatically moved from the current position to the next screw plan when ready, or may be moved manually along a selected trajectory.


When moving from one screw plan to the next, the arm moves outwards along the current trajectory to a safe distance (200 mm) from the surgical site before moving to the new trajectory and downwards along the current trajectory to the anatomy.












Robotic arm motion modes














Automatic
Manual


Mode
Software
User Action
Motion
Motion





Wrist
No Plan
PressFootPedal
n/a
User may


Mode
Selected
or

move arm




Squeeze

in the




Bracelet

desired






direction


Trajectory
Plan
Press Foot
Arm moves
After


mode
Selected
Pedal or
automatically to
reaching the




Squeeze
new screw
trajectory,




Bracelet
trajectory
user may






move arm






along






trajectory






only









Automatic motion of the arm occurs when moving the guide tube from the current position (either initially or at a current trajectory) to a new screw plan. Once the end-effector and attached guide tube have moved to a new screw plan, the guide tube is locked onto the trajectory and can be moved up and down along the trajectory. FIG. 29 illustrates the end effector motion when moving from one trajectory to the next, wherein 1, 2, and 3 are automatic movements; 4 is manual and optional. The illustrated movements include movement up along path 2902 from a starting position 2900 to clear the screw and patient, movement along a new trajectory path 2904, movement downward to a safe starting position along path 2906, and an optional movement along a trajectory path 2908 that may involve manual movement.


Automatic motion of the robotic arm may be stopped by the user, stopped by the system, or prevented.


To stop motion at any time, press the Emergency Stop button located on the base station.


Motion is stopped if the end-effector detects a force greater than 50 N (11 lbs).


Motion is also stopped in Trajectory mode when the DRB or the end-effector is not in view of the camera.


Motion is prevented when the sensing ring in the guide tube detects a metallic instrument.


When a trajectory is selected, motion of the arm with guide tube is only allowed along the trajectory.












Stopping or preventing robotic arm motion


Method















Emergency Stop button pressed


End Effector detects force on arm greater than 50N (11 lbs)


Dynamic reference base not in view of camera (Trajectory mode only)


End Effector not in view of camera (Trajectory mode only)


Sensing ring detects a metallic instrument in the guide tube









If the robot arm is not able to reach to a safe starting location due to its current position, an error message is shown. The message states “The arm cannot move back any further along the current end-effector trajectory. Acknowledging this message enables the arm to move to the selected plan trajectory from its current position”. The user may choose to move forward with the planned trajectory because the shorter starting position is acceptable. If the shorter starting position is not acceptable, a new trajectory must be used or the base must be repositioned.


To select a new trajectory, the user clears the selected trajectory and positions the robotic arm using the bracelet to a clear position. The bracelet provides flexibility for the user to move the arm around an obstacle.


To reposition the base, the stabilizers on the casters are disengaged, the station is moved to the desired location and the stabilizers are reengaged. Registration is unaffected because the patient reference (attachment instruments and DRB) has not moved with respect to the patient.


System Software


The system software is responsible for all motion control functions, navigation functions, data storage, network connectivity, user management, case management, and safety functions.


The top navigation bar takes the user through individual screens for each step of the procedure.


The respective tab for each step is highlighted when selected and the corresponding screen displayed. The activities performed under each tab are shown in the table below.












System software tabs








Tab
Meaning





Configure
Surgeon, imaging workflow, and anatomy selection


Preplan
Implant system selection and desired anatomical



location identification


Verify
Navigated instrument verification


Image
Loading of patient images used for planning and navigation


Plan
Estimation of desired implant location with respect



to patient images


Navigate
Screw plan with real-time display of navigated instrument



and implant (actual plan) with respect to patient images










System Setup


Power Up



FIG. 30 illustrates the power button 3000, line power indicator 3002 and battery indicator 3004. Press the Power Button 3000 on the control panel to turn the system on. The Power Button 3000 is illuminated when the system is on.


Undocking and Positioning Camera Stand


To release the camera stand from the Robotic Base Station, unwrap the cord holding the monitor arm and the camera arm together, and pull up on the release handle located on the camera stand. Once the legs of the camera stand have cleared the base station, they will automatically release and move outward. FIGS. 31(a) and 31(b) illustrates the camera stand undocking. FIG. 31(a) illustrates pulling up on the release handle located on the camera stand. FIG. 31(b) illustrates clearing the legs of the camera stand legs automatically releasing and moving outward.


Unwrap the camera cord from the cord holder and plug into the connector panel on the base station.


Move the camera to the operating room (O.R.) table and engage the wheel brakes by stepping on the lever located on the wheel.


Align the camera to view the surgical field.



FIG. 32 illustrates the connection of the camera to the connector panel on the base station. FIG. 33 illustrates the camera positioning.


Press and hold the laser button located on the positioning handle of the camera to activate the camera's alignment laser and adjust the position so the laser points to the center of the surgical field. FIG. 34 illustrates pressing the laser button 3400 to activate a laser which facilitates user alignment of the camera.


Draping


A special surgical drape is designed for the robotic computer system Robotic Base Station. Drape the robotic arm, monitor and front of the base station, by following the instructions detailed in the package insert provided with the sterile drape. FIG. 35 illustrates the system with a sterile drape.


Positioning the Robotic Base Station


Unwrap the foot pedal from the foot pedal basket and position it on the level ground at a comfortable distance from the operator's feet. The foot pedal is IPX68 rated and is acceptable for use in areas where liquids are likely to be found. Plug the foot pedal cord into the connector panel. The foot pedal allows the arm to move to the active trajectory, similar to the action of the bracelet on the lower arm.


Position the Robotic Base Station next to the patient at a comfortable distance from the surgeon. Move the robotic arm, using the bracelet, around the planned trajectories to ensure the arm can reach all locations before engaging the stabilizers. FIG. 36 illustrates the foot pedal cable connection.


Press the Stabilizers Engage button on the control panel to lower the stabilizers on the casters. The button is illuminated when the stabilizers are engaged. FIG. 37 illustrates the buttons which are illuminated when the stabilizers engage (e.g. responsive to pressing the stabilizers engage button 3700) and stabilizers disengage (e.g. responsive to pressing the stabilizers disengage 3702).


Attaching End Effector to Robotic Arm


The end effector connects to the robotic arm through the interface plate over the custom drape. A magnetic assist helps to position and self-align the end effector.


The end effector is equipped with a drape-friendly clamp that allows it to be removed and reattached up to 3 times during a procedure without damaging the drape. FIG. 38 illustrates the robotic arm interface plate for connection to the end effector.



FIG. 39 illustrates opening the brackets on the end effector and place the end effector on the interface plate by aligning the V grooves and alignment spheres.



FIG. 40 illustrates squeezing the brackets on both sides of the end effector and press the handle down to lock into place.



FIG. 41 illustrates the correct and incorrect positioning of the handle down to lock into place.


Removing the End Effector


To remove the end-effector from the robotic arm, pull up on the handle to release the spring and side brackets. FIG. 42 illustrates the removal of the end effector.


Surgical Instrument Assembly


To assemble the surgical instruments for navigation, press the release button on the array sleeve and insert the instrument shaft into the sleeve of the respective instrument array. Slide the shaft through the sleeve until it clicks into place. Gently pull up on the instrument shaft to confirm it is locked. FIG. 43 illustrates inserting the instrument shaft into the array sleeve, and further illustrates a release button 4300 which releases the array.


Attach a quick connect handle on the proximal end of the shaft when needed. To remove the instrument from the array, push the release button located on the middle of the array. FIG. 44 illustrates the surgical instrument assembly. FIG. 45 illustrates attaching the quick connect handle on the proximal end of the shaft of the surgical instrument assembly.


Attach the disposable reflective markers to each of the marker posts of each instrument assembly. Ensure that the markers are fully seated on the posts. FIGS. 46(a) and 46(b) illustrates attaching a reflective marker to one of a plurality of marker posts of the instrument assembly. FIG. 46(a) illustrates lowering the reflective marker onto a marker post. FIG. 46(b) illustrates the marker fully seated on the post.


Login


To login, type the four-digit pin on the touch screen of the monitor. The four digit pin is provided during system installation and can be changed by contacting Tech Support. FIG. 47 illustrates the login screen displayed on the monitor.


A case encompasses all of the data associated with performing a procedure, including surgeon preferences, medical images, and plans.


After logging in, the SELECT CASE page is displayed on the monitor.


To select an existing case, select the corresponding row from the case list. To start a new case, click the new case icon. Click the right arrows to advance to the next tab. FIG. 48 illustrates the case management screen displayed on the monitor.


Applications


Spine surgical procedures are supported by the robotic computer system. FIG. 49 illustrates the CONFIGURE tab used to display procedure types.


Spine Procedures


Spinal surgical applications supported by the robotic computer system are listed below.












Supported spine procedures










Procedures
Patient Position






Posterior Cervical
Prone



Posterior Thoracic
Prone



Anterolateral Thoracic
Lateral



Posterior Lumbar
Prone



Lateral Lumbar
Lateral









Globus spinal implant systems that are compatible with the robotic computer system are listed below.












Compatible spinal implant systems

















CREO®  Stabilization System



REVERE®  Stabilization System



REVOLVE®  Stabilization System



ELLIPSE®  Occipito-Cervico-Thoracic Spinal System



QUARTEX®  Occipito-Cervico-Thoracic Spinal System



SI-LO® K Sacroiliac Joint Fusion System










Procedure Setup


Configure Tab


After selecting a case, the CONFIGURE tab is displayed on the monitor.


Using the CONFIGURE tab, select the surgeon, the imaging modality and the procedure type. Click the right arrows to advance to the next tab.


Preplan Tab


Using the PREPLAN tab, select the implant system, desired vertebral level and orientation, and click the desired implant location on the anatomical model. Click the right arrows to advance to the next tab. FIG. 50 illustrates the PREPLAN tab displayed on the monitor to select the implant system, desired vertebral level and orientation.


Verify Tab



FIG. 51 illustrates the VERIFY tab displaying navigation details including visibility, location and verification status of the instruments selected on the PREPLAN tab. Verification is used to ensure all instruments are accurate and have not been damaged during handling and sterilization. The operator must assemble all instruments prior to verification (see Surgical Instrument Assembly).


The VERIFY tab shows CAMERA VIEW and INSTRUMENT STATUS.


CAMERA VIEW is a real-time view from the perspective of the camera with color circles indicating instrument location. A solid colored circle indicates that the instrument is visible by the camera, while a hollow circle indicates that it is not visible. The colored circle grows larger as the instrument is moved closer to the physical camera and smaller as it moves away from the camera. The ideal distance from the camera is approximately 2 meters or 6 feet.


INSTRUMENT STATUS lists each instrument and its verification status, with corresponding color circles to identify each instrument. The verification status is symbolized by a checkmark if verification is successful and an X-mark if the verification failed. When no icon appears, the instrument is not verified.


Instrument Verification


Verify each instrument as follows: place the tip of the instrument to be verified into verification divots located on the end-effector and on any other instrument array for convenience; ensure both instruments are visible and held steady; and use a pop-up screen appearing on the VERIFY tab to indicate the verification progress. FIG. 52 illustrates the pop-up screen appearing on the VERIFY tab to indicate the verification progress. FIG. 53 illustrates the verification divot 1712 which between the hand grip 1704 and the array 1700.


Once verification is complete, verification status is indicated on the screen with the tip error displayed in mm. If verification has failed (red crossed circle), verification must be repeated until it is successful (green circle).


When all instruments are successfully verified, advance to the next tab. FIG. 54 illustrates the green circle indicating a successful verification. FIG. 55 illustrates the red crossed circle indicating a failed verification.


Patient Attachment Instruments


Patient attachment instruments are secured to rigid bony anatomy neighboring the surgical site. Select the desired instrument. Patient attachment instruments should be placed no more than 185 mm from the center of the surgical site to maintain accuracy.


Bone clamps are clamped onto anatomical structures such as the spinous process, iliac crest, long bone, or any rigid bony structure that can be safely clamped.


Quattro spikes are inserted into the iliac crest or a long bone.


Rod attachments are secured to an existing spinal rod, 4.5 mm to 6.35 mm in diameter.


Refer to the table below for recommended anatomic locations for the various patient attachment instruments.












Patient attachment instruments - recommended anatomic locations












Patient
Recommended Patient



Patient
Attachment
Attachment Instrument


Spine Procedures
Position
Instrument
Location





Posterior Cervical
Prone
Bone Clamp
Spinous Process C2-T3




Rod Attachment
Existing Rod


Posterior
Prone
Bone Clamp
Spinous Process T1-L1


Thoracic

Rod Attachment
Existing Rod


Anterolateral
Lateral
Bone Clamp
Spinous Process T1-L1


Thoracic





Posterior Lumbar
Prone
Quattro Spike
Iliac Crest




Low Profile
Iliac Crest




Quattro Spike





Bone Clamp
Spinous Process T12-L5




Rod Attachment
Existing Rod


Lateral Lumbar
Lateral
Quattro Spike
Iliac Crest




Low Profile
Iliac Crest




Quattro Spike





Bone Clamp
Spinous Process T12-L5




Rod Attachment
Existing Rod










Dynamic Reference Base Insertion


Position the compression clamp on the Dynamic Reference Base (DRB) over the patient attachment instrument and tighten the knob. If needed, the clamp driver can be used to further tighten the knob.


Position the reflective markers on the DRB in the direction of the camera. Care should be taken with initial placement of the patient reference instrument as to not interfere with the surgical procedure.


Following navigation, the patient attachment instrument is removed. FIG. 56 illustrates securing a Dynamic Reference Base to a patient attachment instrument. FIG. 57 illustrates using a clamp driver to the Dynamic Reference Base.


Surveillance Marker


The surveillance marker is inserted into rigid bony anatomy to track the relative distance to the DRB, to identify unwanted shifts in the DRB during the procedure.


Surveillance markers are inserted into the iliac crest or long bone, or may be attached to the spinous process using a bone clamp. Verify that the clamp is rigidly secured. The surveillance marker should be placed no more than 185 mm from the Dynamic Reference Base. Refer to the table below for recommended anatomic locations.












Surveillance marker - recommended anatomic locations












Patient
Recommended Patient



Patient
Attachment
Attachment Instrument


Spine Procedures
Position
Instrument
Location





Posterior Cervical
Prone
Bone Clamp
Spinous Process C2-T3


Posterior Thoracic
Prone
Single
Iliac Crest




Bone Clamp
Spinous Process T1-L1


Anterolateral
Lateral
Bone Clamp
Spinous Process T1-L1


Thoracic





Posterior Lumbar
Prone
Single
Iliac Crest




Bone Clamp
Spinous Process T12-L5


Lateral Lumbar
Lateral
Single
Iliac Crest




Bone Clamp
Spinous Process T12-L5









Attach a disposable reflective marker to the marker post of the surveillance marker. Attach the impaction cap, designed to fit over the reflective marker sphere, onto the surveillance marker. Insert the surveillance marker into rigid bony anatomy near the surgical site, and gently impact with a mallet. Remove the impaction cap. Remove the reflective marker prior to using the removal tool. FIG. 58 illustrates the placement of the Dynamic Reference Base (DRB) 5800 and the surveillance marker 5804. The DRB 5800 includes reflective markers 5802.


To use a bone clamp with the marker, attach a disposable marker onto the tip of the bone clamp. Use the clamp driver to secure the bone clamp. Verify that the clamp is rigidly secured.


Removal


The quattro spikes and surveillance marker are removed from bony anatomy manually or using the removal tool. The bone clamp is removed by loosening the clamp with the clamp driver, attaching the removal tool and lifting up the bone clamp. FIG. 59 illustrates a quattro spike. FIG. 60 illustrates a quattro spike removal tool. FIG. 61 illustrates removing a quattro spike with a removal tool.


Intra-Operative CT Imaging Workflow


Image Tab


Intra-Op Ct Registration Fixture Setup



FIG. 62 illustrates attaching a registration fixture 6200 to a pivoting arm 6202. Place the pivoting arm starburst 6206 over the starburst 6206 on the registration fixture 6200 and rotate 90° to secure. Referring to the enlarged view 6208 of the pivoting arm 6202 positioned over the starburst 6206, push the lock post 6204 from the bottom and rotate the arm 90° until the pin in the lock post 6204 is seated to secure the fixture. Enlarged view 6210 shows the pivoting arm 6202 attached and rotated to become secured to the registration fixture 6200.



FIG. 63 illustrates a registration fixture connecting to a patient attachment instrument. Position the fixture on the patient attachment instrument post and tighten the compression clamp knob. If needed, the clamp driver can be used to further tighten the knob.


To release the pivoting arm, push the lock post on the fixture, rotate the pivoting arm 90° and pull up.


The Intra-op CT Registration Fixture has six degrees of freedom and can be moved by adjusting one of the three joints so that it is stable and hovering over the surgical site. Only the metal fiducials embedded in the fixture need to be in the 3D scan (not the reflective markers). It is important that the Intra-op CT Registration Fixture does not move between the image acquisition and performing an anatomical landmark check.


Loading the Image


The IMAGE tab shows the steps needed to load a CT scan image. The image can be loaded from a USB drive or hard drive. If the image is transferred via the Ethernet, it automatically appears on the hard drive when the transfer is complete.


To view images on a USB drive, insert the USB drive into the USB port on the connector panel. To load an image, select the hard drive or USB drive icon and select the desired patient image. Click the right arrows to load the patient images and advance to the next tab.


Manual Registration


Automatic registration is performed when loading images. FIG. 64 illustrates a registered fiducial. If this step fails, the manual registration screen will be shown to allow manual registration as described below.


The image on the left panel of the registration screen is a full scan with a depiction of the intra-op CT.


The registration fixture and the seven fiducials should be visible below the image. Fiducials that are not registered need to be adjusted by the operator. On the screen, select a fiducial that is not registered; that image will appear on the right. Move the blue circle on the screen until it surrounds the white fiducial marker. The three small boxes at the bottom of the right panel show the x, y and z direction of the fiducial and all must be adjusted until the blue circle is centered. Ensure that all seven fiducials are properly identified by viewing the 3D model of the intra-op registration fixture. A fiducial may be deleted by selecting the delete icon on the right panel. Click the right arrows to confirm that the fiducials have been properly identified before proceeding to the next step.


Landmark Check


After registration has been completed, a landmark check should be performed to ensure that the registration was calculated successfully. Using the verification probe, touch an anatomical landmark or a fiducial on the registration fixture and verify that the corresponding location is shown on the system monitor. Repeat this process using 2-3 landmarks.


Removing Registration Fixture


Carefully remove the Intra-op CT Registration Fixture. Ensure the patient attachment instrument does not move.


Intra-Operative CT Imaging Workflow


Plan Tab



FIG. 65 illustrates the PLAN tab allowing the user to plan all screw trajectories on the patient image. Screws are preloaded on the right hand side of the screen, based on selections made in the PREPLAN tab.


To add a screw onto the planning page, drag and drop the appropriate screw label on the image at the desired slice.


The active screw plan is shown in green. Details of the active screw plan are shown on the lower right of the screen, including screw family, diameter, and length. Click on the right arrows to advance to the next tab once plans are complete for all screws.












Adjusting screw trajectory
















Screw Body
Press and move along screen to translate the screw



along the current plane of the anatomy


Screw Head
Press and move to change the angle of the trajectory,



pivoting along the tip of the screw


Screw Tip
Press and move to change the angle of the trajectory,



pivoting along the head of the screw


Scroll Bar
The scroll bar is the dial control located above the head



of the screw. Press the scroll bar and move to rotate



the anatomy 360° about the screw.



















Adjusting screw size
















Screw Tip
Press and move longitudinally to automatically



adjust the length of the screw to available screw sizes


Screw
Press the screw diameter button located on the right hand


Diameter
side of the screen to select other options available with



the selected implant set


Screw Length
Press the screw length button located on the right hand



side of the screen to select other options available



with the selected implant set










Intra-Operative CT Imaging Workflow Planning Operations



FIG. 80 illustrates a block diagram of a surgical system 600 that includes a surgical implant planning computer 610 which may be separate from and operationally connected to the robot 500 or at least partially incorporated therein. Alternatively, at least a portion of operations disclosed herein for the surgical implant planning computer 610 may be performed by components of the robot 500 such as by the computer subsystem 520.


Referring to FIG. 80, the surgical implant planning computer 610 includes a display 612, at least one processor circuit 614 (also referred to as a processor for brevity), at least one memory circuit 616 (also referred to as a memory for brevity) containing computer readable program code 618, and at least one network interface 620 (also referred to as a network interface for brevity). The network interface 620 can be configured to connect to a CT image scanner 630, a fluoroscopy image scanner 640, an image database 650 of medical images, components of the surgical robot 500, the marker tracking camera 570, and/or other electronic equipment.


When the surgical implant planning computer 610 is at least partially integrated within the surgical robot 500, the display 612 may correspond to the display 524 and/or the tablet 590, the network interface 620 may correspond to the platform network interface 512, and the processor 614 may correspond to the computer 522.


The processor 614 may include one or more data processing circuits, such as a general purpose and/or special purpose processor, e.g., microprocessor and/or digital signal processor. The processor 614 is configured to execute the computer readable program code 618 in the memory 616 to perform operations, which may include some or all of the operations described herein as being performed by a surgical implant planning computer. FIGS. 81 through 87 illustrates various operations that can be performed by the processor 614 in accordance with some embodiments of the present disclosure.


Referring to FIGS. 80 and 81, the processor 614 displays 700 on the display device a CT image of a bone that is received from the CT image scanner 630 through the network interface 620. The processor 614 receives 702 a user's selection of a surgical screw from among a set of defined surgical screws, such as by a user touch selecting user-selectable indicia shown through a touch sensitive screen overlay on the display 612. The processor 614 displays 704 a graphical screw representing the selected surgical screw as an overlay on the CT image of the bone.


The processor 614 controls 706 angular orientation and location of the displayed graphical screw relative to the bone in the CT image responsive to receipt of user inputs, which may be provided by the user touch selecting and/or touch dragging a finger on the display 614 and/or via another user interface, such as a touchpad, joystick, dials, etc. The processor 614 stores 708 an indication of the selected surgical screw and an angular orientation and a location of the displayed graphical screw in a surgical plan data structure, e.g., within memory 616, responsive to receipt of a defined user input, such as a user selecting a displayed indicia for providing a keyboard input. As will be described in further detail below, the processor 614 may control 710 the robot 500 based on the surgical plan data structure to move the robot arm relative to a patient.


The angular orientation and the location that is stored 708 in the surgical plan data structure may be configured to indicate the angular orientation and the location of the displayed graphical screw relative to an angular orientation and a location of the bone in the CT image. The operations to display 704 the graphical screw representing the selected surgical screw as an overlay on the CT image of the bone, can include determining a trajectory along an axis of the graphical screw, and displaying a trajectory line that extends from adjacent to a tip of the graphical screw and along the trajectory to facilitate a user visually orienting and positioning the graphical screw relative to a desired insertion location on the bone.


The operations to control 706 angular orientation and location of the displayed graphical screw relative to the bone in the CT image responsive to receipt of user inputs, can include translating a location of the displayed graphical screw responsive to determining that the user has pressed on a touch-sensitive screen of the display device 612 over a screw body of the graphical screw while moving location of the user's continued pressing along the touch-sensitive screen. The operations can further include angularly pivoting the displayed graphical screw responsive to determining that the user has pressed on the touch-sensitive screen over a screw head and/or tip of the graphical screw while moving location of the user's continued pressing along the touch-sensitive screen.


Alternatively or additionally, the operations to control 706 angular orientation and location of the displayed graphical screw relative to the bone in the CT image responsive to receipt of user inputs, can include selecting a length of the displayed graphical screw from among a set of defined lengths for surgical screws responsive to determining that the user has pressed on a touch-sensitive screen of the display device over a screw tip or a screw head of the graphical screw while moving location of the user's continued pressing along the touch-sensitive screen a measured distance. The selected length of the surgical screw is then stored 708 in the surgical plan data structure.


The operations to control 706 orientation and location of the displayed graphical screw relative to the bone in the CT image responsive to receipt of user inputs, can include modifying a size and/or a rotational angle of the displayed graphical screw on the CT image responsive to tracking motion of a user's hand relative to an input device, such as by tracking motion of the user's finger on a touch sensitive screen overlay on the display 612, on a touchpad, etc.


Intra-Operative CT Imaging Workflow


Navigate Tab



FIG. 66 illustrates the NAVIGATE tab allowing the user to visualize the navigated instrument trajectory and the planned trajectory with respect to patient anatomy.


The robotic arm precisely aligns the end-effector to the planned trajectory. Select the desired screw label on the right of the screen. The screw plan is active when the screw label is highlighted and the robotic arm can be moved by the bracelet or pressing the foot pedal. The robotic arm first moves up in order to clear obstacles in the surgical field and then down along the trajectory. Once on the trajectory, the robotic arm can move up/down along the trajectory but does not move off of the trajectory unless the screw plan is deselected.


The real-time instrument/implant trajectory is displayed on the patient images along with the planned screw, allowing the user to confirm the desired trajectory. If the real-time trajectory is not acceptable, the user can return to the PLAN tab to select another trajectory. If the real-time trajectory is acceptable, the user inserts the screw according to the instrument's current trajectory to the desired depth.


GPS instruments are displayed as they are advanced through the end-effector. While navigating the instruments, periodically observe the monitor and surgical site to ensure consistency between tactile and navigation feedback. Non-navigated metallic Globus instruments may be used through the guide tube while it is stationary for surgical applications unrelated to screw placement.


Monitor the surveillance marker during the procedure. If the surveillance marker indicates significant movement of the DRB, perform an anatomical landmark check. If the landmark check is satisfactory, re-register the surveillance marker. If the landmark check fails, re-register the patient.


There are multiple navigation tab icons. Referring to FIG. 66, the force gauge 661 indicates the force exerted on the end-effector. The image of the instrument at the bottom of the force gauge shows the active instrument in the end-effector or the end-effector image if no instrument is inserted. The surveillance marker error gauge 662 indicates the distance that the patient reference has moved in relation to the surveillance marker. The full range of the scale is 2 mm. The DRB icon 663 indicates dynamic reference base visibility. If the DRB is visible by the camera, the background is green. If the DRB is not visible by the camera, the background is red.


Intra-Operative CT Imaging Workflow Navigation Operations


As explained above, the surgical implant planning computer 610 can control 710 operations of the surgical robot 500. Referring to the operational embodiment of FIG. 82, the processor 614 of the surgical implant planning computer 610 can control 710 the robot 500 by providing 800 the surgical plan data structure to the robot 500 to control movement of the robot arm relative to the robot base.


Referring to the alternative or additional operations of FIG. 83, the processor 614 of the surgical implant planning computer 610 can control 710 the robot 500 by controlling 900 selected ones of the motors 550-554, either directly or indirectly via the computer 522 and/or controller 546, responsive to content of the surgical plan data structure to regulate movement of the robot arm while positioning an end-effector 544, which is connected to the robot arm, relative to a patient. The processor 614 can also control 902 angular orientation and location of the displayed graphical screw on the display 612 responsive to the movement of the robot arm while the end-effector 544 is positioned relative to the patient.


In a further embodiment, the processor 614 can directly or indirectly control 900 one or more of the motors 550-554 to move the end-effector 544 in a direction along a trajectory that is defined by the content of the surgical plan data structure, and can control 902 location of the displayed graphical screw responsive to the movement of the end-effector 544 along the trajectory.


In a further embodiment, while moving the end-effector 544 along the trajectory, the processor 614 can directly or indirectly control one or more of the motors 550-554 to resist movement of the end-effector 544 in a direction that is perpendicular to the trajectory until another operation is perform that cancels an end-effector trajectory constraint mode. In a further embodiment, prior to initiating the end-effector trajectory constraint mode, the processor 614 can directly or indirectly control one or more of the motors 550-554 to move the end-effector 544 in a direction upward away from the patient and then toward a location along the trajectory toward the patient, and prevent initiation of the end-effector trajectory constraint mode before reaching the location along the trajectory. The processor can control angular orientation and location of the displayed graphical screw responsive to the movement of the robot arm away from the patient and then toward the location along the trajectory.


Pre-Operative CT Imaging Workflow


Image Tab


Loading the Image


The IMAGE tab shows the steps needed to load a CT scan image. The image can be loaded from a USB drive or hard drive. If the image is transferred through the Ethernet, it automatically appears on the hard drive when the transfer is complete.


To view images on a USB drive, insert the USB drive into the USB port on the connector panel. To load an image, select the hard drive or USB drive icon and select the desired patient image. Click the right arrows to load the patient images and advance to the next tab.


Pre-Operative CT Imaging Workflow


Plan Tab



FIG. 67 illustrates the PLAN tab allowing the user to plan all screw trajectories on the patient image. Screws are preloaded on the right-hand side of the screen, based on selections made in the PREPLAN tab.


To add a screw onto the planning page, drag and drop the appropriate screw label on the image at the desired slice. The active screw plan is shown in green. Details of the active screw plan are shown on the lower right of the screen, including screw family, diameter, and length. Click on the right arrows to advance to the next tab once plans are complete for all screws.












Adjusting screw trajectory
















Screw Body
Press and move along screen to translate the



screw along the current plane of the anatomy


Screw Head
Press and move to change the angle of the trajectory,



pivoting along the tip of the screw


Screw Tip
Press and move to change the angle of the trajectory,



pivoting along the head of the screw


Scroll Bar
The scroll bar is the dial control located above the head of



the screw. Press the scroll bar and move to rotate the



anatomy 360° about the screw.



















Adjusting screw size
















Screw Tip
Press and move longitudinally to automatically



adjust the length of the screw to available screw sizes


Screw Diameter
Press the screw diameter button located on the



right hand side of the screen to select other options



available with the selected implant set


Screw Length
Press the screw length button located on the right



hand side of the screen to select other options



available with the selected implant set










Pre-Operative CT Imaging Workflow Planning Operations


Pre-operative CT imaging workflow planning operations that can be performed by the surgical implant planning computer 610 and, more particularly by the processor 614, are now described in the context of the embodiments shown in FIG. 84.


Referring to FIG. 84, the operations can include loading 1000 a CT image of a bone, which is received from the image database 650 through the network interface 620, into the memory 616. The operations include displaying 1002 the CT image on the display device 612, and receiving 1004 a user's selection of a surgical screw from among a set of defined surgical screws. The operations display 1006 a graphical screw representing the selected surgical screw as an overlay on the CT image of the bone. The operations control 1008 angular orientation and location of the displayed graphical screw relative to the bone in the CT image responsive to receipt of user inputs. The operations store 1012 an indication of the selected surgical screw and an angular orientation and a location of the displayed graphical screw in a surgical plan data structure responsive to user input. The surgical plan data structure is configured for use by the robot 500 to control movement of the robot arm in accordance with various embodiments disclosed herein.


The operations to display 1006 the graphical screw representing the selected surgical screw as an overlay on the CT image of the bone, can include determining a trajectory along an axis of the graphical screw, and displaying 1010 a trajectory line that extends from adjacent to a tip of the graphical screw and along the trajectory to facilitate a user visually orienting and positioning the graphical screw relative to a desired insertion location on the bone.


The operations to control 1008 angular orientation and location of the displayed graphical screw relative to the bone in the CT image responsive to receipt of user inputs, can include translating a location of the displayed graphical screw responsive to determining that the user has pressed on a touch-sensitive screen of the display device 612 over a screw body of the graphical screw while moving location of the user's continued pressing along the touch-sensitive screen. The operations can alternatively or additionally include angularly pivoting the displayed graphical screw responsive to determining that the user has pressed on the touch-sensitive screen over a screw head and/or tip of the graphical screw while moving location of the user's continued pressing along the touch-sensitive screen.


The operations to control 1008 angular orientation and location of the displayed graphical screw relative to the bone in the CT image responsive to receipt of user inputs, can include selecting a length of the displayed graphical screw from among a set of defined lengths for surgical screws responsive to determining that the user has pressed on a touch-sensitive screen of the display device 612 over a screw tip or a screw head of the graphical screw while moving location of the user's continued pressing along the touch-sensitive screen a measured distance.


The selected length of the surgical screw is stored 1012 in the surgical plan data structure.


The operations can include controlling 1014 angular orientation and location of the displayed graphical screw responsive to the movement of the robot arm while the end-effector 544 is being positioned relative to a patient.


Pre-Operative CT Imaging Workflow


Navigate Tab


The NAVIGATE tab allows the user to visualize the navigated instruments and trajectory alignment with respect to patient anatomy, according to the screw plan.


Registration Setup



FIG. 68 illustrates the first screen highlighting the three steps to complete before the fluoroscopy images can be taken to register the pre-operative CT image. Animation visually depicts the steps.



FIG. 69 illustrates a Fluoroscopy Registration Fixture attached to image intensifier. Attach the Fluoroscopy Registration Fixture to the image intensifier on the C-arm by turning the clamps clockwise until tight. Drape the fluoroscope and Fluoroscopy Registration Fixture and attach new reflective markers outside of the drape. Position the fixture such that the reflective markers are facing the camera. Attach the video capture cable (yellow jack) to the C-arm viewing station. Plug the video capture USB cable into either of the two USB ports on the robotic computer system connector panel.


Ensure that the Dynamic Reference Base is visible to the camera after the C-Arm is in place.


Register the surveillance marker by placing an instrument close to the reflective sphere on the surveillance marker but not touching. The box turns green when it is activated. Click the right arrows to advance to the next tab.


Pre-Operative CT Imaging Workflow Navigation Operations


Pre-operative CT imaging workflow navigation operations that can be performed by the surgical implant planning computer 610 and, more particularly by the processor 614, are now described in the context of the embodiments shown in FIG. 85.


Referring to FIG. 85, the operations can include performing 1100 a registration setup mode that includes determining occurrence of a first condition indicating that a marker tracking camera 570 can observe to track reflective markers that are on a fluoroscopy registration fixture (e.g., connected to the fluoroscopy imager 640), and further determining occurrence of a second condition indicating that the marker tracking camera 570 can observe to track dynamic reference base markers attached to the robot arm and/or an end-effector 544 connected to the robot arm. The operations display 1102 on the display device 612 an indication of when both of the first and second conditions occur, and determine that the registration setup mode is allowed to be marked satisfied when at least both of the first and second conditions are determined to occur.


Registration


Acquire the intra-operative fluoroscopic images, one AP and one lateral for each level planned. The same image may be used for multiple levels.


The following three conditions must be met prior to acquiring the images: (1) the DRB is visible by the camera; (2) the Fluoroscopy Registration Fixture is visible by the camera; and (3) a valid fluoroscopic image was taken.



FIG. 70 illustrates a lateral image within the NAVIGATE tab. Referring to FIG. 70. Each of the three images on the left of the screen turns green when ready for image capture. When all three conditions are met, acquire the intra-operative fluoroscopic image and then select the CAPTURE button to transfer the image to the system. Once both images are successfully captured, the spinal level on the right side of the screen displays a check mark. Click the right arrows to advance to the next tab.



FIG. 71 illustrates selecting the desired level. To do so, the user drags and drops the planned screw onto the fluoroscopic images. Use the circle control points to roughly position the screw within the vertebral body. Ensure that the screw shank is positioned correctly, the head and tail of the screws are in the desired direction, and left/right are correctly oriented. Click the register button when complete to allow registration.



FIG. 72 illustrates a successful registration with a check mark being shown next to the active level. Click the right arrows when registration is complete.


Pre-Operative CT Imaging Workflow Navigation Operations


With further reference to FIG. 85, the operations by the surgical implant planning computer 610 can further include operating 1104 while both of the first and second conditions are determined 1104 to continue to occur, to allow operations to be performed to obtain a first intra-operative fluoroscopic image of the patient along a first plane and to obtain a second intra-operative fluoroscopic image of the patient along a second plane that is orthogonal to the first plane. The operations determine that a registration mode is allowed to be marked satisfied when the first and second intra-operative fluoroscopic images have been obtained.


With further reference to FIG. 85, the operations by the surgical implant planning computer 610 can further include displaying 1106 the first and second intra-operative fluoroscopic images on the display device 612. The operations display 1108 the graphical screw as an overlay on both of the first and second intra-operative fluoroscopic images. The operations control 1110 angular orientation and location of the displayed graphical screw relative to a bone in the first and second intra-operative fluoroscopic images responsive to receipt of user inputs.


Operations may alternatively or additionally include determining 1112 when the angular orientation and location of the displayed graphical screw relative to the bone in the first and second intra-operative fluoroscopic images satisfies a registration rule for corresponding to the angular orientation and the location of the displayed graphical screw in the surgical plan data structure, and then responsively displaying on the display device 612 an indication of when the registration rule is satisfied.


With further reference to FIG. 85, the operations by the surgical implant planning computer 610 can further include, based on determining that the registration rule is satisfied, controlling 1114 one or more of the motors 550-554 responsive to content of the surgical plan data structure to regulate movement of the robot arm while positioning the end-effector 544 relative to the patient. The operations can further control 1114 angular orientation and location of the graphical screw that is displayed, responsive to the movement of the robot arm while the end-effector 544 is being positioned relative to the patient.


Landmark Check


After registration has been completed, a landmark check, or verification, should be performed to ensure that the registration was calculated successfully. Using the verification probe, touch an anatomical landmark and verify that the corresponding location is shown on the system monitor. Repeat this process using 2-3 landmarks.


Removing Registration Fixture


Carefully remove the Fluoroscopy Registration Fixture if desired.


Navigation


The robotic arm precisely aligns the end-effector on the planned trajectory. Select the desired screw label on the right of the screen.


The screw plan is active when the screw label is highlighted and the robotic arm can be moved by the bracelet or pressing the foot pedal. The robotic arm first moves up in order to clear obstacles in the surgical field and then down along the trajectory. Once on the trajectory, the robotic arm can move up/down along the trajectory but does not move off of the trajectory unless the screw is deselected.



FIG. 73 illustrates how the real-time instrument/implant trajectory is displayed on the patient images along with the planned screw, allowing the user to confirm the desired trajectory. If the real-time trajectory is not acceptable, the user can return to the PLAN tab to select another trajectory. If the real-time trajectory is acceptable, the user inserts the screw according to the instrument's current trajectory to the desired depth.


GPS instruments are displayed as they are advanced through the end-effector. While navigating the instruments, periodically observe the monitor and surgical site to ensure consistency between tactile and navigation feedback.


Non-navigated metallic Globus instruments may be used through the guide tube while it is stationary for surgical applications unrelated to screw placement.


Monitor the surveillance marker during the procedure. If the surveillance marker indicates significant movement of the DRB, perform an anatomical landmark check. If the landmark check is satisfactory, re-register the surveillance marker. If the landmark check fails, re-register the patient.


There are multiple navigation tab icons. Referring to FIG. 73, the force gauge 731 indicates the force exerted on the end-effector. The image of the instrument at the bottom of the force gauge shows the active instrument in the end-effector or the end-effector image if no instrument is inserted. The surveillance marker error gauge 732 indicates the distance that the patient reference has moved in relation to the surveillance marker. The full range of the scale is 2 mm. The DRB icon 733 indicates dynamic reference base visibility. If the DRB is visible by the camera, the background is green. If the DRB is not visible by the camera, the background is red.


Fluoroscopic Imaging Workflow


Image Tab


Registration Setup


Referring to FIG. 68 the first screen highlights the three steps to complete before fluoroscopic images can be taken to register the patient. Animation visually depicts the steps.


Referring to FIG. 69, attach the Fluoroscopy Registration Fixture to the image intensifier on the C-arm by turning the clamps clockwise until tight. Drape the fluoroscope and Fluoroscopy Registration Fixture and attach new reflective markers outside of the drape. Position the fixture such that the reflective markers are facing the camera. Attach the video capture cable (yellow jack) to the C-arm viewing station. Plug the video capture USB cable into either of the two USB ports on the robotic computer system connector panel.


Ensure that the Dynamic Reference Base is visible to the camera after the C-Arm is in place.


Register the surveillance marker by placing an instrument close to the reflective sphere on the surveillance marker but not touching. The box turns green when it is activated. Click the right arrows to advance to the next tab.


Image Acquisition


Acquire intra-operative fluoroscopic images, one AP and one lateral.


The following three conditions must be met prior to acquiring the images: (1) the DRB is visible by the camera; (2) the Fluoroscopy Registration Fixture is visible by the camera; and (3) a valid fluoroscopic image was taken.



FIG. 74 illustrates a lateral image within the NAVIGATE tab. Referring to FIG. 74, each of the three images on the left of the screen turn green when ready for image capture. When all three conditions are met, acquire the intra-operative fluoroscopic image and then select the CAPTURE button to transfer the image to the system. Once both images are successfully captured, the level on the right side of the screen displays a check mark. Once the appropriate images have been loaded and selected, click on the right arrows to proceed.


Landmark Check


After registration has been completed, a landmark check, or verification, should be performed to ensure that the registration was calculated successfully. Using the navigated verification probe, touch an anatomical landmark and verify that the corresponding location is shown on the system monitor. Repeat this process using 2-3 landmarks.


Removing Registration Fixture


Carefully remove the fluoroscopy registration fixture if desired.


Fluoroscopic Imaging Workflow Operations


Fluoroscopic imaging workflow operations that can be performed by the surgical implant planning computer 610 and, more particularly by the processor 614, are now described in the context of the embodiments shown in FIG. 86.


Referring to FIG. 86, the operations can include performing 1200 operations for a registration setup mode that include determining occurrence of a first condition indicating that the marker tracking camera 570 can observe to track reflective markers that are on a fluoroscopy registration fixture of the fluoroscopy imager 640, and determining occurrence of a second condition indicating the marker tracking camera 570 can observe to track dynamic reference base markers attached to the robot arm and/or the end-effector 544 connected to the robot arm. While both of the first and second conditions are determined to continue to occur, the processor 614 allows 1204 operations to be performed to obtain a first intra-operative fluoroscopic image of a patient along a first plane and to obtain a second intra-operative fluoroscopic image of the patient along a second plane that is orthogonal to the first plane. The operations may display 1202 on the display device 612 an indication of when both of the conditions occur. If one or both conditions cease to be satisfied before the first and second intra-operative fluoroscopic images are obtained, the system may interrupt further obtaining of the uncompleted first and second intra-operative fluoroscopic imaging and generate a notification to the user.


The operations can further include displaying 1206 the first and second intra-operative fluoroscopic images on the display device 612. The operations can receive 1208 a user's selection of a surgical screw from among a set of defined surgical screws, and display 1210 a graphical screw representing the selected surgical screw as an overlay on both of the first and second intra-operative fluoroscopic images. The operations can control 1212 angular orientation and location of the displayed graphical screw relative to a bone shown in the first and second intra-operative fluoroscopic images responsive to receipt of user inputs, and store 1214 an indication of an angular orientation and a location of the displayed graphical screw in a surgical plan data structure responsive to receipt of a defined user input.


Fluoroscopic Imaging Workflow


Plan Tab



FIG. 75 illustrates the PLAN tab allowing the user to plan all screw trajectories on the patient image. Referring to 75, screws are preloaded on the right side of the screen, based on selections made in the PREPLAN tab.


To add a screw onto the planning page, drag and drop the appropriate screw label on the image at the desired slice.


The active screw plan is shown in green. Details of the active screw plan are shown on the lower right of the screen, including screw family, diameter, and length. Click on the right arrows to advance to the next tab once plans are complete for all screws.












Adjusting screw trajectory
















Screw Head
Press and move along screen to adjust the screw



along the current plane of the anatomy


Screw Tip
Press and move to change the angle of the trajectory,



pivoting along the head of the screw


Screw Trajectory
Press and move the screw along the 3D trajectory.



This is useful to simulate actual advancement of the



screw in 3D space. Both AP and Lateral images will



be updated to reflect the new screw position.



















Adjusting screw size


















Screw Diameter
Press the screw diameter button located on




the right hand side of the screen to select other




options available with the selected implant set



Screw Length
Press the screw length button located on




the right hand side of the screen to select other




options available with the selected implant set










Fluoroscopic Imaging Workflow Planning Operations


Fluoroscopic imaging workflow operations for planning that can be performed by the surgical implant planning computer 610 and, more particularly by the processor 614, are now described in the context of the embodiments shown in FIG. 87.


Referring to FIG. 87, operations to display the graphical screw representing the selected surgical screw as an overlay on both of the first and second intra-operative fluoroscopic images, can include determining 1300 a trajectory along an axis of the graphical screw and displaying a trajectory line that extends from adjacent to a tip of the graphical screw and along the trajectory to facilitate a user visually orienting and positioning the graphical screw relative to a desired insertion location on the bone.


Operations to control angular orientation and location of the displayed graphical screw relative to the bone shown in the first and second intra-operative fluoroscopic images responsive to receipt of user inputs, can include translating 1302 a location of the displayed graphical screw responsive to determining that the user has pressed on a touch-sensitive screen of the display device 612 over a screw body of the graphical screw while moving location of the user's continued pressing along the touch-sensitive screen. The operations can further include angularly pivoting 1304 the displayed graphical screw responsive to determining that the user has pressed on the touch-sensitive screen over a screw head and/or tip of the graphical screw while moving location of the user's continued pressing along the touch-sensitive screen.


Operations to control angular orientation and location of the displayed graphical screw relative to the bone shown in the first and second intra-operative fluoroscopic images responsive to receipt of user inputs, can include selecting 1306 a length of the displayed graphical screw from among a set of defined lengths for surgical screws responsive to determining that the user has pressed on a touch-sensitive screen of the display device 612 over a screw tip or a screw head of the graphical screw while moving location of the user's continued pressing along the touch-sensitive screen a measured distance. The selected length is stored 1308 in the surgical plan data structure.


Fluoroscopic Imaging Workflow


Navigate Tab



FIG. 76 illustrates the NAVIGATE tab allowing the user to visualize the navigated instrument trajectory and the planned trajectory with respect to patient anatomy.


The robotic arm precisely aligns the end-effector to the planned trajectory. Referring to FIG. 76, select the desired screw label on the right of the screen.


The screw plan is active when the screw label is highlighted and the robotic arm can be moved by the bracelet or pressing the foot pedal. The robotic arm first moves up in order to clear obstacles in the surgical field and then down along the trajectory. Once on the trajectory, the robotic arm can move up/down along the trajectory but does not move off of the trajectory unless the screw plan is deselected.


The real-time instrument/implant trajectory is displayed on the patient images along with the planned screw, allowing the user to confirm the desired trajectory. If the real-time trajectory is not acceptable, the user can return to the PLAN tab to select another trajectory. If the real-time trajectory is acceptable, the user inserts the screw according to the instrument's current trajectory to the desired depth.


GPS instruments are displayed as they are advanced through the end-effector. While navigating the instruments, periodically observe the monitor and surgical site to ensure consistency between tactile and navigation feedback.


Non-navigated metallic Globus instruments may be used through the guide tube while it is stationary for surgical applications unrelated to screw placement.


Monitor the surveillance marker during the procedure. If the surveillance marker indicates significant movement of the DRB, perform an anatomical landmark check. If the landmark check is satisfactory, re-register the surveillance marker. If the landmark check fails, re-register the patient.


There are multiple navigation tab icons. Referring to FIG. 76, the force gauge 761 indicates the force exerted on the end-effector. The image of the instrument at the bottom of the force gauge shows the active instrument in the end-effector or the end-effector image if no instrument is inserted. The surveillance marker error gauge 762 indicates the distance that the patient reference has moved in relation to the surveillance marker. The full range of the scale is 2 mm. The DRB icon 763 indicates dynamic reference base visibility. If the DRB is visible by the camera, the background is green. If the DRB is not visible by the camera, the background is red.


Navigation-Only Procedures



FIG. 77 illustrates how the robotic computer system may be used for navigation without the robotic arm and end effector. Pre-surgical planning is optional. Referring to FIG. 77, all verified GPS instruments are visible on loaded patient images when moved within the view of the camera. The instruments are displayed with respect to the patient.


Refer to the corresponding application and imaging workflow for the imaging modality (pre-operative CT, intra-operative CT, or fluoroscopy).


Use the IMAGE tab to load the desired patient images.


After instrument registration has been completed, a landmark check, or verification, should be performed to ensure that the registration was calculated successfully. Using the navigated verification probe, touch an anatomical landmark and verify that the corresponding location is shown on the system monitor. Repeat this process using 2-3 landmarks.


Use the PLAN tab to plan screw placement if desired. Select the desired screw label on the right of the screen to choose the screw plan.


Use the NAVIGATE tab to display the screw and navigated instruments during the procedure.


Monitor the surveillance marker during the procedure. If the surveillance marker indicates significant movement of the DRB, perform an anatomical landmark check. If the landmark check is satisfactory, re-register the surveillance marker. If the landmark check fails, re-register the patient.


Trajectory-Only Procedures



FIG. 78 illustrates how the robotic computer system may be used for trajectory guidance using the robotic arm without navigated instruments. Referring to t FIG. 78, the guide tube serves as a rigid retractor that can be moved within the surgical field or aligned to a trajectory automatically or manually.


Refer to the corresponding application and imaging workflow for the imaging modality (pre-operative CT, intra-operative CT, or fluoroscopy). Use the IMAGE tab to load the desired patient images.


A landmark check, or verification, should be performed to ensure that the registration was calculated successfully. Using the navigated verification probe, touch an anatomical landmark and verify that the corresponding location is shown on the system monitor. Repeat this process using 2-3 landmarks.


Use the PLAN tab to plan screw placement. Select the desired screw label on the right of the screen. The screw plan is active when the screw label is highlighted and the robotic arm can be moved by the bracelet or by pressing the foot pedal and moving the arm. The robotic arm first moves up to clear obstacles in the surgical field and then down along the specified trajectory. Once on the trajectory, the robotic arm can be moved up/down along the trajectory but does not move off of the trajectory unless the screw is deselected.


If using k-wires, use the cannulated awl to prepare the starting hole and place the k-wire into bone at the desired trajectory through the guide tube. The end effector should be moved away from the trajectory so the screw can be placed by k-wire guidance (deselect the screw plan).


Perform the surgical procedure using non-navigated metallic surgical instruments that fit through the guide tube diameter.


Monitor the surveillance marker during the procedure. If the surveillance marker indicates significant movement of the DRB, perform an anatomical landmark check. If the landmark check is satisfactory, re-register the surveillance marker. If the landmark check fails, re-register the patient.


Fluoroscopic Imaging Workflow Planning Operations


As explained above, the fluoroscopic imaging workflow operations for planning by the surgical implant planning computer 610 can include displaying the graphical screw representing the selected surgical screw as an overlay on both of the first and second intra-operative fluoroscopic images. The operations can determine 1300 a trajectory along an axis of the graphical screw and displaying a trajectory line that extends from adjacent to a tip of the graphical screw and along the trajectory to facilitate a user visually orienting and positioning the graphical screw relative to a desired insertion location on the bone.


The operations may further include directly or indirectly, e.g., via the computer 522 and/or controller 546, controlling one or more of the motors 550-554 responsive to content of the surgical plan data structure to regulate movement of the robot arm while positioning the end-effector 544 relative to a patient. The operations can control (e.g., 1212 in FIG. 86) angular orientation and location of the displayed graphical screw responsive to the movement of the robot arm while the end-effector 544 is being positioned relative to the patient.


The operations can further include directly or indirectly, e.g., via the computer 522 and/or controller 546, controlling the motors 550-554 to move the end-effector 544 in a direction along a trajectory defined by the content of the surgical plan data structure. The operations can further include controlling (e.g., 1212 in FIG. 86) location of the displayed graphical screw responsive to the movement of the end-effector 544 along the trajectory.


The operations can further include, while moving the end-effector 544 along the trajectory, directly or indirectly controlling the motors 550-554 to resist movement of the end-effector 544 in a direction perpendicular to the trajectory until another operation is perform that cancels an end-effector trajectory constraint mode.


The operations can further include, prior to initiating the end-effector trajectory constraint mode, directly or indirectly controlling the motors 550-554 to move the end-effector 544 in a direction upward away from the patient and then toward a location along the trajectory, and preventing initiation of the end-effector trajectory constraint mode before reaching the location along the trajectory. The operations can control angular orientation and location of the displayed graphical screw responsive to the movement of the robot arm away from the patient and then toward the location along the trajectory.


Software Error Messages


The system alerts the operator of errors through pop-up messages. The following list describes all possible errors and the actions to correct them.














Message
Description
Proposed Remedy







End Effector
The End Effector is not attached to the
Ensure that the End Effector is properly


Disconnected
robot arm.
attached.


Stabilizer Not Down
Stabilizers have not been deployed.
Engage stabilizer.


Registration Not
The patient scan did not pass automatic
Complete registration.


Completed
registration or was unregistered via the




registration view.



Registration Not
Registration has not yet been transferred
Transfer registration


Transferred
from the intra-op CT registration fixture




to the Dynamic Reference Base.



Camera Disconnected
The connection to the camera was
Ensure the camera is properly



dropped, most likely as a result of a loose
connected.



cable.



Camera Frame Rate
The frame rate of the camera has
Too many instruments in view of the


Dropped
dropped below the system's safe limit.
camera. Removing instruments will



This is usually due to too many tracked
increase the camera frame rate.



instruments/objects in the camera's view.



Camera CRC Mismatch
Data from camera is not valid, or there
Disconnect camera from Robotic Base



has been a camera communication
Station and reconnect.



problem.



End Effector Not Visible
The End Effector is not currently visible
Ensure the End Effector is in view of the



to the camera. (This will stop or prevent
camera.



motion as the End Effector fiducials must




be visible to move the robot arm.)



DRB Not Visible
The Dynamic Reference Base is not
Ensure the Dynamic Reference Base is



currently visible to the camera (this will
in view of the camera.



stop motion as the Dynamic Reference




Base fiducials must be visible to move




the robot arm).



E-Stop pressed
Someone has physically pressed the
Rotate the E-Stop button to release.



E-Stop or Emergency Stop button on the




Robot Base Station. This stops motion.



PIB Communication
Communication to the PIB (Platform
Restart the system.


Dropped
Interface Board) has been lost. This




severs communication to the robotic arm,




which stops or prevents motion.



Surveillance Marker
The surveillance marker has moved
Perform an anatomical landmark check


Moved
beyond its safety-critical limit in relation
to ensure navigation is still accurate.



to the Dynamic Reference Base.
If navigation is inaccurate, either re-




register the patient or discontinue use




for that procedure.


Surveillance Marker Not
The surveillance marker has either shifted
Perform an anatomical landmark check


Visible
dramatically or moved a great distance,
to ensure navigation is still accurate.



which causes the camera to no longer
If navigation is inaccurate, either re-



see it.
register the patient or discontinue use




for that procedure.


Active Trajectory Not
The robotic arm cannot create a table of
Move Robotic Base Station to allow the


Reachable
position points to move to a trajectory,
arm to reach the trajectory.



based on the kinematics equations used.



Maximum Trajectory
When the robot arm is locked onto a
Restart the move.


Error Exceeded
trajectory, if the actual position of the




robot arm exceeds a certain distance




from the perceived trajectory, this error




will occur. Could be related to excessive




force on the End Effector or kinematics




issues.



Excessive force on the
Excessive force has been applied to the
Remove the force.


End Effector
load cell, over a certain limit (50N or 11 lbs)



Excessive Dynamic
The Dynamic Reference Base position
Perform an anatomical landmark check


Reference Base
has shifted relatively quickly, without
to ensure navigation is still accurate.


Movement
movement of other objects in the view of
If navigation is inaccurate, either re-



the camera.
register the patient or discontinue use




for that procedure.


Move Enabled Press
Move enabled is pressed while activating
Release the foot pedal or bracelet, then


Error
trajectory. Prevents the robot from
activate the trajectory.



instantly entering auto-move mode




immediately after activating a trajectory.



GMAS Communication
Communication with the GMAS controller
The system should automatically


Failure
has been lost. This will stop or prevent
connect. If not, restart the system.



motion as GMAS is no longer receiving




updates from the client about trajectory




and camera.



Move Enabled Timeout
Move enable has been active for longer
Release the foot pedal or bracelet, then



than threshold, 90 seconds or more. This
re-engage the foot pedal or bracelet.



is a failsafe for accidentally leaving the




arm engaged.



Camera Bumped
Massive bump to the camera, in which
Call Tech Support.



the camera is likely to be permanently




damaged. This is an error thrown




internally by the NDI software.



Tool in End Effector
If an instrument is in the End Effector
Remove instrument from End Effector.



when attempting to move, motion




will be disallowed and this error will be




displayed.



Move Enabled Test
The move enabled test has failed.
Ensure no buttons are pressed on


Failure

the system and the system will




automatically retry.


Motion Homing Failure
The homing routine for the robot has
Call Tech Support.



failed. This causes the robotic arm to




lose its relative positions. This test can be




retried, but if it consistently fails, there are




no user actions to fix.



Need to Home
Robot has not run its homing routine,
Call Tech Support.



thus the robot arm does not know its




relative positions.









Further Definitions and Embodiments

In the above-description of various embodiments of the present disclosure, aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or contexts including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented in entirely hardware, entirely software (including firmware, resident software, micro-code, etc.) or combining software and hardware implementation that may all generally be referred to herein as a “circuit,” “module,” “component,” or “system.” Furthermore, aspects of the present disclosure may take the form of a computer program product comprising one or more computer readable media having computer readable program code embodied thereon.


Any combination of one or more computer readable media may be used. The computer readable media may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an appropriate optical fiber with a repeater, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.


A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.


Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C #, VB.NET, Python or the like, conventional procedural programming languages, such as the “C” programming language, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a Service (SaaS).


Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable instruction execution apparatus, create a mechanism for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.


These computer program instructions may also be stored in a computer readable medium that when executed can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions when stored in the computer readable medium produce an article of manufacture including instructions which when executed, cause a computer to implement the function/act specified in the flowchart and/or block diagram block or blocks. The computer program instructions may also be loaded onto a computer, other programmable instruction execution apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatuses or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.


It is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.


The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various aspects of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.


The terminology used herein is for the purpose of describing particular aspects only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. Like reference numbers signify like elements throughout the description of the figures.


The corresponding structures, materials, acts, and equivalents of any means or step plus function elements in the claims below are intended to include any disclosed structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present disclosure has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the disclosure in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the disclosure. The aspects of the disclosure herein were chosen and described in order to best explain the principles of the disclosure and the practical application, and to enable others of ordinary skill in the art to understand the disclosure with various modifications as are suited to the particular use contemplated.

Claims
  • 1. A surgical implant planning computer comprising: at least one network interface connectable to an image scanner and a robot having a robot base coupled to a robot arm that is movable by motors relative to the robot base;a display device;at least one processor; andat least one memory storing program code that is executed by the at least one processor to perform operations comprising:displaying on the display device first and second images at different orientations of a bone that are received from the image scanner through the at least one network interface;receiving a user's selection of a surgical screw from among a set of defined surgical screws;displaying a graphical screw representing the selected surgical screw as an overlay on the displayed first and second images of the bone;receiving as user inputs dragging of the displayed graphical screw to manipulate the angular orientation of the displayed graphical screw including pivoting the displayed graphical screw about the tip and about the head of the displayed graphical screw;updating on the display device angular orientation and location of the displayed graphical screw relative to the bone in both the first and second images responsive to receipt of the user inputs;receiving as a second user input selecting and dragging of the selected screw to simulate an advancement of the selected screw into the bone;updating on the display device the advancement of the selected screw only along a trajectory defined by the angular orientation of the selected screw in both the first and second images responsive to receipt of the second user input; andstoring an indication of the selected surgical screw and an angular orientation and a location of the displayed graphical screw in a surgical plan data structure responsive to receipt of a defined user input.
  • 2. The surgical implant planning computer of claim 1, wherein the angular orientation and the location stored in the surgical plan data structure indicates the angular orientation and the location of the displayed graphical screw relative to an angular orientation and a location of the bone in the first and second images.
  • 3. The surgical implant planning computer of claim 1, wherein the operations to display the graphical screw representing the selected surgical screw as an overlay on the first and second images of the bone, comprise: determining a trajectory along an axis of the graphical screw; anddisplaying a trajectory line that extends from adjacent to a tip of the graphical screw and along the trajectory to facilitate a user visually orienting and positioning the graphical screw relative to a desired insertion location on the bone.
  • 4. The surgical implant planning computer of claim 3, wherein the operations to control angular orientation and location of the displayed graphical screw relative to the bone in the first and second images responsive to receipt of user inputs, comprise: translating a location of the displayed graphical screw responsive to determining that the user has pressed on a touch-sensitive screen of the display device over a screw body of the graphical screw while moving location of the user's continued pressing along the touch-sensitive screen; andangularly pivoting the displayed graphical screw responsive to determining that the user has pressed on the touch-sensitive screen over a screw head and/or tip of the graphical screw while moving location of the user's continued pressing along the touch-sensitive screen.
  • 5. The surgical implant planning computer of claim 3, wherein the operations to control angular orientation and location of the displayed graphical screw relative to the bone in the first and second images responsive to receipt of user inputs, comprise: selecting a length of the displayed graphical screw from among a set of defined lengths for surgical screws responsive to determining that the user has pressed on a touch-sensitive screen of the display device over a screw tip or a screw head of the graphical screw while moving location of the user's continued pressing along the touch-sensitive screen a measured distance,wherein the selected length is stored in the surgical plan data structure.
  • 6. The surgical implant planning computer of claim 1, wherein the operations to control orientation and location of the displayed graphical screw relative to the bone in the first and second images responsive to receipt of user inputs, comprise: modifying a size and/or a rotational angle of the displayed graphical screw on the first and second images responsive to tracking motion of a user's hand relative to an input device.
  • 7. The surgical implant planning computer of claim 1, wherein the operations further comprise: providing the surgical plan data structure to the robot to control movement of the robot arm relative to the robot base.
  • 8. The surgical implant planning computer of claim 1, wherein the operations further comprise: controlling the motors responsive to content of the surgical plan data structure to regulate movement of the robot arm while positioning an end-effector, which is connected to the robot arm, relative to a patient; andcontrolling angular orientation and location of the displayed graphical screw responsive to the movement of the robot arm while the end-effector is positioned relative to the patient.
  • 9. The surgical implant planning computer of claim 8, wherein the operations further comprise: controlling the motors to move the end-effector in a direction along a trajectory defined by the content of the surgical plan data structure; andcontrolling location of the displayed graphical screw responsive to the movement of the end-effector along the trajectory.
  • 10. The surgical implant planning computer of claim 8, wherein the operations further comprise: while moving the end-effector along the trajectory, further controlling the motors to resist movement of the end-effector in a direction perpendicular to the trajectory until another operation is perform that cancels an end-effector trajectory constraint mode.
  • 11. The surgical implant planning computer of claim 10, wherein the operations further comprise: prior to initiating the end-effector trajectory constraint mode, controlling the motors to move the end-effector in a direction upward away from the patient and then toward a location along the trajectory toward the patient;preventing initiation of the end-effector trajectory constraint mode before reaching the location along the trajectory; andcontrolling angular orientation and location of the displayed graphical screw responsive to the movement of the robot arm away from the patient and then toward the location along the trajectory.
  • 12. A surgical implant planning computer comprising: at least one network interface connectable to an image database;a display device;at least one processor; andat least one memory storing program code that is executed by the at least one processor to perform operations comprising:loading first and second images at different orientations from a computed tomography (CT) scan of a bone, which are received from the image database through the at least one network interface, into the at least one memory;displaying the first and second images on the display device;receiving a user's selection of a surgical screw from among a set of defined surgical screws;displaying a graphical screw representing the selected surgical screw as an overlay on the displayed first and second images of the bone;receiving as user inputs dragging of the displayed graphical screw to manipulate the angular orientation of the displayed graphical screw including pivoting the displayed graphical screw about the tip and about the head of the displayed graphical screw;updating on the display device angular orientation and location of the displayed graphical screw relative to the bone in the first and second images responsive to receipt of the user inputs;receiving as a second user input selecting and dragging of the selected screw to simulate an advancement of the selected screw into the bone;updating on the display device the advancement of the selected screw only along a trajectory defined by the angular orientation of the selected screw in both the first and second images responsive to receipt of the second user input; andstoring an indication of the selected surgical screw and an angular orientation and a location of the displayed graphical screw in a surgical plan data structure responsive to user input, the surgical plan data structure being configured for use by a robot with a robot base coupled to a robot arm that is movable by motors relative to the robot base.
  • 13. The surgical implant planning computer of claim 12, wherein the operations to display the graphical screw representing the selected surgical screw as an overlay on the first and second images of the bone, comprise: determining a trajectory along an axis of the graphical screw; anddisplaying a trajectory line that extends from adjacent to a tip of the graphical screw and along the trajectory to facilitate a user visually orienting and positioning the graphical screw relative to a desired insertion location on the bone.
  • 14. The surgical implant planning computer of claim 13, wherein the operations to control angular orientation and location of the displayed graphical screw relative to the bone in the first and second images responsive to receipt of user inputs, comprise: translating a location of the displayed graphical screw responsive to determining that the user has pressed on a touch-sensitive screen of the display device over a screw body of the graphical screw while moving location of the user's continued pressing along the touch-sensitive screen; andangularly pivoting the displayed graphical screw responsive to determining that the user has pressed on the touch-sensitive screen over a screw head and/or tip of the graphical screw while moving location of the user's continued pressing along the touch-sensitive screen.
  • 15. The surgical implant planning computer of claim 13, wherein the operations to control angular orientation and location of the displayed graphical screw relative to the bone in the first and second images responsive to receipt of user inputs, comprise: selecting a length of the displayed graphical screw from among a set of defined lengths for surgical screws responsive to determining that the user has pressed on a touch-sensitive screen of the display device over a screw tip or a screw head of the graphical screw while moving location of the user's continued pressing along the touch-sensitive screen a measured distance,wherein the selected length is stored in the surgical plan data structure.
  • 16. The surgical implant planning computer of claim 12, wherein the operations further comprise: controlling angular orientation and location of the displayed graphical screw responsive to the movement of the robot arm while the end-effector is positioned relative to the patient.
  • 17. The surgical implant planning computer of claim 12, wherein the operations further comprise: performing a registration setup mode comprising determining occurrence of a first condition indicating that a marker tracking camera can observe to track reflective markers that are on a fluoroscopy registration fixture, and further determining occurrence of a second condition indicating that the marker tracking camera can observe to track dynamic reference base markers attached to the robot arm and/or an end-effector connected to the robot arm;displaying on the display device an indication of when both of the first and second conditions occur; anddetermining that the registration setup mode is allowed to be marked satisfied when at least both of the first and second conditions are determined to occur.
  • 18. The surgical implant planning computer of claim 17, wherein the operations further comprise: while both of the first and second conditions are determined to continue to occur, allowing operations to be performed to obtain a first intra-operative fluoroscopic image of the patient along a first plane and to obtain a second intra-operative fluoroscopic image of the patient along a second plane that is orthogonal to the first plane; anddetermining that a registration mode is allowed to be marked satisfied when the first and second intra-operative fluoroscopic images have been obtained.
  • 19. The surgical implant planning computer of claim 18, wherein the operations further comprise: displaying the first and second intra-operative fluoroscopic images on the display device;displaying the graphical screw as an overlay on both of the first and second intra-operative fluoroscopic images;controlling angular orientation and location of the displayed graphical screw relative to a bone in the first and second intra-operative fluoroscopic images responsive to receipt of user inputs.
  • 20. A method by a surgical implant planning computer, the method comprising: displaying on a display device first and second images at different orientations from a computed tomography (CT) scan of a bone with a CT image scanner;receiving a user's selection of a surgical screw from among a set of defined surgical screws;displaying a graphical screw representing the selected surgical screw as an overlay on the first and second images of the bone;receiving as user inputs dragging of the displayed graphical screw to manipulate the angular orientation of the displayed graphical screw including pivoting the displayed graphical screw about the tip and about the head of the displayed graphical screw;updating on the display device angular orientation and location of the displayed graphical screw relative to the bone in the first and second images responsive to receipt of the user inputs;receiving as a second user input selecting and dragging of the selected screw to simulate an advancement of the selected screw into the bone;updating on the display device the advancement of the selected screw only along a trajectory defined by the angular orientation of the selected screw in both the first and second images responsive to receipt of the second user input; andstoring an indication of the selected surgical screw and an angular orientation and a location of the displayed graphical screw in a surgical plan data structure within a memory responsive to receipt of a defined user input.
CROSS REFERENCE TO RELATED APPLICATIONS

This application claims priority under 35 U.S.C. 119(e) to U.S. Provisional Patent Application Ser. No. 62/535,591, filed Jul. 21, 2017, the content of which is incorporated by reference herein in its entirety for all purposes.

US Referenced Citations (685)
Number Name Date Kind
4150293 Franke Apr 1979 A
5246010 Gazzara et al. Sep 1993 A
5354314 Hardy et al. Oct 1994 A
5397323 Taylor et al. Mar 1995 A
5598453 Baba et al. Jan 1997 A
5772594 Barrick Jun 1998 A
5791908 Gillio Aug 1998 A
5820559 Ng et al. Oct 1998 A
5825982 Wright et al. Oct 1998 A
5887121 Funda et al. Mar 1999 A
5911449 Daniele et al. Jun 1999 A
5951475 Gueziec et al. Sep 1999 A
5987960 Messner et al. Nov 1999 A
6012216 Esteves et al. Jan 2000 A
6031888 Ivan et al. Feb 2000 A
6033415 Mittelstadt et al. Mar 2000 A
6080181 Jensen et al. Jun 2000 A
6106511 Jensen Aug 2000 A
6122541 Cosman et al. Sep 2000 A
6144875 Schweikard et al. Nov 2000 A
6157853 Blume et al. Dec 2000 A
6167145 Foley et al. Dec 2000 A
6167292 Badano et al. Dec 2000 A
6201984 Funda et al. Mar 2001 B1
6203196 Meyer et al. Mar 2001 B1
6205411 DiGioia, III et al. Mar 2001 B1
6212419 Blume et al. Apr 2001 B1
6231565 Tovey et al. May 2001 B1
6236875 Bucholz et al. May 2001 B1
6246900 Cosman et al. Jun 2001 B1
6301495 Gueziec et al. Oct 2001 B1
6306126 Montezuma Oct 2001 B1
6312435 Wallace et al. Nov 2001 B1
6314311 Williams et al. Nov 2001 B1
6320929 Von Der Haar Nov 2001 B1
6322567 Mittelstadt et al. Nov 2001 B1
6325808 Bernard et al. Dec 2001 B1
6340363 Bolger et al. Jan 2002 B1
6377011 Ben-Ur Apr 2002 B1
6379302 Kessman et al. Apr 2002 B1
6402762 Hunter et al. Jun 2002 B2
6424885 Niemeyer et al. Jul 2002 B1
6447503 Wynne et al. Sep 2002 B1
6451027 Cooper et al. Sep 2002 B1
6477400 Barrick Nov 2002 B1
6484049 Seeley et al. Nov 2002 B1
6487267 Wolter Nov 2002 B1
6490467 Bucholz et al. Dec 2002 B1
6490475 Seeley et al. Dec 2002 B1
6499488 Hunter et al. Dec 2002 B1
6501981 Schweikard et al. Dec 2002 B1
6507751 Blume et al. Jan 2003 B2
6535756 Simon et al. Mar 2003 B1
6560354 Maurer, Jr. et al. May 2003 B1
6565554 Niemeyer May 2003 B1
6587750 Gerbi et al. Jul 2003 B2
6614453 Suri et al. Sep 2003 B1
6614871 Kobiki et al. Sep 2003 B1
6619840 Rasche et al. Sep 2003 B2
6636757 Jascob et al. Oct 2003 B1
6645196 Nixon et al. Nov 2003 B1
6666579 Jensen Dec 2003 B2
6669635 Kessman et al. Dec 2003 B2
6701173 Nowinski et al. Mar 2004 B2
6757068 Foxlin Jun 2004 B2
6782287 Grzeszczuk et al. Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6786896 Madhani et al. Sep 2004 B1
6788018 Blumenkranz Sep 2004 B1
6804581 Wang et al. Oct 2004 B2
6823207 Jensen et al. Nov 2004 B1
6827351 Graziani et al. Dec 2004 B2
6837892 Shoham Jan 2005 B2
6839612 Sanchez et al. Jan 2005 B2
6856826 Seeley et al. Feb 2005 B2
6856827 Seeley et al. Feb 2005 B2
6879880 Nowlin et al. Apr 2005 B2
6892090 Verard et al. May 2005 B2
6920347 Simon et al. Jul 2005 B2
6922632 Foxlin Jul 2005 B2
6968224 Kessman et al. Nov 2005 B2
6978166 Foley et al. Dec 2005 B2
6988009 Grimm et al. Jan 2006 B2
6991627 Madhani et al. Jan 2006 B2
6996487 Jutras et al. Feb 2006 B2
6999852 Green Feb 2006 B2
7007699 Martinelli et al. Mar 2006 B2
7016457 Senzig et al. Mar 2006 B1
7043961 Pandey et al. May 2006 B2
7062006 Pelc et al. Jun 2006 B1
7063705 Young et al. Jun 2006 B2
7072707 Galloway, Jr. et al. Jul 2006 B2
7083615 Peterson et al. Aug 2006 B2
7097640 Wang et al. Aug 2006 B2
7099428 Clinthorne et al. Aug 2006 B2
7108421 Gregerson et al. Sep 2006 B2
7130676 Barrick Oct 2006 B2
7139418 Abovitz et al. Nov 2006 B2
7139601 Bucholz et al. Nov 2006 B2
7155316 Sutherland et al. Dec 2006 B2
7164968 Treat et al. Jan 2007 B2
7167738 Schweikard et al. Jan 2007 B2
7169141 Brock et al. Jan 2007 B2
7172627 Fiere et al. Feb 2007 B2
7194120 Wicker et al. Mar 2007 B2
7197107 Arai et al. Mar 2007 B2
7231014 Levy Jun 2007 B2
7231063 Naimark et al. Jun 2007 B2
7239940 Wang et al. Jul 2007 B2
7248914 Hastings et al. Jul 2007 B2
7301648 Foxlin Nov 2007 B2
7302288 Schellenberg Nov 2007 B1
7313430 Urquhart et al. Dec 2007 B2
7318805 Schweikard et al. Jan 2008 B2
7318827 Leitner et al. Jan 2008 B2
7319897 Leitner et al. Jan 2008 B2
7324623 Heuscher et al. Jan 2008 B2
7327865 Fu et al. Feb 2008 B2
7331967 Lee et al. Feb 2008 B2
7333642 Green Feb 2008 B2
7339341 Oleynikov et al. Mar 2008 B2
7366562 Dukesherer et al. Apr 2008 B2
7379790 Toth et al. May 2008 B2
7386365 Nixon Jun 2008 B2
7422592 Morley et al. Sep 2008 B2
7435216 Kwon et al. Oct 2008 B2
7440793 Chauhan et al. Oct 2008 B2
7460637 Clinthorne et al. Dec 2008 B2
7466303 Yi et al. Dec 2008 B2
7493153 Ahmed et al. Feb 2009 B2
7505617 Fu et al. Mar 2009 B2
7533892 Schena et al. May 2009 B2
7542791 Mire et al. Jun 2009 B2
7555331 Viswanathan Jun 2009 B2
7567834 Clayton et al. Jul 2009 B2
7594912 Cooper et al. Sep 2009 B2
7606613 Simon et al. Oct 2009 B2
7607440 Coste-Maniere et al. Oct 2009 B2
7623902 Pacheco Nov 2009 B2
7630752 Viswanathan Dec 2009 B2
7630753 Simon et al. Dec 2009 B2
7643862 Schoenefeld Jan 2010 B2
7660623 Hunter et al. Feb 2010 B2
7661881 Gregerson et al. Feb 2010 B2
7683331 Chang Mar 2010 B2
7683332 Chang Mar 2010 B2
7689320 Prisco et al. Mar 2010 B2
7691098 Wallace et al. Apr 2010 B2
7702379 Avinash et al. Apr 2010 B2
7702477 Tuemmler et al. Apr 2010 B2
7711083 Heigl et al. May 2010 B2
7711406 Kuhn et al. May 2010 B2
7720523 Omernick et al. May 2010 B2
7725253 Foxlin May 2010 B2
7726171 Langlotz et al. Jun 2010 B2
7742801 Neubauer et al. Jun 2010 B2
7751865 Jascob et al. Jul 2010 B2
7760849 Zhang Jul 2010 B2
7762825 Burbank et al. Jul 2010 B2
7763015 Cooper et al. Jul 2010 B2
7787699 Mahesh et al. Aug 2010 B2
7796728 Bergfjord Sep 2010 B2
7813838 Sommer Oct 2010 B2
7818044 Dukesherer et al. Oct 2010 B2
7819859 Prisco et al. Oct 2010 B2
7824401 Manzo et al. Nov 2010 B2
7831294 Viswanathan Nov 2010 B2
7834484 Sartor Nov 2010 B2
7835557 Kendrick et al. Nov 2010 B2
7835778 Foley et al. Nov 2010 B2
7835784 Mire et al. Nov 2010 B2
7840253 Tremblay et al. Nov 2010 B2
7840256 Lakin et al. Nov 2010 B2
7843158 Prisco Nov 2010 B2
7844320 Shahidi Nov 2010 B2
7853305 Simon et al. Dec 2010 B2
7853313 Thompson Dec 2010 B2
7865269 Prisco et al. Jan 2011 B2
D631966 Perloff et al. Feb 2011 S
7879045 Gielen et al. Feb 2011 B2
7881767 Strommer et al. Feb 2011 B2
7881770 Melkent et al. Feb 2011 B2
7886743 Cooper et al. Feb 2011 B2
RE42194 Foley et al. Mar 2011 E
RE42226 Foley et al. Mar 2011 E
7900524 Calloway et al. Mar 2011 B2
7907166 Lamprecht et al. Mar 2011 B2
7909122 Schena et al. Mar 2011 B2
7925653 Saptharishi Apr 2011 B2
7930065 Larkin et al. Apr 2011 B2
7935130 Williams May 2011 B2
7940999 Liao et al. May 2011 B2
7945012 Ye et al. May 2011 B2
7945021 Shapiro et al. May 2011 B2
7953470 Vetter et al. May 2011 B2
7954397 Choi et al. Jun 2011 B2
7971341 Dukesherer et al. Jul 2011 B2
7974674 Hauck et al. Jul 2011 B2
7974677 Mire et al. Jul 2011 B2
7974681 Wallace et al. Jul 2011 B2
7979157 Anvari Jul 2011 B2
7983733 Viswanathan Jul 2011 B2
7988215 Seibold Aug 2011 B2
7996110 Lipow et al. Aug 2011 B2
8004121 Sartor Aug 2011 B2
8004229 Nowlin et al. Aug 2011 B2
8010177 Csavoy et al. Aug 2011 B2
8019045 Kato Sep 2011 B2
8021310 Sanborn et al. Sep 2011 B2
8035685 Jensen Oct 2011 B2
8046054 Kim et al. Oct 2011 B2
8046057 Clarke Oct 2011 B2
8052688 Wolf, II Nov 2011 B2
8054184 Cline et al. Nov 2011 B2
8054752 Druke et al. Nov 2011 B2
8057397 Li et al. Nov 2011 B2
8057407 Martinelli et al. Nov 2011 B2
8062288 Cooper et al. Nov 2011 B2
8062375 Glerum et al. Nov 2011 B2
8066524 Burbank et al. Nov 2011 B2
8073335 Labonville et al. Dec 2011 B2
8079950 Stern et al. Dec 2011 B2
8086299 Adler et al. Dec 2011 B2
8092370 Roberts et al. Jan 2012 B2
8098914 Liao et al. Jan 2012 B2
8100950 St. Clair et al. Jan 2012 B2
8105320 Manzo Jan 2012 B2
8108025 Csavoy et al. Jan 2012 B2
8109877 Moctezuma de la Barrera et al. Feb 2012 B2
8112292 Simon Feb 2012 B2
8116430 Shapiro et al. Feb 2012 B1
8120301 Goldberg et al. Feb 2012 B2
8121249 Wang et al. Feb 2012 B2
8123675 Funda et al. Feb 2012 B2
8133229 Bonutti Mar 2012 B1
8142420 Schena Mar 2012 B2
8147494 Leitner et al. Apr 2012 B2
8150494 Simon et al. Apr 2012 B2
8150497 Gielen et al. Apr 2012 B2
8150498 Gielen et al. Apr 2012 B2
8165658 Waynik et al. Apr 2012 B2
8170313 Kendrick et al. May 2012 B2
8179073 Farritor et al. May 2012 B2
8182476 Julian et al. May 2012 B2
8184880 Zhao et al. May 2012 B2
8202278 Orban, III et al. Jun 2012 B2
8208708 Homan et al. Jun 2012 B2
8208988 Jensen Jun 2012 B2
8219177 Smith et al. Jul 2012 B2
8219178 Smith et al. Jul 2012 B2
8220468 Cooper et al. Jul 2012 B2
8224024 Foxlin et al. Jul 2012 B2
8224484 Swarup et al. Jul 2012 B2
8225798 Baldwin et al. Jul 2012 B2
8228368 Zhao et al. Jul 2012 B2
8231610 Jo et al. Jul 2012 B2
8263933 Hartmann et al. Jul 2012 B2
8239001 Verard et al. Aug 2012 B2
8241271 Millman et al. Aug 2012 B2
8248413 Gattani et al. Aug 2012 B2
8256319 Cooper et al. Sep 2012 B2
8271069 Jascob et al. Sep 2012 B2
8271130 Hourtash Sep 2012 B2
8281670 Larkin et al. Oct 2012 B2
8282653 Nelson et al. Oct 2012 B2
8301226 Csavoy et al. Oct 2012 B2
8311611 Csavoy et al. Nov 2012 B2
8320991 Jascob et al. Nov 2012 B2
8332012 Kienzle, III Dec 2012 B2
8333755 Cooper et al. Dec 2012 B2
8335552 Stiles Dec 2012 B2
8335557 Maschke Dec 2012 B2
8348931 Cooper et al. Jan 2013 B2
8353963 Glerum Jan 2013 B2
8358818 Miga et al. Jan 2013 B2
8359730 Burg et al. Jan 2013 B2
8374673 Adcox et al. Feb 2013 B2
8374723 Zhao et al. Feb 2013 B2
8379791 Forthmann et al. Feb 2013 B2
8386019 Camus et al. Feb 2013 B2
8392022 Ortmaier et al. Mar 2013 B2
8394099 Patwardhan Mar 2013 B2
8395342 Prisco Mar 2013 B2
8398634 Manzo et al. Mar 2013 B2
8400094 Schena Mar 2013 B2
8414957 Enzerink et al. Apr 2013 B2
8418073 Mohr et al. Apr 2013 B2
8450694 Baviera et al. May 2013 B2
8452447 Nixon May 2013 B2
RE44305 Foley et al. Jun 2013 E
8462911 Vesel et al. Jun 2013 B2
8465476 Rogers et al. Jun 2013 B2
8465771 Wan et al. Jun 2013 B2
8467851 Mire et al. Jun 2013 B2
8467852 Csavoy et al. Jun 2013 B2
8469947 Devengenzo et al. Jun 2013 B2
RE44392 Hynes Jul 2013 E
8483434 Buehner et al. Jul 2013 B2
8483800 Jensen et al. Jul 2013 B2
8486532 Enzerink et al. Jul 2013 B2
8489235 Moll et al. Jul 2013 B2
8500722 Cooper Aug 2013 B2
8500728 Newton et al. Aug 2013 B2
8504201 Moll et al. Aug 2013 B2
8506555 Ruiz Morales Aug 2013 B2
8506556 Schena Aug 2013 B2
8508173 Goldberg et al. Aug 2013 B2
8512318 Tovey et al. Aug 2013 B2
8515576 Lipow et al. Aug 2013 B2
8518120 Glerum et al. Aug 2013 B2
8521331 Itkowitz Aug 2013 B2
8526688 Groszmann et al. Sep 2013 B2
8526700 Issacs Sep 2013 B2
8527094 Kumar et al. Sep 2013 B2
8528440 Morley et al. Sep 2013 B2
8532741 Heruth et al. Sep 2013 B2
8541970 Nowlin et al. Sep 2013 B2
8548563 Simon et al. Oct 2013 B2
8549732 Burg et al. Oct 2013 B2
8551114 Ramos de la Pena Oct 2013 B2
8551116 Julian et al. Oct 2013 B2
8556807 Scott et al. Oct 2013 B2
8556979 Glerum et al. Oct 2013 B2
8560118 Green et al. Oct 2013 B2
8561473 Blumenkranz Oct 2013 B2
8562594 Cooper et al. Oct 2013 B2
8571638 Shoham Oct 2013 B2
8571710 Coste-Maniere et al. Oct 2013 B2
8573465 Shelton, IV Nov 2013 B2
8574303 Sharkey et al. Nov 2013 B2
8585420 Burbank et al. Nov 2013 B2
8594841 Zhao et al. Nov 2013 B2
8597198 Sanborn et al. Dec 2013 B2
8600478 Verard et al. Dec 2013 B2
8603077 Cooper et al. Dec 2013 B2
8611985 Lavallee et al. Dec 2013 B2
8613230 Blumenkranz et al. Dec 2013 B2
8621939 Blumenkranz et al. Jan 2014 B2
8624537 Nowlin et al. Jan 2014 B2
8630389 Kato Jan 2014 B2
8634897 Simon et al. Jan 2014 B2
8634957 Toth et al. Jan 2014 B2
8638056 Goldberg et al. Jan 2014 B2
8638057 Goldberg et al. Jan 2014 B2
8639000 Zhao et al. Jan 2014 B2
8641726 Bonutti Feb 2014 B2
8644907 Hartmann et al. Feb 2014 B2
8657809 Schoepp Feb 2014 B2
8660635 Simon et al. Feb 2014 B2
8666544 Moll et al. Mar 2014 B2
8675939 Moctezuma de la Barrera Mar 2014 B2
8678647 Gregerson et al. Mar 2014 B2
8679125 Smith et al. Mar 2014 B2
8679183 Glerum et al. Mar 2014 B2
8682413 Lloyd Mar 2014 B2
8684253 Giordano et al. Apr 2014 B2
8685098 Glerum et al. Apr 2014 B2
8693730 Umasuthan et al. Apr 2014 B2
8694075 Groszmann et al. Apr 2014 B2
8696458 Foxlin et al. Apr 2014 B2
8700123 Okamura et al. Apr 2014 B2
8706086 Glerum Apr 2014 B2
8706185 Foley et al. Apr 2014 B2
8706301 Zhao et al. Apr 2014 B2
8717430 Simon et al. May 2014 B2
8727618 Maschke et al. May 2014 B2
8734432 Tuma et al. May 2014 B2
8738115 Amberg et al. May 2014 B2
8738181 Greer et al. May 2014 B2
8740882 Jun et al. Jun 2014 B2
8746252 McGrogan et al. Jun 2014 B2
8749189 Nowlin et al. Jun 2014 B2
8749190 Nowlin et al. Jun 2014 B2
8761930 Nixon Jun 2014 B2
8764448 Yang et al. Jul 2014 B2
8771170 Mesallum et al. Jul 2014 B2
8781186 Clements et al. Jul 2014 B2
8781630 Banks et al. Jul 2014 B2
8784385 Boyden et al. Jul 2014 B2
8786241 Nowlin et al. Jul 2014 B2
8787520 Baba Jul 2014 B2
8792704 Isaacs Jul 2014 B2
8798231 Notohara et al. Aug 2014 B2
8800838 Shelton, IV Aug 2014 B2
8808164 Hoffman et al. Aug 2014 B2
8812077 Dempsey Aug 2014 B2
8814793 Brabrand Aug 2014 B2
8816628 Nowlin et al. Aug 2014 B2
8818105 Myronenko et al. Aug 2014 B2
8820605 Shelton, IV Sep 2014 B2
8821511 von Jako et al. Sep 2014 B2
8823308 Nowlin et al. Sep 2014 B2
8827996 Scott et al. Sep 2014 B2
8828024 Farritor et al. Sep 2014 B2
8830224 Zhao et al. Sep 2014 B2
8834489 Cooper et al. Sep 2014 B2
8834490 Bonutti Sep 2014 B2
8838270 Druke et al. Sep 2014 B2
8844789 Shelton, IV et al. Sep 2014 B2
8855822 Bartol et al. Oct 2014 B2
8858598 Seifert et al. Oct 2014 B2
8860753 Bhandarkar et al. Oct 2014 B2
8864751 Prisco et al. Oct 2014 B2
8864798 Weiman et al. Oct 2014 B2
8864833 Glerum et al. Oct 2014 B2
8867703 Shapiro et al. Oct 2014 B2
8870880 Himmelberger et al. Oct 2014 B2
8876866 Zappacosta et al. Nov 2014 B2
8880223 Raj et al. Nov 2014 B2
8882803 Iott et al. Nov 2014 B2
8883210 Truncale et al. Nov 2014 B1
8888821 Rezach et al. Nov 2014 B2
8888853 Glerum et al. Nov 2014 B2
8888854 Glerum et al. Nov 2014 B2
8894652 Seifert et al. Nov 2014 B2
8894688 Suh Nov 2014 B2
8894691 Iott et al. Nov 2014 B2
8906069 Hansell et al. Dec 2014 B2
8964934 Ein-Gal Feb 2015 B2
8992580 Bar et al. Mar 2015 B2
8996169 Lightcap et al. Mar 2015 B2
9001963 Sowards-Emmerd et al. Apr 2015 B2
9002076 Khadem et al. Apr 2015 B2
9044190 Rubner et al. Jun 2015 B2
9107683 Hourtash et al. Aug 2015 B2
9125556 Zehavi et al. Sep 2015 B2
9131986 Greer et al. Sep 2015 B2
9215968 Schostek et al. Dec 2015 B2
9308050 Kostrzewski et al. Apr 2016 B2
9380984 Li et al. Jul 2016 B2
9393039 Lechner et al. Jul 2016 B2
9398886 Gregerson et al. Jul 2016 B2
9398890 Dong et al. Jul 2016 B2
9414859 Ballard et al. Aug 2016 B2
9420975 Gutfleisch et al. Aug 2016 B2
9492235 Hourtash et al. Nov 2016 B2
9592096 Maillet et al. Mar 2017 B2
9750465 Engel et al. Sep 2017 B2
9757203 Hourtash et al. Sep 2017 B2
9795354 Menegaz et al. Oct 2017 B2
9814535 Bar et al. Nov 2017 B2
9820783 Donner et al. Nov 2017 B2
9833265 Donner et al. Nov 2017 B2
9848922 Tohmeh et al. Dec 2017 B2
9925011 Gombert et al. Mar 2018 B2
9931025 Graetzel et al. Apr 2018 B1
10034717 Miller et al. Jul 2018 B2
20010036302 Miller Nov 2001 A1
20020035321 Bucholz et al. Mar 2002 A1
20040068172 Nowinski et al. Apr 2004 A1
20040076259 Jensen et al. Apr 2004 A1
20050096502 Khalili May 2005 A1
20050143651 Verard et al. Jun 2005 A1
20050171558 Abovitz et al. Aug 2005 A1
20060100610 Wallace et al. May 2006 A1
20060173329 Marquart et al. Aug 2006 A1
20060184396 Dennis et al. Aug 2006 A1
20060241416 Marquart et al. Oct 2006 A1
20060291612 Nishide et al. Dec 2006 A1
20070015987 Benlloch Baviera et al. Jan 2007 A1
20070021738 Hasser et al. Jan 2007 A1
20070038059 Sheffer et al. Feb 2007 A1
20070073133 Schoenefeld Mar 2007 A1
20070156121 Millman et al. Jul 2007 A1
20070156157 Nahum et al. Jul 2007 A1
20070167712 Keglovich et al. Jul 2007 A1
20070233238 Huynh et al. Oct 2007 A1
20080004523 Jensen Jan 2008 A1
20080013809 Zhu et al. Jan 2008 A1
20080033283 Dellaca et al. Feb 2008 A1
20080046122 Manzo et al. Feb 2008 A1
20080082109 Moll et al. Apr 2008 A1
20080108912 Node-Langlois May 2008 A1
20080108991 von Jako May 2008 A1
20080109012 Falco et al. May 2008 A1
20080144906 Allred et al. Jun 2008 A1
20080161680 von Jako et al. Jul 2008 A1
20080161682 Kendrick et al. Jul 2008 A1
20080177203 von Jako Jul 2008 A1
20080214922 Hartmann et al. Sep 2008 A1
20080228068 Viswanathan et al. Sep 2008 A1
20080228196 Wang et al. Sep 2008 A1
20080235052 Node-Langlois et al. Sep 2008 A1
20080269596 Revie et al. Oct 2008 A1
20080287771 Anderson Nov 2008 A1
20080287781 Revie et al. Nov 2008 A1
20080300477 Lloyd et al. Dec 2008 A1
20080300478 Zuhars et al. Dec 2008 A1
20080302950 Park et al. Dec 2008 A1
20080306490 Lakin et al. Dec 2008 A1
20080319311 Hamadeh Dec 2008 A1
20090012509 Csavoy et al. Jan 2009 A1
20090030428 Omori et al. Jan 2009 A1
20090080737 Battle et al. Mar 2009 A1
20090185655 Koken et al. Jul 2009 A1
20090198121 Hoheisel Aug 2009 A1
20090216113 Meier et al. Aug 2009 A1
20090228019 Gross et al. Sep 2009 A1
20090259123 Navab et al. Oct 2009 A1
20090259230 Khadem et al. Oct 2009 A1
20090264899 Appenrodt et al. Oct 2009 A1
20090281417 Hartmann et al. Nov 2009 A1
20100022874 Wang et al. Jan 2010 A1
20100039506 Sarvestani et al. Feb 2010 A1
20100125286 Wang et al. May 2010 A1
20100130986 Mailloux et al. May 2010 A1
20100228117 Hartmann Sep 2010 A1
20100228265 Prisco Sep 2010 A1
20100249571 Jensen et al. Sep 2010 A1
20100274120 Heuscher Oct 2010 A1
20100280363 Skarda et al. Nov 2010 A1
20100331858 Simaan et al. Dec 2010 A1
20110022229 Jang et al. Jan 2011 A1
20110077504 Fischer et al. Mar 2011 A1
20110098553 Robbins et al. Apr 2011 A1
20110137152 Li Jun 2011 A1
20110152676 Groszmann Jun 2011 A1
20110213384 Jeong Sep 2011 A1
20110224684 Larkin et al. Sep 2011 A1
20110224685 Larkin et al. Sep 2011 A1
20110224686 Larkin et al. Sep 2011 A1
20110224687 Larkin et al. Sep 2011 A1
20110224688 Larkin et al. Sep 2011 A1
20110224689 Larkin et al. Sep 2011 A1
20110224825 Larkin et al. Sep 2011 A1
20110230967 O'Halloran et al. Sep 2011 A1
20110238080 Ranjit et al. Sep 2011 A1
20110276058 Choi et al. Nov 2011 A1
20110282189 Graumann Nov 2011 A1
20110286573 Schretter et al. Nov 2011 A1
20110295062 Gratacos Solsona et al. Dec 2011 A1
20110295370 Suh et al. Dec 2011 A1
20110306986 Lee et al. Dec 2011 A1
20120035507 George et al. Feb 2012 A1
20120046668 Gantes Feb 2012 A1
20120051498 Koishi Mar 2012 A1
20120053597 Anvari et al. Mar 2012 A1
20120059248 Holsing et al. Mar 2012 A1
20120071753 Hunter et al. Mar 2012 A1
20120108954 Schulhauser et al. May 2012 A1
20120136372 Amat Girbau et al. May 2012 A1
20120143084 Shoham Jun 2012 A1
20120184839 Woerlein Jul 2012 A1
20120197182 Millman et al. Aug 2012 A1
20120226145 Chang et al. Sep 2012 A1
20120235909 Birkenbach et al. Sep 2012 A1
20120245596 Meenink Sep 2012 A1
20120253332 Moll Oct 2012 A1
20120253360 White et al. Oct 2012 A1
20120256092 Zingerman Oct 2012 A1
20120294498 Popovic Nov 2012 A1
20120296203 Hartmann et al. Nov 2012 A1
20130006267 Odermatt et al. Jan 2013 A1
20130016889 Myronenko et al. Jan 2013 A1
20130027433 Hand Jan 2013 A1
20130030571 Ruiz Morales et al. Jan 2013 A1
20130035583 Park et al. Feb 2013 A1
20130060146 Yang et al. Mar 2013 A1
20130060337 Petersheim et al. Mar 2013 A1
20130094742 Feilkas Apr 2013 A1
20130096574 Kang et al. Apr 2013 A1
20130113791 Isaacs et al. May 2013 A1
20130116706 Lee et al. May 2013 A1
20130131695 Scarfogliero et al. May 2013 A1
20130135312 Yang et al. May 2013 A1
20130144307 Jeong et al. Jun 2013 A1
20130158542 Manzo et al. Jun 2013 A1
20130165937 Patwardhan Jun 2013 A1
20130178867 Farritor et al. Jul 2013 A1
20130178868 Roh Jul 2013 A1
20130178870 Schena Jul 2013 A1
20130204271 Brisson et al. Aug 2013 A1
20130211419 Jensen Aug 2013 A1
20130211420 Jensen Aug 2013 A1
20130218142 Tuma et al. Aug 2013 A1
20130223702 Holsing et al. Aug 2013 A1
20130225942 Holsing et al. Aug 2013 A1
20130225943 Holsing et al. Aug 2013 A1
20130231556 Holsing et al. Sep 2013 A1
20130237995 Lee et al. Sep 2013 A1
20130245375 DiMaio et al. Sep 2013 A1
20130261640 Kim et al. Oct 2013 A1
20130272488 Bailey et al. Oct 2013 A1
20130272489 Dickman et al. Oct 2013 A1
20130274761 Devengenzo et al. Oct 2013 A1
20130281821 Liu et al. Oct 2013 A1
20130296884 Taylor et al. Nov 2013 A1
20130303887 Holsing et al. Nov 2013 A1
20130307955 Deitz et al. Nov 2013 A1
20130317521 Choi et al. Nov 2013 A1
20130325033 Schena et al. Dec 2013 A1
20130325035 Hauck et al. Dec 2013 A1
20130331686 Freysinger et al. Dec 2013 A1
20130331858 Devengenzo et al. Dec 2013 A1
20130331861 Yoon Dec 2013 A1
20130342578 Isaacs Dec 2013 A1
20130345717 Markvicka et al. Dec 2013 A1
20130345718 Crawford Dec 2013 A1
20130345757 Stad Dec 2013 A1
20140001235 Shelton, IV Jan 2014 A1
20140012131 Heruth et al. Jan 2014 A1
20140031664 Kang et al. Jan 2014 A1
20140046128 Lee et al. Feb 2014 A1
20140046132 Hoeg et al. Feb 2014 A1
20140046340 Wilson et al. Feb 2014 A1
20140049629 Siewerdsen et al. Feb 2014 A1
20140058406 Tsekos Feb 2014 A1
20140073914 Lavallee et al. Mar 2014 A1
20140080086 Chen Mar 2014 A1
20140081128 Verard et al. Mar 2014 A1
20140088612 Bartol et al. Mar 2014 A1
20140094694 Moctezuma de la Barrera Apr 2014 A1
20140094851 Gordon Apr 2014 A1
20140096369 Matsumoto et al. Apr 2014 A1
20140100587 Farritor et al. Apr 2014 A1
20140121676 Kostrzewski et al. May 2014 A1
20140128882 Kwak et al. May 2014 A1
20140135796 Simon et al. May 2014 A1
20140142591 Alvarez et al. May 2014 A1
20140142592 Moon et al. May 2014 A1
20140148692 Hartmann et al. May 2014 A1
20140163581 Devengenzo et al. Jun 2014 A1
20140171781 Stiles Jun 2014 A1
20140171900 Stiles Jun 2014 A1
20140171965 Loh et al. Jun 2014 A1
20140180308 von Grunberg Jun 2014 A1
20140180309 Seeber et al. Jun 2014 A1
20140187915 Yaroshenko et al. Jul 2014 A1
20140188132 Kang Jul 2014 A1
20140194699 Roh et al. Jul 2014 A1
20140130810 Azizian et al. Aug 2014 A1
20140221819 Sarment Aug 2014 A1
20140222023 Kim et al. Aug 2014 A1
20140228631 Kwak et al. Aug 2014 A1
20140234804 Huang et al. Aug 2014 A1
20140257328 Kim et al. Sep 2014 A1
20140257329 Jang et al. Sep 2014 A1
20140257330 Choi et al. Sep 2014 A1
20140275760 Lee et al. Sep 2014 A1
20140275985 Walker et al. Sep 2014 A1
20140276931 Parihar et al. Sep 2014 A1
20140276940 Seo Sep 2014 A1
20140276944 Farritor et al. Sep 2014 A1
20140288413 Hwang et al. Sep 2014 A1
20140299648 Shelton, IV et al. Oct 2014 A1
20140303434 Farritor et al. Oct 2014 A1
20140303643 Ha et al. Oct 2014 A1
20140305995 Shelton, IV et al. Oct 2014 A1
20140309659 Roh et al. Oct 2014 A1
20140316436 Bar et al. Oct 2014 A1
20140323803 Hoffman et al. Oct 2014 A1
20140324070 Min et al. Oct 2014 A1
20140330288 Date et al. Nov 2014 A1
20140364720 Darrow et al. Dec 2014 A1
20140371577 Maillet et al. Dec 2014 A1
20150039034 Frankel et al. Feb 2015 A1
20150085970 Bouhnik et al. Mar 2015 A1
20150146847 Liu May 2015 A1
20150150524 Yorkston et al. Jun 2015 A1
20150196261 Funk Jul 2015 A1
20150213633 Chang et al. Jul 2015 A1
20150324114 Hurley et al. Nov 2015 A1
20150335480 Alvarez et al. Nov 2015 A1
20150342647 Frankel et al. Dec 2015 A1
20160005194 Schretter et al. Jan 2016 A1
20160166329 Langan et al. Jun 2016 A1
20160235480 Scholl et al. Aug 2016 A1
20160249990 Glozman et al. Sep 2016 A1
20160302871 Gregerson et al. Oct 2016 A1
20160320322 Suzuki Nov 2016 A1
20160331335 Gregerson et al. Nov 2016 A1
20170135770 Scholl et al. May 2017 A1
20170143284 Sehnert et al. May 2017 A1
20170143426 Isaacs et al. May 2017 A1
20170156816 Ibrahim Jun 2017 A1
20170202629 Maillet et al. Jul 2017 A1
20170212723 Atarot et al. Jul 2017 A1
20170215825 Johnson et al. Aug 2017 A1
20170215826 Johnson et al. Aug 2017 A1
20170215827 Johnson et al. Aug 2017 A1
20170231710 Scholl et al. Aug 2017 A1
20170258426 Risher-Kelly et al. Sep 2017 A1
20170273748 Hourtash et al. Sep 2017 A1
20170296277 Hourtash et al. Oct 2017 A1
20170360493 Zucker et al. Dec 2017 A1
Foreign Referenced Citations (2)
Number Date Country
2016102026 Jun 2016 WO
2017003916 Jan 2017 WO
Non-Patent Literature Citations (1)
Entry
US 8,231,638 B2, 07/2012, Swarup et al. (withdrawn)
Related Publications (1)
Number Date Country
20190021800 A1 Jan 2019 US
Provisional Applications (1)
Number Date Country
62535591 Jul 2017 US