The present disclosure relates generally to the field of medical systems and, in particular, to a robot-assisted sensorized surgical guide that may be used in combination with an imaging device or system.
With any type of surgery there is a health risk to the patient. During a surgical procedure, gaining access to a target area within the body of a patient may be difficult and may require precise navigation around vital arteries and organs. What is needed is a universal assistant for a surgeon to perform safer and more efficient surgeries.
Briefly, and in general terms, the present disclosure is directed to various embodiments of a system and method for performing robot-assisted surgery. The system may include a radiological imaging device or system, a bed to support a patient, a robotic arm disposed adjacent to the radiological imaging system, and a sensorized surgical guide attached to the robotic arm and that holds a surgical instrument and measures the translation of a surgical instrument along an axis of intervention and the rotation of the surgical instrument about the axis of intervention. In use, the axis of intervention point is determined, and the robotic arm positions the sensorized surgical guide in line with the axis of intervention point so that the surgical instrument may be guided to the target area of the patient. Several embodiments of the surgical guide are included.
Other features and advantages will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, which illustrate by way of example the features of the various embodiments.
Where considered appropriate, reference numerals may be repeated among the drawings to indicate corresponding or analogous elements. Moreover, some of the blocks depicted in the drawings may be combined into a single function.
In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the invention. However, it will be understood by those of ordinary skill in the art that the embodiments of the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to obscure the present invention.
Each of the features and teachings disclosed herein can be used separately or in conjunction with other features and teachings to provide a surgical system including a radiological imaging system with a bed. Representative examples using many of these additional features and teachings, both separately and in combination, are described in further detail with reference to the attached figures.
With reference to
In particular, radiological imaging system 12 is suitable for performing radiological imaging examinations including, but not limited to, X-rays, CT scans, and fluoroscopy. In one embodiment, the imaging system includes a control unit suitable to control the radiological imaging system. Bed 18 extends along main direction 20 (shown in
As shown in
As shown in the example of
In one embodiment, translating component 36 includes a linear guide 40 suitable to control the translational motion along the sliding direction that is substantially parallel to main direction 20. Translating component 36 may include a carriage 42 suitable to slide along linear guide 40. In one embodiment, the carriage moves along the linear guide with the assistance of a motor. Any suitable mechanism may be used to move the gantry 22, either manually or mechanically/automatically.
In one embodiment, the radiological imaging system includes a rotation device (not shown) suitable to rotate gantry 22 about an axis of rotation that is substantially perpendicular to main direction 20 and, specifically, substantially perpendicular to the floor. The rotation device may include a first plate that is integrally attached to carriage 42. The rotation device may also include a second plate integrally attached to gantry 22. In addition, the rotation device may include a rotation component (not shown) that has pins, bearings, or other known mechanical elements suitable to permit the second plate, and thereby gantry 22, to rotate about the axis of rotation, in relation to the first plate, and therefore to the rest of radiological imaging system 12. The rotation device may also have a control lever or other mechanism, suitable to be held by an operator, to control the rotation of gantry 22 about the axis. A handle or any other type of grip may be used to control the rotation of gantry 22 about the axis.
In one embodiment, the rotation device and the control lever permit gantry 22 to be disposed in at least two configurations. One possible configuration is a working configuration in which gantry 22 is substantially perpendicular to main direction 20. Another possible configuration is a rest configuration in which gantry 22 is substantially parallel to main direction 20. In the rest configuration the bed may no longer be attached to the imaging system. The rotation device and control lever may also permit the gantry to be in a variety of other positions and angles relative to main direction 20. The robotic arm may also be put into a rest configuration in which the robot arm is not extended and allows the entire system to be transported more easily.
One example of an imaging system is disclosed in U.S. Pat. No. 10,016,171, the entirety of which is incorporated herein by reference, as if set forth fully herein. In one embodiment, detector 30 detects radiation when performing at least one of tomography, fluoroscopy, radiography, and multimodality and generates data signals based on the radiation received. Furthermore, in one embodiment, at least one detector includes at least one flat panel sensor and/or at least one linear sensor. In an example embodiment in which the at least one detector is a flat panel sensor, the flat panel sensor is selectably operable in at least a flat panel mode and a linear sensor mode obtained, for example, by activating one or more pixel rows that are, preferably, substantially perpendicular to the axis of the bore. In a further example embodiment herein, in the flat panel mode, the sensor performs at least one of fluoroscopy and tomography and, in the linear sensor mode, performs at least one of radiography and tomography. Other examples of an imaging system are disclosed in U.S. Pat. Nos. 9,510,793, 10,136,867, 10,154,824, and 10,265,042, each of which is incorporated herein by reference in its entirety, as if set forth fully herein. The system disclosed in this application may incorporate any of the imaging systems disclosed in these referenced applications, as robotic arm 14 and sensorized surgical guide 16 may be attached to a portion of any of these referenced imaging systems.
Robotic arm 14 and sensorized surgical guide 16 may be used with various other imaging devices or systems such as MRI devices. Robotic arm 14 and sensorized surgical guide 16 may be designed such that the robotic arm attaches to bed 18 or the imaging system for examination, and then may be detached from the bed or imaging system after the examination.
As shown in
In one embodiment, robotic arm base 50 may be mounted on base 34 of the radiological imaging system. In this embodiment, the robotic arm base moves in the same translational direction of the gantry over the target and/or bed. The robotic arm may be a modular arm that may be removed from the gantry track or any other mounting point of the radiological imaging system, for ease of use, and mount and dismount from the radiological imaging system. According to one embodiment, the modular robotic arm may be mounted upon a mobile cart that is motorized (on treads, wheels, etc.) or manually driven by a user from one location to another location. According to another embodiment, the modular robotic arm is mounted directly to the gantry, at a fixed point or on an internal or external track that is separate from the gantry track, or to either end of the base platform of the radiological imaging system. In yet another embodiment, more than one modular robotic arm is mounted to the radiological imaging system. Each of the modular robotic arms may be equipped with a unique tool to perform a specific task.
In one embodiment, the robotic arm may incorporate six-degree-of-freedom force sensors, to be used to obtain controlled compliance in case of contact (voluntary or not) with the surgeon. This compliance may be obtained just along the intervention axis or along any other axis, depending on a pre-setting in control software. These compliances may be used to adapt the robotic arm position with respect to a cannula and the patient.
System 10 may also include a system computer or server that is in communication with radiological imaging system 12, robotic arm 14, and sensorized surgical guide 16. The system computer or server also may be in communication with a display and graphical user interface. In one embodiment, the system computer includes navigation software capable of referencing a position and orientation of an axis of intervention point and instructing robotic arm 14 to move sensorized surgical guide 16 to align with the axis of intervention point. Data from the sensorized surgical guide related to the translation and rotational movement of a surgical instrument positioned through the sensorized surgical guide may be sent to the system computer in order to monitor the movement of the surgical instrument within the body of the patient. Images from the radiological imaging system 12 may also be sent to the system computer and displayed, which may allow the surgeon to view the target area to be treated. Using the images from the radiological imaging system, the navigation software may allow the surgeon to find the best axis of intervention point. Any data from a patient monitoring or tracking device may also be sent to the system computer. The surgical instrument may also be marked or coated such that it is more visible using the radiological imaging system. In one embodiment, the surgical instrument is radio-opaque. In some embodiments, only certain instruments, with special markings or the like, may be recognized by sensorized surgical guide 16. In this way, the system may more precisely be calibrated to provide more accurate translation and rotation measurements, because the software that handles the measurements may recognize the particular instrument and load the related calibration table. This result may be obtained by using a coated texture (that the sensor on the surgical guide could recognize) or by an RFID or other embedded identification system.
Embodiments of a display 200 associated with the system computer and navigation software are shown in
One embodiment 16a of sensorized surgical guide 16 is shown in
As shown in
By way of example only, and not by way of limitation,
In another embodiment, sensorized surgical guide 16a may include at least one, and preferably two, distance sensors 90 on the distal end of distal section 62 as shown in
In another embodiment, the sensorized surgical guide 16a may include a bar display 92 as shown in
Another embodiment 16b of sensorized surgical guide 16 is shown in
Furthermore, proximal guide 108 may include a surface reader or a displacement sensor 109 capable of measuring the translation of a surgical instrument. As discussed above, the surface reader or displacement sensor also may be used to measure the rotation of the surgical instrument. The various types of surface readers described above with respect to surface reader 69 may also be used with sensorized surgical guide 16b.
In one embodiment, the connection between distal section 102 and end portion 106 of the robotic arm may be a releasable connection, such as an electro-magnetic connection. This connection may also be pneumatic, hydraulic, or the like. In this embodiment, the connection may be controlled by the system computer or may manually be released. In use, if there were an issue, the system computer or surgeon could release the connection between distal section 102 and end portion 106 of the robotic arm to abort the robot-assisted holding of the cannula or other surgical instrument. In other embodiments, there may be a releasable connection located anywhere under proximal guide 108 and a separate releasable connection located anywhere under distal guide 110. As above, the releasable connection may be an electro-magnetic connection, pneumatic, hydraulic, or the like. Similarly, the releasable connection(s) in this embodiment may be controlled by the system computer or manually released by the surgeon in case aborting the robot-assisted holding of the cannula or other surgical instrument is necessary.
In yet another embodiment, proximal guide 108 and/or distal guide 110 may include a locking mechanism to prevent the movement of a surgical instrument or cannula, respectively. The locking mechanism may be controlled by the system computer or may be manually activated by the surgeon. In certain embodiments, the locking mechanism may be a mechanical spring-based device or an electro-magnetic coil-based device. In use, if sensors determine that the patient body has moved or the sensorized surgical guide or robotic arm breaches a threshold distance to the patient body, the system computer may activate the locking mechanism and prevent further translation of the cannula or surgical instrument.
By way of example only, in use, proximal section 100 is rotated approximately 90 degrees in order to place a cannula through the hole of distal guide 110. As stated above, the inner diameter of the distal guide should be sized such that the cannula is able to translate along the distal guide in a straight line without allowing the cannula to wobble or move in an angle that is not parallel to the axis of intervention 112. Once the cannula is in position, proximal section 100 is rotated back into its original working configuration and locked such that guides 108 and 110 are aligned with one another. An instrument may be slid through proximal guide 108 and into cannula 80 held within distal guide 110. The working end of the surgical instrument may exit the distal end of the cannula.
By way of example only, and not by way of limitation, a method of using system 10 will be described using sensorized surgical guide 16b. The sensorized surgical guide 16a, however, could be used in place of sensorized surgical guide 16b. In one embodiment, a surgical team diagnoses a patient and identifies a defect or target area, such as a tumor, that needs to be removed from the body. The surgical team plans the needed surgery including the point and angle of intervention into the body of the patient. Radiological imaging system 12 may be used to visualize a portion of the patient's body in order to plan the surgery and determine the intervention point and angle. In this example, the surgical team plans to remove a tumor from the patient.
As one of the initial steps, the patient is placed on bed 18 and the position of the body is scanned or otherwise determined and stored in the system computer. The body of the patient may be strapped down or otherwise held in place to prevent movement on bed 18. In one embodiment, system 10 may constantly monitor the position of the body or at least the target area on the patient. This may be done by reading sensors attached to the body of the patient, such as infrared (IR) sensors or other sensors that may be monitored by the navigation software system. Other technologies that monitor distance and location, such as a stereo IR camera and the like may be used. In addition to knowing the position of the body or at least the target area, the surgical team enters the intervention point and angle through the user interface of the system computer. In one embodiment, the system computer and navigation software may determine the axis of intervention point. The navigation software running on the system computer supplies the intervention point, angle, and direction (axis) of intervention to robotic arm 14, which moves to place sensorized surgical guide 16b (or 16a) along the axis of intervention and at a certain distance from the patient body as shown in
With sensorized surgical guide 16b in proper alignment with respect to the target area of the patient, the surgeon may open proximal section 100 of the sensorized surgical guide as shown in
The surgical procedure may be continually assessed to monitor the location of the surgical instrument with respect to the target area and other structures (veins, arteries, organs, etc.) in the body. In one embodiment, displacement sensor 109 of sensorized surgical guide 16b reads the surgical instrument's penetration depth and sends this information to the navigation software on the system computer. The navigation software may display the surgical instrument's penetration depth on display 200 using a color-coded bar 202 or other visual stimulus. In one embodiment, a 2-D fluoroscopy image(s) 204 may be shown on the display to visually monitor the location of the surgical image as shown in
Yet another embodiment 16c of sensorized surgical guide 16 is shown in
Furthermore, proximal guide 128 may include a surface reader capable of measuring the translation of a surgical instrument. As discussed above, the surface reader also may be used to measure the rotation of the surgical instrument. The various types of surface readers described above with respect to surface reader 69 may also be used with sensorized surgical guide 16b.
By way of example only, a method using system 10 including sensorized surgical guide 16c will now be described. As one of the initial steps, the patient is placed on bed 18 and the position of the body is scanned or otherwise determined and stored in the system computer. The body of the patient may be strapped down or otherwise held in place to prevent movement on bed 18. In one embodiment, system 10 may constantly monitor the position of the body or at least the target area on the patient. This may be done by reading sensors attached to the body of the patient. Other technologies that monitor distance and location, such as cameras (e.g., Lidar), lasers, and the like, may be used. In addition to knowing the position of the body or at least the target area, the surgical team enters the intervention point and angle using the user interface of the system computer. The navigation software running on the system computer supplies the intervention point, angle, and direction (axis) of intervention to robotic arm 14, which moves to place sensorized surgical guide 16c along the axis of intervention 132.
With sensorized surgical guide 16c in proper alignment with respect to the target area of the patient, the surgeon may insert a trocar and cannula 140 through window 126 and into holder arm 120, where the cannula is fed into and through the canal of distal guide 130 as shown in
A surgical instrument 141 may then be slid through the canal of proximal guide 128 and into cannula 140 held within distal guide 130. The working end of the surgical instrument may exit the distal end of the cannula to perform the surgical procedure as shown in
The surgical procedure may be continually assessed to monitor the location of the surgical instrument with respect to the target area and other structures (veins, arteries, organs, etc.) in the body. In one embodiment, the surface reader in proximal sensor 128 of sensorized surgical guide 16c reads the surgical instrument's penetration depth and sends this information to the navigation software on the system computer. The navigation software may display the surgical instrument's penetration depth on a display 160 using a color-coded bar 162 (whose different colors are indicated by different cross-hatch patterns) or other visual stimulus. In this embodiment, the color-coded bar shows the distance to the target area, using green, yellow, and red to tell the surgeon to advance (green), slow down (yellow), and stop (red) progression. A pointer 166 may be shown on the display to indicate the current position of the distal end of the surgical instrument. Audio and tactile stimuli may also be used to provide feedback to the surgeon when advancing the surgical instrument. Also, a 2-D fluoroscopy image(s) 164 may be shown on the display to visually monitor the location of the surgical instrument as shown in
The sensorized surgical guide may also control the use of surgical instruments during surgery. More specifically, the surface reader on the sensorized surgical guide that measures the translation and rotation of the inserted surgical instrument may also read markings or patterns on the surface of the surgical instrument itself. If the markings or patterns on the surgical instrument indicate that the instrument comes from an authorized source or is of an authorized type, a shutter assembly allows the surgical instrument to be used with the sensorized surgical guide.
Reference is now made to
The generated signal when the pattern is detected may be used to command a shutter assembly made up of piston 6, spring 7, and solenoid 8. The shutter assembly may be an electro-mechanical device that is placed so as to obstruct, when commanded, surgical instrument path 4 (shown in
The linear movement of plunger 9 may move piston 6, which, when solenoid 8 is de-energized, partially obstructs surgical instrument path 4. This position may be held using spring 7. When solenoid 8 is energized, plunger 9 is pulled within solenoid 8 and places piston 6 in a position such that surgical instrument path 4 is free. The field produced by solenoid 8 on plunger 9 should be capable of overcoming the resistance of spring 7.
Other shutter assemblies may be used in addition to or instead of the shutter assembly described above. The shutter may have an optional coating such as silicone to avoid potential instrument damage. One embodiment of a shutter may comprise an iris mechanism, e.g., a camera shutter that collapses circumferentially. Another embodiment of a shutter may comprise an inflatable or deflatable donut that closes the opening of the surgical path.
Reference is now made to
Besides the operations shown in
The authorizing pattern may be cross-hatch marks, surface roughening, arrangements of letters or numbers, or a one-dimensional or two-dimensional bar code or some other pattern that identifies the manufacturer of the surgical instrument, the manufacturer of the surgical guide, the manufacturer of the surgical system, including the robotics associated therein, or an authorized licensee of such manufacturers, depending on which entity desires to control surgical access. The pattern may also identify instrument type, so as to be able to allow or block certain types of surgical instruments. The pattern may be laser or chemical etched on the surface of the surgical instrument or otherwise printed thereon. The printing may be performed by a surgical instrument manufacturer before selling the instrument or by an authorized user of the surgical system before performing the surgical procedure (using an authorized printer and/or authorized printing software). The pattern may also include calibration marks so that the translation and rotation sensor may be synchronized with those marks in order to improve precision. It is preferable that the identifying pattern cannot be copied by an unauthorized entity.
Instead of or in addition to using a pattern or marking on the surface of the surgical instrument, inductive and/or capacitive sensing could be used to identify an authorized instrument. In this embodiment, the reader may read the inductive and/or capacitive identification.
Instead of or in addition to using a pattern or marking on the surface of the surgical instrument, the system may use a radio-frequency identification (RFID) tag to identify authorized surgical instruments. In this embodiment, the surface reader may also include an RFID reader that may detect the RFID tag embedded in or on the surgical instrument.
The benefit of this function of the surface reader is to prevent the use of unauthorized surgical instruments with a surgical system or surgical guide. Unauthorized instruments may not have the same quality or precision as that of authorized instruments and thus may pose a danger to a patient during surgery, may not work as well as authorized instruments, or may not be able to take advantage of all the benefits of the surgical system. Unauthorized instruments may also not be calibrated, such that the surgical system cannot determine how far the instrument has traveled or whether the instrument has been rotated.
The various embodiments of system 10 described above using sensorized surgical guides 16, 16a, 16b, and 16c are not limited to any specific procedure and may be used in a variety of surgical procedures. For example, system 10 may be used for bone edema procedures. Depending on the procedure, sensorized surgical guides 16, 16a, 16b, and 16c may be used with a variety of surgical instruments including, but not limited to, trocars, cannulas, scissors, probes, lasers, monopolar RF, bipolar RF or multipolar RF devices, graspers, forceps, electro-surgical knives, ultrasonic transducers, cameras, and the like. Surgical system 10 may also be used with other medical imaging systems such as MRI.
Reference is now made to
Sensorized surgical guide 251 may use an optical sensor such as a high-speed camera to analyze the surface of surgical instrument 253. The optical sensor may be a low-cost surface reader in a two-dimensional optical position measurement system. The working principle is based on detecting and correlating microscopic, inherent features of the surface over which the optical sensor travels. For example, optical sensor chip PMW3360 from PixArt Imaging Inc. (TW) may be used as the optical device.
The PMW3360 sensor includes an LED and an infrared (monochrome) image camera of 36×36 pixels running at a frame rate of up to 12,000 frames/sec. Such a high acquisition and processing speed guarantees accuracy of even high-speed movements. The inspection area can be concentrated within a 1×1 mm square. The internal digital signal processor (DSP) acquires an image and compares it with the next image, isolating artifacts due to surface imperfections. The movement of artifacts in the evolution of the frame acquisitions provides an estimation of the counts of two-dimensional translations.
The optical sensor is used with one or more optical lenses, such as is shown in
Sensorized surgical guide 251 also includes a pairing slider, such as that shown in
Sensorized surgical guide 251 also includes a detector processing unit. This is a microcontroller unit that includes a communication link with a host (e.g., a PC) and an interface with the optical sensor. The microcontroller application software reads information from the optical sensor, processes such information, and generates displacement and rotation measurements. The application software transmits such information to the application host via a USB interface, for example. Two types of information may be read from the optical sensor: (1) relative movements in arbitrary units (counts of movements, not mm or degrees); or (2) raster raw image information, which permits a better analysis of the data but may require more intensive calculation algorithms. All of these components can be assembled in a mechanical housing that holds them in a minimal space (see
The left side of
In that case, another optical sensor and/or illuminator may be used in sensorized surgical guide 251 instead of the PMW3360. A coherent laser emitter (e.g., a vertical cavity surface emitting laser (VCSEL)) can be added externally to the PMW3360 sensor or integrated with a different family of sensor devices, like the PAT9125SEL optical navigation chip from PixArt. The major advantages of this approach are less interference from backscattered light, no need for an optical lens (because the laser beam is focused within the package), less sensitivity to distance variations (no sensitivity at all for short vibrations<0.5 mm), and lower power consumption. Because the dimensions of this improved sensor are small (e.g., 3.4×3.2 mm), multiple devices can be installed in one sensorized surgical guide. This option enables multiple material detection crossing, multi-point detection for better accuracy, and increases the types of surgical instruments that can be sensed by the sensorized surgical guide.
In one embodiment, the sensorized surgical guide has one set of bushings 280, 281 (
In certain embodiments, some parts of the sensorized surgical guide may be detachable for a number of reasons. First, they may be detachable to increase sterility. These detachable parts can be manufactured sterile and be used as disposable parts 341.
Second, the detachable parts may be used in a threshold release mechanism that allows the surgical instrument to be disconnected from the robot when the patient moves. This mechanism can be built using a magnetic threshold device, which disconnects the disposable parts (in contact with the surgical instrument) when the force on the threshold device exceeds a predefined level. In this way, if the patient moves as the surgical instrument is inserted into the body, the robot can hold its position and the surgical instrument can follow the patient's body, increasing safety.
Reference is now made to
When the surgical instrument slides in the middle of the spheres, the spheres rotate, similar to the way a trackball within a mouse moves. Two sensing mechanisms are described below—optical and magnetic—although others (including mechanical and a combination of optical and magnetic) may be used. These mechanisms sense the movement of the spheres, not the instrument directly, which is how the sensing mechanism of other disclosed embodiments operated. The mechanisms sense both translation and rotation of the instrument.
Although the optical sensing of this embodiment differs from the other disclosed embodiments, the optical sensing mechanism operates similarly in that it is still based on the recognition of structured patterns or surface roughness. But instead of sensing the patterns or roughness on the instrument, this embodiment senses the patterns or roughness of the spheres. The spheres may be covered with a matte surface or printed with a structured pattern. Sensors 371-374 may be of the same family as that used in previous embodiments (e.g., PMW3360 or PAT9125SEL or similar sensors), placed in order to see the bottom of the sphere. Although three spheres are included for stability, if there are fewer than three sensors, then only the sphere or spheres having a sensor need to be patterned—the other sphere or spheres can be unpatterned. But adding sensors can be useful for redundancy.
The magnetic sensing mechanism for this embodiment is based on magnetic field detection. A 3D, linear Hall-effect sensor integrated circuit provides a digital value corresponding to the magnetic field measured in each of the x-, y-, and z-axes. This type of sensor is often used in 3D sensing applications for head-on linear motion, slide-by position sensing, and rotation angle measurements.
For the sensorized surgical guide of the present invention, each sphere of removable part 364 of the sensor mechanism may be magnetic, and when the instrument travels or rotates in the center of the spheres, it causes the spheres to move. If the spheres are magnetic, a rotating field is created and sensed by the Hall-effect sensors. Changes in the magnetic field can be processed to obtain linear and rotational movements as outputs.
As with the optical arrangement, if there are fewer than three sensors, then only the sphere or spheres having a sensor need to be magnetic—the other sphere or spheres can be non-magnetic. But having three sensors and magnetic spheres yields a redundancy that can improve sensitivity and accuracy and reduce the distortion effect resulting when a metallic tool is used or when the sensorized surgical guide is in proximity to large metal masses.
Other embodiments are contemplated. Instead of using an optical or magnetic sensor to determine translation and/or rotation of the surgical instrument, a mechanical sensor may be used. In the event a pattern is laser or chemical etched or embossed or engraved on the surface of a surgical instrument, a surface reader may comprise a profilometer to read the etching or embossing or engraving. A wheel, such as that used in an opisometer or curvimeter, may be used to track the movement of the surgical instrument. The wheel may be connected to an electronic encoder to calculate distance. Two wheels, at 90 degrees to one another, may be used to track both translation and rotation of the instrument. In another embodiment, three linear wheels and three rotational wheels may be used for greater accuracy. In another embodiment, a trackball, such as that found in a computer mouse, may be used to track translation and rotation.
In another embodiment, the sensor may be moved to the surgical instrument so that the instrument itself does not need to be read. In this embodiment, the instrument is surrounded by a clamping component made up of two pieces—a metal top and a magnetic bottom. This clamping component fits inside of the surgical guide tube. The top may contain a light source such as an LED or a laser, as well as an optical sensor; the bottom may contain a battery to power the light source. Alternatively, the top may contain the battery and connecting it to the bottom half completes a power circuit. Once the clamping component surrounds the instrument and the pair is inserted into the guide tube, the clamping component is preferably locked, for example by actuating solenoids to keep the clamping component together. As the instrument moves down the guide, the LED/laser on the clamping component illuminates the inside surface of the guide tube, and the optical sensor reads the surface of the guide tube. Alternatively, the guide tube may be lined with one or more magnetic strips that can read the relative location and rotation of the clamping component via magnetic sensing.
In another embodiment, the clamping component expands to an integral guide tube that clamps onto the surgical instrument. The surgical instrument is inserted into the guide tube, and the guide tube, using, for example, a rack-and-pinion mechanism, tightens an O-ring or a grommet clamp on the instrument. The guide tube may be marked with gradations or etchings or patterns as described above with respect to the instrument itself, so these markings may be read using an LED/laser/optical sensor combination also as described above. Thus, the surgical instrument would not need to be itself marked.
The above discussion is meant to illustrate the principles and various embodiments of the present invention. Numerous variations and modifications will become apparent to those skilled in the art once the above disclosure is fully appreciated. It is intended that the following claims be interpreted to embrace all such variations and modifications.
This application is a continuation in part of U.S. application Ser. No. 16/160,575, filed on Oct. 15, 2018, which claims priority from U.S. Provisional Application No. 62/572,986, filed on Oct. 16, 2017, and from U.S. Provisional Application No. 62/627,565, filed on Feb. 7, 2018, all of which are incorporated by reference in their entireties.
Number | Date | Country | |
---|---|---|---|
62572986 | Oct 2017 | US | |
62627565 | Feb 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16160575 | Oct 2018 | US |
Child | 17320195 | US |