The present invention generally relates to image-guided surgery. More specifically, the invention relates to image-guided surgery that incorporates mixed reality with a flexible biopsy needle.
When surgical biopsy needles are placed into a human body, the physician placing them is no longer able to see the shape, orientation and position of the needle tip because the body is opaque. There is a need to show, in real-time, the three-dimensional shape, orientation and position of a thin needle, or other similar flexible device, throughout the entire medical procedure, particularly when any portion of the device is hidden from direct view within the human body.
To address the needs in the art, a method of real-time 3D flexible needle tracking for image-guided surgery is provided that includes uploading, using a controller, a virtual model of a flexible needle to a calibrated 3D mixed reality headset, where the virtual model, establishing an initial position of a base of a flexible needle relative to a target under test, using the controller and the flexible needle, where the flexible needle includes sensors spanning from the base to along a length of the flexible needle, where the sensors communicate a position, a shape, and an orientation of the flexible needle to the controller, where the controller communicates the sensor positions to the calibrated 3D mixed-reality headset, and using the calibrated 3D mixed-reality headset to display in real-time a position and shape of the flexible needle relative to the target under test.
In another aspect of the invention, the 3D mixed-reality headset is programmed to display an extension to the needle tip as a line such that it aids the user in perceiving the direction in which the needle tip is pointing.
According to one aspect of the invention, the calibrated 3D mixed reality headset includes a head-mounted optical see-through stereoscopic display configured to display 3D content in the surroundings of a user.
In another aspect of the invention, calibrated 3D mixed reality headset is configured for displaying 3D reconstructions of structures of the target under test when coupled to a medical imaging device selected from the group consisting of MRI, ultrasound and CT. The sensors communicate to the controller the reference frame of a patient which the controller can use to display these medical 3D images inside the patient such that the user performing a biopsy can steer the needle insertion based on these images.
According to a further aspect of the invention, the sensors include optical strain gauges.
In yet another aspect of the invention, the controller is configured to measure the relative position and orientation of the calibrated 3D mixed reality headset to the flexible needle and the target under test.
In another aspect of the invention, calibrating the 3D virtual reality headset includes using pupil-tracking cameras rigidly attached to the 3D mixed-reality headset and a computer vision algorithm to track eye positions, where a relative position between the eyes and the 3D virtual reality headset are provided.
The present invention provides a new real-time visual guidance modality to assist physicians during a needle insertion procedure. The invention is comprised by four major components:
1) A sensorized needle 10 or device that is instrumented with strain gauges that enable real-time measurements of the precise shape of that device. This may include needles sensorized with optical fiber Bragg gratings (FBG).
2) A controller operating a position tracking system that measures, in real-time, position and orientation of rigid parts of the needle such as the handle. The position tracking system may be based on electromagnetic trackers, stereoscopic optical system, room-anchored multi-camera triangulation system or computer vision system. In one embodiment, this component uses standard tag detection techniques.
3) Preoperative scans of the patient anatomy registered to the patient using fiducial markers. The scans may be gathered through medical imaging methods such as MRI, CT, PET, Fluoroscopy or Ultrasound.
4) A head-mounted computer and transparent display system, such as a calibrated 3D mixed-reality headset, that overlays a holographic reconstruction of the biopsy needle based on the real-time information (position, orientation, and shape) gathered from (2).
Referring to the drawings in detail, a physician using the visual guidance system during a needle biopsy procedure is shown in
This design includes, in addition to all the features of a traditional biopsy needle, two components:
(i) a plurality of optical strain gauges embedded along the needle stylet and
(ii) visual fiducials markers.
The signal from the optical strain gauges is measured using an optical interrogator 14 and the visual fiducial markers are measured with a digital camera 20 attached to the calibrated 3D mixed-reality headset 11. The measured data from the strain sensors are processed by a computer/controller 12 into needle shape data, which are then sent to the calibrated 3D mixed-reality headset 11 through a wireless connection fast enough so that the update rate is not visible (e.g. greater than 100 Hz). The calibrated 3D mixed-reality headset 11 uses the shape information with the position and orientation of the needle handle to display 22 a holographic representation of the needle 23 overlaid on the physical biopsy needle 10 (see
As shown in
Another variation of the method to perform needle position and orientation tracking is to use a magnetic tracking system. In this implementation, an electromagnetic transmitter would be placed within 0.5 meters distance from both the biopsy needle 10 and the calibrated 3D mixed-reality headset 11. A 6 degrees-of-freedom magnetic tracking sensor would be fixed to each the needle 10 and the calibrated 3D mixed-reality headset 11. Each magnetic tracking sensor is able to determine its own position and orientation given the presence of the electromagnetic field from the transmitter. The readings from both tracking sensors can be used to obtain the position and orientation of the needle 10 with respect to the calibrated 3D mixed-reality headset 11 through a simple translation matrix and a rotation matrix, respectively, calculated from the two readings.
On the compute side of the calibrated 3D mixed-reality headset 11, it is necessary to convert the curvature of the needle 10 to its deflection, and eventually build the mesh representation of the needle 10 using the deflection data. To obtain the deflection of the needle, constant length links connected through revolute joints were used.
Given the real-time needle shape data and the needle base 24 position and orientation data, the calibrated 3D mixed-reality headset 11 can construct a virtual needle 23 following these steps:
1) Use the data obtained from the visual fiducial system to display a virtual needle handle at the measured position and orientation of the physical needle handle.
2) Use the polynomial representation of the shape of the needle to construct the shape of the needle shaft. This virtual needle can be constructed using procedural mesh generation to create a curved cylinder starting at the connection point on needle handle, following the needle shape data. As a result, every part of the holographic needle should be overlaying the physical needle. The wearer of the calibrated 3D mixed-reality headset 11 will be able to see the entire needle even when it has been partially occluded, as shown in
Another aspect of this invention is the integration of preoperative images into the image-guidance system. These preoperative images, which may be taken though MRI, CT, PET, fluoroscopy or ultrasound, reveal abnormalities, structures, anatomy and/or targets that the physician may want visualize in real-time while inserting the needle 10. These images may be registered to fiducial markers 50 fixed on the patient (see
Since this augmented reality system is being used for visualizing objects at high position accuracy it is important to carefully perform the registration of the headset to objects and the patient. As shown in
In addition to the system's capability of superimposing a virtual needle on a real needle, the system can be programmed to extend the virtual needle tip as a line. Since the deflection of the needle is already known to the display device, this can be easily done by taking the 3D position and slope of the tip and drawing a 3D line extending out from the tip. Displaying this needle tip extension may help the physician better perceive the insertion angle of the needle, which may result in more accurate needle placement.
The present invention has now been described in accordance with several exemplary embodiments, which are intended to be illustrative in all aspects, rather than restrictive. Thus, the present invention is capable of many variations in detailed implementation, which may be derived from the description contained herein by a person of ordinary skill in the art.
All such variations are considered to be within the scope and spirit of the present invention as defined by the following claims and their legal equivalents.
This application claims priority from U.S. Provisional Patent Application 62/409,601 filed Oct. 18, 2016, which is incorporated herein by reference.
This invention was made with Government support under contract EB009055 awarded by the National Institutes of Health, under contract CA159992 awarded by the National Institutes of Health, and under contract CA009695 awarded by the National Institutes of Health. The Government has certain rights in the invention.
Number | Date | Country | |
---|---|---|---|
62409601 | Oct 2016 | US |