Apparatus and method for a global coordinate system for use in robotic surgery

Information

  • Patent Grant
  • 12083043
  • Patent Number
    12,083,043
  • Date Filed
    Thursday, August 1, 2019
    5 years ago
  • Date Issued
    Tuesday, September 10, 2024
    5 months ago
Abstract
An apparatus and method for establishing a global coordinate system to facilitate robotic assisted surgery. The coordinate system may be established using a combination of the robotic data, i.e., kinematics, and optical coherence tomographic images generated by an overhead optical assembly and a tool-based sensor. Using these components, the system may generate a computer-registered three-dimensional model of the patient's eye. In some embodiments, the system may also generate a virtual boundary within the coordinate system to prevent inadvertent injury to the patient.
Description
BACKGROUND OF THE INVENTION
Field of the Invention

The field of the present application pertains to medical devices. More particularly, the field of the invention pertains to an apparatus and method for a global coordinate system for use in robotic-assisted surgery.


Embodiments described herein are directed to a new method, apparatus, and system for generating and operating a global coordinate system that may be useful for performing robotically-assisted surgery.


Description of the Related Art

Present microsurgical procedures are currently extremely technique dependent. For example, several existing solutions for eye surgery involve various techniques with lasers and phacoemulsification.


Modern extracapsular cataract surgery is usually performed using a microsurgical technique called phacoemulsification, whereby the cataract is emulsified with an ultrasonic hand piece and then suctioned out of the eye. Before phacoemulsification can be performed, one or more incisions are made in the eye to allow the introduction of surgical instruments. The surgeon then removes the anterior face of the capsule that contains the lens inside the eye. A phacoemulsification probe is an ultrasonic hand piece with a titanium or steel needle. The tip of the needle vibrates at ultrasonic frequency to sculpt and emulsify the cataract while a pump aspirates particles through the tip. In some techniques, a second fine steel instrument called a chopper is used from a side port to help with chopping the nucleus into smaller pieces. The cataract is usually broken into numerous pieces and each piece is emulsified and aspirated out with suction. The nucleus emulsification makes it easier to aspirate the particles. After removing all hard central lens nucleus with phacoemulsification, the softer outer lens cortex is removed with suction only. As with other cataract extraction procedures, an intraocular lens implant (IOL), is placed into the remaining lens capsule.


One possible improvement to phacoemulsification is a cataract surgery performed with lasers. Femtosecond laser cataract surgery is rapidly emerging as a potential technology that may allow for improved precision of incision formation and emulsification of the cataract. Although phacoemulsification and laser-based cataract surgery work well for many patients, these technologies have several shortcomings. For example, phacoemulsification ultrasound probes must propagate ultrasound energy along the length of the probe, from a proximal transducer to a distal tip. This propagation may lead to transmission of ultrasound energy along the probe to tissues in and around the eye that do not benefit from the transmission. Current lens emulsifying probes generate cavitation energy that is initiated within the area of lens nucleus and radiates outwards towards the lens capsule. This places the lens capsule at risk for damage by this energy. Ultrasound probes also tend to generate more heat than would be desirable for a procedure in the eye.


Finally, it may be quite difficult to steer an ultrasound probe around corners or bends, due to the mechanical requirements of propagating the ultrasound wave along the entire instrument. In other words, the probe may have to be rigid or at least more rigid than would be desirable.


Probe based lasers have similar drawbacks. They may generate unwanted heat in the eye and are often difficult to control, thus risking damage to important nearby tissues. They also are easily damaged when attempting to navigate tight corners, as fibers in a laser probe may easily break.


Femtosecond laser systems have been devised to assist in the removal of cataracts. These devices are used to create the entry sites through the cornea and sclera into the eye, as well as to remove the anterior face of the capsule. In addition, the femtosecond laser energy can be focused within the lens nucleus itself, and used to “pre-chop” the lens nucleus into a number of pieces that can then be more easily removed with the phacoemulsification probe. However, these lasers can only fragment the center zone of the lens that is visible within the pupil (the iris blocks the peripheral lens from laser energy), so that fracture and removal of the peripheral lens by another method is still necessary. They are costly to own and operate and have the additional drawback of extending operative time.


The aforementioned techniques require extreme precision for successfully completion and minimal complications. Therefore, it would be beneficial to have an apparatus and method for a global coordinate system for assisting performing surgery for various applications including eye, micro-surgery, and/or other emulsification applications.


SUMMARY OF THE INVENTION

In general, the present invention provides for a global coordinate system for use in robotically-assisted surgery. In one aspect, the present invention provides for a system for facilitating robotic surgery comprising a first mechanical arm configured to facilitate operation of an optical assembly that comprises a first optical coherence tomographical imaging system; a second mechanical arm operatively coupled to operate a surgical tool, the surgical tool comprising a second optical coherence tomographical imaging system, and a coordinate system processing unit configured to generate a global coordinate system using optical coherence tomographic data generated by the first optical coherence tomographical imaging system and the second optical coherence tomographical imaging system, and kinematic registration data based on the location of the first mechanical arm, second mechanical arm, and the tool.


In related systems, the first mechanical arm is robotically-driven. In some systems, the second mechanical arm is robotically-driven. In some systems, the coordinate system processing unit is further configured to define a virtual boundary for avoiding inadvertent injury to patient tissue and anatomy. In some systems, the virtual boundary restricts movement of the second mechanical arm. In some systems, movement of the second mechanical arm into the virtual boundary triggers auditory feedback. In some systems, the second optical coherence tomographical system is configured to generate images of tissue topology.


In related systems, the optical assembly further comprises a visual light microscope configured for positioning the optical assembly over the eye of a patient. In some systems, the optical assembly further comprises a shared lens stack configured to be used by both the visual light microscope and the first optical coherence tomographical system.


In another aspect, the present invention provides a method for facilitating robotic surgery using a global coordinate system comprising registering the location of an optical assembly with a predetermined focal length that is operatively coupled to the flange of a mechanical arm, orienting the optical assembly over a patient's eye, registering the location of the patient's eye using the predetermined focal length of the optical assembly, scanning the patient's eye using an optical tomographical system integrated into the optical assembly, retrieving optical tomographical data, processing the data to generate a three-dimensional model of the patient's eye, and integrating the three-dimensional model into a global coordinate system that has been registered to the location of the mechanical arm.





BRIEF DESCRIPTION OF THE DRAWINGS

These and other aspects and embodiments will be described in greater detail below, in reference to the attached drawing figures.



FIG. 1 illustrates a robotic surgical system, in accordance with an embodiment of the present invention;



FIG. 2 illustrates force drag through the use of a force-torque sensor in a flange joint, in accordance with an embodiment of the present invention;



FIG. 3 illustrates force drag in a system that does not use force sensors, in accordance with an embodiment of the invention;



FIG. 4 is a perspective view of an apparatus for a GOCT (Global Optical Coherence Tomography) system, according to one embodiment of the present invention;



FIGS. 5A and 5B illustrate a flow diagram that details the steps for initializing system 400 after the patient has been positioned on the operating table;



FIG. 6 illustrates an optical apparatus that uses shared optics for both a visual light microscope and an OCT system, in accordance with an embodiment of the present invention.



FIG. 7 illustrates an optical apparatus that uses shared optics for both a visual light microscope and an OCT system, in accordance with an embodiment of the present invention.



FIG. 8 illustrates a flow diagram that shows how the system may generate a three-dimensional model of the optical region based on OCT data received in image segments, in accordance with an embodiment of the present invention;



FIG. 9 illustrates a series of OCT image segments from a patient's eye that may be collected by an OCT system, in accordance with an embodiment of the present invention;



FIG. 10 illustrates the use of a surgical tool with reflective strips that are configured to be registered in an OCT system, in accordance with an embodiment of the present invention;



FIG. 11A illustrates a spectrograph showing the expected results of using the tool-based OCT system in tool 1002 from FIG. 10, in accordance with an embodiment of the present invention;



FIG. 11B illustrates how the spectral spikes may be analyzed to determine the relative depth of the anatomical features, in accordance with an embodiment of the present invention; and



FIGS. 12A and 12B illustrates the use of a global OCT-based coordinate system and related surgical tool for peeling the internal limiting membrane (ILM) during macular surgery, in accordance with an embodiment of the claimed invention.





DETAILED DESCRIPTION OF THE INVENTION

Although certain preferred embodiments and examples are disclosed below, inventive subject matter extends beyond the specifically disclosed embodiments to other alternative embodiments and/or uses, and to modifications and equivalents thereof. Thus, the scope of the claims appended hereto is not limited by any of the particular embodiments described below. For example, in any method or process disclosed herein, the acts or operations of the method or process may be performed in any suitable sequence and are not necessarily limited to any particular disclosed sequence. Various operations may be described as multiple discrete operations in turn, in a manner that may be helpful in understanding certain embodiments; however, the order of description should not be construed to imply that these operations are order dependent. Additionally, the structures, systems, and/or devices described herein may be embodied as integrated components or as separate components.


For purposes of comparing various embodiments, certain aspects and advantages of these embodiments are described. Not necessarily all such aspects or advantages are achieved by any particular embodiment. Thus, for example, various embodiments may be carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other aspects or advantages as may also be taught or suggested herein.


A. Surgical Robotic Arm Platform.



FIG. 1 illustrates a robotic surgical system, in accordance with an embodiment of the present invention. Robotic system 100 may comprise several system carts 101, 102, and 103 with at least one mechanical arm, such as arms 104, 105, and 106 respectively. The system carts may be in communication with a remotely-located command console (not shown). In practice, the system carts may be arranged to provide access to a patient located on workspace 112, while a physician may control the system 100 from the comfort of the command console. In some embodiments, the system cart may be integrated into the operating table or bed for stability and access to the patient. Alternatively, robotic system 100 may be tethered to a single cart.


Mechanical arm 104 may be fixedly coupled to system cart 101 which contains a variety of support systems, including control electronics, power sources and optical sources in some embodiments. The arm 104 may be formed from a plurality of linkages 107 and joints 108 to enable access to the patient. The system cart 101 may contain a source of power 109, pneumatic pressure 110, and control and sensor electronics 111—including components such as central processing unit, data bus, control circuitry, and memory—and related actuators or motors that may drive arms such as arm 104. Power may be conveyed from the system cart 101 to the arm 104 using a variety of means known to one skilled in the art such as electrical wiring, gear heads, air chambers. The electronics 111 in system cart 101 may also process and transmit control signals communicated from a command console. System carts 102 and 103 and arms 105 and 106 may be similarly configured to system cart 101 and arm 104.


The system cart 101 may also be mobile. In some embodiments, the system cart may capable of being wheeled to the desired location near the patient. System carts 101, 102, and 103 may be located in various locations in the operating room in order to accommodate space needs and facilitate appropriate placement and motion of modules and instruments with respect to a patient. This capability enables the arms to be positioned in locations where they do not interfere with the patient, doctor, anesthesiologist or any supportive surgical equipment required for the selected procedure. During procedures, the arms with instruments will work collaboratively via user control through separate control devices, which may include a command console with haptic devices, joystick, or customized pendants.


If mechanical arm 104 is robotic, joints 108 may comprise one or more actuators in order to affect movement in at least one degree of freedom. The arm 104, as a whole, preferably has more than three degrees of freedom. Through a combination of wires and circuits, each arm may also convey both power and control signals from system cart 101 to the instruments located at the end of their extremities.


In some embodiments, the arms may be fixedly coupled to the operating table with the patient. In some embodiments, the arms may be coupled to the base of the operating table and reach around to access patient.


In some embodiments, the arms may be capable of external “force drag” in order to affect precise positioning. Force drag, a type of cooperative force control, allows an operator to manually adjust and move an arm to a desired position and location. In the right situation, cooperative force control provides extremely precise, tremor-free motion while preserving much of the transparency and naturalness of conventional manual instrument manipulation.



FIG. 2 illustrates force drag through the use of a force-torque sensor in a flange joint, in accordance with an embodiment of the present invention. As shown in FIG. 2, system 200 may include an arm 201 with a force-torque sensor located at the flange joint 202 between the arm 201 and end effector 203. In robotics parlance, a flange joint is external to the robotic system, e.g., the arms in this particular embodiment. The force-torque sensor in joint 202 may detect and measure external operator interaction force 204 in an effort to move an attached OCT scanner and camera 205 over an operative site 206. A six-degree force-torque sensor may measure force in three directions (x, y, and z axes) and three torque in three directions (x, y, and z axes). In response, the system 200 may respond by making precise, tremor-free motions to comply with the operator's commands.


In some embodiments, the force drag capability may be implemented without a complicated force-torque sensor. FIG. 3 illustrates force drag in a system that does not use force sensors, in accordance with an embodiment of the invention. In system 300, a robotic arm 301 with 5 joints (302, 303, 304, 305, and 306) may be constructed with 5 integrated torque sensors, one torque sensor per joint. While only able to detect one degree of torque per joint, sampling all the sensors in response to an external force 307 allows system 300 to generate a corresponding matrix of the motion felt across the entire arm 301. The resulting matrix may be applied to the Jacobian of the arm 301, allowing for system 300 to calculate the external force 307. Using that information, system 300 may be able to respond like system 200 from FIG. 2, and make precise, tremor-free motions that comply with the operator's force.


In one embodiment, the system provides six degrees of freedom (DOF) for a given end effector. Three DOFs may be accounted in the Cartesian x-, y-, z-axes by moving the arm 104 from FIG. 1 independent of the end effector. The remaining three DOFs (pitch, yaw, and roll) of the end effector may be achieved by manipulation through a pivot point. The high number of DOFs allows the end effector to be carefully positioned in close proximity to the operative site, such as the cornea of a patient.


In one embodiment, the end effector includes an instrument drive mechanism that has additional DOF control in the form of both mechanical insertion/retraction and roll, and pneumatic insertion/retraction. An instrument drive mechanism with DOF control allows for instrument motion to be localized to the distal end of the system, and reduces the amount of motion required of the arm.


B. Global Coordinate System Using Optical Coherence Tomography.


In conjunction with a robotic surgical platform, the present invention provides for a global coordinate system to map the three-dimensional space both outside and inside the operative region in the patient. In the context of ophthalmology, the three dimensional mapping of the operative region, i.e., the eye, may be accomplished through a combination of visual optics and optical coherence tomographical (OCT) scanning.



FIG. 4 is a perspective view of an apparatus for a global coordinate system, according to an embodiment of the present invention. System 400 principally comprises a mechanical arm 401, an optical assembly 402, a laser emitter/detector 403, and a processing unit 404. The proximal end of arm 401 may be fixedly mounted or coupled to the system base 405. Mechanical arm 401 comprises a plurality of linkages 406 connected by at least one joint per arm, such as joints 407. If mechanical arm 401 is robotic, joints 407 may comprise one or more actuators in order to affect movement in at least one degree of freedom. In some embodiments, the joints may also comprise of a force or torque sensor. Through a combination of wires and circuits, arm 401 may also convey both power and control signals from system base 405 to the flange joint 408 to optical assembly 402 if needed.


As in other embodiments, system base 405 may contain a variety of support systems, including control electronics, power sources and optical sources in some embodiments. The system base 405 may contain source of power, pneumatic pressure, and control and sensor electronics—including components such as central processing unit, data bus, control circuitry, and memory—and related actuators or motors that may drive arms such as arm 401. Power may be conveyed from the system base 405 to the arm 401 using a variety of means known to one skilled in the art such as electrical wiring, gear heads, air chambers. The electronics in system base 405 may also process and transmit control signals communicated from a remotely-located command console. In some embodiments, the system base 405 may be coupled or integrated with the laser unit 403 and processing unit 404.


As in other embodiments, the system base 405 may also be mobile. In some embodiments, the system base 405 may be located in various locations in the operating room in order to accommodate space needs and facilitate appropriate placement and motion of modules and instruments with respect to a patient. As in other embodiments, arm 401 may be fixedly coupled to the operating table with the patient. In some embodiments, arm 401 may be coupled to the base of the operating table and reach around to access patient.


Arm 401 may position optical assembly 402 over an operative site such as eye 409. In some embodiments, arm 401 may be referred to as a “vision arm” due to its use with optical assembly 402. In some embodiments, arm 401 may be configured to enable six degrees of freedom control of optical assembly 402 or stereo-optic video system to provide an ideal clinical perspective for the operator.


Optical assembly 402 may comprise of a lens 410, a lens stack 411, mirror reflecting unit 412. Optical assembly 402 works in conjunction with laser emitter/detector unit 403, laser fiber 413. As is understood in the art, an optical coherence tomography (OCT) system operates by emitting laser energy of a predetermined wavelength over an operative region and measuring the scattering of the laser energy that results from interaction with tissue and human anatomy. Scattering often occurs when the index of refraction changes, typically at the interface between different tissues.


Within system 400, the laser energy may be generated by laser unit 403 and conveyed to the optical assembly 402 through the laser fiber 413. In some embodiments, laser unit 403 may be capable of emitting laser light of fixed wavelengths varying from 860 nm to 1300 nm, including wavelengths of 860 nm, 1030 nm, 1040 nm, and 1300 nm. Using shorter wavelengths allows for higher axial resolutions in the OCT image, while sacrificing tissue depth penetration. Conversely, using longer wavelengths generally results in lower axial resolution in the OCT image but provides for greater tissue depth penetration.


Laser fiber 413 may be sheathed by any conventional means such as wiring, castings, or coating. Having reached the optical assembly 402, the laser energy may be redirected through the lens stack 411 by the reflecting unit 412. Reflecting unit 412 may comprise mirrors and reflectors that are manipulated by a galvanometer to direct laser energy through the lens stack 411 and lens 410 into particular regions of the operative site 409. The galvanometer within reflecting unit 412 may direct the laser energy in a precise pattern to scatter laser energy in a measurable pattern.


Depending on the tissues encountered by the laser energy emerging from lens 410, the laser energy may be absorbed, reflected and refracted, scattering the laser energy. The reflected scattered laser energy may be captured through the lens 410 and lens stack 411, reflected back to the laser fiber 413 by the reflecting unit 412, and detected by laser unit 403. Based on the strength and scattering of the reflected laser energy, system 400 may be able to visualize tissue contours and depth in the operative site in three dimensions.



FIGS. 5A and 5B illustrate a flow diagram that details the steps for initializing system 400 after the patient has been positioned on the operating table. To start process 500, in step 501, the system 400 may first register the arm 401 to the system base 405. Specifically, step 501 involves determining the location of the linkages 406 of the arm 401, based on sensors in the joints 407. In some embodiments, this may involve receiving information from the sensors in the joints 407, such as torque sensors. Based on the readings from sensors mounted in joints 407, and the known sizes of the linkages 407, the system 400 may calculate the arm locations using Jacobian and kinematics calculations. As the joint sensors may continually transmit readings back to the system base 405, arm location may be continually updated and verified to ensure that the location is accurate in real-time use.


In step 502, the system 400 registers the location of the tool tip, i.e., the location of lens 410 at the tip of the optical assembly 402. This location may be derived from the location of the tip of the robot arm (flange joint 408) and the size of the optical assembly 402. Thus, in combination with arm registration performed in step 501, the system 400 may be able to register the optical assembly to the system base 405, which acts as an anchor in the global coordinate system.


In step 503, the operator may manually move the arm 401 and optical assembly 402 to a desired position over the patient's operative site, i.e., eye. The desired position generally provides a clear line of sight between the optical assembly and the operative site in order for the OCT system to properly scan the site. In some embodiments, the optical assembly is positioned in an overhead position above the patient in order to access the operative site. The optical assembly 402 includes means of aiming and positioning the optical assembly over the site. In some embodiments, this may involve using a camera that makes use of the same lens stack as the OCT system.



FIG. 6 illustrates an optical apparatus that uses shared optics for both a visual light microscope and an OCT system, in accordance with an embodiment of the present invention. As shown in system 600, which may be attached at the flange of a robotic arm, an operator may view the operative site through a microscope 601 with a stereoscopic foldable tube, such as the Carl Zeiss F170 Foldable Tube, which allows for magnification, flexibility and operator comfort. The microscope may be integrated with a reflecting prism 602 that redirects the OCT IR signal 603 from a laterally mounted OCT scanner/receiver (not shown) that may be manipulated to redirect the OCT signal 603 off the prism 602 into different portions of the eye. In this embodiment, the visual light paths 604 and 605 are spaced to avoid interference with the OCT signal 603 when they pass through the shared lens stack 606. In this way, the visual light microscopy paths may be unaffected by the OCT signal paths, allowing for the optical assembly to be integrated for a smaller footprint in the operating room. Using the integrated optical assembly, the operator may manually position and aim the optical assembly over the eye.



FIG. 7 illustrates an optical apparatus 700 that uses shared optics for both a visual light microscope and an OCT system, in accordance with an embodiment of the present invention. The optical apparatus 700 broadly comprises a visual light microscope 701, a combining prism assembly 702, and a series of infrared mirrors 703, a series of infrared achromatic lenses 704, and an OCT emitter and detector unit 705. In the optical apparatus 700, the OCT unit 705 both emits and detects OCT signals that are subsequently focused and directed through a combination of both infrared achromatic lens 704 and infrared mirrors 703 to the optical assembly 706 which houses the microscope 701, the combining prism assembly 702, and the output lens 707. Manipulation of the achromatic lens 704 and infrared mirrors 703 allows the optical apparatus 700 to direct the OCT signal to different portions of the combining prism assembly 702 when attempting to scan different portions of the operative region. For example, infrared mirrors 703 may be rotated to direct the OCT signal off a different portion of combining prism assembly 702. The resulting reflected signal would be directed at a different part of the eye.


Within the optical assembly 706, the visual light optics in microscope 701 are aligned through the objective lens mount 706 into the combining prism assembly 702. The combining prism assembly 702 comprises two separate pieces, separated by a glued dielectric coating at the interface 708. The coating provides a highly reflective surface to infrared light, ensuring that the OCT signal is directed towards the output lens 707, which is situated above the patient's eye (not shown). Conversely, the coating also allows visual light to pass through the combining prism undisturbed, rendering the combining prism highly transmissible to visible light. Thus, using system 700, an operator may visually view the operative site through the microscope 701 to position the optical assembly 706 while also directing the OCT signals onto the operative site using the same assembly.


In some embodiments, one or more downward facing lasers positioned on the exterior of the optical assembly may be used to “aim” the optical assembly over the eye.


In step 504, the system 400 may focus and align the operative region on the eye. In some embodiments, alignment and focus may be performed manually, using the means of aiming and positioning discussed with respect to step 503. In some embodiments, this may be performed robotically. For example, in some embodiments, the system 400 may use an infrared sensor and aiming lasers to align the optical assembly 402. In other embodiments, system 400 may use a visual light camera sensor to autofocus on the operative region.


In step 505, the system 400 may then calculate the location of the operative region (e.g., eye cornea, sclera) using the global coordinate system and the focal length of the optical assembly 402. The system 400 may extrapolate the location of the operative region based on the location of the arm 401, the tip of the optical assembly 402, and the straight-line focal length of the OCT system. Interpolating the location data, the global coordinate system may register three-dimensional coordinates to each of the main components, e.g., arm 401, optical assembly 402, including the operative site 409.


In step 506 of FIG. 5B, the OCT camera may scan the operative region by maneuvering the galvanometer-linked reflector. As discussed with respect to the embodiments in FIGS. 4, 6, and 7, as the galvanometer in reflecting unit 412 directs laser light across the operative region, scattered and reflected laser light will be simultaneously reflected back through the lens stack back to the reflecting unit 412. In turn, the reflector will direct the scattered and directed laser light towards the laser detector 403, often conveyed through an optical fiber 413. Because the OCT data is retrieved by scattering from focused laser light directed at a specific portion of the operative region, the returned OCT data may be represented as image slices or segments of the eye, each showing a specific cross-section. The contours in the image are generated by transitions in the indices of refraction caused by different anatomical features; thus, presenting an anatomical cross-section.


In step 507, the processing unit 404 may construct a three-dimensional model of the operative region based on the OCT data received by the optical assembly. FIG. 8 illustrates a flow diagram that shows how the system may generate a three-dimensional model of the optical region based on OCT data. Having generated OCT-based cross-sections, in step 801 from process 800, the image slices and segments may be collected for analysis by processing unit 404. FIG. 9 illustrates a series of OCT data slices from an eye that may be collected by the system, in accordance with an embodiment of the present invention. An OCT scan typically generates series of cross-section scans, which may be visualized as slices 900, 901, 902, and 903 in FIG. 9.


Having collected image segments/slices of the operative region, i.e., the eye, in step 802 of process 800, the system may then analyze the images for anatomical contours and boundaries. For example, in image 900 from FIG. 9, the processing unit 404 may resolve line 904 to represent the sclera of the patient's eye and resolve lines 905 as the outline of the lens capsule. Each contour may then be mapped in x- and y-coordinates of Cartesian space. The system may repeat this identification process for all of the OCT image segments.


Having identified recognizable anatomical contours and landmarks, in step 803 of process 800, the processing unit 404 may combine the mappings of the anatomical contours and landmarks in x- and y-space from the separate image segments to generate a three-dimensional model of the anatomical contours in three-dimensional space. Coordinates in the z-space may be derived using the angle of the galvanometer in reflecting unit 412 at the time the OCT image was captured. The angle of the galvanometer in reflecting unit 412 may be used to derive which portion of the eye was scanned at the time the OCT image was captured. Through repeated iterations of analyzing the x- and y-coordinates generated in step 802 with the angle of the galvanometer, extrapolations into the z-coordinate space may be derived, allowing the system to generate a three-dimensional model in the x-, y-, and z-coordinate planes.


Returning to FIG. 5B, having generated a three-dimensional model of the eye, in step 508 of process 500, the system may register the model to the global coordinate system that includes the registration of the mechanical arm, optical assembly and arm base. Given the system's knowledge of the kinematics of the arm, the general location of the operative region, and the focal length of the lens stack in the optical assembly, the system may precisely register the model, and thus the interior of the eye, to the larger, global coordinate system.


In some embodiments, the three-dimensional model may overlay through the view of the eye through the visual light microscope to assist with aligning the optical assembly over the eye. In other embodiments, the view of the eye through the visual light microscope may include reticles and aiming targets to assist the user with identifying anatomical landmarks for alignment.


Having registered the eye to the global coordinate system, in step 509 of process 500, the system may establish surgical boundaries to restrict maneuvers by a robotic tool within the confined anatomical space of the eye. In practice, having generated the ocular surface data, which is registered to the global OCT established world coordinate system, and further having tool tip position in the same coordinate system, an artificial safety barrier may be robotically enforced to prevent the robot from breaching a specified minimum distance between the tool and the surface. This barrier may be used to prevent potential unwanted anatomical collisions. In other words, the system may prevent the tool tip from entering an established “keep out” zone. In other embodiments, an audible warning may be sounded to alert the operator (physician) if the instrument tip approaches within a certain distance of the retinal tissue.


Various challenges exist under this boundary establishment modality, namely; shadowing caused by tools, regions of the anatomy obstructed from view and movement of the anatomical target. Tool shadows occur when OCT scanning signals cannot penetrate surfaces that are opaque to its wavelength. Analyzing tool-based shadows may be used to identify the location of the tool relative to the operative site. In some embodiments, to compensate, the system may interpolate topology of the anatomy based on topology of the surfaces near the obscured region. In similar fashion, the same technique can be used to overcome areas obscured from the global OCT system's view, such as the equatorial regions of the capsule.


To facilitate the use of surgical boundaries, certain embodiments make use of surgical tools that may be registered to the global coordinate system using OCT. FIG. 10 illustrates the use of a surgical tool with reflective strips that are configured to be registered in an OCT system, in accordance with an embodiment of the present invention. As shown in diagram 1000, an elongated surgical tool 1002 with reflective markers 1003 and 1004 located along its length may be operated within a patient's eye 1001 through an incision 1005 in the patient's sclera 1006. Real-time determination of tool position using the global OCT system may be achieved through monitoring OCT signal scattering off reflective markers 1003 and 1004 on tool 1002. Reflective markets may be formed using retro-reflective prisms (similar to those found on street signs), reflective coatings or selective polishing.


Retro-reflective markers direct light back towards the OCT detector at the inverse of the incidence vector. This phenomenon results in a strong OCT signal spike, known as an “autocorrelation” signal. Thus, in embodiments where the marker is located at the distal end of the tool 1002, a spike in the OCT signal may be used to identify the tip of the tool 1002.


To prevent the autocorrelation signal from saturating the OCT detector, the tool may use a series of retro-reflective strips, spaced at various distances between them (similar to bar codes), the configuration of which is also specific to instrument type, tip location could be determined through kinematic calculations. Thus, using the (known) shape of the tool, such as the distance between the strips (1007) and the distance of striped area (1008), the system may kinematically determine the position and orientation of the tool 1002 within the eye 1001. Moreover, the measured autocorrelation signals may be compared to the expected locations of the autocorrelation signals, based on the kinematics of tool 1002, to further determine the exact position and vector of the tool 1002.


C. Tool-Based OCT Scanning.


In addition to a global coordinate system, the present invention provides for the use of tool-based OCT scanning to measure for tool position relative to anatomical structures, label imaged structures and monitor for acute changes in anatomy.


In some embodiments, a surgical tool may comprise an embedded optical fiber to emit an OCT scanning signal from the tool tip and an OCT detector to detect the resulting scattering from that radiation. In doing so, the tool may be able to scan and detect structures directly in front of the tool tip. Using the tool registration techniques disclosed earlier, the information received by the OCT detector in the tool may also be incorporated into the global coordinate system.


Returning to tool 1002 in FIG. 10, tool 1002 may be configured with an OCT fiber and coupled to an OCT scanner and detector (not shown). Tool 1002 may scan tissues in eye 1001 by emitting an OCT signal directed to a portion of the lens capsule. Within the eye 1001, the OCT signal may penetrate and scatter IR radiation as it passes through the lens capsule anterior wall 1009, the lens capsule posterior wall 1010, and intersect with the sclera 1011. In some embodiments, the tool-based OCT system may be able to generate A-mode, B-mode, and C-mode OCT scans for analysis.



FIG. 11A illustrates a spectrograph showing the expected results of using the tool-based OCT system in tool 1002 from FIG. 10. Spectrograph 1100 represents the spectral intensity of the OCT signal received at the detector. As shown in spectrograph 1100, the signal 1101 scatters at three independent locations at a change of an Index of Refraction, and is detected by the OCT signal detector as a signal “peak” or “spike.” As would be recognized by one skilled in the art, the spikes representing signal scattering as the signal passes through a different anatomical boundary. For example, spike 1102 represents the crossing of the lens capsule anterior wall. Spike 1103 represents the crossing of the lens capsule posterior wall into the vitreous. Spike 1104 represents the signal intersection with the solid sclera tissue wall. Thus, system can anticipate what surfaces it should see given this instrument position and correlate the signal spikes with the anatomy expected to be viewed.


The resulting spectral data, including the spectral spikes, may be analyzed to determine tissue topology and the relative depth of corresponding anatomical features. FIG. 11B illustrates how the spectral spikes may be analyzed to determine the relative depth of the anatomical features. As shown on the distance chart 1106, spikes 1102, 1103, and 1104 occur at specific distances from the detector 1105 corresponding to distances marked as 1107, 1108, and 1109. These distances may be measured independently between the spikes as shown by row 1110, or as a measure of cumulative distance from the detector as shown by row 1111.


Utilizing the methods for determining tool registration described above, and extrapolating a vector position for the OCT signal, the OCT data may be incorporated into the global coordinate system. In doing so, the tool-based OCT data may be used to provide a more detailed three-dimensional anatomical model than using OCT data from the optical assembly alone. With the tool-based OCT data, the anatomical three-dimensional model labeled with names and thicknesses.


In certain embodiments, the tool-based OCT may monitor tool tip proximity to tissue at a rate of 50 Hz, which allows for monitoring aggressive changes in tissue position relative to robotic manipulation. Thus, if an anatomical change occurs that puts the tool at risk of colliding with either the safety boundary or actual anatomy, the system may withdraw the tool to a safe position.



FIGS. 12A and 12B illustrates the use of a global OCT-based coordinate system and related surgical tool for peeling the internal limiting membrane (ILM) during macular surgery, in accordance with an embodiment of the claimed invention. A common issue encountered during retinal surgery is the detection of an edge of the ILM for the surgeon to grasp in order to remove the ILM. The precision of real-time tool-based OCT images may ease that process. Similarly, macular holes and other retinal artifacts may be easily be visualized and identified with this enhanced visualization capability.



FIG. 12A illustrates use of surgical tools in a cutaway view of a patient's eye, in accordance with an embodiment of the present invention. As shown in FIG. 12A, surgical tool 1201 may be inserted into the eye 1200 through insertion points 1202 for access to a location on the retina, i.e., the operating area 1203. In addition, in some embodiments, a second surgical tool 1204 may be inserted into the eye 1200. While the first surgical tool 1201 may include a surgical pick at its distal end, the second surgical tool 1204 may utilize imaging means or illumination means at its distal end in order to enhance the vision of the first surgical tool 1201. Consistent with earlier embodiments, tools 1201 and 1204 may be registered to a global coordinate system using the devices and methods discussed earlier.



FIG. 12B illustrates a schematic drawing with a close up view of surgical tool 1201 at the operating area 1203 from FIG. 12A. Tool 1201 may comprise an OCT fiber 1205 receiving an OCT signal from an OCT system 1206, which may comprise both an OCT emitter and OCT detector. The OCT fiber 1205 may be used to scan the tissue topology with infrared OCT radiation (represented by the area enclosed by dotted lines 1207). Using the results from the OCT scans, an operator (physician) may determine the distance of the tool 1201 from the surface (1208). With the help of the resulting OCT-based tissue topology, the OCT scans may also help identify a candidate edge of the ILM for grasping with a surgical pick 1209, often a very difficult process in particular ophthalmological surgeries.


Elements or components shown with any embodiment herein are exemplary for the specific embodiment and may be used on or in combination with other embodiments disclosed herein. While the invention is susceptible to various modifications and alternative forms, specific examples thereof have been shown in the drawings and are herein described in detail. The invention is not limited, however, to the particular forms or methods disclosed, but to the contrary, covers all modifications, equivalents and alternatives thereof.

Claims
  • 1. A surgical robotic system, comprising: a first robotic arm configured to control movement of an optical imaging system coupled to the first robotic arm, wherein the optical imaging system includes a prism and a network of optical elements, the prism and the network of optical elements configured to simultaneously provide: first light from an optical emitter to a surgical site, and reflected light from the surgical site to a microscope by way of the prism; a second robotic arm configured to control movement of a surgical tool coupled to the second robotic arm; and a processor configured to: receive imaging data from the optical imaging system; construct a three-dimensional model that includes an anatomy of a patient based on the imaging data; derive a location of the optical imaging system based at least partially from a size of the optical imaging system; register the location of the optical imaging system and positions of the first robotic arm and the second robotic arm on a common coordinate system; establish a surgical boundary based on the common coordinate system, the surgical boundary having a specified distance to a surface defined by the three-dimensional model to restrict movement of the surgical tool; and control movement of the second robotic arm while enforcing the surgical boundary.
  • 2. The system of claim 1, wherein: the processor is configured to interpolate at least a portion of a surface of the anatomy that is obscured by an object within a view of the optical imaging system, andthe constructing of the three-dimensional model that includes the anatomy is further based on the interpolated surface of the anatomy.
  • 3. The system of claim 2, wherein the processor is configured to: control movement of the second robotic arm such that the surgical tool does not cross the surgical boundary.
  • 4. The system of claim 3, wherein the processor is configured to: determine a position of the surgical tool based at least in part on signal received from the optical imaging system.
  • 5. The system of claim 1, wherein the second robotic arm comprises: at least two links;at least one joint connecting the at least two links; anda sensor located in the at least one joint and configured to detect external forces applied to the second robotic arm,wherein the processor is configured to: receive, from the sensor, a signal indicative of the detected external forces applied to the second robotic arm, andcontrol movement of the second robotic arm based on the signal indicative of the detected external forces.
  • 6. The system of claim 1, wherein the system is configured to: detect a change in a position of the anatomy causing the surgical tool to be within a predetermined distance of the surgical boundary in the common coordinate system, andcontrol the second robotic arm to move the surgical tool away from the surgical boundary.
  • 7. The system of claim 1, wherein the registering of the positions of the first robotic arm and the second robotic arm, and the location of the optical imaging system on the common coordinate system includes: determining locations of a plurality of first links of the first robotic arm in the common coordinate system from a plurality of first sensors coupled to the first robotic arm;determining locations of a plurality of second links of the second robotic arm in the common coordinate system from a plurality of second sensors coupled to the second robotic arm; andregistering the locations of the plurality of first links and the locations of the plurality of second links to a base of the system.
  • 8. The system of claim 1, wherein: the surgical tool includes a reflective marker; andthe processor is configured to: receive from the optical imaging system a signal scattering off the reflective marker; and
  • 9. The system of claim 1, wherein the prism includes a dielectric coating configured to reflect infrared light and to allow visible light to pass therethrough.
  • 10. The system of claim 1, wherein the surgical site is an eye, and the optical emitter is an optical coherence tomography (OCT) emitter.
  • 11. A method for performing a robotic medical procedure, the method comprising: training an optical imaging system on an anatomical region of a patient using visible light passing through a prism of the optical imaging system;scanning the anatomical region by simultaneously conveying: infrared light from an optical emitter to the anatomical region by way of the prism and a network of optical elements; andreflected light from the anatomical region to a microscope by way of the prism;generating a three-dimensional model of the anatomical region using a signal received from the optical imaging system based on the scanned anatomical region, the optical imaging system configured to be controlled by a first robotic arm;deriving a location of the optical imaging system based at least partially from a size of the optical imaging system;registering the location of the optical imaging system and a position of the first robotic arm on a common coordinate system;establishing a surgical boundary based on the common coordinate system, the surgical boundary having a specified distance to a surface defined by the three-dimensional model to restrict movement of a second robotic arm configured to control movement of a surgical tool; and
  • 12. The method of claim 11, further comprising: interpolating at least a portion of a surface of the anatomical region that is obscured by an object within a view of the optical imaging system,wherein the generating of the three-dimensional model of the anatomical region is further based on the interpolated surface of the anatomical region.
  • 13. The method of claim 12, further comprising: controlling movement of the second robotic arm such that the surgical tool does not cross the surgical boundary.
  • 14. The method of claim 13, further comprising: determining a position of the surgical tool based at least in part on the signal received from the optical imaging system.
  • 15. The method of claim 11, wherein: the second robotic arm comprises: at least two links;at least one joint connecting the at least two links; anda sensor located in the at least one joint and configured to detect external forces applied to the second robotic arm, andthe method further comprises: receiving, from the sensor, a signal indicative of the detected external forces applied to the second robotic arm, andcontrolling movement of the second robotic arm based on the signal indicative of the detected external forces.
  • 16. The method of claim 15, wherein the sensor comprises a force-torque sensor or a torque sensor.
  • 17. The method of claim 11, further comprising: detecting a change in a position of the anatomical region causing the surgical tool to be within a predetermined distance of the surgical boundary, andcontrolling the second robotic arm to move the surgical tool away from the surgical boundary.
  • 18. The method of claim 11, wherein the anatomical region is an eye, and the optical emitter is an optical coherence tomography (OCT) emitter.
  • 19. A surgical robotic system, comprising: a first robotic arm coupled to a base and configured to control movement of a first tool;a second robotic arm coupled to the base and configured to control movement of a second tool;an optical imaging system including a prism and a network of optical elements aligned with an optical emitter and a microscope, the prism and the network of optical elements configured to simultaneously provide: first light from the optical emitter to a surgical site; andreflected light from the surgical site to the microscope by way of the prism; anda processor configured to: receive imaging data;derive a location of the first tool based at least partially from a size of the first tool;register the location of the first tool and positions of the first robotic arm, the second robotic arm, and the second tool on a common coordinate system;establish a surgical boundary based on the common coordinate system, the surgical boundary having a specified distance to a surface of an anatomy of a patient to restrict movement of the first robotic arm or the second robotic arm; andcontrol movement of the first robotic arm or the second robotic arm while enforcing the surgical boundary.
  • 20. The surgical robotic system of claim 19, wherein the processor is configured to: control movement of the first robotic arm or the second robotic arm such that the first tool or the second tool does not cross the surgical boundary.
  • 21. The surgical robotic system of claim 19, wherein the second robotic arm comprises: at least two links;at least one joint connecting the at least two links; anda sensor located in the at least one joint and configured to detect external forces applied to the second robotic arm,wherein the processor is configured to: receive, from the sensor, a signal indicative of the detected external forces applied to the second robotic arm, andcontrol movement of the second robotic arm based on the signal indicative of the detected external forces.
  • 22. The surgical robotic system of claim 21, wherein the sensor comprises at least one of the following: a force-torque sensor and a torque sensor.
  • 23. The surgical robotic system of claim 19, wherein the surgical site is a target eye, and the optical emitter is an optical coherence tomography (OCT) emitter.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. Pat. No. 10,383,754, issued on Aug. 20, 2019, which is a continuation-in-part of U.S. Publication No. 2014/0142591, published on May 22, 2014 and now abandoned, which claims priority from U.S. Provisional Application No. 61/637,426, filed Apr. 24, 2012, the entire contents of which are incorporated by reference.

US Referenced Citations (285)
Number Name Date Kind
4597388 Koziol et al. Jul 1986 A
4905673 Pimiskern Mar 1990 A
4921393 Andeen May 1990 A
5375588 Yoon Dec 1994 A
5402801 Taylor Apr 1995 A
5425735 Rosen et al. Jun 1995 A
5472406 De La Torre et al. Dec 1995 A
5540648 Yoon Jul 1996 A
5572999 Funda et al. Nov 1996 A
5577991 Akui Nov 1996 A
5662590 De La Torre et al. Sep 1997 A
5695500 Taylor et al. Dec 1997 A
5746720 Stouder May 1998 A
5795295 Hellmuth Aug 1998 A
5865809 Moenning et al. Feb 1999 A
6033371 Torre et al. Mar 2000 A
6279579 Riaziat Aug 2001 B1
6326616 Andrien et al. Dec 2001 B1
6398792 O'Connor Jun 2002 B1
6406486 De La Torre et al. Jun 2002 B1
6436107 Wang et al. Aug 2002 B1
6522906 Salisbury et al. Feb 2003 B1
6554793 Pauker et al. Apr 2003 B1
6638246 Naimark et al. Oct 2003 B1
6671581 Niemeyer et al. Dec 2003 B2
6736784 Menne et al. May 2004 B1
6763259 Hauger et al. Jul 2004 B1
6850817 Green Feb 2005 B1
7087061 Chernenko et al. Aug 2006 B2
7344528 Tu et al. Mar 2008 B1
7351193 Forman et al. Apr 2008 B2
7699855 Anderson Apr 2010 B2
7725214 Diolaiti May 2010 B2
7883475 Dupont et al. Feb 2011 B2
7967799 Boukhny Jun 2011 B2
8049873 Hauger et al. Nov 2011 B2
8224484 Swarup et al. Jul 2012 B2
8414564 Goldshleger et al. Apr 2013 B2
8469947 Devengenzo Jun 2013 B2
8491597 Russell et al. Jul 2013 B2
8506555 Ruiz Morales Aug 2013 B2
8518024 Williams et al. Aug 2013 B2
8827948 Romo et al. Sep 2014 B2
9173713 Hart et al. Nov 2015 B2
9226796 Bowling Jan 2016 B2
9254123 Alvarez et al. Feb 2016 B2
9480534 Bowling Nov 2016 B2
9504604 Alvarez Nov 2016 B2
9561083 Yu et al. Feb 2017 B2
9622827 Yu et al. Apr 2017 B2
9636184 Lee et al. May 2017 B2
9713509 Schuh et al. Jul 2017 B2
9727963 Mintz et al. Aug 2017 B2
9737371 Romo et al. Aug 2017 B2
9737373 Schuh Aug 2017 B2
9744335 Jiang Aug 2017 B2
9763741 Alvarez et al. Sep 2017 B2
9770828 Taylor Sep 2017 B2
9788910 Schuh Oct 2017 B2
9795445 Bowling Oct 2017 B2
9820818 Malackowski Nov 2017 B2
9844353 Walker et al. Dec 2017 B2
9844412 Bogusky et al. Dec 2017 B2
9867635 Alvarez et al. Jan 2018 B2
9918681 Wallace et al. Mar 2018 B2
9931025 Graetzel et al. Apr 2018 B1
9949749 Noonan et al. Apr 2018 B2
9955986 Shah May 2018 B2
9962228 Schuh et al. May 2018 B2
9980785 Schuh May 2018 B2
9993313 Schuh et al. Jun 2018 B2
10004562 Kostrzewski Jun 2018 B2
10004569 Singh Jun 2018 B2
10016900 Meyer et al. Jul 2018 B1
10022192 Ummalaneni Jul 2018 B1
10080576 Romo et al. Sep 2018 B2
10136959 Mintz et al. Nov 2018 B2
10145747 Lin et al. Dec 2018 B1
10149720 Romo Dec 2018 B2
10154829 Henderson et al. Dec 2018 B2
10159532 Ummalaneni et al. Dec 2018 B1
10159533 Moll et al. Dec 2018 B2
10169875 Mintz et al. Jan 2019 B2
10219874 Yu et al. Mar 2019 B2
10231793 Romo Mar 2019 B2
10231867 Alvarez et al. Mar 2019 B2
10244926 Noonan et al. Apr 2019 B2
10285574 Landey et al. May 2019 B2
10299870 Connolly et al. May 2019 B2
10314463 Agrawal et al. Jun 2019 B2
10426559 Graetzel et al. Oct 2019 B2
10470830 Hill Nov 2019 B2
10482599 Mintz et al. Nov 2019 B2
10517692 Eyre et al. Dec 2019 B2
10524866 Srinivasan Jan 2020 B2
10539478 Lin Jan 2020 B2
10555778 Ummalaneni et al. Feb 2020 B2
20020193685 Mate Dec 2002 A1
20030050558 Bencini Mar 2003 A1
20040030349 Boukhny Feb 2004 A1
20050070844 Chow et al. Mar 2005 A1
20050128197 Thrun Jun 2005 A1
20050182295 Soper et al. Aug 2005 A1
20060079756 Lloyd et al. Apr 2006 A1
20060258938 Hoffman Nov 2006 A1
20060270909 Davis et al. Nov 2006 A1
20070021738 Hasser et al. Jan 2007 A1
20070032906 Sutherland et al. Feb 2007 A1
20070135733 Soukup et al. Jun 2007 A1
20070135763 Musbach et al. Jun 2007 A1
20070144298 Miller Jun 2007 A1
20070244599 Tsai Oct 2007 A1
20070270685 Kang et al. Nov 2007 A1
20070299427 Yeung et al. Dec 2007 A1
20080065109 Larkin Mar 2008 A1
20080077158 Haider et al. Mar 2008 A1
20080097293 Chin et al. Apr 2008 A1
20080114341 Thyzel May 2008 A1
20080177285 Brock et al. Jul 2008 A1
20080187101 Gertner Aug 2008 A1
20080188986 Hoppe Aug 2008 A1
20080228104 Uber et al. Sep 2008 A1
20090024142 Ruiz Morales Jan 2009 A1
20090036902 DiMaio Feb 2009 A1
20090062602 Rosenberg et al. Mar 2009 A1
20090099445 Burger Apr 2009 A1
20090088634 Zhao May 2009 A1
20090171271 Webster et al. Jul 2009 A1
20090248037 Prisco Oct 2009 A1
20090248038 Blumenkranz Oct 2009 A1
20090248041 Williams et al. Oct 2009 A1
20090248043 Tierney et al. Oct 2009 A1
20090259412 Brogardh Oct 2009 A1
20090264878 Carmel et al. Oct 2009 A1
20090268015 Scott et al. Oct 2009 A1
20090312768 Hawkins et al. Dec 2009 A1
20090326322 Diolaiti Dec 2009 A1
20100036294 Mantell et al. Feb 2010 A1
20100073150 Olson et al. Mar 2010 A1
20100228191 Alvarez et al. Sep 2010 A1
20100234999 Nakajima Sep 2010 A1
20100280320 Alvarez et al. Nov 2010 A1
20100280525 Alvarez et al. Nov 2010 A1
20110009779 Romano et al. Jan 2011 A1
20110015648 Alvarez et al. Jan 2011 A1
20110028887 Fischer et al. Feb 2011 A1
20110040404 Diolaiti et al. Feb 2011 A1
20110040411 Murayama et al. Feb 2011 A1
20110046441 Wiltshire et al. Feb 2011 A1
20110060215 Tupin Mar 2011 A1
20110106102 Balicki et al. May 2011 A1
20110144479 Hastings et al. Jun 2011 A1
20110152880 Alvarez et al. Jun 2011 A1
20110208355 Tsusaka Aug 2011 A1
20110306836 Ohline et al. Dec 2011 A1
20120046542 Csavoy et al. Feb 2012 A1
20120138586 Webster et al. Jun 2012 A1
20120191107 Tanner et al. Jul 2012 A1
20120283747 Popovic Nov 2012 A1
20130041215 McDowall Feb 2013 A1
20130041509 Saito Feb 2013 A1
20130116706 Lee May 2013 A1
20130123759 Kang May 2013 A1
20130144116 Cooper et al. Jun 2013 A1
20130169423 Lescu Jul 2013 A1
20130173058 Seo et al. Jul 2013 A1
20130274596 Azizian Oct 2013 A1
20140012276 Alvarez Jan 2014 A1
20140039517 Bowling Feb 2014 A1
20140052154 Griffiths Feb 2014 A1
20140088763 Hazan Mar 2014 A1
20140094968 Taylor Apr 2014 A1
20140142591 Alvarez et al. May 2014 A1
20140253684 Kumar Sep 2014 A1
20140276594 Tanner et al. Sep 2014 A1
20140276937 Wong Sep 2014 A1
20140350387 Siewerdsen et al. Nov 2014 A1
20140357984 Wallace et al. Dec 2014 A1
20140364870 Alvarez et al. Dec 2014 A1
20150051592 Kintz Feb 2015 A1
20150051732 Grygorowicz et al. Feb 2015 A1
20150066051 Kwon Mar 2015 A1
20150164594 Romo et al. Jun 2015 A1
20150164596 Romo Jun 2015 A1
20150239121 Takeda Aug 2015 A1
20150248121 Nilsson Sep 2015 A1
20150289941 Bowling Oct 2015 A1
20150323398 Lauzier et al. Nov 2015 A1
20150328771 Yuelai et al. Nov 2015 A1
20150335480 Alvarez et al. Nov 2015 A1
20150374446 Malackowski Dec 2015 A1
20160001038 Romo et al. Jan 2016 A1
20160022189 Pouteau et al. Jan 2016 A1
20160031083 Embon Feb 2016 A1
20160074117 Mohr Mar 2016 A1
20160100896 Yu Apr 2016 A1
20160144509 Gulhar May 2016 A1
20160151122 Alvarez et al. Jun 2016 A1
20160158601 Lee et al. Jun 2016 A1
20160221189 Nilsson Aug 2016 A1
20160270865 Landey et al. Sep 2016 A1
20160279405 Riley Sep 2016 A1
20160287279 Bovay et al. Oct 2016 A1
20160354925 Shimodaira Dec 2016 A1
20170007336 Tsuboi Jan 2017 A1
20170007337 Dan Jan 2017 A1
20170007342 Kasai Jan 2017 A1
20170100199 Yu et al. Apr 2017 A1
20170119481 Romo et al. May 2017 A1
20170165011 Bovay et al. Jun 2017 A1
20170165834 Hares Jun 2017 A1
20170172673 Yu et al. Jun 2017 A1
20170172680 Bowling Jun 2017 A1
20170202627 Sramek et al. Jul 2017 A1
20170209073 Sramek et al. Jul 2017 A1
20170245955 Bowling Aug 2017 A1
20170258529 Winne Sep 2017 A1
20170274530 Mottram Sep 2017 A1
20170290631 Lee et al. Oct 2017 A1
20170333679 Jiang Nov 2017 A1
20170340396 Romo et al. Nov 2017 A1
20170363669 Marvast Dec 2017 A1
20170367782 Schuh et al. Dec 2017 A1
20180025666 Ho et al. Jan 2018 A1
20180177556 Noonan et al. Jun 2018 A1
20180214011 Graetzel et al. Aug 2018 A1
20180221038 Noonan et al. Aug 2018 A1
20180221039 Shah Aug 2018 A1
20180250083 Schuh et al. Sep 2018 A1
20180271616 Schuh et al. Sep 2018 A1
20180279852 Rafii-Tari et al. Oct 2018 A1
20180280660 Landey et al. Oct 2018 A1
20180289431 Draper et al. Oct 2018 A1
20180325499 Landey et al. Nov 2018 A1
20180333044 Jenkins Nov 2018 A1
20180360435 Romo Dec 2018 A1
20190000559 Berman et al. Jan 2019 A1
20190000560 Berman et al. Jan 2019 A1
20190000576 Mintz et al. Jan 2019 A1
20190083183 Moll et al. Mar 2019 A1
20190105776 Ho et al. Apr 2019 A1
20190105785 Meyer Apr 2019 A1
20190110839 Rafii-Tari et al. Apr 2019 A1
20190151148 Alvarez et al. Apr 2019 A1
20190167366 Ummalaneni Jun 2019 A1
20190175009 Mintz Jun 2019 A1
20190175062 Rafii-Tari et al. Jun 2019 A1
20190175799 Hsu Jun 2019 A1
20190183585 Rafii-Tari et al. Jun 2019 A1
20190183587 Rafii-Tari et al. Jun 2019 A1
20190216548 Ummalaneni Jul 2019 A1
20190216576 Eyre Jul 2019 A1
20190223974 Romo Jul 2019 A1
20190228525 Mintz et al. Jul 2019 A1
20190246882 Graetzel et al. Aug 2019 A1
20190262086 Connolly et al. Aug 2019 A1
20190269468 Hsu et al. Sep 2019 A1
20190274764 Romo Sep 2019 A1
20190290109 Agrawal et al. Sep 2019 A1
20190298160 Ummalaneni et al. Oct 2019 A1
20190298460 Al-Jadda Oct 2019 A1
20190298465 Chin Oct 2019 A1
20190328213 Landey et al. Oct 2019 A1
20190336238 Yu Nov 2019 A1
20190365209 Ye et al. Dec 2019 A1
20190365479 Rafii-Tari Dec 2019 A1
20190365486 Srinivasan et al. Dec 2019 A1
20190365491 Yu Dec 2019 A1
20190374297 Wallace et al. Dec 2019 A1
20190380787 Ye Dec 2019 A1
20190380797 Yu Dec 2019 A1
20200000530 DeFonzo Jan 2020 A1
20200000533 Schuh Jan 2020 A1
20200022767 Hill Jan 2020 A1
20200039086 Meyer Feb 2020 A1
20200046434 Graetzel Feb 2020 A1
20200054405 Schuh Feb 2020 A1
20200054408 Schuh et al. Feb 2020 A1
20200060516 Baez Feb 2020 A1
20200093549 Chin Mar 2020 A1
20200093554 Schuh Mar 2020 A1
20200100845 Julian Apr 2020 A1
20200100853 Ho Apr 2020 A1
20200101264 Jiang Apr 2020 A1
20200107894 Wallace Apr 2020 A1
Foreign Referenced Citations (6)
Number Date Country
0 830 562 Jul 2009 EP
09-224951 Sep 1997 JP
WO 9214411 Sep 1992 WO
WO 03096871 Nov 2003 WO
2004105849 Dec 2004 WO
WO 11161218 Dec 2011 WO
Non-Patent Literature Citations (20)
Entry
Blavier, Adélaïde, Quentin Gaudissart, G-B. Cadiere, and A-S. Nyssen. “Impact of 2D and 3D vision on performance of novice subjects using da Vinci robotic system.” Acta Chirurgica Belgica 106, No. 6 (2006): 662-664.
Richa, Rogério, et al. “Visual tracking of surgical tools for proximity detection in retinal surgery.” International Conference on Information Processing in Computer-Assisted Interventions. Springer, Berlin, Heidelberg, 2011.
Balicki, et al. Single fiber optical coherence tomography microsurgical instruments for computer and robot-assisted retinal surgery. Medical Image Computing and Computer-Assisted Intervention. MICCAI 2009. Springer Berlin Heidelberg, 2009. 108-115.
Bourla, Dan H., et al. “Feasibility study of intraocular robotic surgery with the da Vinci surgical system.” Retina 28.1 (2008): 154-158.
Ehlers, et al. Integration of a spectral domain optical coherence tomography system into a surgical microscope for intraoperative imaging. Investigative Ophthalmology and Visual Science 52.6. 2011; 3153-3159.
Hubschman. Robotic Eye Surgery: Past, Present, and Future. Journal of Computer Science and Systems Biology. 2012.
Skarecky et al., 2008, Zero positive surgical margins after radical prostatectomy: is the end in sight?, Expert Review of Medical Devices, 5(6):709-717.
St. Jude Medical, EnSite Velocity Cardiac Mapping System, online, http:--www.sjmprofessional.com-Products-US-Mapping-and-Visualization-EnSi- te-Velocity.aspx.
Verdaasdonk et al., Jan. 23, 2012, Effect of microsecond pulse length and tip shape on explosive bubble formation of 2.78 μm Er, Cr;YSGG and 2.94 μm Er:YAG laser, Proceedings of SPIE, vol. 8221, 12.
European search report and search opinion dated Jul. 2, 2015 for EP Application No. 12856685.8.
International search report and written opinion dated Mar. 29, 2013 for PCT/US2012/069540.
International search report dated Jun. 16, 2014 for PCT/US2014/022424.
International search report and written opinion dated Nov. 7, 2014 for PCT Application No. PCT/US2014/041990.
International search report and written opinion dated Jan. 27, 2015 for PCT Application No. PCT/US2014/062284.
Office action dated May 21, 2015 for U.S. Appl. No. 13/711,440.
Office action dated Jun. 11, 2015 for U.S. Appl. No. 14/158,548.
Office action dated Oct. 7, 2014 for U.S. Appl. No. 13/711,440.
Office action dated Jun. 19, 2014 for U.S. Appl. No. 13/868,769.
U.S. Appl. No. 14/578,082, filed Dec. 19, 2014, Patented.
Stoyanov. Surgical vision. Annals of Biomedical Engineering 40.2. 2012; 332-345. Published Oct. 20, 2011.
Related Publications (1)
Number Date Country
20190374383 A1 Dec 2019 US
Provisional Applications (1)
Number Date Country
61637426 Apr 2012 US
Continuations (1)
Number Date Country
Parent 14578082 Dec 2014 US
Child 16529155 US
Continuation in Parts (1)
Number Date Country
Parent 13868769 Apr 2013 US
Child 14578082 US