The systems and methods disclosed herein are directed to branch prediction in a luminal network, and more particularly to techniques for predicting which branch an instrument will be advanced into based on location sensor data.
Medical procedures such as endoscopy (e.g., bronchoscopy) may involve the insertion of a medical tool into a patient's luminal network (e.g., airways) for diagnostic and/or therapeutic purposes. Surgical robotic systems may be used to control the insertion and/or manipulation of the medical tool during a medical procedure. The surgical robotic system may comprise at least one robotic arm including a manipulator assembly which may be used to control the positioning of the medical tool prior to and during the medical procedure. The surgical robotic system may further comprise location sensor(s) configured to generate location data indicative of a position of the distal end of the medical tool.
The surgical robotic system may further comprise one or more displays for providing an indication of the location of the distal end of the instrument to a user and thereby aid the user in navigating the instrument through the patient's luminal network. The system may be configured to perform various techniques in support of the navigation of the instrument, including predicting into which branch of the luminal network the instrument is most likely to be advanced from a current branch.
The systems, methods and devices of this disclosure each have several innovative aspects, no single one of which is solely responsible for the desirable attributes disclosed herein.
In one aspect, there is provided a system, comprising a processor and at least one computer-readable memory in communication with the processor and having stored thereon a model of a luminal network of a patient, the memory further having stored thereon computer-executable instructions to cause the processor to: determine a first orientation of an instrument based on first location data generated by a set of one or more location sensors for the instrument, the first location data being indicative of the location of the instrument in a location sensor coordinate system at a first time; determine a second orientation of the instrument at a second time based on second location data generated by the set of location sensors, a distal end of the instrument being located within a first segment of the model at the first time and the second time and the first segment branching into two or more child segments; determine data indicative of a difference between the first orientation and the second orientation; and determine a prediction that the instrument will advance into a first one of the child segments based on the data indicative of the difference.
In another aspect, there is provided a non-transitory computer readable storage medium having stored thereon instructions that, when executed, cause at least one computing device to: determine a first orientation of an instrument based on first location data generated by a set of one or more location sensors for the instrument, the first location data being indicative of the location of the instrument in a location sensor coordinate system at a first time; determine a second orientation of the instrument at a second time based on second location data generated by the set of location sensors, a distal end of the instrument being located within a first segment of a model at the first time and the second time and the first segment branching into two or more child segments, the model being stored in a memory and modelling a luminal network of a patient; determine data indicative of a difference between the first orientation and the second orientation; and determine a prediction that the instrument will advance into a first one of the child segments based on the data indicative of the difference.
In yet another aspect, there is provided method of predicting movement of an instrument, comprising: determining a first orientation of an instrument based on first location data generated by a set of one or more location sensors for the instrument, the first location data being indicative of the location of the instrument in a location sensor coordinate system at a first time; determining a second orientation of the instrument at a second time based on second location data generated by the set of location sensors, a distal end of the instrument being located within a first segment of a model at the first time and the second time and the first segment branching into two or more child segments, the model being stored in a memory and modelling a luminal network of a patient; determining data indicative of a difference between the first orientation and the second orientation; and determining a prediction that the instrument will advance into a first one of the child segments based on the data indicative of the difference.
In still yet another aspect, there is provided a system, comprising a processor and at least one computer-readable memory in communication with the processor and having stored thereon a model of a luminal network of a patient, the memory further having stored thereon computer-executable instructions to cause the processor to: determine an orientation of an instrument with respect to the model based on location data generated by a set of one or more location sensors for the instrument, a distal end of the instrument being located within a first segment of the model and the first segment branching into two or more child segments; determine an orientation a first one of the child segments; and determine a prediction that the instrument will advance into the first child segment based on the orientation of the instrument and the orientation of the first child segment.
In another aspect, there is provided non-transitory computer readable storage medium having stored thereon instructions that, when executed, cause at least one computing device to: determine an orientation of an instrument with respect to a model based on location data generated by a set of one or more location sensors for the instrument, the model being stored in a memory and modelling a luminal network of a patient, a distal end of the instrument being located within a first segment of the model and the first segment branching into two or more child segments; determine an orientation a first one of the child segments; and determine a prediction that the instrument will advance into the first child segment based on the orientation of the instrument and the orientation of the first child segment.
In yet another aspect, there is provided a method of predicting movement of an instrument, comprising: determining an orientation of an instrument with respect to a model based on location data generated by a set of one or more location sensors for the instrument, the model being stored in a memory and modelling a luminal network of a patient, a distal end of the instrument being located within a first segment of the model and the first segment branching into two or more child segments; determining an orientation a first one of the child segments; and determining a prediction that the instrument will advance into the first child segment based on the orientation of the instrument and the orientation of the first child segment.
The disclosed aspects will hereinafter be described in conjunction with the appended drawings, provided to illustrate and not to limit the disclosed aspects, wherein like designations denote like elements.
1. Overview
Aspects of the present disclosure may be integrated into a robotically-enabled medical system capable of performing a variety of medical procedures, including both minimally invasive, such as laparoscopy, and non-invasive, such as endoscopy, procedures. Among endoscopy procedures, the system may be capable of performing bronchoscopy, ureteroscopy, gastroscopy, etc.
In addition to performing the breadth of procedures, the system may provide additional benefits, such as enhanced imaging and guidance to assist the physician. Additionally, the system may provide the physician with the ability to perform the procedure from an ergonomic position without the need for awkward arm motions and positions. Still further, the system may provide the physician with the ability to perform the procedure with improved ease of use such that one or more of the instruments of the system can be controlled by a single user.
Various embodiments will be described below in conjunction with the drawings for purposes of illustration. It should be appreciated that many other implementations of the disclosed concepts are possible, and various advantages can be achieved with the disclosed implementations. Headings are included herein for reference and to aid in locating various sections. These headings are not intended to limit the scope of the concepts described with respect thereto. Such concepts may have applicability throughout the entire specification.
A. Robotic System—Cart.
The robotically-enabled medical system may be configured in a variety of ways depending on the particular procedure.
With continued reference to
The endoscope 13 may be directed down the patient's trachea and lungs after insertion using precise commands from the robotic system until reaching the target destination or operative site. In order to enhance navigation through the patient's lung network and/or reach the desired target, the endoscope 13 may be manipulated to telescopically extend the inner leader portion from the outer sheath portion to obtain enhanced articulation and greater bend radius. The use of separate instrument drivers 28 also allows the leader portion and sheath portion to be driven independent of each other.
For example, the endoscope 13 may be directed to deliver a biopsy needle to a target, such as, for example, a lesion or nodule within the lungs of a patient. The needle may be deployed down a working channel that runs the length of the endoscope to obtain a tissue sample to be analyzed by a pathologist. Depending on the pathology results, additional tools may be deployed down the working channel of the endoscope for additional biopsies. After identifying a nodule to be malignant, the endoscope 13 may endoscopically deliver tools to resect the potentially cancerous tissue. In some instances, diagnostic and therapeutic treatments may need to be delivered in separate procedures. In those circumstances, the endoscope 13 may also be used to deliver a fiducial to “mark” the location of the target nodule as well. In other instances, diagnostic and therapeutic treatments may be delivered during the same procedure.
The system 10 may also include a movable tower 30, which may be connected via support cables to the cart 11 to provide support for controls, electronics, fluidics, optics, sensors, and/or power to the cart 11. Placing such functionality in the tower 30 allows for a smaller form factor cart 11 that may be more easily adjusted and/or re-positioned by an operating physician and his/her staff. Additionally, the division of functionality between the cart/table and the support tower 30 reduces operating room clutter and facilitates improving clinical workflow. While the cart 11 may be positioned close to the patient, the tower 30 may be stowed in a remote location to stay out of the way during a procedure.
In support of the robotic systems described above, the tower 30 may include component(s) of a computer-based control system that stores computer program instructions, for example, within a non-transitory computer-readable storage medium such as a persistent magnetic storage drive, solid state drive, etc. The execution of those instructions, whether the execution occurs in the tower 30 or the cart 11, may control the entire system or sub-system(s) thereof. For example, when executed by a processor of the computer system, the instructions may cause the components of the robotics system to actuate the relevant carriages and arm mounts, actuate the robotics arms, and control the medical instruments. For example, in response to receiving the control signal, the motors in the joints of the robotics arms may position the arms into a certain posture.
The tower 30 may also include a pump, flow meter, valve control, and/or fluid access in order to provide controlled irrigation and aspiration capabilities to system that may be deployed through the endoscope 13. These components may also be controlled using the computer system of tower 30. In some embodiments, irrigation and aspiration capabilities may be delivered directly to the endoscope 13 through separate cable(s).
The tower 30 may include a voltage and surge protector designed to provide filtered and protected electrical power to the cart 11, thereby avoiding placement of a power transformer and other auxiliary power components in the cart 11, resulting in a smaller, more moveable cart 11.
The tower 30 may also include support equipment for the sensors deployed throughout the robotic system 10. For example, the tower 30 may include opto-electronics equipment for detecting, receiving, and processing data received from the optical sensors or cameras throughout the robotic system 10. In combination with the control system, such opto-electronics equipment may be used to generate real-time images for display in any number of consoles deployed throughout the system, including in the tower 30. Similarly, the tower 30 may also include an electronic subsystem for receiving and processing signals received from deployed electromagnetic (EM) sensors. The tower 30 may also be used to house and position an EM field generator for detection by EM sensors in or on the medical instrument.
The tower 30 may also include a console 31 in addition to other consoles available in the rest of the system, e.g., console mounted on top of the cart. The console 31 may include a user interface and a display screen, such as a touchscreen, for the physician operator. Consoles in system 10 are generally designed to provide both robotic controls as well as pre-operative and real-time information of the procedure, such as navigational and localization information of the endoscope 13. When the console 31 is not the only console available to the physician, it may be used by a second operator, such as a nurse, to monitor the health or vitals of the patient and the operation of system, as well as provide procedure-specific data, such as navigational and localization information.
The tower 30 may be coupled to the cart 11 and endoscope 13 through one or more cables or connections (not shown). In some embodiments, the support functionality from the tower 30 may be provided through a single cable to the cart 11, simplifying and de-cluttering the operating room. In other embodiments, specific functionality may be coupled in separate cabling and connections. For example, while power may be provided through a single power cable to the cart, the support for controls, optics, fluidics, and/or navigation may be provided through a separate cable.
The carriage interface 19 is connected to the column 14 through slots, such as slot 20, that are positioned on opposite sides of the column 14 to guide the vertical translation of the carriage 17. The slot 20 contains a vertical translation interface to position and hold the carriage at various vertical heights relative to the cart base 15. Vertical translation of the carriage 17 allows the cart 11 to adjust the reach of the robotic arms 12 to meet a variety of table heights, patient sizes, and physician preferences. Similarly, the individually configurable arm mounts on the carriage 17 allow the robotic arm base 21 of robotic arms 12 to be angled in a variety of configurations.
In some embodiments, the slot 20 may be supplemented with slot covers that are flush and parallel to the slot surface to prevent dirt and fluid ingress into the internal chambers of the column 14 and the vertical translation interface as the carriage 17 vertically translates. The slot covers may be deployed through pairs of spring spools positioned near the vertical top and bottom of the slot 20. The covers are coiled within the spools until deployed to extend and retract from their coiled state as the carriage 17 vertically translates up and down. The spring-loading of the spools provides force to retract the cover into a spool when carriage 17 translates towards the spool, while also maintaining a tight seal when the carriage 17 translates away from the spool. The covers may be connected to the carriage 17 using, for example, brackets in the carriage interface 19 to ensure proper extension and retraction of the cover as the carriage 17 translates.
The column 14 may internally comprise mechanisms, such as gears and motors, that are designed to use a vertically aligned lead screw to translate the carriage 17 in a mechanized fashion in response to control signals generated in response to user inputs, e.g., inputs from the console 16.
The robotic arms 12 may generally comprise robotic arm bases 21 and end effectors 22, separated by a series of linkages 23 that are connected by a series of joints 24, each joint comprising an independent actuator, each actuator comprising an independently controllable motor. Each independently controllable joint represents an independent degree of freedom available to the robotic arm. Each of the arms 12 have seven joints, and thus provide seven degrees of freedom. A multitude of joints result in a multitude of degrees of freedom, allowing for “redundant” degrees of freedom. Redundant degrees of freedom allow the robotic arms 12 to position their respective end effectors 22 at a specific position, orientation, and trajectory in space using different linkage positions and joint angles. This allows for the system to position and direct a medical instrument from a desired point in space while allowing the physician to move the arm joints into a clinically advantageous position away from the patient to create greater access, while avoiding arm collisions.
The cart base 15 balances the weight of the column 14, carriage 17, and arms 12 over the floor. Accordingly, the cart base 15 houses heavier components, such as electronics, motors, power supply, as well as components that either enable movement and/or immobilize the cart. For example, the cart base 15 includes rollable wheel-shaped casters 25 that allow for the cart to easily move around the room prior to a procedure. After reaching the appropriate position, the casters 25 may be immobilized using wheel locks to hold the cart 11 in place during the procedure.
Positioned at the vertical end of column 14, the console 16 allows for both a user interface for receiving user input and a display screen (or a dual-purpose device such as, for example, a touchscreen 26) to provide the physician user with both pre-operative and intra-operative data. Potential pre-operative data on the touchscreen 26 may include pre-operative plans, navigation and mapping data derived from pre-operative computerized tomography (CT) scans, and/or notes from pre-operative patient interviews. Intra-operative data on display may include optical information provided from the tool, sensor and coordinate information from sensors, as well as vital patient statistics, such as respiration, heart rate, and/or pulse. The console 16 may be positioned and tilted to allow a physician to access the console from the side of the column 14 opposite carriage 17. From this position, the physician may view the console 16, robotic arms 12, and patient while operating the console 16 from behind the cart 11. As shown, the console 16 also includes a handle 27 to assist with maneuvering and stabilizing cart 11.
After insertion into the urethra, using similar control techniques as in bronchoscopy, the ureteroscope 32 may be navigated into the bladder, ureters, and/or kidneys for diagnostic and/or therapeutic applications. For example, the ureteroscope 32 may be directed into the ureter and kidneys to break up kidney stone build up using laser or ultrasonic lithotripsy device deployed down the working channel of the ureteroscope 32. After lithotripsy is complete, the resulting stone fragments may be removed using baskets deployed down the ureteroscope 32.
B. Robotic System—Table.
Embodiments of the robotically-enabled medical system may also incorporate the patient's table. Incorporation of the table reduces the amount of capital equipment within the operating room by removing the cart, which allows greater access to the patient.
The arms 39 may be mounted on the carriages through a set of arm mounts 45 comprising a series of joints that may individually rotate and/or telescopically extend to provide additional configurability to the robotic arms 39. Additionally, the arm mounts 45 may be positioned on the carriages 43 such that, when the carriages 43 are appropriately rotated, the arm mounts 45 may be positioned on either the same side of table 38 (as shown in
The column 37 structurally provides support for the table 38, and a path for vertical translation of the carriages. Internally, the column 37 may be equipped with lead screws for guiding vertical translation of the carriages, and motors to mechanize the translation of said carriages based the lead screws. The column 37 may also convey power and control signals to the carriage 43 and robotic arms 39 mounted thereon.
The table base 46 serves a similar function as the cart base 15 in cart 11 shown in
Continuing with
In some embodiments, a table base may stow and store the robotic arms when not in use.
In a laparoscopic procedure, through small incision(s) in the patient's abdominal wall, minimally invasive instruments (elongated in shape to accommodate the size of the one or more incisions) may be inserted into the patient's anatomy. After inflation of the patient's abdominal cavity, the instruments, often referred to as laparoscopes, may be directed to perform surgical tasks, such as grasping, cutting, ablating, suturing, etc.
To accommodate laparoscopic procedures, the robotically-enabled table system may also tilt the platform to a desired angle.
For example, pitch adjustments are particularly useful when trying to position the table in a Trendelenburg position, i.e., position the patient's lower abdomen at a higher position from the floor than the patient's lower abdomen, for lower abdominal surgery. The Trendelenburg position causes the patient's internal organs to slide towards his/her upper abdomen through the force of gravity, clearing out the abdominal cavity for minimally invasive tools to enter and perform lower abdominal surgical procedures, such as laparoscopic prostatectomy.
C. Instrument Driver & Interface.
The end effectors of the system's robotic arms comprise (i) an instrument driver (alternatively referred to as “instrument drive mechanism” or “instrument device manipulator”) that incorporate electro-mechanical means for actuating the medical instrument and (ii) a removable or detachable medical instrument which may be devoid of any electro-mechanical components, such as motors. This dichotomy may be driven by the need to sterilize medical instruments used in medical procedures, and the inability to adequately sterilize expensive capital equipment due to their intricate mechanical assemblies and sensitive electronics. Accordingly, the medical instruments may be designed to be detached, removed, and interchanged from the instrument driver (and thus the system) for individual sterilization or disposal by the physician or the physician's staff. In contrast, the instrument drivers need not be changed or sterilized, and may be draped for protection.
For procedures that require a sterile environment, the robotic system may incorporate a drive interface, such as a sterile adapter connected to a sterile drape, that sits between the instrument driver and the medical instrument. The chief purpose of the sterile adapter is to transfer angular motion from the drive shafts of the instrument driver to the drive inputs of the instrument while maintaining physical separation, and thus sterility, between the drive shafts and drive inputs. Accordingly, an example sterile adapter may comprise of a series of rotational inputs and outputs intended to be mated with the drive shafts of the instrument driver and drive inputs on the instrument. Connected to the sterile adapter, the sterile drape, comprised of a thin, flexible material such as transparent or translucent plastic, is designed to cover the capital equipment, such as the instrument driver, robotic arm, and cart (in a cart-based system) or table (in a table-based system). Use of the drape would allow the capital equipment to be positioned proximate to the patient while still being located in an area not requiring sterilization (i.e., non-sterile field). On the other side of the sterile drape, the medical instrument may interface with the patient in an area requiring sterilization (i.e., sterile field).
D. Medical Instrument.
The elongated shaft 71 is designed to be delivered through either an anatomical opening or lumen, e.g., as in endoscopy, or a minimally invasive incision, e.g., as in laparoscopy. The elongated shaft 66 may be either flexible (e.g., having properties similar to an endoscope) or rigid (e.g., having properties similar to a laparoscope) or contain a customized combination of both flexible and rigid portions. When designed for laparoscopy, the distal end of a rigid elongated shaft may be connected to an end effector comprising a jointed wrist formed from a clevis with an axis of rotation and a surgical tool, such as, for example, a grasper or scissors, that may be actuated based on force from the tendons as the drive inputs rotate in response to torque received from the drive outputs 74 of the instrument driver 75. When designed for endoscopy, the distal end of a flexible elongated shaft may include a steerable or controllable bending section that may be articulated and bent based on torque received from the drive outputs 74 of the instrument driver 75.
Torque from the instrument driver 75 is transmitted down the elongated shaft 71 using tendons within the shaft 71. These individual tendons, such as pull wires, may be individually anchored to individual drive inputs 73 within the instrument handle 72. From the handle 72, the tendons are directed down one or more pull lumens within the elongated shaft 71 and anchored at the distal portion of the elongated shaft 71. In laparoscopy, these tendons may be coupled to a distally mounted end effector, such as a wrist, grasper, or scissor. Under such an arrangement, torque exerted on drive inputs 73 would transfer tension to the tendon, thereby causing the end effector to actuate in some way. In laparoscopy, the tendon may cause a joint to rotate about an axis, thereby causing the end effector to move in one direction or another. Alternatively, the tendon may be connected to one or more jaws of a grasper at distal end of the elongated shaft 71, where tension from the tendon cause the grasper to close.
In endoscopy, the tendons may be coupled to a bending or articulating section positioned along the elongated shaft 71 (e.g., at the distal end) via adhesive, control ring, or other mechanical fixation. When fixedly attached to the distal end of a bending section, torque exerted on drive inputs 73 would be transmitted down the tendons, causing the softer, bending section (sometimes referred to as the articulable section or region) to bend or articulate. Along the non-bending sections, it may be advantageous to spiral or helix the individual pull lumens that direct the individual tendons along (or inside) the walls of the endoscope shaft to balance the radial forces that result from tension in the pull wires. The angle of the spiraling and/or spacing there between may be altered or engineered for specific purposes, wherein tighter spiraling exhibits lesser shaft compression under load forces, while lower amounts of spiraling results in greater shaft compression under load forces, but also exhibits limits bending. On the other end of the spectrum, the pull lumens may be directed parallel to the longitudinal axis of the elongated shaft 71 to allow for controlled articulation in the desired bending or articulable sections.
In endoscopy, the elongated shaft 71 houses a number of components to assist with the robotic procedure. The shaft may comprise of a working channel for deploying surgical tools, irrigation, and/or aspiration to the operative region at the distal end of the shaft 71. The shaft 71 may also accommodate wires and/or optical fibers to transfer signals to/from an optical assembly at the distal tip, which may include of an optical camera. The shaft 71 may also accommodate optical fibers to carry light from proximally-located light sources, such as light emitting diodes, to the distal end of the shaft.
At the distal end of the instrument 70, the distal tip may also comprise the opening of a working channel for delivering tools for diagnostic and/or therapy, irrigation, and aspiration to an operative site. The distal tip may also include a port for a camera, such as a fiberscope or a digital camera, to capture images of an internal anatomical space. Relatedly, the distal tip may also include ports for light sources for illuminating the anatomical space when using the camera.
In the example of
Like earlier disclosed embodiments, an instrument 86 may comprise of an elongated shaft portion 88 and an instrument base 87 (shown with a transparent external skin for discussion purposes) comprising a plurality of drive inputs 89 (such as receptacles, pulleys, and spools) that are configured to receive the drive outputs 81 in the instrument driver 80. Unlike prior disclosed embodiments, instrument shaft 88 extends from the center of instrument base 87 with an axis substantially parallel to the axes of the drive inputs 89, rather than orthogonal as in the design of
When coupled to the rotational assembly 83 of the instrument driver 80, the medical instrument 86, comprising instrument base 87 and instrument shaft 88, rotates in combination with the rotational assembly 83 about the instrument driver axis 85. Since the instrument shaft 88 is positioned at the center of instrument base 87, the instrument shaft 88 is coaxial with instrument driver axis 85 when attached. Thus, rotation of the rotational assembly 83 causes the instrument shaft 88 to rotate about its own longitudinal axis. Moreover, as the instrument base 87 rotates with the instrument shaft 88, any tendons connected to the drive inputs 89 in the instrument base 87 are not tangled during rotation. Accordingly, the parallelism of the axes of the drive outputs 81, drive inputs 89, and instrument shaft 88 allows for the shaft rotation without tangling any control tendons.
E. Navigation and Control.
Traditional endoscopy may involve the use of fluoroscopy (e.g., as may be delivered through a C-arm) and other forms of radiation-based imaging modalities to provide endoluminal guidance to an operator physician. In contrast, the robotic systems contemplated by this disclosure can provide for non-radiation-based navigational and localization means to reduce physician exposure to radiation and reduce the amount of equipment within the operating room. As used herein, the term “localization” may refer to determining and/or monitoring the position of objects in a reference coordinate system. Technologies such as pre-operative mapping, computer vision, real-time EM tracking, and robot command data may be used individually or in combination to achieve a radiation-free operating environment. In other cases, where radiation-based imaging modalities are still used, the pre-operative mapping, computer vision, real-time EM tracking, and robot command data may be used individually or in combination to improve upon the information obtained solely through radiation-based imaging modalities.
As shown in
The various input data 91-94 are now described in greater detail. Pre-operative mapping may be accomplished through the use of the collection of low dose CT scans. Pre-operative CT scans are reconstructed into three-dimensional (3D) images, which are visualized, e.g., as “slices” of a cutaway view of the patient's internal anatomy. When analyzed in the aggregate, image-based models for anatomical cavities, spaces and structures of the patient's anatomy, such as a patient lung network, may be generated. Techniques such as center-line geometry may be determined and approximated from the CT images to develop a 3D volume of the patient's anatomy, referred to as preoperative model data 91. The use of center-line geometry is discussed in U.S. patent application Ser. No. 14/523,760, the contents of which are herein incorporated in its entirety. Network topological models may also be derived from the CT-images, and are particularly appropriate for bronchoscopy.
In some embodiments, the instrument may be equipped with a camera to provide vision data 92. The localization module 95 may process the vision data to enable one or more vision-based location tracking. For example, the preoperative model data may be used in conjunction with the vision data 92 to enable computer vision-based tracking of the medical instrument (e.g., an endoscope or an instrument advance through a working channel of the endoscope). For example, using the preoperative model data 91, the robotic system may generate a library of expected endoscopic images from the model based on the expected path of travel of the endoscope, each image linked to a location within the model. Intra-operatively, this library may be referenced by the robotic system in order to compare real-time images captured at the camera (e.g., a camera at a distal end of the endoscope) to those in the image library to assist localization.
Other computer vision-based tracking techniques use feature tracking to determine motion of the camera, and thus the endoscope. Some feature of the localization module 95 may identify circular geometries in the preoperative model data 91 that correspond to anatomical lumens and track the change of those geometries to determine which anatomical lumen was selected, as well as the relative rotational and/or translational motion of the camera. Use of a topological map may further enhance vision-based algorithms or techniques.
Optical flow, another computer vision-based technique, may analyze the displacement and translation of image pixels in a video sequence in the vision data 92 to infer camera movement. Examples of optical flow techniques may include motion detection, object segmentation calculations, luminance, motion compensated encoding, stereo disparity measurement, etc. Through the comparison of multiple frames over multiple iterations, movement and location of the camera (and thus the endoscope) may be determined.
The localization module 95 may use real-time EM tracking to generate a real-time location of the endoscope in a global coordinate system that may be registered to the patient's anatomy, represented by the preoperative model. In EM tracking, an EM sensor (or tracker) comprising of one or more sensor coils embedded in one or more locations and orientations in a medical instrument (e.g., an endoscopic tool) measures the variation in the EM field created by one or more static EM field generators positioned at a known location. The location information detected by the EM sensors is stored as EM data 93. The EM field generator (or transmitter), may be placed close to the patient to create a low intensity magnetic field that the embedded sensor may detect. The magnetic field induces small currents in the sensor coils of the EM sensor, which may be analyzed to determine the distance and angle between the EM sensor and the EM field generator. These distances and orientations may be intra-operatively “registered” to the patient anatomy (e.g., the preoperative model) in order to determine the geometric transformation that aligns a single location in the coordinate system with a position in the pre-operative model of the patient's anatomy. Once registered, an embedded EM tracker in one or more positions of the medical instrument (e.g., the distal tip of an endoscope) may provide real-time indications of the progression of the medical instrument through the patient's anatomy.
Robotic command and kinematics data 94 may also be used by the localization module 95 to provide localization data 96 for the robotic system. Device pitch and yaw resulting from articulation commands may be determined during pre-operative calibration. Intra-operatively, these calibration measurements may be used in combination with known insertion depth information to estimate the position of the instrument. Alternatively, these calculations may be analyzed in combination with EM, vision, and/or topological modeling to estimate the position of the medical instrument within the network.
As
The localization module 95 may use the input data 91-94 in combination(s). In some cases, such a combination may use a probabilistic approach where the localization module 95 assigns a confidence weight to the location determined from each of the input data 91-94. Thus, where the EM data may not be reliable (as may be the case where there is EM interference) the confidence of the location determined by the EM data 93 can be decrease and the localization module 95 may rely more heavily on the vision data 92 and/or the robotic command and kinematics data 94.
As discussed above, the robotic systems discussed herein may be designed to incorporate a combination of one or more of the technologies above. The robotic system's computer-based control system, based in the tower, bed and/or cart, may store computer program instructions, for example, within a non-transitory computer-readable storage medium such as a persistent magnetic storage drive, solid state drive, or the like, that, upon execution, cause the system to receive and analyze sensor data and user commands, generate control signals throughout the system, and display the navigational and localization data, such as the position of the instrument within the global coordinate system, anatomical map, etc.
2. Introduction to Location Sensor-based Branch Prediction
Embodiments of the disclosure relate to systems and techniques for location sensor-based branch prediction. The system may employ location sensor(s) or location sensing device(s) to localize the distal end of an instrument, for example, during a medical procedure. The location sensor(s) may be positioned at or near the distal end of the instrument or may be positioned remote from the distal end of the instrument. Examples of location sensors or location sensing devices which may be positioned at or near the distal end of the instrument include EM sensors, vision-based location sensors (e.g., a camera), shape sensing fibers, etc. Examples of location sensors or location sensing devices which may be positioned remotely from the distal end of the instrument include fluoroscopic imaging devices, robotic system component(s) that generate or process robotic data for controlling the position of the instrument via one or more instrument manipulators, remote vision-based location sensors, etc.
The location sensors may be configured to generate location data indicative of the location of the distal end of the instrument, for example, with respect to a location sensor coordinate system. As used herein, the location sensor coordinate system may refer to any coordinate system which can be used to define or determine the positions of the location data (e.g., on a manifold such as Euclidean space) generated by the location sensors. When the location sensors are collocated with the distal end of the instrument, the location data may be representative of the location of the location sensors themselves, which the processor can then use to determine the location of the distal end of the instrument. In certain embodiments, the location sensor coordinate system may comprise a set of axes and an origin, which may be defined based on the particular technology used to implement the location sensors.
For example, EM sensors located in or on the instrument may be configured to measure an EM field generated by an EM field generator. The properties of the EM field, and thus the EM values measured by the EM sensors, may be defined with respect to the location and orientation of the EM field generator. Thus, the positioning of the EM field generator may affect the values measured by the EM sensors and may also define the location and orientation of the EM coordinate system.
As described above, a luminal network of a patient may be pre-operatively mapped using, for example, low dose CT scans to produce a model of the luminal network. Since the model may be produced via a different technique than used to locate the distal end of the instrument, the model coordinate system may not be aligned with the location sensor coordinate system. Accordingly, in order to use the location sensor coordinate system to track the location of the instrument with respect to the model, one technique may involve registering (e.g., by one or more components of a robotic system or a separate system communicatively coupled to the robotic system, including but limited to a processor, a localization system, a localization module, etc.) a coordinate system used by one or more location sensors with another coordinate system, such as a coordinate system used by an anatomical model. This registration may include, for example, translation and/or rotation applied to the location sensor data in order to map the location sensor data from the location sensor coordinate system into the model coordinate system.
The system or processor may perform registration of a location sensor coordinate system to the model coordinate system, for example, during an initial phase of the procedure. Depending on the implementation, the processor may perform the registration process automatically in the background as the instrument is initially advanced through the luminal network. In another implementation, the processor may provide a set of instructions to the user to drive the instrument to specific locations within the luminal network or along a set registration path to facilitation the registration process. Accordingly, the processor may perform a portion of the procedure while the location data received from the location sensor(s) is not registered to the model.
In order to provide feedback to the user regarding the navigation of the instrument during the medical procedure, a “fusion” localization algorithm may be run (e.g., by the localization system 90 of
Since the location sensor(s) may not be registered for at least a portion of the medical procedure, certain aspects of this disclosure may relate to techniques for branch prediction which may be employed based on registered location sensor data or unregistered location sensor data (also generally referred to as “raw” location sensor data). Thus, the system may selectively apply different techniques or combinations thereof for location sensor-based branch prediction depending on whether the location sensor(s) have been registered.
A. EM Navigation-guided Bronchoscopy.
Hereinafter, an example system which may employ the techniques for location sensor-based branch prediction will be described. For example, the system may be configured for an EM navigation-guided bronchoscopic procedure. However, aspects of this disclosure may also apply to systems which use other location sensors which can produce location data as well as to other types of medical procedures.
The system 110 can include one or more robotic arms for positioning and guiding movement of the instrument 115 through the luminal network 140 of the patient 101. The command center 105 can be communicatively coupled to the robotic system 110 for receiving position data and/or providing control signals generated based on user commands received from a user. As used herein, “communicatively coupled” refers to any wired and/or wireless data transfer mediums, including but not limited to a wireless wide area network (WWAN) (e.g., one or more cellular networks), a wireless local area network (WLAN) (e.g., configured for one or more standards, such as the IEEE 802.11 (Wi-Fi)), Bluetooth, data transfer cables, and/or the like. The robotic system 110 can be any of the systems described above with respect to
The instrument 115 may be a tubular and flexible surgical instrument that is inserted into the anatomy of a patient to capture images of the anatomy (e.g., body tissue) and provide a working channel for insertion of other medical instruments to a target tissue site. As described above, the instrument 115 can be a procedure-specific endoscope, for example a bronchoscope, gastroscope, or ureteroscope, or may be a laparoscope or vascular steerable catheter. The instrument 115 can include one or more imaging devices (e.g., cameras or other types of optical sensors) at its distal end. The imaging devices may include one or more optical components such as an optical fiber, fiber array, photosensitive substrate, and/or lens(es). The optical components move along with the tip of the instrument 115 such that movement of the tip of the instrument 115 results in corresponding changes to the field of view of the images captured by the imaging devices. The distal end of the instrument 115 can be provided with one or more EM sensors 125 for tracking the position of the distal end within an EM field generated around the luminal network 140.
The EM controller 135 can control the EM field generator 120 to produce a varying EM field. The EM field can be time-varying and/or spatially varying, depending upon the embodiment. The EM field generator 120 can be an EM field generating board in some embodiments. Some embodiments of the disclosed systems can use an EM field generator board positioned between the patient and the platform 102 supporting the patient 101, and the EM field generator board can incorporate a thin barrier that minimizes tracking distortions caused by conductive or magnetic materials located below it. In other embodiments, an EM field generator board can be mounted on a robotic arm, for example, similar to those shown in the robotic system 110, which can offer flexible setup options around the patient.
In some embodiments, a two-dimensional (2D) display of a 3D luminal network model as described herein, or a cross-section of the 3D model, can resemble
The console base 161 may include a central processing unit, a memory unit, a data bus, and associated data communication ports that are responsible for interpreting and processing signals such as camera imagery and tracking sensor data, e.g., from the instrument 115 shown in
The displays 162 may include electronic monitors (e.g., LCD displays, LED displays, touch-sensitive displays), virtual reality viewing devices, e.g., goggles or glasses, and/or other display devices. In some embodiments, the display modules 162 are integrated with the control modules, for example, as a tablet device with a touchscreen. In some embodiments, one of the displays 162 can display a 3D model of the patient's luminal network and virtual navigation information (e.g., a virtual representation of the end of the endoscope within the model based on EM sensor position) while the other of the displays 162 can display image information received from the camera or another sensing device at the end of the instrument 115. In some implementations, the user 165 can both view data and input commands to the system 110 using the integrated displays 162 and control modules. The displays 162 can display 2D renderings of 3D images and/or 3D images using a stereoscopic device, e.g., a visor or goggles. The 3D images provide an “endo view” (i.e., endoscopic view), which is a computer 3D model illustrating the anatomy of a patient. The “endo view” provides a virtual environment of the patient's interior and an expected location of an instrument 115 inside the patient. A user 165 compares the “endo view” model to actual images captured by a camera to help mentally orient and confirm that the instrument 115 is in the correct—or approximately correct—location within the patient. The “endo view” provides information about anatomical structures, e.g., the shape of airways, circulatory vessels, or an intestine or colon of the patient, around the distal end of the instrument 115. The display modules 162 can simultaneously display the 3D model and CT scans of the anatomy the around distal end of the instrument 115. Further, the display modules 162 may overlay the already determined navigation paths of the instrument 115 on the 3D model and CT scans.
In some embodiments, a model of the instrument 115 is displayed with the 3D models to help indicate a status of a surgical procedure. For example, the CT scans identify a lesion in the anatomy where a biopsy may be necessary. During operation, the display modules 162 may show a reference image captured by the instrument 115 corresponding to the current location of the instrument 115. The display modules 162 may automatically display different views of the model of the instrument 115 depending on user settings and a particular surgical procedure. For example, the display modules 162 show an overhead fluoroscopic view of the instrument 115 during a navigation step as the instrument 115 approaches an operative region of a patient.
B. Location Sensor-based Branch Prediction Using Unregistered Location Data.
As discussed above, an initial phase of a medical procedure may be performed before the location sensor(s) are registered to a model of the luminal network. However, the location sensor(s) may still produce location data prior to registration. Although unregistered to the model, the raw location sensor data may be useful to provide certain localization and navigational functionality. For example, the processor can determine the relative orientations of the instrument at different times based on the raw, unregistered location data. Thus, in certain embodiments, based on the shape and structure of the luminal network and the orientation of the instrument determined based on the unregistered data, the processor may facilitate predicting the next branch of the luminal network into which the instrument is likely to be advanced.
Branch prediction may be included as part of a navigation configuration system.
The input data, as used herein, refers to raw data gathered from and/or processed by input devices (e.g., command module(s), optical sensor(s), EM sensor(s), IDM(s)) for generating estimated state information for the endo scope as well as output navigation data. The multiple input data stores 210-240 include an image data store 210, an EM data store 220, a robot data store 230, and a 3D model data store 240. Each type of the input data stores 210-240 stores the name-indicated type of data for access and use by a navigation module 205. Image data may include one or more image frames captured by the imaging device at the instrument tip, as well as information such as frame rates or timestamps that allow a determination of the time elapsed between pairs of frames. Robot data may include data related to physical movement of the medical instrument or part of the medical instrument (e.g., the instrument tip or sheath) within the tubular network. Example robot data includes command data instructing the instrument tip to reach a specific anatomical site and/or change its orientation (e.g., with a specific pitch, roll, yaw, insertion, and retraction for one or both of a leader and a sheath) within the tubular network, insertion data representing insertion movement of the part of the medical instrument (e.g., the instrument tip or sheath), IDM data, and mechanical data representing mechanical movement of an elongate member of the medical instrument, for example motion of one or more pull wires, tendons or shafts of the endoscope that drive the actual movement of the medial instrument within the tubular network. EM data may be collected by EM sensors and/or the EM tracking system as described above. 3D model data may be derived from 2D CT scans as described above. Path data includes the planned navigation path which may be generated by a topological search of the tubular network to one or more targets.
The output navigation data store 290 receives and stores output navigation data provided by the navigation module 205. Output navigation data indicates information to assist in directing the medical instrument through the tubular network to arrive at a particular destination within the tubular network, and is based on estimated state information for the medical instrument at each instant time, the estimated state information including the location and orientation of the medical instrument within the tubular network. In one embodiment, as the medical instrument moves inside the tubular network, the output navigation data indicating updates of movement and location/orientation information of the medical instrument is provided in real time, which better assists its navigation through the tubular network.
To determine the output navigation data, the navigation module 205 locates (or determines) the estimated state of the medical instrument within a tubular network. As shown in
The state estimator 280 included in the navigation module 205 receives various intermediate data and provides the estimated state of the instrument tip as a function of time, where the estimated state indicates the estimated location and orientation information of the instrument tip within the tubular network. The estimated state data are stored in the estimated data store 285 that is included in the state estimator 280.
The various stores introduced above represent estimated state data in a variety of ways. Specifically, bifurcation data refers to the location of the medical instrument with respect to the set of branches (e.g., bifurcation, trifurcation or a division into more than three branches) within the tubular network. For example, the bifurcation data can be set of branch choices elected by the instrument as it traverses through the tubular network, based on a larger set of available branches as provided, for example, by the 3D model which maps the entirety of the tubular network. The bifurcation data can further include information in front of the location of the instrument tip, such as branches (bifurcations) that the instrument tip is near but has not yet traversed through, but which may have been detected, for example, based on the tip's current position information relative to the 3D model, or based on images captured of the upcoming bifurcations.
Position data indicates three-dimensional position of some part of the medical instrument within the tubular network or some part of the tubular network itself. Position data can be in the form of absolute locations or relative locations relative to, for example, the 3D model of the tubular network. As one example, position data can include an indication of the position of the location of the instrument being within a specific branch. The identification of the specific branch may also be stored as a segment identification (ID) which uniquely identifies the specific segment of the model in which the instrument tip is located.
Depth data indicates depth information of the instrument tip within the tubular network. Example depth data includes the total insertion (absolute) depth of the medical instrument into the patient as well as the (relative) depth within an identified branch (e.g., the segment identified by the position data store 287). Depth data may be determined based on position data regarding both the tubular network and medical instrument.
Orientation data indicates orientation information of the instrument tip, and may include overall roll, pitch, and yaw in relation to the 3D model as well as pitch, roll, raw within an identified branch.
Turning back to
As the state estimator 280 may use several different kinds of intermediate data to arrive at its estimates of the state of the medical instrument within the tubular network, the state estimator 280 is configured to account for the various different kinds of errors and uncertainty in both measurement and analysis that each type of underlying data (robotic, EM, image, path) and each type of algorithm module might create or carry through into the intermediate data used for consideration in determining the estimated state. To address these, two concepts are discussed, that of a probability distribution and that of confidence value.
The “probability” of the “probability distribution”, as used herein, refers to a likelihood of an estimation of a possible location and/or orientation of the medical instrument being correct. For example, different probabilities may be calculated by one of the algorithm modules indicating the relative likelihood that the medical instrument is in one of several different possible branches within the tubular network. In one embodiment, the type of probability distribution (e.g., discrete distribution or continuous distribution) is chosen to match features of an estimated state (e.g., type of the estimated state, for example continuous position information vs. discrete branch choice). As one example, estimated states for identifying which segment the medical instrument is in for a trifurcation may be represented by a discrete probability distribution, and may include three discrete values of 20%, 30% and 50% representing chance as being in the location inside each of the three branches as determined by one of the algorithm modules. As another example, the estimated state may include a roll angle of the medical instrument of 40±5 degrees and a segment depth of the instrument tip within a branch may be is 4±1 mm, each represented by a Gaussian distribution which is a type of continuous probability distribution. Different methods or modalities can be used to generate the probabilities, which will vary by algorithm module as more fully described below with reference to later figures.
In contrast, the “confidence value,” as used herein, reflects a measure of confidence in the estimation of the state provided by one of the algorithms based one or more factors. For the EM-based algorithms, factors such as distortion to EM Field, inaccuracy in EM registration, shift or movement of the patient, and respiration of the patient may affect the confidence in estimation of the state. Particularly, the confidence value in estimation of the state provided by the EM-based algorithms may depend on the particular respiration cycle of the patient, movement of the patient or the EM field generators, and the location within the anatomy where the instrument tip locates. For the image-based algorithms, examples factors that may affect the confidence value in estimation of the state include illumination condition for the location within the anatomy where the images are captured, presence of fluid, tissue, or other obstructions against or in front of the optical sensor capturing the images, respiration of the patient, condition of the tubular network of the patient itself (e.g., lung) such as the general fluid inside the tubular network and occlusion of the tubular network, and specific operating techniques used in, e.g., navigating or image capturing.
For example one factor may be that a particular algorithm has differing levels of accuracy at different depths in a patient's lungs, such that relatively close to the airway opening, a particular algorithm may have a high confidence in its estimations of medical instrument location and orientation, but the further into the bottom of the lung the medical instrument travels that confidence value may drop. Generally, the confidence value is based on one or more systemic factors relating to the process by which a result is determined, whereas probability is a relative measure that arises when trying to determine the correct result from multiple possibilities with a single algorithm based on underlying data.
As one example, a mathematical equation for calculating results of an estimated state represented by a discrete probability distribution (e.g., branch/segment identification for a trifurcation with three values of an estimated state involved) can be as follows:
S1=CEM*P1,EM+CImage*P1,Image+CRobot*P1,Robot;
S2=CEM*P2,EM+CImage*P2,Image+CRobot*P2,Robot;
S3=CEM*P3,EM+CImage*P3,Image+CRobot*P3,Robot.
In the example mathematical equation above, Si (i=1, 2, 3) represents possible example values of an estimated state in a case where 3 possible segments are identified or present in the 3D model, CEM, CImage, and CRobot represents confidence value corresponding to EM-based algorithm, image-based algorithm, and robot-based algorithm and Pi,EM, Pi,Image, and Pi,Robot represent the probabilities for segment i.
To better illustrate the concepts of probability distributions and confidence value associated with estimate states, a detailed example is provided here. In this example, a user is trying to identify segment where an instrument tip is located in a certain trifurcation within a central airway (the predicted region) of the tubular network, and three algorithms modules are used including EM-based algorithm, image-based algorithm, and robot-based algorithm. In this example, a probability distribution corresponding to the EM-based algorithm may be 20% in the first branch, 30% in the second branch, and 50% in the third (last) branch, and the confidence value applied to this EM-based algorithm and the central airway is 80%. For the same example, a probability distribution corresponding to the image-based algorithm may be 40%, 20%, 40% for the first, second, and third branch, and the confidence value applied to this image-based algorithm is 30%; while a probability distribution corresponding to the robot-based algorithm may be 10%, 60%, 30% for the first, second, and third branch, and the confidence value applied to this image-based algorithm is 20%. The difference of confidence values applied to the EM-based algorithm and the image-based algorithm indicates that the EM-based algorithm may be a better choice for segment identification in the central airway, compared with the image-based algorithm. An example mathematical calculation of a final estimated state can be:
for the first branch: 20%*80%+40%*30%+10%*20%=30%; for the second branch: 30%*80%+20%*30%+60%*20%=42%; and for the third branch: 50%*80%+40%*30%+30%*20%=58%.
In this example, the output estimated state for the instrument tip can be the result values (e.g., the resulting 30%, 42% and 58%), or derivative value from these result values such as the determination that the instrument tip is in the third branch. Although this example describes the use of the algorithm modules include EM-based algorithm, image-based algorithm, and robot-based algorithm, the estimation of the state for the instrument tip can also be provided based on different combinations of the various algorithms modules, including the path-based algorithm.
As above the estimated state may be represented in a number of different ways. For example, the estimated state may further include an absolute depth from airway to location of the tip of the instrument, as well as a set of data representing the set of branches traversed by the instrument within the tubular network, the set being a subset of the entire set of branches provided by the 3D model of the patient's lungs, for example. The application of probability distribution and confidence value on estimated states allows improved accuracy of estimation of location and/or orientation of the instrument tip within the tubular network.
As shown in
B.1. Branch Prediction System
The EM-based algorithm module 250 further includes an EM registration module 252, an EM localization module 254, and an EM branch prediction module 256. The EM registration module 252 may perform registration of EM coordinates to 3D model coordinates. The EM localization module 254 may determine an estimate of the position and orientation of the instrument. The EM branch prediction module 256 may determine a prediction as to which segment of the model the instrument will advance from a current segment in which the instrument is currently located.
The EM localization module 254 receives as inputs, estimated state data (prior) (e.g., bifurcation data) from the estimated state data store 285, the EM data from the EM data store 220, registration transform data from the registration transform data store 253, as well as 3D model data from the 3D model data store 240. Based on the received data, the EM localization module 254 determines an estimate of the position and orientation of the instrument tip relative to the 3D model of the tubular network and provides EM-based estimated state data (current) to the state estimator 280. As an example, the EM-based estimated state data may be represented as a probability distribution (e.g., a discrete distribution of 20%, 30% and 50% for three segments of a trifurcation, as described above). Additionally, when at a bifurcation as indicated by the received bifurcation data, the EM localization module 254 may compare the pitch and yaw of the tip of the instrument to the angles of each branch in the 3D model to estimate which branch has been selected by the user for traversal. The EM localization module 254 outputs the EM-based estimated state data (current) to the estimated data store 285 and the EM branch prediction module 256.
The EM branch prediction module 256 may determine a prediction as to which segment of the model the instrument will advance. For example, based on the determined current segment of the model in which the instrument is located and/or orientation data received from the orientation data store 289, the EM branch prediction module 256 may determine a prediction that the instrument will advance into each of the child segments of the current segment. There may be a number of different techniques which may be employed by the EM branch prediction module 256 for determining the prediction, which will be described in greater detail below. In some embodiments, the specific technique used by the EM branch prediction module 256 may depend on whether the location data has been registered to the model. The EM branch prediction module 256 may provide the determined prediction to the estimated state data store 285. In some embodiments as discussed above, the EM branch prediction module 256 may determine the predication based on orientation data received from the orientation data store 289. In other embodiments, the EM branch prediction module 256 may determine the predication based on position data from the position data store 287 or based on both the orientation data and the position data.
B.2. Example Route Taken by Instrument
For illustrative purposes, aspects of this disclosure related to location sensor-based branch prediction will be explained in the context of bronchoscopy and navigating portions of the bronchial luminal network. However, the present disclosure can also be applied to other luminal networks and other medical procedures.
B.3. Example Location Data Generated During Procedure
During a procedure, the location sensors may generate a plurality of unregistered data points 355 which represent the tracked locations of the instrument as the instrument is navigated through the airways. The unregistered data points 355 may be defined with respect to a location sensor coordinate system 360. In the case of embodiments utilizing an EM sensors, the location sensor coordinate system 360 may be defined by or correspond to the EM field from the EM generator A navigation system (e.g., the localization system shown in
B.4. Example Branch Prediction Technique
At block 405, the location sensor-based branch prediction system determines a first orientation of an instrument based on first location data generated by a set of one or more location sensors for the instrument. The first location data may be indicative of the location of the instrument in a location sensor coordinate system at a first time. In certain embodiments, the location sensors may be located at or near a distal end of the instrument, and thus, the location data produced by the location sensors may be indicative of the location of the distal end of the instrument. In one embodiment, the first orientation of the instrument may correspond to the orientation of the instrument at an initial location, such as the initial location 330 of
At block 410, the location sensor-based branch prediction system determines a second orientation of the instrument at a second time based on second location data generated by the set of location sensors. The distal end of the instrument may be located within a first segment of the model at the first time and the second time. The first segment may branch into two or more child segments. In the example embodiment illustrated in
At block 415, the location sensor-based branch prediction system determines data indicative of a difference between the first orientation and the second orientation. This may include for example, determining the orientation of the instrument, based on unregistered location sensor data, at two successive points in time. Based on the difference between the orientations of the instrument while positioned in the current segment, the system may be able to predict into which of the two or more child segments the instrument is most likely to be advanced. That is, since the system has access to the orientation of each of the first generation airway 305 which branches into two second generation airways 315 and 320 from the model, the system may be able to predict which of the child segments the instrument is most likely to be advanced based on a change in the orientation of the instrument.
In one implementation, the location sensor-based branch prediction system may determine the difference between the initial orientation and a subsequent orientation by calculating the relative transformation matrix between the initial orientation and the subsequent orientation. The system may decompose the transformation matrix into the roll, pitch, and yaw angles, defining the orientation of the instrument in the location sensor coordinate system. Certain location sensor technologies (e.g., EM location sensors) which can be employed as the location sensor(s) may be substantially aligned with the patient in at least one angular degree of freedom when the patient lies supine for a given procedure (e.g., for a bronchoscopy procedure). In the EM implementation, an EM field generator may generate an EM field having an orientation that is defined with respect to the orientation of the EM field generator. Thus, by arranging the orientation of the EM field generator to be aligned with the patient, the system may be able to perform the branch prediction method 400 using only the yaw angle determined from the relative transformation matrix.
Since the orientation of the patient and the EM field may be known for a bronchoscopy procedure, the system may be able to determine a yaw angle between the initial orientation and the subsequent orientation based on the data generated by the EM sensor. The bifurcation of the trachea into the primary bronchi can thus be defined with respect to the yaw axis in the EM sensor coordinate system. Accordingly, when the instrument is located within the trachea 305 (see
In some embodiments, block 415 may also include the system determining the angle formed between the orientations of the child segments. The amount of change in the orientation of the distal end of the instrument may correspond to the angle formed between the child segments in order to redirect the insertion direction of the instrument from one of the child segments to the other. As described below, the system may use the angle formed between the child segments as a factor in determining whether to update the branch prediction and/or as a factor used in assigning probabilities to the child segments during branch prediction. It is to be appreciated that embodiments that select thresholds based on the angles formed between the child segments may be determined at design time (e.g., the threshold is a parameter selected by a designer of the system based on the angle) or during run-time (e.g., the system includes logic and data to dynamically determine the threshold either pre-operatively or intra-operatively). It is also to be appreciated that run-time approaches may be impacted on the hardware and software of the system and the anatomy of the patient to accurately distinguish between the angles using the threshold.
At block 420, the location sensor-based branch prediction system determines a prediction that the instrument will advance into a first one of the child segments based on the data indicative of the difference. Depending on the embodiment, the prediction may include: an identification of the child segment the instrument is most likely to be advanced, a probability to each of the child segments that the instrument will be advanced into the corresponding child segment, an ordered list of the child segments from the highest probability to the lowest probability, etc. When the system has previously determined a predication that the instrument will advance into the child segments, the system may update the prediction based on the difference in orientation determined in block 415.
In embodiments where the system determines the angle between the child segments, the system may use the angle formed between the orientations of the child segments in determining the prediction. In other embodiments, the system may determine the prediction based on the difference between the orientation of the distal end of the instrument and the orientation of each of the child segments. In one embodiment, the system may assign a higher probability to a child segment that has a smaller difference in orientation from the orientation of the instrument than the remaining child segments.
In certain embodiments, the system may refrain from adjusting the probabilities of the instrument advancing into each of the child segments unless the orientation of the instrument has changed by more than a threshold level. The system may, in certain embodiments, adjust the threshold level based on the angle between the child segments. For example, the system may increase the threshold level when the angle between the child segments is greater than a first threshold angle and decrease the threshold level when the angle between the child segments is less than a second threshold angle. Once the system has determined a prediction that the instrument will advance into a given one of the child segments (e.g., based on the probability of advancing into the given one being greater than the probabilities for the other child segments), the system may subsequently determine the orientation of the instrument at a third time based on subsequent location data generated by the set of location sensors. The system may be configured to: calculate an angle between (i) the orientation at the third time (e.g., a time after the second time where the instrument is located at the subsequent location) and (ii) the initial orientation; and compare the calculated angle to a threshold angle value. The system may be further configured to update the probabilities of the instrument advancing into each of the child segments in response to the calculated angle being greater than the threshold angle value. Thus, in certain implementations, the system may not update the probabilities of the instrument advancing into each of the child segments unless the orientation of the instrument forms an angle with the initial orientation that is greater than the threshold angle value. The method 400 ends at block 425.
The above-described method 400 may be particularly advantageous when performed using unregistered location data. That is, since unregistered location data may not provide an exact mapping of the location and/or orientation of the instrument to the model coordinate system, the system may use the relative orientation of the instrument at two or more successive times to determine whether a change in the measured orientation of the instrument is consistent with the instrument being articulated towards one of the child segments. The detected change in the measured orientation of the instrument (e.g., within the location sensor coordinate system) may be consistent with a change in the physical orientation of the instrument to be pointed in a direction that is closer to the physical orientation of one of the child segments. In other words, the relative change in the orientation of the instrument in the location sensor coordinate system from the initial orientation to the subsequent orientation may still be indicative of a change in the direction of advancement of the instrument towards one of the child segments when determined using unregistered location data. Thus, since the location sensor coordinate system may not be registered to the model coordinate system during an initial phase of certain procedures, the method 400 can provide branch prediction during the initial phase, which can be used by a sensor fusion technique (e.g., localization system 90 of
In one example, when the data indicative of a difference between the initial orientation and the subsequent orientation is consistent with the instrument being advanced or directed towards the patient's left primary bronchi (e.g., the second generation airway 320), the system may predict that the instrument will be advanced into the left primary bronchi rather than the right primary bronchi. Accordingly, the system may update the prediction for each child branch based on whether the change in the orientation is indicative of the instrument being advanced or directed towards or away from the corresponding child branch.
B.5. Selection of the Initial Location
As described above, the method 400 may include determining a first orientation of the instrument at block 405 when the instrument is located at an initial location. The location sensor-based branch prediction system may select the initial location of the instrument in response to initialization of the state estimation module 240. In certain embodiments, the location sensor-based branch prediction system may select the first indication of the location of the instrument produced by the state estimation module 240 after initialization as the initial location. However, in other embodiments, the system may select other locations (e.g., the second, third, etc. location of the instrument) produced by the state estimation module 240 as the initial location.
In some embodiments, the system may be configured to select the initial location of the instrument based on a determination that the orientation of the instrument at the initial location is substantially aligned with the orientation of the current segment (e.g., aligned with a longitudinal axis of the current segment). As used herein, the system may consider the orientation of the instrument to be substantially aligned with the orientation of the current segment when the difference between the orientations of the instrument and the current segment is less than a threshold difference. However, since the received location sensor data may be unregistered to the model coordinate system, the system may not be able to directly compare the orientation of the instrument to the orientation of the current segment.
In one implementation, the system may be configured to receive an indication from a user that the instrument is aligned with the current segment. The user may be able to confirm that orientation of the instrument is currently aligned with a first generation segment based on other sensors of the system (e.g., a camera located at the distal end of the instrument). In one implementation, the system may provide instructions to the user to drive the instrument to a defined position within the luminal network (e.g., the carina 310 of
In other implementations, the system may be configured or programmed to automatically select the initial location during the driving of the instrument without receiving user input. In one embodiment, the system may track the orientation of the instrument as the instrument is advanced through the current segment over a period of time and, in response to the orientation of the instrument being substantially unchanged for a threshold time period, the system may determine that the orientation of the instrument during the identified period is aligned with the orientation of the current segment. In some embodiments, the system may determine that the orientation of the instrument is substantially unchanged when a maximum difference between the measured orientations of the instrument over the time period is less than a threshold difference.
B.6. Confirmation of a Registration Process
In certain implementations, the system may be configured to perform a registration process in order to register the coordinate system of the location sensor(s) to the coordinate system of the model of the luminal network. The registration may be stored in a registration data store 225 as shown in
In certain procedures, the target may include a nodule (or lesion) to which the instrument may be driven to facilitate diagnosis and/or treatment. Thus, the memory may store the target path to the target within the model and the contra-lateral registration path. During the registration process, the system may be configured to confirm whether the user is currently driving the instrument into the correct branch of the luminal network as defined by the contra-lateral registration path, based on the predictions of whether the instrument will advance down each of the child segments. Thus, the system may determine whether the instrument is located along the contra-lateral registration path based on the predictions. In one implementation, when approaching a bifurcation in the luminal network (e.g., the bifurcation near the primary bronchi), the system may compare the probability that the instrument will advance into the branch along the contra-lateral registration path to a threshold probability. When the probability is less than the threshold probability, the system may display an indication to the user that the user may not be driving towards the correct branch.
In another implementation, the system may determine that the instrument was advanced along the target path prior to being advanced down the contra-lateral registration path, which may be indicative of the user inadvertently driving the instrument along a path that does not correspond to the contra-lateral registration path used during the registration process. Since the contra-lateral registration path may require the instrument being driven down the contra-lateral path before being driven down the target path, the system may provide an indication that contra-lateral registration was unsuccessful in response to determining that the instrument was advanced along the target path prior to being advanced down the contra-lateral registration path.
The system may also be configured to display an indication of the location of the instrument with respect to the model during a given procedure to provide feedback to the user. Accordingly, the system may determine a position of the distal end of the instrument with respect to the model based on a plurality of sources of data indicative of the location of the instrument. In certain implementations, the system may determine the position of the distal end of the instrument based on one or more of: the location data received from the location sensors, a set of commands provided to control movement of the instrument, and prediction(s) that the instrument will advance into the child segments of the current segment. Thus, the prediction(s) determined via the method 400 of
C. Registered Location Sensor-based Branch Prediction.
After the system has performed a registration process, registering the location sensor coordinate system to the model coordinate system, the system may be able to use the data generated by the location sensor(s) to determine an indication of the location of the distal end of the instrument with reference to the model associated with the model coordinate system. Using the registered data location data, the system can produce a predication as to the child segment of the luminal network into which the instrument is most likely to be advanced. Depending on the implementation, the use of unregistered location sensor data may not provide sufficient accuracy for branch prediction in the luminal network once the instrument has been advanced into the luminal network beyond a certain distance. For example, once the instrument has been advanced into the primary bronchi, the system may not be able to performed branch prediction without registered location sensor data. Accordingly, aspects of this disclosure also relate to the use of registered location data for branch prediction.
At block 605, the location sensor-based branch prediction system determines an orientation of an instrument with respect to the model based on location data generated by a set of one or more location sensors for the instrument. In certain implementations, the location data includes registered location data received from the set of location sensors. At the time the location sensor data is received by the system, the location sensors may be registered to a model of the luminal network. Using the registered location data, the system may be able to determine the location of the distal end of the instrument with respect to the model based on the location data. In certain implementations, the system may employ a sensor fusion technique (e.g., by using localization system 90 of
In certain implementations, a distal end of the instrument may be located within a first segment of the model when the system determines the orientation of the instrument at block 605. In one embodiment, the distal end of the instrument 525 is located within a current segment 525 which includes two child branches 515 and 520 as shown in
At block 610, the location sensor-based branch prediction system determines an orientation of a first one of the child segments. In the example of
At block 615, the location sensor-based branch prediction determines a prediction that the instrument will advance into the first child segment based on the orientation of the instrument and the orientation of the first child segment. This may include, for example, the system determining the difference between the orientation of the distal end of the instrument 525 and the orientations of each of the child segments 515 and 520 in the
Depending on the embodiment, the prediction may include: an identification of the child segment the instrument is most likely to be advanced, a probability to each of the child segments that the instrument will be advanced into the corresponding child segment, an ordered list of the child segments from the highest probability to the lowest probability, etc. Thus, in certain embodiments, the prediction may include data indicative of a probability that the instrument will advance into each of the child segments. In some implementations, the system may assign higher probabilities to child segments having a lower difference in orientation from the orientation of the instrument. Thus, the system may determine data indicative of a difference between the orientation of the instrument and the orientation of each of the child segments to aid in determining the corresponding probabilities.
In certain implementations, the system may further determine an angle between the orientation of the instrument and the orientation of each of the child segment. The angle between the orientation of the instrument and a given child segment may be indicative of the difference between the orientations. Thus, the system may be configured to assign a probability to a given child segment that is inversely proportional to the angle between the given child segment and the instrument. In one implementation, the system may determine the angle based on the dot product between the orientation of the instrument and the orientation of a given child segment. Since a smaller angle may be indicative greater alignment between the orientation of the instrument and a given child segment, the system may also use the inverse of the determined angle to calculate the probability for the given child segment.
The prediction may be used by the system as a source of data for a fusion technique (such as, e.g., the localization system 90 of
In certain implementations, the system may apply an auxiliary technique for branch prediction in addition to the above-described orientation-based prediction. In one implementation, the auxiliary technique may include a location-based prediction which compares the location of the distal end of the instrument to the location of the beginning of each of the child segments. The system may determine an auxiliary prediction that the instrument will advance into each of the child segments based on the location of the distal end of the instrument. The prediction may be based on the location of the distal end of the instrument with respect to each of the child segments. Further details and examples of location-based prediction techniques which can be used as the auxiliary technique are described in U.S. Patent Publication No. 2017/0084027, referenced above. In one implementation, when the location senor data is indicative of the distal end of the instrument as being located closer to one child segment than another child segment, the system may assign a higher probability to the closer child segment than the farther child segment. For example, referring back to
One advantage associated with orientation-based location sensor branch prediction over location-based branch prediction is that the orientation-based prediction may be performed continuously during the driving of the instrument through the luminal network. For example, a location-based branch prediction technique may not provide an accurate prediction unless the distal end of the instrument is within a threshold distance of the bifurcation of the current segment into the child segments. Since the location-based prediction technique relies on location sensor data, the location-based prediction technique may be susceptible to errors in the location sensor registration as well as to jitter in the location sensor data. Thus, when the instrument is relatively far away from the child segments, the distance between the instrument and each of the child segments may not be indicative of the child segment to which the user is driving the instrument towards. In contrast, the orientation of the instrument may be more strongly correlated with the child segment into which the user is driving the instrument even when the instrument is relatively far away from the bifurcation defined by the child segments. Accordingly, in some implementations, the orientation-based branch predication techniques described herein may be applied as the distal end of the instrument is advanced from a beginning of a current segment to an end of the current segment. In other embodiments, the system may apply the orientation-based branch prediction technique independent of the position of the distal end of the instrument within the current segment.
3. Implementing Systems and Terminology
Implementations disclosed herein provide systems, methods and apparatuses for location sensor-based branch prediction.
It should be noted that the terms “couple,” “coupling,” “coupled” or other variations of the word couple as used herein may indicate either an indirect connection or a direct connection. For example, if a first component is “coupled” to a second component, the first component may be either indirectly connected to the second component via another component or directly connected to the second component.
The functions described herein may be stored as one or more instructions on a processor-readable or computer-readable medium. The term “computer-readable medium” refers to any available medium that can be accessed by a computer or processor. By way of example, and not limitation, such a medium may comprise random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory, compact disc read-only memory (CD-ROM) or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. It should be noted that a computer-readable medium may be tangible and non-transitory. As used herein, the term “code” may refer to software, instructions, code or data that is/are executable by a computing device or processor.
The methods disclosed herein comprise one or more steps or actions for achieving the described method. The method steps and/or actions may be interchanged with one another without departing from the scope of the claims. In other words, unless a specific order of steps or actions is required for proper operation of the method that is being described, the order and/or use of specific steps and/or actions may be modified without departing from the scope of the claims.
As used herein, the term “plurality” denotes two or more. For example, a plurality of components indicates two or more components. The term “determining” encompasses a wide variety of actions and, therefore, “determining” can include calculating, computing, processing, deriving, investigating, looking up (e.g., looking up in a table, a database or another data structure), ascertaining and the like. Also, “determining” can include receiving (e.g., receiving information), accessing (e.g., accessing data in a memory) and the like. Also, “determining” can include resolving, selecting, choosing, establishing and the like.
The phrase “based on” does not mean “based only on,” unless expressly specified otherwise. In other words, the phrase “based on” describes both “based only on” and “based at least on.”
The previous description of the disclosed implementations is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these implementations will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other implementations without departing from the scope of the invention. For example, it will be appreciated that one of ordinary skill in the art will be able to employ a number corresponding alternative and equivalent structural details, such as equivalent ways of fastening, mounting, coupling, or engaging tool components, equivalent mechanisms for producing particular actuation motions, and equivalent mechanisms for delivering electrical energy. Thus, the present invention is not intended to be limited to the implementations shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
This application claims the benefit of U.S. Provisional Application No. 62/678,160, filed May 30, 2018, and the benefit of U.S. Provisional Application No. 62/678,962, filed May 31, 2018, each of which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4745908 | Wardle | May 1988 | A |
5273025 | Sakiyam et al. | Dec 1993 | A |
5526812 | Dumoulin et al. | Jun 1996 | A |
5550953 | Seraji | Aug 1996 | A |
5831614 | Tognazzini et al. | Nov 1998 | A |
5935075 | Casscells | Aug 1999 | A |
6038467 | De Bliek et al. | Mar 2000 | A |
6047080 | Chen | Apr 2000 | A |
6059718 | Taniguchi et al. | May 2000 | A |
6063095 | Wang et al. | May 2000 | A |
6167292 | Badano | Dec 2000 | A |
6203493 | Ben-Haim | Mar 2001 | B1 |
6246784 | Summers | Jun 2001 | B1 |
6246898 | Vesely | Jun 2001 | B1 |
6332089 | Acker | Dec 2001 | B1 |
6425865 | Salcudean et al. | Jul 2002 | B1 |
6466198 | Feinstein | Oct 2002 | B1 |
6490467 | Bucholz | Dec 2002 | B1 |
6553251 | Lahdesmaki | Apr 2003 | B1 |
6665554 | Charles | Dec 2003 | B1 |
6690963 | Ben-Haim | Feb 2004 | B2 |
6690964 | Beiger et al. | Feb 2004 | B2 |
6812842 | Dimmer | Nov 2004 | B2 |
6899672 | Chin | May 2005 | B2 |
6926709 | Beiger et al. | Aug 2005 | B2 |
7180976 | Wink | Feb 2007 | B2 |
7206627 | Abovitz | Apr 2007 | B2 |
7233820 | Gilboa | Jun 2007 | B2 |
7386339 | Strommer et al. | Jun 2008 | B2 |
7756563 | Higgins | Jul 2010 | B2 |
7850642 | Moll et al. | Dec 2010 | B2 |
7901348 | Soper | Mar 2011 | B2 |
8155403 | Tschirren | Apr 2012 | B2 |
8190238 | Moll et al. | May 2012 | B2 |
8298135 | Ito et al. | Oct 2012 | B2 |
8317746 | Sewell et al. | Nov 2012 | B2 |
8394054 | Wallace et al. | Mar 2013 | B2 |
8460236 | Roelle et al. | Jun 2013 | B2 |
8821376 | Tolkowsky | Sep 2014 | B2 |
8858424 | Hasegawa | Oct 2014 | B2 |
8929631 | Pfister et al. | Jan 2015 | B2 |
9014851 | Wong et al. | Apr 2015 | B2 |
9125639 | Mathis | Sep 2015 | B2 |
9138129 | Diolaiti | Sep 2015 | B2 |
9183354 | Baker et al. | Nov 2015 | B2 |
9186046 | Ramamurthy et al. | Nov 2015 | B2 |
9272416 | Hourtash et al. | Mar 2016 | B2 |
9289578 | Walker et al. | Mar 2016 | B2 |
9459087 | Dunbar | Oct 2016 | B2 |
9504604 | Alvarez | Nov 2016 | B2 |
9561083 | Yu et al. | Feb 2017 | B2 |
9603668 | Weingarten et al. | Mar 2017 | B2 |
9622827 | Yu et al. | Apr 2017 | B2 |
9629682 | Wallace et al. | Apr 2017 | B2 |
9636184 | Lee et al. | May 2017 | B2 |
9710921 | Wong et al. | Jul 2017 | B2 |
9713509 | Schuh et al. | Jul 2017 | B2 |
9717563 | Tognaccini | Aug 2017 | B2 |
9727963 | Mintz et al. | Aug 2017 | B2 |
9737371 | Romo et al. | Aug 2017 | B2 |
9737373 | Schuh | Aug 2017 | B2 |
9744335 | Jiang | Aug 2017 | B2 |
9763741 | Alvarez et al. | Sep 2017 | B2 |
9788910 | Schuh | Oct 2017 | B2 |
9844412 | Bogusky et al. | Dec 2017 | B2 |
9867635 | Alvarez et al. | Jan 2018 | B2 |
9918681 | Wallace et al. | Mar 2018 | B2 |
9931025 | Graetzel et al. | Apr 2018 | B1 |
9949749 | Noonan et al. | Apr 2018 | B2 |
9955986 | Shah | May 2018 | B2 |
9962228 | Schuh et al. | May 2018 | B2 |
9980785 | Schuh | May 2018 | B2 |
9993313 | Schuh et al. | Jun 2018 | B2 |
10016900 | Meyer et al. | Jul 2018 | B1 |
10022192 | Ummalaneni | Jul 2018 | B1 |
10046140 | Kokish et al. | Aug 2018 | B2 |
10080576 | Romo et al. | Sep 2018 | B2 |
10123755 | Walker et al. | Nov 2018 | B2 |
10130345 | Wong et al. | Nov 2018 | B2 |
10136950 | Schoenefeld | Nov 2018 | B2 |
10136959 | Mintz et al. | Nov 2018 | B2 |
10143360 | Roelle et al. | Dec 2018 | B2 |
10143526 | Walker et al. | Dec 2018 | B2 |
10145747 | Lin et al. | Dec 2018 | B1 |
10149720 | Romo | Dec 2018 | B2 |
10159532 | Ummalaneni et al. | Dec 2018 | B1 |
10159533 | Moll et al. | Dec 2018 | B2 |
10169875 | Mintz et al. | Jan 2019 | B2 |
10219874 | Yu et al. | Mar 2019 | B2 |
10231793 | Romo | Mar 2019 | B2 |
10231867 | Alvarez et al. | Mar 2019 | B2 |
10244926 | Noonan et al. | Apr 2019 | B2 |
10278778 | State | May 2019 | B2 |
10285574 | Landey et al. | May 2019 | B2 |
10299870 | Connolly et al. | May 2019 | B2 |
10524866 | Srinivasan | Jan 2020 | B2 |
20010021843 | Bosselmann et al. | Sep 2001 | A1 |
20010039421 | Heilbrun | Nov 2001 | A1 |
20020065455 | Ben-Haim et al. | May 2002 | A1 |
20020077533 | Bieger et al. | Jun 2002 | A1 |
20020120188 | Brock et al. | Aug 2002 | A1 |
20030105603 | Hardesty | Jun 2003 | A1 |
20030125622 | Schweikard | Jul 2003 | A1 |
20030181809 | Hall et al. | Sep 2003 | A1 |
20030195664 | Nowlin et al. | Oct 2003 | A1 |
20040047044 | Dalton | Mar 2004 | A1 |
20040072066 | Cho et al. | Apr 2004 | A1 |
20040186349 | Ewers | Sep 2004 | A1 |
20040249267 | Gilboa | Dec 2004 | A1 |
20040263535 | Birkenbach et al. | Dec 2004 | A1 |
20050027397 | Niemeyer | Feb 2005 | A1 |
20050060006 | Pflueger | Mar 2005 | A1 |
20050085714 | Foley et al. | Apr 2005 | A1 |
20050107679 | Geiger | May 2005 | A1 |
20050143649 | Minai et al. | Jun 2005 | A1 |
20050143655 | Satoh | Jun 2005 | A1 |
20050182295 | Soper et al. | Aug 2005 | A1 |
20050193451 | Quistgaard et al. | Sep 2005 | A1 |
20050197557 | Strommer et al. | Sep 2005 | A1 |
20050256398 | Hastings | Nov 2005 | A1 |
20050272975 | McWeeney et al. | Dec 2005 | A1 |
20060004286 | Chang | Jan 2006 | A1 |
20060015096 | Hauck et al. | Jan 2006 | A1 |
20060025668 | Peterson | Feb 2006 | A1 |
20060058643 | Florent | Mar 2006 | A1 |
20060084860 | Geiger | Apr 2006 | A1 |
20060095066 | Chang | May 2006 | A1 |
20060098851 | Shoham | May 2006 | A1 |
20060149134 | Soper et al. | Jul 2006 | A1 |
20060173290 | Lavallee et al. | Aug 2006 | A1 |
20060184016 | Glossop | Aug 2006 | A1 |
20060209019 | Hu | Sep 2006 | A1 |
20060258935 | Pile-Spellman et al. | Nov 2006 | A1 |
20060258938 | Hoffman et al. | Nov 2006 | A1 |
20070032826 | Schwartz | Feb 2007 | A1 |
20070055128 | Glossop | Mar 2007 | A1 |
20070055144 | Neustadter | Mar 2007 | A1 |
20070073136 | Metzger | Mar 2007 | A1 |
20070083193 | Werneth | Apr 2007 | A1 |
20070123748 | Meglan | May 2007 | A1 |
20070135886 | Maschke | Jun 2007 | A1 |
20070156019 | Larkin et al. | Jul 2007 | A1 |
20070167743 | Honda | Jul 2007 | A1 |
20070167801 | Webler et al. | Jul 2007 | A1 |
20070208252 | Makower | Sep 2007 | A1 |
20070253599 | White et al. | Nov 2007 | A1 |
20070269001 | Maschke | Nov 2007 | A1 |
20070293721 | Gilboa | Dec 2007 | A1 |
20070299353 | Harlev et al. | Dec 2007 | A1 |
20080071140 | Gattani | Mar 2008 | A1 |
20080079421 | Jensen | Apr 2008 | A1 |
20080103389 | Begelman et al. | May 2008 | A1 |
20080118118 | Berger | May 2008 | A1 |
20080118135 | Averbach | May 2008 | A1 |
20080123921 | Gielen et al. | May 2008 | A1 |
20080147089 | Loh | Jun 2008 | A1 |
20080161681 | Hauck | Jul 2008 | A1 |
20080183064 | Chandonnet | Jul 2008 | A1 |
20080183068 | Carls et al. | Jul 2008 | A1 |
20080183073 | Higgins et al. | Jul 2008 | A1 |
20080183188 | Carls et al. | Jul 2008 | A1 |
20080201016 | Finlay | Aug 2008 | A1 |
20080207997 | Higgins et al. | Aug 2008 | A1 |
20080212082 | Froggatt et al. | Sep 2008 | A1 |
20080218770 | Moll et al. | Sep 2008 | A1 |
20080243142 | Gildenberg | Oct 2008 | A1 |
20080262297 | Gilboa | Oct 2008 | A1 |
20080275349 | Halperin | Nov 2008 | A1 |
20080287963 | Rogers et al. | Nov 2008 | A1 |
20080306490 | Lakin et al. | Dec 2008 | A1 |
20080312501 | Hasegawa et al. | Dec 2008 | A1 |
20090030307 | Govari | Jan 2009 | A1 |
20090054729 | Mori | Feb 2009 | A1 |
20090076476 | Barbagli et al. | Mar 2009 | A1 |
20090149867 | Glozman | Jun 2009 | A1 |
20090209817 | Averbuch | Aug 2009 | A1 |
20090227861 | Ganatra | Sep 2009 | A1 |
20090248036 | Hoffman et al. | Oct 2009 | A1 |
20090259230 | Khadem | Oct 2009 | A1 |
20090262109 | Markowitz et al. | Oct 2009 | A1 |
20090292166 | Ito | Nov 2009 | A1 |
20090295797 | Sakaguchi | Dec 2009 | A1 |
20100008555 | Trumer | Jan 2010 | A1 |
20100039506 | Sarvestani et al. | Feb 2010 | A1 |
20100041949 | Tolkowsky | Feb 2010 | A1 |
20100054536 | Huang | Mar 2010 | A1 |
20100113852 | Sydora | May 2010 | A1 |
20100121139 | OuYang | May 2010 | A1 |
20100160733 | Gilboa | Jun 2010 | A1 |
20100161022 | Tolkowsky | Jun 2010 | A1 |
20100161129 | Costa et al. | Jun 2010 | A1 |
20100225209 | Goldberg | Sep 2010 | A1 |
20100240989 | Stoianovici | Sep 2010 | A1 |
20100290530 | Huang et al. | Nov 2010 | A1 |
20100292565 | Meyer | Nov 2010 | A1 |
20100298641 | Tanaka | Nov 2010 | A1 |
20100328455 | Nam et al. | Dec 2010 | A1 |
20110054303 | Barrick | Mar 2011 | A1 |
20110092808 | Shachar | Apr 2011 | A1 |
20110184238 | Higgins | Jul 2011 | A1 |
20110234780 | Ito | Sep 2011 | A1 |
20110238082 | Wenderow | Sep 2011 | A1 |
20110245665 | Nentwick | Oct 2011 | A1 |
20110248987 | Mitchell | Oct 2011 | A1 |
20110249016 | Zhang | Oct 2011 | A1 |
20110276179 | Banks et al. | Nov 2011 | A1 |
20110319910 | Roelle et al. | Dec 2011 | A1 |
20120046521 | Hunter et al. | Feb 2012 | A1 |
20120056986 | Popovic | Mar 2012 | A1 |
20120065481 | Hunter | Mar 2012 | A1 |
20120069167 | Liu et al. | Mar 2012 | A1 |
20120071782 | Patil et al. | Mar 2012 | A1 |
20120082351 | Higgins | Apr 2012 | A1 |
20120120305 | Takahashi | May 2012 | A1 |
20120165656 | Montag | Jun 2012 | A1 |
20120191079 | Moll et al. | Jul 2012 | A1 |
20120209069 | Popovic | Aug 2012 | A1 |
20120215094 | Rahimian et al. | Aug 2012 | A1 |
20120219185 | Hu | Aug 2012 | A1 |
20120289777 | Chopra | Nov 2012 | A1 |
20120289783 | Duindam et al. | Nov 2012 | A1 |
20120302869 | Koyrakh | Nov 2012 | A1 |
20130060146 | Yang et al. | Mar 2013 | A1 |
20130144116 | Cooper et al. | Jun 2013 | A1 |
20130165945 | Roelle | Jun 2013 | A9 |
20130204124 | Duindam | Aug 2013 | A1 |
20130225942 | Holsing | Aug 2013 | A1 |
20130243153 | Sra | Sep 2013 | A1 |
20130246334 | Ahuja | Sep 2013 | A1 |
20130259315 | Angot et al. | Oct 2013 | A1 |
20130303892 | Zhao | Nov 2013 | A1 |
20130345718 | Crawford | Dec 2013 | A1 |
20140058406 | Tsekos | Feb 2014 | A1 |
20140107390 | Brown | Apr 2014 | A1 |
20140114180 | Jain | Apr 2014 | A1 |
20140148808 | Inkpen et al. | Apr 2014 | A1 |
20140142591 | Alvarez et al. | May 2014 | A1 |
20140180063 | Zhao | Jun 2014 | A1 |
20140235943 | Paris | Aug 2014 | A1 |
20140243849 | Saglam | Aug 2014 | A1 |
20140257746 | Dunbar et al. | Sep 2014 | A1 |
20140264081 | Walker et al. | Sep 2014 | A1 |
20140275988 | Walker et al. | Sep 2014 | A1 |
20140276033 | Brannan | Sep 2014 | A1 |
20140276937 | Wong et al. | Sep 2014 | A1 |
20140296655 | Akhbardeh et al. | Oct 2014 | A1 |
20140309527 | Namati et al. | Oct 2014 | A1 |
20140343416 | Panescu | Nov 2014 | A1 |
20140350391 | Prisco et al. | Nov 2014 | A1 |
20140357984 | Wallace et al. | Dec 2014 | A1 |
20140364739 | Liu | Dec 2014 | A1 |
20140364870 | Alvarez et al. | Dec 2014 | A1 |
20150051482 | Liu et al. | Feb 2015 | A1 |
20150051592 | Kintz | Feb 2015 | A1 |
20150054929 | Ito et al. | Feb 2015 | A1 |
20150057498 | Akimoto | Feb 2015 | A1 |
20150073266 | Brannan | Mar 2015 | A1 |
20150141808 | Elhawary | May 2015 | A1 |
20150141858 | Razavi | May 2015 | A1 |
20150142013 | Tanner et al. | May 2015 | A1 |
20150164594 | Romo et al. | Jun 2015 | A1 |
20150164596 | Romo | Jun 2015 | A1 |
20150223725 | Engel | Aug 2015 | A1 |
20150223897 | Kostrzewski et al. | Aug 2015 | A1 |
20150223902 | Walker et al. | Aug 2015 | A1 |
20150255782 | Kim et al. | Sep 2015 | A1 |
20150265087 | Park | Sep 2015 | A1 |
20150265368 | Chopra | Sep 2015 | A1 |
20150275986 | Cooper | Oct 2015 | A1 |
20150287192 | Sasaki | Oct 2015 | A1 |
20150297133 | Jouanique-Dubuis et al. | Oct 2015 | A1 |
20150305650 | Hunter | Oct 2015 | A1 |
20150313503 | Seibel et al. | Nov 2015 | A1 |
20150335480 | Alvarez et al. | Nov 2015 | A1 |
20160000302 | Brown | Jan 2016 | A1 |
20160000414 | Brown | Jan 2016 | A1 |
20160000520 | Lachmanovich | Jan 2016 | A1 |
20160001038 | Romo et al. | Jan 2016 | A1 |
20160008033 | Hawkins et al. | Jan 2016 | A1 |
20160111192 | Suzara | Apr 2016 | A1 |
20160128992 | Hudson | May 2016 | A1 |
20160183841 | Duindam | Jun 2016 | A1 |
20160199134 | Brown et al. | Jul 2016 | A1 |
20160206389 | Miller | Jul 2016 | A1 |
20160213432 | Flexman | Jul 2016 | A1 |
20160228032 | Walker et al. | Aug 2016 | A1 |
20160270865 | Landey et al. | Sep 2016 | A1 |
20160287279 | Bovay et al. | Oct 2016 | A1 |
20160287346 | Hyodo et al. | Oct 2016 | A1 |
20160314710 | Jarc | Oct 2016 | A1 |
20160331469 | Hall et al. | Nov 2016 | A1 |
20160360947 | Iida | Dec 2016 | A1 |
20160372743 | Cho et al. | Dec 2016 | A1 |
20160374541 | Agrawal et al. | Dec 2016 | A1 |
20170007337 | Dan | Jan 2017 | A1 |
20170055851 | Al-Ali | Mar 2017 | A1 |
20170079725 | Hoffman | Mar 2017 | A1 |
20170079726 | Hoffman | Mar 2017 | A1 |
20170084027 | Mintz | Mar 2017 | A1 |
20170100199 | Yu et al. | Apr 2017 | A1 |
20170119481 | Romo et al. | May 2017 | A1 |
20170165011 | Bovay et al. | Jun 2017 | A1 |
20170172673 | Yu et al. | Jun 2017 | A1 |
20170189118 | Chopra | Jul 2017 | A1 |
20170202627 | Sramek et al. | Jul 2017 | A1 |
20170209073 | Sramek et al. | Jul 2017 | A1 |
20170215808 | Shimol et al. | Aug 2017 | A1 |
20170215969 | Zhai et al. | Aug 2017 | A1 |
20170238807 | Vertikov et al. | Aug 2017 | A9 |
20170258366 | Tupin | Sep 2017 | A1 |
20170290631 | Lee et al. | Oct 2017 | A1 |
20170296032 | Li | Oct 2017 | A1 |
20170296202 | Brown | Oct 2017 | A1 |
20170325896 | Donhowe | Nov 2017 | A1 |
20170333679 | Jiang | Nov 2017 | A1 |
20170340241 | Yamada | Nov 2017 | A1 |
20170340396 | Romo et al. | Nov 2017 | A1 |
20170348067 | Krimsky | Dec 2017 | A1 |
20170360508 | Germain et al. | Dec 2017 | A1 |
20170367782 | Schuh et al. | Dec 2017 | A1 |
20180025666 | Ho et al. | Jan 2018 | A1 |
20180055582 | Krimsky | Mar 2018 | A1 |
20180098690 | Iwaki | Apr 2018 | A1 |
20180177556 | Noonan et al. | Jun 2018 | A1 |
20180214011 | Graetzel et al. | Aug 2018 | A1 |
20180217734 | Koenig et al. | Aug 2018 | A1 |
20180221038 | Noonan et al. | Aug 2018 | A1 |
20180221039 | Shah | Aug 2018 | A1 |
20180240237 | Donhowe et al. | Aug 2018 | A1 |
20180250083 | Schuh et al. | Sep 2018 | A1 |
20180271616 | Schuh et al. | Sep 2018 | A1 |
20180279852 | Rafii-Tari et al. | Oct 2018 | A1 |
20180280660 | Landey et al. | Oct 2018 | A1 |
20180286108 | Hirakawa | Oct 2018 | A1 |
20180289431 | Draper et al. | Oct 2018 | A1 |
20180308247 | Gupta | Oct 2018 | A1 |
20180325499 | Landey et al. | Nov 2018 | A1 |
20180333044 | Jenkins | Nov 2018 | A1 |
20180360435 | Romo | Dec 2018 | A1 |
20180368920 | Ummalaneni | Dec 2018 | A1 |
20190000559 | Berman et al. | Jan 2019 | A1 |
20190000560 | Berman et al. | Jan 2019 | A1 |
20190000566 | Graetzel et al. | Jan 2019 | A1 |
20190000576 | Mintz et al. | Jan 2019 | A1 |
20190046814 | Senden et al. | Feb 2019 | A1 |
20190066314 | Abhari | Feb 2019 | A1 |
20190083183 | Moll et al. | Mar 2019 | A1 |
20190086349 | Nelson | Mar 2019 | A1 |
20190105776 | Ho et al. | Apr 2019 | A1 |
20190105785 | Meyer | Apr 2019 | A1 |
20190107454 | Lin | Apr 2019 | A1 |
20190110839 | Rafii-Tari et al. | Apr 2019 | A1 |
20190110843 | Ummalaneni et al. | Apr 2019 | A1 |
20190117176 | Walker et al. | Apr 2019 | A1 |
20190117203 | Wong et al. | Apr 2019 | A1 |
20190151148 | Alvarez et al. | Apr 2019 | A1 |
20190125164 | Roelle et al. | May 2019 | A1 |
20190167366 | Ummalaneni | Jun 2019 | A1 |
20190175009 | Mintz | Jun 2019 | A1 |
20190175062 | Rafii-Tari et al. | Jun 2019 | A1 |
20190175287 | Hill | Jun 2019 | A1 |
20190175799 | Hsu | Jun 2019 | A1 |
20190183585 | Rafii-Tari et al. | Jun 2019 | A1 |
20190183587 | Rafii-Tari et al. | Jun 2019 | A1 |
20190216548 | Ummalaneni | Jul 2019 | A1 |
20190216550 | Eyre | Jul 2019 | A1 |
20190216576 | Eyre | Jul 2019 | A1 |
20190223974 | Romo | Jul 2019 | A1 |
20190228525 | Mintz et al. | Jul 2019 | A1 |
20190228528 | Mintz et al. | Jul 2019 | A1 |
20190246882 | Graetzel et al. | Aug 2019 | A1 |
20190262086 | Connolly et al. | Aug 2019 | A1 |
20190269468 | Hsu et al. | Sep 2019 | A1 |
20190274764 | Romo | Sep 2019 | A1 |
20190287673 | Michihata | Sep 2019 | A1 |
20190290109 | Agrawal et al. | Sep 2019 | A1 |
20190298160 | Ummalaneni et al. | Oct 2019 | A1 |
20190298460 | Al-Jadda | Oct 2019 | A1 |
20190298465 | Chin | Oct 2019 | A1 |
20190328213 | Landey et al. | Oct 2019 | A1 |
20190336238 | Yu | Nov 2019 | A1 |
20190365209 | Ye et al. | Dec 2019 | A1 |
20190365486 | Srinivasan et al. | Dec 2019 | A1 |
20190374297 | Wallace et al. | Dec 2019 | A1 |
20190375383 | Alvarez | Dec 2019 | A1 |
20190380787 | Ye | Dec 2019 | A1 |
20190380797 | Yu | Dec 2019 | A1 |
20200000530 | DeFonzo | Jan 2020 | A1 |
20200000533 | Schuh | Jan 2020 | A1 |
20200022767 | Hill | Jan 2020 | A1 |
20200039086 | Meyer | Feb 2020 | A1 |
20200046434 | Graetzel | Feb 2020 | A1 |
20200054405 | Schuh | Feb 2020 | A1 |
20200054408 | Schuh et al. | Feb 2020 | A1 |
20200060516 | Baez | Feb 2020 | A1 |
20200078103 | Duindam | Mar 2020 | A1 |
20200093549 | Chin | Mar 2020 | A1 |
20200093554 | Schuh | Mar 2020 | A1 |
20200100845 | Julian | Apr 2020 | A1 |
20200100853 | Ho | Apr 2020 | A1 |
20200100855 | Leparmentier | Apr 2020 | A1 |
20200101264 | Jiang | Apr 2020 | A1 |
20200107894 | Wallace | Apr 2020 | A1 |
20200121502 | Kintz | Apr 2020 | A1 |
20200146769 | Eyre | May 2020 | A1 |
20200155084 | Walker | May 2020 | A1 |
20200170630 | Wong | Jun 2020 | A1 |
20200170720 | Ummalaneni | Jun 2020 | A1 |
20200188043 | Yu | Jun 2020 | A1 |
20200206472 | Ma | Jul 2020 | A1 |
20200217733 | Lin | Jul 2020 | A1 |
20200222134 | Schuh | Jul 2020 | A1 |
20200237458 | DeFonzo | Jul 2020 | A1 |
Number | Date | Country |
---|---|---|
101147676 | Mar 2008 | CN |
101222882 | Jul 2008 | CN |
102316817 | Jan 2012 | CN |
102458295 | May 2012 | CN |
102973317 | Mar 2013 | CN |
103735313 | Apr 2014 | CN |
105559850 | May 2016 | CN |
105559886 | May 2016 | CN |
105611881 | May 2016 | CN |
106455908 | Feb 2017 | CN |
106821498 | Jun 2017 | CN |
104931059 | Sep 2018 | CN |
3 025 630 | Jun 2016 | EP |
10-2014-0009359 | Jan 2014 | KR |
2569699 | Nov 2015 | RU |
WO 05087128 | Sep 2005 | WO |
WO 09097461 | Jun 2007 | WO |
WO 15089013 | Jun 2015 | WO |
WO 17048194 | Mar 2017 | WO |
WO 17066108 | Apr 2017 | WO |
WO 17167754 | Oct 2017 | WO |
Entry |
---|
Vermuri et al., Dec. 2015, Inter-operative biopsy site relocations in endoluminal surgery, IEEE Transactions on Biomedical Engineering, Institute of Electrical and Electronics Engineers, <10.1109/TBME.2015.2503981>. <hal-01230752>. |
Ciuti et al., 2012, Intra-operative monocular 30 reconstruction for image-guided navigation in active locomotion capsule endoscopy. Biomedical Robotics and Biomechatronics (Biorob), 4th IEEE Ras & Embs International Conference on IEEE. |
Fallavollita et al., 2010, Acquiring multiview C-arm images to assist cardiac ablation procedures, EURASIP Journal on Image and Video Processing, vol. 2010, Article ID 871408, pp. 1-10. |
Haigron et al., 2004, Depth-map-based scene analysis for active navigation in virtual angioscopy, IEEE Transactions on Medical Imaging, 23(11):1380-1390. |
Konen et al., 1998, The VN-project: endoscopic image processing for neurosurgery, Computer Aided Surgery, 3:1-6. |
Kumar et al., 2014, Stereoscopic visualization of laparoscope image using depth information from 3D model, Computer methods and programs in biomedicine 113(3):862-868. |
Livatino et al., 2015, Stereoscopic visualization and 3-D technologies in medical endoscopic teleoperation, IEEE. |
Luo et al., 2010, Modified hybrid bronchoscope tracking based on sequential monte carlo sampler: Dynamic phantom validation, Asian Conference on Computer Vision, Springer, Berlin, Heidelberg. |
Mayo Clinic, Robotic Surgery, https://www.mayoclinic.org/tests-procedures/robotic-surgery/about/pac-20394974?p=1, downloaded from the Internet on Jul. 12, 2018, 2 pp. |
Mourgues et al., 2002, Flexible calibration of actuated stereoscopic endoscope for overlay inrobot assisted surgery, International Conference on Medical Image Computing and Computer-Assisted Intervention. Springer, Berlin, Heidelberg. |
Nadeem et al., 2016, Depth Reconstruction and Computer-Aided Polyp Detection in Optical Colonoscopy Video Frames, arXiv preprint arXiv:1609.01329. |
Point Cloud, Sep. 10, 2010, Wikipedia, 2 pp. |
Racadio et al., Dec. 2007, Live 3D guidance in the interventional radiology suite, AJR, 189:W357-W364. |
Sato et al., 2016, Techniques of stapler-based navigational thoracoscopic segmentectomy using virtual assisted lung mapping (VAL-MAP), Journal of Thoracic Disease, 8(Suppl 9):S716. |
Shen et al., 2015, Robust camera localisation with depth reconstruction for bronchoscopic navigation. International Journal of Computer Assisted Radiology and Surgery, 10(6):801-813. |
Shi et al., Sep. 14-18, 2014, Simultaneous catheter and environment modeling for trans-catheter aortic valve implantation, IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2024-2029. |
Solheim et al., May 14, 2009, Navigated resection of giant intracranial meningiomas based on intraoperative 3D ultrasound, Acta Neurochir, 151:1143-1151. |
Song et al., 2012, Autonomous and stable tracking of endoscope instrument tools with monocular camera, Advanced Intelligent Mechatronics (AIM), 2012 IEEE-ASME International Conference on. IEEE. |
Verdaasdonk et al., Jan. 23, 2012, Effect of microsecond pulse length and tip shape on explosive bubble formation of 2.78 μm Er,Cr;YSGG and 2.94 μm Er:YAG laser, Proceedings of SPIE, vol. 8221, 12. |
Wilson et al., 2008, a buyer's guide to electromagnetic tracking systems for clinical applications, Proc. of SPCI, 6918:69182B-1 p. 6918B-11. |
Yip et al., 2012, Tissue tracking and registration for image-guided surgery, IEEE transactions on medical imaging 31(11):2169-2182. |
Zhou et al., 2010, Synthesis of stereoscopic views from monocular endoscopic videos, Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference on IEEE. |
International search report and written opinion dated Aug. 15, 2019 for PCT/US2019/034145. |
Kiraly et al, 2002, Three-dimensional Human Airway Segmentation Methods for Clinical Virtual Bronchoscopy, Acad Radiol, 9:1153-1168. |
Kiraly et al., Sep. 2004, Three-dimensional path planning for virtual bronchoscopy, IEEE Transactions on Medical Imaging, 23(9):1365-1379. |
Solomon et al., Dec. 2000, Three-dimensional CT-Guided Bronchoscopy With a Real-Time Electromagnetic Position Sensor a Comparison of Two Image Registration Methods, Chest, 118(6):1783-1787. |
Al-Ahmad et al., dated 2005, Early experience with a computerized robotically controlled catheter system, Journal of Interventional Cardiac Electrophysiology, 12:199-202. |
Gutierrez et al., Mar. 2008, A practical global distortion correction method for an image intensifier based x-ray fluoroscopy system, Med. Phys, 35(3):997-1007. |
Hansen Medical, Inc. 2005, System Overview, product brochure, 2 pp., dated as available at http://hansenmedical.com/system.aspx on Jul. 14, 2006 (accessed Jun. 25, 2019 using the internet archive way back machine). |
Hansen Medical, Inc. Bibliography, product brochure, 1 p., dated as available at http://hansenmedical.com/bibliography.aspx on Jul. 14, 2006 (accessed Jun. 25, 2019 using the internet archive way back machine). |
Hansen Medical, Inc. dated 2007, Introducing the Sensei Robotic Catheter System, product brochure, 10 pp. |
Hansen Medical, Inc. dated 2009, Sensei X Robotic Catheter System, product brochure, 5 pp. |
Hansen Medical, Inc. Technology Advantages, product brochure, 1 p., dated as available at http://hansenmedical.com/advantages.aspx on Jul. 13, 2006 (accessed Jun. 25, 2019 using the internet archive way back machine). |
Marrouche et al., dated May 6, 2005, AB32-1, Preliminary human experience using a novel robotic catheter remote control, Heart Rhythm, 2(5):S63. |
Oh et al., dated May 2005, P5-75, Novel robotic catheter remote control system: safety and accuracy in delivering RF Lesions in all 4 cardiac chambers, Heart Rhythm, 2(5):S277-S278. |
Reddy et al., May 2005, P1-53. Porcine pulmonary vein ablation using a novel robotic catheter control system and real-time integration of CT imaging with electroanatomical mapping, Hearth Rhythm, 2(5):S121. |
Slepian, dated 2010, Robotic Catheter Intervention: the Hansen Medical Sensei Robot Catheter System, PowerPoint presentation, 28 pp. |
Number | Date | Country | |
---|---|---|---|
20190365479 A1 | Dec 2019 | US |
Number | Date | Country | |
---|---|---|---|
62678160 | May 2018 | US | |
62678962 | May 2018 | US |