Autonomous endobronchial access with an EM guided catheter

Information

  • Patent Grant
  • 12303220
  • Patent Number
    12,303,220
  • Date Filed
    Wednesday, January 26, 2022
    3 years ago
  • Date Issued
    Tuesday, May 20, 2025
    4 days ago
Abstract
A system for performing a surgical procedure includes a controller including a memory and a processor, the memory storing instructions, which when executed by the processor cause the processor to receive an image captured by a camera, generate a segmented image by applying a first threshold value to the image captured by the camera, identify a lumen within the segmented image, determine a centroid of the lumen within the segmented image, and align a portion of a surgical device operably coupled to the controller with the centroid of the lumen.
Description
BACKGROUND
Technical Field

The present disclosure relates to the field of visualizing the navigation of medical devices, such as biopsy or ablation tools, relative to targets, and in particular, navigating medical devices to a target.


Description of Related Art

There are several commonly applied medical methods, such as endoscopic procedures or minimally invasive procedures, for treating various maladies affecting organs including the liver, brain, heart, lungs, gall bladder, kidneys, and bones. Often, one or more imaging modalities, such as magnetic resonance imaging (MRI), ultrasound imaging, computed tomography (CT), or fluoroscopy are employed by clinicians to identify and navigate to areas of interest within a patient and ultimately a target for biopsy or treatment. In some procedures, pre-operative scans may be utilized for target identification and intraoperative guidance. However, real-time imaging may be required to obtain a more accurate and current image of the target area. Furthermore, real-time image data displaying the current location of a medical device with respect to the target and its surroundings may be needed to navigate the medical device to the target in a safe and accurate manner (e.g., without causing damage to other organs or tissue).


For example, an endoscopic approach has proven useful in navigating to areas of interest within a patient. To enable the endoscopic approach endoscopic navigation systems have been developed that use previously acquired MRI data or CT image data to generate a three-dimensional (3D) rendering, model, or volume of the particular body part such as the lungs.


The resulting volume generated from the MRI scan or CT scan is then utilized to create a navigation plan to facilitate the advancement of the endoscope (or other suitable medical device) within the patient anatomy to an area of interest. A locating or tracking system, such as an electromagnetic (EM) tracking system, may be utilized in conjunction with, for example, CT data, to facilitate guidance of the endoscope to the area of interest.


However, manually navigating the endoscope through the luminal network or utilizing robotic controls to remotely navigate the endoscope through the luminal network can be complex and time consuming. Further, navigating the endoscope through the luminal network takes considerable skill to ensure no damage is done to the surrounding tissue and that the endoscope is navigated to the correct location.


SUMMARY

In accordance with the present disclosure, a system for performing a surgical procedure includes a controller operably coupled to a camera, the controller including a memory and a processor, the memory storing instructions, which when executed by the processor, cause the processor to receive an image captured by the camera, generate a segmented image by applying a first threshold value to the image captured by the camera, identify a lumen within the segmented image, determine a centroid of the lumen within the segmented image, and align a portion of a surgical device operably coupled to the controller with the centroid of the lumen.


In aspects, the segmented image may be generated using dynamic binary thresholding.


In other aspects, the system may include a surgical device, wherein the camera is disposed on a distal portion of the surgical device, wherein the surgical device is navigable within a portion of the patient's anatomy.


In certain aspects, the system may include a robotic surgical system operably coupled to the surgical device.


In other aspects, the instructions, when executed by the processor, may cause the robotic surgical system to align the surgical device with the centroid of the lumen.


In aspects, the instruction, when executed by the processor, may cause the processor to generate a pathway to a target tissue.


In certain aspects, the instructions, when executed by the processor, may cause the processor to identify a lumen within the image corresponding to the pathway to the target tissue.


In other aspects, the instruction, when executed by the processor, may cause the robotic surgical system to advance the surgical device within the lumen within the image corresponding to the pathway to the target tissue.


In aspects, the instructions, when executed by the processor, may cause the processor to determine a centroid of the lumen within the segmented image in real-time.


In other aspects, the instruction, when executed by the processor, may cause the robotic surgical system to maintain alignment of the distal portion of the surgical device with the centroid of the lumen within the segmented image as the surgical device is advanced within the lumen.


In accordance with another aspect of the present disclosure, a method of performing a surgical procedure includes receiving an image of a patient's anatomy from a camera operably coupled to a surgical device, generating a segmented image by applying a first threshold value to the image, identifying a lumen within the segmented image, determining a centroid of the lumen within the segmented image, and aligning the surgical deice with the centroid of the lumen.


In aspects, generating the segmented image may include applying dynamic binary thresholding utilizing the first threshold value to generate the segmented image.


In other aspects, the method may include applying a second threshold value to the segmented image.


In certain aspects, the method may include generating a binary image from the segmented image.


In other aspects, the method may include generating a pathway through a patient's anatomy to a target tissue.


In aspects, the method may include advancing the surgical device through the patient's anatomy following the centroid of the lumen and the pathway through the patient's anatomy.


In accordance with another aspect of the present disclosure, a system for performing a surgical procedure includes a robotic surgical system including an endoscope including a camera, the camera disposed on a portion of the endoscope, and a drive mechanism operably coupled to the endoscope, a controller operably coupled to the robotic surgical system, the controller including a memory and a processor, the memory storing instructions, which when executed by the processor cause the processor to receive an image captured by the camera, generate a segmented image by applying a first threshold value to the image captured by the camera, identify a centroid of the lumen, and cause the drive mechanism to align a distal end portion of the endoscope with the centroid of the lumen.


In aspects, the instructions, when executed by the processor, may cause the processor to generate a pathway to a target tissue located within a patient's anatomy.


In other aspects, the instructions, when executed by the processor, may cause the processor to cause the drive mechanism to advance the distal end portion of the endoscope along the pathway.


In certain aspects, the instructions, when executed by the processor, may cause the processor to continuously update the centroid of the lumen as the endoscope is advanced along the pathway.





BRIEF DESCRIPTION OF THE DRAWINGS

Various aspects and embodiments of the disclosure are described hereinbelow with references to the drawings, wherein:



FIG. 1 is a schematic view of a surgical system provided in accordance with the present disclosure;



FIG. 2 is a schematic view of a controller of the surgical system of FIG. 1;



FIG. 3 is a perspective view of a distal end portion of an endoscope of the surgical system of FIG. 1;



FIG. 4 is an exploded view of a drive mechanism of the endoscope of FIG. 3;



FIG. 5 is a perspective view of a robotic surgical system of the surgical system of FIG. 1;



FIG. 6 is a depiction of a graphical user interface of the surgical system of FIG. 1 illustrating a color image;



FIG. 7 is a depiction of the graphical user interface of FIG. 6 illustrating a grayscale version of the color image of FIG. 6;



FIG. 8 is a depiction of the graphical user interface of FIG. 6 illustrating a binary version of the grayscale image of FIG. 7;



FIG. 9 is a depiction of the graphical user interface of FIG. 6 illustrating labels overlaid on the color image of FIG. 6;



FIG. 10 is a depiction of the graphical user interface of FIG. 6 illustrating a marker corresponding to a centroid of each lumen identified in the image of FIG. 9;



FIG. 11 is a depiction of the graphical user interface of FIG. 6 illustrating a pathway to target tissue overlaid on the image of FIG. 10;



FIG. 12A is a flow diagram of a method of navigating a surgical instrument to target tissue in accordance with the present disclosure;



FIG. 12B is a continuation of the flow diagram of FIG. 12A;



FIG. 13A is a flow diagram of a method of identifying a centroid of a lumen of a luminal network in accordance with the present disclosure; and



FIG. 13B is a continuation of the flow diagram of FIG. 13A.





DETAILED DESCRIPTION

The present disclosure is directed to a surgical system having a controller or workstation operably coupled to an endoscope, which in turn, is operably coupled to a robotic surgical system. The controller is configured to generate a 3D model of a patient's lungs and generate a pathway through the luminal network of the patient's lung to reach target tissue. The controller is operably coupled to an electromagnetic navigation system, which is configured to identify a location of a distal end portion of the endoscope within the liminal network of the patient's lungs using an electromagnetic sensor disposed within the endoscope and an electromagnetic field generator disposed proximate the patient.


The endoscope includes a camera, which in embodiments can be a miniature CMOS camera, which captures images of the patient's anatomy as the endoscope is navigated through the luminal network of the patient's lungs. The software associated with the controller analyzes the images captured by the camera disposed on the endoscope, which in embodiments, may be continuous throughout navigation of the endoscope to the target tissue. Initially, the software converts white light images captured by the camera to grayscale images, which in turn, are segmented into two portions by comparing the pixels within the grayscale image to a first threshold value. In embodiments, the software application segments the image using dynamic binary thresholding, which determines a mean value of pixels in each of the two segmented portions. After initial segmentation, the software application averages the mean values of the pixels of each of the two segmented portions and calculates an updated threshold value. If the updated threshold value is below a specified limit, the segmented image is re-segmented using the updated threshold value and the average of the mean values of the pixels is once again calculated and compared to the specified limit. If the updated threshold value is above a specified limit, the software generates a binary image, illustrating tissue walls in one color and lumens within the image as another. In one non-limiting embodiment, the tissue walls are illustrated as black whereas the lumens are illustrated as white. Although generally described as updating a threshold value and re-segmenting the image, it is envisioned that the software application may utilize Otsu's method, wherein automatic image thresholding is performed by automatically identifying a single intensity threshold that separates pixels identified in the image into two classes, foreground and background. It is envisioned that the software application may assign labels, such as a translucent color, to each lumen identified in the image, which can be overlaid on the live image captured by the camera disposed on the endoscope.


The software application compares the image captured by the endoscope to pre-procedure images, and in conjunction with the electromagnetic sensor, determines a location at which the image was captured within the luminal network of the patient's lungs. With the location known, the pathway to the target tissue may be overlaid on the live image and the software may select the appropriate lumen within the image to traverse to maintain course on the pathway and reach the target tissue.


The surgical system of the present disclosure enables automated navigation of the endoscope within the luminal network of the patient's lungs to the target issue by calculating a centroid of the lumens from the segmented images. In this manner, each lumen identified within the images includes a shape. Using the binary image, the software application is configured to calculate a centroid of the particular shape of each lumen. With the centroid of the lumen known, the software application is configured to instruct the robotic surgical system to manipulate or otherwise articulate the distal end portion of the endoscope to be aligned with the centroid of the lumen associated with the pathway to the target tissue.


Using this information, the surgical system continually analyzes the images captured by the camera disposed on the endoscope to maintain alignment of the distal end portion of the endoscope with the centroid of the lumen and avoid impacting or otherwise touching the tissue wall of the lumen. Further, monitoring the location of the endoscope within the luminal network of the patient's lungs enables the robotic surgical system to automatically manipulate and navigate the endoscope proximate the target tissue for treatment. As can be appreciated, the surgical system may perform spot checks or otherwise check that the real-time location of the endoscope is registered with the 3D model and pathway to correct any discrepancies and maintain course to the target tissue. In embodiments, a clinician may manually override automatic navigation to correct errors and reposition the endoscope to the correct location.


Although generally described with reference to the lung, it is contemplated that the systems and methods described herein may be used with any structure within the patient's body, such as the liver, kidney, prostate, gynecological, amongst others.


Turning now to the drawings, FIG. 1 illustrates a surgical system provided in accordance with the present disclosure and generally identified by reference numeral 10. As will be described in further detail hereinbelow, the surgical system 10 is generally configured to identify target tissue, register real-time images captured by a surgical instrument to a generated 3-Dimensional (3D) model, and navigate the surgical instrument to the target tissue.


The surgical system includes an endoscope 100, a controller or workstation 20 operably coupled to the endoscope 100, and a robotic surgical system 200 operably coupled to the controller 20 and operably coupled to the endoscope 100. The patient “P” is shown lying on an operating table 60 with the endoscope 100 inserted through the patient's mouth and into the patient's airways, although it is contemplated that the endoscope 100 may be inserted into any suitable body cavity of the patient, depending upon the procedure being performed.


Continuing with FIG. 1 and with additional reference to FIG. 2, the controller 20 includes a computer 22 and a display 24 that is configured to display one or more user interfaces 26 and/or 28. The controller 20 may be a desktop computer or a tower configuration with the display 24 or may be a laptop computer or other computing device. The controller 20 includes a processor 30 which executes software stored in a memory 32. The memory 32 may store video or other imaging data captured by the endoscope 100 or pre-procedure images from, for example, a computer-tomography (CT) scan, Positron Emission Tomography (PET), Magnetic Resonance Imaging (MRI), Cone-beam CT, amongst others. In addition, the memory 32 may store one or more applications 34 to be executed on the processor 30. Though not explicitly illustrated, the display 24 may be incorporated into a head mounted display such as an augmented reality (AR) headset such as the HoloLens offered by Microsoft Corp.


A network interface 36 enables the controller 20 to communicate with a variety of other devices and systems via the Internet. The network interface 36 may connect the controller 20 to the Internet via a wired or wireless connection. Additionally, or alternatively, the communication may be via an ad-hoc Bluetooth® or wireless network enabling communication with a wide-area network (WAN) and/or a local area network (LAN). The network interface 36 may connect to the Internet via one or more gateways, routers, and network address translation (NAT) devices. The network interface 36 may communicate with a cloud storage system 38, in which further image data and videos may be stored. The cloud storage system 38 may be remote from or on the premises of the hospital such as in a control or hospital information technology room. An input module 40 receives inputs from an input device such as a keyboard, a mouse, voice commands, amongst others. An output module 42 connects the processor 30 and the memory 32 to a variety of output devices such as the display 24. In embodiments, the controller 20 may include its own display 44, which may be a touchscreen display.


In embodiments, the endoscope 100 includes a location sensor, such as an electromagnetic (EM) sensor 102 (FIG. 3) which receives electromagnetic signals from an electromagnetic field generator 104 (FIG. 1) which generates one or more electromagnetic fields. In one non-limiting embodiment, the EM field generator 104 generates three or more electromagnetic fields. It is envisioned that the EM sensor 102 may be a single coil sensor that enables the system 10 to identify the position of the endoscope within the EM field generated by the EM field generator 104, although it is contemplated that the EM sensor 102 may be any suitable sensor and may be a sensor capable of enabling the system 10 to identify the position, orientation, and/or pose of the endoscope 100 within the EM field.


With continued reference to FIG. 2, one of the applications 34 stored in the memory 32 and executed by the processor 30 may determine the position of the EM sensor 102 in the EM field generated by the electromagnetic field generator 104. The determination of the position of the endoscope 100 and one or more cameras disposed thereon enables one method in which the images captured by the endoscope may be registered to a generated 3D model of the patient's anatomy, as will be described in further detail hereinbelow. Although generally described as being an EM sensor, it is contemplated that other position sensors may be utilized, such as an ultrasound sensor, flex sensors, fiber Bragg grating (FBG), robotic position detections sensors, amongst others.


In a planning or pre-procedure phase, the software stored in the memory 32 and executed by the processor 30 utilizes pre-procedure CT image data, either stored in the memory 32 or retrieved via the network interface 36, for generating and viewing a 3D model of the patient's anatomy, enabling the identification of target tissue on the 3D model (automatically, semi-automatically, or manually), and in embodiments, allowing for the selection of a pathway through the patient's anatomy to the target tissue. One example of such an application is the ILOGIC® planning and navigation suites currently marketed by Medtronic. The 3D model may be displayed on the display 24 or another suitable display (not shown) associated with the controller 20, or in any other suitable fashion. Using the controller 20, various views of the 3D model may be provided and/or the 3D model may be manipulated to facilitate identification of target tissue on the 3D model and/or selection of a suitable pathway to the target tissue.


In embodiments, the software stored in the memory 32 may identify and segment out a targeted critical structure within the 3D model. It is envisioned that the segmentation process may be performed automatically, manually, or a combination of both. The segmentation process isolates the targeted critical structure from the surrounding tissue in the 3D model and identifies its position within the 3D model. As can be appreciated, this position can be updated depending upon the view selected on the display 24 such that the view of the segmented targeted critical structure may approximate a view captured by the endoscope 100, as will be described in further detail hereinbelow.


Turning to FIGS. 3-4, the endoscope 100 is illustrated and includes a distal surface 106 having a camera 108 disposed thereon. Although generally illustrated as having one camera 108, it is contemplated that the endoscope 100 may include any number of cameras disposed on the distal surface 106 or any other suitable location thereon (e.g., sidewall, etc.). It is envisioned that the camera 108 is a complementary metal-oxide-semiconductor (CMOS) camera, and in embodiments, may be a mini-CMOS camera. As can be appreciated, the camera 108 captures images of the patient's anatomy from a perspective of looking out from the distal surface 106. Although generally identified as being a CMOS camera, it is envisioned that the camera 108 may be any suitable camera, such as charge-coupled device (CCD), complementary metal-oxide-semiconductor (CMOS), N-type metal-oxide-semiconductor (NMOS), and in embodiments, may be a white light camera, infrared (IR) camera, amongst others, depending upon the design needs of the system 10.


In embodiments, the endoscope 100 may include one or more light sources 110 disposed on the distal surface 106 or any other suitable location (e.g., side surface, protuberance, etc.). The light source 110 may be or may include a light emitting diode (LED), an optical fiber connected to a light source that is located external to the patient, amongst others, and may emit white, IR, or near infrared (NIR) light. In this manner, the camera 108 may be a white light camera, IR camera, or NIR camera, a camera that is capable of capturing white light and NIR light, amongst others. In one non-limiting embodiment, the camera 108 is a white light mini-CMOS camera.


The endoscope 100 includes one or more working channels 112 defined therethrough and extending through the distal surface 106. The working channel 112 is configured to receive a tool (not shown), locatable guide (LG), amongst others to enable a clinician to navigate to, or treat, target tissue. It is contemplated that the tool may be any suitable tool utilized during an endoscopic surgical procedure, and in embodiments, may be a fixed tool.


With reference to FIG. 4 the endoscope 100 includes a drive mechanism 120 disposed within an interior portion thereof that is operably coupled to a proximal portion of the endoscope 100. The drive mechanism 120 effectuates manipulation or articulation of a distal portion 100a of the endoscope 100 in four degrees of freedom (e.g., left, right, up, down), which is controlled by two push-pull wires, although it is contemplated that the drive mechanism 120 may include any suitable number of wires to effectuate movement and/or articulation of the distal portion 100a of the endoscope 100 in greater or fewer degrees of freedom without departing from the scope of the present disclosure. It is envisioned that the drive mechanism 120 may be cable actuated using artificial tendons or pull wires 122 (e.g., metallic, non-metallic, composite, etc.) or may be a nitinol wire mechanism. In embodiments, the drive mechanism 120 may include motors 124 or other suitable devices capable of effectuating movement of the pull wires 122. In this manner, the motors 124 are disposed within the endoscope 100 such that rotation of the motors 124 effectuates a corresponding articulation of the distal portion 100a of the endoscope 100.


Turning to FIG. 5, the system 10 includes a robotic surgical system 200 that is operably coupled to the endoscope 100. The robotic surgical system 200 includes a drive mechanism 202 including a robotic arm 204 operably coupled to a base or cart 206. The robotic arm 204 includes a cradle 208 that is configured to receive a portion of the endoscope 100 thereon. The endoscope 100 is coupled to the cradle 208 using any suitable means (e.g., straps, mechanical fasteners, couplings, amongst others).


It is envisioned that the robotic surgical system 200 may communicate with the endoscope 100 via electrical connection (e.g., contacts, plugs, etc.) or may be in wireless communication with the endoscope 100 to control or otherwise effectuate movement of the motors 124 and receive images captured by the camera 108. In this manner, it is contemplated that the robotic surgical system 200 may include a wireless communication system 210 operably coupled thereto such that the endoscope 100 may wirelessly communicate with the robotic surgical system 200 and/or the controller 20 via Wi-Fi, Bluetooth®, amongst others. As can be appreciated, the robotic surgical system 200 may omit the electrical contacts altogether and may communicate with the endoscope 100 wirelessly or may utilize both electrical contacts and wireless communication. The wireless communication system 210 is substantially similar to the wireless network interface 28 of the controller 20, and therefore, the wireless communication system 210 will not be described in detail herein in the interest of brevity. In embodiments, the robotic surgical system 200 and the controller 20 may be one in the same or may be widely distributed over multiple locations within the operating room. It is contemplated that the controller 20 may be disposed in a separate location and the display 12 may be an overhead monitor disposed within the operating room.


Although generally described as having the motors 124 disposed within the endoscope 100, it is contemplated that the endoscope 100 may not include motors 124 disposed therein. In this manner, the drive mechanism 120 disposed within the endoscope 100 may interface with motors 124 disposed within the cradle 208 of the robotic surgical system 200. In embodiments, the endoscope 100 may include a motor or motors 124 for controlling articulation of the distal end portion 100a of the endoscope in one plane (e.g., left/null, right/null, etc.) and the drive mechanism 202 of the robotic surgical system 200 may include at least one motor 212 to effectuate the second axis of rotation and for axial motion. In this manner, the motor 124 of the endoscope 100 and the motors 212 of the robotic surgical system 200 cooperate to effectuate four-way articulation of the distal end portion 100a of the endoscope 100 and effectuate rotation of the endoscope 100. As can be appreciated, by removing the motors 124 from the endoscope 100, the endoscope 100 becomes increasingly cheaper to manufacture and may be a disposable unit. In embodiments, the endoscope 100 may be integrated into the robotic surgical system 200 (e.g., one piece) and may not be a separate component.


With reference to FIGS. 6-11, the software stored in the memory 32 communicates with the camera 108 to capture images in real-time of the patient's anatomy as the endoscope 100 is navigated through the luminal network of the patient. The software includes two tiers or processes; the first being the image processing tier, and the second being the navigation tier.


The first tier of the software application segments the images captured by the camera 108 using dynamic binary thresholding. In this manner, if the images captured by the camera 108 are captured in color (e.g., white light) (FIG. 6), the software converts the color image into a grayscale image (FIG. 7). At this point, the software segments the image using dynamic binary thresholding to divide the image into two portions, a background portion and a foreground portion. The background portion includes pixel values that are less than or equal to a selected threshold and the foreground portion includes pixel values that are greater than the selected threshold. This process is repeated by averaging mean values of the two portions of the image and calculating a new threshold by averaging the two means until the difference between the previous threshold value and the new threshold value are below a selected limit. Although generally described as updating a threshold value and re-segmenting the image, it is envisioned that the software application may utilize Otsu's method, wherein automatic image thresholding is performed by automatically identifying a single intensity threshold that separates pixels identified in the image into two classes, foreground and background. In this manner, the software application dynamically computes a threshold based upon the distribution of pixel intensities within the image to group the pixels into two classes, foreground and background. As such, a single threshold is utilized and the segmentation process is performed a single time for each image frame, reducing the amount of time, and processing power, to perform the segmentation process.


The result of the dynamic binary thresholding process is an image in black and white, illustrating the foreground in white and the background in black, or vice versa (e.g., a binary image). In one non-limiting embodiment, the first tier of the software identifies the tissue walls “T” in black and the lumen “L” of the luminal network present in the image in white (FIG. 8). With the lumens “L” within the image identified, the software overlays the black and white image over the grayscale image with the black portion removed and the white portions illustrated as being a translucent color, such as yellow, red, amongst others to create a labeled image (FIG. 9). In one non-limiting embodiment, each lumen “L” identified by the software is illustrated as a different color (e.g., the first lumen “L” is identified by red, the second lumen is identified by blue, etc.).


With reference to FIG. 10, the second tier of the software application stored on the memory 32 utilizes the labeled image to identify a centroid or center portion of the identified lumens. As can be appreciated, the use of dynamic binary thresholding to identify each lumen within an image creates a shape from which the centroid “C” can be calculated. Using the calculated centroid, the software application controls or manipulates the distal end portion 100a of the endoscope 100 to aim or otherwise align the distal end portion 100a of the endoscope 100 with the centroid “C” of the lumen “L”. As can be appreciated, the software application continuously analyzes each image of the images captured by the camera in real time to ensure the distal end portion 100a of the endoscope 100 is aligned with the centroid of the lumen to inhibit or otherwise minimize the chances of the distal end portion 100a of the endoscope 100 from contacting a sidewall of the lumen.


In embodiments, the second tier of the software application overlays the pathway “P1” generated and selected by the clinician to the selected area of interest or target tissue (FIG. 11). In this manner, the software application utilizes the EM sensor 102 to identify the location of the distal end portion 100a of the endoscope within the airways of the patient's lungs and compares the images captured by the camera 108 to the pre-procedure images to register or otherwise ensure that the distal end portion 100a of the endoscope 100 is in the proper location within the airways of the patient's lungs. It is envisioned that the position of the endoscope 100 within the patient's lungs may be registered to the generated 3D model automatically via software or manually. In embodiments, the software application continuously checks the position of the endoscope within the patient's lungs as compared to the location indicated within the 3D model via the EM sensor 102 and/or the images captured by the camera 108. It is envisioned that the software application can automatically adjust the position of the endoscope 100 indicated within the 3D model due to tidal breathing or the respiration rate of the patient using any suitable means.


In operation, the endoscope is advanced, retracted, and manipulated within the patient's lungs automatically via the software application and the robotic surgical system 200. In this manner, the software application continuously communicates with the camera 108 and identifies the position of the EM sensor 102 within the electromagnetic field generated by the EM field generator 104. Utilizing the pathway selected by the clinician in conjunction with pre-procedure images stored on the memory 32, the software application identifies the position of the distal end portion 100a of the endoscope 100 within the patient's lungs. At this point, the first tier of the software application identifies each lumen of the lungs visible within the filed of view of the camera 108 and determines a centroid of each lumen via the methods described hereinabove. Using the identified centroid of each lumen, the software application instructs the robotic surgical system 200 to advance the endoscope 100, retract the endoscope 100, manipulate the distal end portion 100a of the endoscope 100 up, down, left, or right, and/or rotate the endoscope to maintain the position of the distal end portion 100a of the endoscope substantially aligned with the centroid of the lumen.


When the distal end portion 100a of the endoscope 100 encounters a bifurcation within the patient's luminal network, the software application identifies the lumen through which the endoscope 100 must travel to remain on the selected pathway to the target tissue. At this point, the software application instructs the robotic surgical system 200 to manipulate the distal end portion 100a of the endoscope 100 to align the distal end portion 100a with the centroid of the appropriate lumen and further advances the endoscope within the luminal network of the patient's lungs until the distal end portion 100a of the endoscope 100 is proximate the target tissue. It is envisioned that the software application may periodically or continuously confirm the location of the distal end portion 100a of the endoscope 100 within the patient's luminal network as compared to the location indicated on the 3D model of the patient's lung and update the indicated location on the 3D model as necessary to ensure that the endoscope is following the correct path through the luminal network. Once the distal end portion 100a of the endoscope 100 has been navigated proximate the target tissue, one or more tools or other surgical devices may be advanced through the one or more working channels 112 to treat the target tissue.


Turning to FIGS. 12A and 12B, a method of navigating an endoscope through a luminal network of a patient's lungs to a target tissue is described and generally identified by reference numeral 300. Initially, in step 302, the patient's lungs are imaged using any suitable imaging modality (e.g., CT, MRI, amongst others) and the images are stored in the memory 32 associated with the controller 20. In step 304, the images stored on the memory 32 are utilized to generate and viewing a 3D model of the patient's anatomy, and thereafter, target tissue is identified in step 306. With the target tissue identified, a pathway to the target tissue through the luminal network of the patient's lungs is generated in step 308.


Once the desired pathway to the target tissue is selected, the surgical procedure is initiated in step 310 by advancing the distal end portion 100a of the endoscope 100 within the airways of the patient's lungs. With the distal end portion 100a of the endoscope 100 disposed within the airways of the patient's lungs, the location of the distal end portion 100a of the endoscope 100 is identified in step 312 using the camera 108 and the EM sensor 102 of the endoscope and the identified position of the distal end portion 100a of the endoscope is registered to the 3D model and the selected pathway to the target tissue in step 314. After registration, the software application analyzes images captured by the camera 108 and identifies lumens of luminal network of the patient's lungs visible in the field of view of the camera 108 and determines a centroid of the lumens in step 316. In step 318, with the lumens identified within the images captured by the camera 108, the software application compares the images captured by the camera 108 to the pre-procedure images and identifies a lumen through which the endoscope 100 must traverse to maintain its course on the selected pathway to the target tissue. After selecting the appropriate lumen, the software application instructs the robotic surgical system 200 to manipulate or otherwise control the distal end portion 100a of the endoscope 100 to advance within the selected lumen in step 320.


In step 322, the software application checks the location of the distal end portion 100a of the endoscope 100 within the patient's luminal network and determines if the endoscope 100 is on the pathway or is off course. If the endoscope 100 is not on the desired pathway, the method returns to step 320 to cause the robotic surgical system to retract the endoscope and/or manipulate the endoscope to the appropriate location within the patient's luminal network to place the distal end portion 100a of the endoscope 100 back on the desired pathway. If the endoscope 100 is on the desired pathway, in step 324, the software application monitors the images captured by the camera 108 and determines if the distal end portion 100a of the endoscope 100 is substantially aligned with the centroid of the lumen through which the distal end portion 100a of the endoscope 100 is traversing. If the software application determines that the distal end portion 100a of the endoscope 100 is not substantially aligned with the centroid of the lumen, the software application instructs the robotic surgical system 200 to manipulate or otherwise control the distal end portion 100a of the endoscope to re-align with the centroid of the lumen in step 326. If the software application determines that the distal end portion 100a is substantially aligned with the centroid of the lumen, in step 328, the software application continues navigating the endoscope 100 through the luminal network along the selected pathway.


In step 330, the software application determines if the distal end portion 100a of the endoscope is located proximate the target tissue, and if so, terminates navigation of the endoscope in step 332. If the software application determines that the distal end portion 100a of the endoscope 100 is not located proximate the target tissue, the software application returns to step 320 to continue navigating the endoscope through the luminal network of the patient's lungs along the selected pathway until the software application determines that the distal end portion 100a of the endoscope 100 is located proximate the target tissue.


With reference to FIGS. 13A and 13B, a method of identifying a centroid of a lumen of a luminal network of a patient's lungs is illustrated and generally identified by reference numeral 400. Initially, in step 402 one or more images of the patient's anatomy proximate the distal end portion 100a are captured by the camera 108. If the images captured by the camera 108 are in color, the color images are converted to grayscale images in step 404. In step 406, with the images converted to grayscale, a grayscale image of the one or more captured images is segmented using dynamic binary thresholding to divide the image into two portions; a first portion including pixel values that are greater than a first threshold and a second portion including pixel values that are equal to or less than the first threshold. After the initial segmentation, in step 408, the mean value of the pixel values of each of the two portions of the image is calculated, and in step 410, an updated threshold value is calculated by averaging the calculated mean values of the first and second portions. In step 412, the updated threshold value is compared to a specified limit, and if the updated threshold value is below the specified limit, the image is segmented a second time by again dividing the image into two portions utilizing the updated threshold value in step 414, at which point the process returns to step 410. As can be appreciated, steps 406-414 are repeated as many times as necessary until the updated threshold value is determined to be greater than the specified limit, at which point, in step 416, a binary image (e.g., black and white image) is generated illustrating the two portions in contrasting colors. In embodiments, as described in further detail hereinabove, the process may segment the image a single time by dynamically identifying a threshold value based upon the distribution of pixel intensities identified in the image. In this manner, the process performs step 406 and skips to step 416, as the re-segmentation process is unnecessary.


In step 418, the two portions of the binary image are assigned labels corresponding to the lumens and the tissue walls within the image, and in step 420, the labels are overlaid on the color or live image captured by the camera 108. In step 422, the centroid of the identified lumens is calculated utilizing the shape of the lumen or lumens identified in the binary image. Although generally described as being calculated after assigning labels, it is envisioned that the centroid of the lumen may be calculated at any time after the binary image is generated. Once the centroid of the lumen has been calculated, the process returns to step 402 and is continuously repeated as the endoscope 100 is advanced within the luminal network of the patient's lungs and new images are captured until it is determined, in step 424, that the endoscope 100 is located proximate the target tissue. If it is determined that the endoscope 100 is located proximate target tissue, navigation of the endoscope 100 within the patient's lungs terminates in step 426.


Although described generally hereinabove, it is envisioned that the memory 32 may include any non-transitory computer-readable storage media for storing data and/or software including instructions that are executable by the processor 30 and which control the operation of the controller 20 and, in some embodiments, may also control the operation of the endoscope 100. In an embodiment, memory 32 may include one or more storage devices such as solid-state storage devices, e.g., flash memory chips. Alternatively, or in addition to the one or more solid-state storage devices, the memory 32 may include one or more mass storage devices connected to the processor 30 through a mass storage controller (not shown) and a communications bus (not shown).


Although the description of computer-readable media contained herein refers to solid-state storage, it should be appreciated by those skilled in the art that computer-readable storage media can be any available media that can be accessed by the processor 30. That is, computer readable storage media may include non-transitory, volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable storage media may include RAM, ROM, EPROM, EEPROM, flash memory or other solid-state memory technology, CD-ROM, DVD, Blu-Ray or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store the desired information, and which may be accessed by the controller 20.


While several embodiments of the disclosure have been shown in the drawings, it is not intended that the disclosure be limited thereto, as it is intended that the disclosure be as broad in scope as the art will allow and that the specification be read likewise. Therefore, the above description should not be construed as limiting, but merely as exemplifications of embodiments. Those skilled in the art will envision other modifications within the scope and spirit of the claims appended hereto.

Claims
  • 1. A system for performing a surgical procedure, comprising: a controller operably coupled to a camera, the controller including a memory and a processor, the memory storing instructions, which when executed by the processor cause the processor to: receive a color image captured by the camera;convert the color image to a grayscale image;generate a segmented image by applying a first threshold value to the grayscale image captured by the camera;identify a lumen within the segmented image;determine a centroid of the lumen within the segmented image;overlay the segmented image on the grayscale image, wherein the identified lumen of the segmented image is depicted as a translucent color; andarticulate a portion of a surgical device operably coupled to the controller to align the portion of the surgical device with the centroid of the lumen.
  • 2. The system according to claim 1, wherein the segmented image is generated using dynamic binary thresholding.
  • 3. The system according to claim 1, further including a surgical device, wherein the camera is disposed on a distal portion of the surgical device, wherein the surgical device is navigable within a portion of a patient's anatomy.
  • 4. The system according to claim 3, further including a robotic surgical system operably coupled to the surgical device.
  • 5. The system according to claim 4, wherein the instructions, when executed by the processor, cause the robotic surgical system to align the surgical device with the centroid of the identified lumen.
  • 6. The system according to claim 5, wherein the instructions, when executed by the processor, cause the processor to generate a pathway to a target tissue.
  • 7. The system according to claim 6, wherein the instructions, when executed by the processor, cause the processor to identify a lumen within the image corresponding to the pathway to the target tissue.
  • 8. The system according to claim 7, wherein the instruction, when executed by the processor, cause the robotic surgical system to advance the surgical device within the lumen within the image corresponding to the pathway to the target tissue.
  • 9. The system according to claim 8, wherein the instructions, when executed by the processor, cause the processor to determine a centroid of the identified lumen within the segmented image in real-time.
  • 10. The system according to claim 9, wherein the instructions, when executed by the processor, cause the robotic surgical system to maintain alignment of the distal portion of the surgical device with the centroid of the identified lumen within the segmented image as the surgical device is advanced within the identified lumen.
  • 11. A method of performing a surgical procedure, comprising: receiving a color image of a patient's anatomy from a camera operably coupled to a surgical device;convert the color image to a grayscale image;generating a segmented image by applying a first threshold value to the grayscale image;identifying a lumen within the segmented image;determining a centroid of the lumen within the segmented image;overlay the segmented image on the grayscale image, wherein the identified lumen of the segmented image is depicted as a translucent color; andarticulating the surgical device to align the surgical device with the centroid of the lumen.
  • 12. The method according to claim 11, wherein generating the segmented image includes applying dynamic binary thresholding utilizing the first threshold value to generate the segmented image.
  • 13. The method according to claim 12, further comprising applying a second threshold value to the segmented image.
  • 14. The method according to claim 12, further comprising generating a binary image from the segmented image.
  • 15. The method according to claim 11, further comprising generating a pathway through a patient's anatomy to a target tissue.
  • 16. The method according to claim 15, further comprising advancing the surgical device through the patient's anatomy following the centroid of the identified lumen and the pathway through the patient's anatomy.
  • 17. A system for performing a surgical procedure, comprising: a robotic surgical system, comprising: an endoscope including a camera, the camera disposed on a portion of the endoscope; and a drive mechanism operably coupled to the endoscope; anda controller operably coupled to the robotic surgical system, the controller including a memory and a processor, the memory storing instructions, which when executed by the processor cause the processor to:receive a color image captured by the camera;convert the color image to a grayscale image;generate a segmented image by applying a first threshold value to the grayscale image captured by the camera;identify a lumen within the segmented image;determine a centroid of the lumen;overlay the segmented image on the grayscale image, wherein the identified lumen of the segmented image is depicted as a translucent color; andcause the drive mechanism to articulate a distal end portion of the endoscope to align the distal portion of the endoscope with the centroid of the lumen.
  • 18. The system according to claim 17, wherein the instructions, when executed by the processor, cause the processor to generate a pathway to a target tissue located within a patient's anatomy.
  • 19. The system according to claim 18, wherein the instructions, when executed by the processor, cause the processor to cause the drive mechanism to advance the distal end portion of the endoscope along the pathway.
  • 20. The system according to claim 19, wherein the instructions, when executed by the processor, cause the processor to continuously update the centroid of the identified lumen as the endoscope is advanced along the pathway.
US Referenced Citations (309)
Number Name Date Kind
4202352 Osborn May 1980 A
5358496 Ortiz et al. Oct 1994 A
6086586 Hooven Jul 2000 A
6533784 Truckai et al. Mar 2003 B2
6656177 Truckai et al. Dec 2003 B2
6802843 Truckai et al. Oct 2004 B2
6835336 Watt Dec 2004 B2
6913579 Truckai et al. Jul 2005 B2
7806891 Nowlin et al. Oct 2010 B2
7947000 Vargas et al. May 2011 B2
8052636 Moll et al. Nov 2011 B2
8190238 Moll et al. May 2012 B2
8335359 Fidrich et al. Dec 2012 B2
8600551 Itkowitz et al. Dec 2013 B2
8706184 Mohr et al. Apr 2014 B2
8827934 Chopra et al. Sep 2014 B2
8914150 Moll et al. Dec 2014 B2
9119654 Ramans et al. Sep 2015 B2
9393000 Donhowe Jul 2016 B2
9801630 Harris et al. Oct 2017 B2
9839481 Blumenkranz et al. Dec 2017 B2
9918659 Chopra et al. Mar 2018 B2
9993313 Schuh et al. Jun 2018 B2
10172973 Vendely et al. Jan 2019 B2
10206686 Swayze et al. Feb 2019 B2
10349938 Widenhouse et al. Jul 2019 B2
10373719 Soper et al. Aug 2019 B2
10376178 Chopra Aug 2019 B2
10405753 Sorger Sep 2019 B2
10478162 Barbagli et al. Nov 2019 B2
10480926 Froggatt et al. Nov 2019 B2
10482599 Mintz et al. Nov 2019 B2
10524866 Srinivasan et al. Jan 2020 B2
10539478 Lin et al. Jan 2020 B2
10543048 Noonan Jan 2020 B2
10555788 Panescu et al. Feb 2020 B2
10610306 Chopra Apr 2020 B2
10631949 Schuh et al. Apr 2020 B2
10638953 Duindam et al. May 2020 B2
10639108 Romo et al. May 2020 B2
10653866 Duindam et al. May 2020 B2
10667871 Romo et al. Jun 2020 B2
10667875 DeFonzo et al. Jun 2020 B2
10674970 Averbuch et al. Jun 2020 B2
10682070 Duindam Jun 2020 B2
10682192 Fenech Jun 2020 B2
10706543 Donhowe et al. Jul 2020 B2
10709506 Coste-Maniere et al. Jul 2020 B2
10716637 Kowshik et al. Jul 2020 B2
10729886 Fenech et al. Aug 2020 B2
10743751 Landey et al. Aug 2020 B2
10744303 Duindam et al. Aug 2020 B2
10751140 Wallace et al. Aug 2020 B2
10765303 Graetzel et al. Sep 2020 B2
10765487 Ho et al. Sep 2020 B2
10772485 Schlesinger et al. Sep 2020 B2
10779803 Prisco et al. Sep 2020 B2
10779898 Hill et al. Sep 2020 B2
10786329 Schuh et al. Sep 2020 B2
10792022 Keast et al. Oct 2020 B2
10792464 Romo et al. Oct 2020 B2
10796432 Mintz et al. Oct 2020 B2
10813539 Graetzel et al. Oct 2020 B2
10820947 Julian Nov 2020 B2
10820954 Marsot et al. Nov 2020 B2
10823627 Sanborn et al. Nov 2020 B2
10827913 Ummalaneni et al. Nov 2020 B2
10835153 Rafii-Tari et al. Nov 2020 B2
10842575 Panescu et al. Nov 2020 B2
10842581 Bailey Nov 2020 B2
10849591 Azizian et al. Dec 2020 B2
10850013 Hsu et al. Dec 2020 B2
10856855 Gordon Dec 2020 B2
10881280 Baez, Jr. Jan 2021 B2
10881385 Fenech Jan 2021 B2
10885630 Li et al. Jan 2021 B2
20030013972 Makin Jan 2003 A1
20050107808 Evans et al. May 2005 A1
20050165276 Belson et al. Jul 2005 A1
20060235457 Belson Oct 2006 A1
20070135803 Belson Jun 2007 A1
20100204545 Tanaka et al. Aug 2010 A1
20110085720 Averbuch Apr 2011 A1
20130096385 Fenech et al. Apr 2013 A1
20130303945 Blumenkranz et al. Nov 2013 A1
20140035798 Kawada et al. Feb 2014 A1
20140052018 Hawkins Feb 2014 A1
20140235943 Paris et al. Aug 2014 A1
20150148690 Chopra et al. May 2015 A1
20150265368 Chopra et al. Sep 2015 A1
20160001038 Romo et al. Jan 2016 A1
20160067450 Kowshik Mar 2016 A1
20160157939 Larkin et al. Jun 2016 A1
20160183841 Duindam et al. Jun 2016 A1
20160192860 Allenby et al. Jul 2016 A1
20160256230 Kowshik et al. Sep 2016 A1
20160270865 Landey et al. Sep 2016 A1
20160270870 Kowshik Sep 2016 A1
20160287344 Donhowe et al. Oct 2016 A1
20160331358 Gordon Nov 2016 A1
20160338783 Romo et al. Nov 2016 A1
20160374676 Flanagan et al. Dec 2016 A1
20170020628 Averbuch Jan 2017 A1
20170112366 Duindam et al. Apr 2017 A1
20170112576 Coste-Maniere et al. Apr 2017 A1
20170112588 Bissing et al. Apr 2017 A1
20170151026 Panescu et al. Jun 2017 A1
20170209071 Zhao et al. Jul 2017 A1
20170224338 Sung Aug 2017 A1
20170238795 Blumenkranz et al. Aug 2017 A1
20170258309 Deyanov Sep 2017 A1
20170265952 Donhowe et al. Sep 2017 A1
20170273542 Au Sep 2017 A1
20170273712 Carlson et al. Sep 2017 A1
20170274189 Smith et al. Sep 2017 A1
20170281287 Au Oct 2017 A1
20170281288 Au Oct 2017 A1
20170311844 Zhao et al. Nov 2017 A1
20170319165 Averbuch Nov 2017 A1
20170325896 Donhowe et al. Nov 2017 A1
20180001058 Schlesinger Jan 2018 A1
20180056040 Fenech et al. Mar 2018 A1
20180064904 Vargas et al. Mar 2018 A1
20180070935 Fenech Mar 2018 A1
20180078318 Barbagli et al. Mar 2018 A1
20180153621 Duindam et al. Jun 2018 A1
20180214011 Graetzel et al. Aug 2018 A1
20180214138 Prisco et al. Aug 2018 A9
20180221038 Noonan et al. Aug 2018 A1
20180221039 Shah Aug 2018 A1
20180235565 Azizian et al. Aug 2018 A1
20180235709 Donhowe et al. Aug 2018 A1
20180240237 Donhowe et al. Aug 2018 A1
20180256262 Duindam et al. Sep 2018 A1
20180263706 Averbuch Sep 2018 A1
20180279852 Rafii-Tari et al. Oct 2018 A1
20180280660 Landey et al. Oct 2018 A1
20180296281 Yeung et al. Oct 2018 A1
20180325419 Zhao et al. Nov 2018 A1
20180325499 Landey et al. Nov 2018 A1
20190000559 Berman et al. Jan 2019 A1
20190000560 Berman et al. Jan 2019 A1
20190000576 Mintz et al. Jan 2019 A1
20190008413 Duindam et al. Jan 2019 A1
20190038365 Soper et al. Feb 2019 A1
20190065209 Mishra et al. Feb 2019 A1
20190076143 Smith Mar 2019 A1
20190110839 Rafii-Tari et al. Apr 2019 A1
20190110843 Ummalaneni Apr 2019 A1
20190133702 Fenech et al. May 2019 A1
20190167366 Ummalaneni et al. Jun 2019 A1
20190175062 Rafii-Tari et al. Jun 2019 A1
20190183318 Froggatt et al. Jun 2019 A1
20190183585 Rafii-Tari et al. Jun 2019 A1
20190183587 Rafii-Tari et al. Jun 2019 A1
20190192143 Shelton, IV et al. Jun 2019 A1
20190192234 Gadda et al. Jun 2019 A1
20190192819 Duindam et al. Jun 2019 A1
20190200984 Shelton, IV et al. Jul 2019 A1
20190209016 Herzlinger et al. Jul 2019 A1
20190209043 Zhao et al. Jul 2019 A1
20190216447 Bailey et al. Jul 2019 A1
20190216548 Ummalaneni Jul 2019 A1
20190223693 Vargas Jul 2019 A1
20190223759 Page et al. Jul 2019 A1
20190231449 Diolaiti et al. Aug 2019 A1
20190239723 Duindam et al. Aug 2019 A1
20190239724 Averbuch et al. Aug 2019 A1
20190239831 Chopra Aug 2019 A1
20190246876 Schaning Aug 2019 A1
20190246882 Graetzel et al. Aug 2019 A1
20190247128 Inouye et al. Aug 2019 A1
20190250050 Sanborn et al. Aug 2019 A1
20190254649 Walters et al. Aug 2019 A1
20190262086 Connolly et al. Aug 2019 A1
20190269468 Hsu et al. Sep 2019 A1
20190269470 Barbagli et al. Sep 2019 A1
20190269885 Bailey et al. Sep 2019 A1
20190272634 Li et al. Sep 2019 A1
20190290109 Agrawal et al. Sep 2019 A1
20190290375 Dearden et al. Sep 2019 A1
20190298160 Ummalaneni et al. Oct 2019 A1
20190298451 Wong et al. Oct 2019 A1
20190298460 Al-Jadda et al. Oct 2019 A1
20190298465 Chin et al. Oct 2019 A1
20190320878 Duindam et al. Oct 2019 A1
20190320937 Duindam et al. Oct 2019 A1
20190328213 Landey et al. Oct 2019 A1
20190336238 Yu et al. Nov 2019 A1
20190343424 Blumenkranz et al. Nov 2019 A1
20190350659 Wang et al. Nov 2019 A1
20190350660 Moll et al. Nov 2019 A1
20190350662 Huang et al. Nov 2019 A1
20190365199 Zhao et al. Dec 2019 A1
20190365201 Noonan et al. Dec 2019 A1
20190365479 Rafii-Tari Dec 2019 A1
20190365486 Srinivasan et al. Dec 2019 A1
20190374297 Wallace et al. Dec 2019 A1
20190380787 Ye et al. Dec 2019 A1
20200000319 Saadat et al. Jan 2020 A1
20200000526 Zhao Jan 2020 A1
20200000533 Schuh et al. Jan 2020 A1
20200000537 Marsot et al. Jan 2020 A1
20200008655 Schlesinger et al. Jan 2020 A1
20200008678 Barbagli et al. Jan 2020 A1
20200008827 Dearden et al. Jan 2020 A1
20200008874 Barbagli et al. Jan 2020 A1
20200022762 Cassell et al. Jan 2020 A1
20200022767 Hill et al. Jan 2020 A1
20200029948 Wong et al. Jan 2020 A1
20200030044 Wang et al. Jan 2020 A1
20200030461 Sorger Jan 2020 A1
20200030575 Bogusky et al. Jan 2020 A1
20200038123 Graetzel et al. Feb 2020 A1
20200038750 Kojima Feb 2020 A1
20200039086 Meyer et al. Feb 2020 A1
20200043207 Lo et al. Feb 2020 A1
20200046431 Soper et al. Feb 2020 A1
20200046434 Graetzel et al. Feb 2020 A1
20200046436 Tzeisler et al. Feb 2020 A1
20200054399 Duindam et al. Feb 2020 A1
20200060516 Baez, Jr. Feb 2020 A1
20200060771 Lo et al. Feb 2020 A1
20200069192 Sanborn et al. Mar 2020 A1
20200069384 Fenech et al. Mar 2020 A1
20200077870 Dicarlo et al. Mar 2020 A1
20200077991 Gordon et al. Mar 2020 A1
20200078095 Chopra et al. Mar 2020 A1
20200078096 Barbagli et al. Mar 2020 A1
20200078103 Duindam et al. Mar 2020 A1
20200078104 Bailey et al. Mar 2020 A1
20200085514 Blumenkranz Mar 2020 A1
20200085516 DeFonzo et al. Mar 2020 A1
20200093549 Chin et al. Mar 2020 A1
20200093554 Schuh et al. Mar 2020 A1
20200100776 Blumenkranz et al. Apr 2020 A1
20200100845 Julian Apr 2020 A1
20200100853 Ho et al. Apr 2020 A1
20200100855 Leparmentier et al. Apr 2020 A1
20200107894 Wallace et al. Apr 2020 A1
20200107899 Carlson et al. Apr 2020 A1
20200109124 Pomper et al. Apr 2020 A1
20200121170 Gordon et al. Apr 2020 A1
20200129045 Prisco Apr 2020 A1
20200129239 Bianchi et al. Apr 2020 A1
20200138515 Wong May 2020 A1
20200146757 Fenech et al. May 2020 A1
20200155116 Donhowe et al. May 2020 A1
20200163581 Kowshik et al. May 2020 A1
20200163726 Tanner et al. May 2020 A1
20200170623 Averbuch Jun 2020 A1
20200170720 Ummalaneni Jun 2020 A1
20200171660 Ho et al. Jun 2020 A1
20200179058 Barbagli et al. Jun 2020 A1
20200188038 Donhowe et al. Jun 2020 A1
20200197112 Chin et al. Jun 2020 A1
20200198147 Fredrickson et al. Jun 2020 A1
20200205903 Srinivasan et al. Jul 2020 A1
20200205904 Chopra Jul 2020 A1
20200205908 Julian et al. Jul 2020 A1
20200206472 Ma et al. Jul 2020 A1
20200214664 Zhao et al. Jul 2020 A1
20200217733 Lin et al. Jul 2020 A1
20200222134 Schuh et al. Jul 2020 A1
20200222666 Chan et al. Jul 2020 A1
20200229679 Zhao et al. Jul 2020 A1
20200237458 DeFonzo et al. Jul 2020 A1
20200242767 Zhao et al. Jul 2020 A1
20200253670 Doisneau et al. Aug 2020 A1
20200254223 Duindam et al. Aug 2020 A1
20200261172 Romo et al. Aug 2020 A1
20200261175 Fenech Aug 2020 A1
20200268240 Blumenkranz et al. Aug 2020 A1
20200268459 Noonan Aug 2020 A1
20200268463 Au Aug 2020 A1
20200275860 Duindam Sep 2020 A1
20200275984 Brisson et al. Sep 2020 A1
20200281787 Ruiz Sep 2020 A1
20200289023 Duindam et al. Sep 2020 A1
20200297437 Schuh et al. Sep 2020 A1
20200297442 Adebar et al. Sep 2020 A1
20200305983 Yampolsky et al. Oct 2020 A1
20200305989 Schuh et al. Oct 2020 A1
20200315554 Averbuch et al. Oct 2020 A1
20200323593 Coste-Maniere et al. Oct 2020 A1
20200330167 Romo et al. Oct 2020 A1
20200330795 Sawant et al. Oct 2020 A1
20200345436 Kowshik et al. Nov 2020 A1
20200352420 Graetzel et al. Nov 2020 A1
20200352427 Deyanov Nov 2020 A1
20200352675 Averbuch Nov 2020 A1
20200364865 Donhowe et al. Nov 2020 A1
20200367719 Au Nov 2020 A1
20200367726 Landey et al. Nov 2020 A1
20200367981 Ho et al. Nov 2020 A1
20200375678 Wallace et al. Dec 2020 A1
20200383750 Kemp et al. Dec 2020 A1
20200391010 Fenech et al. Dec 2020 A1
20200405317 Wallace Dec 2020 A1
20200405411 Draper et al. Dec 2020 A1
20200405419 Mao et al. Dec 2020 A1
20200405420 Purohit et al. Dec 2020 A1
20200405423 Schuh Dec 2020 A1
20200405424 Schuh Dec 2020 A1
20200405434 Schuh et al. Dec 2020 A1
20200406002 Romo et al. Dec 2020 A1
20210121051 Altshuler Apr 2021 A1
20220286602 Ben Hassen Sep 2022 A1
20230363836 Pandya Nov 2023 A1
Foreign Referenced Citations (29)
Number Date Country
0013237 Jul 2003 BR
0116004 Jun 2004 BR
486540 Sep 2016 CZ
2486540 Sep 2016 CZ
2709512 Aug 2017 CZ
3060157 Dec 2019 CZ
2884879 Jan 2020 CZ
3326551 May 2018 EP
3367915 Jul 2019 EP
3413830 Sep 2019 EP
3562423 Nov 2019 EP
3552653 Dec 2019 EP
3576598 Dec 2019 EP
3576599 Dec 2019 EP
3478161 Feb 2020 EP
3641686 Apr 2020 EP
3644820 May 2020 EP
3644885 May 2020 EP
3644886 May 2020 EP
3645100 May 2020 EP
3654870 May 2020 EP
3668582 Jun 2020 EP
3576599 Nov 2020 EP
PA03005028 Jan 2004 MX
225663 Jan 2005 MX
226292 Feb 2005 MX
246862 Jun 2007 MX
265247 Mar 2009 MX
284569 Mar 2011 MX
Non-Patent Literature Citations (1)
Entry
Extended European Search Report issued in European Patent Application No. 23153302.7 dated Jun. 5, 2023.
Related Publications (1)
Number Date Country
20230233270 A1 Jul 2023 US