Systems and methods for guiding surgical procedures

Information

  • Patent Grant
  • 12256890
  • Patent Number
    12,256,890
  • Date Filed
    Wednesday, November 18, 2020
    4 years ago
  • Date Issued
    Tuesday, March 25, 2025
    a month ago
Abstract
Methods for guiding a surgical procedure include accessing information relating to a surgical procedure, accessing at least one image of a surgical site captured by an endoscope during the surgical procedure, identifying a tool captured in the at least one image by a machine learning system, determining whether the tool should be changed based on comparing the information relating to the surgical procedure and the tool identified by the machine learning system, and providing an indication when the determining indicates that the tool should be changed.
Description
FIELD

The present technology is generally related to assisted surgical procedures and, more particularly, to systems and methods for guiding surgical procedures, such as in robotic surgical procedures.


BACKGROUND

During laparoscopic surgical procedures, an endoscope is used to visualize a surgical site. Particularly, in minimally invasive surgery (MIS) involving robotic surgery, image sensors have been used to allow a surgeon to visualize a surgical site.


A surgeon performing a laparoscopic surgery has a limited scope of view of the surgical site through a display. When a non-robotic tool, a robotic tool, or a laparoscopic instrument is used, the proper or appropriate selection and positioning of such tools for the surgical operation is based on the clinician's judgment and experience. When an inappropriate tool is used, for example, or when a tool is not appropriately positioned, the treatment results may be sub-optimal. There is interest in developing systems for supplementing a clinician's experience and judgment during surgical operations.


SUMMARY

The techniques of this disclosure generally relate to systems and methods for guiding a surgical procedure by supplementing a clinician's judgment and experience.


In an aspect, a method for guiding a surgical procedure includes accessing information relating to a surgical procedure, accessing at least one image of a surgical site captured by an endoscope during the surgical procedure, identifying a tool captured in the at least one image by a machine learning system, determining whether the tool should be changed based on comparing the information relating to the surgical procedure and the tool identified by the machine learning system, and providing an indication when the determining indicates that the tool should be changed.


In various embodiments of the method, the information relating to the surgical procedure indicates tools which have been used for other surgical procedures of a same type as the surgical procedure, and determining whether the tool should be changed includes determining whether the tool is among the tools which have been used for the other surgical procedures.


In various embodiments of the method, determining whether the tool should be changed includes determining that the tool should be changed when the tool is not among the tools which have been used for the other surgical procedures.


In various embodiments of the method, the endoscope is a stereo-endoscope, the at least one image includes at least one stereoscopic image, and the at least one stereoscopic image includes depth information relating to the tool and to tissue of the surgical site.


In various embodiments of the method, the method includes determining orientation information for the tool by the machine learning system based on the at least one image.


In various embodiments of the method, the machine learning system was trained using tool orientation information for other surgical procedures of a same type as the surgical procedure.


In various embodiments of the method, the orientation information for the tool determined by the machine learning system indicates whether the tool should be re-oriented.


In various embodiments of the method, the method includes determining position information for the tool by the machine learning system based on the at least one image.


In various embodiments of the method, the machine learning system was trained using tool position information for other surgical procedures of a same type as the surgical procedure.


In various embodiments of the method, the position information for the tool determined by the machine learning system indicates whether the tool should be re-positioned.


In an aspect, a surgical guiding system for guiding a surgical procedure includes a memory configured to store instructions, and a processor coupled with the memory and configured to execute the instructions. The processor is configured to execute the instructions to cause the surgical guiding system to access information relating to a surgical procedure, access at least one image of a surgical site captured by an endoscope during the surgical procedure, identify a tool captured in the at least one image by a machine learning system, determine whether the tool should be changed based on comparing the information relating to the surgical procedure and the tool identified by the machine learning system, and provide an indication when the determining indicates that the tool should be changed.


In various embodiments of the surgical guiding system, the information relating to the surgical procedure indicates tools which have been used for other surgical procedures of a same type as the surgical procedure, and in determining whether the tool should be changed, the instructions, when executed, cause the surgical guiding system to determine whether the tool is among the tools which have been used for the other surgical procedures.


In various embodiments of the surgical guiding system, in determining whether the tool should be changed, the instructions, when executed, cause the surgical guiding system to determine that the tool should be changed when the tool is not among the tools which have been used for the other surgical procedures.


In various embodiments of the surgical guiding system, the endoscope is a stereo-endoscope, the at least one image includes at least one stereoscopic image, and the at least one stereoscopic image includes depth information relating to the tool and to tissue of the surgical site.


In various embodiments of the surgical guiding system, the instructions, when executed, further cause the surgical guiding system to determine orientation information for the tool by the machine learning system based on the at least one image.


In various embodiments of the surgical guiding system, the machine learning system was trained using tool orientation information for other surgical procedures of a same type as the surgical procedure.


In various embodiments of the surgical guiding system, the orientation information of the tool determined by the machine learning system indicates whether the tool should be re-oriented.


In various embodiments of the surgical guiding system, the instructions, when executed, further cause the surgical guiding system to determine position information for the tool by the machine learning system based on the at least one image.


In various embodiments of the surgical guiding system, the machine learning system was trained using tool position information for other surgical procedures of a same type as the surgical procedure.


In various embodiments of the surgical guiding system, the position information for the tool determined by the machine learning system indicates whether the tool should be re-positioned.


The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the techniques described in this disclosure will be apparent from the description and drawings, and from the claims.





BRIEF DESCRIPTION OF DRAWINGS


FIG. 1A is a perspective view of a surgical system in accordance with embodiments of the disclosure;



FIG. 1B is a functional block diagram of the surgical system of FIG. 1A in accordance with embodiments of the disclosure;



FIG. 2A is a functional block diagram of a computing device in accordance with embodiments of the disclosure;



FIG. 2B is a block diagram of a machine learning system in accordance with embodiments of the disclosure;



FIG. 3 is a graphical illustration of a surgical image to be processed by a machine learning system in accordance with embodiments of the disclosure; and



FIG. 4 is a flowchart illustrating a method for checking whether a tool should be moved or changed during a surgery in accordance with the disclosure.





DETAILED DESCRIPTION

Laparoscopic surgeries use various sizes, types, shapes, and kinds of surgical tools during the surgical procedure. If an improper tool is used during the laparoscopic surgery, the results of the procedure may be incomplete or unsatisfactory. Further, if a tool is positioned too close to or too far from the target tissue, or oriented incorrectly, similar unsatisfactory results may occur. The present disclosure provides guidance to a clinician regarding a tool used in a surgical procedure and can provide guidance relating to whether a tool should be changed, whether a tool should be re-positioned, and/or whether a tool should be re-oriented. As described in more detail below, aspects of the present disclosure relate to a machine learning system which has been trained using data from other surgical procedures similar to the surgical procedure that is being performed. Such a machine learning system can supplement a clinician's experience and judgment based on data from other, similar surgical procedures.


Referring to FIGS. 1A and 1B, a surgical system or robotic surgical system 100, in accordance with aspects of the disclosure, is shown and includes a surgical robot 110, a processor 140, and a user console 150. The surgical system 100 may not be able to completely conduct a surgery by itself and may be supplemented by a non-robotic tool 170. The surgical robot 110 includes one or more robotic linkages or arms 120 and robot bases 130 which support the corresponding robotic linkages 120. Each robotic linkage 120 moveably supports an end effector or tool 126, which is configured to act on a target of interest. Each robotic linkage 120 has an end 122 that supports the end effector or tool 126. In addition, the ends 122 of the robotic linkages 120 may include an imaging device 124 for imaging a surgical site “S”.


The user console 150 is in communication with the robot bases 130 through the processor 140. In addition, each robot base 130 may include a controller 132, which is in communication with the processor 140, and an arm or robot arm motor 134, as shown in FIG. 1B. The robotic linkage 120 may include one or more arms, and joints between two adjoining arms. The arm motor 134 may be configured to actuate each joint of the robotic linkage 120 to move the end effector 126 to a proper position.


The non-robotic tool 170 or an end effector 126 may be inserted to the surgical site “S” to assist or perform the surgery during the surgical operation. In accordance with aspects of the present disclosure, in order to reduce occurrences of inappropriate tools being used during the surgery, the processor 140 may determine whether the inserted non-robotic tool 170 or the end effector 126 is appropriate. When it is determined that the tool is inappropriate, the processor 140 may display a popup window on a display device 156 of the user console 150 to provide an indication that the tool may be inappropriate. The indication is presented in a manner that does not interfere with the surgery.


In accordance with aspects of the present disclosure, and as described in more detail below, the processor 140 can determine when the tool is not properly positioned, such as being too far from or too close to a target organ for the surgery, when the tool is not properly oriented, such as being oriented with respect to the target at an inappropriate angle, or when the velocity of the tool is too fast to the target. An indication may be presented to bring these determinations to the clinician's attention. In various embodiments, the indication may include, but is not limited to, an audible sound, a popup window on the screen of the display 156, and/or vibrations to the input handles 152 of the user console 150.


Now referring to FIG. 1B, the processor 140 may be a stand-alone computing device similar to the computing device 200 of FIG. 2, or integrated into one or more components of the surgical system 100 (e.g., in the robot bases 130 or the user console 150). The processor 140 may also be distributed across multiple components of the surgical system 100 (e.g., in multiple robot bases 130.) The processor 140 of the surgical system 100 generally includes a processing unit 142, a memory 149, the robot base interface 146, a console interface 144, and an image device interface 148. The robot base interface 146, the console interface 144, and the image device interface 148 communicate with the robot bases 130, the user console 150, the imaging devices 162 via wireless configurations, e.g., Wi-Fi, Bluetooth, LTE, and/or wired configurations. Although depicted as a separate module, the console interface 144, the robot base interface 146, and the image device interface 148 may be a single component in various embodiments.


The user console 150 also includes input handles 152 which are supported on control arms 154 and which allow a clinician to manipulate the surgical robot 110 (e.g., move the robotic linkages 120, the ends 122 of the robotic linkages 120, and/or the tools 126). Each of the input handles 152 is in communication with the processor 140 to transmit control signals thereto and to receive feedback signals therefrom. Additionally or alternatively, each of the input handles 152 may include input devices (not explicitly shown) which allow the surgeon to manipulate (e.g., clamp, grasp, fire, open, close, rotate, thrust, slice, etc.) the tools 126 supported at the ends 122 of the robotic linkages 120.


Each of the input handles 152 is moveable through a predefined workspace to move the ends 122 of the robotic linkages 120, e.g., tools 126, within the surgical site “S”. As the input handles 152 are moved, the tools 126 are moved within the surgical site “S” as detailed below. Movement of the tools 126 may also include movement of the ends 122 of the robotic linkages 120 which support the tools 126.


The user console 150 further includes a computer 158, which includes a processing unit or processor and memory, which includes data, instructions and/or information related to the various components, algorithms, and/or operations of the robot bases 130, similar in many respects to the computing device 200 of FIG. 2A. The user console 150 may operate using any suitable electronic service, database, platform, cloud, or the like. The user console 150 is in communication with the input handles 152 and a display 156. Each input handle 152 may, upon engagement by the clinician, provide input signals to the computer 158 corresponding to the movement of the input handles 152. Based on the received input signals, the computer 158 may process and transmit the signals to the processor 140, which in turn transmits control signals to the robot bases 118 and the devices of the robot bases 118, to effect motion based at least in part on the signals transmitted from the computer 158. In various embodiments, the input handles 152 may be implemented by another mechanism such as handles, pedals, or computer accessories (e.g., a keyboard, joystick, mouse, button, touch screen, switch, trackball, and the like).


The user console 150 includes the display device 156 configured to display two-dimensional and/or three-dimensional images of the surgical site “S,” which may include data captured by the imaging devices 124 positioned on the ends 122 of the robotic linkages 120. In various embodiments, the imaging devices 124 may capture stereoscopic images, visual images, infrared images, ultrasound images, X-ray images, thermal images, and/or other images of the surgical site “S”. The imaging devices 124 transmit captured imaging data to the processor 140 which creates display screens of the surgical site “S” from the imaging data and transmits the display screens to the display device 156 for displaying such.


The display device 156 may be connected to an endoscope installed on the end 122 of the robotic arms 120 so that live view images from the endoscope may be displayed on the display device 156. Further, as described above, a potential notification may be displayed in an overlaying or overlapping manner over the live view images. The endoscope may capture images of the non-robotic tool 170 or the end effector 126. Such captured images of the tools 170 or 126 may be transmitted to and processed by the processor 140, which serves as or coordinates with a machine learning system to identify the tool 170 or 126. In accordance with aspects of the present disclosure, such information may be used to determine whether or not the surgery is appropriately performed.


The identification of the tool may be performed by a machine learning system based on one or more images. The identified tool can be compared with information relating to the surgical procedure to determine whether or not the tool is appropriate for the surgery. In other words, it is determined whether or not the identified tool should be moved or changed. For example, the machine learning system can identify the tool and various aspects of the tool such as size and shape. In case the tool is determined to be inappropriate in size, shape, and/or type for the surgery, an indicator can indicate that the tool should be changed. Further, in aspects of the present disclosure, the machine learning system can predict whether the tool is properly positioned or oriented. In these cases, an indicator can indicate whether the tool should be moved to a different position or orientation. The machine learning system may be trained based on training data derived from previously performed surgeries. The previously performed surgeries may be related to the current surgery being performed. For example, the training data can include frame images and tagged information, which are used to train the machine learning system to identify a tool, determine whether an orientation of a tool is appropriate, and/or whether a positioning of a tool is appropriate. In an aspect, the tagging of the training may be manually entered by doctors, experts, or medical professionals of the previous surgeries.


The tagged information may identify the tool captured in the frame images and/or may indicate whether a tool captured in frame images of the previous surgeries is appropriately positioned or oriented, such as whether the tool is too far or too close from target tissue, and/or oriented at an appropriate or an incorrect angle with respect to target tissue, among other things. The machine learning system may process the frame images with the tagged information to train the machine learning system to make determinations based on the images which will match the tagged information. Further, the machine learning system may be trained to identify a progression stage during the current surgery based on the frame images and/or tagged information of the previous surgeries, which are related to the current surgery.


Regarding the machine learning system, previous surgery videos or frame images may be used to train the machine learning system. For example, doctors or medical professionals may tag tools, organs, and progression information of the surgery in each video or in frame images. Specifically, medical professionals may tag a tool and a target organ in each frame image or video. In addition, they may also tag a frame image with a label of “too close” meaning that the tool is too close to the target organ in the image, or “too far” meaning that the tool is too far from the target organ in the image. This tagged information may be used to train the machine learning system to determine whether the tool captured in the frame images should be moved to a different location or orientation.


Further, medical professionals may also tag the tool on the image frames as being “inappropriate,” meaning that the type, size, or shape of the tool is not appropriate for the respective surgical stage or progression. This tagged information may be used to train the machine learning system to predict whether a tool captured in the frame images should be changed.


Furthermore, medical professionals may also tag non-target but critical organs in the image frames so as to determine whether or not the non-target but critical organs are too close to the tool. As such, unintended contact with critical non-target organs may be reduced.


In an aspect, the machine learning system may process images or videos of previously performed surgeries and add tagged information of tools and progression stages. Such tagged information may be reviewed by experts, doctors, or medical professionals so that they confirm or revise the tagged information.


Referring now to FIG. 2A, a functional block diagram of a computing device is shown and designated generally as a computing device 200. Though not explicitly shown in the corresponding figures of the present application, the computing device 200, or one or more components thereof, may represent one or more components (e.g., the processor 140 or the computer 158) of the surgical system 100. The computing device 200 may include one or more processors 210, memories 220, input interfaces 230, output interfaces 240, network interfaces 250, or any desired subset of components thereof.


The memory 220 includes non-transitory computer-readable storage media for storing data and/or software which include instructions that may be executed by the one or more processors 210. When executed, the instructions may cause the processor 210 to control operation of the computing device 200 such as, without limitation, reception, analysis, and transmission of sensor signals received in response to movement and/or actuation of the one or more input handles 152. The memory 220 may include one or more solid-state storage devices such as flash memory chips. Additionally, or alternatively, the memory 220 may include one or more mass storage devices in communication with the processor 210 through a mass storage controller and a communications bus (not shown). Although the description of computer readable media described in this disclosure refers to a solid-state storage device, it will be appreciated by one of ordinary skill that computer-readable media may include any available media that can be accessed by the processor 210. More particularly, the computer readable storage media may include, without limitation, non-transitory, volatile, non-volatile, removable, non-removable media, and the like, implemented in any method of technology for storage of information such as computer readable instructions, data structures, program modules, or other suitable data access and management systems. Examples of computer-readable storage media may include RAM, ROM, EPROM, EEPROM, flash memory, or other known solid state memory technology, CD-ROM, DVD, Blu-Ray, or other such optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store information and which can be accessed by the computing device 200.


In embodiments, the memory 220 stores data 222 and/or one or more applications 224. Such applications 224 may include instructions which are executed on the one or more processors 210 of the computing device 200. The data 222 may include standard of care for surgeries, where the standard of care may include progression stages of each surgery and appropriate tools at each progression stage. Further, the standard of care saved in the data 222 may be updated or refined by surgeries performed in the future. Furthermore, the standard of care may be updated by a group of expert clinicians for each surgery.


The applications 224 may include instructions which cause an input interface 230 and/or an output interface 240 to receive and transmit sensor signals, respectively, to and from the various components of the surgical system 100. Additionally or alternatively, the computing device 200 may transmit the signals for analysis and/or display via the output interface 240. For example, the memory 220 may include instructions which, when executed, generate a depth map or point cloud of the objects within the surgical environment based on the real-time image data received from the image devices of the surgical system 100. The depth map or point cloud may be stored in the memory 220 across multiple iterations for a later cumulative analysis of the depth maps or point clouds.


Further, the applications 224 may include a machine learning algorithm and the computing device 200 may function as a machine learning system, which is trained with previous surgical videos/frame images with associated tagged information.


Now referring to FIG. 2B, provided is a block diagram of a machine learning system 260, which may be implemented by the computing device 200 of FIG. 2A. The machine learning system 260 may be trained by a plurality of data records 270a-270n. Videos and frame images of previous surgeries may form one set of the data records. For example, the data record 270a may include frame images/videos 272a, tagged information 274a associated with the frame images 272a, and, if relevant, control parameters 276a for a generator which provided surgical energy for a surgery. In an aspect, one surgery may be divided into several stages. In this case, the data record 270a may include a plurality of sets of the data record 270a, and each set may include frame images, tagged information, and control parameters for the corresponding generator, for one stage. In another aspect, each stage may be considered to be separate from the other stages. As such, one surgery may result in two or more sets of the data records.


For simplicity, one letter (e.g., a-n) affixed to the end of a numeral may be omitted hereafter unless such is necessary. For example, the tagged information 274 may represent one or more of the tagged information 274a-274n. The tagged information 274 may be manually or automatically added to or embedded in the frame images 272. For example, medical professionals may manually tag information in the frame images 272 or a tagging algorithm may process the frame images 272 and automatically tag information in the frame images 272.


In another aspect, the frame images 272, the tagged information 274, the control parameters 276 for the generator, and patient parameters 278 of a previously performed surgery generates one data record 270. One data record 270 may be separate, independent from another of the plurality of data records 270a-270n generated from other surgeries.


The machine learning system 260 may be trained by the plurality of the data records 270a-270n. In an aspect, the machine learning system 260 may be trained with data records, which have been generated from surgeries similar to the current surgery. In this case, the machine learning system 260 may be trained by a supervised or reinforcement learning method. In case when the plurality of data records 270a-270n are generated from various surgeries, the machine learning system 260 may be trained by unsupervised learning. In another aspect, the machine learning system 260 may include, but not limited to, convolutional neural networks, recurrent neural networks (RNN), Bayesian Regression, Naive Bayes, nearest neighbors, least squares, means, and support vector regression, among other data science and artificial science techniques.


The tagged information 274 may have one or more levels. The first level is global, meaning that the tagged information in the first level is effective throughout the entire video or image frames, and the second level is local, meaning that the tagged information in the second level is effective for a portion of the video or the frame images. The first level information may include a type of surgery, a target organ, a position of the target organ, and a surgery plan including a range of appropriate surgery angles. The second level information may include tool information and progression information. The progression information may indicate what stage of the surgery is in corresponding frame images. The tool information may provide whether or not a tool is appropriate in size, shape, and type during the surgery, whether or not the tool is too close to or too far from the target organ for the surgery, whether or not the tool is approaching the target organ in an appropriate angle, and whether or not the tool is approaching toward the target organ too fast. In an aspect, the tool information may provide whether or not the tool is too close to a critical non-target organ during the surgery.


Doctors, experts, or medical professionals may add the tagged information 274 to the frame images 272. For example, a tool may be tagged with the boundaries of the tool in the frame images 272. The target organ and a non-target critical organ may be tagged in the same way as the tool. Positional information and/or orientation information about the tool, such as “too far,” “too close,” “wrong angle,” “too fast,” etc., may be added to the frame images 272. The machine learning system 260 may process the images 272 with the associated or corresponding tagged information 274, adjust, update, and revise internal control parameters of the machine learning system 260, and store the internal control parameters in a configuration file.


In embodiments, the tagged information 274 may further include surgical procedural information related to the surgical operation. The surgical procedural information may indicate relationship between the tool and the tissue. For example, the tool may include two jaw members and the surgical procedural information may indicate how much tissue is grasped between the two jaw members. Further, the surgical procedural information may include how hard the two jaw members press the tissue.


In a case when the tool is a cutter, the surgical procedural information may indicate how deep the cut was made by the cutter.


The surgical procedural information may further include hemodynamics during the surgery. During tissue dissection or tissue approximation, bleeding might occur. The surgical procedural information may indicate whether or not bleeding occurred or how much bleeding has occurred.


Furthermore, other information related to the surgical operation may be tagged so that the machine learning system 260 may be trained with these pieces of tagged information.


The generator control parameters 276 may be parameters for a generator which is to supply surgical energy for the surgery, and include, but not limited to, for example, duration, power, ramp rate, frequency, or other generator parameters for the surgery. The generator control parameters 276 may be saved in a database or memory because it is not likely the generator control parameters 276 can be acquired or obtained from processing the frame images 272.


The data records 270 may further include patient parameters 278. The patient parameters 276 may include a patient's age, tissue moisture, hydration, and/or tissue location within the patient's body, among other patient characteristics. In various embodiments, the data relating to the patient parameters 278 may be entered into the data records 270 manually or automatically from the patient's medical records. Since the patient parameters 278 may not be acquired from image processing of the images 272, the patient parameters 278 may be saved in a database or memory as the generator control parameters 276.


After processing and learning from the data records 270 generated from the previous surgeries, the machine learning system 260 is then able to process real-time frame images/videos of a current surgery and provide notifications based on the results. For example, when one or more real-time frame images show that a tool is too close to the target organ, the machine learning system 260 may present an indication that a tool is too close to the target organ for the surgery. Or when one or more real-time frame images show that the tool is approaching to the target organ in a wrong angle, the machine learning system 260 may present an indication that the tool is approaching to the target from a wrong angle. In a similar way, when one or more frame images show that the tool is approaching too fast to the target organ, the machine learning system 260 may present such an indication.


Now referring back to FIG. 2A, the output interface 240 may further transmit and/or receive data via a network interface 250 via one or more wireless configurations, e.g., radio frequency, optical, Wi-Fi®, Bluetooth® (an open wireless protocol for exchanging data over short distances, using short length radio waves, from fixed and mobile devices, creating personal area networks (PANs), ZigBee® (a specification for a suite of high level communication protocols using small, low-power digital radios based on the IEEE® 802.15.4-2003 standard for wireless personal area networks (WPANs)). Although depicted as a separate component, the network interface 250 may be integrated into the input interface 230 and/or the output interface 240.


With additional reference to FIG. 3, the surgery system 100 of FIG. 1A may include an endoscope 310, which is inserted through a body cavity of the patient “P” and is configured to provide optical views or frame images of the surgical site “S” and to transmit the frame images to the display device 156. The endoscope 310 includes a camera 320 to capture images of the surgical site “S” and a tool 340 during a surgery as detailed below. The camera 320 may be an ultrasonic imaging device, a laser imaging device, a fluorescent imaging device, or any other imaging device, which is capable of producing real-time frame images. In various embodiments, the endoscope 310 may be a stereo-endoscope which captures stereo-images having depth information.


The endoscope 310 is inserted through an opening, either a natural opening or an incision, to position the camera 320 within the body cavity adjacent the surgical site “S” to allow the camera 320 to capture images of the surgical site “S” and the tool 340. The camera 320 transmits the captured images to the machine learning system 260 of FIG. 2B. The machine learning system 260 receives the captured images of the surgical site “S” from the camera 320 and displays the received images on a display device such that the clinician can visually see the surgical site “S” and the tool 340. The endoscope 310 may include a sensor 330 that captures the pose of the camera 320 when the images of the surgical site “S” are captured. The sensor 330 is in communication with the machine learning system 260 such that the machine learning system 260 receives the pose of the camera 320 from the sensor 330 and associates the pose of the camera 320 with the images captured by the camera 320.


In an aspect, the machine learning system 260 may process the images and may identify a type, shape, and size of the tool 340 in consideration of the pose of the camera.


The machine learning system 260 may also be trained to identify progression stages of the surgery from the real-time frame images, which may have been refined by a plurality of images of the previous surgeries, which are related to the current surgery.


The identified progression stages may include navigation to the target organ, identification of the target region, entry of preparatory tools for the surgery to the target organ, performance of the surgery, confirmation of completeness of the surgery, and retreat of all tools from the target organ. The list of progression stages is provided as an example but not limited thereto.


Based on the identified progression stage, the machine learning system 260 may identify whether or not the tool 340 is appropriate in type, shape, and size according to the identified progression stage. In case the tool 340 does not have a proper type, shape, or size, the machine learning system 260 may provide an indication that the tool 340 is not appropriate in type, shape, or size based on the identified progression stage. In various embodiments, a tool identified by the machine learning system 260 may be compared to a database of tools used in other similar surgeries. If the tool identified by the machine learning system 260 is not among the tools used in other similar surgeries, a notification can be provided to indicate that the tool should be changed.


In an aspect, the notification may indicate that the type of the tool is appropriate for the surgery but the size thereof is too large or too small for proper operations in the identified progression stage, or that the tool is inappropriate for the identified progression stage. The notification may be generally made in a manner that does not interfere with the surgery. If the level of potential harm from using the tool 340 appears imminent or severe, the notification may be presented with heightened alerts, such as red flashes on the screen, haptic vibrations on the input handle 152 of FIG. 1, or audible alert sounds.


Further, based on the frame images, the machine learning system 260 may identify whether a pose or orientation of the tool 340 is appropriate or incorrect. In case the orientation of the tool 340 is determined to be incorrect or inappropriate by the machine learning system 260, the notification can indicate the tool should be re-oriented or re-positioned.


The machine learning system 260 may record the determinations and notifications during the surgery for further refinement of the internal control parameters of the machine learning system 260 in the future. For example, the surgeon who has performed the surgery or a group of experts may tag information over the frame images in consideration of the notifications. They may discuss the efficacy of the tool used in the surgery, and refine or update the tagged information of the tool from “inappropriate” to “appropriate” based on the positive efficacy in the surgery or vice versa. The machine learning system 260 may be trained with these updates and further refine internal control parameters saved in a configuration file.



FIG. 4 is a flowchart illustrating a method 400 for checking whether or not a tool captured in images is appropriate during a surgery in accordance with embodiments of the disclosure. When a surgical tool or instrument is inserted into an orifice of a patient, one or more cameras (e.g., a stereoscopic endoscope) may capture images of the tool or instrument and a surgical site. The method 400 starts by receiving information of the surgery in step 405. The information of the surgery includes a type of surgery, a target organ, and a position of the target organ. The information of the surgery may be manually or automatically retrieved from a database stored in a memory, which is entered by a doctor or medical professional.


A machine learning system may be configured for the current surgery based on the information of the current surgery. The machine learning system may save a separate configuration file for different types of surgical procedures. In an aspect, the machine learning system may retrieve a configuration file, which is used to process images of the current surgery.


In step 410, images of the surgical site are received from the endoscope by the machine learning system. The images are processed to determine whether the tool is captured in the images in step 415. This determination may be performed by an image processing algorithm or by the machine learning system.


In a case when it is determined that the tool is not captured in the images in step 410, the method 400 returns to step 410 until the tool is captured in the images.


When the tool is determined to be captured in the images, the information of the tool is identified by the machine learning system from the images in step 420. The information of the tool may include a size, shape, position information, and orientation information of the tool. In an aspect, the pose of an endoscope or a camera, which has captured the images, may be also considered to adjust the images so that the machine learning system can identify the tool with accuracy. In step 420, the machine learning system may also identify a target organ.


In embodiment, hemodynamics may be also determined in step 420. For example, it is determined whether or not bleeding occurred and, if so, how much bleeding has occurred. Further, relationship between the tool and the tissue may also be determined in step 420. The amount of tissue grasped by two jaw members of the tool may be determined. Also, the amount of pressure applied by the two jaw members may be determined. When such determination is determined in step 425 to be out of a range suitable for the surgical operation, a warning about the surgical operation may be displayed in step 430.


In step 425, the machine learning system determines whether or not the tool should be changed or re-positioned or re-oriented based on the configuration file, which has been generate, revised, and updated by images of previous surgeries by the machine learning system. If the machine learning system determines that the tool should be moved or changed, the method 400 proceeds to step 440.


When it is determined that the tool should be changed in step 425, meaning that the tool is inappropriate in shape or size or type for the surgery, the corresponding indication is displayed for the surgeon in step 430. In particular, the notification brings the surgeon's attention that the tool does not have a type, size, or shape suitable for the surgery and should be changed or replaced.


When it is determined that the tool should be re-oriented in step 425, meaning that the tool is positioned in a wrong orientation relative to the target organ for the surgery, an indication is display so that the tool is re-oriented.


Determination that the tool should be moved in step 425 may also mean the tool is not appropriately positioned relative to the target organ for performing the surgery. In this case, the indication is presented so that the surgeon re-positions the tool.


Further, determination that the tool should be moved in step 425 may also mean the tool is approaching the target organ too fast. In this case, the indication is presented so that the surgeon slows down the speed of the tool toward the target organ.


In an aspect, the warning may be relayed to the clinician via haptic vibrations or audible sound alerts. In a case when the level of inappropriateness is sufficiently high to outweigh against performing the surgery, the operation may be abruptly stopped to mitigate potential harms.


In step 435, the detected tool with the images may be recorded for future reference. For example, a board of experts may gather together to enter, refine, or update tagged information for this surgery based on the warning records, and the machine learning system can train itself with this new data. In this way, the machine learning system may update and refine the internal control parameters and save in the configuration file.


In step 440, it is determined whether or not the surgery is completed. When the surgery is not complete, the method 400 reiterates steps 410-440. Otherwise, the method 400 is ended after completion of the surgery.


It should be understood that various aspects disclosed herein may be combined in different combinations than the combinations specifically presented in the description and accompanying drawings. It should also be understood that, depending on the example, certain acts or events of any of the processes or methods described herein may be performed in a different sequence, may be added, merged, or left out altogether (e.g., all described acts or events may not be necessary to carry out the techniques). In addition, while certain aspects of this disclosure are described as being performed by a single module or unit for purposes of clarity, it should be understood that the techniques of this disclosure may be performed by a combination of units or modules associated with, for example, a medical device.


In one or more examples, the described techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).


Instructions may be executed by one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor” as used herein may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.

Claims
  • 1. A method for guiding a surgical procedure, the method comprising: accessing information relating to a surgical procedure;accessing at least one image of a surgical site captured by an endoscope during the surgical procedure;identifying a tool captured in the at least one image by a machine learning system;determining whether the tool should be moved based on comparing the information relating to the surgical procedure and the tool identified by the machine learning system; andproviding an indication: when the determining indicates that the tool should be moved; andwhen the tool is moving toward a target organ tissue at a speed that is greater than a predetermined speed.
  • 2. The method according to claim 1, wherein the information relating to the surgical procedure indicates tools which have been used for other surgical procedures of a same type as the surgical procedure, and wherein the method further comprises: determining whether the tool should be changed based on comparing the information relating to the surgical procedure and the tool identified by the machine learning system;wherein determining whether the tool should be changed includes determining whether the tool is among the tools which have been used for the other surgical procedures.
  • 3. The method according to claim 2, wherein determining whether the tool should be changed includes determining that the tool should be changed when the tool is not among the tools which have been used for the other surgical procedures.
  • 4. The method according to claim 1, wherein the endoscope is a stereo-endoscope, the at least one image includes at least one stereoscopic image, and the at least one stereoscopic image includes depth information relating to the tool and to tissue of the surgical site.
  • 5. The method according to claim 4, further comprising determining position information for the tool by the machine learning system based on the at least one image.
  • 6. The method according to claim 5, wherein the machine learning system was trained using tool position information for other surgical procedures of a same type as the surgical procedure.
  • 7. The method according to claim 6, wherein the position information for the tool determined by the machine learning system indicates whether the tool should be re-positioned.
  • 8. The method according to claim 4, further comprising determining orientation information for the tool by the machine learning system based on the at least one image.
  • 9. The method according to claim 8, wherein the machine learning system was trained using tool orientation information for other surgical procedures of a same type as the surgical procedure.
  • 10. The method according to claim 9, wherein the orientation information for the tool determined by the machine learning system indicates whether the tool should be re-oriented.
  • 11. A surgical guiding system for guiding a surgical procedure, the system comprising: a memory configured to store instructions; anda processor coupled with the memory and configured to execute the instructions to cause the surgical guiding system to: access information relating to a surgical procedure;access at least one image of a surgical site captured by an endoscope during the surgical procedure;identify a tool captured in the at least one image by a machine learning system;determine whether the tool should be moved based on comparing the information relating to the surgical procedure and the tool identified by the machine learning system; andprovide an indication; when the determining indicates that the tool should be moved; andwhen the tool is moving toward a target organ tissue at a speed that is greater than a predetermined speed.
  • 12. The surgical guiding system according to claim 11, wherein the information relating to the surgical procedure indicates tools which have been used for other surgical procedures of a same type as the surgical procedure, and wherein the method comprises: determining whether the tool should be changed based on comparing the information relating to the surgical procedure and the tool identified by the machine learning system;wherein in determining whether the tool should be changed, the instructions, when executed, cause the surgical guiding system to determine whether the tool is among the tools which have been used for the other surgical procedures.
  • 13. The surgical guiding system according to claim 12, wherein in determining whether the tool should be changed, the instructions, when executed, cause the surgical guiding system to determine that the tool should be changed when the tool is not among the tools which have been used for the other surgical procedures.
  • 14. The surgical guiding system according to claim 11, wherein the endoscope is a stereo-endoscope, the at least one image includes at least one stereoscopic image, and the at least one stereoscopic image includes depth information relating to the tool and to tissue of the surgical site.
  • 15. The surgical guiding system according to claim 14, wherein the instructions, when executed, further cause the surgical guiding system to determine position information for the tool by the machine learning system based on the at least one image.
  • 16. The surgical guiding system according to claim 15, wherein the machine learning system was trained using tool position information for other surgical procedures of a same type as the surgical procedure.
  • 17. The surgical guiding system according to claim 16, wherein the position information for the tool determined by the machine learning system indicates whether the tool should be re-positioned.
  • 18. The surgical guiding system according to claim 14, wherein the instructions, when executed, further cause the surgical guiding system to determine orientation information for the tool by the machine learning system based on the at least one image.
  • 19. The surgical guiding system according to claim 18, wherein the machine learning system was trained using tool orientation information for other surgical procedures of a same type as the surgical procedure.
  • 20. The surgical guiding system according to claim 19, wherein the orientation information of the tool determined by the machine learning system indicates whether the tool should be re-oriented.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application is a U.S. National Stage Application filed under 35 U.S.C. § 371(a) of International Patent Application Serial No. PCT/US2020/060990, filed Nov. 18, 2020, which claims the benefit of and priority to U.S. Provisional Patent Application Ser. No. 62/952,628, filed Dec. 23, 2019, the entire disclosure of each of which being incorporated by reference herein.

PCT Information
Filing Document Filing Date Country Kind
PCT/US2020/060990 11/18/2020 WO
Publishing Document Publishing Date Country Kind
WO2021/133483 7/1/2021 WO A
US Referenced Citations (395)
Number Name Date Kind
6132368 Cooper Oct 2000 A
6206903 Ramans Mar 2001 B1
6246200 Blumenkranz et al. Jun 2001 B1
6312435 Wallace et al. Nov 2001 B1
6331181 Tierney et al. Dec 2001 B1
6394998 Wallace et al. May 2002 B1
6424885 Niemeyer et al. Jul 2002 B1
6441577 Blumenkranz et al. Aug 2002 B2
6459926 Nowlin et al. Oct 2002 B1
6491691 Morley et al. Dec 2002 B1
6491701 Tierney et al. Dec 2002 B2
6493608 Niemeyer Dec 2002 B1
6565554 Niemeyer May 2003 B1
6645196 Nixon et al. Nov 2003 B1
6659939 Moll Dec 2003 B2
6671581 Niemeyer et al. Dec 2003 B2
6676684 Morley et al. Jan 2004 B1
6685698 Morley et al. Feb 2004 B2
6699235 Wallace et al. Mar 2004 B2
6714839 Salisbury, Jr. et al. Mar 2004 B2
6716233 Whitman Apr 2004 B1
6728599 Wang et al. Apr 2004 B2
6746443 Morley et al. Jun 2004 B1
6766204 Niemeyer et al. Jul 2004 B2
6770081 Cooper et al. Aug 2004 B1
6772053 Niemeyer Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6793652 Whitman et al. Sep 2004 B1
6793653 Sanchez et al. Sep 2004 B2
6799065 Niemeyer Sep 2004 B1
6837883 Moll et al. Jan 2005 B2
6839612 Sanchez et al. Jan 2005 B2
6840938 Morley et al. Jan 2005 B1
6843403 Whitman Jan 2005 B2
6846309 Whitman et al. Jan 2005 B2
6866671 Tierney et al. Mar 2005 B2
6871117 Wang et al. Mar 2005 B2
6879880 Nowlin et al. Apr 2005 B2
6899705 Niemeyer May 2005 B2
6902560 Morley et al. Jun 2005 B1
6936042 Wallace et al. Aug 2005 B2
6951535 Ghodoussi et al. Oct 2005 B2
6974449 Niemeyer Dec 2005 B2
6991627 Madhani et al. Jan 2006 B2
6994708 Manzo Feb 2006 B2
7048745 Tierney et al. May 2006 B2
7066926 Wallace et al. Jun 2006 B2
7118582 Wang et al. Oct 2006 B1
7125403 Julian et al. Oct 2006 B2
7155315 Niemeyer et al. Dec 2006 B2
7239940 Wang et al. Jul 2007 B2
7306597 Manzo Dec 2007 B2
7357774 Cooper Apr 2008 B2
7373219 Nowlin et al. May 2008 B2
7379790 Toth et al. May 2008 B2
7386365 Nixon Jun 2008 B2
7391173 Schena Jun 2008 B2
7398707 Morley et al. Jul 2008 B2
7413565 Wang et al. Aug 2008 B2
7453227 Prisco et al. Nov 2008 B2
7524320 Tierney et al. Apr 2009 B2
7574250 Niemeyer Aug 2009 B2
7594912 Cooper et al. Sep 2009 B2
7607440 Coste-Maniere et al. Oct 2009 B2
7666191 Orban, III et al. Feb 2010 B2
7682357 Ghodoussi et al. Mar 2010 B2
7689320 Prisco et al. Mar 2010 B2
7695481 Wang et al. Apr 2010 B2
7695485 Whitman et al. Apr 2010 B2
7699855 Anderson et al. Apr 2010 B2
7713263 Niemeyer May 2010 B2
7725214 Diolaiti May 2010 B2
7727244 Orban, III et al. Jun 2010 B2
7741802 Prisco Jun 2010 B2
7756036 Druke et al. Jul 2010 B2
7757028 Druke et al. Jul 2010 B2
7762825 Burbank et al. Jul 2010 B2
7778733 Nowlin et al. Aug 2010 B2
7803151 Whitman Sep 2010 B2
7806891 Nowlin et al. Oct 2010 B2
7819859 Prisco et al. Oct 2010 B2
7819885 Cooper Oct 2010 B2
7824401 Manzo et al. Nov 2010 B2
7835823 Sillman et al. Nov 2010 B2
7843158 Prisco Nov 2010 B2
7865266 Moll et al. Jan 2011 B2
7865269 Prisco et al. Jan 2011 B2
7886743 Cooper et al. Feb 2011 B2
7899578 Prisco et al. Mar 2011 B2
7907166 Lamprecht et al. Mar 2011 B2
7935130 Williams May 2011 B2
7963913 Devengenzo et al. Jun 2011 B2
7983793 Toth et al. Jul 2011 B2
8002767 Sanchez Aug 2011 B2
8004229 Nowlin et al. Aug 2011 B2
8012170 Whitman et al. Sep 2011 B2
8054752 Druke et al. Nov 2011 B2
8062288 Cooper et al. Nov 2011 B2
8079950 Stern et al. Dec 2011 B2
8100133 Mintz et al. Jan 2012 B2
8108072 Zhao et al. Jan 2012 B2
8120301 Goldberg et al. Feb 2012 B2
8142447 Cooper et al. Mar 2012 B2
8147503 Zhao et al. Apr 2012 B2
8151661 Schena et al. Apr 2012 B2
8155479 Hoffman et al. Apr 2012 B2
8182469 Anderson et al. May 2012 B2
8202278 Orban, III et al. Jun 2012 B2
8206406 Orban, III Jun 2012 B2
8210413 Whitman et al. Jul 2012 B2
8216250 Orban, III et al. Jul 2012 B2
8220468 Cooper et al. Jul 2012 B2
8256319 Cooper et al. Sep 2012 B2
8285517 Sillman et al. Oct 2012 B2
8315720 Mohr et al. Nov 2012 B2
8335590 Costa et al. Dec 2012 B2
8347757 Duval Jan 2013 B2
8374723 Zhao et al. Feb 2013 B2
8418073 Mohr et al. Apr 2013 B2
8419717 Diolaiti et al. Apr 2013 B2
8423182 Robinson et al. Apr 2013 B2
8452447 Nixon May 2013 B2
8454585 Whitman Jun 2013 B2
8499992 Whitman et al. Aug 2013 B2
8508173 Goldberg et al. Aug 2013 B2
8528440 Morley et al. Sep 2013 B2
8529582 Devengenzo et al. Sep 2013 B2
8540748 Murphy et al. Sep 2013 B2
8551116 Julian et al. Oct 2013 B2
8562594 Cooper et al. Oct 2013 B2
8594841 Zhao et al. Nov 2013 B2
8597182 Stein et al. Dec 2013 B2
8597280 Cooper et al. Dec 2013 B2
8600551 Itkowitz et al. Dec 2013 B2
8608773 Tierney et al. Dec 2013 B2
8620473 Diolaiti et al. Dec 2013 B2
8624537 Nowlin et al. Jan 2014 B2
8634957 Toth et al. Jan 2014 B2
8638056 Goldberg et al. Jan 2014 B2
8638057 Goldberg et al. Jan 2014 B2
8644988 Prisco et al. Feb 2014 B2
8666544 Moll et al. Mar 2014 B2
8668638 Donhowe et al. Mar 2014 B2
8746252 McGrogan et al. Jun 2014 B2
8749189 Nowlin et al. Jun 2014 B2
8749190 Nowlin et al. Jun 2014 B2
8758352 Cooper et al. Jun 2014 B2
8761930 Nixon Jun 2014 B2
8768516 Diolaiti et al. Jul 2014 B2
8786241 Nowlin et al. Jul 2014 B2
8790243 Cooper et al. Jul 2014 B2
8808164 Hoffman et al. Aug 2014 B2
8816628 Nowlin et al. Aug 2014 B2
8821480 Burbank Sep 2014 B2
8823308 Nowlin et al. Sep 2014 B2
8827989 Niemeyer Sep 2014 B2
8838270 Druke et al. Sep 2014 B2
8852174 Burbank Oct 2014 B2
8858547 Brogna Oct 2014 B2
8862268 Robinson et al. Oct 2014 B2
8864751 Prisco et al. Oct 2014 B2
8864752 Diolaiti et al. Oct 2014 B2
8903546 Diolaiti et al. Dec 2014 B2
8903549 Itkowitz et al. Dec 2014 B2
8911428 Cooper et al. Dec 2014 B2
8912746 Reid et al. Dec 2014 B2
8944070 Guthart Feb 2015 B2
8989903 Weir et al. Mar 2015 B2
9002518 Manzo Apr 2015 B2
9014856 Manzo et al. Apr 2015 B2
9016540 Whitman et al. Apr 2015 B2
9019345 O'Grady et al. Apr 2015 B2
9043027 Durant et al. May 2015 B2
9050120 Swarup et al. Jun 2015 B2
9055961 Manzo et al. Jun 2015 B2
9068628 Solomon et al. Jun 2015 B2
9078684 Williams Jul 2015 B2
9084623 Gomez et al. Jul 2015 B2
9095362 Dachs, II et al. Aug 2015 B2
9096033 Holop et al. Aug 2015 B2
9101381 Burbank et al. Aug 2015 B2
9113877 Whitman et al. Aug 2015 B1
9138284 Krom et al. Sep 2015 B2
9144456 Rosa et al. Sep 2015 B2
9198730 Prisco et al. Dec 2015 B2
9204923 Manzo et al. Dec 2015 B2
9226648 Saadat et al. Jan 2016 B2
9226750 Weir et al. Jan 2016 B2
9226761 Burbank Jan 2016 B2
9232984 Guthart et al. Jan 2016 B2
9241766 Duque et al. Jan 2016 B2
9241767 Prisco et al. Jan 2016 B2
9241769 Arkin et al. Jan 2016 B2
9259275 Burbank Feb 2016 B2
9259277 Rogers et al. Feb 2016 B2
9259281 Griffiths et al. Feb 2016 B2
9259282 Azizian et al. Feb 2016 B2
9261172 Solomon et al. Feb 2016 B2
9265567 Orban, III et al. Feb 2016 B2
9265584 Itkowitz et al. Feb 2016 B2
9283049 Diolaiti et al. Mar 2016 B2
9301811 Goldberg et al. Apr 2016 B2
9314307 Richmond et al. Apr 2016 B2
9317651 Nixon Apr 2016 B2
9345546 Toth et al. May 2016 B2
9393017 Flanagan et al. Jul 2016 B2
9402689 Prisco et al. Aug 2016 B2
9417621 Diolaiti Aug 2016 B2
9424303 Hoffman et al. Aug 2016 B2
9433418 Whitman et al. Sep 2016 B2
9446517 Burns et al. Sep 2016 B2
9452020 Griffiths et al. Sep 2016 B2
9474569 Manzo et al. Oct 2016 B2
9480533 Devengenzo et al. Nov 2016 B2
9503713 Zhao et al. Nov 2016 B2
9550300 Danitz et al. Jan 2017 B2
9554859 Nowlin et al. Jan 2017 B2
9566124 Prisco et al. Feb 2017 B2
9579164 Itkowitz et al. Feb 2017 B2
9585641 Cooper et al. Mar 2017 B2
9615883 Schena et al. Apr 2017 B2
9623563 Nixon Apr 2017 B2
9623902 Griffiths et al. Apr 2017 B2
9629520 Diolaiti Apr 2017 B2
9662177 Weir et al. May 2017 B2
9664262 Donlon et al. May 2017 B2
9675354 Weir et al. Jun 2017 B2
9687312 Dachs, II et al. Jun 2017 B2
9700334 Hinman et al. Jul 2017 B2
9718190 Larkin et al. Aug 2017 B2
9730719 Brisson et al. Aug 2017 B2
9737199 Pistor et al. Aug 2017 B2
9795446 DiMaio et al. Oct 2017 B2
9797484 Solomon et al. Oct 2017 B2
9801690 Larkin et al. Oct 2017 B2
9814530 Weir et al. Nov 2017 B2
9814536 Goldberg et al. Nov 2017 B2
9814537 Itkowitz et al. Nov 2017 B2
9820823 Richmond et al. Nov 2017 B2
9827059 Robinson et al. Nov 2017 B2
9830371 Hoffman et al. Nov 2017 B2
9839481 Blumenkranz et al. Dec 2017 B2
9839487 Dachs, II Dec 2017 B2
9850994 Schena Dec 2017 B2
9855102 Blumenkranz Jan 2018 B2
9855107 Labonville et al. Jan 2018 B2
9872737 Nixon Jan 2018 B2
9877718 Weir et al. Jan 2018 B2
9883920 Blumenkranz Feb 2018 B2
9888974 Niemeyer Feb 2018 B2
9895813 Blumenkranz et al. Feb 2018 B2
9901408 Larkin Feb 2018 B2
9918800 Itkowitz et al. Mar 2018 B2
9943375 Blumenkranz et al. Apr 2018 B2
9948852 Lilagan et al. Apr 2018 B2
9949798 Weir Apr 2018 B2
9949802 Cooper Apr 2018 B2
9952107 Blumenkranz et al. Apr 2018 B2
9956044 Gomez et al. May 2018 B2
9980778 Ohline et al. May 2018 B2
10008017 Itkowitz et al. Jun 2018 B2
10028793 Griffiths et al. Jul 2018 B2
10033308 Chaghajerdi et al. Jul 2018 B2
10034719 Richmond et al. Jul 2018 B2
10052167 Au et al. Aug 2018 B2
10085811 Weir et al. Oct 2018 B2
10092344 Mohr et al. Oct 2018 B2
10123844 Nowlin Nov 2018 B2
10188471 Brisson Jan 2019 B2
10201390 Swarup et al. Feb 2019 B2
10213202 Flanagan et al. Feb 2019 B2
10258416 Mintz et al. Apr 2019 B2
10278782 Jarc et al. May 2019 B2
10278783 Itkowitz et al. May 2019 B2
10282881 Itkowitz et al. May 2019 B2
10335242 Devengenzo et al. Jul 2019 B2
10383694 Venkataraman Aug 2019 B1
10405934 Prisco et al. Sep 2019 B2
10433922 Itkowitz et al. Oct 2019 B2
10464219 Robinson et al. Nov 2019 B2
10485621 Morrissette et al. Nov 2019 B2
10500004 Hanuschik et al. Dec 2019 B2
10500005 Weir et al. Dec 2019 B2
10500007 Richmond et al. Dec 2019 B2
10507066 DiMaio et al. Dec 2019 B2
10510267 Jarc et al. Dec 2019 B2
10524871 Liao Jan 2020 B2
10548459 Itkowitz et al. Feb 2020 B2
10575909 Robinson et al. Mar 2020 B2
10592529 Hoffman et al. Mar 2020 B2
10595946 Nixon Mar 2020 B2
10881469 Robinson Jan 2021 B2
10881473 Itkowitz et al. Jan 2021 B2
10898188 Burbank Jan 2021 B2
10898189 McDonald, II Jan 2021 B2
10905506 Itkowitz et al. Feb 2021 B2
10912544 Brisson et al. Feb 2021 B2
10912619 Jarc et al. Feb 2021 B2
10918387 Duque et al. Feb 2021 B2
10918449 Solomon et al. Feb 2021 B2
10932873 Griffiths et al. Mar 2021 B2
10932877 Devengenzo et al. Mar 2021 B2
10939969 Swarup et al. Mar 2021 B2
10939973 DiMaio et al. Mar 2021 B2
10952801 Miller et al. Mar 2021 B2
10965933 Jarc Mar 2021 B2
10966742 Rosa et al. Apr 2021 B2
10973517 Wixey Apr 2021 B2
10973519 Weir et al. Apr 2021 B2
10984567 Itkowitz et al. Apr 2021 B2
10993773 Cooper et al. May 2021 B2
10993775 Cooper et al. May 2021 B2
11000331 Krom et al. May 2021 B2
11013567 Wu et al. May 2021 B2
11020138 Ragosta Jun 2021 B2
11020191 Piolaiti et al. Jun 2021 B2
11020193 Wixey et al. Jun 2021 B2
11026755 Weir et al. Jun 2021 B2
11026759 Donlon et al. Jun 2021 B2
11040189 Vaders et al. Jun 2021 B2
11045077 Stern et al. Jun 2021 B2
11045274 Dachs, II et al. Jun 2021 B2
11058501 Tokarchuk et al. Jul 2021 B2
11076925 DiMaio et al. Aug 2021 B2
11090119 Burbank Aug 2021 B2
11096687 Flanagan et al. Aug 2021 B2
11098803 Duque et al. Aug 2021 B2
11109925 Cooper et al. Sep 2021 B2
11116578 Hoffman et al. Sep 2021 B2
11129683 Steger et al. Sep 2021 B2
11135029 Suresh et al. Oct 2021 B2
11147552 Burbank et al. Oct 2021 B2
11147640 Jarc et al. Oct 2021 B2
11154373 Abbott et al. Oct 2021 B2
11154374 Hanuschik et al. Oct 2021 B2
11160622 Goldberg et al. Nov 2021 B2
11160625 Wixey et al. Nov 2021 B2
11161243 Rabindran et al. Nov 2021 B2
11166758 Mohr et al. Nov 2021 B2
11166770 DiMaio et al. Nov 2021 B2
11166773 Ragosta et al. Nov 2021 B2
11173597 Rabindran et al. Nov 2021 B2
11185378 Weir et al. Nov 2021 B2
11191596 Thompson et al. Dec 2021 B2
11197729 Thompson et al. Dec 2021 B2
11213360 Hourtash et al. Jan 2022 B2
11221863 Azizian et al. Jan 2022 B2
11234700 Ragosta et al. Feb 2022 B2
11241274 Vaders et al. Feb 2022 B2
11241290 Waterbury et al. Feb 2022 B2
11259870 DiMaio et al. Mar 2022 B2
11259884 Burbank Mar 2022 B2
11272993 Gomez et al. Mar 2022 B2
11272994 Saraliev et al. Mar 2022 B2
11291442 Wixey et al. Apr 2022 B2
11291513 Manzo et al. Apr 2022 B2
11348257 Lang May 2022 B2
11647888 Meglan May 2023 B2
11801114 Lang Oct 2023 B2
11806085 Meglan Nov 2023 B2
11871901 Shelton, IV Jan 2024 B2
11925423 Meglan Mar 2024 B2
11986261 Meglan May 2024 B2
12029510 Peine Jul 2024 B2
12048413 Tada Jul 2024 B2
20090088634 Zhao Apr 2009 A1
20150287236 Winne Oct 2015 A1
20170251900 Hansen Sep 2017 A1
20190104919 Shelton, IV Apr 2019 A1
20190125361 Shelton, IV May 2019 A1
20190125454 Stokes May 2019 A1
20190125455 Shelton, IV May 2019 A1
20190125456 Shelton, IV May 2019 A1
20190125457 Parihar May 2019 A1
20190125458 Shelton, IV May 2019 A1
20190200844 Shelton, IV Jul 2019 A1
20190200977 Shelton, IV Jul 2019 A1
20190200981 Harris et al. Jul 2019 A1
20190201018 Shelton, IV et al. Jul 2019 A1
20190201019 Shelton, IV et al. Jul 2019 A1
20190201104 Shelton, IV Jul 2019 A1
20190201136 Shelton, IV Jul 2019 A1
20190201137 Shelton, IV Jul 2019 A1
20190206562 Shelton, IV Jul 2019 A1
20190206564 Shelton, IV Jul 2019 A1
20190380791 Fuerst Dec 2019 A1
20200022764 Flexman Jan 2020 A1
20200078106 Henderson Mar 2020 A1
20200081585 Petre Mar 2020 A1
20200135330 Sugie Apr 2020 A1
20200304753 Venkataraman Sep 2020 A1
20210030495 Savall Feb 2021 A1
20210290315 Lampert Sep 2021 A1
20210401508 Zhao Dec 2021 A1
20220020496 Saito Jan 2022 A1
Foreign Referenced Citations (2)
Number Date Country
2019130085 Jul 2019 WO
2019130086 Jul 2019 WO
Non-Patent Literature Citations (2)
Entry
International Search Report mailed Apr. 9, 2021 and Written Opinion completed Mar. 30, 2021 corresponding to counterpart Int'l Patent Application PCT/US20/60990.
European Examination Report dated Nov. 13, 2023 for European Application No. 20824768.4-1216 (8 pages).
Related Publications (1)
Number Date Country
20220395334 A1 Dec 2022 US
Provisional Applications (1)
Number Date Country
62952628 Dec 2019 US