Advanced basket drive mode

Information

  • Patent Grant
  • 11439419
  • Patent Number
    11,439,419
  • Date Filed
    Monday, December 14, 2020
    4 years ago
  • Date Issued
    Tuesday, September 13, 2022
    2 years ago
Abstract
A robotic system includes a robotic manipulator configured to: manipulate a medical instrument having a basket; open the basket at a first opening speed and a second, faster opening speed; and close the basket at a first closing speed and a second, faster closing speed. The system includes an input device configured to receive one or more user interactions and initiate one or more actions by the robotic manipulator, including directly controlled movement and/or pre-programmed motions. Control circuitry of the robotic system is configured to: in response to receiving a first user interaction via the input device, trigger a first pre-programmed motion of the robotic manipulator to open the basket at the second, faster opening speed; and in response to receiving a second user interaction via the input device, trigger a second pre-programmed motion to close the basket at the second, faster closing speed.
Description
BACKGROUND
Field

The present disclosure relates to the field of medical devices and procedures and user interfaces.


Description of the Related Art

Various medical procedures involve the use of one or more devices configured to penetrate the human anatomy to reach a treatment site. Certain operational processes can involve inserting the one or more devices through the skin or an orifice of a patient to reach the treatment site and extract an object from the patient, such as a urinary stone.


SUMMARY

Described herein are one or more systems, devices, and/or methods to assist a physician or other in controlling a medical instrument to access to an object, such as a urinary stone, located within the human anatomy.


One general aspect includes a robotic system for performing a medical procedure, the robotic system including a robotic manipulator configured to: manipulate a medical instrument having a basket, the medical instrument configured to access a human anatomy; open the basket at a first opening speed and a second opening speed faster than the first opening speed; and close the basket at a first closing speed and a second closing speed faster than the first closing speed. The system can include an input device configured to receive one or more user interactions and initiate one or more actions by the robotic manipulator, the one or more actions may include at least one of directly controlled movement and pre-programmed motions. The system also can include control circuitry communicatively coupled to the input device and the robotic manipulator and configured to: in response to receiving a first user interaction via the input device, trigger a first pre-programmed motion of the robotic manipulator, the first pre-programmed motion including opening the basket at the second opening speed; and in response to receiving a second user interaction via the input device, trigger a second pre-programmed motion of the robotic manipulator, the second pre-programmed motion including closing the basket at the second closing speed. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.


Implementations of the robotic system may include one or more of the following features. The robotic system may include a ureteroscope. The medical procedure may include ureteroscopy. The input device may include a control pad having directional controls configured to direct movement of the robotic manipulator along a plurality of axes and a plurality of buttons including a first button and a second button. The first user interaction may include double tapping the first button. The second user interaction may include double tapping the second button. The robotic system the control circuitry can be further configured to: in response to tapping the first button and the second button concurrently, trigger a third pre-programmed motion of the robotic manipulator including a repeated, short range, forward and backward movement at an accelerated speed. The control circuitry can be further configured to: in response to receiving a third user interaction, trigger a third pre-programmed motion of the robotic manipulator including a repeated, short range, forward and backward movement at an accelerated speed. The second pre-programmed motion may further include: detecting a torque on a drive mechanism of the basket; and in response to the torque exceeding a threshold, stopping the closing of the basket. The first user interaction and/or the second user interaction may include a voice command. Implementations of the described techniques may include hardware, a method or process, or computer software on a computer-accessible medium.


One general aspect includes a method for controlling a medical instrument using a robotic manipulator. The method can include manipulating, using the robotic manipulator, a medical instrument including a basket to access a human anatomy, the robotic manipulator configured to open the basket at a first opening speed and a second opening speed, the robotic manipulator further configured to close the basket at a first closing speed and a second closing speed; receiving, via an input device, one or more user interactions for triggering pre-programmed actions by the robotic manipulator. The method can further include, in response to receiving a first user interaction via the input device, triggering a first pre-programmed motion of the robotic manipulator, the first pre-programmed motion including opening the basket at the second opening speed, the second opening speed faster than the first opening speed. The method can further include, in response to receiving a second user interaction via the input device, triggering a second pre-programmed motion of the robotic manipulator, the second pre-programmed motion including closing the basket at the second closing speed, the second closing speed faster than the first closing speed. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.


Implementations of the method may include one or more of the following features. The first user interaction may include double tapping a first button of the input device and second user interaction may include double tapping a second button of the input device. The method can further include, in response to tapping the first button and the second button concurrently, triggering a third pre-programmed motion of the robotic manipulator, the third pre-programmed motion including a repeated, short range, forward and backward movement at an accelerated speed. The method may further include, in response to receiving a movement input along a first axis on the input device, moving a central locus of the third pre-programmed motion of the robotic manipulator along the first axis; and repeating the short range, forward and backward movement at the central locus. The third pre-programmed motion may further include a repeated, rotational movement. The method may further include manipulating, using the robotic manipulator, an endoscope to access a human anatomy, the endoscope configured to capture images of the medical instrument within the human anatomy. The method may further include receiving, via an input device, a third user interaction for directly controlling movement of the medical instrument; and manipulating, using the robotic manipulator, the medical instrument along one or more axes of movement based on the received third user interaction. The second pre-programmed motion may further include: detecting a torque on a drive mechanism of the basket; and in response to the torque exceeding a threshold, stopping the closing of the basket. Implementations of the described techniques may include hardware, a method or process, or computer software on a computer-accessible medium.


One general aspect includes a control system for controlling a robotic device for performing a medical procedure. The control system can include an input device configured to receive one or more user interactions and initiate one or more actions by the robotic device, the one or more actions including at least one of directly-controlled movement and pre-programmed motions. The control system can further include a communication interface configured to send commands to the robotic device corresponding to the directly-controlled movement and the pre-programmed motions, the commands including: movement, by the robotic device, of a medical instrument having a basket, the medical instrument configured to access a human anatomy; opening the basket at a first opening speed and a second opening speed faster than the first opening speed; and closing the basket at a first closing speed and a second closing speed faster than the first closing speed. The control system can further include control circuitry communicatively coupled to the input device and the communication interface, the control circuitry configured to: in response to receiving a first user interaction, trigger a first pre-programmed motion of the robotic device, the first pre-programmed motion including opening the basket at the second opening speed; and in response to receiving a second user interaction, trigger a second pre-programmed motion of the robotic device, the second pre-programmed motion including closing the basket at the second closing speed. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.


Implementations of the control system may include one or more of the following features. The input device may include directional controls configured to direct movement of the robotic device along a plurality of axes; and a plurality of buttons including a first button configured to trigger the first pre-programmed motion and a second button configured to trigger the second pre-programmed motion. Double-tapping the first button can trigger the first pre-programmed motion and double-tapping the second button can trigger the second pre-programmed motion. Single tapping the first button can trigger a third pre-programmed motion different from the first pre-programmed motion and single-tapping the second button can trigger a fourth pre-programmed motion different from the second pre-programmed motion. The control circuitry can be further configured to: in response to tapping the first button and the second button concurrently, trigger a third pre-programmed motion of the robotic device, the third pre-programmed motion including a repeated, short range, forward and backward movement at an accelerated speed. The control circuitry can be further configured to: in response to receiving, via the directional controls, a movement request along a first axis, move a central locus of the third pre-programmed motion of the robotic device along the first axis; and repeat the short range, forward and backward movement at the central locus. The input device may include a microphone configured to capture vocal user commands; and the control circuitry is further configured to identify a first vocal user command corresponding to the first user interaction, and a second vocal user command corresponding to the second user interaction. The robotic device may be located at a first geographic location different from a second geographic location of the control system; and the communication interface is further configured to send the commands over a wide area network. Implementations of the described techniques may include hardware, a method or process, or computer software on a computer-accessible medium.


One general aspect includes one or more non-transitory computer-readable media storing computer-executable instructions that, when executed by control circuitry, cause the control circuitry to perform operations including: manipulating, using a robotic device, a medical instrument having a basket to access a human anatomy, the robotic device configured to open the basket at a first opening speed and a second opening speed, the robotic device further configured to close the basket at a first closing speed and a second closing speed; receiving, via an input device, one or more inputs for triggering pre-programmed actions by the robotic device; in response to receiving a first input via the input device, triggering a first pre-programmed motion of the robotic device, the first pre-programmed motion including opening the basket at the second opening speed, the second opening speed faster than the first opening speed; and in response to receiving a second input via the input device, triggering a second pre-programmed motion of the robotic device, the second pre-programmed motion including closing the basket at the second closing speed, the second closing speed faster than the first closing speed. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.


Implementations of the non-transitory computer-readable media may include one or more of the following features. The first input may include double tapping a first button of the input device and second input may include double tapping a second button of the input device. The computer-executable instructions can be further configured to cause the control circuitry to perform operations including: in response to tapping the first button and the second button concurrently, triggering a third pre-programmed motion of the robotic device, the third pre-programmed motion including a repeated, short range, forward and backward movement at an accelerated speed. The computer-executable instructions can be further configured to cause the control circuitry to perform operations may include: receiving, via the input device, a third input for controlling direct movement of the robotic device; and manipulating, using the robotic device, the medical instrument along one or more axes of movement based on the received third input. Implementations of the described techniques may include hardware, a method or process, or computer software on a computer-accessible medium.


For purposes of summarizing the disclosure, certain aspects, advantages and novel features have been described. It is to be understood that not necessarily all such advantages may be achieved in accordance with any particular embodiment. Thus, the disclosed embodiments may be carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other advantages as may be taught or suggested herein.





BRIEF DESCRIPTION OF THE DRAWINGS

Various embodiments are depicted in the accompanying drawings for illustrative purposes and should in no way be interpreted as limiting the scope of the disclosure. In addition, various features of different disclosed embodiments can be combined to form additional embodiments, which are part of this disclosure. Throughout the drawings, reference numbers may be reused to indicate correspondence between reference elements.



FIG. 1 illustrates an example medical system to perform or assist in performing medical procedures, according to certain embodiments.



FIGS. 2A-2B illustrates a perspective view and a top profile view of a controller, respectively, for the medical system, according to certain embodiments.



FIGS. 3A-3C illustrate a urinary stone capture procedure, according to certain embodiments.



FIGS. 4A-4B illustrates a basket retrieval device and several basket configurations, respectively, according to certain embodiments.



FIG. 5 is a flow diagram of a pre-programed rapid open process, according to certain embodiments.



FIG. 6 is a flow diagram of a pre-programed rapid close process, according to certain embodiments.



FIG. 7 is a flow diagram of a pre-programed jiggle process, according to certain embodiments.



FIG. 8 illustrates example details of the robotic system 110, according to certain embodiments.



FIG. 9 illustrates example details of the control system 140, according to certain embodiments.





DETAILED DESCRIPTION

The headings provided herein are for convenience only and do not necessarily affect the scope or meaning of disclosure. Although certain preferred embodiments and examples are disclosed below, the subject matter extends beyond the specifically disclosed embodiments to other alternative embodiments and/or uses and to modifications and equivalents thereof. Thus, the scope of the claims that may arise herefrom is not limited by any of the particular embodiments described below. For example, in any method or process disclosed herein, the acts or operations of the method or process may be performed in any suitable sequence and are not necessarily limited to any particular disclosed sequence. Various operations may be described as multiple discrete operations in turn, in a manner that may be helpful in understanding certain embodiments; however, the order of description should not be construed to imply that these operations are order dependent. Additionally, the structures, systems, and/or devices described herein may be embodied as integrated components or as separate components. For purposes of comparing various embodiments, certain aspects and advantages of these embodiments are described. Not necessarily all such aspects or advantages are achieved by any particular embodiment. Thus, for example, various embodiments may be carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other aspects or advantages as may also be taught or suggested herein.


Certain standard anatomical terms of location may be used herein to refer to the anatomy of animals, and namely humans, with respect to the preferred embodiments. Although certain spatially relative terms, such as “outer,” “inner,” “upper,” “lower,” “below,” “above,” “vertical,” “horizontal,” “top,” “bottom,” and similar terms, are used herein to describe a spatial relationship of one device/element or anatomical structure to another device/element or anatomical structure, it is understood that these terms are used herein for ease of description to describe the positional relationship between element(s)/structures(s), as illustrated in the drawings. It should be understood that spatially relative terms are intended to encompass different orientations of the element(s)/structures(s), in use or operation, in addition to the orientations depicted in the drawings. For example, an element/structure described as “above” another element/structure may represent a position that is below or beside such other element/structure with respect to alternate orientations of the subject patient or element/structure, and vice-versa.


Overview


The present disclosure relates to techniques and systems for controlling a medical device such as a basket retrieval device for retrieving urinary stones. The basket retrieval device can be used in different situations during a medical procedure, such as ureteroscopy. For example, the basket can be used to capture a urinary stone, release the urinary stone, reposition the urinary stone, shake off the tissue on the basket, and/or break up urinary stone congestion. Different scenarios utilize different techniques for operating the basket retrieval device. The basket can be controlled to open/close, insert/retract, and/or rotate, with varying velocities depending on the scenario. In some embodiments, movement of the basket retrieval device is coordinated with movement of a scope for better feedback and control. Typically, the basket retrieval device is operated by two people, a physician controlling the insertion/retraction of the basket retrieval device and an assistant controlling opening/closing the basket itself. As such, cooperation and coordination between the physician and assistant is needed for successful operation of the device.


Kidney stone disease, also known as urolithiasis, is a relatively common medical condition that involves the formation, in the urinary tract, of a solid piece of material, referred to as “kidney stones,” “urinary stones,” “renal calculi,” “renal lithiasis,” or “nephrolithiasis.” Urinary stones can be formed and/or found in the kidneys, the ureters, and the bladder (referred to as “bladder stones”). Such urinary stones form as a result of concentrated minerals and can cause significant abdominal pain once they reach a size sufficient to impede urine flow through the ureter or urethra. Urinary stones can be formed from calcium, magnesium, ammonia, uric acid, cysteine, and/or other compounds.


To remove urinary stones from the bladder and ureter, surgeons can insert a ureteroscope into the urinary tract through the urethra. Typically, a ureteroscope includes an endoscope at its distal end configured to enable visualization of the urinary tract. The ureteroscope can also include a lithotomy mechanism, such as the basket retrieval device, to capture or break apart urinary stones. As described above, during a ureteroscopy procedure, one physician/technician can control the position of the ureteroscope, while another other physician/technician can control the lithotomy mechanism.


In one example operation, the physician tries to capture the stone while the assistant controls opening/closing of the basket. This requires some amount of coordination as the physician inserts and articulates the basket (and possibly a scope at the same time), while the assistant needs to quickly close the basket around the urinary stone until the stone is fully captured. In another operation involving releasing the stone, the assistant needs to open the basket to the full amount and at high speed to release the stone. In an operation for shaking off the tissue, the assistant or physician needs to jiggle the basket back and forth at high frequency so that the tissue falls off from the basket. In an operation for repositioning the stone inside the basket, the assistant may need to slightly open the basket to give the stone the room for rotation, while the physician jiggles the basket retrieval device back and forth, and sometimes inserts or retracts the basket retrieval device at the same time to help adjust the basket position.


As described above, basket operation can have various levels of complexity, depending on the medical procedure. Conventional approaches employing a single, slow speed basket drive mode that require two users to operate do not provide the physician with sufficient flexibility or ease of use. Thus, there is a need for more advanced basket drive modes that allow physicians to unilaterally control the basket (e.g., adjusting basket velocity and/or opening/closing the basket) for more dynamic basket operation, as well as the ability to control multiple instruments at the same time.


In many embodiments, the techniques and systems are discussed in the context of a minimally invasive procedure. However, it should be understood that the techniques and systems can be implemented in the context of any medical procedure including, for example, percutaneous operations where access is gained to a target location by making a puncture and/or a minor incision into the body to insert a medical instrument, non-invasive procedures, therapeutic procedures, diagnostic procedures, non-percutaneous procedures, or other types of procedures. An endoscopic procedure can include a bronchoscopy, a ureteroscopy, a gastroscopy, nephroscopy, nephrolithotomy, and so on. Further, in many embodiments, the techniques and systems are discussed as being implemented as robotically-assisted procedures. However, it should also be appreciated that the techniques and systems can be implemented in other procedures, such as in fully-robotic medical procedures.


For ease of illustration and discussion, the techniques and systems are discussed in the context of removing urinary stones, such as kidneys stones from the kidneys. However, as noted above, the techniques and systems can be used to perform other procedures.


Medical System



FIG. 1 illustrates an example medical system 100 to perform or assist in performing medical procedures in accordance with one or more embodiments. Embodiments of the medical system 100 can be used for surgical and/or diagnostic procedures. The medical system 100 includes a robotic system 110 configured to engage with and/or control a medical instrument 120 to perform a procedure on a patient 130. The medical system 100 also includes a control system 140 configured to interface with the robotic system 110, provide information regarding the procedure, and/or perform a variety of other operations. For example, the control system 140 can include a display 142 to present a user interface 144 to assist the physician 160 in using the medical instrument 120. Further, the medical system 100 can include a table 150 configured to hold the patient 130 and/or an imaging sensor 180, such as a camera, x-ray, computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET) device, or the like.


In some embodiments, the physician performs a minimally-invasive medical procedure, such as ureteroscopy. The physician 160 can interact with the control system 140 to control the robotic system 110 to navigate the medical instrument 120 (e.g., a basket retrieval device and/or scope) from the urethra up to the kidney 170 where the stone 165 is located. The control system 140 can provide information via a display 142 regarding the medical instrument 120 to assist the physician 160 in navigation, such as real-time images from the medical instrument 120 or the imaging sensor 180. Once at the site of the kidney stone, the medical instrument 120 can be used to break-up and/or capture a urinary stone 165.


In some implementations of using the medical system 100, a physician 160 can perform a percutaneous procedure. To illustrate, if the patient 130 has a kidney stone 165 in a kidney 170 that is too large to be removed through a urinary tract, the physician 160 can perform a procedure to remove the kidney stone through a percutaneous access point on the patient 130. For example, the physician 160 can interact with the control system 140 to control the robotic system 110 to navigate the medical instrument 120 (e.g., a scope) from the urethra up to the kidney 170 where the stone 165 is located. The control system 140 can provide information via a display 142 regarding the medical instrument 120 to assist the physician 160 in navigating the medical instrument 120, such as real-time images from the medical instrument 120 or the imaging sensor 180. Once at the site of the kidney stone, the medical instrument 120 can be used to designate a target location for a second medical instrument (not shown) to access the kidney percutaneously (e.g., a desired point to access the kidney). To minimize damage to the kidney, the physician 160 can designate a particular papilla as the target location for entering into the kidney with the second medical instrument. However, other target locations can be designated or determined. Once the second medical instrument has reached the target location, the physician 160 can use the second medical instrument and/or another medical instrument to extract the kidney stone from the patient 130, such as through the percutaneous access point. Although the above percutaneous procedure is discussed in the context of using the medical instrument 120, in some implementations a percutaneous procedure can be performed without the assistance of the medical instrument 120. Further, the medical system 100 can be used to perform a variety of other procedures.


In the example of FIG. 1, the medical instrument 120 is implemented as a basket retrieval device. Thus, for ease of discussion, the medical instrument 120 is also referred to as “the basket retrieval device 120.” However, the medical instrument 120 can be implemented as various types of medical instruments including, for example, a scope (sometimes referred to as an “endoscope”), a needle, a catheter, a guidewire, a lithotripter, forceps, a vacuum, a scalpel, a combination of the above, or the like. In some embodiments, a medical instrument is a steerable device, while other embodiments a medical instrument is a non-steerable device. In some embodiments, a surgical tool refers to a device that is configured to puncture or to be inserted through the human anatomy, such as a needle, a scalpel, a guidewire, and so on. However, a surgical tool can refer to other types of medical instruments. In some embodiments, multiple medical instruments may be used. For example, an endoscope can be used with a basket retrieval device 120. In some embodiments, the medical instrument 120 may be a compound device incorporating several instruments, such as a vacuum, a basket retrieval device, a scope or various combinations of instruments.


The robotic system 110 can be configured to at least partly facilitate a medical procedure. The robotic system 110 can be arranged in a variety of ways depending on the particular procedure. The robotic system 110 can include one or more robotic arms 112 (robotic arms 112(a), 112(b), 112(c)) to engage with and/or control the medical instrument 120 to perform a procedure. As shown, each robotic arm 112 can include multiple arm segments coupled to joints, which can provide multiple degrees of movement. In the example of FIG. 1, the robotic system 110 is positioned proximate to the patient's 130 lower torso and the robotic arms 112 are actuated to engage with and position the medical instrument 120 for access into an access point, such as the urethra of the patient 130. With the robotic system 110 properly positioned, the medical instrument 120 can be inserted into the patient 130 robotically using the robotic arms 112, manually by the physician 160, or a combination thereof.


The robotic system 110 can also include a base 114 coupled to the one or more robotic arms 112. The base 114 can include a variety of subsystems, such as control electronics, a power source, pneumatics, an optical source, an actuator (e.g., motors to move the robotic arm), control circuitry, memory, and/or a communication interface. In some embodiments, the base 114 includes an input/output (I/O) device 116 configured to receive input, such as user input to control the robotic system 110, and provide output, such as patient status, medical instrument location, or the like. The I/O device 116 can include a controller, a mouse, a keyboard, a microphone, a touchpad, other input devices, or combinations of the above. The I/O device can include an output component, such as a speaker, a display, a haptic feedback device, other output devices, or combinations of the above. In some embodiments, the robotic system 110 is movable (e.g., the base 114 includes wheels) so that the robotic system 110 can be positioned in a location that is appropriate or desired for a procedure. In other embodiments, the robotic system 110 is a stationary system. Further, in some embodiments, the robotic system 110 is integrated into the table 150.


The robotic system 110 can be coupled to any component of the medical system 100, such as the control system 140, the table 150, the imaging sensor 180, and/or the medical instruments 120. In some embodiments, the robotic system is communicatively coupled to the control system 140. In one example, the robotic system 110 can receive a control signal from the control system 140 to perform an operation, such as to position a robotic arm 112 in a particular manner, manipulate a scope, and so on. In response, the robotic system 110 can control a component of the robotic system 110 to perform the operation. In another example, the robotic system 110 can receive an image from the scope depicting internal anatomy of the patient 130 and/or send the image to the control system 140 (which can then be displayed on the control system 140). Further, in some embodiments, the robotic system 110 is coupled to a component of the medical system 100, such as the control system 140, to receive data signals, power, and so on. Other devices, such as other medical instruments, intravenous bags, blood packs or the like can also be coupled to the robotic system 110 or other components of the medical system 100 depending on the medical procedure being performed.


The control system 140 can be configured to provide various functionality to assist in performing a medical procedure. In some embodiments, the control system 140 can be coupled to the robotic system 110 and operate in cooperation with the robotic system 110 to perform a medical procedure on the patient 130. For example, the control system 140 can communicate with the robotic system 110 via a wireless or wired connection (e.g., to control the robotic system 110, the basket retrieval device 120, receive an image(s) captured by a scope, etc.), control the flow of fluids through the robotic system 110 via one or more fluid channels, provide power to the robotic system 110 via one or more electrical connections, provide optical signals to the robotic system 110 via one or more optical fibers or other components, and so on. Further, in some embodiments, the control system 140 can communicate with a scope to receive sensor data. Moreover, in some embodiments, the control system 140 can communicate with the table 150 to position the table 150 in a particular orientation or otherwise control the table 150.


As shown in FIG. 1, the control system 140 includes various I/O devices configured to assist the physician 160 or others in performing a medical procedure. In some embodiments, the control system 140 includes an input device 146 that is employed by the physician 160 or another user to control the basket retrieval device 120. For example, the input device 146 can be used to navigate the basket retrieval device 120 within the patient 130. The physician 160 can provide input via the input device 146 and, in response, the control system 140 can send control signals to the robotic system 110 to manipulate the medical instrument 120.


Although the input device 146 is illustrated as a controller in the example of FIG. 1, the input device 146 can be implemented as a variety of types of I/O devices, such as a touchscreen/pad, a mouse, a keyboard, a microphone, a smart speaker, etc. As also shown in FIG. 1, the control system 140 can include the display 142 to provide various information regarding a procedure. For example, the control system 140 can receive real-time images that are captured by a scope and display the real-time images via the display 142. Additionally or alternatively, the control system 140 can receive signals (e.g., analog, digital, electrical, acoustic/sonic, pneumatic, tactile, hydraulic, etc.) a medical monitor and/or a sensor associated with the patient 130, and the display 142 can present information regarding the health of the patient 130 and/or an environment of the patient 130. Such information can include information that is displayed via a medical monitor including, for example, a heart rate (e.g., electrocardiogram (ECG), heart rate variability (HRV), etc.), blood pressure/rate, muscle bio-signals (e.g., electromyography (EMG)), body temperature, oxygen saturation (e.g., Spa)), carbon dioxide (CO2), brainwave (e.g., electroencephalogram (EEG)), environmental temperature, and so on.


In some embodiments, the input device 146 is configured to directly control movement of the basket retrieval device 120, as well as trigger pre-programmed motions. In one embodiment, direct control involves movement that continues as long as an active input is provided by the user, for example, by pushing up or down on a joystick or actuating a button. Direct control can include movement along one or more axes, such as inserting/retracting, rotating clockwise/counterclockwise, moving left/right and/or moving up/down. Pre-programmed motions can include rapid open, rapid close, jiggle, or other pre-defined movements that are triggered by a command but do not require continuing input from the user. By using pre-programmed motions, operation of the basket retrieval device is simplified as complex movements can be initiated by simplified commands. For example, rather than requiring coordinated actions between the physician 160 and an assistant to close the basket over a stone, a rapid close action can be triggered by a single user using a simplified command (e.g., pressing or double-tapping a button).


In contrast to a regular speed opening of the basket, rapid open opens the basket at a faster speed. In some scenarios, rapid open can be used by the user to quickly open the basket to prepare for stone capture and can also be used to release the stone. In an embodiment, regular open is directly controlled by the user. For example, the drive mechanism of the basket may open the basket as long as a button is being pressed, but stops when the button is released or a torque threshold level is reached, which usually indicates the basket is fully open. This provides finer control over the basket mechanism. Meanwhile, in certain embodiments, rapid open is pre-programmed to complete a series of actions when triggered, with the drive mechanism engaging the basket to open until the threshold torque level is reached or a new command (e.g., a button press) is received. In combination, regular open and rapid open can provide the user with greater control and flexibility during a medical procedure, with regular open being used when finer control is needed and rapid open being used when speed and/or timing is more important.


In contrast to a regular speed closing of the basket, rapid close closes the basket at a faster speed. In some scenarios, rapid close can be used for the user to grasp the stone quickly and also close the basket quickly when it not being used. In an embodiment, regular close is directly controlled by the user. For example, the drive mechanism of the basket may close the basket as long as a button is being pressed, but stops when the button is released or a torque threshold level is reached, which usually indicates the basket is fully close or a stone is captured. This provides finer control over the basket mechanism. Meanwhile, in certain embodiments, rapid close is pre-programmed to complete a series of actions when triggered, with the drive mechanism engaging the basket to close until the threshold torque level is reached or a new command (e.g., a button press) is received. In combination, regular close and rapid close can provide the user with greater control and flexibility during a medical procedure, with regular close being used when finer control is needed and rapid close being used when speed and/or timing is more important.


In one embodiment, a jiggle motion of the basket retrieval device can be triggered by holding two buttons on the input device 146 at the same time. Other embodiments can trigger the jiggle motion with other button presses, touch screen selections, voice commands, and/or other user inputs. In some embodiments, the jiggle motion is a pre-programmed motion where the basket retrieval device inserts forward and retracts backward for a small fixed amount of basket travel at a higher speed compared to a normal basket insertion speed. For example, during direct control, the basket retrieval device can move (e.g., insert/retract) at a normal speed (lx speed), while during a pre-programmed motion, the basket retrieval device can move at an accelerated speed (e.g., 1.5×, 2×, 3×, or the like). This high frequency dynamic movement can be used to shake off tissue that attaches to the basket, shake off the stone during stone release, and/or can be used to break up stone congestion.


In some embodiments, the jiggle motion includes a variable movement. For example, the user can use a direct-control movement to move the basket retrieval device from a first location where it is performing a jiggle motion to a second location to continue performing the jiggle motion. In this scenario, the pre-programmed motion is combined with a direct-control movement. The variable jiggle motion can be used to adjust stone position to address the stone being stuck. In one embodiment, the variable jiggle motion is triggered by holding down a first button and a second button at the same time, while a joystick is moved to provide a direction of movement (e.g., insert/retract). While the user presses the first and second buttons, the basket jiggles back and forth for a fixed amount. If the user then uses the joystick to insert or retract while holding both buttons, the basket can move to a new position. When the user lets go of the insertion joystick, the basket will go back to the jiggle mode with the locus of its movement moved to the new position indicated by the user. With variable jiggle mode, the user can first jiggle to loosen or rotate the stone, then insert or retract to adjust the jiggle location based on, for example, visual feedback received from a scope or the imaging sensor 180, then continue the jiggle motion until the stone repositioned to the desired location.



FIG. 1 also shows various anatomy of the patient 130 relevant to certain aspects of the present disclosure. In particular, the patient 130 includes kidneys 170 fluidly connected to a bladder 171 via ureters 172, and a urethra 173 fluidly connected to the bladder 171. As shown in the enlarged depiction of the kidney 170, the kidney includes calyxes 174 (e.g., major and minor calyxes), renal papillae (including the renal papilla 176, also referred to as “the papilla 176”), and renal pyramids (including the renal pyramid 178). In these examples, a kidney stone 165 is located in proximity to the papilla 176. However, the kidney stone can be located at other locations within the kidney 170.


As shown in FIG. 1, to remove the kidney stone 165 in the example minimally-invasive procedure, the physician 160 can position the robotic system 110 at the foot of the table 150 to initiate delivery of the medical instrument 120 into the patient 130. In particular, the robotic system 110 can be positioned within proximity to a lower abdominal region of the patient 130 and aligned for direct linear access to the urethra 173 of the patient 130. From the foot of the table 150, the robotic arm 112(B) can be controlled to provide access to the urethra 173. In this example, the physician 160 inserts a medical instrument 120 at least partially into the urethra along this direct linear access path (sometimes referred to as “a virtual rail”). The medical instrument 120 can include a lumen configured to receive the scope and/or basket retrieval device, thereby assisting in insertion of those devices into the anatomy of the patient 130.


Once the robotic system 110 is properly positioned and/or the medical instrument 120 is inserted at least partially into the urethra 173, the scope can be inserted into the patient 130 robotically, manually, or a combination thereof. For example, the physician 160 can connect the medical instrument 120 to the robotic arm 112(C). The physician 160 can then interact with the control system 140, such as the input device 146, to navigate the medical instrument 120 within the patient 130. For example, the physician 160 can provide input via the input device 146 to control the robotic arm 112(C) to navigate the basket retrieval device 120 through the urethra 173, the bladder 171, the ureter 172, and up to the kidney 170.


The control system 140 can include various components (sometimes referred to as “subsystems”) to facilitate its functionality. For example, the control system 140 can include a variety of subsystems, such as control electronics, a power source, pneumatics, an optical source, an actuator, control circuitry, memory, and/or a communication interface. In some embodiments, the control system 140 includes a computer-based control system that stores executable instructions, that when executed, implement various operations. In some embodiments, the control system 140 is movable, such as that shown in FIG. 1, while in other embodiments, the control system 140 is a stationary system. Although various functionality and components are discussed as being implemented by the control system 140, any of this functionality and/or components can be integrated into and/or performed by other systems and/or devices, such as the robotic system 110 and/or the table 150.


The medical system 100 can provide a variety of benefits, such as providing guidance to assist a physician in performing a procedure (e.g., instrument tracking, patient status, etc.), enabling a physician to perform a procedure from an ergonomic position without the need for awkward arm motions and/or positions, enabling a single physician to perform a procedure with one or more medical instruments, avoiding radiation exposure (e.g., associated with fluoroscopy techniques), enabling a procedure to be performed in a single-operative setting, providing continuous suction to remove an object more efficiently (e.g., to remove a kidney stone), and so on. Further, the medical system 100 can provide non-radiation-based navigational and/or localization techniques to reduce physician exposure to radiation and/or reduce the amount of equipment in an operating room. Moreover, the medical system 100 can divide functionality into the control system 140 and the robotic system 110, each of which can be independently movable. Such division of functionality and/or movability can enable the control system 140 and/or the robotic system 110 to be placed at locations that are optimal for a particular medical procedure, which can maximize working area around the patient, and/or provide an optimized location for a physician to perform a procedure. For example, many aspects of the procedure can be performed by the robotic system 110 (which is positioned relatively close to the patient) while the physician manages the procedure from the comfort of the control system 140 (which can be positioned farther way).


In some embodiments, the control system 140 can function even if located in a different geographic location from the robotic system 110. For example, in a tele-health implementation, the control system 140 is configured to communicate over a wide area network with the robotic system 110. In one scenario, a physician 160 may be located in one hospital with the control system 140 while the robotic system 110 is located in a different hospital. The physician may then perform the medical procedure remotely. This can be beneficial where remote hospitals, such as those in rural areas, have limited expertise in particular procedures. Those hospitals can then rely on more experienced physicians in other locations. In some embodiments, a control system 140 is able to pair with a variety of robotic systems 110, for example, by selecting a specific robotic system and forming a secure network connection (e.g., using passwords, encryption, authentication tokens, etc.). Thus, a physician in one location may be able to perform medical procedures in a variety of different locations by setting up a connection with robotic systems 110 located at each of those different locations.


In some embodiments, the robotic system 110, the table 150, the medical instrument 120, the needle and/or the imaging sensor 180 are communicatively coupled to each other over a network, which can include a wireless and/or wired network. Example networks include one or more personal area networks (PANs), one or more local area networks (LANs), one or more wide area networks (WANs), one or more Internet area networks (IANs), one or more cellular networks, the Internet, etc. Further, in some embodiments, the control system 140, the robotic system 110, the table 150, the medical instrument 120, and/or the imaging sensor 180 are connected for communication, fluid/gas exchange, power exchange, and so on via one or more support cables.


Although not illustrated in FIG. 1, in some embodiments the medical system 100 includes and/or is associated with a medical monitor configured to monitor health of the patient 130 and/or an environment in which the patient 130 is located. For example, a medical monitor can be located in the same environment where the medical system 100 is located, such as within an operating room. The medical monitor can be physically and/or electrically coupled to one or more sensors that are configured to detect or determine one or more physical, physiological, chemical, and/or biological signals, parameters, properties, states and/or conditions associated with the patient 130 and/or the environment. For example, the one or more sensors can be configured to determine/detect any type of physical properties, including temperature, pressure, vibration, haptic/tactile features, sound, optical levels or characteristics, load or weight, flow rate (e.g., of target gases and/or liquid), amplitude, phase, and/or orientation of magnetic and electronic fields, constituent concentrations relating to substances in gaseous, liquid, or solid form, and/or the like. The one or more sensors can provide the sensor data to the medical monitor and the medical monitor can present information regarding the health of the patient 130 and/or the environment of the patient 130. Such information can include information that is displayed via a medical monitor including, for example, a heart rate (e.g., EGG, HRV, etc.), blood pressure/rate, muscle bio-signals (e.g., EMG), body temperature, oxygen saturation (e.g., SpO2), CO2, brainwave (e.g., EEG), environmental temperature, and so on. In some embodiments, the medical monitor and/or the one or more sensors are coupled to the control system 140 and the control system 140 is configured to provide information regarding the health of the patient 130 and/or the environment of the patient 130.


Example Controller



FIG. 2A and FIG. 2B illustrates a perspective view and a top profile view of a controller 200 for the control system 140, respectively, according to certain embodiments. As described in FIG. 1, the input device 146, in some embodiments, is a controller 200 or includes a controller 200. The face of the controller can include axis movement inputs, such one or more joysticks 205, 215 and one or more directional pads 210. In some embodiments, the joysticks 205, 215 provide analog input while the directional pad 210 provides digital input. The face of the controller can further include a plurality of buttons 220 to provide additional controls. In the example illustrated in FIG. 2B, the controller 200 includes four buttons on the top side of the controller: R1225, R2, 230 L1235, and L2240. Other embodiments can include a different number of buttons and/or a different layout. In some embodiments, the controller 200 may be a game console controller repurposed to work with the robotic system 110. For example, the controller game firmware may be overwritten with a medical device firmware and/or an input device manager may be installed in a component of the medical system 100 (e.g., the control system 140) to convert inputs from the controller into inputs understandable by the robotic system 110.


In one embodiment, rapid open can be triggered by double tapping the pendant top lower right button (R2225) and rapid close can be triggered by double tapping the pendant top lower left button (L2240). Users can double tap to rapidly open the basket, and double tap to rapidly close the basket. Double tap operation provides the user with easy access to the pre-programmed commands using the top two buttons (R2, L2). Meanwhile, the other inputs on the controller can be used for other functions, including controlling other medical instruments, such as inserting the scope, articulating the scope, and/or inserting the basket. These other functions can be triggered at the same time or independent from the pre-programmed motions. As will be apparent, other embodiments can configure the controller in different ways. For example, rapid open/rapid close can be triggered using other buttons and/or other interactions (e.g., using single tap, double tap, holding down a button, etc.). In one embodiment, the button mappings are switched, with rapid open triggered by the L2 button and rapid close triggered by the R2 button 225.


In one embodiment, jiggle motion of the basket retrieval device can be triggered by holding both the top lower right button (R2225) and the top lower left button (L224) at the same time. By requiring both buttons to be pressed, the buttons used for rapid open and rapid close (R2/L2) can be dual-purpose, allowing more commands to be mapped to the controller 200 inputs. In one embodiment, the R2 and L2 buttons are triple-purpose, with single taps of the R2 and L2 buttons, respectively, each triggering another action. For example, single tapping R2 may initiate regular speed open of the basket while single tapping L2 may initiate regular speed close of the basket or vice versa. Other embodiments can trigger the jiggle motion with other button presses.


As described in FIG. 1, the jiggle motion can include a variable movement. In one embodiment, the first joystick 205 may be configured to directly control insertion and retraction movement of the basket retrieval device 120. While holding down R2 and L2 to trigger the jiggle motion, the user can move the joystick 205 up to further insert the basket retrieval device 120 to a new location in the patient's body. Alternatively, the user can move the joystick 205 down to retract the basket retrieval device 120 to a new location towards the entry point into the body. Once the user releases the joystick 205 (while still holding down R2 and L2), the jiggle motion can continue with the locus at the new location. As will be apparent, insertion and retraction can be mapped to other controller inputs, such the second joystick 215 or the directional pad 210.


Controller 200 operation can be customizable in some embodiments. The control system 140 can include a user interface that allows assigning of pre-programmed motions (e.g., rapid open/close, jiggle, etc.) to a desired controller layout by the user. For example, the user can assign rapid open or rapid close to any of the top buttons 220, the directional pad 210 or one of the joysticks 205, 215.


In some embodiments, triggering the pre-programmed motions can be at least partially automated. Machine learning or computer vision algorithms can be implemented to recognize when the basket retrieval device 120 is in the right position to perform a pre-programmed motion. For example, the medical system 100, using its imaging sensor 180, scope, or other sensing device can recognize that the basket is sufficiently near to the urinary stone 165 that rapid open can be triggered to capture the device. Once a threshold distance is reached, rapid open can be triggered to rapidly open the basket. Additional pre-programmed motions can also be chained together with the rapid open. For example, after opening the basket, a pre-programmed motion to further insert the basket retrieval device 120 such that the urinary stone is surrounded by the basket (“forward insertion” pre-programmed motion) can be triggered. Next, a rapid close could then be automatically triggered to capture the urinary stone 165.


As a safety precaution, pre-programmed motions can be configured to only automatically trigger in a specific mode (“auto-capture” mode) where automated motions are allowed. This auto-capture mode could be enabled using one of the buttons or other inputs on the controller 200 or via a menu setting in the control system 140 interface. In one scenario, the physician 160 moves the basket retrieval device 120 to the correct location near the urinary stone 165 using directly-controlled movement. The physician can then enable the auto-capture mode. If the stone is sufficiently close to meet or exceed a distance threshold, the auto-capture pre-programmed motions (e.g., rapid open, forward insertion, and/or rapid close) is automatically triggered. If the distance is greater than the distance threshold, the physician can further adjust the location of the basket retrieval device 120 until the auto-capture pre-programmed motions are triggered.


While FIGS. 2A-2B have illustrated one embodiment of a controller 200, other types of controllers or other input devices can also be used with the control system 140. For example, the input device 146 for the controls system 140 can be wireless (e.g., Wi-Fi, Bluetooth, etc.) or wired (e.g., universal serial bus (USB)). In another example, the input device 146 can be a graphical user interface (GUI) implemented on a touch screen device, such as a tablet or smart phone. In one example, the controller can be a smart speaker with a built-in microphone that accepts voice commands. In another example, the input device 146 may be controllers for a virtual reality or augmented reality system.


Urinary Stone Capture



FIGS. 3A-3C illustrate a urinary stone capture procedure, according to certain embodiments. In these examples, the medical system 100 is arranged in an operating room to remove a kidney stone from the patient 130. In many instances of such a procedure, the patient 130 is positioned in a modified supine position with the patient 130 slightly tilted to the side to access the posterior or side of the patient 130. The urinary stone capture procedure may also be performed with the patient in a regular supine position, as show in in FIG. 1. Although FIGS. 3A-3C illustrate use of the medical system 100 to perform a minimally-invasive procedure to remove a kidney stone from the patient 130, the medical system 100 can be used to remove a kidney stone in other manners and/or to perform other procedures. Further, the patient 130 can be arranged in other positions as desired for a procedure. Various acts are described in FIGS. 3A-3C and throughout this disclosure as being performed by the physician 160. It should be understood that these acts can be performed directly by the physician 160, indirectly by the physician with the aid of the medical system 100, by a user under direction of the physician, by another user (e.g., a technician), and/or any other user.


Although particular robotic arms of the robotic system 110 are illustrated as performing particular functions in the context of FIGS. 3A-3C, any of the robotic arms 112 can be used to perform the functions. Further, any additional robotic arms and/or systems can be used to perform the procedure. Moreover, the robotic system 110 can be used to perform other parts of the procedure.


At block 305, the basket retrieval device 120 is maneuvered into the kidney 170 to approach the urinary stone 165. In some scenarios, the physician 160 or other user uses the input device 146 to directly control movement of the basket retrieval device 120. Such directly controlled movement can include insertion/retraction, flexing the basket retrieval device 120 left or right, rotation, and/or regular open/close of the basket. Using various movements, the basket retrieval device 120 is placed close to the stone.


At block 310, the rapid open pre-programmed movement is triggered in response to a user input (e.g., double-tapping a button). Rapid open causes the basket of the basket retrieval device 120 to open at an accelerated rate. Using rapid open allows the basket retrieval device 120 to more quickly get into position to capture the urinary stone 165, reducing the chances that extraneous movement (e.g., by the basket retrieval device or due to respiration/circulatory/urine flow movements within the kidney) moves the basket retrieval device 120 out of position. The rapid open movement can be triggered by double tapping a controller 200 button, using different inputs on the controller, or by using another type of input device, such as a voice command.


In some embodiments, a laser, shock wave device, or other device is used to break up the stone. The laser or other device may be incorporated into the basket retrieval device 120 or may be a separate medical instrument. The device for breaking up the stone may also controlled by the same input device (e.g., controller 200) as for triggering rapid open, rapid close and/or the jiggle motion. In some situations, the stone 165 is small enough that breaking up the stone into smaller pieces is not needed. In those cases, block 315 can be skipped and the process can proceed to block 320.


Optionally, at block 315, the pre-programmed jiggle motion is triggered to aid in clearing stone congestion or otherwise move the stone(s). For example, if the urinary stone 165 is broken into smaller pieces as discussed above, the jiggle motion can be used to separate the stones apart. The jiggle motion can be a pre-programmed motion where the basket retrieval device inserts forward and retracts backward for a small fixed amount of basket travel at a higher speed compared to a normal basket insertion speed. The pre-programmed jiggle motion can be triggered by pressing two buttons of the controller 200 simultaneously, using a different inputs on the controller, or by using another type of input device, such as a voice command.


At block 320, the open basket is maneuvered to surround the urinary stone 165 or a smaller piece of the urinary stone. In some scenarios, maneuvering is accomplished by directly-controlled movement by the physician 160 of the basket retrieval device 120. In some embodiments, a forward insertion pre-programmed motion can be used to surround the stone with the basket. For example, if the basket is formed by wires, the forward insertion movement can include a forward movement with, optionally, a slight sideways offset to one side to avoid hitting the stone with the wires forming the distal end of the basket. Once the stone has passed the distal end, an optional sideways motion opposite the first sideways offset motion can be applied to the basket to center the basket around the stone. In other embodiments, such as with a basket formed by plurality of tines, insertion of the basket is coordinated with closing the tines in order to center the basket longitudinally around the stone and avoid moving the stone during basket closure.


The forward insertion movement can be triggered by using inputs on the controller 200 or by using another type of input device, such as a voice command. While the above has described a pre-programmed forward insertion movement, this movement can also be accomplished by the user using directly controlled movement to surround the urinary stone.


At block 325, the rapid close programmed movement is triggered in response to a user input (e.g., double-tapping a button). Rapid close causes the basket of the basket retrieval device 120 to close at an accelerated rate. The closing motion can continue until the basket fully closes and/or a threshold torque is reached. When the drive mechanism of the basket reaches the torque threshold, that can indicate that the basket has closed over the stone, which is preventing further closing of the basket. Limiting the torque can protect the basket from any damage due to increased stress/force applied during closure. Using rapid close allows the basket retrieval device 120 to more quickly capture the urinary stone 165, reducing the chances that extraneous movement (e.g., by the patient or the basket retrieval device 120) moves the basket retrieval device 120 out of position. The rapid close movement can be triggered by double tapping a controller 200 button, using different inputs on the controller, or by using another type of input device, such as a voice command. Optionally, the jiggle motion can be triggered to help adjust the stone 165 position for easier withdrawal of the stone from the kidney.


At block 330, the basket retrieval device 120 is withdrawn from the kidney 170 and then out of the patient's body. The rapid open movement can optionally be triggered once the basket retrieval device 120 is outside the patient's body in order to quickly release the captured stone.


If additional stones (or large pieces of a broken-up stone 165) exist, the basket retrieval device 120 may be reinserted into the patient to capture the remaining large pieces. In some embodiments, a vacuum instrument can be used to facilitate removal of the pieces. In some situations, the stone pieces may be sufficiently small that they can be passed by the patient naturally.


Example Basket Retrieval Device



FIG. 4A illustrates a basket retrieval device 120, according to certain embodiments. The basket retrieval device 120 can include a basket 405 formed on the distal side, a handle 410 on the proximal side, a sheath 415 between the basket and the handle, and a basket drive mechanism 420. The basket can be formed in a variety of ways to capture urinary stones. In some embodiments, the basket is formed by two or more wire loops 425 that expand to form a space into which a stone is maneuvered and contract around the stone to capture it. As shown in FIG. 4B, the wires can be configured to form various shapes, such as a bulb, teardrop, helical, bowl shape or the like. In other embodiments, the basket is formed by two substantially oval or round bowls with recesses facing each other to form a hollow area for the urinary stone. In some embodiments, the basket is formed by a plurality of tines configured to close around a stone. The basket can be made from a variety of materials, such as nitinol, nickel, titanium, steel, cobalt-chrome alloy, other types of metals, ceramics, polymers such as plastic, or combinations of the same.


The handle 410 of the basket retrieval device 120 can be operated by a user or a robot. In some embodiments, the basket drive mechanism 420 is built into the handle. Engaging the drive, for example, with a sliding or twisting motion, can cause the basket to open or close. In one embodiment, engaging the drive to the open position causes basket wires to extend out of the sheath into an open basket position. Engaging the drive to the closed position can cause the basket wires to retract towards the sheath, collapsing the basket. If a urinary stone is inside the basket, the stone is captured by the closing of the basket wires.


A scope (not shown), which may be part of the basket retrieval device 120 or used in conjunction with it, can be configured to navigate within the human anatomy, such as within a natural orifice or lumen of the human anatomy. For example, the scope can include a channel through which the distal portion of the basket retrieval device 120 can be inserted. A scope can include, for example, a ureteroscope (e.g., for accessing the urinary tract), a laparoscope, a nephroscope (e.g., for accessing the kidneys), a bronchoscope (e.g., for accessing an airway, such as the bronchus), a colonoscope (e.g., for accessing the colon), an arthroscope (e.g., for accessing a joint), a cystoscope (e.g., for accessing the bladder), and so on. A scope can also be articulable, such as a distal portion of the scope, so that the scope can be steered within the human anatomy. A scope can include telescoping parts, such as an inner leader portion and an outer sheath portion, which can be manipulated to telescopically extend the scope. In some embodiments, a scope includes a working channel for deploying medical instruments (e.g., lithotripters, basketing devices, forceps, etc.), irrigation, and/or aspiration to an operative region at a distal end of the scope. A scope can accommodate wires and/or optical fibers to transfer signals to/from an optical assembly and a distal end of the scope, which can include an imaging device, such as an optical camera. A scope can also accommodate optical fibers to carry light from approximately-located light sources, such as light-emitting diodes, to the distal end of the scope. The distal end of a scope can also include an opening for a working channel to deliver tools, irrigation, and/or aspiration to an operative site. The distal end of a scope can also include a port for an imaging device, such as a camera, that can be configured to capture images of an internal anatomical space. The distal end of a scope can include ports for light sources to illuminate an anatomical space when using an imaging device. In some embodiments, the scope is configured to be controlled by the robotic system 110. The scope can include components to engage with the robotic system 110.


Example Pre-Programmed Movements



FIG. 5 is a flow diagram of a pre-programed rapid open process 500, according to certain embodiments. The rapid open process 500 can be performed by the robotic system 110 or by another component of the medical system 100 of FIG. 1. For example, the robotic system 110 may be manipulating a basket retrieval device 120 with one or more of its arms. The process 500 may be performed according to the inputs provided by a user (e.g., physician) or may at least be partially automated. While the following describes one possible sequence to the process, other embodiments can perform the process in a different order.


At block 505, a control system 140 of the robotic system 110 receives a first user input from an input device 146. For example, the input may be double-tapping a button, a voice command, or other input. In one embodiment, the first input is double-tapping a first button on the controller 200 of FIGS. 2A-2B.


At block 510, the robotic system 110 engages the basket drive mechanism to open the basket 405 of the basket retrieval device 120 at an accelerated speed. In some embodiments, the basket drive mechanism 420 is configured to operate in at least two speeds: a first speed corresponding to a regular opening speed of the basket, and a second speed faster than the first speed. In one embodiment, the regular opening speed is used when the basket is opened using directly controlled movement rather than a pre-programmed motion.


At block 515, a torque sensor, which may be located in the basket retrieval device 120 or the robotic arm 112, determines the torque being applied by the basket drive mechanism 420 to the basket 405. If the torque meets or exceeds a torque limit configured for the robotic system 110, then the drive mechanism is disengaged. For example, the torque limit may be reached if there is tissue preventing the basket from opening. If the torque limit is reached, then the process 500 proceeds to block 520. Otherwise, the process 500 proceeds to block 525.


At block 520, the torque limit is reached and the drive mechanism is disengaged. As something is preventing the basket from further opening, additional movement could damage the basket 405 or surrounding tissue.


At block 525, the rapid open movement is completed. The basket 405 may be fully open (e.g., as indicated by the drive mechanism torque) or have otherwise reached the desired open configuration. For example, the rapid open movement may be configured to stop at 50%, 60%, 70%, 80%, 90%, 100%, or other amount of the basket being fully open. In one embodiment, the target opening amount of the basket is set based on the detected size of the urinary stone.



FIG. 6 is a flow diagram of a pre-programed rapid close process 600, according to certain embodiments. The rapid close process 600 can be performed by the robotic system 110 or by another component of the medical system 100. For example, the robotic system 110 may be manipulating a basket retrieval device 120 with one or more of its arms. The process 600 may be performed according to the inputs provided by a user (e.g., physician) or may at least be partially automated. While the following describes one possible sequence to the process, other embodiments can perform the process in a different order.


At block 605, the control system 140 of the robotic system 110 receives a second user input from the input device 146. For example, the input may be double-tapping a button, a voice command, or other input. In one embodiment, the second input is double-tapping a second button on the controller 200 of FIGS. 2A-2B.


At block 610, the robotic system 110 engages the basket drive mechanism to close the basket 405 of the basket retrieval device 120 at an accelerated speed. In some embodiments, the basket drive mechanism 420 is configured to operate in at least two speeds: a first speed corresponding to a regular closing speed of the basket, and a second speed faster than the first speed. In one embodiment, the regular closing speed is used when the basket is closed using directly controlled movement rather than a pre-programmed motion.


At block 615, the torque sensor determines the torque being applied by the basket drive mechanism 420 to the basket 405. If the torque meets or exceeds a torque limit configured for the robotic system 110, then the drive mechanism is disengaged. For example, the torque limit may be reached if there is tissue or a urinary stone preventing the basket from closing. If the torque limit is reached, then the process 600 proceeds to block 620. Otherwise, the process 600 proceeds to block 625.


At block 620, the torque limit is reached and the drive mechanism is disengaged. As something is preventing the basket from further closing, additional movement could damage the basket 405 or surrounding tissue.


At block 625, the rapid close movement is completed. The basket 405 may be fully closed (e.g., as indicated by the drive mechanism torque) or have otherwise reached the desired closed configuration. For example, the rapid close movement may be configured to stop at 50%, 60%, 70%, 80%, 90%, 100% or other amount of the basket being fully closed. In one embodiment, the target closed amount of the basket is set based on the detected size of the urinary stone 165. For example, if the urinary stone is large, then the basket may only close a small amount before contacting or surrounding the urinary stone.



FIG. 7 is a flow diagram of a pre-programed jiggle motion process 700, according to certain embodiments. The jiggle process 700 can be performed by the robotic system 110 or by another component of the medical system 100. For example, the robotic system 110 may be manipulating a basket retrieval device 120 with one or more of its arms. The process 700 may be performed according to the inputs provided by a user (e.g., physician) or may at least be partially automated. While the following describes one possible sequence to the process, other embodiments can perform the process in a different order.


At block 705, the control system 140 of the robotic system 110 receives a third user input from the input device 146. For example, the input may be holding down one or more buttons, double-tapping a button, a voice command, or other input. In one embodiment, the third input is holding down the first button (associated with rapid open) and the second button (associated with on rapid close) of controller 200 of FIGS. 2A-2B.


At block 710, the robotic system 110 initiates a short-range forward and backward movement of the basket at an accelerated speed. For example, the basket may move a few millimeters back and forth. In some situations, the basket may move a few centimeters. In some embodiments, the basket drive mechanism 420 is configured to operate the basket in at least two speeds: a first speed corresponding to a regular movement speed of the basket, and a second speed faster than the first speed. In one embodiment, the regular movement speed is used when the basket is moving using directly controlled movement rather than a pre-programmed motion.


At block 715, the control system 140 receives a fourth user input corresponding to a movement input, while receiving the third user input. In one embodiment, the fourth input is movement along an axis (e.g., forward or backward) of a joystick of the controller 200. For example, the physician 160 can move the joystick while holding down the first and second buttons.


At block 720, the robotic system 110 moves the basket to a new location based on the fourth input. For example, if the joystick is moved up, the basket is further inserted into the patient. If the joystick is moved back, the basket is retracted towards the proximal end of the basket retrieval device 120.


At block 725, the robotic system 110 continues the short-range forward and backward movement of the basket at the new location. The locus or center of the movement is the new location.


At block 730, if the third input has ceased (e.g., the physician 160 has released the first button and second button), the process 700 proceeds to block 735 and the jiggle motion ceases. If the third input is still ongoing, the process 700 proceeds to block 725 and the robotic system 110 continues the forward and backward movement of the basket. In other embodiments, the jiggle motion stops in response to other input, after a certain amount of time has passed, or after certain amount of repetitions of movement have completed.


Example Robotic System



FIG. 8 illustrates example details of the robotic system 110 in accordance with one or more embodiments. In this example, the robotic system 110 is illustrated as a cart-based robotically-enabled system that is movable. However, the robotic system 110 can be implemented as a stationary system, integrated into a table, and so on.


The robotic system 110 can include the support structure 114 including an elongated section 114(A) (sometimes referred to as “the column 114(A)”) and a base 114(B). The column 114(A) can include one or more carriages, such as a carriage 1102 (alternatively referred to as “the arm support 1102”) for supporting the deployment of one or more the robotic arms 112 (three shown in FIG. 8). The carriage 1102 can include individually configurable arm mounts that rotate along a perpendicular axis to adjust the base of the robotic arms 112 for positioning relative to a patient. The carriage 1102 also includes a carriage interface 1104 that allows the carriage 1102 to vertically translate along the column 114(A). The carriage interface 1104 is connected to the column 114(A) through slots, such as slot 1106, that are positioned on opposite sides of the column 114(A) to guide the vertical translation of the carriage 1102. The slot 1106 includes a vertical translation interface to position and hold the carriage 1102 at various vertical heights relative to the base 114(B). Vertical translation of the carriage 1102 allows the robotic system 110 to adjust the reach of the robotic arms 112 to meet a variety of table heights, patient sizes, physician preferences, etc. Similarly, the individually configurable arm mounts on the carriage 1102 allow a robotic arm base 1108 of the robotic arms 112 to be angled in a variety of configurations. The column 114(A) can internally comprise mechanisms, such as gears and/or motors, that are designed to use a vertically aligned lead screw to translate the carriage 1102 in a mechanized fashion in response to control signals generated in response to user inputs, such as inputs from the I/O device(s) 116.


In some embodiments, the slot 1106 can be supplemented with a slot cover(s) that is flush and/or parallel to the slot surface to prevent dirt and/or fluid ingress into the internal chambers of the column 114(A) and/or the vertical translation interface as the carriage 1102 vertically translates. The slot covers can be deployed through pairs of spring spools positioned near the vertical top and bottom of the slot 1106. The covers can be coiled within the spools until deployed to extend and retract from their coiled state as the carriage 1102 vertically translates up and down. The spring-loading of the spools can provide force to retract the cover into a spool when the carriage 1102 translates towards the spool, while also maintaining a tight seal when the carriage 1102 translates away from the spool. The covers can be connected to the carriage 1102 using, for example, brackets in the carriage interface 1104 to ensure proper extension and retraction of the covers as the carriage 1102 translates.


The base 114(B) can balance the weight of the column 114(A), the carriage 1102, and/or arms 112 over a surface, such as the floor. Accordingly, the base 114(B) can house heavier components, such as one or more electronics, motors, power supply, etc., as well as components that enable movement and/or immobilize the robotic system 110. For example, the base 114(B) can include rollable wheels 1116 (also referred to as “the casters 1116”) that allow for the robotic system 110 to move around the room for a procedure. After reaching an appropriate position, the casters 1116 can be immobilized using wheel locks to hold the robotic system 110 in place during the procedure. As shown, the robotic system 110 also includes a handle 1118 to assist with maneuvering and/or stabilizing the robotic system 110.


The robotic arms 112 can generally comprise robotic arm bases 1108 and end effectors 1110, separated by a series of linkages 1112 that are connected by a series of joints 1114. Each joint 1114 can comprise an independent actuator and each actuator can comprise an independently controllable motor. Each independently controllable joint 1114 represents an independent degree of freedom available to the robotic arm 112. For example, each of the arms 112 can have seven joints, and thus, provide seven degrees of freedom. However, any number of joints can be implemented with any degrees of freedom. In examples, a multitude of joints can result in a multitude of degrees of freedom, allowing for “redundant” degrees of freedom. Redundant degrees of freedom allow the robotic arms 112 to position their respective end effectors 1110 at a specific position, orientation, and/or trajectory in space using different linkage positions and/or joint angles. In some embodiments, the end effectors 1110 can be configured to engage with and/or control a medical instrument, a device, an object, and so on. The freedom of movement of the arms 112 can allow the robotic system 110 to position and/or direct a medical instrument from a desired point in space and/or allow a physician to move the arms 112 into a clinically advantageous position away from the patient to create access, while avoiding arm collisions.


As shown in FIG. 8, the robotic system 110 can also include the I/O device(s) 116. The I/O device(s) 116 can include a display, a touchscreen, a touchpad, a projector, a mouse, a keyboard, a microphone, a speaker, a controller, a camera (e.g., to receive gesture input), or another I/O device to receive input and/or provide output. The I/O device(s) 116 can be configured to receive touch, speech, gesture, or any other type of input. The I/O device(s) 116 can be positioned at the vertical end of column 114(A) (e.g., the top of the column 114(A)) and/or provide a user interface for receiving user input and/or for providing output. For example, the I/O device(s) 116 can include a touchscreen (e.g., a dual-purpose device) to receive input and provide a physician with pre-operative and/or intra-operative data. Example pre-operative data can include pre-operative plans, navigation, and/or mapping data derived from pre-operative computerized tomography (CT) scans, and/or notes from pre-operative patient interviews. Example intra-operative data can include optical information provided from a tool/instrument, sensor, and/or coordinate information from sensors, as well as vital patient statistics, such as respiration, heart rate, and/or pulse. The I/O device(s) 116 can be positioned and/or tilted to allow a physician to access the I/O device(s) 116 from a variety of positions, such as the side of the column 114(A) opposite the carriage 1102. From this position, the physician can view the I/O device(s) 116, the robotic arms 112, and/or a patient while operating the I/O device(s) 116 from behind the robotic system 110.


The robotic system 110 can include a variety of other components. For example, the robotic system 110 can include one or more control electronics/circuitry, power sources, pneumatics, optical sources, actuators (e.g., motors to move the robotic arms 112), memory, and/or communication interfaces (e.g. to communicate with another device). In some embodiments, the memory can store computer-executable instructions that, when executed by the control circuitry, cause the control circuitry to perform any of the operations discussed herein. For example, the memory can store computer-executable instructions that, when executed by the control circuitry, cause the control circuitry to receive input and/or a control signal regarding manipulation of the robotic arms 112 and, in response, control the robotic arms 112 to be positioned in a particular arrangement and/or to navigate a medical instrument connected to the end effectors 1110.


In some embodiments, robotic system 110 is configured to engage with and/or control a medical instrument, such as the basket retrieval device 120. For example, the robotic arms 112 can be configured to control a position, orientation, and/or tip articulation of a scope (e.g., a sheath and/or a leader of the scope). In some embodiments, the robotic arms 112 can be configured/configurable to manipulate the scope using elongate movement members. The elongate movement members can include one or more pull wires (e.g., pull or push wires), cables, fibers, and/or flexible shafts. To illustrate, the robotic arms 112 can be configured to actuate multiple pull wires coupled to the scope to deflect the tip of the scope. Pull wires can include any suitable or desirable materials, such as metallic and/or non-metallic materials such as stainless steel, Kevlar, tungsten, carbon fiber, and the like. In some embodiments, the scope is configured to exhibit nonlinear behavior in response to forces applied by the elongate movement members. The nonlinear behavior can be based on stiffness and compressibility of the scope, as well as variability in slack or stiffness between different elongate movement members.


Example Control System



FIG. 9 illustrates example details of the control system 140 in accordance with one or more embodiments. As illustrated, the control system 140 can include one or more of the following components, devices, modules, and/or units (referred to herein as “components”), either separately/individually and/or in combination/collectively: control circuitry 1202, data storage/memory 1204, one or more communication interfaces 1206, one or more power supply units 1208, one or more I/O components 1210, and/or one or more wheels 1212 (e.g., casters or other types of wheels). In some embodiments, the control system 140 can comprise a housing/enclosure configured and/or dimensioned to house or contain at least part of one or more of the components of the control system 140. In this example, the control system 140 is illustrated as a cart-based system that is movable with the one or more wheels 1212. In some cases, after reaching the appropriate position, the one or more wheels 1212 can be immobilized using wheel locks to hold the control system 140 in place. However, the control system 140 can be implemented as a stationary system, integrated into another system/device, and so on.


Although certain components of the control system 140 are illustrated in FIG. 9, it should be understood that additional components not shown can be included in embodiments in accordance with the present disclosure. Furthermore, certain of the illustrated components can be omitted in some embodiments. Although the control circuitry 1202 is illustrated as a separate component in the diagram of FIG. 9, it should be understood that any or all of the remaining components of the control system 140 can be embodied at least in part in the control circuitry 1202. That is, the control circuitry 1202 can include various devices (active and/or passive), semiconductor materials and/or areas, layers, regions, and/or portions thereof, conductors, leads, vias, connections, and/or the like, wherein one or more of the other components of the control system 140 and/or portion(s) thereof can be formed and/or embodied at least in part in/by such circuitry components/devices.


The various components of the control system 140 can be electrically and/or communicatively coupled using certain connectivity circuitry/devices/features, which can or may not be part of the control circuitry 1202. For example, the connectivity feature(s) can include one or more printed circuit boards configured to facilitate mounting and/or interconnectivity of at least some of the various components/circuitry of the control system 140. In some embodiments, two or more of the control circuitry 1202, the data storage/memory 1204, the communication interface(s) 1206, the power supply unit(s) 1208, and/or the input/output (I/O) component(s) 1210, can be electrically and/or communicatively coupled to each other.


As illustrated, the memory 1204 can include an input device manager 1216 and a user interface component 1218 configured to facilitate various functionality discussed herein. In some embodiments, the input device manager 1216, and/or the user interface component 1218 can include one or more instructions that are executable by the control circuitry 1202 to perform one or more operations. Although many embodiments are discussed in the context of the components 1216-1218 including one or more instructions that are executable by the control circuitry 1202, any of the components 1216-1218 can be implemented at least in part as one or more hardware logic components, such as one or more application specific integrated circuits (ASIC), one or more field-programmable gate arrays (FPGAs), one or more program-specific standard products (ASSPs), one or more complex programmable logic devices (CPLDs), and/or the like. Furthermore, although the components 1216-1218 are illustrated as being included within the control system 140, any of the components 1216-1218 can be implemented at least in part within another device/system, such as the robotic system 110, the table 150, or another device/system. Similarly, any of the other components of the control system 140 can be implemented at least in part within another device/system.


The input device manager 1216 can be configured to receive inputs from the input device 146 and translate them into actions performable by the robotic system 110. For example, pre-programmed motions, such as rapid open, rapid close, and jiggle motion, can be stored in the input device manager 1216. These pre-programmed motions can then be assigned to the desired input (e.g., single or dual button presses, voice commands, joystick movements, etc.). In some implementations, the pre-programmed motions are determined by the manufacturer. In other implementations, users may be able to modify existing pre-programmed motions and/or create new ones.


The user interface component 1218 can be configured to facilitate one or more user interfaces (also referred to as “one or more graphical user interfaces (GUI)”). For example, the user interface component 1218 can generate a configuration menu for assigning pre-programmed motions to inputs or a settings menu for enabling certain modes of operation or disabling selected pre-programmed motions in specific situations. The user interface component 1218 can also provide user interface data 1222 for display to the user.


The one or more communication interfaces 1206 can be configured to communicate with one or more device/sensors/systems. For example, the one or more communication interfaces 1206 can send/receive data in a wireless and/or wired manner over a network. A network in accordance with embodiments of the present disclosure can include a local area network (LAN), wide area network (WAN) (e.g., the Internet), personal area network (PAN), body area network (BAN), etc. In some embodiments, the one or more communication interfaces 1206 can implement a wireless technology such as Bluetooth, Wi-Fi, near field communication (NFC), or the like.


The one or more power supply units 1208 can be configured to manage power for the control system 140 (and/or the robotic system 110, in some cases). In some embodiments, the one or more power supply units 1208 include one or more batteries, such as a lithium-based battery, a lead-acid battery, an alkaline battery, and/or another type of battery. That is, the one or more power supply units 1208 can comprise one or more devices and/or circuitry configured to provide a source of power and/or provide power management functionality. Moreover, in some embodiments the one or more power supply units 1208 include a mains power connector that is configured to couple to an alternating current (AC) or direct current (DC) mains power source.


The one or more I/O components 1210 can include a variety of components to receive input and/or provide output, such as to interface with a user. The one or more I/O components 1210 can be configured to receive touch, speech, gesture, or any other type of input. In examples, the one or more I/O components 1210 can be used to provide input regarding control of a device/system, such as to control the robotic system 110, navigate the scope or other medical instrument attached to the robotic system 110, control the table 150, control the fluoroscopy device 190, and so on. As shown, the one or more I/O components 1210 can include the one or more displays 142 (sometimes referred to as “the one or more display devices 142”) configured to display data. The one or more displays 142 can include one or more liquid-crystal displays (LCD), light-emitting diode (LED) displays, organic LED displays, plasma displays, electronic paper displays, and/or any other type(s) of technology. In some embodiments, the one or more displays 142 include one or more touchscreens configured to receive input and/or display data. Further, the one or more I/O components 1210 can include the one or more input devices 146, which can include a touchscreen, touch pad, controller, mouse, keyboard, wearable device (e.g., optical head-mounted display), virtual or augmented reality device (e.g., head-mounted display), etc. Additionally, the one or more I/O components 1210 can include one or more speakers 1226 configured to output sounds based on audio signals and/or one or more microphones 1228 configured to receive sounds and generate audio signals. In some embodiments, the one or more I/O components 1210 include or are implemented as a console.


Although not shown in FIG. 9, the control system 140 can include and/or control other components, such as one or more pumps, flow meters, valve controls, and/or fluid access components in order to provide controlled irrigation and/or aspiration capabilities to a medical instrument (e.g., a scope), a device that can be deployed through a medical instrument, and so on. In some embodiments, irrigation and aspiration capabilities can be delivered directly to a medical instrument through separate cable(s). Further, the control system 140 can include a voltage and/or surge protector designed to provide filtered and/or protected electrical power to another device, such as the robotic system 110, thereby avoiding placement of a power transformer and other auxiliary power components in robotic system 110, resulting in a smaller, more moveable robotic system 110.


The control system 140 can also include support equipment for sensors deployed throughout the medical system 100. For example, the control system 140 can include opto-electronics equipment for detecting, receiving, and/or processing data received from optical sensors and/or cameras. Such opto-electronics equipment can be used to generate real-time images for display in any number of devices/systems, including in the control system 140.


In some embodiments, the control system 140 can be coupled to the robotic system 110, the table 150, and/or a medical instrument, such as the scope and/or the basket retrieval device 120, through one or more cables or connections (not shown). In some implementations, support functionality from the control system 140 can be provided through a single cable, simplifying and de-cluttering an operating room. In other implementations, specific functionality can be coupled in separate cabling and connections. For example, while power can be provided through a single power cable, the support for controls, optics, fluidics, and/or navigation can be provided through a separate cable.


The term “control circuitry” is used herein according to its broad and ordinary meaning, and can refer to any collection of one or more processors, processing circuitry, processing modules/units, chips, dies (e.g., semiconductor dies including come or more active and/or passive devices and/or connectivity circuitry), microprocessors, micro-controllers, digital signal processors, microcomputers, central processing units, graphics processing units, field programmable gate arrays, programmable logic devices, state machines (e.g., hardware state machines), logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of the circuitry and/or operational instructions. Control circuitry can further comprise one or more, storage devices, which can be embodied in a single memory device, a plurality of memory devices, and/or embedded circuitry of a device. Such data storage can comprise read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, data storage registers, and/or any device that stores digital information. It should be noted that in embodiments in which control circuitry comprises a hardware state machine (and/or implements a software state machine), analog circuitry, digital circuitry, and/or logic circuitry, data storage device(s)/register(s) storing any associated operational instructions can be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry.


The term “memory” is used herein according to its broad and ordinary meaning and can refer to any suitable or desirable type of computer-readable media. For example, computer-readable media can include one or more volatile data storage devices, non-volatile data storage devices, removable data storage devices, and/or nonremovable data storage devices implemented using any technology, layout, and/or data structure(s)/protocol, including any suitable or desirable computer-readable instructions, data structures, program modules, or other types of data.


Computer-readable media that can be implemented in accordance with embodiments of the present disclosure includes, but is not limited to, phase change memory, static random-access memory (SRAM), dynamic random-access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, compact disk read-only memory (CD-ROM), digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transitory medium that can be used to store information for access by a computing device. As used in certain contexts herein, computer-readable media may not generally include communication media, such as modulated data signals and carrier waves. As such, computer-readable media should generally be understood to refer to non-transitory media.


Additional Embodiments

Depending on the embodiment, certain acts, events, or functions of any of the processes or algorithms described herein can be performed in a different sequence, may be added, merged, or left out altogether. Thus, in certain embodiments, not all described acts or events are necessary for the practice of the processes.


Conditional language used herein, such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is intended in its ordinary sense and is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment. The terms “comprising,” “including,” “having,” and the like are synonymous, are used in their ordinary sense, and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list. Conjunctive language such as the phrase “at least one of X, Y, and Z,” unless specifically stated otherwise, is understood with the context as used in general to convey that an item, term, element, etc. may be either X, Y, or Z. Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y, and at least one of Z to each be present.


It should be appreciated that in the above description of embodiments, various features are sometimes grouped together in a single embodiment, Figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. This method of disclosure, however, is not to be interpreted as reflecting an intention that any claim require more features than are expressly recited in that claim. Moreover, any components, features, or steps illustrated and/or described in a particular embodiment herein can be applied to or used with any other embodiment(s). Further, no component, feature, step, or group of components, features, or steps are necessary or indispensable for each embodiment. Thus, it is intended that the scope of the inventions herein disclosed and claimed below should not be limited by the particular embodiments described above, but should be determined only by a fair reading of the claims that follow.


It should be understood that certain ordinal terms (e.g., “first” or “second”) may be provided for ease of reference and do not necessarily imply physical characteristics or ordering. Therefore, as used herein, an ordinal term (e.g., “first,” “second,” “third,” etc.) used to modify an element, such as a structure, a component, an operation, etc., does not necessarily indicate priority or order of the element with respect to any other element, but rather may generally distinguish the element from another element having a similar or identical name (but for use of the ordinal term). In addition, as used herein, indefinite articles (“a” and “an”) may indicate “one or more” rather than “one.” Further, an operation performed “based on” a condition or event may also be performed based on one or more other conditions or events not explicitly recited.


Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example embodiments belong. It be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and not be interpreted in an idealized or overly formal sense unless expressly so defined herein.


Unless otherwise expressly stated, comparative and/or quantitative terms, such as “less,” “more,” “greater,” and the like, are intended to encompass the concepts of equality. For example, “less” can mean not only “less” in the strictest mathematical sense, but also, “less than or equal to.”

Claims
  • 1. A robotic system for performing a medical procedure, the robotic system comprising: a robotic manipulator configured to: manipulate a medical instrument comprising a basket, the medical instrument configured to access a human anatomy;open the basket at a first opening speed and a second opening speed faster than the first opening speed; andclose the basket at a first closing speed and a second closing speed faster than the first closing speed;an input device configured to receive one or more user interactions and initiate one or more actions by the robotic manipulator, the one or more actions comprising at least one of directly controlled movement and pre-programmed motions; andcontrol circuitry communicatively coupled to the input device and the robotic manipulator and configured to: in response to receiving a first user interaction via the input device, trigger a first pre-programmed motion of the robotic manipulator, the first pre-programmed motion comprising opening the basket at the second opening speed; andin response to receiving a second user interaction via the input device, trigger a second pre-programmed motion of the robotic manipulator, the second pre-programmed motion comprising closing the basket at the second closing speed.
  • 2. The robotic system of claim 1, wherein the medical instrument further comprises a ureteroscope.
  • 3. The robotic system of claim 1, wherein the medical procedure comprises ureteroscopy.
  • 4. The robotic system of claim 1, wherein the input device comprises a control pad, the control pad comprising: directional controls configured to direct movement of the robotic manipulator along a plurality of axes; anda plurality of buttons including a first button and a second button.
  • 5. The robotic system of claim 4, wherein the first user interaction comprises double tapping the first button.
  • 6. The robotic system of claim 5, wherein the second user interaction comprises double tapping the second button.
  • 7. The robotic system of claim 6, the control circuitry further configured to: in response to tapping the first button and the second button concurrently, trigger a third pre-programmed motion of the robotic manipulator comprising a repeated, short range, forward and backward movement at an accelerated speed.
  • 8. The robotic system of claim 1, the control circuitry further configured to: in response to receiving a third user interaction, trigger a third pre-programmed motion of the robotic manipulator comprising a repeated, short range, forward and backward movement at an accelerated speed.
  • 9. The robotic system of claim 1, wherein the second pre-programmed motion further comprises: detecting a torque on a drive mechanism of the basket; andin response to the torque exceeding a threshold, stopping the closing of the basket.
  • 10. The robotic system of claim 1, wherein the at least one of the first user interaction and the second user interaction comprises a voice command.
  • 11. A method for controlling a medical instrument using a robotic manipulator, the method comprising: manipulating, using the robotic manipulator, a medical instrument comprising a basket to access a human anatomy, the robotic manipulator configured to open the basket at a first opening speed and a second opening speed, the robotic manipulator further configured to close the basket at a first closing speed and a second closing speed;receiving, via an input device, one or more user interactions for triggering pre-programmed actions by the robotic manipulator;in response to receiving a first user interaction via the input device, triggering a first pre-programmed motion of the robotic manipulator, the first pre-programmed motion comprising opening the basket at the second opening speed, the second opening speed faster than the first opening speed; andin response to receiving a second user interaction via the input device, triggering a second pre-programmed motion of the robotic manipulator, the second pre-programmed motion comprising closing the basket at the second closing speed, the second closing speed faster than the first closing speed.
  • 12. The method of claim 11, wherein the first user interaction comprises double tapping a first button of the input device and second user interaction comprises double tapping a second button of the input device.
  • 13. The method of claim 12, the method comprising: in response to tapping the first button and the second button concurrently, triggering a third pre-programmed motion of the robotic manipulator, the third pre-programmed motion comprising a repeated, short range, forward and backward movement at an accelerated speed.
  • 14. The method of claim 13, the method comprising: in response to receiving a movement input along a first axis on the input device, moving a central locus of the third pre-programmed motion of the robotic manipulator along the first axis; andrepeating the short range, forward and backward movement at the central locus.
  • 15. The method of claim 13, wherein the third pre-programmed motion further comprises a repeated, rotational movement.
  • 16. The method of claim 11, the method comprising: manipulating, using the robotic manipulator, an endoscope to access a human anatomy, the endoscope configured to capture images of the medical instrument within the human anatomy.
  • 17. The method of claim 11, the method comprising: receiving, via an input device, a third user interaction for directly controlling movement of the medical instrument; andmanipulating, using the robotic manipulator, the medical instrument along one or more axes of movement based on the received third user interaction.
  • 18. The method of claim 11, wherein the second pre-programmed motion further comprises: detecting a torque on a drive mechanism of the basket; andin response to the torque exceeding a threshold, stopping the closing of the basket.
  • 19. A control system for controlling a robotic device for performing a medical procedure, the control system comprising: an input device configured to receive one or more user interactions and initiate one or more actions by the robotic device, the one or more actions comprising at least one of directly-controlled movement and pre-programmed motions;a communication interface configured to send commands to the robotic device corresponding to the directly-controlled movement and the pre-programmed motions, the commands comprising: movement, by the robotic device, of a medical instrument comprising a basket, the medical instrument configured to access a human anatomy;opening the basket at a first opening speed and a second opening speed faster than the first opening speed; andclosing the basket at a first closing speed and a second closing speed faster than the first closing speed; andcontrol circuitry communicatively coupled to the input device and the communication interface, the control circuitry configured to: in response to receiving a first user interaction, trigger a first pre-programmed motion of the robotic device, the first pre-programmed motion comprising opening the basket at the second opening speed; andin response to receiving a second user interaction, trigger a second pre-programmed motion of the robotic device, the second pre-programmed motion comprising closing the basket at the second closing speed.
  • 20. The control system of claim 19, wherein the input device comprises: directional controls configured to direct movement of the robotic device along a plurality of axes; anda plurality of buttons including a first button configured to trigger the first pre-programmed motion and a second button configured to trigger the second pre-programmed motion.
  • 21. The control system of claim 20, wherein: double-tapping the first button triggers the first pre-programmed motion; anddouble-tapping the second button triggers the second pre-programmed motion.
  • 22. The control system of claim 20, wherein: single tapping the first button triggers a third pre-programmed motion different from the first pre-programmed motion; andsingle-tapping the second button triggers a fourth pre-programmed motion different from the second pre-programmed motion.
  • 23. The control system of claim 20, the control circuitry further configured to: in response to tapping the first button and the second button concurrently, trigger a third pre-programmed motion of the robotic device, the third pre-programmed motion comprising a repeated, short range, forward and backward movement at an accelerated speed.
  • 24. The control system of claim 23, the control circuitry further configured to: in response to receiving, via the directional controls, a movement request along a first axis, move a central locus of the third pre-programmed motion of the robotic device along the first axis; andrepeat the short range, forward and backward movement at the central locus.
  • 25. The control system of claim 19, wherein: the input device comprises a microphone configured to capture vocal user commands; andthe control circuitry is further configured to identify a first vocal user command corresponding to the first user interaction, and a second vocal user command corresponding to the second user interaction.
  • 26. The control system of claim 19, wherein: the robotic device is located at a first geographic location different from a second geographic location of the control system; andthe communication interface is further configured to send the commands over a wide area network.
  • 27. One or more non-transitory computer-readable media storing computer-executable instructions that, when executed by control circuitry, cause the control circuitry to perform operations comprising: manipulating, using a robotic device, a medical instrument comprising a basket to access a human anatomy, the robotic device configured to open the basket at a first opening speed and a second opening speed, the robotic device further configured to close the basket at a first closing speed and a second closing speed;receiving, via an input device, one or more inputs for triggering pre-programmed actions by the robotic device;in response to receiving a first input via the input device, triggering a first pre-programmed motion of the robotic device, the first pre-programmed motion comprising opening the basket at the second opening speed, the second opening speed faster than the first opening speed; andin response to receiving a second input via the input device, triggering a second pre-programmed motion of the robotic device, the second pre-programmed motion comprising closing the basket at the second closing speed, the second closing speed faster than the first closing speed.
  • 28. The one or more non-transitory computer-readable media of claim 27, wherein the first input comprises double tapping a first button of the input device and second input comprises double tapping a second button of the input device.
  • 29. The one or more non-transitory computer-readable media of claim 28, the computer-executable instructions further configured to cause the control circuitry to perform operations comprising: in response to tapping the first button and the second button concurrently, triggering a third pre-programmed motion of the robotic device, the third pre-programmed motion comprising a repeated, short range, forward and backward movement at an accelerated speed.
  • 30. The one or more non-transitory computer-readable media of claim 27, the computer-executable instructions further configured to cause the control circuitry to perform operations comprising: receiving, via the input device, a third input for controlling direct movement of the robotic device; andmanipulating, using the robotic device, the medical instrument along one or more axes of movement based on the received third input.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application claims priority to U.S. Provisional Patent Application Ser. No. 62/956,071, filed Dec. 31, 2019, entitled ADVANCED BASKET DRIVE MODE, the disclosure of which is hereby incorporated by reference in its entirety.

US Referenced Citations (966)
Number Name Date Kind
2556601 Schofield Jun 1951 A
2566183 Forss Aug 1951 A
2623175 Finke Dec 1952 A
2730699 Gratian Jan 1956 A
2884808 Mueller May 1959 A
3294183 Riley et al. Dec 1966 A
3472083 Schnepel Oct 1969 A
3513724 Box May 1970 A
3595074 Johnson Jul 1971 A
3734207 Fishbein May 1973 A
3739923 Totsuka Jun 1973 A
3763860 Clarke Oct 1973 A
3784031 Niitu et al. Jan 1974 A
3790002 Guilbaud et al. Feb 1974 A
3921536 Savage Nov 1975 A
3926386 Stahmann et al. Dec 1975 A
4040413 Ohshiro Aug 1977 A
4141245 Brandstetter Feb 1979 A
4198960 Utsugi Apr 1980 A
4241884 Lynch Dec 1980 A
4243034 Brandt Jan 1981 A
4351493 Sonnek Sep 1982 A
4357843 Peck et al. Nov 1982 A
4384493 Grunbaum May 1983 A
4470407 Hussein Sep 1984 A
4507026 Lund Mar 1985 A
4530471 Inoue Jul 1985 A
4532935 Wang Aug 1985 A
4555960 King Dec 1985 A
4611594 Grayhack et al. Sep 1986 A
4685458 Leckrone Aug 1987 A
4688555 Wardle Aug 1987 A
4741335 Okada May 1988 A
4745908 Wardle May 1988 A
4747405 Leckrone May 1988 A
4784150 Voorhies et al. Nov 1988 A
4854301 Nakajima Aug 1989 A
4857058 W. Aug 1989 A
4898574 Uchiyama et al. Feb 1990 A
4899733 DeCastro et al. Feb 1990 A
4907168 Boggs Mar 1990 A
4945790 Golden Aug 1990 A
4983165 Loiterman Jan 1991 A
5029574 Shimamura et al. Jul 1991 A
5085659 Rydell Feb 1992 A
5150452 Pollack et al. Sep 1992 A
5190542 Nakao et al. Mar 1993 A
5190557 Borodulin et al. Mar 1993 A
5196023 Martin Mar 1993 A
5207128 Albright May 1993 A
5217465 Steppe Jun 1993 A
5234428 Kaufman Aug 1993 A
5256150 Quiachon et al. Oct 1993 A
5277085 Tanimura et al. Jan 1994 A
5308323 Sogawa et al. May 1994 A
5318589 Lichtman Jun 1994 A
5325848 Adams et al. Jul 1994 A
5342381 Tidemand Aug 1994 A
5344395 Whalen et al. Sep 1994 A
5350101 Godlewski Sep 1994 A
5353783 Nakao et al. Oct 1994 A
5370609 Drasler et al. Dec 1994 A
5372124 Takayama et al. Dec 1994 A
5411016 Kume et al. May 1995 A
5426687 Goodall et al. Jun 1995 A
5431649 Mulier et al. Jul 1995 A
5441485 Peters Aug 1995 A
5449356 Walbrink et al. Sep 1995 A
5450843 Moll et al. Sep 1995 A
5472426 Bonati et al. Dec 1995 A
5496267 Drasler et al. Mar 1996 A
5501667 Verduin, Jr. Mar 1996 A
5507725 Savage et al. Apr 1996 A
5520684 Imran May 1996 A
5524180 Wang et al. Jun 1996 A
5545170 Hart Aug 1996 A
5559294 Hoium et al. Sep 1996 A
5562239 Boiarski et al. Oct 1996 A
5562648 Peterson Oct 1996 A
5562678 Booker Oct 1996 A
5572999 Funda et al. Nov 1996 A
5573535 Viklund Nov 1996 A
5613973 Jackson et al. Mar 1997 A
5645083 Essig et al. Jul 1997 A
5653374 Young et al. Aug 1997 A
5658311 Baden Aug 1997 A
5695500 Taylor et al. Dec 1997 A
5697949 Giurtino et al. Dec 1997 A
5709661 Egmond et al. Jan 1998 A
5710870 Ohm et al. Jan 1998 A
5716325 Bonutti Feb 1998 A
5737500 Seraji et al. Apr 1998 A
5767840 Selker Jun 1998 A
5779623 Bonnell Jul 1998 A
5788667 Stoller Aug 1998 A
5788710 Bates et al. Aug 1998 A
5792135 Madhani et al. Aug 1998 A
5792165 Klieman et al. Aug 1998 A
5797900 Madhani et al. Aug 1998 A
5798627 Gilliland et al. Aug 1998 A
5810770 Chin et al. Sep 1998 A
5842390 Bouligny et al. Dec 1998 A
5855583 Wang et al. Jan 1999 A
5893869 Barnhart et al. Apr 1999 A
5897491 Kastenbauer et al. Apr 1999 A
5921968 Lampropoulos et al. Jul 1999 A
5924175 Lippitt et al. Jul 1999 A
5943056 Sato et al. Aug 1999 A
5967934 Ishida et al. Oct 1999 A
5969230 Sakai et al. Oct 1999 A
5989230 Frassica Nov 1999 A
6071281 Burnside et al. Jun 2000 A
6077219 Viebach et al. Jun 2000 A
6084371 Kress et al. Jul 2000 A
6093157 Chandrasekaran Jul 2000 A
6110171 Rydell Aug 2000 A
6120476 Fung et al. Sep 2000 A
6120498 Jani et al. Sep 2000 A
6154000 Rastegar et al. Nov 2000 A
6156030 Neev Dec 2000 A
6159220 Gobron et al. Dec 2000 A
6171234 White et al. Jan 2001 B1
6174318 Bates et al. Jan 2001 B1
6183435 Bumbalough et al. Feb 2001 B1
6183482 Bates et al. Feb 2001 B1
6185478 Koakutsu et al. Feb 2001 B1
6206903 Ramans Mar 2001 B1
6236906 Müller May 2001 B1
6272371 Shlomo Aug 2001 B1
6289579 Viza et al. Sep 2001 B1
6302895 Gobron et al. Oct 2001 B1
6322557 Nikolaevich et al. Nov 2001 B1
6375635 Moutafis et al. Apr 2002 B1
6394998 Wallace et al. May 2002 B1
6401572 Provost Jun 2002 B1
6405078 Moaddeb et al. Jun 2002 B1
6428563 Keller Aug 2002 B1
6436107 Wang et al. Aug 2002 B1
6440061 Wenner et al. Aug 2002 B1
6487940 Hart et al. Dec 2002 B2
6491701 Tierney et al. Dec 2002 B2
6508823 Gonon Jan 2003 B1
6522906 Salisbury et al. Feb 2003 B1
6577891 Jaross et al. Jun 2003 B1
6659939 Moll et al. Dec 2003 B2
6673080 Reynolds et al. Jan 2004 B2
6676668 Mercereau et al. Jan 2004 B2
6685698 Morley et al. Feb 2004 B2
6695818 Wollschläger Feb 2004 B2
6706050 Giannadakis Mar 2004 B1
6726675 Beyar Apr 2004 B1
6786896 Madhani et al. Sep 2004 B1
6827712 Tovey et al. Dec 2004 B2
6991602 Nakazawa et al. Jan 2006 B2
7044936 Harding et al. May 2006 B2
7172580 Hruska et al. Feb 2007 B2
7248944 Green Jul 2007 B2
7276044 Ferry et al. Oct 2007 B2
7282055 Tsuruta Oct 2007 B2
7524284 Murakami et al. Apr 2009 B2
7559934 Teague et al. Jul 2009 B2
7615042 Beyar et al. Nov 2009 B2
7635342 Ferry et al. Dec 2009 B2
7645283 Reynolds et al. Jan 2010 B2
7736356 Cooper et al. Jun 2010 B2
7766856 Ferry et al. Aug 2010 B2
7882841 Aljuri et al. Feb 2011 B2
7914540 Schwartz et al. Mar 2011 B2
7938809 Lampropoulos et al. May 2011 B2
7963911 Turliuc Jun 2011 B2
7972298 Wallace et al. Jul 2011 B2
7974674 Hauck et al. Jul 2011 B2
7987046 Peterman et al. Jul 2011 B1
7998020 Kidd et al. Aug 2011 B2
8002713 Heske et al. Aug 2011 B2
8004229 Nowlin et al. Aug 2011 B2
8016839 Wilk Sep 2011 B2
8038598 Khachi Oct 2011 B2
8043303 Razvi et al. Oct 2011 B2
8052636 Moll et al. Nov 2011 B2
8092397 Wallace et al. Jan 2012 B2
8157308 Pedersen Apr 2012 B2
8182415 Larkin et al. May 2012 B2
8187173 Miyoshi May 2012 B2
8257303 Moll et al. Sep 2012 B2
8277417 Fedinec Oct 2012 B2
8291791 Light et al. Oct 2012 B2
8414505 Weitzner et al. Apr 2013 B1
8425465 Nagano et al. Apr 2013 B2
8480595 Speeg et al. Jul 2013 B2
8523762 Miyamoto et al. Sep 2013 B2
8540748 Murphy et al. Sep 2013 B2
8541970 Nowlin et al. Sep 2013 B2
8602031 Reis et al. Dec 2013 B2
8671817 Bogusky Mar 2014 B1
8720448 Reis et al. May 2014 B2
8746252 McGrogan et al. Jun 2014 B2
8749190 Nowlin et al. Jun 2014 B2
8755124 Aschwanden et al. Jun 2014 B2
8786241 Nowlin et al. Jul 2014 B2
8820603 Shelton, IV et al. Sep 2014 B2
8821480 Burbank Sep 2014 B2
8827948 Romo et al. Sep 2014 B2
8870815 Bhat et al. Oct 2014 B2
8882660 Phee et al. Nov 2014 B2
8894610 MacNamara et al. Nov 2014 B2
8945163 Voegele et al. Feb 2015 B2
8956280 Eversull et al. Feb 2015 B2
8961533 Stahler et al. Feb 2015 B2
8968333 Yu et al. Mar 2015 B2
8992542 Hagag et al. Mar 2015 B2
9173713 Hart et al. Nov 2015 B2
9179979 Jinno Nov 2015 B2
9204933 Reis et al. Dec 2015 B2
9254123 Alvarez et al. Feb 2016 B2
9259280 Au et al. Feb 2016 B2
9259281 Griffiths et al. Feb 2016 B2
9259282 Azizian et al. Feb 2016 B2
9296104 Swarup et al. Mar 2016 B2
9326822 Lewis et al. May 2016 B2
9345456 Tsonton et al. May 2016 B2
9345544 Hourtash et al. May 2016 B2
9375284 Hourtash Jun 2016 B2
9408669 Kokish et al. Aug 2016 B2
9415510 Hourtash et al. Aug 2016 B2
9446177 Millman et al. Sep 2016 B2
9452018 Yu Sep 2016 B2
9457168 Moll et al. Oct 2016 B2
9460536 Hasegawa et al. Oct 2016 B2
9480534 Bowling et al. Nov 2016 B2
9498601 Tanner et al. Nov 2016 B2
9504604 Alvarez Nov 2016 B2
9510911 Hourtash Dec 2016 B2
9517106 Hourtash et al. Dec 2016 B2
9561083 Yu et al. Feb 2017 B2
9566125 Bowling et al. Feb 2017 B2
9592042 Titus Mar 2017 B2
9597152 Schaeffer et al. Mar 2017 B2
9622827 Yu et al. Apr 2017 B2
9636184 Lee et al. May 2017 B2
9636483 Hart et al. May 2017 B2
9668814 Kokish Jun 2017 B2
9675422 Hourtash et al. Jun 2017 B2
9687310 Nowlin et al. Jun 2017 B2
9713509 Schuh et al. Jul 2017 B2
9727963 Mintz et al. Aug 2017 B2
9730757 Brudniok Aug 2017 B2
9737371 Romo et al. Aug 2017 B2
9737373 Schuh Aug 2017 B2
9744335 Jiang Aug 2017 B2
9763741 Alvarez et al. Sep 2017 B2
9788910 Schuh Oct 2017 B2
9818681 Machida Nov 2017 B2
9844412 Bogusky et al. Dec 2017 B2
9867635 Alvarez et al. Jan 2018 B2
9918659 Chopra et al. Mar 2018 B2
9918681 Wallace et al. Mar 2018 B2
9931025 Graetzel et al. Apr 2018 B1
9943962 Sattler et al. Apr 2018 B2
9949749 Noonan et al. Apr 2018 B2
9955986 Shah May 2018 B2
9962228 Schuh et al. May 2018 B2
9980785 Schuh May 2018 B2
9993313 Schuh et al. Jun 2018 B2
9993614 Pacheco et al. Jun 2018 B2
10016900 Meyer et al. Jul 2018 B1
10022192 Ummalaneni Jul 2018 B1
10029367 Hourtash Jul 2018 B2
10046140 Kokish et al. Aug 2018 B2
10071479 Swarup et al. Sep 2018 B2
10080576 Romo et al. Sep 2018 B2
10117713 Barrera et al. Nov 2018 B2
10130429 Weir Nov 2018 B1
10136949 Felder et al. Nov 2018 B2
10136959 Mintz et al. Nov 2018 B2
10143360 Roelle et al. Dec 2018 B2
10145747 Lin et al. Dec 2018 B1
10149720 Romo Dec 2018 B2
10154822 Henderson et al. Dec 2018 B2
10159532 Ummalaneni Dec 2018 B1
10159533 Moll et al. Dec 2018 B2
10169875 Mintz et al. Jan 2019 B2
10176681 Plewe et al. Jan 2019 B2
10213264 Tanner et al. Feb 2019 B2
10219867 Saglam et al. Mar 2019 B2
10219868 Weir Mar 2019 B2
10219874 Yu et al. Mar 2019 B2
10231793 Romo Mar 2019 B2
10231867 Alvarez et al. Mar 2019 B2
10244926 Noonan et al. Apr 2019 B2
10258285 Hauck et al. Apr 2019 B2
10285574 Landey et al. May 2019 B2
10299870 Connolly et al. May 2019 B2
10314463 Agrawal et al. Jun 2019 B2
10314661 Bowling et al. Jun 2019 B2
10327855 Hourtash et al. Jun 2019 B2
10350017 Bowling et al. Jul 2019 B2
10350390 Moll et al. Jul 2019 B2
10363103 Alvarez et al. Jul 2019 B2
10383765 Alvarez et al. Aug 2019 B2
10398518 Yu et al. Sep 2019 B2
10405939 Romo Sep 2019 B2
10405940 Romo Sep 2019 B2
10426559 Graetzel et al. Oct 2019 B2
10426661 Kintz Oct 2019 B2
10434660 Meyer et al. Oct 2019 B2
10454347 Covington et al. Oct 2019 B2
10463440 Bowling et al. Nov 2019 B2
10464209 Ho et al. Nov 2019 B2
10470830 Hill et al. Nov 2019 B2
10478595 Kokish Nov 2019 B2
10482599 Mintz et al. Nov 2019 B2
10493239 Hart et al. Dec 2019 B2
10493241 Jiang Dec 2019 B2
10500001 Yu et al. Dec 2019 B2
10517692 Eyre et al. Dec 2019 B2
10524866 Srinivasan et al. Jan 2020 B2
10524867 Kokish et al. Jan 2020 B2
10539478 Lin et al. Jan 2020 B2
10543047 Yu Jan 2020 B2
10543048 Noonan Jan 2020 B2
10555778 Ummalaneni Feb 2020 B2
10556092 Yu et al. Feb 2020 B2
10569052 Kokish et al. Feb 2020 B2
10631949 Schuh et al. Apr 2020 B2
10639108 Romo et al. May 2020 B2
10639109 Bovay et al. May 2020 B2
10639114 Schuh et al. May 2020 B2
10667871 Romo et al. Jun 2020 B2
10667875 DeFonzo et al. Jun 2020 B2
10682189 Schuh et al. Jun 2020 B2
10687903 Lewis et al. Jun 2020 B2
10695536 Weitzner et al. Jun 2020 B2
10702348 Moll et al. Jul 2020 B2
10716461 Jenkins Jul 2020 B2
10743751 Landey et al. Aug 2020 B2
10744035 Alvarez et al. Aug 2020 B2
10751140 Wallace et al. Aug 2020 B2
10765303 Graetzel et al. Sep 2020 B2
10765487 Ho et al. Sep 2020 B2
10779898 Hill et al. Sep 2020 B2
10786329 Schuh et al. Sep 2020 B2
10786432 Jornitz et al. Sep 2020 B2
10792112 Kokish et al. Oct 2020 B2
10792464 Romo et al. Oct 2020 B2
10792466 Landey et al. Oct 2020 B2
10813539 Graetzel et al. Oct 2020 B2
10814101 Jiang Oct 2020 B2
10820947 Julian Nov 2020 B2
10820952 Yu Nov 2020 B2
10820954 Marsot et al. Nov 2020 B2
10827913 Ummalaneni et al. Nov 2020 B2
10828118 Schuh et al. Nov 2020 B2
10835153 Rafii-Tari et al. Nov 2020 B2
10850013 Hsu et al. Dec 2020 B2
10881280 Baez, Jr. Jan 2021 B2
10888386 Eyre et al. Jan 2021 B2
10898276 Graetzel et al. Jan 2021 B2
20010042643 Krueger et al. Nov 2001 A1
20020019644 Hastings et al. Feb 2002 A1
20020045905 Gerbi et al. Apr 2002 A1
20020068954 Foster Jun 2002 A1
20020098938 Milbourne et al. Jul 2002 A1
20020100254 Dharssi Aug 2002 A1
20020107573 Steinberg Aug 2002 A1
20020111608 Baerveldt et al. Aug 2002 A1
20020111621 Wallace et al. Aug 2002 A1
20020117017 Bernhardt et al. Aug 2002 A1
20020161355 Wollschlager Oct 2002 A1
20020161426 Iancea Oct 2002 A1
20020177789 Ferry et al. Nov 2002 A1
20030004455 Kadziauskas et al. Jan 2003 A1
20030040681 Ng et al. Feb 2003 A1
20030056561 Butscher et al. Mar 2003 A1
20030065358 Frecker et al. Apr 2003 A1
20030088254 Gregory et al. May 2003 A1
20030100882 Beden et al. May 2003 A1
20030109780 Coste-Maniere et al. Jun 2003 A1
20030109877 Morley et al. Jun 2003 A1
20030109889 Mercereau et al. Jun 2003 A1
20030158545 Hovda et al. Aug 2003 A1
20030167623 Lorenz Sep 2003 A1
20030208189 Payman Nov 2003 A1
20030212308 Bendall Nov 2003 A1
20030225419 Lippitt et al. Dec 2003 A1
20040015053 Bieger et al. Jan 2004 A1
20040122444 Gerard Jun 2004 A1
20040143253 Vanney et al. Jul 2004 A1
20040152972 Hunter Aug 2004 A1
20040153093 Donovan Aug 2004 A1
20040158261 Vu Aug 2004 A1
20040186349 Ewers et al. Sep 2004 A1
20040193146 Lee et al. Sep 2004 A1
20040210116 Nakao Oct 2004 A1
20040243147 Lipow Dec 2004 A1
20040253079 Sanchez Dec 2004 A1
20040254566 Plicchi et al. Dec 2004 A1
20040260246 Desmond Dec 2004 A1
20050004579 Schneider et al. Jan 2005 A1
20050033270 Ramans et al. Feb 2005 A1
20050054900 Mawn et al. Mar 2005 A1
20050059645 Bodor Mar 2005 A1
20050159645 Bertolero et al. Jul 2005 A1
20050177026 Hoeg et al. Aug 2005 A1
20050183532 Najaf et al. Aug 2005 A1
20050222554 Wallace et al. Oct 2005 A1
20050240178 Morley et al. Oct 2005 A1
20050261705 Gist Nov 2005 A1
20050267488 Hare et al. Dec 2005 A1
20060015133 Grayzel et al. Jan 2006 A1
20060041245 Ferry et al. Feb 2006 A1
20060058813 Teague et al. Mar 2006 A1
20060111692 Hlavka et al. May 2006 A1
20060116693 Weisenburgh et al. Jun 2006 A1
20060135963 Kick et al. Jun 2006 A1
20060142657 Quaid et al. Jun 2006 A1
20060146010 Schneider Jul 2006 A1
20060156875 McRury et al. Jul 2006 A1
20060161137 Orban et al. Jul 2006 A1
20060189891 Waxman et al. Aug 2006 A1
20060201688 Jenner et al. Sep 2006 A1
20060229587 Beyar et al. Oct 2006 A1
20060237205 Sia et al. Oct 2006 A1
20070000498 Glynn et al. Jan 2007 A1
20070000577 Chen Jan 2007 A1
20070013336 Nowlin et al. Jan 2007 A1
20070016164 Dudney et al. Jan 2007 A1
20070021768 Nance et al. Jan 2007 A1
20070027443 Rose et al. Feb 2007 A1
20070027534 Bergheim et al. Feb 2007 A1
20070032906 Sutherland et al. Feb 2007 A1
20070060879 Weitzner et al. Mar 2007 A1
20070086934 Huber et al. Apr 2007 A1
20070100201 Komiya et al. May 2007 A1
20070100254 Murakami et al. May 2007 A1
20070106304 Hammack et al. May 2007 A1
20070112355 Salahieh et al. May 2007 A1
20070119274 Devengenzo et al. May 2007 A1
20070123855 Morley et al. May 2007 A1
20070135803 Belson Jun 2007 A1
20070149946 Viswanathan et al. Jun 2007 A1
20070185485 Hauck et al. Aug 2007 A1
20070191177 Nagai et al. Aug 2007 A1
20070203475 Fischer et al. Aug 2007 A1
20070208252 Makower Sep 2007 A1
20070208375 Nishizawa et al. Sep 2007 A1
20070213668 Spitz Sep 2007 A1
20070239028 Houser et al. Oct 2007 A1
20070239178 Weitzner et al. Oct 2007 A1
20070245175 Zheng et al. Oct 2007 A1
20070250111 Lu et al. Oct 2007 A1
20070299427 Yeung et al. Dec 2007 A1
20080009884 Kennedy Jan 2008 A1
20080015566 Livneh Jan 2008 A1
20080021440 Solomon Jan 2008 A1
20080027464 Moll et al. Jan 2008 A1
20080033467 Miyamoto et al. Feb 2008 A1
20080039255 Jinno et al. Feb 2008 A1
20080045981 Margolin et al. Feb 2008 A1
20080046122 Manzo et al. Feb 2008 A1
20080065103 Cooper et al. Mar 2008 A1
20080065111 Blumenkranz et al. Mar 2008 A1
20080065112 Tovey et al. Mar 2008 A1
20080082109 Moll et al. Apr 2008 A1
20080091215 Saleh Apr 2008 A1
20080125698 Gerg et al. May 2008 A1
20080147011 Urmey Jun 2008 A1
20080177277 Huang et al. Jul 2008 A1
20080177285 Brock et al. Jul 2008 A1
20080187101 Gertner Aug 2008 A1
20080188864 Ducharme Aug 2008 A1
20080196533 Bergamasco et al. Aug 2008 A1
20080214925 Wilson et al. Sep 2008 A1
20080228104 Uber et al. Sep 2008 A1
20080243064 Stahler et al. Oct 2008 A1
20080249536 Stahler et al. Oct 2008 A1
20080253108 Ellenburg et al. Oct 2008 A1
20080262301 Gibbons et al. Oct 2008 A1
20080262480 Stahler et al. Oct 2008 A1
20080262513 Stahler et al. Oct 2008 A1
20080287963 Rogers et al. Nov 2008 A1
20080300592 Weitzner et al. Dec 2008 A1
20080302200 Tobey Dec 2008 A1
20080312521 Solomon Dec 2008 A1
20090005768 Sharareh et al. Jan 2009 A1
20090012507 Culbertson et al. Jan 2009 A1
20090030446 Measamer Jan 2009 A1
20090036900 Moll Feb 2009 A1
20090043305 Brodbeck et al. Feb 2009 A1
20090062602 Rosenberg et al. Mar 2009 A1
20090082634 Kathrani et al. Mar 2009 A1
20090082722 Munger et al. Mar 2009 A1
20090088774 Swarup et al. Apr 2009 A1
20090098971 Ho et al. Apr 2009 A1
20090105645 Kidd et al. Apr 2009 A1
20090105723 Dillinger Apr 2009 A1
20090131885 Akahoshi May 2009 A1
20090138025 Stahler et al. May 2009 A1
20090161827 Gertner et al. Jun 2009 A1
20090163948 Sunaoshi et al. Jun 2009 A1
20090171371 Nixon et al. Jul 2009 A1
20090192510 Bahney Jul 2009 A1
20090192524 Itkowitz et al. Jul 2009 A1
20090227998 Aljuri et al. Sep 2009 A1
20090247944 Kirschenman et al. Oct 2009 A1
20090248039 Cooper et al. Oct 2009 A1
20090248041 Williams et al. Oct 2009 A1
20090248043 Tierney et al. Oct 2009 A1
20090264878 Carmel et al. Oct 2009 A1
20090270760 Leimbach et al. Oct 2009 A1
20090287188 Golden et al. Nov 2009 A1
20090299352 Zerfas et al. Dec 2009 A1
20090312773 Cabrera et al. Dec 2009 A1
20100004642 Lumpkin Jan 2010 A1
20100010504 Simaan et al. Jan 2010 A1
20100011900 Burbank Jan 2010 A1
20100011901 Burbank Jan 2010 A1
20100016852 Manzo et al. Jan 2010 A1
20100016853 Burbank Jan 2010 A1
20100030023 Yoshie Feb 2010 A1
20100069833 Wenderow et al. Mar 2010 A1
20100073150 Olson et al. Mar 2010 A1
20100081965 Mugan et al. Apr 2010 A1
20100082017 Zickler et al. Apr 2010 A1
20100082041 Prisco Apr 2010 A1
20100125284 Tanner et al. May 2010 A1
20100130923 Cleary et al. May 2010 A1
20100130987 Wenderow et al. May 2010 A1
20100137846 Desai et al. Jun 2010 A1
20100175701 Reis et al. Jul 2010 A1
20100179632 Bruszewski et al. Jul 2010 A1
20100204605 Blakley et al. Aug 2010 A1
20100204646 Plicchi et al. Aug 2010 A1
20100210923 Li et al. Aug 2010 A1
20100217235 Thorstenson et al. Aug 2010 A1
20100225209 Goldberg et al. Sep 2010 A1
20100228191 Alvarez et al. Sep 2010 A1
20100228249 Mohr et al. Sep 2010 A1
20100248177 Mangelberger et al. Sep 2010 A1
20100249506 Prisco Sep 2010 A1
20100268211 Manwaring et al. Oct 2010 A1
20100274078 Kim et al. Oct 2010 A1
20100312141 Keast et al. Dec 2010 A1
20100331856 Carlson et al. Dec 2010 A1
20100331858 Simaan et al. Dec 2010 A1
20100332033 Diolaiti et al. Dec 2010 A1
20110009863 Marczyk et al. Jan 2011 A1
20110015483 Barbagli et al. Jan 2011 A1
20110015484 Alvarez et al. Jan 2011 A1
20110015648 Alvarez et al. Jan 2011 A1
20110015650 Choi et al. Jan 2011 A1
20110028790 Farr et al. Feb 2011 A1
20110028991 Ikeda et al. Feb 2011 A1
20110071541 Prisco et al. Mar 2011 A1
20110071543 Prisco et al. Mar 2011 A1
20110106146 Jeong May 2011 A1
20110125165 Simaan et al. May 2011 A1
20110130718 Kidd et al. Jun 2011 A1
20110147030 Blum et al. Jun 2011 A1
20110152880 Alvarez et al. Jun 2011 A1
20110160713 Neuberger Jun 2011 A1
20110160745 Fielding et al. Jun 2011 A1
20110167611 Williams Jul 2011 A1
20110184391 Aljuri et al. Jul 2011 A1
20110208211 Whitfield et al. Aug 2011 A1
20110213362 Cunningham et al. Sep 2011 A1
20110224660 Neuberger et al. Sep 2011 A1
20110238064 Williams Sep 2011 A1
20110238083 Moll et al. Sep 2011 A1
20110257641 Hastings et al. Oct 2011 A1
20110261183 Ma et al. Oct 2011 A1
20110270273 Moll et al. Nov 2011 A1
20110276085 Krzyzanowski Nov 2011 A1
20110277775 Holop et al. Nov 2011 A1
20110288573 Yates et al. Nov 2011 A1
20110306836 Ohline et al. Dec 2011 A1
20110313343 Milutinovic et al. Dec 2011 A1
20120048759 Disch et al. Mar 2012 A1
20120069167 Liu et al. Mar 2012 A1
20120071821 Yu Mar 2012 A1
20120071894 Tanner et al. Mar 2012 A1
20120071895 Stahler et al. Mar 2012 A1
20120132018 Tang et al. May 2012 A1
20120136419 Zarembo et al. May 2012 A1
20120138586 Webster et al. Jun 2012 A1
20120138660 Shelton, IV Jun 2012 A1
20120143226 Belson et al. Jun 2012 A1
20120150154 Brisson et al. Jun 2012 A1
20120186194 Schlieper Jul 2012 A1
20120191107 Tanner et al. Jul 2012 A1
20120209315 Girbau Aug 2012 A1
20120217457 Schena et al. Aug 2012 A1
20120232342 Reydel Sep 2012 A1
20120232476 Bhat et al. Sep 2012 A1
20120239012 Laurent et al. Sep 2012 A1
20120253277 Tah et al. Oct 2012 A1
20120253332 Moll Oct 2012 A1
20120259320 Loesel et al. Oct 2012 A1
20120277730 Salahieh et al. Nov 2012 A1
20120283747 Popovic Nov 2012 A1
20120296318 Wellhöfer et al. Nov 2012 A1
20130006144 Clancy et al. Jan 2013 A1
20130018400 Milton et al. Jan 2013 A1
20130035537 Wallace et al. Feb 2013 A1
20130053877 BenMaamer et al. Feb 2013 A1
20130066136 Palese et al. Mar 2013 A1
20130066335 Bärwinkel et al. Mar 2013 A1
20130085442 Shtul et al. Apr 2013 A1
20130085486 Boutoussov et al. Apr 2013 A1
20130096422 Boctor et al. Apr 2013 A1
20130096574 Kang et al. Apr 2013 A1
20130110042 Humphreys May 2013 A1
20130110107 Smith et al. May 2013 A1
20130116716 Bahls et al. May 2013 A1
20130144116 Cooper et al. Jun 2013 A1
20130144274 Stefanchik et al. Jun 2013 A1
20130144395 Stefanchik et al. Jun 2013 A1
20130190741 Moll et al. Jul 2013 A1
20130190796 Tilson et al. Jul 2013 A1
20130204124 Duindam et al. Aug 2013 A1
20130225997 Dillard et al. Aug 2013 A1
20130226151 Suehara Aug 2013 A1
20130226161 Hickenbotham Aug 2013 A1
20130231678 Wenderow et al. Sep 2013 A1
20130233908 Knodel et al. Sep 2013 A1
20130253267 Collins Sep 2013 A1
20130303876 Gelfand et al. Nov 2013 A1
20130304084 Beira et al. Nov 2013 A1
20130310819 Neuberger et al. Nov 2013 A1
20130317519 Romo et al. Nov 2013 A1
20130334281 Williams Dec 2013 A1
20130345519 Piskun et al. Dec 2013 A1
20130345686 Brown Dec 2013 A1
20140000411 Shelton, IV et al. Jan 2014 A1
20140005681 Gee et al. Jan 2014 A1
20140039517 Bowling et al. Feb 2014 A1
20140039681 Bowling et al. Feb 2014 A1
20140046308 Bischoff et al. Feb 2014 A1
20140051985 Fan et al. Feb 2014 A1
20140058365 Bille et al. Feb 2014 A1
20140058404 Hammack et al. Feb 2014 A1
20140058428 Christopher et al. Feb 2014 A1
20140582428 Christopher et al. Feb 2014
20140066944 Taylor et al. Mar 2014 A1
20140069437 Reis et al. Mar 2014 A1
20140100445 Stenzel et al. Apr 2014 A1
20140142591 Alvarez et al. May 2014 A1
20140163318 Swanstrom Jun 2014 A1
20140163736 Azizian et al. Jun 2014 A1
20140166023 Kishi Jun 2014 A1
20140171778 Tsusaka et al. Jun 2014 A1
20140180063 Zhao et al. Jun 2014 A1
20140194859 Ianchulev Jul 2014 A1
20140194905 Kappel et al. Jul 2014 A1
20140222019 Brudniok Aug 2014 A1
20140222207 Bowling et al. Aug 2014 A1
20140243849 Saglam et al. Aug 2014 A1
20140246473 Auld Sep 2014 A1
20140257333 Blumenkranz Sep 2014 A1
20140275956 Fan Sep 2014 A1
20140276233 Murphy Sep 2014 A1
20140276389 Walker Sep 2014 A1
20140276394 Wong et al. Sep 2014 A1
20140276594 Tanner et al. Sep 2014 A1
20140276723 Parihar et al. Sep 2014 A1
20140276935 Yu Sep 2014 A1
20140276936 Kokish et al. Sep 2014 A1
20140276956 Crainich et al. Sep 2014 A1
20140277334 Yu et al. Sep 2014 A1
20140309649 Alvarez et al. Oct 2014 A1
20140309655 Gal et al. Oct 2014 A1
20140316203 Carroux et al. Oct 2014 A1
20140357984 Wallace et al. Dec 2014 A1
20140364870 Alvarez et al. Dec 2014 A1
20140375784 Massetti Dec 2014 A1
20140379000 Romo et al. Dec 2014 A1
20150012134 Robinson et al. Jan 2015 A1
20150028195 King et al. Jan 2015 A1
20150051592 Kintz et al. Feb 2015 A1
20150073439 Dannaher Mar 2015 A1
20150080879 Trees et al. Mar 2015 A1
20150090063 Lantermann et al. Apr 2015 A1
20150101442 Romo et al. Apr 2015 A1
20150119634 Jones Apr 2015 A1
20150119638 Yu et al. Apr 2015 A1
20150127045 Prestel May 2015 A1
20150133960 Lohmeier et al. May 2015 A1
20150133963 Barbagli May 2015 A1
20150142013 Tanner et al. May 2015 A1
20150144514 Brennan et al. May 2015 A1
20150148600 Ashinuma et al. May 2015 A1
20150150635 Kilroy et al. Jun 2015 A1
20150164522 Budiman et al. Jun 2015 A1
20150164594 Romo et al. Jun 2015 A1
20150164595 Bogusky et al. Jun 2015 A1
20150164596 Romo et al. Jun 2015 A1
20150182250 Conlon et al. Jul 2015 A1
20150190204 Popovi Jul 2015 A1
20150201917 Snow Jul 2015 A1
20150202085 Lemonis et al. Jul 2015 A1
20150231364 Blanchard et al. Aug 2015 A1
20150314110 Park Nov 2015 A1
20150335480 Alvarez et al. Nov 2015 A1
20150366629 Bowling et al. Dec 2015 A1
20150374445 Gombert et al. Dec 2015 A1
20150374446 Malackowski et al. Dec 2015 A1
20160000512 Gombert et al. Jan 2016 A1
20160001038 Romo et al. Jan 2016 A1
20160022289 Wan Jan 2016 A1
20160022466 Pedtke et al. Jan 2016 A1
20160030073 Isakov et al. Feb 2016 A1
20160045208 Ciulla Feb 2016 A1
20160051318 Manzo et al. Feb 2016 A1
20160066935 Nguyen et al. Mar 2016 A1
20160124220 Bueeler et al. May 2016 A1
20160151122 Alvarez et al. Jun 2016 A1
20160157945 Madhani et al. Jun 2016 A1
20160158490 Leeflang et al. Jun 2016 A1
20160166234 Zhang et al. Jun 2016 A1
20160183841 Duindam et al. Jun 2016 A1
20160184032 Romo et al. Jun 2016 A1
20160192860 Allenby et al. Jul 2016 A1
20160199984 Lohmeier et al. Jul 2016 A1
20160206389 Miller et al. Jul 2016 A1
20160213435 Hourtash et al. Jul 2016 A1
20160235478 Bonneau et al. Aug 2016 A1
20160235495 Wallace et al. Aug 2016 A1
20160242858 Barrera et al. Aug 2016 A1
20160249932 Rogers et al. Sep 2016 A1
20160270865 Landey et al. Sep 2016 A1
20160270866 Yu et al. Sep 2016 A1
20160279394 Moll et al. Sep 2016 A1
20160287279 Bovay et al. Oct 2016 A1
20160287840 Jiang Oct 2016 A1
20160296294 Moll et al. Oct 2016 A1
20160302871 Gregerson et al. Oct 2016 A1
20160303743 Rockrohr Oct 2016 A1
20160310146 Levy et al. Oct 2016 A1
20160331358 Gordon Nov 2016 A1
20160338783 Romo et al. Nov 2016 A1
20160338785 Kokish et al. Nov 2016 A1
20160346049 Allen et al. Dec 2016 A1
20160367324 Sato et al. Dec 2016 A1
20160374541 Agrawal et al. Dec 2016 A1
20170000577 Bowling et al. Jan 2017 A1
20170007279 Sharma Jan 2017 A1
20170007337 Dan Jan 2017 A1
20170020615 Koenig et al. Jan 2017 A1
20170049471 Gaffney et al. Feb 2017 A1
20170055995 Weir et al. Mar 2017 A1
20170065227 Marrs et al. Mar 2017 A1
20170065357 Schuh Mar 2017 A1
20170065363 Schuh et al. Mar 2017 A1
20170065364 Schuh et al. Mar 2017 A1
20170065365 Schuh Mar 2017 A1
20170071584 Suigetsu et al. Mar 2017 A1
20170086934 Devengenzo et al. Mar 2017 A1
20170095234 Prisco et al. Apr 2017 A1
20170095295 Overmyer Apr 2017 A1
20170100199 Yu et al. Apr 2017 A1
20170119411 Shah May 2017 A1
20170119412 Noonan et al. May 2017 A1
20170119413 Romo May 2017 A1
20170119481 Romo et al. May 2017 A1
20170135706 Frey et al. May 2017 A1
20170151028 Ogawa et al. Jun 2017 A1
20170151416 Kutikov et al. Jun 2017 A1
20170165009 Chaplin et al. Jun 2017 A1
20170165011 Bovay et al. Jun 2017 A1
20170172553 Chaplin et al. Jun 2017 A1
20170172673 Yu et al. Jun 2017 A1
20170172680 Bowling et al. Jun 2017 A1
20170202627 Sramek et al. Jul 2017 A1
20170202827 Genkin et al. Jul 2017 A1
20170209073 Sramek et al. Jul 2017 A1
20170252096 Felder et al. Sep 2017 A1
20170258534 Hourtash et al. Sep 2017 A1
20170265923 Privitera et al. Sep 2017 A1
20170265954 Burbank et al. Sep 2017 A1
20170281049 Yamamoto et al. Oct 2017 A1
20170290631 Lee et al. Oct 2017 A1
20170319289 Neff Nov 2017 A1
20170325932 Hoelzle Nov 2017 A1
20170333147 Bernstein Nov 2017 A1
20170333679 Jiang et al. Nov 2017 A1
20170340396 Romo et al. Nov 2017 A1
20170365055 Mintz et al. Dec 2017 A1
20170367782 Schuh et al. Dec 2017 A1
20180000563 Shanjani et al. Jan 2018 A1
20180025666 Ho et al. Jan 2018 A1
20180042464 Arai et al. Feb 2018 A1
20180042686 Peine Feb 2018 A1
20180049792 Eckert et al. Feb 2018 A1
20180049824 Harris et al. Feb 2018 A1
20180055583 Schuh et al. Mar 2018 A1
20180056044 Choi et al. Mar 2018 A1
20180079090 Koenig et al. Mar 2018 A1
20180080841 Cordoba et al. Mar 2018 A1
20180104820 Troy et al. Apr 2018 A1
20180116735 Tierney et al. May 2018 A1
20180140371 Hares et al. May 2018 A1
20180168681 Kirk et al. Jun 2018 A1
20180177383 Noonan et al. Jun 2018 A1
20180177556 Noonan Jun 2018 A1
20180177561 Mintz et al. Jun 2018 A1
20180193049 Heck et al. Jul 2018 A1
20180206927 Prisco et al. Jul 2018 A1
20180206931 Scheib Jul 2018 A1
20180214011 Graetzel et al. Aug 2018 A1
20180221038 Noonan et al. Aug 2018 A1
20180221039 Shah Aug 2018 A1
20180243048 Shan et al. Aug 2018 A1
20180250083 Schuh et al. Sep 2018 A1
20180250085 Simi et al. Sep 2018 A1
20180271616 Schuh et al. Sep 2018 A1
20180279852 Rafii-Tari et al. Oct 2018 A1
20180280660 Landey et al. Oct 2018 A1
20180289431 Draper et al. Oct 2018 A1
20180296285 Simi et al. Oct 2018 A1
20180296299 Iceman Oct 2018 A1
20180303566 Soundararajan et al. Oct 2018 A1
20180325499 Landey et al. Nov 2018 A1
20180326181 Kokish et al. Nov 2018 A1
20180333044 Jenkins Nov 2018 A1
20180360435 Romo Dec 2018 A1
20190000559 Berman et al. Jan 2019 A1
20190000560 Berman et al. Jan 2019 A1
20190000566 Graetzel et al. Jan 2019 A1
20190000576 Mintz et al. Jan 2019 A1
20190015166 Mahoney et al. Jan 2019 A1
20190017320 Lombardini Jan 2019 A1
20190069962 Tabandeh et al. Mar 2019 A1
20190083183 Moll et al. Mar 2019 A1
20190099231 Bruehwiler et al. Apr 2019 A1
20190099232 Soto et al. Apr 2019 A1
20190105776 Ho et al. Apr 2019 A1
20190105785 Meyer et al. Apr 2019 A1
20190107454 Lin et al. Apr 2019 A1
20190110839 Rafii-Tari et al. Apr 2019 A1
20190110843 Ummalaneni Apr 2019 A1
20190117324 Hibner et al. Apr 2019 A1
20190125465 Evans et al. May 2019 A1
20190142537 Covington May 2019 A1
20190151148 Alvarez et al. May 2019 A1
20190167366 Ummalaneni et al. Jun 2019 A1
20190175009 Mintz et al. Jun 2019 A1
20190175062 Rafii-Tari et al. Jun 2019 A1
20190175287 Hill et al. Jun 2019 A1
20190175799 Hsu et al. Jun 2019 A1
20190183585 Rafii-Tari et al. Jun 2019 A1
20190183587 Rafii-Tari et al. Jun 2019 A1
20190191967 Yamamoto et al. Jun 2019 A1
20190192249 Bowling et al. Jun 2019 A1
20190216548 Ummalaneni Jul 2019 A1
20190216550 Eyre et al. Jul 2019 A1
20190216576 Eyre et al. Jul 2019 A1
20190223967 Abbott et al. Jul 2019 A1
20190223974 Romo et al. Jul 2019 A1
20190228525 Mintz et al. Jul 2019 A1
20190228528 Mintz et al. Jul 2019 A1
20190231458 DiMaio et al. Aug 2019 A1
20190231460 DiMaio et al. Aug 2019 A1
20190239890 Stokes et al. Aug 2019 A1
20190246882 Graetzel et al. Aug 2019 A1
20190262086 Connolly et al. Aug 2019 A1
20190269468 Hsu et al. Sep 2019 A1
20190274764 Romo Sep 2019 A1
20190290109 Agrawal et al. Sep 2019 A1
20190298160 Ummalaneni et al. Oct 2019 A1
20190298460 Al-Jadda et al. Oct 2019 A1
20190298464 Abbott Oct 2019 A1
20190298465 Chin et al. Oct 2019 A1
20190298469 Ramstad et al. Oct 2019 A1
20190314616 Moll et al. Oct 2019 A1
20190328213 Landey et al. Oct 2019 A1
20190336238 Yu et al. Nov 2019 A1
20190365201 Noonan et al. Dec 2019 A1
20190365209 Ye et al. Dec 2019 A1
20190365479 Rafii-Tari Dec 2019 A1
20190365486 Srinivasan et al. Dec 2019 A1
20190374297 Wallace et al. Dec 2019 A1
20190375383 Auer Dec 2019 A1
20190380787 Ye et al. Dec 2019 A1
20190380797 Yu et al. Dec 2019 A1
20200000533 Schuh et al. Jan 2020 A1
20200000855 Xu et al. Jan 2020 A1
20200008874 Barbagli et al. Jan 2020 A1
20200022767 Hill et al. Jan 2020 A1
20200030046 Bowling et al. Jan 2020 A1
20200034526 Asokan et al. Jan 2020 A1
20200038123 Graetzel et al. Feb 2020 A1
20200039086 Meyer et al. Feb 2020 A1
20200040537 Groeneweg Feb 2020 A1
20200046434 Graetzel et al. Feb 2020 A1
20200054408 Schuh et al. Feb 2020 A1
20200055801 Fish et al. Feb 2020 A1
20200060516 Baez, Jr. et al. Feb 2020 A1
20200085516 DeFonzo et al. Mar 2020 A1
20200086087 Hart et al. Mar 2020 A1
20200091799 Covington et al. Mar 2020 A1
20200093549 Chin et al. Mar 2020 A1
20200093554 Schuh et al. Mar 2020 A1
20200100845 Julian Apr 2020 A1
20200100853 Ho et al. Apr 2020 A1
20200100855 Leparmentier et al. Apr 2020 A1
20200101264 Jiang Apr 2020 A1
20200107894 Wallace et al. Apr 2020 A1
20200121502 Kintz Apr 2020 A1
20200129252 Kokish et al. Apr 2020 A1
20200138531 Chaplin May 2020 A1
20200146769 Eyre et al. May 2020 A1
20200155245 Yu May 2020 A1
20200163726 Tanner et al. May 2020 A1
20200170720 Ummalaneni Jun 2020 A1
20200171660 Ho et al. Jun 2020 A1
20200188043 Yu et al. Jun 2020 A1
20200197109 Chaplin Jun 2020 A1
20200197112 Chin et al. Jun 2020 A1
20200206472 Ma et al. Jul 2020 A1
20200217733 Lin et al. Jul 2020 A1
20200222134 Schuh et al. Jul 2020 A1
20200230360 Yu et al. Jul 2020 A1
20200237458 Defonzo et al. Jul 2020 A1
20200261172 Romo et al. Aug 2020 A1
20200268459 Noonan Aug 2020 A1
20200268460 Tse et al. Aug 2020 A1
20200281787 Ruiz Sep 2020 A1
20200297437 Schuh et al. Sep 2020 A1
20200297444 Camarillo et al. Sep 2020 A1
20200305922 Yan et al. Oct 2020 A1
20200305983 Yampolsky et al. Oct 2020 A1
20200305989 Schuh et al. Oct 2020 A1
20200305992 Schuh et al. Oct 2020 A1
20200315717 Bovay et al. Oct 2020 A1
20200315723 Hassan et al. Oct 2020 A1
20200323596 Moll et al. Oct 2020 A1
20200330167 Romo et al. Oct 2020 A1
20200345216 Jenkins Nov 2020 A1
20200352420 Graetzel et al. Nov 2020 A1
20200360183 Alvarez et al. Nov 2020 A1
20200367726 Landey et al. Nov 2020 A1
20200367981 Ho et al. Nov 2020 A1
20200375678 Wallace et al. Dec 2020 A1
20200383735 Lewis et al. Dec 2020 A1
20200405411 Draper et al. Dec 2020 A1
20200405413 Kokish et al. Dec 2020 A1
20200405419 Mao et al. Dec 2020 A1
20200405420 Purohit et al. Dec 2020 A1
20200405423 Schuh Dec 2020 A1
20200405424 Schuh Dec 2020 A1
20200405434 Schuh et al. Dec 2020 A1
20200406002 Romo et al. Dec 2020 A1
20210007819 Schuh et al. Jan 2021 A1
20210008341 Landey et al. Jan 2021 A1
20210045819 Castillo et al. Feb 2021 A1
20210045822 Landey et al. Feb 2021 A1
20210045823 Landey et al. Feb 2021 A1
20210045824 Landey et al. Feb 2021 A1
20210059766 Graetzel et al. Mar 2021 A1
20210121052 Graetzel et al. Apr 2021 A1
20210169588 Graetzel et al. Jun 2021 A1
20210178032 Hsu et al. Jun 2021 A1
20210196293 Lin et al. Jul 2021 A1
20210196312 Plewe et al. Jul 2021 A1
20210196399 Ayvali et al. Jul 2021 A1
20210196410 Hsu et al. Jul 2021 A1
Foreign Referenced Citations (141)
Number Date Country
2017336790 Apr 2019 AU
2018243364 Oct 2019 AU
2018290831 Dec 2019 AU
2018292606 Jan 2020 AU
2018347472 Apr 2020 AU
2018378808 May 2020 AU
2019347767 Apr 2021 AU
2021204979 Aug 2021 AU
101161426 Apr 2008 CN
101443069 May 2009 CN
100515347 Jul 2009 CN
101495023 Jul 2009 CN
102015759 Apr 2011 CN
201884596 Jun 2011 CN
102316817 Jan 2012 CN
102327118 Jan 2012 CN
102458295 May 2012 CN
102665590 Sep 2012 CN
102834043 Dec 2012 CN
102973317 Mar 2013 CN
103037799 Apr 2013 CN
103298414 Sep 2013 CN
103735313 Apr 2014 CN
104619281 May 2015 CN
102947730 Jul 2015 CN
105005103 Oct 2015 CN
105147393 Dec 2015 CN
105559850 May 2016 CN
105559886 May 2016 CN
205729413 Nov 2016 CN
108882837 Nov 2018 CN
108990412 Dec 2018 CN
110831653 Feb 2020 CN
110868903 Mar 2020 CN
110891514 Mar 2020 CN
111386450 Jul 2020 CN
111432856 Jul 2020 CN
112472007 Mar 2021 CN
108369450 Apr 2021 CN
112770690 May 2021 CN
112804946 May 2021 CN
19649082 Jan 1998 DE
102004020465 Sep 2005 DE
102015016152 Jun 2017 DE
1321106 Jun 2003 EP
1442720 Aug 2004 EP
1321106 Oct 2005 EP
1849423 Oct 2007 EP
1849423 Dec 2007 EP
2043501 Apr 2009 EP
2046227 Apr 2009 EP
2210066 Jul 2010 EP
2239600 Oct 2010 EP
2567670 Mar 2013 EP
3025630 Jun 2016 EP
3387514 Jun 2019 EP
3518724 Aug 2019 EP
2577363 Jul 2020 EP
3676587 Jul 2020 EP
3752085 Dec 2020 EP
3600031 Jan 2021 EP
3644820 Mar 2021 EP
3645100 Mar 2021 EP
3820373 May 2021 EP
3856064 Aug 2021 EP
3684438 Sep 2021 EP
07136173 May 1995 JP
2005270464 Oct 2005 JP
2009139187 Jun 2009 JP
2010046384 Mar 2010 JP
2014159071 Sep 2014 JP
2015181495 Oct 2015 JP
2020512102 Apr 2020 JP
2020526252 Aug 2020 JP
2020526254 Aug 2020 JP
2019531807 Nov 2020 JP
2020536754 Dec 2020 JP
2021505287 Feb 2021 JP
2021513436 May 2021 JP
1020190119541 Oct 2019 KR
20190134968 Dec 2019 KR
20200023640 Mar 2020 KR
20200024873 Mar 2020 KR
20200071744 Jun 2020 KR
20200099127 Aug 2020 KR
20200122337 Oct 2020 KR
20210042134 Apr 2021 KR
20210073542 Jun 2021 KR
102297011 Sep 2021 KR
9414494 Jul 1994 WO
9622591 Jul 1996 WO
200274178 Sep 2002 WO
2007005976 Jan 2007 WO
2007088208 Aug 2007 WO
2007146987 Dec 2007 WO
2008014425 Jan 2008 WO
2008031077 Mar 2008 WO
2008017080 Oct 2008 WO
2008157399 Dec 2008 WO
2008101228 Jan 2009 WO
2009023801 Feb 2009 WO
2009064629 May 2009 WO
2009092059 Jul 2009 WO
2010133982 Nov 2010 WO
2010133982 Jan 2011 WO
2011005335 Jan 2011 WO
2011008922 Jan 2011 WO
2011150526 Dec 2011 WO
2011161218 Dec 2011 WO
2012037506 Mar 2012 WO
2012167043 Jan 2013 WO
2013107468 Jul 2013 WO
2013130895 Sep 2013 WO
2013179600 Dec 2013 WO
2015127231 Aug 2015 WO
2015153174 Oct 2015 WO
2016137612 Sep 2016 WO
2017059412 Apr 2017 WO
2017075574 May 2017 WO
2017097399 Jun 2017 WO
2017114855 Jul 2017 WO
2017151993 Sep 2017 WO
20171156070 Sep 2017 WO
2018064394 Apr 2018 WO
2018069679 Apr 2018 WO
2018094191 May 2018 WO
2018183727 Oct 2018 WO
2018189722 Oct 2018 WO
2019005872 Jan 2019 WO
2019005992 Jan 2019 WO
2019074669 Apr 2019 WO
2019113389 Jun 2019 WO
2019160865 Aug 2019 WO
2020033318 Feb 2020 WO
2020069430 Apr 2020 WO
2021028889 Feb 2021 WO
2021044297 Mar 2021 WO
2021137071 Jul 2021 WO
2021137106 Jul 2021 WO
2021137108 Jul 2021 WO
2021137109 Jul 2021 WO
Non-Patent Literature Citations (30)
Entry
International search report for PCT/IB2020/061905, dated Mar. 18, 2021, 8 pages.
Written opinion for PCT/IB2020/061905, dated Mar. 18, 2021, 6 pages.
Invitation to Pay Additional Fees in PCT appl No. PCT/US16/059686, dated Dec. 8, 2016.
Office action for U.S. Appl. No. 15/948,351, dated May 10, 2021, 11 pages.
Office action for U.S. Appl. No. 15/948,369, dated May 25, 2021, 14 pages.
Office action for U.S. Appl. No. 15/948,369, dated Nov. 20, 2020, 14 pages.
Advisory action for U.S. Appl. No. 15/948,351, dated Sep. 1, 2021, 3 pages.
AU Examination report for appl. No. 2016343849, dated Mar. 16, 2021, 5 pages.
Extended European Search Report for appl No. 16861031.9, dated Jun. 12, 2019.
Hernansanz et al., 2015, A multi-robot cooperation strategy for dexterous task oriented teleoperation, 2015, ELSEVIER, Robotics and Autonomous Systems, 68(205):156-172.
International search report and written opinion for appl No. PCTIB2020057704, dated Dec. 18, 2020, 11 pages.
JP office action dated Dec. 1, 2020, for patent appl No. 2018-521951, 4 pages.
Mayo Clinic, Robotic Surgery, https://www.mayoclinic.org/tests-procedures/robotic-surgery/about/pac- 20394974?p=1, downloaded from the internet on Jul. 12, 2018, 2 pp.
Notice of Acceptance for appl No. 2016343849, dated Aug. 5, 2021, 3 pages.
Notice of allowance for U.S. Appl. No. 15/948,369, dated Aug. 17, 2021, 10 pages.
Notice of allowance for U.S. Appl. No. 16/355,437, dated Oct. 28, 2021, 10 pages.
Office action for U.S. Appl. No. 15/948,351, dated Nov. 16, 2020, 10 pages.
Office action for U.S. Appl. No. 15/948,351, dated Sep. 29, 2021, 17 pages.
Office action for U.S. Appl. No. 16/865,904, dated Sep. 17, 2021, 13 pages.
Ramezanifard et al, 2007, A Novel Modeling Approach for Collision Avoidance in Robotic Surgery, 2007 Science Publications, American Journal of Applied Sciences 4(9):693-699, 7 pages.
International Preliminary Report on Patentability for PCT/US2016/059686 dated Feb. 17, 2017, 7 pages.
International Search Report for PCT/IB2020/057704 dated Feb. 18, 2021, 16 pages.
International Search Report for PCT/US2016/059686 dated Feb. 17, 2017, 4 pages.
Mayo Clinic, Robotic Surgery, https://www.mayoclinic.org/tests-procedures/robotic-surgery/about/pac- 20394974?p=1, downloaded from the internet on Jul. 12, 2018, 2 pgs.
Notice of Allowance for U.S. Appl. No. 15/948,369, dated Dec. 14, 2021, 11 pages.
PCT Invitation to Pay Additional Fees, PCT Application No. PCT/US2016/059686, Dec. 8, 2016, 2 pages.
U.S. Appl. No. 62/248,737, filed Oct. 30, 2015, Inventors: David P. Noonan et al., 76 pages.
Written Opinion of the International Search Authority for PCT/IB2020/057704 dated Feb. 18, 2021, 9 pages.
Written Opinion of the International Searching Authority for PCT/US2016/059686 dated Feb. 17, 2017, 6 pages.
Notice of Allowance for U.S. Appl. No. 15/948,351, dated Mar. 2, 2022, 10 pages.
Related Publications (1)
Number Date Country
20210196293 A1 Jul 2021 US
Provisional Applications (1)
Number Date Country
62956071 Dec 2019 US