Various technologies including computing technologies, robotic technologies, medical technologies, and extended reality technologies (e.g., augmented reality technologies, virtual reality technologies, etc.) have made it possible for users such as surgeons to perform, and be trained to perform, various types of medical procedures. For example, users may perform and be trained to perform minimally-invasive medical procedures such as computer-assisted medical procedures in clinical settings (e.g., procedures on bodies of live human or animal patients), in non-clinical settings (e.g., procedures on bodies of human or animal cadavers, bodies of tissue removed from human or animal anatomies, etc.), in training settings (e.g., procedures on bodies of physical anatomical training models, bodies of virtual anatomy models in extended reality environments, etc.), and so forth.
During a procedure in any such setting, a user may view imagery of an anatomical space associated with a body (e.g., an area internal to the body) as the user controls instruments of a computer-assisted medical system to perform the procedure. The user may control the instruments using various input mechanisms on or coupled to the computer-assisted medical system. The input mechanisms may affect how efficiently and/or effectively the user is able to perform a procedure.
The following description presents a simplified summary of one or more aspects of the systems and methods described herein. This summary is not an extensive overview of all contemplated aspects and is intended to neither identify key or critical elements of all aspects nor delineate the scope of any or all aspects. Its sole purpose is to present one or more aspects of the systems and methods described herein as a prelude to the detailed description that is presented below.
An illustrative system includes a memory storing instructions and a processor communicatively coupled to the memory and configured to execute the instructions to direct a computer-assisted medical system, which includes a set of controls comprising a first control and a second control configured to receive user input, to operate in a manipulation input mode in which user input to the first control manipulates a first manipulator of the computer-assisted medical system, and user input to the second control manipulates a second manipulator of the computer-assisted medical system; and direct the computer-assisted medical system to operate in a hybrid input mode in which user input to the first control manipulates the first manipulator, and user input to the second control adjusts a parameter setting associated with a medical procedure.
An illustrative method includes a processor (e.g., a processor of a user input system) directing a computer-assisted medical system, which includes a set of controls comprising a first control and a second control configured to receive user input, to operate in a manipulation input mode in which user input to the first control manipulates a first manipulator of the computer-assisted medical system, and user input to the second control manipulates a second manipulator of the computer-assisted medical system; and directing the computer-assisted medical system to operate in a hybrid input mode in which user input to the first control manipulates the first manipulator, and user input to the second control adjusts a parameter setting associated with a medical procedure.
An illustrative computer-readable medium includes instructions that, when executed by a processor, cause the processor to direct a computer-assisted medical system, which includes a set of controls comprising a first control and a second control configured to receive user input, to operate in a manipulation input mode in which user input to the first control manipulates a first manipulator of the computer-assisted medical system, and user input to the second control manipulates a second manipulator of the computer-assisted medical system; and direct the computer-assisted medical system to operate in a hybrid input mode in which user input to the first control manipulates the first manipulator, and user input to the second control adjusts a parameter setting associated with a medical procedure.
An illustrative system includes a memory storing instructions and a processor communicatively coupled to the memory and configured to execute the instructions to direct a computer-assisted medical system, which includes a set of controls comprising a first control and a second control configured to receive user input, to operate in a manipulation input mode in which the first control is configured to receive a first input to manipulate a first manipulator of the computer-assisted medical system, and the second control is configured to receive a second input to manipulate a second manipulator of the computer-assisted medical system; and direct the computer-assisted medical system to operate in a hybrid input mode in which the first control is configured to receive a third input to manipulate the first manipulator, and the second control is configured to receive a fourth input to adjust a parameter setting associated with a medical procedure.
An illustrative method includes a processor (e.g., a processor of a user input system) directing a computer-assisted medical system, which includes a set of controls comprising a first control and a second control configured to receive user input, to operate in a manipulation input mode in which the first control is configured to receive a first input to manipulate a first manipulator of the computer-assisted medical system, and the second control is configured to receive a second input to manipulate a second manipulator of the computer-assisted medical system; and directing the computer-assisted medical system to operate in a hybrid input mode in which the first control is configured to receive a third input to manipulate the first manipulator, and the second control is configured to receive a fourth input to adjust a parameter setting associated with a medical procedure.
An illustrative computer-readable medium includes instructions that, when executed by a processor, cause the processor to direct a computer-assisted medical system, which includes a set of controls comprising a first control and a second control configured to receive user input, to operate in a manipulation input mode in which the first control is configured to receive a first input to manipulate a first manipulator of the computer-assisted medical system, and the second control is configured to receive a second input to manipulate a second manipulator of the computer-assisted medical system; and direct the computer-assisted medical system to operate in a hybrid input mode in which the first control is configured to receive a third input to manipulate the first manipulator, and the second control is configured to receive a fourth input to adjust a parameter setting associated with a medical procedure.
The accompanying drawings illustrate various embodiments and are a part of the specification. The illustrated embodiments are merely examples and do not limit the scope of the disclosure. Throughout the drawings, identical or similar reference numbers designate identical or similar elements.
User input systems and methods for a computer-assisted medical system are described herein. During a computer-assisted medical procedure, a user (e.g., a surgeon) may control (e.g., teleoperate) instruments through a set of controls that receive inputs configured to cause the instruments to be manipulated by the computer-assisted medical system. For example, user inputs received through the set of controls may cause elements of the computer-assisted medical system, such as manipulators of the computer-assisted medical system, to be manipulated or otherwise controlled in a manner that manipulates instruments connected to the manipulators.
The set of controls may be configured to operate in a manipulation input mode in which user inputs to the controls (e.g., manipulation of the controls by a user) may translate to manipulation of the instruments and/or manipulators by the computer-assisted medical system. As described herein, the set of controls may be further configured to operate in a hybrid input mode in which user input to one of the controls causes the computer-assisted medical system to manipulate an instrument and/or a manipulator and user input to another one of the controls causes one or more parameter settings associated with the computer-assisted medical procedure to be adjusted. When operating in the hybrid input mode, for example, user inputs to one control may be translated to manipulation (e.g., movement) of the instrument and/or manipulator while user inputs to another control may be translated to adjustment of a parameter setting associated with the medical procedure. Examples of such a parameter setting are described herein.
Systems and methods described herein may provide various advantages and benefits. For example, systems and methods described herein provide a hybrid input mode in which a set of user controls are configured to receive inputs for manipulating an instrument while adjusting a parameter setting associated with a medical procedure, such as a parameter setting of the instrument. The hybrid input mode may increase efficiency and/or effectiveness of receiving inputs to a computer-assisted medical system when compared to conventional user input systems for computer-assisted medical systems (e.g., conventional user input systems in which manipulation of instruments and adjustment of parameter settings must be performed at different times and/or in separate input modes, conventional user input systems in which separate input mechanisms other than a set of controls used to manipulate instruments must be used to adjust parameters settings, etc.). Such increase in efficiency and/or effectiveness of receiving inputs may result in more efficient and/or effective medical procedures, such as by facilitating more efficient and/or effective operation of a medical instrument. These and other advantages and benefits of systems and methods described herein will be made apparent herein.
Various embodiments will now be described in more detail with reference to the figures. The disclosed systems and methods may provide one or more of the benefits mentioned above and/or various additional and/or alternative benefits that will be made apparent herein.
As shown in
Storage facility 102 may maintain (e.g., store) executable data used by processing facility 104 to perform any of the functionality described herein. For example, storage facility 102 may store instructions 106 that may be executed by processing facility 104 to perform one or more of the operations described herein. Instructions 106 may be implemented by any suitable application, software, code, and/or other executable data instance. Storage facility 102 may also maintain any data received, generated, managed, used, and/or transmitted by processing facility 104.
Processing facility 104 may be configured to perform (e.g., execute instructions 106 stored in storage facility 102 to perform) various operations associated with a user input system for a computer-assisted medical system. Examples of such operations that may be performed by system 100 (e.g., by processing facility 104 of system 100) are described herein. In the description that follows, any references to functions performed by system 100 may be understood to be performed by processing facility 104 based on instructions 106 stored in storage facility 102.
Controls 204 may be implemented in any suitable manner to receive user input for manipulating manipulators 206. For instance, controls 204 may be implemented by a set of master controls, an example of which is described in relation to
Manipulators 206 may be implemented in any suitable manner to manipulate and/or control medical instruments based on user input received by controls 204. Medical instruments may include any device, tool, or other instrument that may be used in a medical procedure, such as surgical instruments, non-surgical instruments, imaging devices, etc. Manipulators 206 may include any suitable mechanism that translates input from controls 204 to operation of medical instruments coupled to the manipulators 206. For example, manipulators 206 may include configurations of manipulator arms, motorized joints, and/or other manipulator components that are configured to be moved based on user input received by controls 204. Medical instruments may be coupled to manipulator arms such that movement of the manipulator arms causes movement of the medical instruments.
Configuration 200 illustrates user input system 100 configuring computer-assisted medical system 202 to operate in a manipulation input mode 208-1. In manipulation input mode 208-1, computer-assisted medical system 202 may be configured to translate user input received by way of controls 204 to manipulation (e.g., movement) of manipulators 206 and/or instruments connected to manipulators 206. For instance, computer-assisted medical system 202 may receive a first input on first control 204-1 and a second input on second control 204-2. User input system 100 may translate the first input to a manipulation of first manipulator 206-1 and the second input to a manipulation of second manipulator 206-2. As an example, the first input may be a movement of first control 204-1 from right to left. In response, user input system 100 may translate the movement of first control 204-1 to a corresponding movement of an instrument coupled to first manipulator 206-1 (e.g., a movement of the instrument a corresponding amount from right to left based on the input movement). The second input may be a counterclockwise twisting of second control 204-2. In response, user input system 100 may translate the twisting of second control 204-2 to a corresponding rotation of another instrument coupled to second manipulator 206-2 (e.g., a counterclockwise rotation by an amount corresponding to the counterclockwise twisting input). Any other suitable manipulations (e.g., any changes in pose such as changes in position and/or orientation) of controls 204 through user inputs (e.g., movements of hands, wrists, arms, etc., a movement of fingers attached to finger portions of controls 204, an actuation of a button or other input mechanism to send a control signal, etc.) may be considered input received by controls 204 and may be translated into corresponding manipulations of manipulators 206 and/or instruments coupled to manipulators 206 by user input system 100 operating in manipulation input mode 208-1.
As used herein, manipulation of a control, a manipulator, and/or an instrument generally refers to physical movement of the control, the manipulator, and/or the instrument. In certain implementations, for example, user input may be provided to a control that causes the control to be moved within a space. User input system 100 may translate the user input to manipulation of a manipulator and/or an instrument coupled to the manipulator.
Parameter settings associated with the medical procedure may include any adjustable properties and/or characteristics of the computer-assisted medical system, adjustable properties and/or characteristics of any of the medical instruments involved with the medical procedure, adjustable properties and/or characteristics of any data provided to the computer-assisted medical system or by the computer-assisted medical system (e.g., display properties, user interface properties, etc.). For example, a parameter setting associated with the medical procedure may include a display parameter setting of a display device associated with the medical procedure (e.g., a brightness, color, saturation, hue, image layout, image format, etc. associated with the display device and/or content displayed by the display device such as a three-dimensional (3D) model or any other content), an imaging parameter setting of an imaging device and/or an image processing process associated with the medical procedure (e.g., one or more auto-exposure settings of an imaging device), an operation parameter setting of a medical instrument associated with the medical procedure, a user interface parameter setting of a user interface associated with the computer-assisted medical system, and/or any other adjustable parameter setting of the computer-assisted medical system and/or a medical instrument used in or otherwise associated with the medical procedure.
To illustrate, a user may operate a computer-assisted medical system to control a medical instrument such as an ultrasound probe located at an anatomical space associated with a medical procedure. Imagery of the anatomical space may be captured and presented in conjunction with the medical procedure.
When system 100 configures the computer-assisted medical system to operate in a manipulation input mode, first instrument 304-1 and second instrument 304-2 may be controlled by the first and second manipulators; which are in turn controlled by a user providing input via a set of controls of the computer-assisted medical system. For example, the user may provide input to a first control of the set of controls to manipulate the first manipulator to cause first instrument 304-1 to move (e.g., by changing a position and/or orientation of) ultrasound probe 306 in the anatomical space (e.g., so as to capture ultrasound imagery at different locations in the anatomical space, such as at different locations on anatomical object 302). The user may further provide input to a second control of the set of controls to manipulate the second manipulator to cause second instrument 304-2 to move in the anatomical space (e.g., to perform any suitable tasks in the anatomical space).
When system 100 configures the computer-assisted medical system to operate in a hybrid input mode, the user may manipulate first instrument 304-1 by providing input to the first control to manipulate the first manipulator. In this way, the user may continue to move a position and/or orientation of ultrasound probe 306 after system 100 switches from the manipulation input mode to the hybrid input mode during the medical procedure. In the hybrid input mode, the user may also adjust one or more parameter settings associated with the medical procedure by providing input to the second control. To this end, any of user input that may be provided to the second control for manipulating the second manipulator during operation in the manipulation input mode may instead be translated, by system 100, to commands for adjusting parameter settings during operation in the hybrid input mode. For example, in the manipulation input mode, a clockwise twisting of the second control may result in a clockwise movement of the second manipulator and accordingly, second instrument 304-2. In hybrid input mode, the clockwise twisting of the second control may instead be translated to an adjustment to a parameter setting associated with the medical procedure, such as an increasing of a value of a parameter setting associated with the medical procedure.
In hybrid input mode, system 100 may be configured to translate any user input received by the second control into an adjustment of a parameter setting in any suitable way. As an example, specific user inputs received by the second control may be translated into specific corresponding adjustments of a parameter setting (e.g., a clockwise twisting input is translated to an increase in a value of a parameter setting, a counter-clockwise twisting input is translated to a decrease in a value of a parameter setting, a pinching input is translated to an adjustment of a parameter setting, etc.). As another example, specific user inputs received by the second control may be translated into operations in a user interface (e.g., a graphical user interface), which operations may be performed to adjust a parameter setting. For instance, user input received by the second control may cause a cursor to be moved and/or otherwise used within a graphical user interface to adjust a parameter setting. Examples of user interfaces that may be provided by system 100 and used to facilitate adjustment of parameter settings in hybrid input mode will now be described.
Interface 400 further includes a user interface element 406 that includes icons 408-414 that indicate parameter settings that may be adjusted by the user via the second control in the hybrid input mode. In this example, icons 408-414 include a depth icon 408, a brightness icon 410, a doppler icon 412, and a snapshot icon 414. Depth icon 408 may allow the user to adjust a depth of focus of ultrasound images 404 captured by ultrasound probe 306. Brightness icon 410 may allow the user to adjust a brightness of the display of ultrasound images 404. Doppler icon 412 may allow the user to change a mode of the ultrasound (e.g., turning on or off a doppler mode). Snapshot icon 414 may allow the user to store still images or portions of imagery from ultrasound images 404 received from ultrasound probe 306. Thus, while the user is adjusting any of these (or any other suitable) parameter settings using the second control, the user may also move ultrasound probe 306 using the first control to capture ultrasound imagery from different locations in the anatomical space, such as different parts and/or angles of anatomical object 302.
For example,
As shown in
The interaction with interface 400 shown in
As another example,
While user interface element 406 shows four parameter settings that may be adjusted, more parameters, fewer parameters, or different parameters may be provided for adjustment in the hybrid input mode. For instance, other modes (e.g., modalities) of ultrasound may be adjusted in addition to or instead of doppler mode. Further, any other suitable parameter settings associated with ultrasound imaging may be provided for adjustment, such as a gain, a frequency, a width of image, a processing of imagery, a color, etc.
In certain examples, the hybrid input mode may allow the user to adjust an overall brightness of interface 400, a brightness of a portion or portions of interface 400 (e.g., images 402, images 402 and ultrasound images 404, etc.), a size, position, and/or characteristic (e.g., transparency, color, visibility, etc.) of ultrasound images 404 shown in interface 400, etc. For instance, the user may desire to see more of image 402, e.g., to see more clearly a location of ultrasound probe 306 while in the hybrid input mode. The hybrid input mode may provide as an adjustable parameter setting the size, position, transparency, visibility, or any other suitable characteristic of ultrasound images 404 so that ultrasound images 404 take up less space on interface 400 and/or otherwise allow image 402 or elements of interest depicted in image 402 to be visible.
In a manipulation mode, the user may provide input to a first control of the set of controls to manipulate the first manipulator to cause first instrument 506-1 to move in the anatomical space. Similarly, the user may provide input to a second control of the set of controls to manipulate the second manipulator to cause second instrument 506-2 to move in the anatomical space. The user may further provide input to control the imaging device to change a view of image 502. For instance, the user may move the user's head, which may result in a corresponding movement of imaging device to show a corresponding change in angle or view of image 502. Additionally or alternatively, the imaging device may change the view of image 502 to follow a movement of first instrument 506-1 and/or second instrument 506-2. Any other suitable input may be provided to change image 502.
In the manipulation mode, in response to a change in image 502 and/or imaging device, image 508 may show a corresponding change to a view of 3D model 510. For instance, if the user causes the imaging device to zoom in to show a zoomed view of image 502, image 508 may show a similarly zoomed view of 3D model 510. If the user causes the imaging device to show a portion to the right of what is currently shown in image 502, image 508 may likewise show a portion of 3D model 510 to the right of what is currently shown in image 508. Thus, in the manipulation mode, image 508 may track on 3D model 510 based on what is shown of anatomical object 504 in image 502.
When system 100 configures the computer-assisted medical system to operate in a hybrid input mode, the user may continue to manipulate first instrument 506-1 by providing input to the first control to manipulate the first manipulator. Input provided by the user to the second control may be translated to adjust a display parameter setting associated with the medical procedure.
For instance,
In certain examples, the parameters provided for adjustment may be dependent on context of the medical procedure. For example, the parameters provided for adjustment may be based on the instrument being controlled by the first control in the hybrid input mode. For instance, when an ultrasound probe is being controlled by user input to the first control in the hybrid input mode, parameters related to ultrasound and/or the ultrasound probe may be provided for adjustment by the second control. If a different instrument (e.g., a suturing tool) were being manipulated by the first control, different parameters related to the different instrument (e.g., a type of suture, a technique of suturing, etc.) may be presented for adjustment.
In some examples, system 100 may be configured to provide various mechanisms to assist a user of a computer-assisted medical system with operation in hybrid input mode. Such mechanisms may include mechanisms configured to mitigate risks and/or assist the user in handling an increased or different cognitive load than that associated with use of a manipulation input mode. For example, user input system 100 may include mechanisms to make abundantly clear when the user enters and exits the hybrid input mode. For instance, user input system 100 may be configured to provide a notification such as an alert upon entering and/or exiting the hybrid input mode. The notification may include one or more notifications and/or types of notifications, such as an audio notification, a visual notification, a haptic notification, etc.
In certain examples, user input system 100 may require relatively elaborate and/or deliberate input to enter and/or exit the hybrid input mode. For example, the input may include a series of inputs in a specified order, an input from an input mechanism that is not easily triggered, a plurality of inputs in succession on an input mechanism (e.g., a double-click, triple-click, etc.), a duration of time of input (e.g., holding a button for a predetermined amount of time, etc.), or any other suitable input that may mitigate an unintentional entering and/or exiting of the hybrid input mode.
In certain examples, user input system 100 may include mechanisms to make abundantly clear to the user that user input system 100 is operating in the hybrid input mode. For example, user input system 100 may provide a visualization of a user interface and/or images presented to the user in the hybrid input mode that is different from a visualization of a user interface and/or images presented to the user in the manipulation input mode and/or another mode of operation. Such different visualizations may include any suitable visual mechanisms, such as an addition of a prominent parameter setting user interface element, a changing of a visualization parameter (e.g., changing a color, changing to grayscale, changing a style of visualization, inverting the images, changing a size of the images, etc.), etc. Additionally or alternatively, such mechanisms may include an audio indicator indicating that the user is in the hybrid mode, such as a periodic beep or other alert, a playback of a sound for a duration of the hybrid mode, or any other suitable audio indicator.
In certain examples, user input system 100 may include mechanisms that transition the user into or out of the hybrid input mode. For example, user input system 100 may filter inputs received on controls for a predetermined amount of time upon entering and/or exiting the hybrid input mode. For instance, user input system 100 may filter out movements of the controls that exceed a threshold velocity and/or distance for an amount of time (e.g., a second, a portion of a second, several seconds, etc.) upon entering and/or exiting the hybrid input mode. Additionally or alternatively, user input system 100 may filter all movement (e.g., prevent all movement of controls and/or instruments) for a predetermined amount of time upon entering and/or exiting the hybrid input mode. Additionally or alternatively, user input system 100 may temporarily constrain movement of the controls, manipulators, and/or instruments, for instance, by setting a velocity of the controls, manipulators, and/or instrument upon entering and/or exiting the hybrid input mode. For example, user input system 100 may initially set the velocity of the controls, manipulators, and/or instruments to zero for a predetermined amount of time upon entering and/or exiting the hybrid input mode. User input system 100 may constrain movement of the controls, manipulators, and/or instruments in any other suitable manner.
In certain examples, user input system 100 may include mechanisms that restrict operation of both controls simultaneously in the hybrid input mode. For example, user input system 100 may filter inputs received on controls so that when user input is received on one control, user input system 100 may refrain from translating user input received on the other control. In this manner, user input system 100 may allow a user to either manipulate a manipulator or adjust a parameter setting one at a time but not simultaneously. User input system 100 may implement such a restriction in any suitable manner.
User input system 100 may include any of the above-described example mechanisms, any combination or sub-combination thereof, and/or any other suitable mechanisms to clearly demarcate operation in the hybrid input mode so that the user is made fully aware of the effects of inputs provided at the controls.
In operation 602, a user input system may direct a computer-assisted medical system, which includes a set of controls comprising a first control and a second control configured to receive user input, to operate in a manipulation input mode in which user input to the first control manipulates a first manipulator of the computer-assisted medical system, and user input to the second control manipulates a second manipulator of the computer-assisted medical system. Operation 602 may be performed in any of the ways described herein.
In operation 604, the user input system may direct the computer-assisted medical system to operate in a hybrid input mode in which user input to the first control manipulates the first manipulator, and user input to the second control adjusts a parameter setting associated with a medical procedure. Operation 604 may be performed in any of the ways described herein.
In operation 702, the user input system operates in a manipulation input mode. For example, the user input system may configure the computer-assisted medical system to operate based on the manipulation input mode. Operation 702 may be performed in any of the ways described herein.
In operation 704, the user input system determines whether to change to operation in another input mode. The determination may be based on user input received by the user input system, and may include the user input system determining whether received user input matches a predefined command to change modes. If no, the user input system may return to operation 702 and continue to operate in the manipulation input mode. If yes, the user input system may move to operation 706 to change the active input mode from the manipulation input mode to the hybrid input mode.
In operation 706, the user input system operates in a hybrid input mode. For example, the user input system may configure the computer-assisted medical system to operate based on the hybrid input mode. Operation 706 may be performed in any of the ways described herein.
In operation 708, the user input system determines whether to change to operation in another input mode. Similar to operation 704, the determination may be based on user input received by the user input system, and may include the user input system determining whether received user input matches a predefined command to change modes. If no, the user input system my return to operation 706 and continue to operate in the hybrid input mode. If yes, the user input system may move to operation 702 to change the active input mode from the hybrid input mode to the manipulation input mode.
Any suitable user input may be defined and used to trigger a transition from the manipulation input mode to the hybrid input mode and/or a transition from the hybrid input mode to the manipulation input mode. The same user input or different user input may be used to trigger each transition.
In operation 802, the user input system, while operating in a manipulation input mode, receives user input to a control. Operation 802 may be performed in any of the ways described herein.
In operation 804, the user input system manipulates an instrument based on the user input to the control received in the manipulation input mode. Operation 804 may be performed in any of the ways described herein.
In operation 806, the user input system, while operating in a hybrid input mode, receives user input to the control. In some examples, the input received may include a same type of input (e.g., a manipulation, an actuation of a button, etc.) as received in the manipulation input mode in operation 802. Operation 806 may be performed in any of the ways described herein.
In operation 808, the user input system adjusts a parameter setting based on the user input to the control received in the hybrid input mode. Operation 808 may be performed in any of the ways described herein.
As shown, medical system 900 may include a manipulating system 902, a user control system 904, and an auxiliary system 906 communicatively coupled one to another. Medical system 900 may be utilized by a medical team to perform a computer-assisted medical procedure on a patient 908. As shown, the medical team may include a surgeon 910-1, an assistant 910-2, a nurse 910-3, and an anesthesiologist 910-4, all of whom may be collectively referred to as “medical team members 910.” Additional or alternative medical team members may be present during a medical session as may serve a particular implementation.
While
As shown in
Manipulator arms 912 and/or medical instruments attached to manipulator arms 912 may include one or more displacement transducers, orientational sensors, and/or positional sensors used to generate raw (i.e., uncorrected) kinematics information. One or more components of medical system 900 may be configured to use the kinematics information to track (e.g., determine positions of) and/or control the medical instruments.
User control system 904 may be configured to facilitate control by surgeon 910-1 of manipulator arms 912 and medical instruments attached to manipulator arms 912. For example, surgeon 910-1 may interact with user control system 904 to remotely manipulate manipulator arms 912 and the medical instruments. To this end, user control system 904 may provide surgeon 910-1 with imagery (e.g., high-definition 3D imagery) of a surgical area associated with patient 908 as captured by an imaging system (e.g., any of the medical imaging systems described herein). In certain examples, user control system 904 may include a stereo viewer having two displays where stereoscopic images of an anatomical space associated with patient 908 and generated by a stereoscopic imaging system may be viewed by surgeon 910-1. Surgeon 910-1 may utilize the imagery to perform one or more procedures with one or more medical instruments attached to manipulator arms 912.
To facilitate control of medical instruments, user control system 904 may include a set of master controls. These master controls may be manipulated by surgeon 910-1 to control movement of medical instruments (e.g., by utilizing robotic and/or teleoperation technology). The master controls may be configured to detect a wide variety of hand, wrist, and finger movements by surgeon 910-1. User control system 904 may be configured to receive from the master controls information such as position, movement, rotation, interaction with, etc., to track movement of the master controls. User control system 904 may translate the received information into corresponding movement and/or control of manipulator arms 912. In this way, surgeon 910-1 may manipulate medical instruments attached to manipulator arms 912 using the master controls. In this manner, surgeon 910-1 may intuitively perform a procedure using one or more medical instruments.
Auxiliary system 906 may include one or more computing devices configured to perform primary processing operations of medical system 900. In such configurations, the one or more computing devices included in auxiliary system 906 may control and/or coordinate operations performed by various other components (e.g., manipulating system 902 and user control system 904) of medical system 900. For example, a computing device included in user control system 904 may transmit instructions to manipulating system 902 by way of the one or more computing devices included in auxiliary system 906. As another example, auxiliary system 906 may receive, from manipulating system 902, and process image data representative of imagery captured by an imaging device attached to one of manipulator arms 912.
In some examples, auxiliary system 906 may be configured to present visual content to medical team members 910 who may not have access to the images provided to surgeon 910-1 at user control system 904. To this end, auxiliary system 906 may include a display monitor 914 configured to display one or more user interfaces, such as images (e.g., 2D images, 3D images) of the surgical area, information associated with patient 908 and/or the medical procedure, and/or any other visual content as may serve a particular implementation. For example, display monitor 914 may display images of the surgical area together with additional content (e.g., graphical content, contextual information, etc.) concurrently displayed with the images. In some embodiments, display monitor 914 is implemented by a touchscreen display with which medical team members 910 may interact (e.g., by way of touch gestures) to provide user input to medical system 900.
Manipulating system 902, user control system 904, and auxiliary system 906 may be communicatively coupled one to another in any suitable manner. For example, as shown in
Certain examples described herein are directed to implementations of system 100 with computer-assisted medical systems such as medical system 900. In such implementations, system 100 may be configured to selectively direct medical system 900 to operate in a manipulation input mode or a hybrid input mode as described herein. The actively operating input mode governs how user input received by master controls is translated to operations of medical system 900. In other examples, system 100 may be similarly implemented with other computer-assisted systems (e.g., surgical systems), robotic systems, etc.
In some examples, a non-transitory computer-readable medium storing computer-readable instructions may be provided in accordance with the principles described herein. The instructions, when executed by a processor of a computing device, may direct the processor and/or computing device to perform one or more operations, including one or more of the operations described herein. Such instructions may be stored and/or transmitted using any of a variety of known computer-readable media.
A non-transitory computer-readable medium as referred to herein may include any non-transitory storage medium that participates in providing data (e.g., instructions) that may be read and/or executed by a computing device (e.g., by a processor of a computing device). For example, a non-transitory computer-readable medium may include, but is not limited to, any combination of non-volatile storage media and/or volatile storage media. Illustrative non-volatile storage media include, but are not limited to, read-only memory, flash memory, a solid-state drive, a magnetic storage device (e.g. a hard disk, a floppy disk, magnetic tape, etc.), ferroelectric random-access memory (“RAM”), and an optical disc (e.g., a compact disc, a digital video disc, a Blu-ray disc, etc.). Illustrative volatile storage media include, but are not limited to, RAM (e.g., dynamic RAM).
As shown in
Communication interface 1002 may be configured to communicate with one or more computing devices. Examples of communication interface 1002 include, without limitation, a wired network interface (such as a network interface card), a wireless network interface (such as a wireless network interface card), a modem, an audio/video connection, and any other suitable interface.
Processor 1004 generally represents any type or form of processing unit capable of processing data and/or interpreting, executing, and/or directing execution of one or more of the instructions, processes, and/or operations described herein. Processor 1004 may perform operations by executing computer-executable instructions 1012 (e.g., an application, software, code, and/or other executable data instance) stored in storage device 1006.
Storage device 1006 may include one or more data storage media, devices, or configurations and may employ any type, form, and combination of data storage media and/or device. For example, storage device 1006 may include, but is not limited to, any combination of the non-volatile media and/or volatile media described herein. Electronic data, including data described herein, may be temporarily and/or permanently stored in storage device 1006. For example, data representative of computer-executable instructions 1012 configured to direct processor 1004 to perform any of the operations described herein may be stored within storage device 1006. In some examples, data may be arranged in one or more databases residing within storage device 1006.
I/O module 1008 may include one or more I/O modules configured to receive user input and provide user output. I/O module 1008 may include any hardware, firmware, software, or combination thereof supportive of input and output capabilities. For example, I/O module 1008 may include hardware and/or software for capturing user input, including, but not limited to, a keyboard or keypad, a touchscreen component (e.g., touchscreen display), a receiver (e.g., an RF or infrared receiver), motion sensors, and/or one or more input buttons.
I/O module 1008 may include one or more devices for presenting output to a user, including, but not limited to, a graphics engine, a display (e.g., a display screen), one or more output drivers (e.g., display drivers), one or more audio speakers, and one or more audio drivers. In certain embodiments, I/O module 1008 is configured to provide graphical data to a display for presentation to a user. The graphical data may be representative of one or more graphical user interfaces and/or any other graphical content as may serve a particular implementation.
In some examples, any of the facilities described herein may be implemented by or within one or more components of computing device 1000. For example, one or more applications 1012 residing within storage device 1006 may be configured to direct an implementation of processor 1004 to perform one or more operations or functions associated with processing facility 104 of system 100. Likewise, storage facility 102 of system 100 may be implemented by or within an implementation of storage device 1006.
One or more operations described herein may be performed in real time. As used herein, operations that are performed “in real time” will be understood to be performed immediately and without undue delay, even if it is not possible for there to be absolutely zero delay.
Any of the systems, devices, and/or components thereof may be implemented in any suitable combination or sub-combination. For example, any of the systems, devices, and/or components thereof may be implemented as an apparatus configured to perform one or more of the operations described herein.
In the preceding description, various illustrative embodiments have been described with reference to the accompanying drawings. It will, however, be evident that various modifications and changes may be made thereto, and additional embodiments may be implemented, without departing from the scope of the invention as set forth in the claims that follow. For example, certain features of one embodiment described herein may be combined with or substituted for features of another embodiment described herein. The description and drawings are accordingly to be regarded in an illustrative rather than a restrictive sense.
The present application claims priority to U.S. Provisional Patent Application No. 63/039,753, filed Jun. 16, 2020, the contents of which are hereby incorporated by reference in their entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2021/037475 | 6/15/2021 | WO |
Number | Date | Country | |
---|---|---|---|
63039753 | Jun 2020 | US |