The present disclosure is directed to medical procedures and methods for manipulating tissue during medical procedures. More particularly, the present disclosure is directed to systems and methods for providing depth-aware synthetic indicators, indicators for components outside of a field of view, and indicators for components occluded in the field of view.
Minimally invasive medical techniques are intended to reduce the amount of extraneous tissue that is damaged during diagnostic or surgical procedures, thereby reducing patient recovery time, discomfort, and harmful side effects. Such minimally invasive techniques may be performed through natural orifices in a patient anatomy or through one or more surgical incisions. Through these natural orifices or incisions, clinicians may insert medical tools to reach a target tissue location. Minimally invasive medical tools include instruments such as therapeutic instruments, diagnostic instruments, and surgical instruments. Minimally invasive medical tools may also include imaging instruments such as endoscopic instruments that provide a user with a field of view within the patient anatomy.
Some minimally invasive medical tools may be robot-assisted including teleoperated, remotely operated, or otherwise computer-assisted. During a medical procedure, the clinician may be provided with a graphical user interface including an image of a three-dimensional field of view of the patient anatomy. To improve the clinician's experience and efficiency, various indicators may be needed to provide additional information about medical tools in the field of view, medical tools occluded in the field of view, and components outside of the field of view.
The embodiments of the invention are best summarized by the claims that follow the description.
In one example embodiment, a medical system may comprise a display system and a control system. The control system may include a processing unit including one or more processors. The processing unit may be configured to display, on the display system, an image of a field, generated by an imaging component, of view of a surgical environment. The processing unit may also be configured to generate a three-dimensional synthetic indicator for a position of an instrument outside of the field of view of the surgical environment and display the three-dimensional synthetic indicator with the image of the field of view of the surgical environment.
In another embodiment, a medical system may comprise a display system and an input system including a first pedal and a second pedal. The first pedal may have a spatial relationship to the second pedal. The medical system may also comprise a control system. The control system may include a processing unit including one or more processors. The processing unit may be configured to display, on the display system, an image of a field of view of a surgical environment. The image may be generated by an imaging component. The processing unit may also be configured to generate a first synthetic indicator indicating an engagement status of the first pedal, generate a second synthetic indicator indicating an engagement status of the second pedal, and display, on the display system, the first synthetic indicator relative to the second synthetic indicator based on the spatial relationship with the image of the field of view of the surgical environment.
In another embodiment, a medical system may comprise a display system and an input system including a first pedal and a second pedal. The first pedal may have a spatial relationship to the second pedal. The medical system may also comprise a control system. The control system may include a processing unit including one or more processors. The processing unit may be configured to display, on the display system, an image of a field of view of a surgical environment. The image may be generated by an imaging component. The processing unit may also be configured to generate a first synthetic indicator associated with an instrument in the surgical environment, generate a depth mapping including the first synthetic indicator and a structure in the field of view, and determine, from the depth mapping, an occluded portion of the first synthetic indicator occluded by the structure. The processing unit may also be configured to display, on the display system, the first synthetic indicator. The occluded portion of the first synthetic indicator may have a differentiated graphic appearance from a non-occluded portion of the first synthetic indicator.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory in nature and are intended to provide an understanding of the present disclosure without limiting the scope of the present disclosure. In that regard, additional aspects, features, and advantages of the present disclosure will be apparent to one skilled in the art from the following detailed description.
Embodiments of the present disclosure and their advantages are best understood by referring to the detailed description that follows. It should be appreciated that like reference numerals are used to identify like elements illustrated in one or more of the figures, wherein showings therein are for purposes of illustrating embodiments of the present disclosure and not for purposes of limiting the same.
In robot-assisted medical procedures, endoscopic images of the surgical environment may provide a clinician with a field of view of the patient anatomy and any medical tools located in the patient anatomy. Augmenting the endoscopic images with various indicators may allow the clinician to access information while maintaining the field of view. Such indicators may include depth-aware graphical indicators, indicators for components outside of a field of view, and indicators for components occluded in the field of view.
In one or more embodiments, the medical system 10 may be a robot-assisted medical system that is under the teleoperational control of a surgeon. In alternative embodiments, the medical system 10 may be under the partial control of a computer programmed to perform the medical procedure or sub-procedure. In still other alternative embodiments, the medical system 10 may be a fully automated medical system that is under the full control of a computer programmed to perform the medical procedure or sub-procedure with the medical system 10. One example of the medical system 10 that may be used to implement the systems and techniques described in this disclosure is the da Vinci® Surgical System manufactured by Intuitive Surgical, Inc. of Sunnyvale, California.
As shown in
The medical instrument system 14 may comprise one or more medical instruments. In embodiments in which the medical instrument system 14 comprises a plurality of medical instruments, the plurality of medical instruments may include multiple of the same medical instrument and/or multiple different medical instruments. Similarly, the endoscopic imaging system 15 may comprise one or more endoscopes. In the case of a plurality of endoscopes, the plurality of endoscopes may include multiple of the same endoscope and/or multiple different endoscopes.
The operator input system 16 may be located at a surgeon's control console, which may be located in the same room as operating table O. In some embodiments, the surgeon S and the operator input system 16 may be located in a different room or a completely different building from the patient P. The operator input system 16 generally includes one or more control device(s) for controlling the medical instrument system 14. The control device(s) may include one or more of any number of a variety of input devices, such as hand grips, joysticks, trackballs, data gloves, trigger-guns, foot pedals, hand-operated controllers, voice recognition devices, touch screens, body motion or presence sensors, and other types of input devices.
In some embodiments, the control device(s) will be provided with the same degrees of freedom as the medical instrument(s) of the medical instrument system 14 to provide the surgeon with telepresence, which is the perception that the control device(s) are integral with the instruments so that the surgeon has a strong sense of directly controlling instruments as if present at the surgical site. In other embodiments, the control device(s) may have more or fewer degrees of freedom than the associated medical instruments and still provide the surgeon with telepresence. In some embodiments, the control device(s) are manual input devices that move with six degrees of freedom, and which may also include an actuatable handle for actuating instruments (for example, for closing grasping jaw end effectors, applying an electrical potential to an electrode, delivering a medicinal treatment, and actuating other types of instruments).
The assembly 12 supports and manipulates the medical instrument system 14 while the surgeon S views the surgical site through the operator input system 16. An image of the surgical site may be obtained by the endoscopic imaging system 15, which may be manipulated by the assembly 12. The assembly 12 may comprise endoscopic imaging systems 15 and may similarly comprise multiple medical instrument systems 14 as well. The number of medical instrument systems 14 used at one time will generally depend on the diagnostic or surgical procedure to be performed and on space constraints within the operating room, among other factors. The assembly 12 may include a kinematic structure of one or more non-servo controlled links (e.g., one or more links that may be manually positioned and locked in place, generally referred to as a set-up structure) and a manipulator. When the manipulator takes the form of a teleoperational manipulator, the assembly 12 is a teleoperational assembly. The assembly 12 includes a plurality of motors that drive inputs on the medical instrument system 14. In an embodiment, these motors move in response to commands from a control system (e.g., control system 20). The motors include drive systems which when coupled to the medical instrument system 14 may advance a medical instrument into a naturally or surgically created anatomical orifice. Other motorized drive systems may move the distal end of said medical instrument in multiple degrees of freedom, which may include three degrees of linear motion (e.g., linear motion along the X, Y, Z Cartesian axes) and three degrees of rotational motion (e.g., rotation about the X, Y, Z Cartesian axes). Additionally, the motors may be used to actuate an articulable end effector of the medical instrument for grasping tissue in the jaws of a biopsy device or the like. Medical instruments of the medical instrument system 14 may include end effectors having a single working member such as a scalpel, a blunt blade, an optical fiber, or an electrode. Other end effectors may include, for example, forceps, graspers, scissors, or clip appliers.
The medical system 10 also includes a control system 20. The control system 20 includes at least one memory 24 and at least one processor 22 for effecting control between the medical instrument system 14, the operator input system 16, and other auxiliary systems 26 which may include, for example, imaging systems, audio systems, fluid delivery systems, display systems, illumination systems, steering control systems, irrigation systems, and/or suction systems. A clinician may circulate within the medical environment 11 and may access, for example, the assembly 12 during a set up procedure or view a display of the auxiliary system 26 from the patient bedside.
Though depicted as being external to the assembly 12 in
Any of a wide variety of centralized or distributed data processing architectures may be employed. Similarly, the programmed instructions may be implemented as a number of separate programs or subroutines, or they may be integrated into a number of other aspects of the systems described herein, including teleoperational systems. In one embodiment, the control system 20 supports wireless communication protocols such as Bluetooth, IrDA, HomeRF, IEEE 802.11, DECT, and Wireless Telemetry.
The control system 20 is in communication with a database 27 which may store one or more clinician profiles, a list of patients and patient profiles, a list of procedures to be performed on said patients, a list of clinicians scheduled to perform said procedures, other information, or combinations thereof. A clinician profile may comprise information about a clinician, including how long the clinician has worked in the medical field, the level of education attained by the clinician, the level of experience the clinician has with the medical system 10 (or similar systems), or any combination thereof.
The database 27 may be stored in the memory 24 and may be dynamically updated. Additionally or alternatively, the database 27 may be stored on a device such as a server or a portable storage device that is accessible by the control system 20 via an internal network (e.g., a secured network of a medical facility or a teleoperational system provider) or an external network (e.g. the Internet). The database 27 may be distributed throughout two or more locations. For example, the database 27 may be present on multiple devices which may include the devices of different entities and/or a cloud server. Additionally or alternatively, the database 27 may be stored on a portable user-assigned device such as a computer, a mobile device, a smart phone, a laptop, an electronic badge, a tablet, a pager, and other similar user devices.
In some embodiments, control system 20 may include one or more servo controllers that receive force and/or torque feedback from the medical instrument system 14. Responsive to the feedback, the servo controllers transmit signals to the operator input system 16. The servo controller(s) may also transmit signals instructing assembly 12 to move the medical instrument system(s) 14 and/or endoscopic imaging system 15 which extend into an internal surgical site within the patient body via openings in the body. Any suitable conventional or specialized servo controller may be used. A servo controller may be separate from, or integrated with, assembly 12. In some embodiments, the servo controller and assembly 12 are provided as part of a teleoperational arm cart positioned adjacent to the patient's body.
The control system 20 can be coupled with the endoscopic imaging system 15 and can include a processor to process captured images for subsequent display, such as to a surgeon on the surgeon's control console, or on another suitable display located locally and/or remotely. For example, where a stereoscopic endoscope is used, the control system 20 can process the captured images to present the surgeon with coordinated stereo images of the surgical site. Such coordination can include alignment between the opposing images and can include adjusting the stereo working distance of the stereoscopic endoscope.
In alternative embodiments, the medical system 10 may include more than one assembly 12 and/or more than one operator input system 16. The exact number of assemblies 12 will depend on the surgical procedure and the space constraints within the operating room, among other factors. The operator input systems 16 may be collocated or they may be positioned in separate locations. Multiple operator input systems 16 allow more than one operator to control one or more assemblies 12 in various combinations. The medical system 10 may also be used to train and rehearse medical procedures.
The assembly 12 includes a drivable base 58. The drivable base 58 is connected to a telescoping column 57, which allows for adjustment of the height of arms 54. The arms 54 may include a rotating joint 55 that both rotates and moves up and down. Each of the arms 54 may be connected to an orienting platform 53. The arms 54 may be labeled to facilitate trouble shooting. For example, each of the arms 54 may be emblazoned with a different number, letter, symbol, other identifier, or combinations thereof. The orienting platform 53 may be capable of 360 degrees of rotation. The assembly 12 may also include a telescoping horizontal cantilever 52 for moving the orienting platform 53 in a horizontal direction.
In the present example, each of the arms 54 connects to a manipulator arm 51. The manipulator arms 51 may connect directly to a medical instrument, e.g., one of the surgical tools 30a-c. The manipulator arms 51 may be teleoperable. In some examples, the arms 54 connecting to the orienting platform 53 may not be teleoperable. Rather, such arms 54 may be positioned as desired before the surgeon S begins operation with the teleoperative components. Throughout a surgical procedure, medical instruments may be removed and replaced with other instruments such that instrument to arm associations may change during the procedure.
Endoscopic imaging systems (e.g., endoscopic imaging system 15 and imaging device 28) may be provided in a variety of configurations including rigid or flexible endoscopes. Rigid endoscopes include a rigid tube housing a relay lens system for transmitting an image from a distal end to a proximal end of the endoscope. Flexible endoscopes transmit images using one or more flexible optical fibers. Digital image-based endoscopes have a “chip on the tip” design in which a distal digital sensor such as a one or more charge-coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) device store image data. Endoscopic imaging systems may provide two- or three-dimensional images to the viewer. Two-dimensional images may provide limited depth perception. Three-dimensional stereo endoscopic images may provide the viewer with more accurate depth perception. Stereo endoscopic instruments employ stereo cameras to capture stereo images of the patient anatomy. An endoscopic instrument may be a fully sterilizable assembly with the endoscope cable, handle and shaft all rigidly coupled and hermetically sealed.
The operator input system 16 further includes one or more input control devices 36, which in turn cause the assembly 12 to manipulate one or more instruments of the endoscopic imaging system 15 and/or medical instrument system 14. The input control devices 36 can provide the same degrees of freedom as their associated instruments to provide the surgeon S with telepresence, or the perception that the input control devices 36 are integral with said instruments so that the surgeon has a strong sense of directly controlling the instruments. To this end, position, force, and tactile feedback sensors (not shown) may be employed to transmit position, force, and tactile sensations from the medical instruments, e.g., surgical tools 30a-c, or imaging device 28, back to the surgeon's hands through the input control devices 36. Input control devices 37 are foot pedals that receive input from a user's foot. Aspects of the operator input system 16, the assembly 12, and the auxiliary systems 26 may be adjustable and customizable to meet the physical needs, skill level, or preferences of the surgeon S.
During a medical procedure performed using the medical system 10, the surgeon S or another clinician may need to access medical tools in the patient anatomy that are outside of the field of view of the imaging system 15, may need to engage foot pedals activate medical tools or perform other system functions, and/or may need to identify tools that are occluded in the field of view. Further, with a stereoscopic field of view, it may be desirable that synthetic elements presented with the field of view are displayed at depths that correspond with the tissue or components indicated by the synthetic elements. Thus, the synthetic elements may appear to be attached to the components in the field of view rather than floating in front of the field of view. The various embodiments described below provide methods and systems that allow the surgeon S to view depth-aware graphical indicators, indicators for components outside of a field of view, and indicators for components occluded in the field of view.
The graphical user interface 200 may also include one or more synthetic indicators 218, 220, 222 that may appear in the field of view portion 202 when a corresponding medical tool is in the surgical environment but outside the view of the imaging system and thus not visible in the field of view portion 202. The synthetic indicator 218 indicates the tool 204. The synthetic indicator 220 indicates the tool 206. The synthetic indicator 222 indicates the tool 208. Each synthetic indicator 218, 220, 222 may have a three-dimensional shape and may point in the three-dimensional direction of the corresponding tool outside of the field of view.
In
As the tool 204 moves within the surgical environment 201 or as the field of view 203 changes within the surgical environment, the synthetic indicator 218 may pivot such that the directional portion 252 remains pointed toward the tool 204 and the flat surface 253 remains visible to the viewer. In
In some embodiments, the synthetic indicator 218 or a portion thereof may have a color coding or other visual treatment to indicate a status and/or a control mode (e.g., active or inactive; location where clutch initiated) of the associated tool 204. In some embodiments, the orientation of the synthetic indicator 218 may be determined based on presentation objectives including visibility to the viewer. For example, the flat surface 253 may be oriented toward the endoscope, and the icon 254 may be oriented in the place of the surface 253 to be upright in the view. The directional portion 252 may be constrained so that a normal to the flat surface 253 is oriented within a viewing cone or frustum of the endoscope to ensure legibility of the icon 254. The stereoscopic depth of the synthetic indicator 218 position may be constrained for ease of fusion, to reduce depth mismatch with endoscopic scene content, and to resolve occlusion and depth relative to other synthetic elements in the field of view portion 202. The apparent size of the synthetic indicator 218 may be constrained based on its depth.
In some embodiments, the position of the directional portion 252 along the perimeter 219 of the field of view portion 202 is computed by a ray intersection with the stereoscopic images.
For example, a ray extending from a point along a centerline of an imaging component (e.g. an endoscope) to a distal keypoint on the associated instrument (e.g. a predetermined point on the instrument end effector or joint) may be determined. This determination resolves a point along the perimeter 219 that may be visible by both eyes within a comfortable depth range for fusion.
In some embodiments, the synthetic indicator may morph in shape and size as the endoscope and/or medical tools are moved. For example, the synthetic indicator may transition from a circular badge to the teardrop shape. The length of the teardrop or arrow shape may indicate the distance of the tool from the field of view. The synthetic indicator may also emphasize a direction and/or distance of travel to locate the offscreen tool. For example, when tools are located farther than a threshold distance from the field of view, all or a portion of the synthetic indicator may be animated to produce a gestural cue to emphasize that the tool is greater than a threshold distance from the field of view.
A method 800 for displaying a three-dimensional synthetic indicator (e.g., a synthetic indicator 218, 220 or 222) is illustrated in the flowchart of
At a process 802, an image of the field of view (e.g., field of view portion 202) in a surgical environment (e.g. surgical environment 201) is displayed on, for example a display 35. In some embodiments, the process 802 may include one or more of the process 804a-804f. At a process 804a, the visibility of the instrument tip keypoints with respect to an endoscope field of view volume may be determined.
At a process 804b, a determination may be made about whether a synthetic indicator should be displayed for an offscreen instrument based on context and predetermined rules. Displaying the synthetic indicator at all times while a tool tip is outside of the field of view may introduce undesirable distractions. Therefore, predetermined rules may be imposed on when the synthetic indicator is shown so that it is more contextual and its visibility coincides with operator workflow steps that benefit from user awareness of the offscreen tool location. For example, the synthetic indicator may be displayed when endoscope movement is active either from bedside or the surgeon console. The synthetic indicator may be displayed when a guided tool change feature is active on the tool's manipulator arm. The synthetic indicator may be displayed when an instrument clutch is active for the manipulator arm controlling an offscreen tool. The synthetic indicator may be displayed when a surgeon console user is about to start control of a manipulator arm that controls an offscreen tool. The synthetic indicator may be displayed when the surgeon console user has started control of a manipulator arm that controls an offscreen tool. The synthetic indicator may be displayed when the surgeon console user is changing hand association to a manipulator arm coupled to an offscreen tool. The synthetic indicator may be displayed when a notification is displayed for a manipulator arm to which an offscreen tool is coupled.
At a process 804c, a projected three-dimensional position of the synthetic indicator along lateral extents of the field of view volume may be determined. At a process 804d, an orientation of the three-dimensional synthetic indicator may be determined to face the endoscope tip within the visibility cone or frustum. At a process 804e, an upright orientation of the icon (e.g. icon 254) on surface of the synthetic indicator may be computed. At a process 804f, both left and right views of the synthetic indicator may be rendered using a calibrated stereoscopic camera model that corresponds to the endoscope optics.
At a process 804, a three-dimensional synthetic indicator (e.g., indicator 218) indicating a position of an instrument outside of the field of view may be generated. More specifically, in some embodiments, a composite rendering of the left and right synthetic indicators may be overlayed on the endoscopic video.
At a process 806, the three-dimensional synthetic indicator may be displayed with the image of the field of view of the surgical environment.
As shown in
As shown in
As shown in
The position and orientation of synthetic indicators 404, 420 may be determined to create the appearance that the synthetic indicators are decals adhered, for example, to the tool clevis or shaft. As the tools or endoscope providing the field of view are moved, the synthetic indicators 404, 420 may change orientation in three-dimensional space to maintain tangency to the tool surface and to preserve the spatial understanding of upper and lower pedals.
Various types, shapes, and configurations of synthetic indicators may be displayed to provide information about the status of foot pedal engagement. In an alternative embodiment, as shown in
In this embodiment, the synthetic indicator 450, 452 may perform a function similar to synthetic indicator 404 in providing information about the set of foot pedals 302, 304. As shown in
As shown in
In alternative embodiments, audio cues may be provided instead of or in addition to the synthetic indicators to provide instructions or indicate spatial direction (e.g., up/down/left/right) to move the operator's foot into a hover position for a foot pedal. The system may distinguish between hovering a foot over a pedal vs. actuating the pedal, and there may be distinct visual and audio cues for hover status versus the engaged or actuation status. The system may also depicts when a pedal function is valid or invalid. The highlight color may appears in gray when a pedal function is not valid (e.g. when the instrument function cable not plugged in or the instrument function is not configured)
A method 820 for displaying synthetic indicators corresponding to a set of foot pedals is illustrated in the flowchart of
As shown in
As shown in
The badge 500 may remain at the original keypoint location if the keypoint location remains visible in the field of view portion 202.
The orientation of the badge 500 at a keypoint may be constrained so that the normal to the badge surface is within the viewing cone and thus is visible in the field of view portion 202. If the badge 500 may not be oriented at a keypoint such that the normal is within the viewing cone, the badge 500 may be moved to a different keypoint. As shown in
The position, orientation, and depth of synthetic indicators associated with tools in the surgical environment may be determined based upon tool tracking by the control system and depth map analysis. Tool tracking alone may generate some residual error that may cause the synthetic indicators to appear to float over or interpenetrate the tool surface. This may be distracting to the viewer and may lead to fusion issues with the synthetic indicator and the associated tool. A depth map that provides information about the distance of the surfaces in the field of view portion 202 from the distal end of the endoscope may be used to refine placement of the synthetic indicator on the tool surface. More specifically, a raycast projection may be computed within a tolerance of a reference synthetic indicator position. The produced error may be used to estimate a radial offset correction for more accurately placing the synthetic indicator on the surface of the tool. The depth map quality and accuracy may be better when the tool is static or quasi-static, as compared to when the tool is moving. Thus, the raycasting and updating of the radial offset correction may be performed when the instrument keypoint velocity is lower than a threshold velocity. Alternatively, projective texturing may be used to place the synthetic indicator directly onto an extracted depth map surface. Systems and method for generating a depth map are further described in U.S. Pat. Nos. 7,907,166 and 8,830,224 which are incorporated by reference herein in their entirety.
Three-dimensional synthetic indicators that are merely superimposed on a stereoscopic image, without depth consideration, may intersect or be occluded by content in the field of view. This can result in misleading spatial relationships between real and synthetic objects as well as stereoscopic fusion difficulties. Use of a depth map may improve the spatial appearance of synthetic indicators placed in the field of view. In some embodiments, depth mapping may be used for occlusion culling which causes portions of synthetic indicators that are deeper than the depth map to not be rendered and displayed. Complete or even partial culling of a synthetic indicator may result in a loss of physical co-location status information. In some examples, when a co-located synthetic indicator is being displayed in the presence of sub-optimal tracking or rendering conditions (e.g. depth occlusion, field of view culling, poor tracking performance, poor depth map quality, poor stereoscopic alignment, etc.), the graphical user interface may gradually fall back from the co-located indicators shown in
In other embodiments when using a depth map, the full synthetic indicator may be preserved, but otherwise occluded portions of the synthetic indicator may be rendered with a visual treatment (e.g., a translucent treatment) that differs from the unoccluded portions. To accomplish the special visual treatment for the occluded portion of a synthetic indicator, the rendering to the synthetic indicator may occur in two stages. In a first stage, the synthetic indicator may be rendered with a reduced opacity and without reference to or modification based on a depth map. In a second stage, the synthetic indicator may be rendered more opaquely while applying a depth map culling so that only pixels that are unoccluded appear more opaquely and are rendered over the pixels generated in the first stage. Thus, the occluded portions of the synthetic indicator appear with reduced opacity (e.g., more translucent) and the unoccluded portions of the synthetic indicator appear with greater or full opacity. In some embodiments, with use of a stereoscopic display, the synthetic indicator rendering for one eye (e.g., the viewer's non-dominant eye) may cull the occluded portions of the synthetic indicator, and the synthetic indicator rending for the other eye (e.g., the viewer's dominant eye) may be generated using a depth map and depth-aware blending. In some embodiments, a synthetic indicator may be generated based on a user generated graphic. The user generated graphic may be based in single eye image when the synthetic indicator is generated stereoscopically.
In some embodiments, the graphical user interface 200 may be used to display synthetic indicators for use in a guided tool change. The synthetic indicator may be rendered as a synthetic tube which serves as a path to guide the insertion of the new tool to a distal target mark. In some embodiments, all or portions of the synthetic tube may be occluded by tissue or other tools.
A method 840 for displaying partially occluded synthetic indicators is illustrated in the flowchart of
Elements described in detail with reference to one embodiment, implementation, or application optionally may be included, whenever practical, in other embodiments, implementations, or applications in which they are not specifically shown or described. For example, if an element is described in detail with reference to one embodiment and is not described with reference to a second embodiment, the element may nevertheless be claimed as included in the second embodiment. Thus, to avoid unnecessary repetition in the following description, one or more elements shown and described in association with one embodiment, implementation, or application may be incorporated into other embodiments, implementations, or aspects unless specifically described otherwise, unless the one or more elements would make an embodiment or implementation non-functional, or unless two or more of the elements provide conflicting functions.
Any alterations and further modifications to the described devices, systems, instruments, methods, and any further application of the principles of the present disclosure are fully contemplated as would normally occur to one skilled in the art to which the disclosure relates. In particular, it is fully contemplated that the features, components, and/or steps described with respect to one embodiment may be combined with the features, components, and/or steps described with respect to other embodiments of the present disclosure. In addition, dimensions provided herein are for specific examples and it is contemplated that different sizes, dimensions, and/or ratios may be utilized to implement the concepts of the present disclosure. To avoid needless descriptive repetition, one or more components or actions described in accordance with one illustrative embodiment can be used or omitted as applicable from other illustrative embodiments. For the sake of brevity, the numerous iterations of these combinations will not be described separately.
Various systems and portions of systems have been described in terms of their state in three-dimensional space. As used herein, the term “position” refers to the location of an object or a portion of an object in a three-dimensional space (e.g., three degrees of translational freedom along Cartesian X, Y, Z coordinates). As used herein, the term “orientation” refers to the rotational placement of an object or a portion of an object (three degrees of rotational freedom—e.g., roll, pitch, and yaw). As used herein, the term “pose” refers to the position of an object or a portion of an object in at least one degree of translational freedom and to the orientation of that object or portion of the object in at least one degree of rotational freedom (up to six total degrees of freedom).
Although some of the examples described herein refer to surgical procedures or instruments, or medical procedures and medical instruments, the techniques disclosed optionally apply to non-medical procedures and non-medical instruments. For example, the instruments, systems, and methods described herein may be used for non-medical purposes including industrial uses, general robotic uses, and sensing or manipulating non-tissue work pieces. Other example applications involve cosmetic improvements, imaging of human or animal anatomy, gathering data from human or animal anatomy, and training medical or non-medical personnel. Additional example applications include use for procedures on tissue removed from human or animal anatomies (without return to a human or animal anatomy) and performing procedures on human or animal cadavers. Further, these techniques can also be used for surgical and nonsurgical medical treatment or diagnosis procedures.
A computer is a machine that follows programmed instructions to perform mathematical or logical functions on input information to produce processed output information. A computer includes a logic unit that performs the mathematical or logical functions, and memory that stores the programmed instructions, the input information, and the output information. The term “computer” and similar terms, such as “processor” or “controller” or “control system,” are analogous.
While certain exemplary embodiments of the invention have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that the embodiments of the invention not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art.
This application claims the benefit of U.S. Provisional Application 63/119,549 filed Nov. 30, 2020, which is incorporated by reference herein in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2021/060917 | 11/29/2021 | WO |
Number | Date | Country | |
---|---|---|---|
63119549 | Nov 2020 | US |