System and method for providing substantially stable control of a surgical tool

Abstract
A system for providing substantially stable control of a surgical instrument is provided. The system includes a surgical manipulator for manipulating the surgical instrument and at least one computer configured to identify a first subset and a second subset of interaction geometric primitives associated with a virtual tool; determine, based on the first subset, control forces in a first subspace; and determine based on the second subset, control forces in a second subspace having at least one additional dimension. Control forces in the additional dimension are only determined based on the second subset of primitives, which is different than the first subset of primitives. The computer is further configured to determine a torque to constrain an orientation of the surgical instrument, wherein determining the torque comprises defining a virtual tool normal and a control plane normal and using the virtual tool normal and control plane normal to calculate the torque.
Description
BACKGROUND

The invention relates generally to the field of haptics. Specifically, the invention relates to a system and method for providing substantially stable control in a system using a virtual tool.


The field of haptics relates to, among other things, human interactive devices that provide tactile and/or force feedback to a user to achieve a desired goal. Tactile feedback may include providing a user with tactile sensations such as, for example, vibration. Force feedback may include providing various forms of force to a user, such as a positive force or a resistance to movement.


A common use of haptics is to provide a user of a device with guidance or limits for manipulation of that device. For example, the device may be a robotic system having an object, such as a physical tool, for performing a specified function. The user's manipulation of the physical tool can be guided or limited through the use of haptics to provide feedback to the user during manipulation of the physical tool.


Often such guidance is provided by using a computer to create a virtual environment that effectively guides or limits manipulation of the physical tool. The computer may create an association between the physical tool and a virtual tool (a virtual representation of the physical tool) in the virtual environment. The computer also may construct a haptic object within the virtual environment. The haptic object may provide boundaries for guiding or limiting movement of the virtual tool. For example, when the virtual tool interacts with a boundary of the haptic object, tactile or force feedback may be provided to the user. The guidance or limitation resulting from the interaction between the virtual tool and the haptic object effectively provides guidance or limitation for the user's manipulation of the physical tool.


A specific example of such a robotic system using haptics can be found in computer-assisted surgical systems. In such systems, a physical tool, such as a bone-cutting tool, may be associated with a virtual tool in a virtual environment. A pre-operative surgical plan may be used to identify a region for bone resection, which will then be used to create a haptic object in the virtual environment. For example. the haptic object may represent the boundaries of the bone-resection region. The surgeon will receive tactile or force feedback when the virtual tool interacts with the boundaries of the haptic object. This feedback can assist the surgeon in maintaining the bone-cutting tool with the bone-resection region, according to his/her pre-operative plan.


Feedback is generated based on the interaction of the virtual tool with the haptic object. A fundamental relationship often established for haptics is a linear elastic spring, where the contact force applied by the haptic object may be defined by the spring constant, K, and the displacement into the haptic object, Δx, such that

{right arrow over (f)}=KΔ{right arrow over (x)}


To determine the appropriate feedback, the computer is usually able to determine the interaction between the virtual tool and the haptic object. This interaction is often determined by identifying a single haptic interaction point (HIP) that will represent the location of the virtual tool. For example, a HIP may be defined as the center of a spherical culling tool, as long as the haptic object is offset from a resection depth by the radius of the tool. In such cases, the computer uses the relationship between the HIP and the haptic object to determine the interaction of the virtual tool with the haptic object.


In certain circumstances, a single HIP may not be fully effective for determining the interaction between the virtual tool and the haptic object. For example, an irregular-shaped virtual tool may not be adequately represented by a single HIP. Due to the irregular shape of the virtual tool, a single HIP may not adequately account for variations on the cutting surface of the tool and/or rotation of the tool. In this case, multiple HIPs may be defined along the con tour of the virtual tool. For example, as shown in FIG. 2, three HIPs 20 (points 1, 2, and 3) may be defined on the outer edge of a virtual tool 10 to provide a better estimation of the interaction between the virtual tool and the haptic object.


Even multiple HIPs may not be fully effective for determining the interaction between the virtual tool and the haptic object. For example, traditional multi-point haptic forces may not be stable because of the competing forces that exist along a bi-lateral constraint such as a plane or line. In the example illustrated in FIG. 2, the irregular-shaped virtual tool 10 has outer HIPs (points 1 and 3) and an inner HIP (point 2). The outer HIPs are in an antagonistic relationship, such that haptic forces determined from the interaction of those outer HIPs (points 1 and 3) with the boundary 40 of the haptic object are in opposite directions, leading to an undesirable oscillation about a haptic plane. This unstable behavior can be attributed to measurement noise, discretization errors, and the competing forces between contact points.


The same unstable behavior may be present in other systems that provide non-haptic force feedback or control of a surgical tool using one or more interaction points.


In view of the foregoing, a need exists for a system and method that can provide substantially stable control for a control object that will correct undesirable oscillation caused by behavior such as measurement noise, discretization errors, and competing forces between interaction/contact points.


SUMMARY

According to an aspect of the present invention, a system for providing substantially stable haptics is provided. The system includes at least one computer configured to identify a first subset and a second subset of haptic interact ion geometric primitives for a virtual tool, determine based on the first subset, haptic forces in a first subspace, and determine based on the second subset, haptic forces in a second subspace different from the first subspace.


According to another aspect of the present invention, a method for providing substantially stable haptics is provided. The method includes the steps of identifying a first subset and a second subset of haptic interaction geometric primitives for a haptic object, determining based on the first subset, by at least one computer, haptic forces for a first subspace, and determining based on the second subset, by at least one computer, haptic forces in a second subspace different from the first subspace.


According to yet another aspect of the present invention, a system for providing substantially stable control of a surgical instrument is provided. The system includes a surgical manipulator for manipulating the surgical instrument and at least one computer. The computer is configured to identify a first subset and a second subset of interaction geometric primitives associated with a virtual tool representing the surgical instrument; determine, based on the first subset, control forces in a first subspace; and determine based on the second subset, control forces in a second subspace having at least one additional dimension to first subspace. Control forces in the additional dimension are only determined based on the second subset of interaction geometric primitives, which is different than the first subset of interaction geometric primitives. The computer is further configured to determine a torque to constrain an orientation of the surgical instrument, wherein determining the torque comprises defining a virtual tool normal and a control plane normal and using the virtual tool normal and control plane normal to calculate the torque.


According to yet another aspect of the present invention, a method for providing substantially stable control of a surgical instrument is provided. The method includes identifying a first subset and a second subset of interaction geometric primitives associated with a virtual tool representing a surgical instrument; determining based on the first subset, by at least one computer, control forces in a first subspace; and determining based on the second subset, by at least one computer, control forces in a second subspace having at least one additional dimension to the first subspace. Control forces in the additional dimension are only determined based on the second subset of interaction geometric primitives, which is different than the first subset of interaction geometric primitives. The method further includes determining a torque to constrain an orientation of the surgical instrument, wherein determining the torque comprises defining a virtual tool normal and a control plane normal and using the virtual tool normal and control plane normal to calculate the torque.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain principles of the invention.



FIG. 1 is a block diagram of a system for providing substantially stable haptics according to the present invention.



FIG. 2 shows a virtual tool interacting with a boundary of a haptic object.



FIG. 3 shows a virtual tool with subsets of haptic interaction points according to an embodiment of the present invention.



FIG. 4 shows a virtual tool with a haptic interaction arc segment and point according to an embodiment of the present invention.



FIG. 5 is a now chart for a method for providing substantially stable haptics according to the present invention.





DETAILED DESCRIPTION

Presently preferred embodiments of the invention are illustrated in the drawings. An effort has been made to use the same or like reference numbers throughout the drawings to refer to the same or like parts.


Overview


The present invention relates to a system and method for providing substantially stable control of a tool. According to the present invention, a computer can be used to identify multiple interaction geometric primitives for a virtual tool. To overcome one or more problems in conventional systems, the present invention identifies subsets of the interaction geometric primitives and determines control forces in subspaces corresponding to those subsets. Those control forces may then be used to determine a total interaction force.


Though the present disclosure that follows discusses primarily providing substantially stable haptics, it is to be understood that the following disclosure and calculations can be provided, implemented, and/or utilized by other systems that provide non-haptic force feedback or control of a surgical tool. For example, the following disclosure and force calculations which provide for substantially stable haptics for a haptic system can be similarly applied to a system such as that described in U.S. Pat. No. 9,119,655 entitled “SURGICAL MANIPULATOR CAPABLE OF CONTROLLING A SURGICAL INSTRUMENT IN MULTIPLE MODES” which is incorporated herein by reference in its entirety. The system disclosed in U.S. Pat. No. 9,119,655 provides for control of a surgical tool in a manual and a semi-autonomous mode, which maintains a surgical tool in a proper cutting pattern, path, or area, controlled by forces acting on the tool. The control forces provided by the system can be stabilized according to the teachings and calculations disclosed herein. In this way, references to haptic forces can also be considered to be non-haptic control forces; references to haptic geometric primitives or points can also be considered to be geometric primitives or points; and haptic control can be understood to mean control provided by forces or force feedback, but not specifically haptic control.


In general, a subspace is a set of vectors that is closed under addition and scalar multiplication. For example, geometrically, a subspace of Rn can be a flat through the origin, i.e., a copy of a lower dimensional (or equi-dimensional) Euclidean space sitting in n dimensions. For example, in three-dimensional space R3 there are types of subspaces that can be viewed as different from one another (even though they may overlap), including, without limiting:


(a) lines within R3, which are one-dimensional subspaces of R3;


(b) planes within R3, which are two-dimensional subspaces of R3; and


(c) the entire set R3, which is a three-dimensional subspace of itself.


In n-dimensional space Rn, there are subspaces of every dimension from 0 to n.


Embodiment of a System for Providing Substantially Stable Haptics



FIG. 1 is a block diagram of an embodiment of a computer-assisted system 100 for providing substantially stable haptics. The system includes a computer 102, which is in communication with an input unit 104 and a display 108.


In general, the computer 102 may be configured to determine haptic force(s) based on a virtual tool and, preferably, based on the interaction between a virtual tool and a haptic object. The computer 104 may be configured to, among other things, associate a physical tool with the virtual tool, identify HIGPs for the virtual tool. create the haptic object or objects, determine the interaction between the virtual tool and the haptic object, determine feedback to be provided to the user, and/or control the feedback provided to the user.


The computer 102 may be any known computing system but is preferably a programmable, processor-based system. For example. the computer 102 may include a microprocessor, a hard drive, random access memory (RAM), read only memory (ROM), input/output (I/O) circuitry, and any other well-known computer component. The computer 102 is preferably adapted for use with various types of storage devices (persistent and removable), such as, for example, a portable drive, magnetic storage (e.g., a floppy disk), solid stale storage (e.g., a flash memory card), optical storage (e.g., a compact disc or CD), and/or network/Internet storage. The computer 102 may comprise one or more computers, including, for example, a personal computer (e.g., an IBM-PC compatible computer) or a workstation (e.g., a SUN or Silicon Graphics workstation) operating under a Windows, MS-DOS, UNIX, or other suitable operating system and preferably includes a graphical user interface (GUI).


The input unit 104 enables information to be communicated to the system 100, including the computer 102. The input unit 104 may be one or more devices used for communication of information, such as features of the virtual tool, features of the haptic object, the location of the physical tool, and/or the location of the workpiece upon which the physical tool is or will be working.


The input unit 104 is connected to the computer 102 and may include any device(s) enabling input to a computer. As specific examples, the input unit 104 can include a known input device, such as a keyboard, a mouse. a trackball, a touch screen, a touch pad. voice recognition hardware, dials, switches, buttons, a trackable probe, a foot pedal, a remote control device, a scanner, a camera, a microphone, and/or a joystick. The input unit 104 may also include surgical navigation equipment that provides data to the computer 102. For example, the input unit 104 can include a tracking system for tracking the position of surgical tools and patient anatomy. The tracking system may be, for example, an optical, electromagnetic, radio, acoustic, mechanical, or fiber optic tracking system.


The display 108 is a visual interface between the system 100 and the user. The display 108 enables information to be communicated from the system 100, including the computer 102, to the user. The display 108 may be one or more devices used for communication of information, such as features of the virtual tool, features of the haptic object, and/or the location of the virtual tool relative to the haptic object In some embodiments, the display 108 displays graphical representations of virtual tools and haptic objects in a virtual environment.


The display 108 is connected to the computer 102 and may be any device suitable for displaying text, images, graphics, and/or other visual output. For example, the display 108 may include a standard display screen (e.g., LCD, CRT, plasma, etc.), a touch screen, a wearable display (e.g., eyewear such as glasse5 or goggles), a projection display, a head-mounted display, a holographic display, and for any other visual output device. The display 108 may be disposed on or near the computer 102 (e.g., mounted within a cabinet also comprising the computer 102) or may be remote from the computer 102 (e.g., mounted on a wall of an operating room or other location suitable for viewing by the user). The display 108 is preferably adjustable so that the user can position/reposition the display 108 as needed during a surgical procedure. For example, the display 108 may be disposed on an adjustable arm (not shown) or on any other location well-suited for ease of viewing by the user. The display 108 may be used to display any in formation useful for a medical procedure, such as, for example, images of anatomy generated from an image data set obtained using conventional imaging techniques, graphical models (e.g., CAD models of implants, instruments, anatomy, etc.), graphical representations of a tracked object (e.g., anatomy. tools, implants, etc.), digital or video images, registration information, calibration information, patient data, user data, measurement data, software menus, selection buttons, status information, and the like. The terms model and representation can be used interchangeably to refer to any computerized display of a component (e.g., implant, bone, tissue, etc.) of interest.


This system 100 can be used to determine haptic forces based on a virtual tool. Preferably the system 100 determines the haptic forces based on the interaction between the virtual tool and a haptic object. A specific configuration of a prior system having components that determine haptic forces based on a virtual tool is shown in U.S. Patent Appl. Pub. No. 2009/0012532 A1 to Quaid et al., published Jan. 8, 2009, and assigned to MAKO Surgical Corp., which is hereby incorporated herein by reference in its entirety. That prior system and its components could be modified to be used in accordance with the present invention. The present invention, however, differs from the prior system at least in that the present invention determines haptic forces in a new and advantageous way not contemplated in the prior system. In particular, the present determines the interact ion between the virtual tool and the haptic object using multiple HIGPs for the virtual tool, identifying subsets of the HIGPs, and determining haptic forces in subspaces corresponding to those subsets. Those haptic forces may then be used to determine a total haptic interaction force. A more detailed explanation of the process of determining those haptic forces is provided in the examples below.


Virtual Tool with Haptic Interaction Geometric Primitives (HIGPs)



FIGS. 3 and 4 show an example of an effective area of movement of a virtual tool 10. The virtual tool 10 may represent an object (e.g., surgical tool) in a virtual environment. For example, the virtual tool 10 may represent the manipulation of a physical tool by a user, (e.g., surgeon) to perform a procedure on a patient, such as cutting a surface of a bone in preparation for installing an implant. As the surgeon manipulates the physical tool, the interaction of the virtual tool 10 with a haptic object (not shown) may guide or limit the surgeon by providing haptic (tactile or force) feedback that constrains the physical tool.


As shown in FIGS. 2, 3 and 4, the virtual tool 10 may have one or more haptic interact ion geometric primitives (HIGPs) 20. The HIGPs 20 may correspond to a location on the virtual tool 10 (e.g., a tip of the virtual tool 10). A geometric primitive may be, for example, any one of a point, line, line segment, plane, circle, ellipse, triangle, polygon or curved arc segment. For example, FIGS. 2 and 3 show a virtual tool 10 wherein the HIGPs 20 are points. Alternatively, FIG. 4 shows a virtual tool 10 wherein the HIGPs 20 are an arc segment and a point.


The computer 102 can determine the interaction of the HIGPs 20 of the virtual tool 10 with one or more boundaries of haptic object(s) in a virtual environment. Boundaries 40 of haptic objects are shown, for example, in FIGS. 2 and 4. A haptic object may be a representation of a pre-defined boundary or component. For example, in the computer-assisted system 100, a pre-operative surgical plan may be used to generate a region for bone resection, and a haptic object may be used to constrain the virtual tool 10 to stay inside boundaries 40 of the haptic object that correspond to the boundaries of the bone-resection region. Accordingly, resulting haptic forces me generated to guide a surgeon according to his/her pre-operative plan.


As stated above, when haptic forces are determined from multiple HIGPs 20, competing forces may cause instability. For a given virtual tool 10, the system 100 of the present invention is configured to provide substantially stable haptics by identifying subsets of the HIGPs and determining haptic forces in subspaces corresponding to those subsets, as described in further detail below.


Embodiment Using Points as HIGPs


With reference to FIGS. 2, 3 and 5, an embodiment of a process for providing substantially stable haptics that can be executed by the system 100 will now be described. In this embodiment, the HIGPs 20 for a virtual tool 10 are all points. These HIGPs 20, in the form of points, will be referred to herein as haptic interaction points (HIPs). A plurality of these HIPs 20 may be, for example, disposed along an outer edge of the virtual tool 10. As shown in FIG. 3, HIPs 20 are located along a top edge of the virtual tool 10.


In step 210 of FIG. 5, subsets of HIPs 20 for a virtual tool 10 are identified. FIG. 3 shows a first subset A and a second subset B. The first subset A preferably includes a plurality of HIPs 20. The second subset B preferably includes only one HIP 20, though it may include more. As shown in FIG. 3, the HIP 20 of the second subset B is disposed between HIPs 20 of the first subset A.


In step 220, haptic forces are determined based on the interaction of the first subset A of HIPs 20 with a boundary 40 of a haptic object. The haptic forces are determined in a first subspace that omits at least one dimension. Preferably, as shown in FIG. 3, in a three-dimensional coordinate system R3, the first subspace C is defined in two dimensions {x, y} with the third dimension {z} being omitted. Haptic forces from HIPs 20 of subset A may be mapped to a subspace C of the special Euclidean group SE(3). Each force acting on HIPs 20 of subset A may be mapped to subspace C using a projection matrix, which is akin to a spatial filter. For example, the projection matrix that maps the force [fx, fy, fz]T in three-dimensional space R3 to the force [fx, fy, 0]T can be described as







P
=

[



1


0


0




0


1


0




0


0


0



]


,


such





that






P


[




f





x






f





y






f





z




]



=

[




f





x






f





y





0



]






The haptic forces determined from the HIPs 20 in subset A are projected onto subspace C. Accordingly, haptic forces determined from the HIPs 20 of subset A in the x and y dimension are determined and haptic forces from the HIPs 20 of subset A in the z dimension are omitted. Mapping haptic forces from each HIP of the first subset in the first subspace assists in stabilizing a virtual tool with multiple HIGPs by eliminating the unstable nature of competing haptic interaction forces.


In step 230, haptic forces from the second subset B of HIPs 20 are determined in a second subspace different from the first subspace. Preferably, as shown in FIG. 3, in a three-dimensional coordinate system, the second subspace D is defined in three dimensions {x, y, z} with the third dimension {z} (normal to x, y) being the dimension that was omitted in the first subspace C. Accordingly, the haptic forces for the second subset B of HIPs 20 are defined by forces in three dimensions {x, y, z} in the second subspace D. As shown in FIG. 3, the z dimension is normal to the x and y dimension. As a result, haptic forces normal to subspace C (in the z dimension) are only defined based on the HIPs 20 in subset B. In the embodiment shown in FIG. 3, subset B includes a single HIP 20 positioned between HIPs 20 of subset A. Thus, the haptic forces normal to subspace C are only defined from the single center HIP 20. Relying solely on the single HIP 20 of subset B to define forces normal to the first subspace C reduces the amount of competing haptic interaction forces, which in t urn increases stability.


In the preferred embodiment discussed above, the first subspace is a two-dimensional subspace of three-dimensional space R3, and the second subspace is a three-dimensional subspace of three-dimensional space R3. However, other subspaces and other n-dimensional spaces may be utilized. Preferably, the first subspace has at least one less dimension than the second subspace. Even more preferably, the first subspace has only one less dimension than the second subspace.


In step 240, a total haptic interaction force is determined, preferably by summing the haptic forces in the first subspace C and the second subspace D. The total haptic interaction force, f, can be a summation of the individual haptic forces, fi, from each HIP 20 in subset A and subset B, and can be described as:







f


=




i
=
1

n








P
i




f


i








where Pi represents the projection matrix for each particular HIP 20.


In step 250, a torque can be generated to provide a further constraint on the orientation of the virtual tool I0 (and thus the physical tool). Preferably, the torque, τ, may be generated from a tool normal ({right arrow over (n)}tool) and haptic plane normal ({right arrow over (n)}haptic) (See FIG. 2) to properly constrain the orientation. For example,

{right arrow over (τ)}=Kθ({right arrow over (n)}tool×{right arrow over (n)}haptic)/∥{right arrow over (n)}tool×{right arrow over (n)}haptic∥,

where K represents the rotational haptic stiffness, and θ is the angular displacement.


In addition to generating a torque there are several strategies for reducing the contribution of forces that enable the resultant force to be stable as described in A. Petersik, B. Pflesser, U. Tiede, K. Hohne and R. Leuwer, “Realistic Haptic Interaction in Volume Sculpting for Surgery Simulation”, Lecture Notes in Computer Science, vol. 2673, 2003.


Embodiment Using Complex Body as HIGPs


Alternatively, the HIGPs 20 may include other complex bodies, such as a curved arc segment, or may include a combination of complex bodies and point(s). As shown in FIG. 4, both the arc segment and point (point 1) are located along a top edge of the virtual tool 10. The process for providing substantially stable haptics for a virtual tool as shown in FIG. 4 is the same as described above. Preferably in step 210, the arc segment is identified as the first subset (subset A) and the point is identified as the second subset (subset B) of the geometric primitives 20.


In step 220, the resulting forces for the arc segment (subset A) are then mapped to a first subspace of the special Euclidean group SE(3). Here, haptic forces are defined from the resulting interference between the curved arc segment (subset A) and a boundary 40 of a haptic object. As shown in FIG. 4 specifically, haptic forces are determined from the penetration of a non-uniform virtual tool 10 into a boundary 40 (haptic wall). There are multiple elastic contact theories that may be used to estimate contact forces from this displacement, including Hertzian contact models, elastic half-space models (See N. Ahmadi, L. M. Keer, and T. Mura, “Non-Hertzian contact stress analysis for an elastic half-space normal and sliding contact,” Int. J. Solids Structures, vol. 19, no. 4, pp. 357-373, 1983.), and elastic foundation models.


In the elastic foundation model, contact forces are approximated through the assumption that the deformation al one location does not influence deformations at all locations throughout the object. See Y. Bei. and B. J. Fregly, “Multibody dynamic simulation of knee contact mechanics,” Medical Engineering & Physics, vol. 26, pp. 777-789, 2004. This contact model comprises independent springs evenly-distributed across the contact surface, representing a layer of elastic material. Accordingly, the pressure from any spring element on the surface may be written as








p


=




(

1
-
v

)


E



(

1
+
v

)



(

1
-

2

v


)






d


h



,




where E is the spring constant of the elastic layer, ν is Poisson's ratio, h is the thickness of the layer, and d is the spring deformation. In this model, the spring deformation is defined as the interpenetration of the undeformed contact surfaces in the direction of the midsurface normal. From this result, haptic forces arising from the penetration of an arc segment 20 into a boundary 40 of a haptic object may be determined through integration, such that

{right arrow over (f)}i=∫ab{right arrow over (p)}dx,


In step 230, haptic forces for point 1 (subset B) are determined in a second subspace different from the first subspace. Preferably, the second subspace is normal to the first subspace. As a result, haptic forces normal to the first subspace are only defined based on point 1.


In step 240, the total haptic force on the virtual tool 10 can be defined from a summation of the individual components projected into the first subspace and second subspace, such that







f


=




i
=
1

n








P
i




f


i








where fi represents the contribution of forces from a particular arc segment, point, or surface. For example, the force f1 may be defined from the penetration of the arc segment in the first subspace, and the force f2 may be defined from the penetration of point 1 in the second subspace, normal to the first subspace.


In step 250, a torque is determined to further constrain the orientation of the virtual tool 10. Preferably, the torque may be generated from a tool normal ({right arrow over (n)}tool) and haptic plane normal ({right arrow over (n)}haptic) (See FIG. 2) to properly constrain the orientation of the tool. For example,

{right arrow over (τ)}=Kθ({right arrow over (n)}tool×{right arrow over (n)}haptic)/∥{right arrow over (n)}tool×{right arrow over (n)}haptic∥,


where K represents the rotational haptic stiffness, and θ is the angular displacement.


CONCLUSION

The present invention is not limited to the embodiments disclosed above. Those embodiments, however, disclose examples of configurations that can advantageously provide substantially stable haptics by using a computer to identify multiple HIGPs for a virtual tool, identify subsets of the HIGPs, and determine haptic forces in subspaces corresponding to those subsets. Those haptic forces may then be used to determine a total haptic interaction force. Embodiments can be constructed to overcome the instability problems of prior haptic systems. The present invention can be implemented in a wide variety of configurations beyond those disclosed herein.


For example, the above described techniques can be implemented in a distributed computing system that includes a back-end component. The back-end component can, for example, be a data server, a middleware component, and/or an application server. The above described techniques can be implemented in a distributing computing system that includes a front-end component. The front-end component can, for example, be a client computer having a graphical user interface. a Web browser through which a user can internet with an example implementation, and/or other graphical user interfaces for a transmitting device. The components of the system can be interconnected by any form or medium of digital data communication {e.g., a communication network).


The system can include clients and servers. A client and n server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.


Communication networks can include, for example, the internet, a carrier internet protocol (IP) network (e.g., local area network (LAN), wide area network (WAN), campus area network (CAN), metropolitan area network (MAN), home area network (RAN)), a private IP network, an IP private branch exchange (IPBX), a wireless network (e.g., radio access network (RAN), 802.11 network, 802.16 network, general packet radio service (GPRS) network, HiperLAN), and/or other packet-based networks. Circuit-based networks can include, for example the public switched telephone network (PSTN), a private branch exchange (PBX), a wireless network (e.g., RAN, bluetooth, code-division multiple access (CDMA) network, time division multiple access (TOMA) network, global system for mobile communications (GSM) network), and/or other circuit-based networks.


The transmitting device can include, for example, a computer, a computer with a browser device, a telephone, an IP phone, a mobile device (e.g., cellular phone, personal digital assistant (PDA) device, laptop computer, electronic mail device), and/or other communication devices. The browser device includes, for example, a computer (e.g., desktop computer, laptop computer) with a world wide web browser (e.g., Microsoft™ Internet Explorer™ available from Microsoft Corporation, Mozilla™ Firefox available from Mozilla Corporation). The mobile computing device includes. for example, a personal digital assistant (PDA).


Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. It is intended that the specification and examples be considered as exemplary only.

Claims
  • 1. A system for providing substantially stable control of a surgical instrument, comprising: a surgical manipulator for manipulating the surgical instrument; andat least one computer configured to:identify a first subset and a second subset of interaction geometric primitives associated with a virtual tool representing the surgical instrument;determine, based on the first subset, control forces in a first subspace;determine based on the second subset, control forces in a second subspace having at least one additional dimension to first subspace; wherein control forces in the additional dimension are only determined based on the second subset of interaction geometric primitives, which is different than the first subset of interaction geometric primitives; anddetermine a torque to constrain an orientation of the surgical instrument, wherein determining the torque comprises defining a virtual tool normal and a control plane normal and using the virtual tool normal and control plane normal to calculate the torque.
  • 2. The system of claim 1, wherein the first subspace has only one less dimension than the second subspace.
  • 3. The system of claim 1, wherein the first subspace is a two-dimensional subspace of three-dimensional space R3, and the second subspace is a three-dimensional subspace of three-dimensional space R3.
  • 4. The system of claim 1, wherein the at least one computer is configured such that a total interaction force is determined by summing at least the control forces in the first subspace and the control forces in the second subspace.
  • 5. The system of claim 1, wherein the interaction geometric primitives are at least one of a point, line, line segment, plane, circle, ellipse, triangle, polygon and curved arc segment.
  • 6. The system of claim 1, wherein the first subset includes a plurality of interaction geometric primitives.
  • 7. The system of claim 6, wherein the second subset includes only one interaction geometric primitive.
  • 8. The system of claim 7, wherein the interaction geometric primitive of the second subset is disposed between interaction geometric primitives of the first subset.
  • 9. The system of claim 1, wherein the first subset constitutes a non-linear boundary of the virtual tool.
  • 10. The system of claim 1, wherein the at least one computer is configured to determine a torque based at least in part on an orientation of the virtual tool.
  • 11. The system of claim 1, wherein the computer is further configured to selectively operate the surgical manipulator in a first operating mode and a second operating mode.
  • 12. The system of claim 11, wherein the first operating mode is a manual mode and the second operating mode is a semi-autonomous mode.
  • 13. A method for providing substantially stable control of a surgical instrument, comprising: identifying a first subset and a second subset of interaction geometric primitives associated with a virtual tool representing a surgical instrument:determining based on the first subset, by at least one computer, control forces in a first subspace;determining based on the second subset, by at least one computer, control forces in a second subspace having at least one additional dimension to the first subspace;wherein control forces in the additional dimension are only determined based on the second subset of interaction geometric primitives, which is different than the first subset of interaction geometric primitives; anddetermining a torque to constrain an orientation of the surgical instrument, wherein determining the torque comprises defining a virtual tool normal and a control plane normal and using the virtual tool normal and control plane normal to calculate the torque.
  • 14. The method of claim 13, wherein the first subspace has only one less dimension than the second subspace.
  • 15. The method of claim 13, wherein the first subspace is a two-dimensional subspace of three-dimensional space R3, and the second subspace is a three-dimensional subspace of three-dimensional space R3.
  • 16. The method of claim 13, further comprising: determining, by at least one computer, a total interaction force by summing at least the control forces in the first subspace and the control forces in the second subspace.
  • 17. The method of claim 13, wherein the interaction geometric primitives are at least one of a point, line, line segment, plane, circle, ellipse, triangle, polygon and curved arc segment.
  • 18. The method of claim 13, wherein the first subset includes a plurality of interaction geometric primitives.
  • 19. The method of claim 13, wherein the second subset includes only one interaction geometric primitive.
  • 20. The method of claim 19, wherein the interaction geometric primitive of the second subset is disposed between interaction geometric primitives of the first subset.
  • 21. The method of claim 13, wherein the first subset constitutes a non-linear boundary of the virtual tool.
  • 22. The method of claim 13, further comprising: determining a torque based at least in part on an orientation of the virtual tool.
CROSS-REFERENCE TO RELATED PATENT APPLICATIONS

This application is a continuation in part of U.S. application Ser. No. 13/339,369 filed Dec. 28, 2011, which claims the benefit of and priority to U.S. Provisional Application No. 61/428,210, filed Dec. 29, 2010, both of which are incorporated herein by reference in their entireties. This application is also a continuation in part of U.S. application Ser. No. 15/401,567 filed Jan. 9, 2017, which is a continuation of U.S. application Ser. No. 14/841,062 filed Aug. 31, 2015 and granted as U.S. Pat. No. 9,566,125, which is a divisional of U.S. application Ser. No. 13/958,070 filed Aug. 2, 2013 and granted as U.S. Pat. No. 9,119,655, which claims the benefit of and priority to U.S. Provisional Application No. 61/679,258 filed Aug. 3, 2012 and U.S. Provisional Application No. 61/792,251 filed Mar. 15, 2013, all of which are incorporated herein by reference in their entireties.

US Referenced Citations (410)
Number Name Date Kind
4425818 Asada et al. Jan 1984 A
4442493 Wakai et al. Apr 1984 A
4696167 Matsui et al. Sep 1987 A
4863133 Bonnell Sep 1989 A
4979949 Matsen et al. Dec 1990 A
5078140 Kwoh Jan 1992 A
5086401 Glassman et al. Feb 1992 A
5091861 Geller et al. Feb 1992 A
5154717 Matsen et al. Oct 1992 A
5231693 Backes et al. Jul 1993 A
5279309 Taylor et al. Jan 1994 A
5299288 Glassman et al. Mar 1994 A
5339799 Kami et al. Aug 1994 A
5343391 Mushabac Aug 1994 A
5397323 Taylor et al. Mar 1995 A
5399951 Lavallee et al. Mar 1995 A
5408409 Glassman et al. Apr 1995 A
5434489 Cheng et al. Jul 1995 A
5445144 Wodicka et al. Aug 1995 A
5562448 Mushabac Oct 1996 A
5569578 Mushabac Oct 1996 A
5576727 Rosenberg et al. Nov 1996 A
5629594 Jacobus et al. May 1997 A
5630431 Taylor May 1997 A
5682886 Delp et al. Nov 1997 A
5689159 Culp et al. Nov 1997 A
5691898 Rosenberg et al. Nov 1997 A
5695500 Taylor et al. Dec 1997 A
5699038 Ulrich et al. Dec 1997 A
5710870 Ohm et al. Jan 1998 A
5711299 Manwaring et al. Jan 1998 A
5721566 Rosenberg et al. Feb 1998 A
5730129 Darrow et al. Mar 1998 A
5731804 Rosenberg Mar 1998 A
5734373 Rosenberg et al. Mar 1998 A
5737500 Seraji et al. Apr 1998 A
5739811 Rosenberg et al. Apr 1998 A
5748767 Raab May 1998 A
5762458 Wang et al. Jun 1998 A
5767648 Morel et al. Jun 1998 A
5767839 Rosenberg Jun 1998 A
5769092 Williamson, Jr. Jun 1998 A
5769640 Jacobus et al. Jun 1998 A
5776136 Sahay et al. Jul 1998 A
5784542 Ohm et al. Jul 1998 A
5789890 Genov et al. Aug 1998 A
5792135 Madhani et al. Aug 1998 A
5792147 Evans et al. Aug 1998 A
5806518 Mittelstadt Sep 1998 A
5807377 Madhani et al. Sep 1998 A
5815640 Wang et al. Sep 1998 A
5820623 Ng Oct 1998 A
5824085 Sahay et al. Oct 1998 A
5831408 Jacobus et al. Nov 1998 A
5841950 Wang et al. Nov 1998 A
5847528 Hui et al. Dec 1998 A
5855553 Tajima et al. Jan 1999 A
5855583 Wang et al. Jan 1999 A
5871018 Delp et al. Feb 1999 A
5880976 Digioia, III et al. Mar 1999 A
5882206 Gillio Mar 1999 A
5891157 Day et al. Apr 1999 A
5907487 Rosenberg et al. May 1999 A
5907664 Wang et al. May 1999 A
5929607 Rosenberg et al. Jul 1999 A
5950629 Taylor et al. Sep 1999 A
5952796 Colgate et al. Sep 1999 A
5959613 Rosenberg et al. Sep 1999 A
5966305 Watari et al. Oct 1999 A
5971976 Wang et al. Oct 1999 A
5976156 Taylor et al. Nov 1999 A
5993338 Kato et al. Nov 1999 A
5995738 Digioia et al. Nov 1999 A
5999168 Rosenberg et al. Dec 1999 A
6002859 Digioia, III et al. Dec 1999 A
6020876 Rosenberg et al. Feb 2000 A
6024576 Bevirt et al. Feb 2000 A
6033415 Mittelstadt et al. Mar 2000 A
6037927 Rosenberg Mar 2000 A
6046727 Rosenberg et al. Apr 2000 A
6050718 Schena et al. Apr 2000 A
6063095 Wang et al. May 2000 A
6067077 Martin et al. May 2000 A
6084587 Tarr et al. Jul 2000 A
6097168 Katoh et al. Aug 2000 A
6102850 Wang et al. Aug 2000 A
6111577 Zilles et al. Aug 2000 A
6124693 Okanda et al. Sep 2000 A
6157873 Decamp et al. Dec 2000 A
6163124 Ito et al. Dec 2000 A
6181096 Hashimoto et al. Jan 2001 B1
6191796 Tarr Feb 2001 B1
6205411 Digioia, III et al. Mar 2001 B1
6228089 Wahrburg May 2001 B1
6233504 Das et al. May 2001 B1
6236875 Bucholz et al. May 2001 B1
6236906 Muller May 2001 B1
6278902 Hashimoto et al. Aug 2001 B1
6281651 Haanpaa et al. Aug 2001 B1
6300937 Rosenberg Oct 2001 B1
6304050 Skaar et al. Oct 2001 B1
6311100 Sarma et al. Oct 2001 B1
6314312 Wessels et al. Nov 2001 B1
6322567 Mittelstadt et al. Nov 2001 B1
6325808 Bernard et al. Dec 2001 B1
6329777 Itabashi et al. Dec 2001 B1
6329778 Culp et al. Dec 2001 B1
6330837 Charles et al. Dec 2001 B1
6336931 Hsu et al. Jan 2002 B1
6339735 Peless et al. Jan 2002 B1
6341231 Ferre et al. Jan 2002 B1
6342880 Rosenberg et al. Jan 2002 B2
6347240 Foley et al. Feb 2002 B1
6351659 Vilsmeier Feb 2002 B1
6351661 Cosman Feb 2002 B1
6352532 Kramer et al. Mar 2002 B1
6366272 Rosenberg et al. Apr 2002 B1
6368330 Hynes et al. Apr 2002 B1
6369834 Zilles et al. Apr 2002 B1
6377839 Kalfas et al. Apr 2002 B1
6385475 Cinquin et al. May 2002 B1
6385508 McGee et al. May 2002 B1
6385509 Das et al. May 2002 B2
6401006 Mizuno et al. Jun 2002 B1
6405072 Cosman Jun 2002 B1
6408253 Rosenberg et al. Jun 2002 B2
6411276 Braun et al. Jun 2002 B1
6413264 Jensen et al. Jul 2002 B1
6414711 Arimatsu et al. Jul 2002 B2
6417638 Guy et al. Jul 2002 B1
6421048 Shih et al. Jul 2002 B1
6423077 Carol et al. Jul 2002 B2
6424356 Chang et al. Jul 2002 B2
6430434 Mittelstadt Aug 2002 B1
6432112 Brock et al. Aug 2002 B2
6434415 Foley et al. Aug 2002 B1
6436107 Wang et al. Aug 2002 B1
6443894 Sumanaweera et al. Sep 2002 B1
6450978 Brosseau et al. Sep 2002 B1
6461372 Jensen et al. Oct 2002 B1
6463360 Terada et al. Oct 2002 B1
6466815 Saito et al. Oct 2002 B1
6468265 Evans et al. Oct 2002 B1
6473635 Rasche Oct 2002 B1
6486872 Rosenberg et al. Nov 2002 B2
6490467 Bucholz et al. Dec 2002 B1
6491702 Heilbrun et al. Dec 2002 B2
6494882 Lebouitz et al. Dec 2002 B1
6501997 Kakino Dec 2002 B1
6507165 Kato et al. Jan 2003 B2
6507773 Parker et al. Jan 2003 B2
6514082 Kaufman et al. Feb 2003 B2
6520228 Kennedy et al. Feb 2003 B1
6522906 Salisbury et al. Feb 2003 B1
6533737 Brosseau et al. Mar 2003 B1
6535756 Simon et al. Mar 2003 B1
6542770 Zylka et al. Apr 2003 B2
6562055 Walen May 2003 B2
6620174 Jensen et al. Sep 2003 B2
6636161 Rosenberg Oct 2003 B2
6639581 Moore et al. Oct 2003 B1
6665554 Charles et al. Dec 2003 B1
6671651 Goodwin et al. Dec 2003 B2
6676669 Charles et al. Jan 2004 B2
6697048 Rosenberg et al. Feb 2004 B2
6699177 Wang et al. Mar 2004 B1
6702805 Stuart Mar 2004 B1
6704002 Martin et al. Mar 2004 B1
6704683 Hasser Mar 2004 B1
6704694 Basdogan et al. Mar 2004 B1
6711432 Krause et al. Mar 2004 B1
6723106 Charles et al. Apr 2004 B1
6728599 Wang et al. Apr 2004 B2
6756761 Takahashi et al. Jun 2004 B2
6757582 Brisson et al. Jun 2004 B2
6778850 Adler et al. Aug 2004 B1
6778867 Ziegler et al. Aug 2004 B1
6781569 Gregorio et al. Aug 2004 B1
6785572 Yanof et al. Aug 2004 B2
6785593 Wang et al. Aug 2004 B2
6788999 Green Sep 2004 B2
6793653 Sanchez et al. Sep 2004 B2
6799106 Fukushima et al. Sep 2004 B2
6804547 Pelzer et al. Oct 2004 B2
6810314 Tashiro et al. Oct 2004 B2
6827723 Carson Dec 2004 B2
6832119 Miller Dec 2004 B2
6833846 Hasser Dec 2004 B2
6837892 Shoham Jan 2005 B2
6856888 Kawai Feb 2005 B2
6871117 Wang et al. Mar 2005 B2
6892110 Inoue et al. May 2005 B2
6892112 Wang et al. May 2005 B2
6892129 Miyano May 2005 B2
6895306 Ebisawa et al. May 2005 B2
6903721 Braun et al. Jun 2005 B2
6904823 Levin et al. Jun 2005 B2
6941224 Fukuyasu Sep 2005 B2
6958752 Jennings et al. Oct 2005 B2
6963792 Green Nov 2005 B1
6978166 Foley et al. Dec 2005 B2
6982700 Rosenberg et al. Jan 2006 B2
6999852 Green Feb 2006 B2
7003368 Koike et al. Feb 2006 B2
7006895 Green Feb 2006 B2
7030585 Iwashita et al. Apr 2006 B2
7034491 Kozai et al. Apr 2006 B2
7035711 Watanabe et al. Apr 2006 B2
7035716 Harris et al. Apr 2006 B2
7038657 Rosenberg et al. May 2006 B2
7042175 Watanabe May 2006 B2
7044039 Powell May 2006 B2
7047117 Akiyama et al. May 2006 B2
7055789 Libbey et al. Jun 2006 B2
7056123 Gregorio et al. Jun 2006 B2
7084596 Iwashita et al. Aug 2006 B2
7084867 Ho Aug 2006 B1
7086056 Fukushima Aug 2006 B2
7092791 Terada et al. Aug 2006 B2
7097640 Wang et al. Aug 2006 B2
7102314 Hayashi Sep 2006 B2
7102635 Shin et al. Sep 2006 B2
7103499 Goodwin et al. Sep 2006 B2
7139601 Bucholz et al. Nov 2006 B2
7155316 Sutherland et al. Dec 2006 B2
7181315 Watanabe et al. Feb 2007 B2
7193607 Moore et al. Mar 2007 B2
7204844 Jensen et al. Apr 2007 B2
7206626 Quaid, III Apr 2007 B2
7206627 Abovitz et al. Apr 2007 B2
7209117 Rosenberg et al. Apr 2007 B2
7215326 Rosenberg May 2007 B2
7221983 Watanabe et al. May 2007 B2
7225404 Zilles et al. May 2007 B1
7239940 Wang et al. Jul 2007 B2
7245202 Levin Jul 2007 B2
7249951 Bevirt et al. Jul 2007 B2
7260437 Senoo et al. Aug 2007 B2
7260733 Ichikawa et al. Aug 2007 B2
7280095 Grant Oct 2007 B2
7283120 Grant Oct 2007 B2
7319466 Tarr et al. Jan 2008 B1
7346417 Luth et al. Mar 2008 B2
7404716 Gregorio et al. Jul 2008 B2
7422582 Malackowski et al. Sep 2008 B2
7447604 Braun et al. Nov 2008 B2
7454268 Jinno Nov 2008 B2
7460104 Rosenberg Dec 2008 B2
7460105 Rosenberg et al. Dec 2008 B2
7466303 Yi et al. Dec 2008 B2
7468594 Svensson et al. Dec 2008 B2
7491198 Kockro Feb 2009 B2
7542826 Hanzawa Jun 2009 B2
7543588 Wang et al. Jun 2009 B2
7573461 Rosenberg Aug 2009 B2
7577504 Sawada et al. Aug 2009 B2
7590458 Endo et al. Sep 2009 B2
7625383 Charles et al. Dec 2009 B2
7648513 Green et al. Jan 2010 B2
7657356 Iwashita et al. Feb 2010 B2
7660623 Hunter et al. Feb 2010 B2
7667687 Cruz-Hernandez et al. Feb 2010 B2
7683565 Quaid et al. Mar 2010 B2
7714836 Rodomista et al. May 2010 B2
7725162 Malackowski et al. May 2010 B2
7742801 Neubauer et al. Jun 2010 B2
7744608 Lee et al. Jun 2010 B2
7747311 Quaid, III Jun 2010 B2
7765890 Inoue et al. Aug 2010 B2
7800609 Tarr et al. Sep 2010 B2
7813368 Ootaka Oct 2010 B2
7813784 Marquart et al. Oct 2010 B2
7813838 Sommer Oct 2010 B2
7815644 Masini Oct 2010 B2
7818044 Dukesherer et al. Oct 2010 B2
7824424 Jensen et al. Nov 2010 B2
7831292 Quaid et al. Nov 2010 B2
7835784 Mire et al. Nov 2010 B2
7843158 Prisco Nov 2010 B2
7853356 Tsai et al. Dec 2010 B2
7853358 Joly Dec 2010 B2
7864173 Handley et al. Jan 2011 B2
7881917 Nagatsuka et al. Feb 2011 B2
7892243 Stuart Feb 2011 B2
7914522 Morley et al. Mar 2011 B2
7916121 Braun et al. Mar 2011 B2
7950306 Stuart May 2011 B2
7969288 Braun et al. Jun 2011 B2
8004229 Nowlin et al. Aug 2011 B2
8005571 Sutherland et al. Aug 2011 B2
8005659 Nelson et al. Aug 2011 B2
8010180 Quaid et al. Aug 2011 B2
8013847 Anastas Sep 2011 B2
8049457 Okita et al. Nov 2011 B2
8049734 Rosenberg et al. Nov 2011 B2
8054028 Aoyama et al. Nov 2011 B2
8090475 Blanc et al. Jan 2012 B2
8095200 Quaid, III Jan 2012 B2
8271134 Kato et al. Sep 2012 B2
8287522 Moses et al. Oct 2012 B2
8391954 Quaid, III Mar 2013 B2
8498744 Odermatt et al. Jul 2013 B2
8560047 Haider et al. Oct 2013 B2
8571628 Kang Oct 2013 B2
8831779 Ortmaier et al. Sep 2014 B2
9364291 Bellettre et al. Jun 2016 B2
20020035321 Bucholz et al. Mar 2002 A1
20030069591 Carson et al. Apr 2003 A1
20030208296 Brisson et al. Nov 2003 A1
20030216816 Ito et al. Nov 2003 A1
20040010190 Shahidi Jan 2004 A1
20040024311 Quaid, III Feb 2004 A1
20040034283 Quaid Feb 2004 A1
20040034302 Abovitz et al. Feb 2004 A1
20040077939 Graumann Apr 2004 A1
20040106916 Quaid Jun 2004 A1
20040128030 Nagata et al. Jul 2004 A1
20040148036 Sunami Jul 2004 A1
20040157188 Luth et al. Aug 2004 A1
20040243147 Lipow Dec 2004 A1
20050171553 Schwarz et al. Aug 2005 A1
20060071625 Nakata et al. Apr 2006 A1
20060091842 Nishiyama May 2006 A1
20060109266 Itkowitz et al. May 2006 A1
20060111813 Nishiyama May 2006 A1
20060142657 Quaid Jun 2006 A1
20060155262 Kishi et al. Jul 2006 A1
20060176242 Jaramaz et al. Aug 2006 A1
20060257379 Giordano et al. Nov 2006 A1
20060284587 Teshima et al. Dec 2006 A1
20070013336 Nowlin et al. Jan 2007 A1
20070085496 Philipp et al. Apr 2007 A1
20070249911 Simon Oct 2007 A1
20070260394 Dean Nov 2007 A1
20070265527 Wohlgemuth Nov 2007 A1
20070270685 Kang et al. Nov 2007 A1
20070287911 Haid et al. Dec 2007 A1
20080001565 Nakashima et al. Jan 2008 A1
20080004633 Arata Jan 2008 A1
20080009697 Haider et al. Jan 2008 A1
20080010706 Moses et al. Jan 2008 A1
20080058776 Jo et al. Mar 2008 A1
20080065111 Blumenkranz et al. Mar 2008 A1
20080077158 Haider et al. Mar 2008 A1
20080114267 Lloyd et al. May 2008 A1
20080161829 Kang Jul 2008 A1
20090003975 Kuduvalli et al. Jan 2009 A1
20090012532 Quaid et al. Jan 2009 A1
20090043556 Axelson et al. Feb 2009 A1
20090068620 Knobel et al. Mar 2009 A1
20090082784 Meissner et al. Mar 2009 A1
20090088774 Swarup et al. Apr 2009 A1
20090096148 Usui Apr 2009 A1
20090099680 Usui Apr 2009 A1
20090102767 Shiomi Apr 2009 A1
20090112316 Umemoto et al. Apr 2009 A1
20090149867 Glozman et al. Jun 2009 A1
20090245992 Kato Oct 2009 A1
20090248038 Blumenkranz et al. Oct 2009 A1
20090259412 Brogardh Oct 2009 A1
20090308683 Suzuki Dec 2009 A1
20100076474 Yates et al. Mar 2010 A1
20100094312 Morales et al. Apr 2010 A1
20100137882 Quaid, III Jun 2010 A1
20100154578 Duval Jun 2010 A1
20100168950 Nagano Jul 2010 A1
20100174410 Greer et al. Jul 2010 A1
20100286826 Tsusaka et al. Nov 2010 A1
20100292707 Ortmaier et al. Nov 2010 A1
20100331859 Omori Dec 2010 A1
20110077590 Plicchi et al. Mar 2011 A1
20110082468 Hagag et al. Apr 2011 A1
20110106102 Balicki et al. May 2011 A1
20110118751 Balaji et al. May 2011 A1
20110130761 Plaskos et al. Jun 2011 A1
20110152676 Groszmann et al. Jun 2011 A1
20110160745 Fielding et al. Jun 2011 A1
20110257653 Hughes et al. Oct 2011 A1
20110263971 Nikou et al. Oct 2011 A1
20110264107 Nikou et al. Oct 2011 A1
20110264112 Nowlin et al. Oct 2011 A1
20110277580 Cooper et al. Nov 2011 A1
20110295268 Roelle et al. Dec 2011 A1
20110301500 Maguire et al. Dec 2011 A1
20110306985 Inoue et al. Dec 2011 A1
20120059378 Farrell Mar 2012 A1
20120071752 Sewell et al. Mar 2012 A1
20120071893 Smith et al. Mar 2012 A1
20120123441 Au et al. May 2012 A1
20120143084 Shoham Jun 2012 A1
20120173021 Tsusaka Jul 2012 A1
20120197182 Millman et al. Aug 2012 A1
20120245595 Kesavadas et al. Sep 2012 A1
20120330429 Axelson et al. Dec 2012 A1
20130006267 Odermatt et al. Jan 2013 A1
20130019883 Worm et al. Jan 2013 A1
20130035690 Mittelstadt et al. Feb 2013 A1
20130035696 Qutub Feb 2013 A1
20130060278 Bozung et al. Mar 2013 A1
20130096574 Kang et al. Apr 2013 A1
20130116706 Lee et al. May 2013 A1
20130172902 Lightcap et al. Jul 2013 A1
20130172905 Iorgulescu Jul 2013 A1
20130178868 Roh Jul 2013 A1
20130304258 Taylor et al. Nov 2013 A1
20130325029 Hourtash et al. Dec 2013 A1
20130345718 Crawford et al. Dec 2013 A1
20140135795 Yanagihara May 2014 A1
20140148818 Komuro et al. May 2014 A1
20140195205 Benker et al. Jul 2014 A1
Foreign Referenced Citations (47)
Number Date Country
101031236 Sep 2007 CN
101815981 Aug 2010 CN
1 680 007 Jul 2006 EP
1 871 267 Jan 2008 EP
1 973 487 Jan 2008 EP
2 666 428 Nov 2013 EP
WO-9611624 Apr 1996 WO
WO-9937220 Jul 1999 WO
WO-0021450 Apr 2000 WO
WO-0035366 Jun 2000 WO
WO-0059397 Oct 2000 WO
WO-0060571 Oct 2000 WO
WO-0200131 Jan 2002 WO
WO-0224051 Mar 2002 WO
WO-02060653 Aug 2002 WO
WO-02065931 Aug 2002 WO
WO-02074500 Sep 2002 WO
WO-02076302 Oct 2002 WO
WO-03086714 Oct 2003 WO
WO-03094108 Nov 2003 WO
WO-2004001569 Dec 2003 WO
WO-2004014244 Feb 2004 WO
WO-2004019785 Mar 2004 WO
WO-2004069036 Aug 2004 WO
WO-2005009215 Feb 2005 WO
WO-2005122916 Dec 2005 WO
WO-2006063156 Jun 2006 WO
WO-2006058633 Aug 2006 WO
WO-2006091494 Aug 2006 WO
WO-2007017642 Feb 2007 WO
WO-2007111749 Oct 2007 WO
WO-2007117297 Oct 2007 WO
WO-2007136739 Nov 2007 WO
WO-2007136768 Nov 2007 WO
WO-2007136769 Nov 2007 WO
WO-2007136771 Nov 2007 WO
WO-2009059330 May 2009 WO
WO-2011021192 Feb 2011 WO
WO-2011088541 Jul 2011 WO
WO-2011106861 Sep 2011 WO
WO-2011113483 Sep 2011 WO
WO-2011128766 Oct 2011 WO
WO-2011133873 Oct 2011 WO
WO-2011133927 Oct 2011 WO
WO-2011134083 Nov 2011 WO
WO-2012018816 Feb 2012 WO
WO-2013181507 Dec 2013 WO
Non-Patent Literature Citations (107)
Entry
Ahmadi et al., “Non-Hertzian Contact Stress Analysis for an Elastic Half-Space Normal and Sliding Contact,” Int. J. Solids Structures, vol. 19, No. 4, pp. 357-373, 1983.
Ansar et al., “Visual and haptic collaborative tele-presence,” Computers & Graphics, vol. 25, 2001, pp. 789-798.
B. Davies, “A review of robotics in surgery”, Proceedings of the Institution of Mechanical Engineers, Part H: Journal of Engineering in Medicine Jan. 1, 2000, vol. 214, No. 1, pp. 129-14.
B. Davies, “Computer-assisted and robotics surgery,” International Congress and Symposium Series, 1997, pp. 71-82.
B. Preising et al., “A Literature Review Robots in Medicine, Engineering in Medicine and Biology Magazine”, IEEE (vol. 10, Issue: 2), Jun. 1991, pp. 13-22, IEEE; 10 pages.
B.L. Davies, “Robotics in minimally invasive surgery, Through the Keyhole: Microengineering in Minimally Invasive Surgery,” IEEE Colloquium on Jun. 6, 1995, pp. 5/1-5/2.
Bærentzen, J.A., “Octree-based vol. Sculpting, Proc. Late Breaking Hot Topics,” IEEE Visualization '98, 1998, pp. 9-12.
Bainville, et al., Concepts and Methods of Registration for Computer-Integrated Surgery, Computer Assisted Orthopedic Surgery (CAOS), 1999, Hogrefe & Huber Publishers, 22 pages.
Bargar et al., “Primary and Revision Total Hip Replacement Using the Robodoc System,” Clinical Orthopaedics and Related Research, No. 354, Sep. 1998, pp. 82-91.
Bei et al., “Multibody Dynamic Simulation of Knee Contact Mechanics,” Medical Engineering & Physics, vol. 26, pp. 777-789, 2004.
Bouazza-Marouf et al., “Robot-assisted invasive orthopaedic surgery,” Mechatronics in Surgery, vol. 6, issue 4, Jun. 1996, pp. 381-397.
Brandt et al., “CRIGOS: A Compact Robot for Image-Guided Orthopedic Surgery,” Information Technology in Biomedicine, IEEE Transactions on, vol. 3, No. 4, 1999, pp. 252-260.
Brisson et al., “Precision Freehand Sculpting of Bone,” Medical Image Computing and Computer-Assisted Intervention—MICCAI 2004, Lecture Notes in Computer Science, vol. 3217, 2004, pp. 105-112.
Burghart et al., “Robot Controlled Osteotomy in Craniofacial Surgery,” , First International Workshop on Haptic Devices in Medical Applications Proceedings, Jun. 23, 1999, pp. 12-22.
Burghart et al., “Robotergestutzte Osteotomie in der craniofacialen Chirurgie (Robot Clipped osteotomy in craniofacial surgery),” Jul. 1, 1999, 250 pages.
Catto et al., “Iterative Dynamics with Temporal Coherence,” Feb. 2005, 24 pages.
Catto, “Soft Constraints Reinventing the Spring,” Game Developer Conference, 2011, 51 pages.
Choi et al., “Flexure-based Manipulator for Active Handheld Microsurgical Instrument, Engineering in Medicine and Biology Society,” Proceedings of the 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference of the Digital Object Identifier, 2005, pp. 5085-5088.
Colgate et al., Issues in the Haptic Display of Tool Use, Intelligent Robots and Systems 95.'Human Robot Interaction and Cooperative Robots, Proceedings. 1995 IEEE/RSJ International Conference on, vol. 3, 1995, pp. 140-145.
D. Engel et al., “A Safe Robot System for Craniofacial Surgery”, Robotics and Automation, 2001. Proceedings 2001 ICRA. IEEE International Conference on (vol. 2), pp. 2020-2024, IEEE; 5 pages.
Davies et al., “ACROBOT—using robots and surgeons synergistically in knee surgery,” Advanced Robotics, ICAR '97. Proceedings., 8th International Conference on, 1997, pp. 173-178.
Davies et al., Active compliance in robotic surgery—the use of force control as a dynamic constraint, Proceedings of the Institution of Mechanical Engineers, Part H: Journal of Engineering in Medicine, vol. 211, Apr. 1, 1997, pp. 285-292.
Davies et al., “Neurobot a special-purpose robot for neurosurgery,” Robotics and Automation, 2000. Proceedings. ICRA '00. IEEE International Conference on, vol. 4, 2000, pp. 4103-4108.
Davies, B., et al., “Active-Constraint Robotics for Surgery”, Proceedings of the IEEE, vol. 94, No. 9, pp. 1696-1704 (2006).
Delp et al., “Computer Assisted Knee Replacement,” Clinical Orthopaedics, vol. 354, Sep. 1998, pp. 49-56.
Digioia et al., “Computer Assisted Orthopaedic Surgery Image Guided and Robotic Assistive Technologies,” Clinical Orthopaedics & Related Research, Sep. 1998, vol. 354, pp. 8-16.
Doignon et al., “Segmentation and guidance of multiple rigid objects for intra-operative endoscopic vision,” Proceeding WDV'05/WDV'06/ICCV'05/ECCV'06 Proceedings of the 2005/2006 International Conference on Dynamical Vision, 2006, pp. 314-327.
Ellis et al., “A surgical planning and guidance system for high tibial osteotomy,” Computer Aided Surgery, vol. 4, Apr. 16, 1999, pp. 264-274.
Fadda et al., “Computer Assisted Planning for Total Knee Arthroplasty,” 1997, pp. 619-628.
Fadda et al., “Computer-Assisted Knee Arthroplasty at Rizzoli Institutes,” First International Symposium on Medical Robotics and Computer Assisted Surgery, Sep. 22-24, 1994, pp. 26-30.
Fadda et al., “Premiers Pas Vers La Dissectomie et la Realisation de Protheses du Genou a L'Aide de Robots,” lnnov. Tech. Bio. Med., vol. 13, No. 4, 1992, pp. 394-409.
Fleute et al., “Incorporating a statistically based shape model into a system for computer-assisted anterior cruciate ligament surgery,” Medical Image Analysis, vol. 3, No. 3, Oct. 1999, pp. 209222.
Gravel et al., Flexible robotic assembly efforts at Ford Motor Company, Intelligent Control, Proceedings of the 2001 IEEE International Symposium on 2001, pp. 173-182.
Gravel et al., Flexible Robotic Assembly, Measuring the Performance and Intelligence of Systems: Proceedings of the 2000 PerMIS Workshop, NIST Interagency/Internal Report (NISTIR), Aug. 2000, pp. 412-418.
Grueneis et al., “Clinical Introduction of the Caspar System Problems and Initial Results,” 4th International Symposium of Computer Assisted Orthopaedic Surgery, 1999, p. 160.
Haider et al., “Minimally Invasive Total Knee Arthroplasty Surgery Through Navigated Freehand Bone Cutting,” Journal of Arthroplasty, vol. 22, No. 4, Jun. 2007, pp. 535-542.
Ham et al., “Accuracy study on the registration of the tibia by means of an intramedullary rod in robot-assisted total knee arthroplasty,” Poster Session—Knee Arthroplasty, Orthopaedic Research Society, Mar. 12-50, 2000, p. 450.
Ham et al., “Machining and Accuracy Studies for a Tibial Knee Implant Using a Force-Controlled Robot,” Computer Aided Surgery, vol. 3, 1998, pp. 123-133.
Harris et al., “Experiences with Robotic Systems for Knee Surgery,” Lecture Notes in Computer Science, vol. 1205, 1997, pp. 757-766.
Harris et al., “Intra-operative Application of a Robotic Knee Surgery System, Medical Image Computing and Computer-Assisted Intervention MICCAI'99,” vol. 1679, 1999, pp. 1116-1124.
Haβfeld et al., “Intraoperative Navigation Techniques Accuracy Tests and Clinical Report,” In: Computer Assisted Radiology and Surgery (CARS'98), Jun. 1998, pp. 670-675.
Ho et al., “Force Control for Robotic Surgery,” , ICAR '95, 1995, pp. 21-32.
Ho et al., “Robot Assisted Knee Surgery Establishing a force control strategy incorporating active motion constraint,” IEEE Engineering in Medicine and Biology, vol. 14, No. 3, May/Jun. 1995, pp. 292-300.
Hyosig et al., “Autonomous Suturing using Minimally Invasive Surgical Robots, Control Applications,” Proceedings of the 2000 IEEE International Conference on, Sep. 25-27, 2000, pp. 742-747.
Hyosig et al., “EndoBot a Robotic Assistant in Minimally Invasive Surgeries, Robotics and Automation,” IEEE International Conference on, vol. 2, 2001, pp. 2031-2036.
International Search Report and Written Opinion issued in corresponding International Application No. PCT/US2011/067202 dated May 7, 2012.
J. T. Lea, “Registration Graphs a Language for Modeling and Analyzing Registration in Image-Guided Surgery,” Dec. 1998, 49 pages.
Jakopec et al., “The first clinical application of a “hands-on” robotic knee surgery system,” Computer Aided Surgery, vol. 6, issue 6, 2001, pp. 329-339.
Jaramaz et al. “Range of Motion After Total Hip Arthroplasty Experimental Verification of the Analytical Simulator,” CVRMed-MRCAS'97, Lecture Notes in Computer Science, Feb. 20, 1997, vol. 1205, pp. 573-582.
Kato et al., “A frameless, armless navigational system for computer-assisted neurosurgery”. Technical note, Journal of Neurosurgery, vol. 74, May 1991, pp. 845-849; 5 pages.
Kazanzides et al., “Architecture of a Surgical Robot, Systems, Man and Cybernetics,” IEEE International Conference on, vol. 2, 1992, pp. 1624-1629.
Khadem et al., “Comparative Tracking Error Analysis of Five Different Optical Tracking Systems,” Computer Aided Surgery, vol. 5, 2000, pp. 98-107.
Kienzle et al., “An Integrated CAD-Robotics System for Total Knee Replacement Surgery, Systems, Man and Cybernetics” IEEE International Conference on, vol. 2, 1992, pp. 1609-1614.
Kienzle et al., “Total Knee Replacement Computer-assisted surgical system uses a calibrated robot” Engineering in Medicine and Biology, vol. 14, issue 3, May 1995, pp. 301-306.
Kim et al., “Haptic interaction and volume modeling techniques for realistic dental simulation”, Visual Computers, vol. 22, 2006, pp. 90-98.
Korb et al., “Development and First Patient Trial of a Surgical Robot for Complex Trajectory Milling,” Computer Aided Surgery, vol. 8, 2008, pp. 247-258.
Koseki et al., “Robotic assist for MR-guided surgery using leverage and parallelepiped mechanism,” Medical Image Computing and Computer-Assisted Intervention -MICCAI 2000, Lecture Notes in Computer Science, 2000, vol. 1935, pp. 940-948.
Kozlowski et al., Automated Force Controlled Assembly Utilizing a Novel Hexapod Robot Manipulator, Automation Congress, 2002, Proceedings of the 5th Biannual World, 2002, pp. 547552, vol. 14, 6 pages.
Lavallee et al., “Computer Assisted Spine Surgery a technique for accurate transpedicular screw fixation using CT data and a 3-D optical localizer,” Journal of Image Guided Surgery, 1995, pp. 65-73.
Lea et al., “Registration and immobilization in robot-assisted surgery,” Journal of Image Guided Surgery, Computer Aided Surgery, col. 1, No. 2, 1995, pp. 80-87.
Leitner et al., “Computer-Assisted Knee Surgical Total Replacement,” Lecture Notes in Computer Science, vol. 1205, 1997, pp. 629-638.
Levison et al., Surgical Navigation for THR A Report on Clinical Trial Utilizing HipNav, MICCAI 2000, LNCS 1935, pp. 1185-1187.
Louhisalmi et al., “Development of a Robotic Surgical Assistant,” 1994, pp. 1043-1044.
Maquet et al., “An Automated Cell for Prosthesis Surgery,” Robotics World, No. 87, 1999, pp. 30-31.
Matsen et al., “Robotic Assistance in Orthopaedic Surgery a Proof of Principle Using Distal Femoral Arthroplasty,” Clinical Orthopaedic Related Research, Nov. 1993, vol. 296, pp. 178-186.
Meng et al., “Remote surgery case robot-assisted teleneurosurgery,” Robotics and Automation, 2004. Proceedings. ICRA '04. 2004 IEEE International Conference on, Apr. 26-May 1, 2004, vol. 1, pp. 0819-823.
Moctezuma et al., “A Computer and Robotic Aided Surgery System for Accomplishing Osteotomies”, First International Symposium on Medical Robotics and Computer Assisted Surgery, Sep. 22-24, 1994, Pittsburgh, Pennsylvania, US; 6 pages.
Nolte et al., “A Novel Approach to Computer Assisted Spine Surgery”, Proc. First International Symposium on Medical Robotics and Computer Assisted Surgery, Pittsburgh, 1994, pp. 323-328; 7 pages.
O'Toole et al., “, Biomechanics for Preoperative Planning and Surgical Simulations in Orthopaedics,” Computers in Biology and Medicine, vol. 25, issue 2, Mar. 1995, pp. 183-191.
P. Shinsuk, “Safety Strategies for Human-Robot Interaction in Surgical Environment,” SICE-ICASE, 2006. International Joint Conference, Oct. 18-21, 2006, pp. 1769-1773.
Paul et al., “A Surgical Robot for Total Hip Replacement Surgery, International Conference on Robotics and Automation,” IEEE, 1992, pp. 606-611.
Paul et al., “Development of a Surgical Robot for Cementless Total Hip Arthroplasty”, Clinical Orthopaedics and Related Research, No. 285, Dec. 1992, pp. 57-66.
Paul et al., “Robotic Execution of a Surgical Plan, Systems, Man and Cybernetics, 1992,” IEEE International Conference on, Oct. 18-21, 1992, pp. 1621-1623.
Petersik et al., “Realistic Haptic Interaction in volume Sculpting for Surgery Simulation,” Surgery Simulation and Soft Tissue Modeling, vol. 2673, Jan. 1, 2003, pp. 194-202.
Quaid et al., “Haptic Information Displays for Computer-Assisted Surgery, Robotics and Automation,” IEEE International Conference on, vol. 2, 2002, pp. 2092-2097.
R. Abovitz, “Digital surgery the future of medicine and human-robot symbiotic interaction,” Industrial Robot: An International Journal, vol. 28, issue 5, pp. 401-406.
R. Buckingham, “Robotics in surgery a new generation of surgical tools incorporate computer technology and mechanical actuation to give surgeons much finer control than previously possible during some operations,” IEEE Review, Sep. 1994, pp. 193-196.
R. Buckingham, Safe Active Robotic Devices for Surgery, Systems, Man and Cybernetics, 1993. Systems Engineering in the Service of Human, Conference Proceedings., International Conference on, Oct. 17-20, 1993, vol. 5, pp. 355-358.
R.A. Abovitz, Human-Interactive Medical Robotics, 2000, pp. 71-72.
Raczkowsky et al., Ein Robotersystem fur craniomaxillofaciale chirurgische Eingriffe (A robotic system for surgical procedures craniomaxillofaciale), Computer Forsch. Entw, vol. 14, 1999, pp. 24-35.
Redlich et al., “Robot assisted craniofacial surgery first clinical evaluation,” Computer Assisted Radiology and Surgery, 1999, pp. 828-833.
Rembold et al., “Surgical Robotics: an Introduction,” Journal of Intelligent and Robotic Systems, vol. 30, No. 1, 2001, pp. 1-28.
Riviere et al., “Modeling and Canceling Tremor in Human-Machine Interfaces,” Engineering in Medicine and Biology Magazine, vol. 15, issue 3, 1996, pp. 29-36.
Rohling et al., “Comparison of Relative Accuracy Between a Mechanical and an Optical Position Tracker for Image-Guided Neurosurgery,” Journal of Image Guided Surgery, vol. 1, No. 4, 1995, pp. 30-34.
S. Lembcke, Realtime Rigid Body Simulation Using Impulses, 2006, 5 pages.
Salisbury et al., “Active Stiffness Control of a Manipulator in Cartesian Coordinates, Decision and Control including the Symposium on Adaptive Processes,” IEEE, vol. 19, Dec. 1980, pp. 95-100.
Santos-Munné et al., “A Stereotactic/Robotic System for Pedicle Screw Placement, Interactive Technology and the New Paradigm for Healthcare”, (Proceedings of the Medicine Meets Virtual Reality III Conference, San Diego, 1995), pp. 326-333, IOS Press and Ohmsha; 8 pages.
Satava, R.M., “Surgical robotics the early chronicles a personal historical perspective,” Surgical Laparoscopic Endoscopic Percutaneous Technology, vol. 12, 2002, pp. 6-16.
Schmidt et al., “EasyGuide Neuro, A New System for Image-Guided Planning, Simulation and Navigation in Neurosurgery,” Biomedical Engineering, vol. 40, supplement 1, 1995, pp. 233-234.
Seibold et al., “Prototype of Instrument for Minimally Invasive Surgery with 6-Axis Force Sensing Capability, Robotics and Automation,” ICRA 2005. Proceedings of the 2005 IEEE International Conference on, 2005, pp. 498-503.
Siebert et al., “Technique and first clinical results of robot-assisted total knee replacement,” The Knee, vol. 9, issue 3, Sep. 2002, pp. 173-180.
Sim et al., “Image-Guided Manipulator Compliant Surgical Planning Methodology for Robotic Skull-Base Surgery, Medical Imaging and Augmented Reality,” Proceedings. International Workshop on, 2001, pp. 26-29.
Simon et al., “Accuracy validation in image-guided orthopaedic surgery,” In Medical Robotics and Computer Assisted Surgery, 1995, pp. 185-192.
Spencer, E.H., “The ROBODOC Clinical Trial a Robotic Assistant for Total Hip Arthroplasty,” Orthopaedic Nursing, vol. 14, issue 1, 1996. pages 9-14.
Spetzger et al., “Frameless Neuronavigation in Modern Neurosurgery, Minimally Invasive Neurosurgery,” vol. 38, Dec. 1995, pp. 163-166.
T. Wang et al., “A robotized surgeon assistant”, Intelligent Robots and Systems '94. 'Advanced Robotic Systems and the Real World', IROS '94. Proceedings of the IEEE/RSJ/GI International Conference on, Sep. 12-16, 1994, pp. 862-869, vol. 2, IEEE, Munich, Germany; 8 pages.
Taylor et al., “, An Image-directed Robotic System for Hip Replacement Surgery,” vol. 8, No. 5, 1990, pp. 111-116.
Taylor et al., “A Model-Based Optimal Planning and Execution System with Active Sensing and Passive Manipulation for Augmentation of Human Precision in Computer-Integrated Surgery, Section 4 Robotic Systems and Task-Level Programming, Experimental Robotics II” The 2nd International Symposium, Lecture Notes in Control and Information Sciences, vol. 190, 1991, pp. 177-195.
Taylor et al., “A Steady-Hand Robotic System for Microsurgical Augementation”, MICCAI99: the Second International Conference on Medical Image Computing and Computer-Assisted Intervention, Cambridge, England, Sep. 19-22, 1999. MICCAI99 Submission #1361999, pp. 1031-1041, Springer-Verlag Berlin Heidelberg; 11 pages.
Taylor et al., “An Image-Directed Robotic System for Precise Orthopaedic Surgery, Robotics and Automation,” IEEE Transactions on vol. 10, issue 3, 1994, pp. 261-275.
Tonet Et al., “An Augmented Reality Navigation System for Computer Assisted Arthroscopic Surgery of the Knee, Medical Image Computing and Computer-Assisted Intervention—MICCAI 2000,” Lecture Notes in Computer Science, vol. 1935, 2000, pp. 1158-1162.
Troccaz et al., “A passive arm with dynamic constraints a solution to safety problems in medical robotics”, Systems, Man and Cybernetics, 1993. ‘Systems Engineering in the Service of Humans’, Conference Proceedings., International Conference on, Oct. 17-20, 1993, pp. 166-171, vol. 3, IEEE, Le Touquet, FR; 6 pages.
Troccaz et al., “Guiding systems for computer-assisted surgery introducing synergistic devices and discussing the different approaches,” Medical Image Analysis, vol. 2, No. 2, 1998, pp. 101-119.
Troccaz et al., Semi-Active Guiding Systems in Surgery. A Two-DOF Prototype of the Passive Arm with Dynamic Constraints (PADyC), Mechatronics, vol. 6, issue 4, Jun. 1996, pp. 399-421.
Watanabe et al., “Three-Dimensional Digitizer (Neuronavigator); New Equipment for Computed Tomography-Guided Stereotaxic Surgery,” , Surgical Neurology, vol. 27, issue 6, Jun. 1987, pp. 543-547.
Written Opinion for Application No. PCT/US2013/053451 dated Mar. 19, 2014; 12 pages.
Zilles et al., “A Constraint-Based God-object Method for Haptic Display”, Intelligent Robots and Systems 95. 'Human Robot Interaction and Cooperative Robots', Proceedings. 1995 IEEE/RSJ International Conference on, Aug. 5-9, 1995, pp. 146-151, vol. 3, IEEE, MIT, Cambridge, MA, USA; 6 pages.
Related Publications (1)
Number Date Country
20170177191 A1 Jun 2017 US
Provisional Applications (3)
Number Date Country
61428210 Dec 2010 US
61679258 Aug 2012 US
61792251 Mar 2013 US
Divisions (1)
Number Date Country
Parent 13958070 Aug 2013 US
Child 14841062 US
Continuations (1)
Number Date Country
Parent 14841062 Aug 2015 US
Child 15401567 US
Continuation in Parts (2)
Number Date Country
Parent 13339369 Dec 2011 US
Child 15451257 US
Parent 15401567 Jan 2017 US
Child 13339369 US