The invention relates to a surgical system, more particularly, to a surgical system and method for orthopedic joint replacement.
Minimally invasive surgery (MIS) is the performance of surgery through incisions that are considerably smaller than incisions used in traditional surgical approaches. For example, in an orthopedic application such as total knee replacement surgery, an MIS incision length may be in a range of about 4 to 6 inches whereas an incision length in traditional total knee surgery is typically in a range of about 6 to 12 inches. As a result of the smaller incision length, MIS procedures are generally less invasive than traditional surgical approaches, which minimizes trauma to soft tissue, reduces post-operative pain, promotes earlier mobilization, shortens hospital stays, and speeds rehabilitation.
One drawback of MIS is that the small incision size reduces a surgeon's ability to view and access the anatomy. For example, in minimally invasive orthopedic joint replacement, limited visibility and limited access to the joint increase the complexity of assessing proper implant position and of reshaping bone. As a result, accurate placement of implants may be more difficult. Conventional techniques for counteracting these problems include, for example, surgical navigation, positioning the leg for optimal joint exposure, and employing specially designed, downsized instrumentation and complex surgical techniques. Such techniques, however, typically require a large amount of specialized instrumentation, a lengthy training process, and a high degree of skill. Moreover, operative results for a single surgeon and among various surgeons are not sufficiently predictable, repeatable, and/or accurate. As a result, implant performance and longevity varies among patients.
In orthopedic applications, one drawback of both MIS and traditional surgical approaches is that healthy as well as diseased bone is removed when the bone is prepared to receive the implant. For example, a total knee replacement can require removal of up to ½ inch of bone on each of three compartments of the knee.
Another drawback of both MIS and traditional orthopedic surgical approaches is that such approaches do not enhance the surgeon's inherent surgical skill in a cooperative manner. For example, some conventional techniques for joint replacement include autonomous robotic systems to aid the surgeon. Such systems, however, typically serve primarily to enhance bone machining by performing autonomous cutting with a high speed burr or by moving a drill guide into place and holding the position of the drill guide while the surgeon inserts cutting tools through the guide. Although such systems enable precise bone resections for improved implant fit and placement, they act autonomously (rather than cooperatively with the surgeon) and thus require the surgeon to cede a degree of control to the robot. Additional drawbacks of autonomous systems include the large size of the robot, poor ergonomics, the need to rigidly clamp the bone during registration and cutting, increased incision length for adequate robot access, and limited acceptance by surgeons and regulatory agencies due to the autonomous nature of the system.
Other conventional robotic systems include robots that cooperatively interact with the surgeon. One drawback of conventional interactive robotic systems is that such systems lack the ability to adapt surgical planning and navigation in real-time to a dynamic intraoperative environment. For example, U.S. patent application Ser. No. 10/470,314 (Pub. No. US 2004/0128026), which is hereby incorporated by reference herein in its entirety, discloses an interactive robotic system programmed with a three-dimensional virtual region of constraint that is registered to a patient. The robotic system includes a three degree of freedom (3-DOF) arm having a handle that incorporates force sensors. The surgeon utilizes the handle to manipulate the arm to move the cutting tool. Moving the arm via the handle is required so that the force sensors can measure the force being applied to the handle by the surgeon. The measured force is then used in controlling motors to assist or resist movement of the cutting tool. For example, during a knee replacement operation, the femur and tibia of the patient are fixed in position relative to the robotic system. As the surgeon applies force to the handle to move the cutting tool, the interactive robotic system may apply an increasing degree of resistance to resist movement of the cutting tool as the cutting tool approaches a boundary of the virtual region of constraint. In this manner, the robotic system guides the surgeon in preparing the bone by maintaining the cutting tool within the virtual region of constraint. As with the above-described autonomous systems, however, the interactive robotic system functions primarily to enhance bone machining. The interactive robotic system also requires the relevant anatomy to be rigidly restrained and the robotic system to be fixed in a gross position and thus lacks real-time adaptability to the intraoperative scene. Moreover, the 3-DOF configuration of the arm and the requirement that the surgeon manipulate the arm using the force handle results in limited flexibility and dexterity, making the robotic system unsuitable for certain MIS applications.
In view of the foregoing, a need exists for a surgical system that can replace direct visualization in minimally invasive surgery, spare healthy bone in orthopedic joint replacement applications, enable intraoperative adaptability and surgical planning, and produce operative results that are sufficiently predictable, repeatable, and/or accurate regardless of surgical skill level. A surgical system need not necessarily meet all or any of these needs to be an advance, though a system meeting these needs would me more desirable.
One implementation of the present disclosure is a surgical system. The surgical system includes a robotic arm, an end effector coupled to the robotic arm, a divot at the end effector, a probe configured to be inserted into the divot, a tracking system configured to obtain data indicative of a position of the probe while the probe is in the divot, and circuitry configured to verify a proper physical configuration of the surgical system based on the data indicative of the position of the probe while the probe is in the divot.
Another implementation of the present disclosure is a method of operating a surgical robotics system. The method includes tracking a position of a probe, obtaining an indication that the probe is in a divot coupled to an end effector of a robotic device, and verifying a proper physical configuration of the surgical robotics system based on the position of the probe while the probe is received by the divot.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain principles of the invention.
Presently preferred embodiments of the invention are illustrated in the drawings. Although this specification refers primarily to orthopedic procedures involving the knee joint, it should be understood that the subject matter described herein is applicable to other joints in the body, such as, for example, a shoulder, elbow, wrist, spine, hip, or ankle and to any other orthopedic and/or musculoskeletal implant, including implants of conventional materials and more exotic implants, such as orthobiologics, drug delivery implants, and cell delivery implants.
The computing system 20 includes hardware and software for operation and control of the surgical system 10. As shown in
The haptic device 30 is a surgical device configured to be manipulated by a user to move a surgical tool 50 to perform a procedure on a patient. During the procedure, the computing system 20 implements control parameters for controlling the haptic device 30 based, for example, on a relationship between an anatomy of the patient and a position, an orientation, a velocity, and/or an acceleration of a portion of the haptic device 30 (e.g., the surgical tool 50). In one embodiment, the haptic device 30 is controlled to provide a limit on user manipulation of the device (e.g., by limiting the user's ability to physically manipulate the haptic device 30). In another embodiment, the haptic device 30 is controlled to provide haptic guidance (i.e., tactile and/or force feedback) to the user. “Haptic” refers to a sense of touch, and the field of haptics involves research relating to human interactive devices that provide tactile and/or force feedback to an operator. Tactile feedback generally includes tactile sensations such as, for example, vibration, whereas force feedback refers to feedback in the form of force (e.g., resistance to movement) and/or torque (also known as “wrench). Wrench includes, for example, feedback in the form of force, torque, or a combination of force and torque.
Guidance from the haptic device 30 coupled with computer aided surgery (CAS) enables a surgeon to actively and accurately control surgical actions (e.g., bone cutting) and delivery of localized therapies (e.g., in the brain). The computing system 20 can control the haptic device 30 to generate a force, a torque, and/or vibration based on the position of the tool 50 relative to the virtual object, the parameter, and/or the anatomy. Thus, in operation, as a surgeon manipulates the haptic device 30 to move the tool 50, virtual pathways may be used to guide the tool 50 to specific targets, virtual boundaries may be used to define cutting shapes or to prevent the tool 50 from contacting critical tissue, and predefined parameters may be used to limit travel of the tool 50 (e.g., to a predefined depth). The computing system 20 may also be programmed to adjust the control parameters in response to movement of the physical anatomy during the procedure (e.g., by monitoring detected movement of the physical anatomy and then adjusting the virtual object in response to the detected movement). In this manner, the surgical system 10 can supplement or replace direct visualization of the surgical site, enhance the surgeon's natural tactile sense and physical dexterity, and facilitate the targeting, repairing, and replacing of various structures in the body through conventionally sized portals (e.g., 12 inches or greater in length) to portals having a diameter as small as approximately 1 mm.
In orthopedic applications, for example, the haptic device 30 can be applied to the problems of inaccuracy, unpredictability, and non-repeatability in bone (or work piece) preparation by assisting the surgeon with proper sculpting of bone to thereby enable precise, repeatable bone resections while maintaining intimate involvement of the surgeon in the bone preparation process. Moreover, because the haptic device 30 haptically guides the surgeon in the bone cutting operation, the skill level of the surgeon is less critical. As a result, surgeons with varying degrees of skill and experience are able perform accurate, repeatable bone resections. In one embodiment, for example, a surgical tool is coupled to the haptic device 30. The surgeon can operate the tool to sculpt bone by grasping and moving the tool and/or by grasping and manipulating the haptic device 30 to move the tool. As the surgeon performs the cutting operation, the surgical system 10 tracks the location of the tool (with the tracking system 40) and, in most cases, allows the surgeon to freely move the tool in the workspace. When the tool is in proximity to a virtual boundary in registration with the patient, however, the surgical system 10 controls the haptic device 30 to provide haptic guidance that tends to constrain the surgeon from penetrating the virtual boundary with the tool. For example, the virtual boundary may be defined by a haptic object, and the haptic guidance may comprise an output wrench (i.e., force and/or torque) that is mapped to the haptic object and experienced by the surgeon as resistance to further tool movement in the direction of the virtual boundary. A haptic object may have an associated spatial or geometric representation that can be graphically represented on the display device 23. A graphical representation may be selected so as to convey useful information to the user. For example, as shown in
The haptic device 30 may include a mechanical or electro-mechanical device adapted to transmit tactile feedback (e.g., vibration) and/or force feedback (e.g., wrench) to the user. The haptic device 30 may be robotic, non-robotic, or a combination of robotic and non-robotic systems.
In one embodiment, the haptic device 30 comprises a robot. In such an embodiment, as shown in
The arm 33 is disposed on the base 32 and is adapted to enable the haptic device 30 to be manipulated by the user. The arm 33 may be any suitable mechanical or electromechanical structure but is preferably an articulated arm having four or more degrees of freedom (or axes of movement), such as, for example, a robotic arm known as the “Whole-Arm Manipulator” or WAM™ currently manufactured by Barrett Technology, Inc. The arm 33 includes a proximal end disposed on the base 32 of the haptic device 30 and a distal end to which a surgical tool 50 is coupled. As described further below, the distal end of the arm 33 may include the end effector 35 and/or a tool holder 51 for the tool 50. In one embodiment, the arm 33 includes a first segment 33a, a second segment 33b, and a third segment 33c as shown in
Dexterity of the arm 33 may be enhanced, for example, by adding additional degrees of freedom. For example, the arm 33 may include a wrist 36. As shown in
The end effector 35 may comprise a working end of the haptic device 30 and can be configured to enable the user to perform various activities related to a surgical procedure. For example, in one embodiment, the end effector 35 functions as an adapter or coupling between the arm 33 and the tool 50. By decoupling the tool 50 from the end effector 35 and interchanging one tool 50 for another, the user can utilize the haptic device 30 for different activities, such as registration, bone (or work piece) preparation, measurement/verification, and/or implant installation. In one embodiment, as shown in
The tracking (or localizing) system 40 of the surgical system 10 is configured to determine a pose (i.e., position and orientation) of one or more objects during a surgical procedure to detect movement of the object(s). For example, the tracking system 40 may include a detection device that obtains a pose of an object with respect to a coordinate frame of reference (or coordinate system) of the detection device. As the object moves in the coordinate frame of reference, the detection device tracks the pose of the object to detect (or enable the surgical system 10 to determine) movement of the object. As a result, the computing system 20 can adjust the control parameters (e.g., by adjusting a virtual object) in response to movement of the tracked object. Tracked objects may include, for example, tools/instruments, patient anatomy, implants/prosthetic devices, work pieces, and components of the surgical system 10. Using pose data from the tracking system 40, the surgical system 10 is also able to register (or map or associate) coordinates in one space to those in another to achieve spatial alignment or correspondence (e.g., using a coordinate transformation process as is well known). Objects in physical space may be registered to any suitable coordinate system, such as a coordinate system being used by a process running on the computer 21 and/or the computer 31. For example, utilizing pose data from the tracking system 40, the surgical system 10 is able to associate the physical anatomy and the tool 50 (and/or the haptic device 30) with a representation of the anatomy (such as an image displayed on the display device 23). Based on tracked object and registration data, the surgical system 10 may determine, for example, (a) a spatial relationship between the image of the anatomy and the relevant anatomy and (b) a spatial relationship between the relevant anatomy and the tool 50 so that the computing system 20 can superimpose (and continually update) a virtual representation of the tool 50 on the image, where the relationship between the virtual representation and the image is substantially identical to the relationship between the tool 50 and the actual anatomy. Additionally, by tracking not only the tool 50 but also the relevant anatomy, the surgical system 10 can compensate for movement of the relevant anatomy during the surgical procedure (e.g., by adjusting a virtual object in response to the detected movement). As shown in
The tracking system 40 may be any tracking system that enables the surgical system 10 to continually determine (or track) a pose of the relevant anatomy of the patient and a pose of the tool 50 (and/or the haptic device 30). For example, the tracking system 40 may comprise a non-mechanical tracking system, a mechanical tracking system, or any combination of non-mechanical and mechanical tracking systems suitable for use in a surgical environment.
In one embodiment, as shown in
A non-mechanical tracking system may include a trackable element (or tracker) for each object the user desires to track. For example, in one embodiment, the non-mechanical tracking system includes an anatomy tracker 43 (to track patient anatomy), a haptic device tracker 45 (to track a global or gross position of the haptic device 30), an end effector tracker 47 (to track a distal end of the haptic device 30), and an instrument tracker 49 (to track an instrument/tool held manually by the user).
As shown in
As shown in
The instrument tracker 49 is adapted to be coupled to an instrument 150 that is held manually in the hand of the user (as opposed, for example, to the tool 50 that is attached to the end effector 35). The instrument 150 may be, for example, a probe, such as a registration probe (e.g., a straight or hooked probe). As shown in
The instrument tracker 49 may also be configured to verify calibration of the instrument 150. For example, another tracker (e.g., the tracker 43, 45, or 47) may include a divot into which the user can insert the tip of the instrument 150. In one embodiment, as shown in
The tracking system 40 may additionally or alternatively include a mechanical tracking system. In contrast to the non-mechanical tracking system (which includes a detection device 41 that is remote from the trackers 43, 45, 47, and 49), a mechanical tracking system may be configured to include a detection device (e.g., an articulating arm having joint encoders) that is mechanically linked (i.e., physically connected) to the tracked object. The tracking system 40 may include any known mechanical tracking system, such as, for example, a mechanical tracking system as described in U.S. Pat. Nos. 6,033,415 and/or 6,322,567, each of which is hereby incorporated by reference herein in its entirety. In one embodiment, the tracking system 40 includes a mechanical tracking system having a jointed mechanical arm 241 (e.g., an articulated arm having six or more degrees of freedom) adapted to track a bone of the patient. As shown in
When the tracking system 40 includes the mechanical tracking system, the arm 241 may be used to register the patient's anatomy. For example, the user may use the arm 241 to register the tibia T while the second arm (i.e., the arm that is identical to the arm 241 but that is affixed to the tibia T) tracks motion of the tibia T. Registration may be accomplished, for example, by pointing a tip of the distal end of the arm 241 to anatomical landmarks on the tibia T and/or by touching points on (or “painting”) a surface of the tibia T with the tip of the distal end of the arm 241. As the user touches landmarks on the tibia T and/or paints a surface of the tibia T, the surgical system 10 acquires data from the position sensors in the arm 241 and determines a pose of the tip of the arm 241. Simultaneously, the second arm provides data regarding motion of the tibia T so that the surgical system 10 can account for bone motion during registration. Based on the bone motion data and knowledge of the position of the tip of the arm 241, the surgical system 10 is able to register the tibia T to the diagnostic images and/or the anatomical model of the patient's anatomy in the computing system 20. In a similar manner, the second arm may be used to register the femur F while the arm 241 (which is affixed to the femur F) tracks motion of the femur F. The patient's anatomy may also be registered, for example, using a non-mechanical tracking system in combination with a tracked probe (e.g., the instrument 150 with the instrument tracker 49) and/or using the haptic device 30 (e.g., as described below in connection with step S8 of
A fault condition may exist if there is a system problem (e.g., a problem with the hardware or software), if the occlusion detection algorithm detects an occluded condition (e.g., as described below in connection with step S11 of
In one embodiment, a method of controlling the haptic device 30 based on the tool disabling features includes (a) enabling operation of the haptic device 30; (b) manipulating the haptic device 30 to perform a procedure on a patient; (c) determining whether a relationship between the anatomy of the patient and a position, an orientation, a velocity, and/or an acceleration of the tool 50 of the haptic device 30 corresponds to a desired relationship; and (d) issuing a fault signal if the relationship does not correspond to the desired relationship. The method may further include implementing control parameters for controlling the haptic device 30 to provide at least one of haptic guidance to the user and a limit on user manipulation of the surgical device based on the relationship. In one embodiment, in response to the fault signal, the surgical system 10 disables operation of the haptic device 30, locks a portion of the haptic device 30 in position, and/or places the haptic device 10 in a safety mode. In the safety mode, operation of and/or manipulation of the haptic device 30 is impeded.
In one embodiment, a method of compensating for motion of objects during a surgical procedure includes (a) determining a pose of the anatomy; (b) determining a pose of the tool 50; (c) determining at least one of a position, an orientation, a velocity, and an acceleration of the tool 50; (d) associating the pose of the anatomy, the pose of the tool 50, and a relationship between the pose of the anatomy and the at least one of the position, the orientation, the velocity, and the acceleration of the tool 50; and (e) updating the association in response to motion of the anatomy and/or motion of the tool 50. The relationship may be based, for example, on a desired interaction between the anatomy and a position, an orientation, a velocity, and/or an acceleration of the tool 50. In one embodiment, the relationship is defined by a virtual object or parameter positioned relative to the anatomy and representing a desired location of an implant and/or cut surfaces for installing the implant. The step of associating the pose of the anatomy, the pose of the tool 50, and the relationship may be accomplished, for example, using registration processes (e.g., step S8 of
One advantage of including a haptic rendering process in the surgical system 10 is that the haptic rendering process enables interaction between a surgical tool and a virtual environment. For example, the haptic rendering process can create a virtual environment including one or more virtual (or haptic) objects and a virtual representation of the physical tool 50. The physical tool 50 is associated with (e.g., registered to) the virtual environment and/or the virtual representation of the tool 50. Thus, as the user manipulates the physical tool 50, the virtual representation of the tool 50 interacts with virtual objects in the virtual environment. In this manner, the physical tool 50 is able to interact with the virtual environment. Interaction between the virtual objects and the virtual representation of the tool 50 may be based on point, ray (line), multiple point, and/or polygon models. In a preferred embodiment, the surgical system 10 employs point-based haptic interaction where only a virtual point, or haptic interaction point (HIP), interacts with virtual objects in the virtual environment. The HIP corresponds to a physical point on the haptic device 30, such as, for example, a tip of the tool 50. The HIP is coupled to the physical point on the physical haptic device 30 by a virtual spring/damper model. The virtual object with which the HIP interacts may be, for example, a haptic object having a surface and a haptic force normal vector Fn. A penetration depth di is a distance between the HIP and the nearest point on the surface. The penetration depth di represents the depth of penetration of the HIP into the haptic object.
During a surgical procedure, the computing system 20 guides the user through the procedure. For example, the computing system 20 may be programmed to generate a display configured to guide the user manipulating the haptic device 30 through the procedure. The display may comprise screens shown on the display device 23 that include, for example, predefined pages and/or images corresponding to specific steps of the procedure. The display may also prompt the user to perform one or more tasks. For example, the display may instruct a user to select anatomical landmarks on a representation of the anatomy (discussed below in connection with steps S3 and S4 of
Displays or screens associated with the surgical procedure may be configured to communicate visual information to the user regarding the procedure. For example, as shown in
In addition to communicating with the user visually, the computing system 20 may be programmed to emit audible signals (e.g., via the acoustic device). For example, in one embodiment, the computing system 20 may emit sounds (e.g., beeps) indicating that a cutting depth of the tool 50 is too shallow, approximately correct, or too deep. In another embodiment, the surgical system 10 may provide an audible indication of a distance between the tip of the tool 50 and a surface of a haptic object in registration with the patient as described, for example, in U.S. patent application Ser. No. 10/621,119 (Pub. No. US 2004/0106916), which is hereby incorporated by reference herein in its entirety. The computing system 20 may also be programmed to control the haptic device 30 to provide tactile feedback to the user, such as, for example, a vibration indicating that the tool 50 has reached or exceeded the desired cutting depth. The software of the computing system 20 may also include programs or processes that automatically prompt a user to perform certain tasks, such as, for example, segmenting an image of a diagnostic image data set, selecting points on the patient's anatomy to define a mechanical axis, touching (or “painting”) points on a surface of the bone with a registration probe, entering data (e.g., implant size, burr size, etc.), and the like.
In the embodiment of
Regarding the steps of
Similarly, in step S4, the user may designate tibial landmarks on an image of the tibia T. The tibial landmarks are used by the surgical system 10 to associate (or register) the patient's physical anatomy with the representation of the anatomy (e.g., diagnostic images, models generated from segmentation, anatomical models, etc.). The surgical system 10 generates screens to guide the user in specifying the tibial landmarks. For example, the surgical system 10 may direct the user to specify a medial malleolus, a lateral malleolus, a rotational landmark, and a knee center. In one embodiment, the user may select the tibial landmarks on a displayed image using a mouse or touch screen. In another embodiment, the computer may be programmed to determine the tibial landmarks, for example, using algorithms designed to locate distinguishing features in the diagnostic images.
In step S5, a homing process initializes the position sensors (e.g., encoders) of the haptic device 30 to determine an initial pose of the arm 33. Homing may be accomplished, for example, by manipulating the arm 33 so that each joint encoder is rotated until an index marker on the encoder is read. The index marker is an absolute reference on the encoder that correlates to a known absolute position of a joint. Thus, once the index marker is read, the control system of the haptic device 30 knows that the joint is in an absolute position. As the arm 33 continues to move, subsequent positions of the joint can be calculated based on the absolute position and subsequent displacement of the encoder. The surgical system 10 may guide the user through the homing process by providing instructions regarding the positions in which the user should place the arm 33. The instructions may include, for example, images displayed on the display device 23 showing the positions into which the arm 33 should be moved.
In step S6, an instrument (e.g., a registration probe such as the instrument 150) is checked to verify that the instrument is calibrated. For example, step S6 may be used to verify that a registration probe has a proper physical configuration. As discussed above in connection with the instrument tracker 49, calibration of a probe that includes the instrument tracker 49 may be accomplished by inserting a tip of the probe into the divot 47a of the end effector tracker 47, holding the tip in place, and detecting the instrument tracker 49 and the end effector tracker 47 with the detection device 41. The detection device 41 acquires pose data, and the surgical system 10 compares an actual geometric relationship between the trackers 49 and 47 to an expected geometric relationship between the trackers 49 and 47. Deviation between the actual and expected geometric relationships indicates one or more physical parameters of the probe is out of calibration. As shown in
In step S7, the surgical system 10 prompts the user to attach the anatomy trackers 43a and 43b to the patient. As shown in
In one embodiment, once the anatomy trackers 43a and 43b are attached, a range of motion (ROM) of the knee joint is captured (e.g., by moving the knee joint through the ROM while tracking the anatomy trackers 43a and 43b with the tracking system 40). The captured ROM data may be used to assess relative placement of the femoral and tibial implants. In this way, comprehensive placement planning for both implants can be performed before cutting any bone. The ROM data may also be used (e.g., during the implant planning steps S10 and S13) to display relative positions of the femoral and tibial implants at extension, flexion, and various angles between extension and flexion on the display device 23.
After the anatomy trackers 43a and 43b are fixed to the patient, the process proceeds to step S8 in which the patient's physical anatomy is registered to the representation of the anatomy. In other words, the physical anatomy is registered to image space. For example, the femur F and the tibia T of the patient may be registered in standard fashion using a paired-point/surface match approach based on the femoral and tibial landmarks specified in steps S3 and S4, respectively. The surgical system 10 generates screens to guide the user through the registration process. For example, a screen 86a (
Preferably, step S8 includes identifying at least one check point on the anatomy and digitizing the check point with a registration probe. During the surgical procedure, the check point enables the user to verify that the surgical system 10 is properly configured. For example, the user can touch the tip of the tool 50 to the check point to confirm that the tracking system 40 is properly configured (e.g., the tracking elements are not occluded and are still properly aligned relative to the anatomy and/or the haptic device 30, etc.), that the tool 50 is correctly installed (e.g., property seated, shaft not bent, etc.), and/or that any other object is properly mounted, installed, calibrated, and the like. In this manner, the check point enables the surgical system 10 to confirm that all elements involved in relating the tip of the tool 50 to the anatomy of the patient remain in calibration and that the tracking elements are updating properly.
In one embodiment, the check point is established as follows. In step S8, after locating the hip center and prior to collecting any landmarks, the user establishes two anatomical reference points (or check points) on the anatomy—a first check point on the femur F and a second check point on the tibia T. Each check point should be placed so that it is accessible with a registration probe but is not located on a portion of the bone that will be removed during the surgical procedure. A check point may be established, for example, by marking the bone with methylene blue (or other clinical marking product), by creating a small (e.g., approximately 1 mm diameter) divot on the bone (e.g., using a drill bit), and/or by implanting a temporary fiducial marker in the bone as is well known. In this embodiment, the surgical system 10 displays a screen instructing the user to identify (e.g., touch or contact) the first check point on the femur F. When the user contacts the first check point with the tip of the registration probe, the surgical system 10 digitizes the first check point and establishes a vector from the first check point to the anatomy tracker 43a affixed to the femur F. The surgical system 10 may also instruct the user to verify (or re-touch) the first check point to ensure that the first check point is accurately captured. The surgical system 10 then displays a screen instructing the user to identify (e.g., touch or contact) the second check point on the tibia T. When the user contacts the second check point with the tip of the registration probe, the surgical system 10 digitizes the second check point and establishes a vector from the second check point to the anatomy tracker 43b affixed to the tibia T. The surgical system 10 may also instruct the user to verify (or re-touch) the second check point to ensure that the second check point is accurately captured. After the first and second check points have been established, the user proceeds with registration by selecting landmarks and painting surfaces of the femur F and the tibia T (as described above in connection with step S8). If desired, the first and second check points may be transformed into image space (e.g., using the registration transform obtained in step S8) and displayed on the display device 23 to aid in assessing the success of the registration.
In operation, s any time the user wants to validate the configuration of the surgical system 10, such as when the tool 50 is withdrawn from and then reinserted into the patient. Based on geometric data obtained during establishment of the first and second checkpoints, the surgical system 10 knows a location of the first checkpoint relative to the anatomy tracker 43a and a location of the second checkpoint relative to the anatomy tracker 43b. Based on geometric data obtained during calibration of the haptic device 30 (as described below in connection with step S9), the surgical system 10 knows a location of a center of the tip of the tool 50 from a pose of the haptic device tracker 45, a pose of the arm 33 of the haptic device 30, the geometry of the tool 50, and the geometric relationship between the tool 50 and the end effector 35. Based on this data, when the user touches the tip of the tool 50 to an interface, the surgical system 10 can calculate a distance between the location of the center of the tip of the tool 50 and the location of the relevant checkpoint. A radius of the tip of the tool 50 is subtracted from the distance to obtain a verification value. Preferably, the verification value is approximately 0.00 mm, which indicates that the location of the tip of the tool 50 and the location of the checkpoint correspond. Some error, however, is acceptable. For example, in one embodiment, if the verification value is equal to or less than a predetermined tolerance (e.g., approximately 1 mm), the system configuration will be deemed acceptable and the user may proceed with the surgical procedure. In contrast, if the verification value exceeds the predetermined tolerance, the surgical system 10 will issue a warning (e.g., a visual, audible, and/or tactile warning) indicating a problem with the system configuration. A problem may exist, for example, if one of tracking elements was bumped by the user during a tool change and is now misaligned, if the tool shaft is bent, and the like. If a warning is issued, registration (step S8) and/or calibration (step S9) should be repeated.
In addition to checking the entire system configuration, the checkpoints may also be used to determine whether the anatomy trackers 43a and 43b have moved relative to the femur F and the tibia T, respectively. For example, to determine whether the anatomy tracker 43a has moved relative to the femur F, the user returns to the checkpoint identification screen (e.g., screen 186a of
One advantage of the checkpoint verification procedure is that the procedure enables the user to confirm that various parts of the surgical system 10 are performing as intended prior to making any non-reversible cuts on the patient's anatomy. For example, the checkpoints can be used to verify registration, calibration of the haptic device 30, and proper operation of the tracking system 40 and tracking elements. As a result, the checkpoints enable the surgical system 10 to simultaneously verify movement of the anatomy trackers 43a and 43b, registration accuracy, movement of the haptic device tracker 45, kinematic calibration of the haptic device 30, proper mounting of the tool 50, and correct tool size.
In step S9, the haptic device 30 is calibrated to establish a geometric relationship or transformation (i.e., position and orientation) between a coordinate system of the haptic device tracker 45 and a coordinate system of the haptic device 30. If the haptic device tracker 45 is fixed in a permanent position on the haptic device 30, calibration is not necessary because the geometric relationship between the tracker 45 and the haptic device 30 is fixed and known (e.g., from an initial calibration during manufacture or setup). In contrast, if the tracker 45 can move relative to the haptic device 30 (e.g., if the arm 34 on which the tracker 45 is mounted is adjustable) calibration is necessary to determine the geometric relationship between the tracker 45 and the haptic device 30.
The surgical system 10 can initiate a calibration procedure by generating a screen instructing the user to calibrate the haptic device 30. Calibration involves securing the haptic device tracker 45 in a fixed position on the haptic device 30 and temporarily affixing the end effector tracker 47 to the end effector 35. The end effector 35 is then moved to various positions in a vicinity of the anatomy (e.g., positions above and below the knee joint, positions medial and lateral to the knee joint) while the tracking system 40 acquires pose data for the trackers 47 and 45 relative to the tracking system 40 in each of the positions. The calibration process of step S9 need not be performed if the haptic device tracker 45 has not moved with respect to the haptic device 30 since the previous calibration and the previously acquired calibration data is still reliable.
In step S10, the user plans bone preparation for implanting a first implant on a first bone. In a preferred embodiment, the first bone is the tibia T, the first implant is the tibial component 74, and bone preparation is planned by selecting a location on a proximal end of the tibia T where the tibial component 74 will be installed. To facilitate implant planning, the surgical system 10 can generate a screen that includes various views of representations of the first and second bones (i.e., the tibia T and the femur F, respectively).
Steps S11 to S15 encompass the bone preparation process. In step S11, the first bone (e.g., the tibia T) is prepared to receive the first implant (e.g., the tibial component 74) by manipulating the tool 50 to sculpt the first bone. In step S12, a trial implant is fitted to the prepared feature on the first bone. In step S13, an initial placement of the second implant (e.g., the femoral component) is planned (or a previously planned placement of the second implant may be revisited and adjusted). In step S14, the second bone (e.g., the femur F) is prepared to receive the second implant after preparation of the first bone. In step S15, a trial implant is fitted to the prepared features on the second bone.
Throughout surgical procedure, the surgical system 10 monitors movement of the anatomy to detect movement of the anatomy and makes appropriate adjustments to the programs running on the computer 21 and/or the computer 31. The surgical system 10 can also adjust a virtual object associated with the anatomy in response to the detected movement of the anatomy.
In step S11, the first bone is prepared to receive the first implant by manipulating the tool 50 to sculpt the first bone. In one embodiment, the tibia T is prepared by forming the medial tibial pocket feature on the proximal end of the tibia T. Upon installation of the tibial component 74, the medial tibial pocket feature will mate with the surface 74a of the tibial component 74 (shown in
The occlusion detection algorithm is a safety feature adapted to mitigate risk during a cutting operation in the event tracking elements associated with the haptic device 30 and/or the anatomy become occluded (e.g., the haptic device tracker 45, the anatomy trackers 43a and 43b). An occluded state may exist, for example, when the detection device 41 is unable to detect a tracking element (e.g., when a person or object is interposed between the tracking element and the detection device 41), when a lens of the detection device 41 is occluded (e.g., by dust), and/or when reflectivity of markers on a tracking element is occluded (e.g., by blood, tissue, dust, bone debris, etc.). If an occluded state is detected, the occlusion detection algorithm alerts the user, for example, by causing a warning message to be displayed on the display device 23, an audible alarm to sound, and/or the generation of tactile feedback (e.g., vibration). The occlusion detection algorithm may also issue a control signal, such as a command to the surgical system 10 to shut off power to or otherwise disable the tool 50. In this manner, the occlusion detection algorithm prevents the tool 50 from damaging the anatomy when the tracking system 40 is not able to track relative positions of the tool 50 and the anatomy.
Step S12 is a trial reduction process in which the first implant (i.e., the tibial component 74) or a trial implant (e.g., a tibial trial) is fitted to the first bone (i.e., the prepared medial tibial pocket feature on the tibia T). The user assesses the fit of the tibial component or the tibial trial and may make any desired adjustments, such as, for example, repeating implant planning and/or bone sculpting to achieve an improved fit.
In step S13, the user plans bone preparation for implanting a second implant on a second bone after preparing the first bone. In a preferred embodiment, the second bone is the femur F, the second implant is the femoral component 72, and bone preparation is planned by selecting a location on a distal end of the femur F where the femoral component 72 will be installed. If the femoral component 72 has been previously planned (e.g., in step S10), the prior placement may be revisited and adjusted if desired.
In step S14, the second bone is prepared to receive the second implant by manipulating the tool 50 to sculpt the second bone. In one embodiment, the femur F is prepared by forming the medial femoral surface, post, and keel features on the distal end of the femur F. Upon installation of the femoral component 72, the medial femoral surface, post, and keel features will mate with a surface 72a, a post 72b, and a keel 72c, respectively, of the femoral component 72 (shown in
Step S15 is a trial reduction process in which the second implant (i.e., the femoral component 72) or a trial implant (e.g., a femoral trial) is fitted to the prepared medial femoral surface, post, and keel features on the femur F. The user assesses the fit of the femoral component 72 or the femoral trial and may make any desired adjustments, such as, for example, repeating implant planning and/or bone sculpting to achieve an improved fit. In step S15, adjustments may also be made to the tibia T. When the user is satisfied with the fit of the trial implants, the user may proceed with installation of the femoral component 72 and the tibial component 74 and completion of the surgical procedure.
Thus, embodiments of the present invention can be configured to provide a haptic guidance system and method that may replace direct visualization in minimally invasive surgery, spare healthy bone in orthopedic joint replacement applications, enable intraoperative adaptability and planning, and produce operative results that are sufficiently predictable, repeatable, and/or accurate regardless of surgical skill level.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. It is intended that the specification and examples be considered as exemplary only.
This application is a continuation of U.S. application Ser. No. 17/395,668, now U.S. Pat. No. 11,712,308, filed Aug. 6, 2021, which is a continuation of U.S. application Ser. No. 16/509,651, now U.S. Pat. No. 11,123,143, filed Jul. 12, 2019, which is a continuation of U.S. application Ser. No. 15/288,769, now U.S. Pat. No. 10,350,012, filed Oct. 7, 2016, which is a continuation of U.S. application Ser. No. 11/750,815, now U.S. Pat. No. 9,492,237, filed May 18, 2007, which claims the benefit of and priority to U.S. Provisional Application No. 60/801,378, filed May 19, 2006, all of which are incorporated by reference herein in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
4747393 | Medwid | May 1988 | A |
4903536 | Salisbury et al. | Feb 1990 | A |
4979949 | Matsen et al. | Dec 1990 | A |
5046375 | Salisbury et al. | Sep 1991 | A |
5086401 | Glassman et al. | Feb 1992 | A |
5142930 | Allen et al. | Sep 1992 | A |
5154717 | Matsen et al. | Oct 1992 | A |
5207114 | Salisbury et al. | May 1993 | A |
5230338 | Allen et al. | Jul 1993 | A |
5236432 | Matsen et al. | Aug 1993 | A |
5299288 | Glassman et al. | Mar 1994 | A |
5343385 | Joskowicz et al. | Aug 1994 | A |
5383901 | McGregor et al. | Jan 1995 | A |
5388480 | Townsend | Feb 1995 | A |
5399951 | Lavallee et al. | Mar 1995 | A |
5408409 | Glassman et al. | Apr 1995 | A |
5445144 | Wodicka et al. | Aug 1995 | A |
5445166 | Taylor | Aug 1995 | A |
5452941 | Halse et al. | Sep 1995 | A |
5551429 | Fitzpatrick et al. | Sep 1996 | A |
5572999 | Funda et al. | Nov 1996 | A |
5576727 | Rosenberg et al. | Nov 1996 | A |
5587937 | Massie et al. | Dec 1996 | A |
5611353 | Dance et al. | Mar 1997 | A |
5625576 | Massie et al. | Apr 1997 | A |
5630431 | Taylor | May 1997 | A |
5638819 | Manwaring et al. | Jun 1997 | A |
5676673 | Ferre et al. | Oct 1997 | A |
5682886 | Delp et al. | Nov 1997 | A |
5688280 | Booth et al. | Nov 1997 | A |
5694013 | Stewart et al. | Dec 1997 | A |
5695500 | Taylor et al. | Dec 1997 | A |
5701140 | Rosenberg et al. | Dec 1997 | A |
5704791 | Gillio | Jan 1998 | A |
5727554 | Kalend et al. | Mar 1998 | A |
5792147 | Evans et al. | Aug 1998 | A |
5799055 | Peshkin et al. | Aug 1998 | A |
5806518 | Mittelstadt | Sep 1998 | A |
5831408 | Jacobus et al. | Nov 1998 | A |
5855553 | Tajima et al. | Jan 1999 | A |
5871018 | Delp et al. | Feb 1999 | A |
5887121 | Funda et al. | Mar 1999 | A |
5888220 | Felt et al. | Mar 1999 | A |
5898599 | Massie et al. | Apr 1999 | A |
5928137 | Green | Jul 1999 | A |
5950629 | Taylor et al. | Sep 1999 | A |
5971997 | Guthrie et al. | Oct 1999 | A |
5976156 | Taylor et al. | Nov 1999 | A |
5978696 | Vomlehn et al. | Nov 1999 | A |
5980535 | Barnett et al. | Nov 1999 | A |
5984930 | Maciunas et al. | Nov 1999 | A |
5987960 | Messner et al. | Nov 1999 | A |
6002859 | Digioia, III et al. | Dec 1999 | A |
6006126 | Cosman | Dec 1999 | A |
6006127 | Van Der Brug et al. | Dec 1999 | A |
6017305 | Bonutti | Jan 2000 | A |
6033415 | Mittelstadt | Mar 2000 | A |
6084587 | Tarr et al. | Jul 2000 | A |
6104158 | Jacobus et al. | Aug 2000 | A |
6109270 | Mah et al. | Aug 2000 | A |
6111577 | Zilles et al. | Aug 2000 | A |
6147674 | Rosenberg et al. | Nov 2000 | A |
6161032 | Acker | Dec 2000 | A |
6188728 | Hurst | Feb 2001 | B1 |
6191796 | Tarr | Feb 2001 | B1 |
6205411 | Digioia, III et al. | Mar 2001 | B1 |
6219032 | Rosenberg et al. | Apr 2001 | B1 |
6223100 | Green | Apr 2001 | B1 |
6226566 | Funda et al. | May 2001 | B1 |
6228089 | Wahrburg | May 2001 | B1 |
6231526 | Taylor et al. | May 2001 | B1 |
6233504 | Das et al. | May 2001 | B1 |
6259806 | Green | Jul 2001 | B1 |
6285902 | Kienzle et al. | Sep 2001 | B1 |
6288705 | Rosenberg et al. | Sep 2001 | B1 |
6292174 | Mallett et al. | Sep 2001 | B1 |
6300936 | Braun et al. | Oct 2001 | B1 |
6322467 | Hook et al. | Nov 2001 | B1 |
6322567 | Mittelstadt | Nov 2001 | B1 |
6325808 | Bernard et al. | Dec 2001 | B1 |
6337994 | Stoianovici et al. | Jan 2002 | B1 |
6366273 | Rosenberg et al. | Apr 2002 | B1 |
6369834 | Zilles et al. | Apr 2002 | B1 |
6377011 | Ben-Ur | Apr 2002 | B1 |
6377839 | Kalfas et al. | Apr 2002 | B1 |
6385475 | Cinquin et al. | May 2002 | B1 |
6385509 | Das et al. | May 2002 | B2 |
6393340 | Funda et al. | May 2002 | B2 |
6405072 | Cosman | Jun 2002 | B1 |
6405158 | Massie et al. | Jun 2002 | B1 |
6417638 | Guy et al. | Jul 2002 | B1 |
6421048 | Shih et al. | Jul 2002 | B1 |
6424885 | Niemeyer et al. | Jul 2002 | B1 |
6430434 | Mittelstadt | Aug 2002 | B1 |
6434416 | Mizoguchi et al. | Aug 2002 | B1 |
6443894 | Sumanaweera et al. | Sep 2002 | B1 |
6466815 | Saito et al. | Oct 2002 | B1 |
6468265 | Evans et al. | Oct 2002 | B1 |
6484118 | Govari | Nov 2002 | B1 |
6493608 | Niemeyer | Dec 2002 | B1 |
6494039 | Pratt et al. | Dec 2002 | B2 |
6522906 | Salisbury et al. | Feb 2003 | B1 |
6533737 | Brosseau et al. | Mar 2003 | B1 |
6546277 | Franck et al. | Apr 2003 | B1 |
6547782 | Taylor | Apr 2003 | B1 |
6551325 | Neubauer et al. | Apr 2003 | B2 |
6552722 | Shih et al. | Apr 2003 | B1 |
6583161 | Medina | Jun 2003 | B1 |
6642686 | Ruch | Nov 2003 | B1 |
6671651 | Goodwin et al. | Dec 2003 | B2 |
6674916 | Deman et al. | Jan 2004 | B1 |
6690964 | Bieger et al. | Feb 2004 | B2 |
6692485 | Brock et al. | Feb 2004 | B1 |
6701174 | Krause et al. | Mar 2004 | B1 |
6704694 | Basdogan et al. | Mar 2004 | B1 |
6711431 | Sarin et al. | Mar 2004 | B2 |
6711432 | Krause et al. | Mar 2004 | B1 |
6748819 | Maeguchi et al. | Jun 2004 | B2 |
6750877 | Rosenberg et al. | Jun 2004 | B2 |
6757582 | Brisson et al. | Jun 2004 | B2 |
6778850 | Adler et al. | Aug 2004 | B1 |
6785572 | Yanof et al. | Aug 2004 | B2 |
6786896 | Madhani et al. | Sep 2004 | B1 |
6801801 | Sati | Oct 2004 | B1 |
6810281 | Brock et al. | Oct 2004 | B2 |
6816148 | Mallett et al. | Nov 2004 | B2 |
6831640 | Shih et al. | Dec 2004 | B2 |
6845691 | Hsien | Jan 2005 | B2 |
6850794 | Shahidi | Feb 2005 | B2 |
6853965 | Massie et al. | Feb 2005 | B2 |
6859661 | Tuke | Feb 2005 | B2 |
6877239 | Leitner et al. | Apr 2005 | B2 |
6894678 | Rosenberg et al. | May 2005 | B2 |
6920347 | Simon et al. | Jul 2005 | B2 |
6985133 | Rodomista et al. | Jan 2006 | B1 |
6987504 | Rosenberg et al. | Jan 2006 | B2 |
7001346 | White | Feb 2006 | B2 |
7035716 | Harris et al. | Apr 2006 | B2 |
7039866 | Rosenberg et al. | May 2006 | B1 |
7131073 | Rosenberg et al. | Oct 2006 | B2 |
7168042 | Braun et al. | Jan 2007 | B2 |
7199790 | Rosenberg et al. | Apr 2007 | B2 |
7206626 | Quaid, III | Apr 2007 | B2 |
7206627 | Abovitz et al. | Apr 2007 | B2 |
7331436 | Pack et al. | Feb 2008 | B1 |
7491198 | Kockro | Feb 2009 | B2 |
7660623 | Hunter et al. | Feb 2010 | B2 |
7717932 | McFarlin et al. | May 2010 | B2 |
7742804 | Faul | Jun 2010 | B2 |
7747311 | Quaid, III | Jun 2010 | B2 |
7774044 | Sauer et al. | Aug 2010 | B2 |
7831292 | Quaid et al. | Nov 2010 | B2 |
8010180 | Quaid et al. | Aug 2011 | B2 |
8095200 | Quaid, III | Jan 2012 | B2 |
8391954 | Quaid, III | Mar 2013 | B2 |
8571628 | Kang et al. | Oct 2013 | B2 |
8911499 | Quaid et al. | Dec 2014 | B2 |
9002426 | Quaid et al. | Apr 2015 | B2 |
9492237 | Kang et al. | Nov 2016 | B2 |
9724165 | Arata et al. | Aug 2017 | B2 |
10028789 | Quaid et al. | Jul 2018 | B2 |
10231790 | Quaid et al. | Mar 2019 | B2 |
10350012 | Kang et al. | Jul 2019 | B2 |
10952796 | Arata | Mar 2021 | B2 |
11123143 | Kang | Sep 2021 | B2 |
20010034530 | Malackowski et al. | Oct 2001 | A1 |
20010037064 | Shahidi | Nov 2001 | A1 |
20010039422 | Carol et al. | Nov 2001 | A1 |
20010041838 | Holupka et al. | Nov 2001 | A1 |
20020035321 | Bucholz | Mar 2002 | A1 |
20020062177 | Hannaford et al. | May 2002 | A1 |
20020082498 | Wendt et al. | Jun 2002 | A1 |
20020107521 | Petersen et al. | Aug 2002 | A1 |
20020108054 | Moore et al. | Aug 2002 | A1 |
20020120188 | Brock et al. | Aug 2002 | A1 |
20030093103 | Malackowski et al. | May 2003 | A1 |
20030112281 | Sriram et al. | Jun 2003 | A1 |
20030187351 | Franck et al. | Oct 2003 | A1 |
20030195664 | Nowlin | Oct 2003 | A1 |
20030209096 | Pandey et al. | Nov 2003 | A1 |
20040009459 | Anderson et al. | Jan 2004 | A1 |
20040012806 | Murata | Jan 2004 | A1 |
20040024311 | Quaid, III | Feb 2004 | A1 |
20040034282 | Quaid et al. | Feb 2004 | A1 |
20040034283 | Quaid | Feb 2004 | A1 |
20040034302 | Abovitz et al. | Feb 2004 | A1 |
20040102866 | Harris et al. | May 2004 | A1 |
20040106916 | Quaid et al. | Jun 2004 | A1 |
20040115606 | Davies | Jun 2004 | A1 |
20040127788 | Arata | Jul 2004 | A1 |
20040128026 | Harris et al. | Jul 2004 | A1 |
20040143243 | Wahrburg | Jul 2004 | A1 |
20040157188 | Luth et al. | Aug 2004 | A1 |
20040167654 | Grimm et al. | Aug 2004 | A1 |
20040171924 | Mire et al. | Sep 2004 | A1 |
20040172044 | Grimm et al. | Sep 2004 | A1 |
20040236424 | Berez et al. | Nov 2004 | A1 |
20040242993 | Tajima | Dec 2004 | A1 |
20050001831 | Shih et al. | Jan 2005 | A1 |
20050053200 | Sukovic et al. | Mar 2005 | A1 |
20050062738 | Handley et al. | Mar 2005 | A1 |
20050093821 | Massie et al. | May 2005 | A1 |
20050107801 | Davies et al. | May 2005 | A1 |
20050113677 | Davies et al. | May 2005 | A1 |
20050154471 | Aram et al. | Jul 2005 | A1 |
20050165489 | Michelson | Jul 2005 | A1 |
20050193451 | Quistgaard et al. | Sep 2005 | A1 |
20050197800 | Goodwin et al. | Sep 2005 | A1 |
20050203384 | Sati et al. | Sep 2005 | A1 |
20050215879 | Chuanggui | Sep 2005 | A1 |
20050215888 | Grimm et al. | Sep 2005 | A1 |
20050217394 | Langley et al. | Oct 2005 | A1 |
20050222830 | Massie et al. | Oct 2005 | A1 |
20060033707 | Rodomista et al. | Feb 2006 | A1 |
20060058616 | Marquart et al. | Mar 2006 | A1 |
20060084867 | Tremblay et al. | Apr 2006 | A1 |
20060109266 | Itkowitz et al. | May 2006 | A1 |
20060133827 | Becouarn et al. | Jun 2006 | A1 |
20060142657 | Quaid et al. | Jun 2006 | A1 |
20060207419 | Okazaki et al. | Sep 2006 | A1 |
20060265179 | Jansen et al. | Nov 2006 | A1 |
20060293598 | Fraser | Dec 2006 | A1 |
20070142751 | Kang et al. | Jun 2007 | A1 |
20070260140 | Solar et al. | Nov 2007 | A1 |
20070270685 | Kang et al. | Nov 2007 | A1 |
20080004633 | Arata et al. | Jan 2008 | A1 |
20080010705 | Quaid et al. | Jan 2008 | A1 |
20080010706 | Moses et al. | Jan 2008 | A1 |
20080058945 | Hajaj et al. | Mar 2008 | A1 |
20080125637 | Geist et al. | May 2008 | A1 |
20090000626 | Quaid et al. | Jan 2009 | A1 |
20090000627 | Quaid et al. | Jan 2009 | A1 |
20090012531 | Quaid et al. | Jan 2009 | A1 |
20090012532 | Quaid et al. | Jan 2009 | A1 |
20100137882 | Quaid, III | Jun 2010 | A1 |
20100170362 | Bennett et al. | Jul 2010 | A1 |
20100198219 | Mcfarlin et al. | Aug 2010 | A1 |
20100225209 | Goldberg et al. | Sep 2010 | A1 |
20110130761 | Plaskos et al. | Jun 2011 | A1 |
20110270120 | Mcfarlin et al. | Nov 2011 | A1 |
20120176306 | Lightcap et al. | Jul 2012 | A1 |
20130096573 | Kang et al. | Apr 2013 | A1 |
20130096574 | Kang et al. | Apr 2013 | A1 |
20140350571 | Maillet et al. | Nov 2014 | A1 |
20160097676 | Kurasawa et al. | Apr 2016 | A1 |
20160124022 | Tadano | May 2016 | A1 |
20160153777 | Ni et al. | Jun 2016 | A1 |
20160155097 | Venkatesha | Jun 2016 | A1 |
20190290367 | Andersson et al. | Sep 2019 | A1 |
20210128253 | Kang | May 2021 | A1 |
20210361362 | Kang | Nov 2021 | A1 |
Number | Date | Country |
---|---|---|
1684729 | Oct 2005 | CN |
1 059 067 | Dec 2000 | EP |
1 184 684 | Mar 2002 | EP |
1 380 266 | Jan 2004 | EP |
1 574 186 | Jun 2008 | EP |
1 871 267 | Sep 2018 | EP |
08-215211 | Aug 1996 | JP |
09-330016 | Dec 1997 | JP |
2000-279425 | Oct 2000 | JP |
2002-102251 | Apr 2002 | JP |
3342969 | Nov 2002 | JP |
2003-053684 | Feb 2003 | JP |
2004-513684 | May 2004 | JP |
WO-9501757 | Jan 1995 | WO |
WO-9617552 | Jun 1996 | WO |
WO-0035336 | Jun 2000 | WO |
WO-0224051 | Mar 2002 | WO |
WO-02060653 | Aug 2002 | WO |
WO-02061371 | Aug 2002 | WO |
WO-03007101 | Jan 2003 | WO |
WO-03077101 | Sep 2003 | WO |
WO-2004069036 | Aug 2004 | WO |
WO-2004069040 | Aug 2004 | WO |
WO-2004069041 | Aug 2004 | WO |
WO-2004070573 | Aug 2004 | WO |
WO-2004070577 | Aug 2004 | WO |
WO-2004070580 | Aug 2004 | WO |
WO-2004070581 | Aug 2004 | WO |
WO-2004075987 | Sep 2004 | WO |
WO-2005009215 | Feb 2005 | WO |
WO-2005013841 | Feb 2005 | WO |
WO-2005072629 | Aug 2005 | WO |
WO-2005120380 | Dec 2005 | WO |
WO-2005122916 | Dec 2005 | WO |
WO-2006004894 | Jan 2006 | WO |
WO-2006091494 | Aug 2006 | WO |
WO-2007117297 | Oct 2007 | WO |
Entry |
---|
English translation of JP 3342969, Nov. 2002. (Year: 2002). |
Abovitz et al., “The Future Use of Networked Haptic Learning Information Systems in Computer-Assisted Surgery,” CAOS 2001, Jul. 6-8, 2001, pp. 337-338. |
Abovitz, “Digital surgery: the future of medicine and human-robot symbiotic interaction,” Industrial Robot: An International Journal, Oct. 2001, vol. 28, Issue 5, pp. 401-406 (abstract only). |
Abovitz, “Human-Interactive Medical Robotics,” CAOS 2000, Jun. 15-17, 2000, pp. 71-72. |
Abovitz, “Human-Interactive Medical Robotics,” CAOS 2001, Jul. 6-8, 2001, pp. 81-82. |
Acosta, et al., “Development of a Haptic Virtual Environment”, Computer-Based Medical Systems, Proceedings 12th IEEE Symposium. pp. 35-39, 1999. |
Bennett et al., “Autonomous Calibration Of Single-Loop Kinematic Chains Formed By Manipulators With Passive End-Point Constraints,” IEEE Transactions On Robotics And Automation, vol. 7, pp. 597-606, 1991. |
Bettini et al., “Vision assisted control for manipulation using virtual fixtures: Experiments at macro and micro scales,” in Proc. 2002 IEEE Intl. Conf. on Robotics and Automation, (Washington, DC), May 2002, 8 pages. |
Chapter II Demand and Response to Written Opinion for PCT/US2006/005700, submitted Dec. 15, 2006, 16 pages. |
Chapter II Demand and Response to Written Opinion for PCT/US2006/049216, submitted Jul. 15, 2008, 19 pages. |
Chen et al., “Force Feedback for Surgical Simulation,” Proceedings of the IEEE, New York, US, vol. 86, No. 3, Mar. 1, 1998. pp. 524-530. |
Cobb et al., “A robotic system for TKR surgery,” in Third Annual North American Program on Computer Assisted Orthopaedic Surgery, (Pittsburgh, PA), pp. 70-74, Jun. 1999. |
Colgate, J. Edward, et al., “Cobots: Robots for Collaboration with Human Operators,” proceedings of International Mechanical Engineering Congress & Exhibition, DSC-vol. 58, 1996, pp. 433-439. |
Davies et al., “Acrobot-using Robots and Surgeons Synergistically in Knee Surgery”, 1997 British Crown Copyright, pp. 173-178. |
Davies et al., “The use of force control in robot assisted knee surgery,” in Proceedings of the First Annual Symposium on Medical Robotics and Computer Assisted Surgery, vol. 2, (Pittsburgh, PA), pp. 258-262, Sep. 1994. |
Decision to Refuse a European Patent Application for EP Application No. 07756266.8 dated Aug. 3, 2016, 32 pages. |
Definition of devices, web thesaurus, printers Mar. 4, 2021. (Year: 2021). |
Definition of Object, web dictionary, printed Mar. 4, 2021 (Year: 2021). |
Examination report for EP 04757075.9, dated Jan. 12, 2011, 5 pages. |
Fritz, et al., “Design of a Haptic Data Visualization System for People with Visual Impairments”, IEEE Trans. on Rehabiliation Engineering, vol. 7, No. 3, Sep. 1999, 13 pages. |
Germano et al., Clinical Use of the Optical Digitizer for Intracranial Neuronavigation, Neurosurgery, vol. 45(2), Aug. 1999, 15 pages. |
Goswami, et al., “Identifying Robot Parameters Using Partial Pose Information,” IEEE Control Systems Magazine, vol. 13, No. 5, Oct. 1993, 11 pages. |
Ho, S.C. et al., “Robot Assisted Knee Surgery Establishing a Force Control Strategy Incorporating Active Motion Constraint,” IEEE Engineering in Medicine and Biology Magazine, vol. 14, No. 3, May 1, 1995, col. 2-3, p. 293. |
Hollerbach, J.M. & D. E. Johnson. Virtual Environment Rendering. To appear in Human and Machine Haptics, M. Cutkosky, R. Howe, K. Salisbury, and M. Srinivasan (eds.), MIT Press, 2000 (available at http://www.cs.ubc.ca/labs/spin/publications/related/hollerbach00.pdf), 25 pages. |
International Preliminary Examination Report for PCT/US2003/007063, dated Sep. 2, 2004 (2 pages). |
International Preliminary Report on Patentability for PCT/US2004/022978 including International Search Report and Written Opinion, dated Feb. 13, 2007 (6 pages). |
International Preliminary Report on Patentability for PCT/US2006/005700, dated May 8, 2007 (7 pages). |
International Preliminary Report on Patentability for PCT/US2006/049216, dated Sep. 10, 2008, 9 pages. |
International Search Report and Written Opinion for corresponding PCT Application No. PCT/US2006/049216, dated May 8, 2008 (15 pgs.). |
International Search Report and Written Opinion for PCT/US2006/005700, dated Jun. 27, 2006, 10 pages. |
International Search Report for PCT/US2003/007063, dated Apr. 16, 2004 (7 pages). |
Kanazides, Peter et al., “An Integrated System for Cementless Hip Replacement”, Integrated Surgical Systems Department of Orthopedic Surgery, Sutter General Hospital, May/Jun. 1995, pp. 307-313. |
Leeser et al., “Computerassisted teach and play: Novel user-friendly robot teach mode using gravity compensation and backdrivability,” in Proceedings of the Robotics International/SME Fifth World Conference on Robotics Research, (Cambridge, MA), Sep. 1994, 7 pages. |
Leeser, Karl, et al., “Control and Exploitation of Kinematic Redundancy in Torque-Controllable Manipulators via Multiple-Jacobian Superposition,” to the International Conf. on Field & Service Robotics, Dec. 8-10, 1997, 7 pages. |
London Press Services, “‘Acrobot‘ capable of delicate knee surgery,” Can. Med. Assoc. J., Jun. 15, 1997, 156(12), p. 1690. |
Matsuoka, Yoky, et al., “Design of Life-Size Haptic Environments,” Experimental Robotics VII, 2001, pp. 461-470. |
Meggiolaro, et al., “Manipulator calibration using a single endpoint contact constraint,” in 26th ASME Bienniel Mechanisms Conference, (Baltimore, MD), 2000, 9 pages. |
Moore, Carl A., et al., “Cobot Implementation of 3D Virtual Surfaces,” proceedings of the 2002 Institute of Electrical and Electronics Engineers International Conference on Robotics & Automation, May 2002, pp. 3242-3247. |
Niki, et al., “Simple Haptic Display and Object Data Design”, Proceedings of the 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 967-972, 2000. |
Otmane, S., et al., “Active Virtual Guides as an Apparatus for Augmented Reality Based Telemanipulation System on the Internet,” presented at Institute of Electrical and Electronics Engineers Computer Society 33rd Annual Simulation Symposium ANSS 2000, held Apr. 16-20, 2000, pp. 185-191. |
Park et al., “Virtual fixtures for robotic cardiac surgery,” in Proc. Medical Image Computing and Computer-Assisted Intervention, (Utrecht, Netherlands), Oct. 2001, 2 pages. |
PCT/US2006/049216, Partial Intl. Search Report, dated Jan. 18, 2008 (2 pgs.). |
Press Release, “The Acrobot Company Wins Best Surgical Innovation Award,” Acrobot Precision Surgical Systems, May 24, 2002, 1 page. |
Provision of a Copy of the Minutes in Accordance with Rule 124(4) EPC for EP Application No. 07756266.8 dated Aug. 2, 2016, 5 pages. |
Quaid et al., “Haptic Information Displays for Computer-Assisted Surgery,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, May 2002, pp. 2092-2097. |
Quaid, Arthur E., et al., “FGS WAM: First Cadaver Trial,” Z-Kat, Inc. Confidential Material, Sep. 28, 2001, pp. 1-7. |
Quaid, Arthur E., et al., “FGS WAM: Integration of Fluorotactic Guidance with the Whole-Arm Manipulator,” Z-Kat, Inc. Confidential Material, Dec. 28, 2000, pp. 1-6. |
Quaid, et al., “The Use of Haptic Information Displays for Assisting in the Execution of Image-Guided Surgery Plans,” Syllabus of the Computer Assisted Orthopaedic Surgery Meeting, Jul. 2001, pp. 338-340. |
Roche, “Changing the way surgeons plan and execute minimally invasive unicompartmental knee surgery,” Orthopaedic Product News, pp. 16-18, Jul./Aug. 2006. |
Rosenberg, “Virtual Fixtures: Perceptual Tools for Telerobotic Manipulation”, 1993 IEEE, 76-82. |
Rosenberg, Virtual Fixtures: Perceptual overlays enhance operator performance in telepresence tasks. PhD thesis, Stanford University, Aug. 1994, 7 pages. |
Sayers, Craig P., et al., “An Operator Interface for Teleprogramming Employing Synthetic Fixtures,” to appear in Presence, Special Issue on Networked Virtual Environments and Teleoperation, Jun. 1994, pp. 1-27. |
Schneider, O., et al., “Synergistic Robotic Assistance to Cardiac Procedures,” presented to Computer Assisted Radiology and Surgery on Jun. 23-26, 1999, 5 pages. |
Sensable Technologies, Inc., “Freeform Feel the Difference”, 2001, 4 pages. |
Sensable Technologies, Inc., “FreeForm Modeling—Technical Features,” 2003, 2 pages. |
Staecker et al., “Use of the LandmarX (tm) Surgical Navigation System in Lateral Skull Base and Temporal Bone Surgery”, SkullBase, vol. 11, No. 4, 2001, pp. 245-255; Thieme Medical Publishers, Inc. 11 pages. |
Taylor, Russell et al., “An Image-Directed Robotic System for Precise Orthopaedic Surgery”, IEEE Transactions on Robotics and Automation, vol. 10, No. 3, Jun. 1994, pp. 261-275. |
Taylor, Russell et al., “Redundant Consistency Checking in a Precise Surgical Robot”, Annual International Conference of the IEEE Engineering in Medicine and Biology Society, vol. 12, No. 5, 1990, pp. 1933-1935. |
Taylor, Russell et al., “Robotic Joint Replacement Surgery”, NSF Engineering Research Center for Computer-Integrated Surgical Systems and Technology, 2000, 2001, 2004, 71 pages. |
Tognetti, Lawrence Joseph, “Actuator Design for a Passive Haptic Display,” Georgia Institute of Technology, Jun. 1999, 33 pages. |
Townsend et al., “Teleoperator slave—WAM design methodology,” Industrial Robot, vol. 26, No. 3, pp. 167-177, 1999. |
World Wide Web, http://haptics.me.jhu.edu/r.sub.--hapt.html, “Haptic Interfaces and Virtual Environments,” printed on Jun. 12, 2003, 2 pages. |
World Wide Web, http://haptics.me.jhu.edu/r.sub.--kine.html, “Robot Design and Kinematics, ” printed on Jun. 12, 2003, 2 pages. |
World Wide Web, http://www.acrobot.co.uk/background.html, “The Acrobot Company Limited—Background,” printed on Jul. 10, 2002, 1 page. |
World Wide Web, http://www.acrobot.co.uk/home.html, “The Acrobot Company Limited—Precision Surgical Systems,” printed on Jul. 10, 2002, 1 page. |
World Wide Web, http://www.acrobot.co.uk/meetings.html, “The Acrobot Company Limited—Meetings and Publications,” printed on Jul. 10, 2002, pp. 1-3. |
World Wide Web, http://www.acrobot.co.uk/products.html, “The Acrobot Company Limited—Products,” printed on Jul. 10, 2002, pp. 1-6. |
World Wide Web, http://www.fcs-cs.com/robotics/content/assistance.htm, “Surgical Assistance,” printed on Jun. 12, 2003, 1 page. |
World Wide Web, http://www.fcs-cs.com/robotics/content/design.htm, “Virtual Design, Assembly & Maintenance,” printed on Jun. 12, 2003, 1 page. |
World Wide Web, http://www.fcs-cs.com/robotics/content/endeffectors.htm, “End effectors,” printed on Jun. 12, 2003, 1 page. |
World Wide Web, http://www.fcs-cs.com/robotics/content/hapticmaster.htm, “HapticMASTER”, printed on Jun. 12, 2003, 1 page. |
World Wide Web, http://www.fcs-cs.com/robotics/content/reality.htm, “Virtual Reality,” printed on Jun. 12, 2003, 1 page. |
World Wide Web, http://www.fcs-cs.com/robotics/content/rehabilitation.htm, “Rehabilitation,” printed on Jun. 12, 2003, 1 page. |
World Wide Web, http://www.fcs-cs.com/robotics/content/research.htm, “Research,” printed on Jun. 12, 2003, 1 page. |
World Wide Web, http://www.fcs-cs.com/robotics/content/simulation.htm, “Simulation & Training,” printed on Jun. 12, 2003, 1 page. |
World Wide Web, http://www.fcs-cs.com/robotics/content/software.htm, “Software,” printed on Jun. 12, 2003, 1 page. |
World Wide Web, http://www.merl.com/projects/surgSim99/, “Knee Arthroscopy Simulation,” printed on Jun. 12, 2003, 2 pages. |
Written Opinion for PCT/US2006/049216, dated May 8, 2008, 12 pages. |
Zilles, et al., “A Constraint-Based God-object Method for Haptic Display”, IEEE Proceedings, pp. 146-151, 1995. |
Number | Date | Country | |
---|---|---|---|
20220031404 A1 | Feb 2022 | US |
Number | Date | Country | |
---|---|---|---|
60801378 | May 2006 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17395668 | Aug 2021 | US |
Child | 17504722 | US | |
Parent | 16509651 | Jul 2019 | US |
Child | 17395668 | US | |
Parent | 15288769 | Oct 2016 | US |
Child | 16509651 | US | |
Parent | 11750815 | May 2007 | US |
Child | 15288769 | US |