The present teachings relate generally to surgical navigation, and more particularly to a method for surgically navigating a trackable diagnostic scope along the surface of a bone to identify defects or abnormalities.
Surgical navigation systems, also known as computer assisted surgery and image guided surgery, aid surgeons in locating patient anatomical structures, guiding surgical instruments, and implanting medical devices with a high degree of accuracy. Surgical navigation has been compared to a global positioning system that aids vehicle operators to navigate the earth. A surgical navigation system typically includes a computer, a tracking system, and patient anatomical information. The patient anatomical information can be obtained by using an imaging mode such as fluoroscopy, computer tomography (CT) or by simply defining the location of patient anatomy with the surgical navigation system. Surgical navigation systems can be used for a wide variety of surgeries to improve patient outcomes.
To successfully implant a medical device, surgical navigation systems often employ various forms of computing technology, as well as utilize intelligent instruments, digital touch devices, and advanced 3-D visualization software programs. All of these components enable surgeons to perform a wide variety of standard and minimally invasive surgical procedures and techniques. Moreover, these systems allow surgeons to more accurately plan, track and navigate the placement of instruments and implants relative to a patient's body, as well as conduct pre-operative and intra-operative body imaging.
To accomplish the accurate planning, tracking and navigation of surgical instruments, tools and/or medical devices during a surgical procedure utilizing surgical navigation, surgeons often use “tracking arrays” that are coupled to the surgical components. These tracking arrays allow the surgeons to track the physical location of these surgical components, as well as the patient's bones during the surgery. By knowing the physical location of the tracking array, software associated with the tracking system can accurately calculate the position of the tracked component relative to a surgical plan image.
It is known to use surgical navigation instruments to measure the size and general contour of a bone for use in the selection or manufacture of a prosthetic implant. This selection process allows the surgeon to choose a prosthetic implant that comfortably fits the general shape and size of the patient's anatomy. It would be desirable to improve upon these methods to reduce surgery time and improve prosthetic fit and/or function.
The present teachings provide a trackable diagnostic scope apparatus that is capable of identifying defects or abnormalities on the surface of a bone. These defects or abnormalities are registered by the surgical navigation system and then analyzed so that a prosthetic device can be custom manufactured to fit the flawed surface of the bone in a precise manner.
In one exemplary embodiment, the present teachings provide a method of performing a surgical procedure. The method comprises providing a tracking system and a diagnostic scope trackable by the tracking system, identifying an abnormality on a bone with the diagnostic scope, and acquiring a plurality of points on or near the abnormality with the diagnostic scope. The acquired points are then used to make an implant having a portion whose shape substantially matches that of the bone abnormality.
According to another exemplary embodiment herein, an image guided surgery system is provided. The system comprises a computer having surgical navigation utilities software, a tracking system having a measurement field, and a diagnostic scope that is trackable by the tracking system when exposed to the measurement field. The software comprises a program that when executed causes the system to acquire a plurality of points on or near the abnormality to make an implant having a portion whose shape substantially matches that of the bone abnormality.
In yet another exemplary embodiment, a computer readable storage medium is provided. According to this embodiment, the storage medium stores instructions that, when executed by a computer, causes the computer to perform a surgical procedure. The surgical procedure comprises tracking a diagnostic scope with a tracking system when the diagnostic scope is exposed to a measurement field of the tracking system, and making an implant having a portion whose shape substantially matches that of the bone abnormality by acquiring a plurality of points on or near the abnormality with the diagnostic scope.
The above-mentioned aspects of the present teachings and the manner of obtaining them will become more apparent and the invention itself will be better understood by reference to the following description of the embodiments of the invention taken in conjunction with the accompanying drawings, wherein:
a is a perspective view of a custom manufactured implant device in accordance with the present teachings;
b is a perspective view of the implant device of
c is a perspective view of the implant device of
Corresponding reference characters indicate corresponding parts throughout the several views.
The embodiments of the present invention described below are not intended to be exhaustive or to limit the invention to the precise forms disclosed in the following detailed description. Rather, the embodiments are chosen and described so that others skilled in the art may appreciate and understand the principles and practices of the present invention.
The surgery is performed within a sterile field, adhering to the principles of asepsis by all scrubbed persons in the operating room. Patient 22, surgeon 21 and assisting clinician 50 are prepared for the sterile field through appropriate scrubbing and clothing. The sterile field will typically extend from operating table 24 upward in the operating room. Typically, both the computer display and fluoroscope display are located outside of the sterile field.
A representation of the patient's anatomy can be acquired with an imaging system, a virtual image, a morphed image, or a combination of imaging techniques. The imaging system can be any system capable of producing images that represent the patient's anatomy such as a fluoroscope producing x-ray two-dimensional images, computer tomography (CT) producing a three-dimensional image, magnetic resonance imaging (MRI) producing a three-dimensional image, ultrasound imaging producing a two-dimensional image, and the like. A virtual image of the patient's anatomy can be created by defining anatomical points with the surgical navigation system 20 or by applying a statistical anatomical model. A morphed image of the patient's anatomy can be created by combining an image of the patient's anatomy with a data set, such as a virtual image of the patient's anatomy. Some imaging systems, such as C-arm fluoroscope 26, can require calibration. The C-arm can be calibrated with a calibration grid that enables determination of fluoroscope projection parameters for different orientations of the C-arm to reduce distortion. A registration phantom can also be used with a C-arm to coordinate images with the surgical navigation application program and improve scaling through the registration of the C-arm with the surgical navigation system. A more detailed description of a C-arm based navigation system is provided in James B. Stiehl et al., Navigation and Robotics in Total Joint and Spine Surgery, Chapter 3: C-Arm-Based Navigation, Springer-Verlag (2004).
Computer 112 can be any computer capable of properly operating surgical navigation devices and software, such as a computer similar to a commercially available personal computer that comprises a processor 126, working memory 128, core surgical navigation utilities 130, an application program 132, stored images 134, and application data 136. Processor 126 is a processor of sufficient power for computer 112 to perform desired functions, such as one or more microprocessors. Working memory 128 is memory sufficient for computer 112 to perform desired functions such as solid-state memory, random-access memory, and the like. Core surgical navigation utilities 130 are the basic operating programs, and include image registration, image acquisition, location algorithms, orientation algorithms, virtual keypad, diagnostics, and the like. Application program 132 can be any program configured for a specific surgical navigation purpose, such as orthopedic application programs for unicondylar knee (“uni-knee”), total knee, hip, spine, trauma, intramedullary (“IM”) nail/rod, and external fixator. Stored images 134 are those recorded during image acquisition using any of the imaging systems previously discussed. Application data 136 is data that is generated or used by application program 132, such as implant geometries, instrument geometries, surgical defaults, patient landmarks, and the like. Application data 136 can be pre-loaded in the software or input by the user during a surgical navigation procedure.
Output device 116 can be any device capable of creating an output useful for surgery, such as a visual output and an auditory output. The visual output device can be any device capable of creating a visual output useful for surgery, such as a two-dimensional image, a three-dimensional image, a holographic image, and the like. The visual output device can be a monitor for producing two and three-dimensional images, a projector for producing two and three-dimensional images, and indicator lights. The auditory output can be any device capable of creating an auditory output used for surgery, such as a speaker that can be used to provide a voice or tone output.
Removable storage device 118 can be any device having a removable storage media that would allow downloading data, such as application data 136 and patient anatomical data 124. The removable storage device can be a read-write compact disc (CD) drive, a read-write digital video disc (DVD) drive, a flash solid-state memory port, a removable hard drive, a floppy disc drive, and the like.
Tracking system 120 can be any system that can determine the three-dimensional location of devices carrying or incorporating markers that serve as tracking indicia. An active tracking system has a collection of infrared light emitting diode (ILEDs) illuminators that surround the position sensor lenses to flood a measurement field of view with infrared light. A passive system incorporates retro-reflective markers that reflect infrared light back to the position sensor, and the system triangulates the real-time position (x, y, and z location) and orientation (rotation around x, y, and z axes) of an array 122 and reports the result to the computer system with an accuracy of about 0.35 mm Root Mean Squared (RMS). An example of a passive tracking system is a Polaris® Passive System and an example of a marker is the NDI Passive Spheres™, both available from Northern Digital Inc. Ontario, Canada. A hybrid tracking system can detect active and active wireless markers in addition to passive markers. Active marker based instruments enable automatic tool identification, program control of visible LEDs, and input via tool buttons. An example of a hybrid tracking system is the Polaris® Hybrid System, available from Northern Digital Inc. A marker can be a passive IR reflector, an active IR emitter, an electromagnetic marker, and an optical marker used with an optical camera.
As is generally known within the art, implants and instruments may also be tracked by electromagnetic tracking systems. These systems locate and track devices and produce a real-time, three-dimensional video display of the surgical procedure. This is accomplished by using electromagnetic field transmitters that generate a local magnetic field around the patient's anatomy. In turn, the localization system includes magnetic sensors that identify the position of tracked instruments as they move relative to the patient's anatomy. By not requiring a line of sight with the transmitter, electromagnetic systems are also adapted for in vivo use, and are also integrable, for instance, with ultrasound and CT imaging processes for performing interventional procedures by incorporating miniaturized tracking sensors into surgical instruments. By processing transmitted signals generated by the tracking sensors, the system is able to determine the position of the surgical instruments in space, as well as superimpose their relative positions onto pre-operatively captured CT images of the patient.
Arrays 122 can be probe arrays, instrument arrays, reference arrays, calibrator arrays, and the like. Arrays 122 can have any number of markers, but typically have three or more markers to define real-time position (x, y, and z location) and orientation (rotation around x, y, and z axes). An array comprises a body and markers. The body comprises an area for spatial separation of the markers. In some embodiments, there are at least two arms and some embodiments can have three arms, four arms, or more. The arms are typically arranged asymmetrically to facilitate specific array and marker identification by the tracking system. In other embodiments, such as a calibrator array, the body provides sufficient area for spatial separation of markers without the need for arms. Arrays can be disposable or non-disposable. Disposable arrays are typically manufactured from plastic and include installed markers. Non-disposable arrays are manufactured from a material that can be sterilized, such as aluminum, stainless steel, and the like. The markers are removable, so they can be removed before sterilization.
Planning and collecting patient anatomical data 124 is a process by which a clinician inputs into the surgical navigation system actual or approximate anatomical data. Anatomical data can be obtained through techniques such as anatomic painting, bone morphing, CT data input, and other inputs, such as ultrasound and fluoroscope and other imaging systems.
The present teachings enhance surgical navigation system 20 by incorporating into the system a process for detecting bone abnormalities or defects with a diagnostic scope, and particularly to a process that can be used to custom manufacture a biomedical implant that is appropriately sized and shaped for implantation onto the flawed bone.
A fiber optic scope 408 (such as a 1.2 mm diagnostic scope) is also removably connected to the assembly 400 at one end. While the size of the fiber optic scope can vary depending on the surgical procedure to be performed, it should be understood that the scope should at least be sized so that it can easily penetrate the patient's body during the surgical diagnostic procedure. For instance, according to certain aspects of the present teachings, the removable fiber optic scope is approximately the size of an 18-gauge needle. Moreover, the fiber optic scope can also be disposable, as well as sterilely packaged in a single use pouch. To attach fiber optic scope 408 to assembly 400, any feasible attachment means may be used, such as, but not limited to, molding, fusing, threading, gluing, snapping, press-fitting or the like.
Fiber optic scope 408 is configured to house a cannula device 410 that mimics a needle or tube that can be inserted into a patient during a diagnostic procedure. As such, the size of the cannula device is typically the size of a gauge needle (e.g., about 1.9 mm in diameter). Cannula devices are generally known within the art and typically include a trocar, obturator and plug. Cannula device 410 is also formed of a molded plastic body having a stainless steel sheath to provide rigidity as the device is inserted into and maneuvered within the patient's body. Moreover, the cannula device may also have a luer port on its body to allow irrigation, and the plug may be configured to seal the scope port on the cannula during intra-articular injection.
Unlike traditional scope assemblies, scope assembly 400 further includes a reference array 420 that enables the surgical navigation system to locate and track in real-time the position of the scope assembly relative to other arrays and/or surgical instruments within the surgical field. More particularly, the tracking system determines the position of scope assembly 400 by detecting the position of markers 422 on reference array 420 in space using known triangulation methods. The relative location of scope assembly 400 can then be shown on a surgical plan image of a computer display positioned within the surgical field.
The principles upon which exemplary embodiments of the present invention rely can be understood with reference to
The tracking system detects the location of scope assembly 500 relative to bones 510, 512 by referencing the position of reference array 502 as it moves with respect to reference arrays 520 and 522, which are fixably attached to the femur and tibia of patient 524. As shown in
Once scope assembly 500 has been inserted into the patient's knee at the incision site (see the opening in the patient's knee indicated by reference numeral 528), the patient's bones 510, 512, as well as the interior of the knee joint and all its compartments become visible to the surgeon on surgical plan image 516. At this point, the surgeon is now able to freely move the scope around inside the incision cavity and thereby view real-time images of the patient's bones and ligaments. Such efforts allow the surgeon to identify any unnatural surface defects, flaws or abnormalities that may be present within the patient's knee region. For instance, in
While not shown here, additional monitors may also be used to display other images of the patient's anatomy during the diagnostic scope procedure. For instance, it may be desirable to display other endoscopic images of the patient's anatomy that were taken during the instant procedure and/or images that were taken during a previous procedure. It should be understood that these images can be captured by any known imaging methodologies available within the surgical navigation art. Such imaging methodologies include, but are not limited to, fluoroscopy, computer tomography (CT), magnetic resonance imaging (MRI), ultrasound, and the like.
After surface defect 530 has been located, surgeon 508 can utilize scope assembly 500 to register or collect a series of points on the bone to define the general shape, location and dimensional parameters of the defect. To register or collect these points, a navigational application program is typically used which arranges the point acquisition process into sequential pages of surgical protocol that are configured according to a graphic user interface scheme. For instance, in
To detect the location of scope assembly 500 relative to femur 510 and tibia 512, the tracking system references the position of markers 542 as they move with respect to reference arrays 520 and 522, which are fixably attached to the femur and tibia of the patient. By tracking the relative position of the markers, the exact location of the individual points corresponding to the acquired locations 540 can be determined and shown on the surgical plan image (shown here as black dots 544 on surgical plan image 516). Once the assembly is positioned at a specific point to be acquired, the point can be selected or registered with the system by blocking the markers 542 of the scope assembly from the camera (e.g., selective gesturing) or by any other input means, such as by pushing or selectively activating one of the buttons 546 on the scope assembly or one or more buttons on a conventional computer mouse or keyboard (not shown) that is associated with the navigation system. By pushing a button on the scope assembly, a mouse or a keyboard, the navigation system can be programmed to instantaneously capture the exact location of markers 542 in real-time and translate this position into the surgical plan image.
Selective gesturing is a procedure that allows a user to make a virtual mouse input by occluding the optical path (e.g., the optical path shown in
The system may also recognize and assign functionality to movement of the tip of scope assembly 500 away from the surface of a bone, i.e., along the z-axis. For example, a quick movement of the tip of the scope assembly away from femur 510 a few centimeters and then returning the tip to substantially the same spot on the femur may be interpreted as being equivalent to a single click of a conventional mouse. Similarly, two of these short “taps” may be interpreted as a double click. One of skill in the art would readily recognize many other functions or mouse inputs that could be assigned to various movements of the scope in the z-axis without straying from the present teachings. For a further description of virtual mouse inputting operations useful in accordance with the present teachings, see U.S. patent application Ser. No. 11/227,741, filed Sep. 15, 2005, entitled “Virtual Mouse for use in Surgical Navigation” and U.S. patent application Ser. No. 11/434,035, filed May 15, 2006, also entitled “Virtual Mouse for use in Surgical Navigation,” both disclosures of which are incorporated by reference herein in their entirety.
According to one aspect of the present teachings, after capturing the plurality of locations 540 on or near surface defect 530, software associated with the tracking system can collect and analyze the data and define the dimensional parameters of the abnormality (i.e., its size, shape, location, etc.). More particularly, the acquired points can be analyzed by the software program, which can then determine what implant component would be appropriate for correcting the defect. As will be described in detail below, a custom manufactured implant component can alternatively be created if the software program is not configured to suggest an implant component and/or the software program is unable to locate an implant component that is appropriately shaped and sized to correct the surface defect as needed.
According to another aspect of the present teachings, after capturing the plurality of locations 540 on or near the surface defect, a three-dimensional model of the surface defect can optionally be generated on the surgical plan image. For instance, as shown in
If desired, a fiducial marker may be placed into the bone at or near the defect so that measurements taken in one procedure can be accurately overlaid on images acquired in a later procedure. The fiducial marker could be any stationary device that would provide a reproducible point of reference in aligning the navigation system. For example, the fiducial maker might be a screw, pin or device (e.g., see the fiducial device inserted into femur 510, which is indicated by reference numeral 539) inserted into the bone at some distance from the surgical site so as not to be disturbed during the surgical procedure. The surgeon can also mark the area surrounding the defect for later reference in a subsequent procedure if desired. As such, it should be understood and appreciated herein that there are a variety of acceptable ways to define the three-dimensional reference space surrounding the defect for further reference and identification.
Once the system has calculated the dimensional parameters of the surface defect 530, the data is then analyzed by a software program, which is configured to determine what biomedical implant can be implanted into or onto the flawed bone to correct or fill the bone defect. For instance, as shown on surgical plan image 552 of computer monitor 554 (see
To expand upon the above process, in a diagnostic procedure, the surgeon can transmit the data defining the defect or abnormality to an implant manufacturer that can produce a custom-fit prosthesis. More particularly, once the points on and surrounding the surface defect are identified and recorded by the computer system, an implant can be custom manufactured to match the patient's defect. According to this embodiment, the surgeon, in a follow-up procedure, will utilize information gathered during previous diagnostic procedures to register the patient's anatomy as needed. This is accomplished by utilizing as a reference system the surface of the bone and the points previously utilized during the prior procedure. The defect area is prepared and the patient matched implant can then be placed either with or without tracking the component.
As discussed in detail above, the tracking system detects the location of scope assembly 804 relative to femur 810 and tibia 814 by referencing the position of markers 832 as they move with respect to reference arrays 840 and 842, which are fixably attached to the femur and tibia of the patient. By tracking the position of the markers relative to femur 810, the exact location of the acquired locations 812 can be determined and shown on the surgical plan image (shown here as black dots 844 on surgical plan image 846). As mentioned above, the specific points can be acquired by many different capturing means, such as, but not limited to selective gesturing techniques, pressing a button on the scope assembly, a mouse or a keyboard and/or tapping the tip of the scope assembly relative to the bone's surface.
After capturing the plurality of locations 812 on or near surface defect 830, a three-dimensional model of the surface defect can be generated on the surgical plan image. For instance, as shown in
As discussed above, after the data defining the defect or abnormality has been identified by the surgeon, this data can be transmitted to an implant manufacturer who can produce a custom-fit prosthesis. For instance,
While exemplary embodiments incorporating the principles of the present teachings have been disclosed hereinabove, the present teachings are not limited to the disclosed embodiments. Instead, this application is intended to cover any variations, uses, or adaptations of the invention using its general principles. Further, this application is intended to cover such departures from the present disclosure as come within known or customary practice in the art to which this invention pertains and which fall within the limits of the appended claims.
This application claims priority to U.S. Provisional Application Ser. No. 60/938,771 filed May 18, 2007, the entire contents of which are hereby incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
3541541 | Engelbart | Nov 1970 | A |
4341220 | Perry | Jul 1982 | A |
4360028 | Barbier et al. | Nov 1982 | A |
4583538 | Onik et al. | Apr 1986 | A |
4791934 | Brunnett | Dec 1988 | A |
4945914 | Allen | Aug 1990 | A |
4991579 | Allen | Feb 1991 | A |
5016639 | Allen | May 1991 | A |
5094241 | Allen | Mar 1992 | A |
5097839 | Allen | Mar 1992 | A |
5119817 | Allen | Jun 1992 | A |
5142930 | Allen et al. | Sep 1992 | A |
5178164 | Allen | Jan 1993 | A |
5182641 | Diner et al. | Jan 1993 | A |
5211164 | Allen | May 1993 | A |
5222499 | Allen et al. | Jun 1993 | A |
5230338 | Allen et al. | Jul 1993 | A |
5230623 | Guthrie et al. | Jul 1993 | A |
5261404 | Mick et al. | Nov 1993 | A |
5309913 | Kormos et al. | May 1994 | A |
5383454 | Bucholz | Jan 1995 | A |
5389101 | Heilbrun et al. | Feb 1995 | A |
5397329 | Allen | Mar 1995 | A |
5517990 | Kalfas et al. | May 1996 | A |
5603318 | Heilbrun et al. | Feb 1997 | A |
5628315 | Vilsmeier et al. | May 1997 | A |
5631973 | Green | May 1997 | A |
5638819 | Manwaring et al. | Jun 1997 | A |
5682886 | Delp et al. | Nov 1997 | A |
5682890 | Kormos et al. | Nov 1997 | A |
5704897 | Truppe | Jan 1998 | A |
5724985 | Snell et al. | Mar 1998 | A |
5732703 | Kalfas et al. | Mar 1998 | A |
5740802 | Nafis et al. | Apr 1998 | A |
5769861 | Vilsmeier | Jun 1998 | A |
5772594 | Barrick | Jun 1998 | A |
5776064 | Kalfas et al. | Jul 1998 | A |
5799055 | Peshkin et al. | Aug 1998 | A |
5835791 | Goff et al. | Nov 1998 | A |
5836954 | Heilbrun et al. | Nov 1998 | A |
5851183 | Bucholz | Dec 1998 | A |
5871018 | Delp et al. | Feb 1999 | A |
5902239 | Buurman | May 1999 | A |
5967982 | Barnett | Oct 1999 | A |
5980535 | Barnett et al. | Nov 1999 | A |
5987960 | Messner et al. | Nov 1999 | A |
5999837 | Messner et al. | Dec 1999 | A |
6021343 | Foley et al. | Feb 2000 | A |
6069932 | Peshkin et al. | May 2000 | A |
6122541 | Cosman et al. | Sep 2000 | A |
6135946 | Konen et al. | Oct 2000 | A |
6162227 | Eckhardt et al. | Dec 2000 | A |
6167145 | Foley et al. | Dec 2000 | A |
6178345 | Vilsmeier et al. | Jan 2001 | B1 |
6190395 | Williams | Feb 2001 | B1 |
6198794 | Peshkin et al. | Mar 2001 | B1 |
6205411 | DiGioia, III et al. | Mar 2001 | B1 |
6226548 | Foley et al. | May 2001 | B1 |
6235038 | Hunter et al. | May 2001 | B1 |
6236875 | Bucholz et al. | May 2001 | B1 |
6285902 | Kienzle, III et al. | Sep 2001 | B1 |
6306126 | Moctezuma | Oct 2001 | B1 |
6333971 | McCrory et al. | Dec 2001 | B2 |
6340979 | Beaton et al. | Jan 2002 | B1 |
6358253 | Torrie et al. | Mar 2002 | B1 |
6368279 | Liu | Apr 2002 | B1 |
6377839 | Kalfas et al. | Apr 2002 | B1 |
6379302 | Kessman et al. | Apr 2002 | B1 |
6381485 | Hunter et al. | Apr 2002 | B1 |
6390982 | Bova et al. | May 2002 | B1 |
6402762 | Hunter et al. | Jun 2002 | B2 |
6424856 | Vilsmeier et al. | Jul 2002 | B1 |
6428547 | Vilsmeier et al. | Aug 2002 | B1 |
6434415 | Foley et al. | Aug 2002 | B1 |
6434507 | Clayton et al. | Aug 2002 | B1 |
6440141 | Philippon | Aug 2002 | B1 |
6470207 | Simon et al. | Oct 2002 | B1 |
6490467 | Bucholz et al. | Dec 2002 | B1 |
6491699 | Henderson et al. | Dec 2002 | B1 |
6520964 | Tallarida et al. | Feb 2003 | B2 |
6527443 | Vilsmeier et al. | Mar 2003 | B1 |
6535756 | Simon et al. | Mar 2003 | B1 |
6553152 | Miller et al. | Apr 2003 | B1 |
6584174 | Schubert et al. | Jun 2003 | B2 |
6591130 | Shahidi | Jul 2003 | B2 |
6591581 | Schmieding | Jul 2003 | B2 |
6607487 | Chang et al. | Aug 2003 | B2 |
6609022 | Vilsmeier et al. | Aug 2003 | B2 |
6612980 | Chen et al. | Sep 2003 | B2 |
6636763 | Junker et al. | Oct 2003 | B1 |
6674916 | Deman et al. | Jan 2004 | B1 |
6695772 | Bon et al. | Feb 2004 | B1 |
6697664 | Kienzle, III et al. | Feb 2004 | B2 |
6712856 | Carignan et al. | Mar 2004 | B1 |
6714629 | Vilsmeier | Mar 2004 | B2 |
6724922 | Vilsmeier | Apr 2004 | B1 |
6725080 | Melkent et al. | Apr 2004 | B2 |
6725082 | Sati et al. | Apr 2004 | B2 |
6754374 | Miller et al. | Jun 2004 | B1 |
6772002 | Schmidt et al. | Aug 2004 | B2 |
6776526 | Zeiss | Aug 2004 | B2 |
6782287 | Grzeszczuk et al. | Aug 2004 | B2 |
6783549 | Stone et al. | Aug 2004 | B1 |
6811313 | Graumann et al. | Nov 2004 | B2 |
6852114 | Cerundolo | Feb 2005 | B2 |
6856826 | Seeley et al. | Feb 2005 | B2 |
6856827 | Seeley et al. | Feb 2005 | B2 |
6856828 | Cossette et al. | Feb 2005 | B2 |
6887245 | Kienzle, III et al. | May 2005 | B2 |
6887247 | Couture et al. | May 2005 | B1 |
6892088 | Faulkner et al. | May 2005 | B2 |
6895268 | Rahn et al. | May 2005 | B1 |
6896657 | Willis | May 2005 | B2 |
6917827 | Kienzle, III | Jul 2005 | B2 |
6920347 | Simon et al. | Jul 2005 | B2 |
6925339 | Grimm et al. | Aug 2005 | B2 |
6926673 | Roberts et al. | Aug 2005 | B2 |
6932823 | Grimm et al. | Aug 2005 | B2 |
6947582 | Vilsmeier et al. | Sep 2005 | B1 |
6947783 | Immerz | Sep 2005 | B2 |
6950689 | Willis et al. | Sep 2005 | B1 |
6978166 | Foley et al. | Dec 2005 | B2 |
6988009 | Grimm et al. | Jan 2006 | B2 |
6990220 | Ellis et al. | Jan 2006 | B2 |
7008430 | Dong et al. | Mar 2006 | B2 |
7010095 | Mitschke et al. | Mar 2006 | B2 |
7097357 | Johnson et al. | Aug 2006 | B2 |
7949386 | Buly et al. | May 2011 | B2 |
20010007918 | Vilsmeier et al. | Jul 2001 | A1 |
20010011175 | Hunter et al. | Aug 2001 | A1 |
20010036245 | Kienzle, III et al. | Nov 2001 | A1 |
20010051881 | Filler | Dec 2001 | A1 |
20020077540 | Kienzle, III | Jun 2002 | A1 |
20020095081 | Vilsmeier | Jul 2002 | A1 |
20020095083 | Cinquin et al. | Jul 2002 | A1 |
20020099288 | Chang et al. | Jul 2002 | A1 |
20020183610 | Foley et al. | Dec 2002 | A1 |
20030055502 | Lang et al. | Mar 2003 | A1 |
20030059097 | Abovitz et al. | Mar 2003 | A1 |
20030071893 | Miller et al. | Apr 2003 | A1 |
20030181806 | Medan et al. | Sep 2003 | A1 |
20030209096 | Pandey et al. | Nov 2003 | A1 |
20030216669 | Lang et al. | Nov 2003 | A1 |
20040015077 | Sati et al. | Jan 2004 | A1 |
20040030245 | Noble et al. | Feb 2004 | A1 |
20040087852 | Chen et al. | May 2004 | A1 |
20040097952 | Sarin et al. | May 2004 | A1 |
20040127788 | Arata | Jul 2004 | A1 |
20040133276 | Lang et al. | Jul 2004 | A1 |
20040138754 | Lang et al. | Jul 2004 | A1 |
20040141015 | Fitzmaurice et al. | Jul 2004 | A1 |
20040151354 | Leitner et al. | Aug 2004 | A1 |
20040167390 | Alexander et al. | Aug 2004 | A1 |
20040169673 | Crampe et al. | Sep 2004 | A1 |
20040254454 | Kockro | Dec 2004 | A1 |
20040267242 | Grimm et al. | Dec 2004 | A1 |
20050015003 | Lachner et al. | Jan 2005 | A1 |
20050015005 | Kockro | Jan 2005 | A1 |
20050015022 | Richard et al. | Jan 2005 | A1 |
20050015099 | Momoi et al. | Jan 2005 | A1 |
20050020909 | Moctezuma de la Barrera et al. | Jan 2005 | A1 |
20050020911 | Viswanathan et al. | Jan 2005 | A1 |
20050021037 | McCombs et al. | Jan 2005 | A1 |
20050021039 | Cusick et al. | Jan 2005 | A1 |
20050021043 | Jansen et al. | Jan 2005 | A1 |
20050021044 | Stone et al. | Jan 2005 | A1 |
20050024323 | Salazar-Ferrer et al. | Feb 2005 | A1 |
20050033117 | Ozaki et al. | Feb 2005 | A1 |
20050033149 | Strommer et al. | Feb 2005 | A1 |
20050038337 | Edwards | Feb 2005 | A1 |
20050049477 | Fu et al. | Mar 2005 | A1 |
20050049478 | Kuduvalli et al. | Mar 2005 | A1 |
20050049485 | Harmon et al. | Mar 2005 | A1 |
20050049486 | Urquhart et al. | Mar 2005 | A1 |
20050054915 | Sukovic et al. | Mar 2005 | A1 |
20050054916 | Mostafavi | Mar 2005 | A1 |
20050059873 | Glozman et al. | Mar 2005 | A1 |
20050075632 | Russell et al. | Apr 2005 | A1 |
20050080334 | Willis | Apr 2005 | A1 |
20050085714 | Foley et al. | Apr 2005 | A1 |
20050085715 | Dukesherer et al. | Apr 2005 | A1 |
20050085717 | Shahidi | Apr 2005 | A1 |
20050085718 | Shahidi | Apr 2005 | A1 |
20050085720 | Jascob et al. | Apr 2005 | A1 |
20050090730 | Cortinovis et al. | Apr 2005 | A1 |
20050090733 | Van Der Lugt et al. | Apr 2005 | A1 |
20050096515 | Geng | May 2005 | A1 |
20050096535 | de la Barrera | May 2005 | A1 |
20050101970 | Rosenberg | May 2005 | A1 |
20050113659 | Pothier et al. | May 2005 | A1 |
20050113960 | Karau et al. | May 2005 | A1 |
20050119561 | Kienzle, III | Jun 2005 | A1 |
20050119565 | Pescatore | Jun 2005 | A1 |
20050119639 | McCombs et al. | Jun 2005 | A1 |
20050119783 | Brisson et al. | Jun 2005 | A1 |
20050124988 | Terrill-Grisoni et al. | Jun 2005 | A1 |
20050137599 | Masini | Jun 2005 | A1 |
20050137600 | Jacobs et al. | Jun 2005 | A1 |
20050148850 | Lahm et al. | Jul 2005 | A1 |
20050148855 | Kienzle, III | Jul 2005 | A1 |
20050197568 | Vass et al. | Sep 2005 | A1 |
20050197569 | McCombs | Sep 2005 | A1 |
20050203373 | Boese et al. | Sep 2005 | A1 |
20050203374 | Vilsmeier | Sep 2005 | A1 |
20050203375 | Willis et al. | Sep 2005 | A1 |
20050203383 | Moctezuma de la Barrera et al. | Sep 2005 | A1 |
20050203384 | Sati et al. | Sep 2005 | A1 |
20050215879 | Chuanggui | Sep 2005 | A1 |
20050215888 | Grimm et al. | Sep 2005 | A1 |
20050216032 | Hayden | Sep 2005 | A1 |
20050228250 | Bitter et al. | Oct 2005 | A1 |
20050228266 | McCombs | Oct 2005 | A1 |
20050228270 | Lloyd et al. | Oct 2005 | A1 |
20050228404 | Vandevelde | Oct 2005 | A1 |
20050234335 | Simon et al. | Oct 2005 | A1 |
20050234465 | McCombs et al. | Oct 2005 | A1 |
20050251026 | Stone | Nov 2005 | A1 |
20050251030 | Azar et al. | Nov 2005 | A1 |
20050267353 | Marquart et al. | Dec 2005 | A1 |
20050267354 | Marquart et al. | Dec 2005 | A1 |
20050267358 | Tuma et al. | Dec 2005 | A1 |
20050267360 | Birkenbach et al. | Dec 2005 | A1 |
20050267365 | Sokulin et al. | Dec 2005 | A1 |
20050267722 | Marquart et al. | Dec 2005 | A1 |
20050277832 | Foley et al. | Dec 2005 | A1 |
20050279368 | McCombs | Dec 2005 | A1 |
20050281465 | Marquart et al. | Dec 2005 | A1 |
20050288575 | de la Barrera et al. | Dec 2005 | A1 |
20050288578 | Durlak | Dec 2005 | A1 |
20060004284 | Grunschlager et al. | Jan 2006 | A1 |
20060009780 | Foley et al. | Jan 2006 | A1 |
20060015018 | Jutras et al. | Jan 2006 | A1 |
20060015030 | Poulin et al. | Jan 2006 | A1 |
20060025677 | Verard et al. | Feb 2006 | A1 |
20060025679 | Viswanathan et al. | Feb 2006 | A1 |
20060025681 | Abovitz et al. | Feb 2006 | A1 |
20060036149 | Lavigna et al. | Feb 2006 | A1 |
20060036151 | Ferre et al. | Feb 2006 | A1 |
20060036162 | Shahidi et al. | Feb 2006 | A1 |
20060041178 | Viswanathan et al. | Feb 2006 | A1 |
20060041179 | Viswanathan et al. | Feb 2006 | A1 |
20060041180 | Viswanathan et al. | Feb 2006 | A1 |
20060041181 | Viswanathan et al. | Feb 2006 | A1 |
20060052691 | Hall et al. | Mar 2006 | A1 |
20060058604 | Avinash et al. | Mar 2006 | A1 |
20060058615 | Mahajan et al. | Mar 2006 | A1 |
20060058616 | Marquart et al. | Mar 2006 | A1 |
20060058644 | Hoppe et al. | Mar 2006 | A1 |
20060058646 | Viswanathan | Mar 2006 | A1 |
20060058663 | Willis et al. | Mar 2006 | A1 |
20070016008 | Schoenefeld | Jan 2007 | A1 |
Number | Date | Country |
---|---|---|
0 427 358 | May 1991 | EP |
0 649 117 | Apr 1995 | EP |
0 832 609 | Apr 1998 | EP |
0 904 735 | Mar 1999 | EP |
1 226 788 | Jul 2002 | EP |
2 246 936 | Feb 1992 | GB |
WO 0235454 | May 2002 | WO |
WO 02062248 | Aug 2002 | WO |
WO 02067783 | Sep 2002 | WO |
WO 04001569 | Dec 2003 | WO |
WO 2004006770 | Jan 2004 | WO |
WO 2004069036 | Aug 2004 | WO |
WO 2004069040 | Aug 2004 | WO |
Entry |
---|
Muller PE, Pellengahr C, Witt M, Kircher J, Refior HJ, Jansson V. Influence of minimally invasive surgery on implant positioning and the functional outcome for medial unicompartmental knee arthroplasty. J Arthroplasty 2004; 19(3): 296-301. |
“A Computer-Assisted Total Knee Replacement Surgical System Using a Calibrated Robot,” Thomas C. Kienzle III, S. David Stulburg, Michael Peshkin, Arthur Quaid, Jon Lea, Ambarish Goswami, and Chi-Haur Wu, in “Computer-Integrated Surgery: Technology and Clinical Applications,” ed. Russell H. Taylor, et. al., 1996 MIT Press. (28 pages). |
“Real-Time Image Segmentation for Image-Guided Surgery” by Warfield, Simon; 14 pages; http://splweb.bwh.harvard.edu:8000/pages/papers/warfield/sc98/; accepted to appear at SC98. |
“Acumen™ Surgical Navigation System, Surgical Navigation Applications” (2003) (2 pages). |
Acumen™ Surgical Navigation System, Understanding Surgical Navigation (2003) (2 pages). |
Bathis H, Perlick L, Tingart M, Luring C, Zurakowski D, Grifka J. Alignment in total knee arthroplasty. A comparison of computer-assisted surgery with the conventional technique. J Bone Joint Surg Br. 2004;86(5):682-687. |
C. Graetzel, T.W. Fong, S. Grange, and C. Baur, “A non-contact mouse for surgeon-computer interaction,” Technology and Health Care, vol. 12, No. 3, 2004, pp. 245-257. |
Chauhan SK, Clark GW, Lloyd S, Scott RG, Breidhal W, Sikorski JM. Computer-assisted total knee replacement: a controlled cadaver study using a multi-parameter quantitative CT assessment of alignment (the Perth CT Protocol). J Bone Joint Surg [BR] 2004;86-B:818-23. |
David Stulberg S. How accurate is current TKR instrumentation? Clin Orthop. Nov. 2003;(416):177-84. |
DiFranco. D.E. et al., “Recovery of 3D Articulated Motion from 2D Correspondences,” Cambridge Research Laboratory Technical Report CRL 99/7, Dec. 1999 (20 pages). |
DiGioia AM, Jaramaz B; Colgan BD. Computer assisted orthopaedic surgery. Image guided and robotic assistive technologies. Clin Orthop Sep. 1998;(354):8-16. |
Donald G. Eckhoff, Joel M. Bach, Victor M. Spitzer, Karl D. Reinig, Michelle M. Bagur, Todd H. Baldini, David Rubinstein, and Stephen Humphries, “Three-Dimensional Morphology and Kinematics of the Distal Part of the Femur Viewed in Virtual Reality. Part II,” J Bone Joint Surg. Am 2003 85(Supp 4): 97-104. |
Habets, R.J.E.: Computer assistance in orthopaedic surgery. Promoters: prof dr.ir. A. Hasman, prof.dr.ir. F.A. Gerritsen; copromoter: dr.ir. J.A. Blom. Technische Universiteit Eindhoven, ISBN 90-386-1940-5, Nov. 4, 2002. (4 pages). |
James B. Stiehl et al., Navigation and Robotics in Total Joint and Spine Surgery, Chapter 1 Basics of Computer-Assisted Orthopedic Surgery (CAOS), Springer-Verlag (2004) (9 pages). |
James B. Stiehl et al., Navigation and Robotics in Total Joint and Spine Surgery, Chapter 3 C-Arm-Based Navigation, Springer-Verlag (2004) (9 pages). |
Langlotz, F., et al., Femoral Stem navigation with the Surgi-GATE System, Navigation and Robotics in Total Joint and Spine Surgery, 2004, Springer, Chapter 13, p. 102-109. |
Luck, J.P., Debrunner, C., Hoff, W., He, Q., and Small, D. “Development and Analysis of a Real-Time Human Motion Tracking System,” in Proc. of Workshop on Applications of Computer Vision. 2002. Orlando, FL, IEEE (7 pages). |
Traxtal Technologies—Virtual Keypad, (printed May 23, 2005) pp. 1-2, http://www.traxtal.com/products/products—input—virtualkeypad.htm?print. |
Visarius H, Gong J, Scheer C, Haralamb S, Nolte LP, Man-machine interfaces in computer assisted surgery. Comput Aid Surg 1997;2:102-107. |
Number | Date | Country | |
---|---|---|---|
20080306490 A1 | Dec 2008 | US |
Number | Date | Country | |
---|---|---|---|
60938771 | May 2007 | US |