Not applicable
Not applicable
Not applicable
1. Field of the Invention
This invention relates to the determination of the shape of rigid or nearly rigid bodies. More particularly, this invention relates to shape determination of such bodies using a computer navigation system.
2. Description of the Background of the Invention
Computer determination of the location of bodies has been used in manufacturing and medical fields for a number of years. Computer navigation requires that the bodies to be tracked by the navigation system have a known shape, so that the orientation and position of the bodies can be properly tracked by the system. Tracking is accomplished by either attaching a tracking device to the body or embedding the tracking device into the body. There are numerous tracking technologies including active and passive optical tracking systems, magnetic systems and inertial systems.
For many applications it is necessary to field calibrate bodies so that the navigation system can thereafter track the body and realistically render the body graphically on a computer display. Typically, this is done by attaching the tracking device in a fixed relation with the body and then inserting the body into a calibration device. These devices can be as simple as a divot in a known relation to the navigation system or can be a device that constrains the body in a predetermined attitude relative to the navigation system with the tip of the body located in a predetermined position. Current tracking calibration requires some physical contact between the body and a calibration device.
For certain situations, it may be desirable to minimize contact with other devices or bodies. For instance in a surgical setting, sterility requirements require that the body to be used be sterile and that every body it contacts in any way also be sterile. This necessitates sterilizing the calibration device and maintaining the calibration device within the sterile field. With space at a premium in a surgical suite, this can be a problem.
In addition, bodies that include attachments, such as screwdrivers, drills, implant insertion devices, etc., need to be recalibrated each time a new attachment is inserted. Lastly, some devices do not have an axial shape with the result that these bodies have been difficult to field calibrate using known methods.
According to one aspect of the invention, a system determines the shape and orientation of a body relative to a tracking device. A sensing device generates a series of representation of the body. A tracking device capable of being detected by a computer navigation system is associated with the body such that the position of the body is located relative to the computer navigation system. The computer navigation system having a central processing unit processes the series of representations of the body and the relative location of the body in order to determine the shape and orientation of the body relative to the tracking device.
In accordance with another aspect of the invention, the shape and orientation of a body relative to an emitter are determined by a system. A sensing device generates a series of representation of the body. An emitter capable of being detected by a computer navigation system is associated with the body such that the position of the body is located relative to the computer navigation system. The computer navigation system having a central processing unit processes the series of representations of the body and the relative location of the body in order to determine the shape and orientation of the body relative to the emitter.
In accordance with a further aspect of the invention, a method to determine the shape and orientation of a body relative to a tracking device using a computer navigation system includes the step of generating a series of representation of the body and thereafter using these representations to determine a composite bounding volume of the body. The shape of the body is determined using the composite bounding volume. A position and an orientation of the body is determined using a tracking device associated with the body that communicates with the computer navigation system.
In yet a further aspect of the invention, a method to determine the shape and orientation of a body using a computer navigation system includes the step of generating a series of representations of the body from at least two perspectives. A composite bounding volume is determined from the series of representations and the shape of the body is determined from the composite bounding volume. The position and orientation of the body are determined from the shape of the body and the series of representations of the body.
a is a schematic view of a further embodiment of the present invention;
b is a schematic view of a further embodiment of the present invention;
c is a schematic view of a further embodiment of the present invention;
a is a perspective view of a preferred calibration body;
b is a front view of the preferred calibration body of
c is a side view of the preferred calibration body of
d is a top view of the preferred calibration body of
The position tracking device 106 has a local coordinate system 120, and each of the cameras 108-1 through 108-M have their own local coordinate systems 122-1 through 122-M. Suitable devices for use as the cameras 108-1 through 108-M include known digital video cameras, digital still cameras, image capture devices and the like.
The position tracking device 106 has a predetermined and fixed relationship to the body 102 and is calibrated to the computer navigation system 104. Furthermore, the position tracking device 106 is capable of tracking the position of a fixed point 124 on the surface of the body 102 with respect to either the coordinate system 120 of the position tracking device 106 or with respect to the coordinate system 112 of the navigation computer 104 because the two coordinate systems are calibrated to one another. The calibration of the two coordinate systems enables any measurements of the point 124 on the body 102 with respect to the coordinate system 120 of the position tracking device 106 to be mapped to the coordinate system 112 of the navigation computer 104 through a linear transformation.
The position tracking device 106 can be physically separate from the body 102, or alternatively, the position tracking device 106 can be attached to or otherwise incorporated into the body 102 and still provide the necessary position information. The point 124 can be located in a fixed position relative to the position tracking device 106 or can be determined by a calibration method described hereinafter. The point 124 can be the location of an emitter used by the position tracking device 106, as is discussed hereinafter.
The position tracking device 106 can be one of any of a number of position sensing devices known to those familiar with the art.
b shows the use of another type of an optical position tracking device 210 that makes use of another type of optical position sensor 212. Tracking a body with this alternate optical position tracking device 210 entails affixing a reflective body (e.g., a retro-reflector) 214 on the body 102. An emissive light source 216 is aligned with the reflective body 214 such that a beam of light 218 generated by the emissive light source 216 reflects off the reflective body 214 in such a manner that a light sensor 220 in the optical position sensor 212 may thereafter detect it and thus track the position of the reflective body 214 as it moves with the body 102. The Motus system by Peak Performance Technologies, Inc. (Englewood, Colo.) is a manufacturer of position measurement devices as shown in
c shows the use of a position tracking device 224 that uses magnets 226 affixed to the body 102. The changes in the direction and amplitude of magnetic flux of the magnets 226 as the magnets 226 move with the body 102 are sensed by the magnetic position sensor 228 and used to determine the position of the magnets 226. Examples of manufactures of this type of tracking technology are Polhemus Incorporated of Colchester, Vt., and Ascension Technology Corporation of Burlington, Vt.
Although only one point 124 is depicted as being tracked on the surface of the body 102 to simplify the description, it should be evident that multiple tracking points may be tracked on the same body 102, each with a separate position-tracking device. In fact, multiple tracking points may be necessary to determine the full rotational orientation of the body 102. Multiple bodies 102 can also be tracked at the same time by a single system.
Referring once again to
The plurality of cameras 108-1 through 108-M positioned around the body capture images of the body from different perspectives. These cameras 108-1 through 108-M may be either fixed image cameras or video cameras, or some combination of the two camera technologies. If video cameras are used then individual frames of the video captured by the video camera are processed as single images. Preferably, all of the cameras capture frames nearly synchronously in time so that images from multiple view points are correlated. The positions and coordinate systems 122-1 through 122-M of the cameras 108-1 and 108-M are calibrated to one another and to the global coordinate system 112 established by the navigation computer 104. One embodiment of the method of calibration of the cameras is described herein below. In the preferred embodiment the cameras 108-1 through 108-M are standard video cameras with frame capture hardware in desktop personal computers or Firewire and USB based cameras that are well known in the art.
Fixed backgrounds 110-1 through 110-N preferably are positioned around the body opposite the cameras. These backgrounds 110-1 through 110-N provide a known surround in an image captured by the cameras 108-1 through 108-M that aids in identifying the edges of the body 102 in the image. The backgrounds 110-1 through 110-M may be neutral, black, white, or any color that would increase the contrast between the portion of the image that represents the background 110-1 through 110-M and the portion of the image that represents the body 102. Further, the backgrounds may be backlit to further increase this contrast. It is possible to perform one embodiment of the method of the present invention without fixed backgrounds. However, this is not preferred because of the increased complexity of the shape determination from having to subtract the background image from the image of the body 102.
The navigation computer 104 processes the images captured by the cameras 108-1 through 108-M. The navigation computer 104 may make use of a body database 130 populated with shape information regarding typical bodies that the navigation computer 104 may have to identify. The shape information of a body in the body database 130 is preferably coordinates of vertex points of the body as are typically be available from a computer aided design system. The navigation computer 104 develops one or more comparison metrics by comparing the bounding volume estimated from processing the images from the cameras 108-1 through 108-M to the shape information that is stored in the body database 130. If the shape information for one of the bodies in the body database 130 is found to be highly correlated with the estimated bounding volume, the navigation computer may use the shape information for the body to refine the estimated bounding volume. For example, the navigation computer may develop a comparison metric by analyzing the distances between each vertex of the estimated bounding volume and a corresponding vertex stored as part of the shape information for a body in the body database 130. An example of another comparison metric that may be developed is the result of analyzing the properties of the inertia moment axes of the estimated bounding volume with the inertia moment axes of a body in the body database 130. Additional comparison metrics are known to those familiar with the art. A preferred embodiment uses a plurality of comparison metrics to determine the degree of correlation between the estimated bounding volume and a body stored in the body database 130.
It is not necessary for the cameras 108-1 through 108-M to image the entire body. Only the portion of the body 102 that is of interest needs to be imaged by the cameras 108-1 through 108-M. Furthermore, the body 102 and the cameras 108-1 through 108-M are preferably positioned with respect to one another so that the field of view of each camera captures approximately the same parts of the body.
An embodiment to calibrate the coordinate systems 122-1 through 122-M of the cameras 108-1 through 108-M of the shape characterization system 100 with respect to each other and with respect to the coordinate system 112 of the computer navigation system is through the use of a calibration body with an exactly known shape.
The shadow sensing devices can be calibrated using a body of known shape and dimension. A representative body 700 that can be used for calibration is depicted in
The algorithms used to estimate the shape of the body 102 can be any of those well known and used in the field of computer graphics. Such algorithms are described in publications used in the field such as Computer Graphics: Principles and Practice, by James D. Foley, et al (Addison-Wesley, 1990), which is incorporated herein by reference. From the shape of the body 102 determined, the system can then determine the location of the tip 126.
If at least two sensing devices (either cameras 108 or shadow sensing devices 904) are used then the emitter 124 and the position tracking device 106 are not necessary, because the image of the body (or the shadow of the body) for one of the multiple devices provides information about the relative position of the body 102 with respect to the other devices. This information can be used to deduce the position of the body 102 with respect to the coordinate system 112 of the navigation computer 104 by, for example, stereographically determining multiple homologous point pairs in at least two camera views of the body 102. This is because the position of the sensing devices (either 108 or 904) with respect to the coordinate system 112 of the navigation computer 104 is known and tracked during the operation of the shape characterization system and linear transformation can be used to map between the coordinate systems of the sensing devices 108 or 904 and the navigation computer 104.
In addition, to further enhance the reality of the body 102 as it is displayed on a display monitor, coloration and or texture can also optionally be created by known methods. In this case, one or more light sources 128 optionally can be simulated to shade the rendered view of the body 102 on a computer graphics screen.
Number | Name | Date | Kind |
---|---|---|---|
3942522 | Wilson | Mar 1976 | A |
4346717 | Haerten | Aug 1982 | A |
4370554 | Bohlen et al. | Jan 1983 | A |
4416019 | Weiss et al. | Nov 1983 | A |
4461016 | Weiss et al. | Jul 1984 | A |
4567896 | Barnea et al. | Feb 1986 | A |
4673352 | Hansen | Jun 1987 | A |
4722056 | Roberts et al. | Jan 1988 | A |
4757379 | Wright | Jul 1988 | A |
4836778 | Baumrind et al. | Jun 1989 | A |
4873651 | Raviv | Oct 1989 | A |
4908656 | Suwa et al. | Mar 1990 | A |
4972836 | Schenck et al. | Nov 1990 | A |
5050608 | Watanabe et al. | Sep 1991 | A |
5142930 | Allen et al. | Sep 1992 | A |
5155435 | Kaufman et al. | Oct 1992 | A |
5172331 | Yamada | Dec 1992 | A |
5186174 | Schlöndorff et al. | Feb 1993 | A |
5197476 | Nowacki et al. | Mar 1993 | A |
5198877 | Schulz | Mar 1993 | A |
5206893 | Hara | Apr 1993 | A |
5207681 | Ghadjar et al. | May 1993 | A |
5222499 | Allen et al. | Jun 1993 | A |
5230623 | Guthrie et al. | Jul 1993 | A |
5251127 | Raab | Oct 1993 | A |
5276337 | Starikov | Jan 1994 | A |
5299288 | Glassman et al. | Mar 1994 | A |
5305203 | Raab | Apr 1994 | A |
5309913 | Kormos et al. | May 1994 | A |
5365996 | Crook | Nov 1994 | A |
5383454 | Bucholz | Jan 1995 | A |
5389101 | Heilbrun et al. | Feb 1995 | A |
5393988 | Sakamoto | Feb 1995 | A |
5394875 | Lewis et al. | Mar 1995 | A |
5400428 | Grace | Mar 1995 | A |
5412811 | Hildenbrand et al. | May 1995 | A |
5419320 | Kawaguchi et al. | May 1995 | A |
5422491 | Sakamoto | Jun 1995 | A |
5447154 | Cinquin et al. | Sep 1995 | A |
5512946 | Murata et al. | Apr 1996 | A |
5517990 | Kalfas et al. | May 1996 | A |
5564437 | Bainville et al. | Oct 1996 | A |
5591207 | Coleman | Jan 1997 | A |
5617857 | Chader et al. | Apr 1997 | A |
5622170 | Schulz | Apr 1997 | A |
5637866 | Riener et al. | Jun 1997 | A |
5662111 | Cosman | Sep 1997 | A |
5676673 | Ferre et al. | Oct 1997 | A |
5682890 | Kormos et al. | Nov 1997 | A |
5690108 | Chakeres | Nov 1997 | A |
5697368 | Luber et al. | Dec 1997 | A |
5706811 | Takeda et al. | Jan 1998 | A |
5732703 | Kalfas et al. | Mar 1998 | A |
5740222 | Fujita et al. | Apr 1998 | A |
5748696 | Fujita et al. | May 1998 | A |
5772594 | Barrick | Jun 1998 | A |
5787886 | Kelly et al. | Aug 1998 | A |
5795294 | Luber et al. | Aug 1998 | A |
5797924 | Schulte et al. | Aug 1998 | A |
5807256 | Taguchi et al. | Sep 1998 | A |
5848126 | Fujita et al. | Dec 1998 | A |
5848967 | Cosman | Dec 1998 | A |
5851183 | Bucholz | Dec 1998 | A |
5855553 | Tajima et al. | Jan 1999 | A |
5876325 | Mizuno et al. | Mar 1999 | A |
5878103 | Sauer et al. | Mar 1999 | A |
5880846 | Hasman et al. | Mar 1999 | A |
5891034 | Bucholz | Apr 1999 | A |
5921992 | Costales et al. | Jul 1999 | A |
6006126 | Cosman | Dec 1999 | A |
6021343 | Foley et al. | Feb 2000 | A |
6081336 | Messner et al. | Jun 2000 | A |
6112113 | Van Der Brug et al. | Aug 2000 | A |
6167295 | Cosman | Dec 2000 | A |
6175415 | Pietrzak et al. | Jan 2001 | B1 |
6181815 | Marugame | Jan 2001 | B1 |
6226003 | Akeley | May 2001 | B1 |
6285902 | Kienzle, III et al. | Sep 2001 | B1 |
6301498 | Greenberg et al. | Oct 2001 | B1 |
6306126 | Moctezuma | Oct 2001 | B1 |
6317139 | Williams | Nov 2001 | B1 |
6356272 | Matsumoto et al. | Mar 2002 | B1 |
6442416 | Schultz | Aug 2002 | B1 |
6455835 | Bernardini et al. | Sep 2002 | B1 |
6512844 | Bouguet et al. | Jan 2003 | B2 |
6529192 | Waupotitsch | Mar 2003 | B1 |
6535219 | Marshall et al. | Mar 2003 | B1 |
6567156 | Kerner | May 2003 | B1 |
6592033 | Jennings et al. | Jul 2003 | B2 |
6662036 | Cosman | Dec 2003 | B2 |
6788062 | Schweikard et al. | Sep 2004 | B2 |
6788827 | Makram-Ebeid | Sep 2004 | B1 |
6792074 | Erbel et al. | Sep 2004 | B2 |
20030164953 | Bauch et al. | Sep 2003 | A1 |
20030195526 | Vilsmeier | Oct 2003 | A1 |
20040013305 | Brandt et al. | Jan 2004 | A1 |
20040170247 | Poole et al. | Sep 2004 | A1 |
20040170308 | Belykh et al. | Sep 2004 | A1 |
20040171922 | Rouet et al. | Sep 2004 | A1 |
20040175034 | Wiemker et al. | Sep 2004 | A1 |
20040181144 | Cinquin et al. | Sep 2004 | A1 |
20040181149 | Langlotz et al. | Sep 2004 | A1 |
20060036148 | Grimm | Feb 2006 | A1 |
Number | Date | Country |
---|---|---|
0 535 552 | Apr 1993 | EP |
0 501 812 | Sep 2002 | EP |
1 189 537 | Sep 2004 | EP |
1 340 470 | Sep 2004 | EP |
1 354 564 | Sep 2004 | EP |
05-111886 | Sep 1975 | JP |
55-73253 | Jun 1980 | JP |
55-81640 | Jun 1980 | JP |
55-81641 | Jun 1980 | JP |
55-94244 | Jul 1980 | JP |
55-110539 | Aug 1980 | JP |
56-45649 | Apr 1981 | JP |
57-021250 | Feb 1982 | JP |
57-122862 | Jul 1982 | JP |
57-195447 | Dec 1982 | JP |
7-53160 | Sep 1985 | JP |
60-185538 | Sep 1985 | JP |
61-25531 | Feb 1986 | JP |
61-31129 | Feb 1986 | JP |
61-73308 | May 1986 | JP |
62-057784 | Mar 1987 | JP |
63-53511 | Apr 1988 | JP |
63-59610 | Apr 1988 | JP |
01-236046 | Sep 1989 | JP |
01-245108 | Sep 1989 | JP |
01-288250 | Nov 1989 | JP |
03-032649 | Feb 1991 | JP |
03-057466 | Mar 1991 | JP |
3-73113 | Jul 1991 | JP |
03-155837 | Jul 1991 | JP |
03-193040 | Aug 1991 | JP |
03-210245 | Sep 1991 | JP |
04-161145 | Jun 1992 | JP |
05-007554 | Jan 1993 | JP |
5-8010 | Feb 1993 | JP |
05-049644 | Mar 1993 | JP |
05-184554 | Jul 1993 | JP |
06-038975 | Feb 1994 | JP |
06-019710 | Mar 1994 | JP |
06-063033 | Mar 1994 | JP |
06-149950 | May 1994 | JP |
06-205793 | Jul 1994 | JP |
06-251038 | Sep 1994 | JP |
07-194616 | Aug 1995 | JP |
07-236633 | Sep 1995 | JP |
07-255723 | Oct 1995 | JP |
07-303651 | Nov 1995 | JP |
07-308303 | Nov 1995 | JP |
07-313527 | Dec 1995 | JP |
07-323035 | Dec 1995 | JP |
07-328016 | Dec 1995 | JP |
08-010266 | Jan 1996 | JP |
08-024233 | Jan 1996 | JP |
08-038439 | Feb 1996 | JP |
08-038506 | Feb 1996 | JP |
08-038507 | Feb 1996 | JP |
08-107893 | Apr 1996 | JP |
08-112240 | May 1996 | JP |
08-150129 | Jun 1996 | JP |
08-173449 | Jul 1996 | JP |
08-215211 | Aug 1996 | JP |
08-224255 | Sep 1996 | JP |
08-238248 | Sep 1996 | JP |
08-238257 | Sep 1996 | JP |
08-275206 | Oct 1996 | JP |
09-019441 | Jan 1997 | JP |
26-00627 | Aug 1999 | JP |
WO 9611624 | Apr 1996 | WO |
WO 9632059 | Oct 1996 | WO |
WO 0004506 | Jan 2000 | WO |
WO 0100092 | Jan 2001 | WO |
WO 0111553 | Feb 2001 | WO |
Number | Date | Country | |
---|---|---|---|
20060241404 A1 | Oct 2006 | US |