1. Field of the Invention
The invention relates generally to the fields of dentistry and orthodontics and, more particularly, to subdividing a digital model of a patient's dentition.
Two-dimensional (2D) and three-dimensional (3D) digital image technology has recently been tapped as a tool to assist in dental and orthodontic treatment. Many treatment providers use some form of digital image technology to study the dentitions of patients. U.S. patent application Ser. No. 09/169,276, incorporated by reference above, describes the use of 2D and 3D image data in forming a digital model of a patient's dentition, including models of individual dentition components. Such models are useful, among other things, in developing an orthodontic treatment plan for the patient, as well as in creating one or more orthodontic appliances to implement the treatment plan.
The inventors have developed several computer automated techniques for subdividing, or segmenting, a digital dentition model into models of individual dentition components. These dentition components include, but are not limited to, tooth crowns, tooth roots, and gingival regions. The segmentation techniques include both human assisted and fully automated techniques. Some of the human assisted techniques allow a human user to provide “algorithmic hints” by identifying certain features in the digital dentition model. The identified features then serve as a basis for automated segmentation. Some techniques act on a volumetric 3D image model, or “voxel representation,” of the dentition, and other techniques act on a geometric 3D model, or “geometric representation.”
In one aspect, a computer implementing the invention receives a data set that forms a three-dimensional (3D) representation of the patient's dentition, applies a test to the data set to identify data elements that represent portions of the individual component, and creates a digital model of the individual component based upon the identified data elements. Some implementations require the computer to identify data elements that form one or more 2D cross-sections of the dentition in one or more 2D planes intersecting the dentition. In many of these embodiments, these 2D planes are roughly parallel to the dentition's occlusal plane. The computer analyzes the features of the 2D cross-sections to identify data elements that correspond to the individual component to be modeled. For example, one technique requires the computer to identify cusps in the 2D cross-sectional surface of the dentition, where the cusps represent the locations of an interproximal margin between teeth in the dentition. One variation of this technique allows the computer to confine its search for cusps in one 2D plane to areas in the vicinity of cusps already identified on another 2D plane. Another variation allows the computer to link cusps on adjacent 2D planes to form a solid surface representing the interproximal margin. Some embodiments allow the computer to receive input from a human user identifying the cusp locations in one or more of the 2D cross sections.
Other embodiments require the computer to identify data elements that represent a structural core, or skeleton, of each individual component to be modeled. The computer creates the model by linking other data elements representing the individual component to the structural core.
In another aspect, a computer implementing the invention receives a three-dimensional (3D) data set representing the patient's dentition, applies a test to identify data elements that represent an interproximal margin between two teeth in the dentition, and applies another computer-implemented test to select data elements that lie on one side of the interproximal margin for inclusion in the digital model. Some implementations require the computer to identify data elements that form one or more 2D cross-sections of the dentition in one or more 2D planes intersecting the dentition roughly parallel to the dentition's occlusal plane.
In another aspect, a computer implementing the invention receives a 3D data set representing at least a portion of the patient's dentition, including at least a portion of a tooth and gum tissue surrounding the tooth; applies a test to identify data elements lying on a gingival boundary that occurs where the tooth and the gum tissue meet; and applies a test to the data elements lying on the boundary to identify other data elements representing portions of the tooth.
Other embodiments and advantages are apparent from the detailed description and the claims below.
U.S. patent application Ser. No. 09/169,276 describes techniques for generating a 3D digital model of a patient's dentition, including the crowns and roots of the patients teeth as well as the surrounding gum tissue. One such technique involves creating a physical model of the dentition from a material such as plaster and then digitally imaging the model with a laser scanner or a destructive scanning system. The described techniques are used to produce a volumetric 3D image model (“volume element representation” or “voxel representation”) and a geometric 3D surface model (“geometric model”) of the dentition. The techniques described below act on one or both of these types of 3D dentition models. In creating a voxel representation, the physical model is usually embedded in a potting material that contrasts sharply with the color of the model to enhance detection of the dentition features. A white dentition model embedded in a black potting material provides the sharpest contrast. A wide variety of information is used to enhance the 3D model, including data taken from photographic images, 2D and 3D x-rays scans, computed tomography (CT) scans, and magnetic resonance imaging (MRI) scans of the patient's dentition.
Some computer-implemented techniques for segmenting a 3D dentition model into models of individual dentition components require a substantial amount of human interaction with the computer. One such technique, which is shown in
Another technique requiring substantial human interaction, shown in
Other computer-implemented segmentation techniques require little or no human interaction during the segmentation process. One such technique, which is illustrated in
Once a skeleton has been identified, the computer uses the skeleton to divide the dentition model into 3D models of the individual teeth.
Once the branches are identified, the computer links other voxels in the model to the branches. The computer begins by identifying a reference voxel in each branch of the skeleton (step 144). For each reference voxel, the computer selects an adjacent voxel that does not lie on the skeleton (step 146). The computer then processes the selected voxel, determining whether the voxel lies outside of the dentition, i.e., whether the associated image value is above or below a particular threshold value (step 148); determining whether the voxel already is labeled as belonging to another tooth (step 150); and determining whether the voxel's distance measure is greater than the distance measure of the reference voxel (step 152). If none of these conditions is true, the computer labels the selected voxel as belonging to the same tooth as the reference voxel (step 154). The computer then repeats this test for all other voxels adjacent to the reference voxel (step 156). Upon testing all adjacent voxels, the computer selects one of the adjacent voxels as a new reference point, provided that the adjacent voxel is labeled as belonging to the same tooth, and then repeats the test above for each untested voxel that is adjacent to the new reference point. This process continues until all voxels in the dentition have been tested.
The computer then calculates the rate of curvature (i.e., the derivative of the radius of curvature) at each voxel on the 2D cross-sectional surface 164 (step 176) and identifies all of the voxels at which local maxima in the rate of curvature occur (step 178). Each voxel at which a local maximum occurs represents a “cusp” in the 2D cross-sectional surface 164 and roughly coincides with an interproximal margin between teeth. In each 2D slice, the computer identifies pairs of these cusp voxels that correspond to the same interproximal margin (step 180), and the computer labels each pair to identify the interproximal margin with which it is associated (step 182). The computer then identifies the voxel pairs on all of the 2D slices that represent the same interproximal margins (step 184). For each interproximal margin, the computer fits a 3D surface 168 approximating the geometry of the interproximal margin among the associated voxel pairs (step 186).
The computer eliminates these discontinuities by creating two new line segments 212, 214, each of which is bounded by one cusp voxel 202a–b, 204a–b from each original line segment 200, 206, as shown in
Automated segmentation is enhanced through a technique known as “seed cusp detection.” The term “seed cusp” refers to a location at which an interproximal margin meets the patient's gum tissue. In a volumetric representation of the patient's dentition, a seed cusp for a particular interproximal margin is found at the cusp voxel that lies closest to the gumline. By applying the seed cusp detection technique of the 2D slice analysis, the computer is able to identify all of the seed cusp voxels in the 3D model automatically.
In searching for the cusp voxels on the N+1 slice, the computer tests the image values for all voxels in the projected neighborhoods to identify those associated with the background image and those associated with the dentition (step 256). In the illustrated example, voxels in the background are black and voxels in the dentition are white. The computer identifies the cusp voxels on the N+1 slice by locating the pair of black voxels in the two neighborhoods that lie closest together (step 258). The computer then repeats this process for all remaining slices (step 259).
When applying the arch curve fitting technique, the computer begins by selecting a 2D slice (step 270) and identifying the voxels associated with the surface 262 of the cross-sectional arch 264 (step 272). The computer then defines a curve 260 that fits among the voxels on the surface 262 of the arch (step 274). The computer creates the curve using any of a variety of techniques, a few of which are discussed below. The computer then creates a series of line segments that are roughly perpendicular to the curve and are bounded by the cross-sectional surface 262 (step 276). The line segments are approximately evenly spaced with a spacing distance that depends upon the required resolution and the acceptable computing time. Greater resolution leads to more line segments and thus greater computing time. In general, a spacing on the order of 0.4 mm is sufficient in the initial pass of the arch curve fitting technique.
The computer calculates the length of each line segment (step 278) and then identifies those line segments that form local minima in length (step 280). These line segments roughly approximate the locations of the interproximal boundaries, and the computer labels the voxels that bound these segments as cusp voxels (step 282). The computer repeats this process for each of the 2D slices (step 284) and then uses the cusp voxels to define 3D cutting surfaces that approximate the interproximal margins.
In some implementations, the computer refines the arch cusp determination by creating several additional sets of line segments, each centered around the arch cusps identified on the first pass. The line segments are spaced more narrowly on this pass to provide greater resolution in identifying the actual positions of the arch cusps.
The computer uses any of a variety of curve fitting techniques to create the curve through the arch. One technique involves the creation of a catenary curve with endpoints lying at the two ends 265, 267 (
In applying this technique, the computer first locates an end 265 of the arch (step 300) and creates a line segment 291 that passes through the arch 264 near this end 265 (step 301). The line segment 291 is bounded by voxels 292a–b lying on the surface of the arch. The computer then determines the midpoint 293 of the line segment 291 (step 302), selects a voxel 294 located particular distance from the midpoint 293 (step 304), and creates a second line segment 295 that is parallel to the initial line segment 291 and that includes the selected voxel 294 (step 306). The computer then calculates the midpoint 296 of the second segment 295 (step 308) and rotates the second segment 295 to the orientation 295′ that gives the segment its minimum possible length (step 309). In some cases, the computer limits the second segment 295 to a predetermined amount of rotation (e.g., ±10°).
The computer then selects a voxel 297 located a particular distance from the midpoint 296 of the second segment 295 (step 310) and creates a third line segment 298 that is parallel to the second line segment 295 and that includes the selected voxel 297 (step 312). The computer calculates the midpoint 299 of the third segment 298 (step 314) and rotates the segment 298 to the orientation 298′ that gives the segment its shortest possible length (step 316). The computer continues adding line segments in this manner until the other end of the cross-sectional arch is reached (step 318). The computer then creates a curve that fits among the midpoints of the line segments (step 320) and uses this curve in applying the arch fitting technique described above.
In one implementation of this technique, the computer first identifies one end of the arch in the dentition model (step 340). The computer then creates a vertical plane 330 through the arch near this end (step 342) and identifies the centerpoint 331 of the plane 330 (step 344). The computer then selects a voxel located a predetermined distance from the centerpoint (step 345) and creates a second plane 333 that is parallel to the initial plane and that includes the selected voxel (step 346). The computer calculates the midpoint of the second plane (step 348) and rotates the second plane about two orthogonal axes that intersect at the midpoint (step 350). The computer stops rotating the plane upon finding the orientation that yields the minimum cross-sectional area. In some cases, the computer limits the plane to a predetermined amount of rotation (e.g., ±10° about each axis). The computer then selects a voxel located a particular distance from the midpoint of the second plane (step 352) and creates a third plane that is parallel to the second plane and that includes the selected voxel (step 354). The computer calculates the midpoint of the third plane (step 356) and rotates the plane to the orientation that yields the smallest possible cross-sectional area (step 357). The computer continues adding and rotating planes in this manner until the other end of the arch is reached (step 358). The computer identifies the planes at which local minima in cross-sectional area occur and labels these planes as “interproximal planes,” which approximate the locations of the interproximal margins (step 360).
One variation of this technique, described in
One technique is very similar to the neighborhood filtered cusp detection technique described above, in that voxel neighborhoods 388, 390 are defined on one of the 2D planes to focus the computer's search for cusps on an adjacent 2D plane. Upon detecting a pair of cusps 384, 386 on one 2D plane (step 400), the computer defines one or more neighborhoods 388, 390 to include a predetermined number of voxels surrounding the pair (step 402). The computer projects the neighborhoods onto an adjacent 2D plane by identifying the voxels on the adjacent plane that correspond to the voxels in the neighborhoods 388, 390 on the original plane (step 404). The computer then identifies the pair of black voxels that lie closest together in the two neighborhoods on the adjacent plane, labeling these voxels as lying in the cusp (step 406). The computer repeats this process for all remaining planes (step 408).
Many of these automated segmentation techniques are even more useful and efficient when used in conjunction with human-assisted techniques. For example, techniques that rely on the identification of the interproximal or gingival margins function more quickly and effectively when a human user first highlights the interproximal or gingival cusps in a graphical representation of the dentition model. One technique for receiving this type of information from the user is by displaying a 2D or 3D representation and allowing the user to highlight individual voxels in the display. Another technique allows the user to scroll through a series of 2D cross-sectional slices, identifying those voxels that represent key features such as interproximal or gingival cusps. Some of these techniques rely on user interface tools such as cursors and bounding-box markers.
In many instances, the computer creates proposals for segmenting the dentition model and then allows the user to select the best alternative. For example, one version of the arch curve fitting technique requires the computer to create a candidate catenary or spline curve, which the user is allowed to modify by manipulating the mathematical control parameters. Other techniques involve displaying several surfaces that are candidate cutting surfaces and allowing the user to select the appropriate surfaces.
Some implementations of the invention are realized in digital electronic circuitry, such as an application specific integrated circuit (ASIC); others are realized in computer hardware, firmware, and software, or in combinations of digital circuitry and computer components. The invention is usually embodied, at least in part, as a computer program tangibly stored in a machine-readable storage device for execution by a computer processor. In these situations, methods embodying the invention are performed when the processor executes instructions organized into program modules, operating on input data and generating output. Suitable processors include general and special purpose microprocessors, which generally receive instructions and data from read-only memory and/or random access memory devices. Storage devices that are suitable for tangibly embodying computer program instructions include all forms of non-volatile memory, including semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM.
The invention has been described in terms of particular embodiments. Other embodiments are within the scope of the following claims.
The present application is a continuation of U.S. application Ser. No. 09/264,547, filed Mar. 8, 1999, now U.S. Pat. No. 7,063,532, which is a continuation-in-part of U.S. application Ser. No. 09/169,276, filed on Oct. 8, 1998, (now abandoned), and entitled “Computer Automated Development of an Orthodontic Treatment Plan and Appliance,” which is a continuation-in-part of PCT Application No. PCT/US98/12861, filed on Jun. 19, 1998, and entitled “Method and System for Incrementally Moving Teeth”, which is a continuation-in-part of U.S. application Ser. No. 08/947,080, filed on Oct. 8, 1997, and entitled “Method and System for Incrementally Moving Teeth” (now U.S. Pat. No. 5,975,893), which claims the benefit of U.S. Provisional Application No. 60/050,342, filed on Jun. 20, 1997, all of which are incorporated by reference into this application.
Number | Name | Date | Kind |
---|---|---|---|
2467432 | Kesling | Apr 1949 | A |
3407500 | Kesling | Oct 1968 | A |
3660900 | Andrews | May 1972 | A |
3683502 | Wallshein | Aug 1972 | A |
3860803 | Levine | Jan 1975 | A |
3916526 | Schudy | Nov 1975 | A |
3922786 | Lavin | Dec 1975 | A |
3950851 | Bergersen | Apr 1976 | A |
3983628 | Acevedo | Oct 1976 | A |
4014096 | Dellinger | Mar 1977 | A |
4195046 | Kesling | Mar 1980 | A |
4324546 | Heitlinger et al. | Apr 1982 | A |
4324547 | Arcan et al. | Apr 1982 | A |
4348178 | Kurz | Sep 1982 | A |
4478580 | Barrut | Oct 1984 | A |
4500294 | Lewis | Feb 1985 | A |
4504225 | Yoshii | Mar 1985 | A |
4505673 | Yoshii | Mar 1985 | A |
4526540 | Dellinger | Jul 1985 | A |
4575330 | Hull | Mar 1986 | A |
4575805 | Moermann et al. | Mar 1986 | A |
4591341 | Andrews | May 1986 | A |
4609349 | Cain | Sep 1986 | A |
4611288 | Duret et al. | Sep 1986 | A |
4656860 | Orthuber et al. | Apr 1987 | A |
4663720 | Duret et al. | May 1987 | A |
4664626 | Kesling | May 1987 | A |
4676747 | Kesling | Jun 1987 | A |
4742464 | Duret et al. | May 1988 | A |
4755139 | Abbatte et al. | Jul 1988 | A |
4763791 | Halverson et al. | Aug 1988 | A |
4793803 | Martz | Dec 1988 | A |
4798534 | Breads | Jan 1989 | A |
4836778 | Baumrind et al. | Jun 1989 | A |
4837732 | Brandestini et al. | Jun 1989 | A |
4850864 | Diamond | Jul 1989 | A |
4850865 | Napolitano | Jul 1989 | A |
4856991 | Breads et al. | Aug 1989 | A |
4877398 | Kesling | Oct 1989 | A |
4880380 | Martz | Nov 1989 | A |
4889238 | Batchelor | Dec 1989 | A |
4890608 | Steer | Jan 1990 | A |
4935635 | O'Harra | Jun 1990 | A |
4936862 | Walker et al. | Jun 1990 | A |
4937928 | van der Zel | Jul 1990 | A |
4941826 | Loran et al. | Jul 1990 | A |
4964770 | Steinbichler et al. | Oct 1990 | A |
4975052 | Spencer et al. | Dec 1990 | A |
4983334 | Adell | Jan 1991 | A |
5011405 | Lemchen | Apr 1991 | A |
5017133 | Miura | May 1991 | A |
5027281 | Rekow et al. | Jun 1991 | A |
5035613 | Breads et al. | Jul 1991 | A |
5055039 | Abbatte et al. | Oct 1991 | A |
5059118 | Breads et al. | Oct 1991 | A |
5100316 | Wildman | Mar 1992 | A |
5121333 | Riley et al. | Jun 1992 | A |
5125832 | Kesling | Jun 1992 | A |
5128870 | Erdman et al. | Jul 1992 | A |
5131843 | Hilgers et al. | Jul 1992 | A |
5131844 | Marinaccio et al. | Jul 1992 | A |
5139419 | Andreiko et al. | Aug 1992 | A |
5145364 | Mártz et al. | Sep 1992 | A |
5176517 | Truax | Jan 1993 | A |
5184306 | Erdman et al. | Feb 1993 | A |
5186623 | Breads et al. | Feb 1993 | A |
5257203 | Riley et al. | Oct 1993 | A |
5273429 | Rekow et al. | Dec 1993 | A |
5278756 | Lemchen et al. | Jan 1994 | A |
5338198 | Wu et al. | Aug 1994 | A |
5340309 | Robertson | Aug 1994 | A |
5342202 | Deshayes | Aug 1994 | A |
5368478 | Andreiko et al. | Nov 1994 | A |
5382164 | Stern | Jan 1995 | A |
5395238 | Andreiko et al. | Mar 1995 | A |
5431562 | Andreiko et al. | Jul 1995 | A |
5440326 | Quinn | Aug 1995 | A |
5440496 | Andersson et al. | Aug 1995 | A |
5447432 | Andreiko et al. | Sep 1995 | A |
5452219 | Dehoff et al. | Sep 1995 | A |
5454717 | Andreiko et al. | Oct 1995 | A |
5456600 | Andreiko et al. | Oct 1995 | A |
5474448 | Andreiko et al. | Dec 1995 | A |
RE35169 | Lemchen et al. | Mar 1996 | E |
5518397 | Andreiko et al. | May 1996 | A |
5528735 | Strasnick et al. | Jun 1996 | A |
5533895 | Andreiko et al. | Jul 1996 | A |
5542842 | Andreiko et al. | Aug 1996 | A |
5549476 | Stern | Aug 1996 | A |
5562448 | Mushabac | Oct 1996 | A |
5587912 | Andersson et al. | Dec 1996 | A |
5605459 | Kuroda et al. | Feb 1997 | A |
5607305 | Andersson et al. | Mar 1997 | A |
5621648 | Crump | Apr 1997 | A |
5645420 | Bergersen | Jul 1997 | A |
5645421 | Slootsky | Jul 1997 | A |
5655653 | Chester | Aug 1997 | A |
5683243 | Andreiko et al. | Nov 1997 | A |
5692894 | Schwartz et al. | Dec 1997 | A |
5725376 | Poirier | Mar 1998 | A |
5725378 | Wang | Mar 1998 | A |
5733126 | Andersson et al. | Mar 1998 | A |
5740267 | Echerer et al. | Apr 1998 | A |
5742700 | Yoon et al. | Apr 1998 | A |
5799100 | Clarke et al. | Aug 1998 | A |
5800174 | Andersson | Sep 1998 | A |
5823778 | Schmitt et al. | Oct 1998 | A |
5848115 | Little et al. | Dec 1998 | A |
5857853 | van Nifterick et al. | Jan 1999 | A |
5866058 | Batchelder et al. | Feb 1999 | A |
5879158 | Dolyle et al. | Mar 1999 | A |
5880961 | Crump | Mar 1999 | A |
5880962 | Andersson et al. | Mar 1999 | A |
5934288 | Avila et al. | Aug 1999 | A |
5957686 | Anthony | Sep 1999 | A |
5964587 | Sato | Oct 1999 | A |
5971754 | Sondhi et al. | Oct 1999 | A |
5975893 | Chishti et al. | Nov 1999 | A |
6015289 | Andreiko et al. | Jan 2000 | A |
6044309 | Honda | Mar 2000 | A |
6049743 | Baba | Apr 2000 | A |
6062861 | Andersson | May 2000 | A |
6068482 | Snow | May 2000 | A |
6099314 | Kopelman et al. | Aug 2000 | A |
6123544 | Cleary | Sep 2000 | A |
6152731 | Jordan et al. | Nov 2000 | A |
6183248 | Chishti et al. | Feb 2001 | B1 |
6190165 | Andreiko et al. | Feb 2001 | B1 |
6217334 | Hultgren | Apr 2001 | B1 |
6244861 | Andreiko et al. | Jun 2001 | B1 |
6309215 | Phan et al. | Oct 2001 | B1 |
6315553 | Sachdeva et al. | Nov 2001 | B1 |
6322359 | Jordan et al. | Nov 2001 | B1 |
6350120 | Sachdeva et al. | Feb 2002 | B1 |
6382975 | Poirier | May 2002 | B1 |
6398548 | Muhammad et al. | Jun 2002 | B1 |
6524101 | Phan et al. | Feb 2003 | B1 |
6554611 | Chishti et al. | Apr 2003 | B2 |
20020006597 | Andreiko et al. | Jan 2002 | A1 |
Number | Date | Country |
---|---|---|
3031677 | May 1979 | AU |
517102 | Jul 1981 | AU |
5598894 | Jun 1994 | AU |
1121955 | Apr 1982 | CA |
2749802 | May 1978 | DE |
69327661 | Jul 2000 | DE |
0091876 | Oct 1983 | EP |
0299490 | Jan 1989 | EP |
0376873 | Jul 1990 | EP |
0490848 | Jun 1992 | EP |
0667753 | Aug 1995 | EP |
0774933 | May 1997 | EP |
0541500 | Jun 1998 | EP |
0731673 | Sep 1998 | EP |
463897 | Jan 1980 | ES |
2 369 828 | Jun 1978 | FR |
2369828 | Jun 1978 | FR |
2652256 | Mar 1991 | FR |
1550777 | Aug 1979 | GB |
53-058191 | May 1978 | JP |
04-028359 | Jan 1992 | JP |
08-508174 | Sep 1996 | JP |
WO 9008512 | Aug 1990 | WO |
WO 9104713 | Apr 1991 | WO |
WO 9410935 | May 1994 | WO |
WO 9410935 | May 1994 | WO |
WO 9832394 | Jul 1998 | WO |
WO 9844865 | Oct 1998 | WO |
WO 9858596 | Dec 1998 | WO |
Number | Date | Country | |
---|---|---|---|
20040175671 A1 | Sep 2004 | US |
Number | Date | Country | |
---|---|---|---|
60050342 | Jun 1997 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 09264547 | Mar 1999 | US |
Child | 10802124 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 09169276 | US | |
Child | 09264547 | US |