Smile treatment planning systems and methods

Information

  • Patent Grant
  • 12121411
  • Patent Number
    12,121,411
  • Date Filed
    Tuesday, August 17, 2021
    3 years ago
  • Date Issued
    Tuesday, October 22, 2024
    a month ago
Abstract
Smile treatment planning systems and methods are described herein. One method for adjusting an image of a smile may generally comprise receiving a three-dimensional (3D) digital model of a dental arch of a patient, receiving a digital facial image of the patient which includes an image of one or more teeth of the patient when smiling, registering the 3D digital model to the one or more teeth of the patient from the digital facial image, correcting the 3D digital model for scale and distortion to create a corrected 3D digital model, and overlaying the corrected 3D digital model onto the digital facial image.
Description
FIELD OF THE INVENTION

The present invention relates to methods and apparatus for orthodontics. More particularly, the present invention relates to methods and apparatus for orthodontic treatment planning of malocclusions and optimizing the corresponding smile of the patient with respect to the planned treatment.


BACKGROUND OF THE INVENTION

Orthodontics is a specialty of dentistry that is concerned with the study and treatment of malocclusions which can result from tooth irregularities, disproportionate facial skeleton relationships, or both. Orthodontics treats malocclusion through the displacement of teeth via bony remodeling and control and modification of facial growth.


This process has been accomplished by using a number of different approaches such as the application of static mechanical forces to induce bone remodeling, thereby enabling teeth to move. Devices such as braces having an archwire interface with brackets are affixed to each tooth. As the teeth respond to the pressure applied via the archwire by shifting their positions, the wires are again tightened to apply additional pressure. This widely accepted approach to treating malocclusions takes about twenty-four months on average to complete, and is used to treat a number of different classifications of clinical malocclusion. Other treatments can also include the use of aligners which are positioned upon the teeth to effect the movement of one or more teeth.


However, corrections which are performed may result in a final arrangement of teeth which are straightened but which may or may not produce a corresponding smile which is aesthetically pleasing to the patient. This may be due to a number of factors such as a shifting of the facial features due to the teeth correction. Simply presenting a projected image of the corrected teeth positioning to the patient may not present the most accurate or aesthetically desirable smile which may correspond to the corrected dentition. Furthermore, other factors relating to the patient's smile may be desirable for alteration to result in an aesthetically pleasing smile. Accordingly, there exists a need for efficiently and effectively performing treatments for moving of one or more teeth and optimizing a corresponding smile for presentation to the patient.


SUMMARY OF THE INVENTION

As part of the treatment planning, a three-dimensional (3D) digital scan of the patient's dental arch prior to treatment are typically obtained using any number of scanning methodologies and processes. This 3D scan of the dental arch may be used to generate an image of the patient's smile which results correspondingly from the correction treatment of the teeth positioning. The 3D model may be corrected via software either automatically or manually to adjust for any scale and/or distortion and this corrected 3D model may then be overlaid onto the one or more facial photos. The 3D model may then be manipulated or adjusted in various ways to match a number of various features of the patient's anatomy. The visual image of the smile may be presented to the patient to demonstrate how their corresponding smile would appear after their teeth are corrected for malocclusions.


The image of the face of the patient may be adjusted for positioning using reference lines to allow for the user to reach a natural looking position. These reference lines and areas may be automatically detected upon the facial photo images and/or may be adjusted by the user in order to determine where the teeth of the patient are located upon the facial images.


With the 3D arch model initially overlaid upon the facial photo, the software may be used to highlight the 3D arch model and photo of the patient's teeth for registering the model to the image of the teeth. Various control features may be used upon the graphical user interface to control movement of the 3D arch model relative to the facial image to control fine movements of the model, e.g., linear and angular movement. A calibration process for auto-matching the 3D arch model to the photo image may be implemented in one method by utilizing a number of markers which are generated by the system and placed upon various landmarks of the patient's teeth both upon the 3D arch model and the photo image. Once the registration has been completed, the system may then replace the photo image with the 3D arch model in the facial image of the patient.


Once the registration has been completed so that the arch model is registered to the image of the teeth and the image has been replaced with the arch model, the color of the arch model may not match the actual color the patient's teeth. The user may then select the color from the photo image and apply that color onto the 3D arch model. Additionally and/or alternatively, the color may be further adjusted to be darker or brighter depending upon the desired resulting image. Aside from adjusting the color of the teeth, the color of the gums on the 3D arch model may similarly be adjusted.


With the positioning and registration of the arch model matched to the facial image and with the color of the teeth and gums of the arch model also matched and corrected, the matched 3D arch model may be presented in the facial image and profile image.


Additional parameters of the 3D arch model may be adjusted to alter various features of the model to improve aesthetic features of the patient's smile. One method for adjusting aesthetic features may incorporate the use of a curve or arc which is generated from parameters of the patient's smile to create a “smile arc”. The parameters of the smile arc may be adjusted and the teeth of the patient (as well as other anatomical features) may be manipulated according to the smile arc being used as a guide for adjusting or improving the patient's smile.


The smile arc may be formed to have, e.g., five control points or locations, which may be adjusted and moved to allow for the curvature of the smile arc to be changed. The initial curvature of the smile arc may be obtained from the curvature of, e.g., the patient's lower lip, in order to be used as a guide for having the teeth follow the curvature of the lower lip to enhance the smile. The smile arc can be viewed with or without the frontal image depending upon the preference of the user. The control points may be moved simultaneously together or individually in order to create a symmetrical smile arc or asymmetrical smile arc based on the shape of the lower lip and the user's preferences.


The smile arc may also be adjusted to move upward or downward relative to the patient's lower lip. As the smile arc is translated, the teeth shown in the arch model may be correlated to follow the location of the smile arc, e.g., by having the tips of the teeth (or individual tooth) as well as the FACC lines being used as the indicator for the follow function to allow for the teeth movement. Also, the entire smile arc may be moved upwards and/or downwards while maintaining its curvature unchanged. This may allow for the user to adjust the treatment plan because while the digital tooth movements may appear to be achievable, some or all of the of the teeth may not be movable clinically over the digitally specified long distances; furthermore, the gums may need to be reshaped which the patient may or may not wish to have done. Hence, maintaining a curvature of the smile arc during its adjustment may allow for the smile arc to keep its shape for the smile without having to utilize such aggressive movements.


In some cases where the treatment may utilize the use of brackets rather than aligners to effect the tooth movements, the smile arc may still be used as a guide for following the patient's smile. The 3D arch model may still incorporate the smile arc while preparing the 3D arch model for use with an indirect bonding tray (IDB) for the application of one or more brackets to the teeth.


A plane may be introduced into a 3D arch model which shows a final position of the corrected teeth after a bracket treatment to illustrate where the one or more brackets should be placed upon the teeth. This plane may represent a position of the brackets upon the teeth because as the correction treatment nears completion and the teeth are adjusted to their desired positions, the plane may function as a guide for bracket positioning to remain in an aligned position relative to one another as the bracket wire will become straightened near the end of a correction treatment.


Digitally, a treatment may be planned to bring the fully aligned brackets on to the final stage where the teeth movements are completed. The teeth may then be digitally reverted back to their original pre-treatment positions to enable the user to see where the bracket should be placed at the outset of the treatment to achieve the final position of the teeth and the desired treatment plan.


Once any adjustments of the plane have been completed, rotation of the 3D arch model back to its front view may show the plane aligned in a horizontal orientation. With the plane suitably positioned, models of the brackets may be applied digitally along the plane and upon the teeth such that the wire receiving slot of each bracket is aligned with the plane so as to accommodate the arch wire which also becomes aligned with the plane at the completion of the bracket treatment.


With the brackets superimposed upon the 3D arch model, a distance from the pocket to the gumline and the distance from the pocket to the incisal edge may be measured in order to allow for the user to check and follow the guide for bracket placement. The brackets can also be moved freely when selected.


When the 3D arch model is reverted back to the initial pre-treatment stage, the brackets can be seen in their pre-treatment position for mounting upon the teeth. This may allow for the arch wire to be coupled through the wire receiving slot of each bracket for treatment.


Along with the positioning of the brackets, the smile arc may also adjusted as well as there may be occasions where the bracket cannot be placed clinically at the desired position because of a tooth which is too small or a region of the gums which interfere. The 3D arch model could indicate that bracket is to be placed on the gums if the tooth or gum is not modified. For instance, a tooth may require lengthening with, e.g., a composite resin, or the gum may need to be shaved short to accommodate a bracket. In such a case, the smile arc may be adjusted by moving the arc upwards or downwards while still maintaining the same curvature to achieve the same smile.


In the event that the gums may need clinical adjustment, the gum line may be adjusted on the 3D arch model to mimic what the practitioner can potentially do with respect to, e.g., trimming the gums or applying a patch onto the gums to lengthen it. These results may be reflected in the arch model for presentation to the patient to show the patient what the expected clinical results may look like. In the event that a tooth or several teeth may need clinical adjustment, such as lengthening or reduction, another module may be introduced for adding geometry onto an identified tooth.


In the event that several teeth are to be lengthened, a mold such as an aligner-shaped device may be applied to the teeth. The shape of the mold with respect to the lengthened portions may be fabricated based upon the identified teeth and the shape of the extended teeth.


In addition to lengthening the teeth, another aligner-like device may be used for removing a portion of a tooth or several teeth. The aligner-like device may be fabricated with a portion of the aligner removed corresponding to the region of the tooth to be removed. The exposed portion of the tooth projecting from the aligner opening may be used as a reference guide to the user for removing this excess portion of the tooth.


Aside from the tooth extension or removal, yet another feature of the smile optimization process may include the adjustment of one or more facial features from the facial image. After the course of a correction treatment, the movement of one or more teeth may alter a number of facial features due to the repositioning of the underlying muscles and/or skin. The resulting smile of the patient may accordingly differ as well.


With the movement of the teeth known and the resulting teeth location, the areas likely to be affected are identified and the system may automatically adjust a position of the muscles and/or skin to alter the patient's facial features upon the image. The positions may also be manually adjusted by the user as well. The identified regions may be bounded where the facial regions may be freely moved within the bounds of the identified regions.


In addition to the facial regions, the lips of the patient may be adjusted as well. A number of markers may be applied around each of boundaries to allow for adjustment of the markers by the user. Depending upon the treatment, the upper lips and/or lower lips may be altered.


In yet another feature of the system for optimizing a patient's smile, a “smile score” may be generated for the purpose of providing the user and/or patient some relative scale to provide some indication of how optimized the resulting smile of the patient may appear. Factors such as the patient's smile arc, FACC line, width and height of the teeth, curvature of individual teeth, ABO score, etc., may be input into a smile score engine to automatically calculate the smile score. The user may alter any one of these input parameters to iteratively generate the corresponding smile score and depending upon the results, the user may then implement one or more changes to further increase the corresponding smile score. The changes may then be optionally implemented by the user clinically to achieve an aesthetically pleasing smile.


Yet another feature optionally available through the system may include the generation of an animation of the patient's face. Such an animation can be video based, where the patient may be requested to maintain a natural head position while repeating one or more phrases while recorded. The recorded video may be altered to swap the patient's face with the facial image of the patient with the resulting smile from treatment. The patient may then be able to view the original video and altered video with the replaced arch model for comparison purposes.


While different features are discussed, the system may incorporate any number of different features into a single system in any number of combinations. A single system provided may, for example, include or incorporate every feature described herein or it may include a select number of features depending upon the desired system.


One method for adjusting an image of a smile may generally comprise receiving a three-dimensional (3D) digital model of a dental arch of a patient, receiving a digital facial image of the patient which includes an image of one or more teeth of the patient when smiling, registering the 3D digital model to the one or more teeth of the patient from the digital facial image, correcting the 3D digital model for scale and distortion to create a corrected 3D digital model, and overlaying the corrected 3D digital model onto the digital facial image.


One method of adjusting a smile may generally comprise receiving a three-dimensional (3D) digital model of a dental arch of a patient, receiving a digital facial image of the patient which includes an image of one or more teeth of the patient when smiling, generating a smile curve or arc which corresponds to a curve or arc of a lower lip of the patient from the digital facial image, overlaying the smile curve or arc in proximity to the one or more teeth on the digital facial image, adjusting one or more parameters of the smile curve or arc, and manipulating one or more teeth from the 3D digital model according to the smile curve or arc.


One method of adjusting a facial image may generally comprise receiving a three-dimensional (3D) digital model of a dental arch of a patient, receiving a digital facial image of the patient which includes an image of one or more teeth of the patient when smiling, estimating a facial anatomy from the digital facial image of the patient, identifying one or more areas of the facial anatomy affected by a correction treatment of the one or more teeth, and adjusting the one or more areas of the facial anatomy corresponding to the correction treatment.


One method of improving a smile of a patient may generally comprise receiving a three-dimensional (3D) digital model of a dental arch of a patient, receiving a digital facial image of the patient which includes an image of one or more teeth of the patient when smiling, identifying one or more parameters relating to smile optimization, and generating a smile score based on the one or more parameters.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 shows a flow diagram of one variation of a method for determining and optimizing a smile of a patient corresponding to a treatment plan.



FIG. 2 shows a computer generated three-dimensional (3D) image of a patient's dental arch model obtained from a digital scan of the teeth.



FIGS. 3A and 3B show examples of images of a patient's face and profile taken by a device such as a digital camera or smartphone which may be used for generating the resulting smile of the patient after treatment.



FIG. 4 shows the various photo images of the patient which can be uploaded to a software program used for optimizing various parameters of the patient's smile.



FIG. 5 shows an image of an uploaded photo image of the patient with various indicator lines generated for determining various physical parameters of the patient.



FIGS. 6A to 6C show the patient's photo image being overlaid with the scanned 3D arch model in a corresponding manner.



FIGS. 7A to 7C show detail images for one variation of registering the teeth from the photo image to the 3D arch model.



FIG. 8 shows the photo image of the patient with the 3D arch model replacing the photo image of the patient's teeth.



FIGS. 9A to 9C show profile images of one variation of registering the teeth from the patient's profile photo image to the 3D arch model.



FIGS. 10A to 10C show various images of the patient image having the color of the teeth in the 3D arch model (here shown in the upper dentition) adjusted to match the color of the patient's actual teeth color.



FIG. 11 shows an image of the patient having the color of the gums in the 3D arch model adjusted to match the color of the patient's actual gums.



FIG. 12 shows an image of the patient's corrected teeth applied to the photo image.



FIG. 13 shows a flow diagram of one variation of a method for adjusting a patient's smile using a smile arc.



FIGS. 14A and 14B show how a smile arc is generated and adjusted for use in optimizing the patient's smile.



FIGS. 15A and 15B show an example of how the smile arc may be used for implementing brackets.



FIGS. 16A to 16D show perspective images of a wire plane positioned and superimposed through the 3D arch model with the position of the teeth corrected where it can be adjusted to function as a guide for the placement of brackets and the wire.



FIGS. 17A and 17B show front views of the corrected 3D arch model with the brackets applied.



FIGS. 18A to 18D show front views of the corrected 3D arch model and the initial positioning of the teeth prior to correction to view the location of the brackets and wire.



FIG. 19 shows a front view of the 3D arch model in which a position of the smile arc is adjustable depending upon any anatomical limitations.



FIG. 20 shows images of the patient having the brackets superimposed upon the 3D arch model.



FIGS. 21A and 21B show perspective views to illustrate where the gum line can be adjusted to allow for bracket placement or teeth movement.



FIGS. 22A and 22B show perspective views of another feature where the length or height of the teeth may be adjusted digitally.



FIGS. 23A to 23D show various views of how a tooth or multiple teeth may have its length of height adjusted.



FIG. 24 shows a front view of the 3D arch model to illustrate how a physical device such as an aligner can be used to determine the length to which a tooth or teeth may be adjusted.



FIG. 25 shows a flow diagram of one variation of a method for adjusting various facial anatomy features.



FIG. 26A shows a front view of the patient image illustrating the various areas of the patient face which may be adjustable by the software to account for changing facial anatomy when the tooth or teeth are corrected.



FIG. 26B shows a side view of the patient image having facial morphing features available.



FIGS. 27A and 27B show before and after images generated automatically or manually of how the patient's facial anatomy may change when the teeth are corrected.



FIGS. 28A and 28B show before and after images generated automatically or manually of how the patient's chin line may change when the teeth are corrected.



FIG. 29 shows a front view of how the patient's lips may be recognized by the computer for potential adjustment.



FIGS. 30A and 30B show front views of before and after images generated automatically or manually of how the patient's lips may change when the teeth are corrected.



FIG. 31 shows a flow diagram of one variation of a method for generating a smile score assigned to a patient.



FIG. 32 shows a front view of the digital model of the patient's dentition for illustrating a smile arc determination.



FIG. 33 shows an example of the patient's image with the digital model superimposed illustrating the smile arc.



FIGS. 34 and 35 show examples of the patient's image with the digital model superimposed illustrating the smile arc.



FIG. 36 shows a front view of the digital model of the patient's dentition for illustrating the incisor plane cant.



FIG. 37 shows an example of the patient's image with the digital model superimposed illustrating the incisor line.



FIG. 38 shows a front view of the digital model of the patient's dentition for illustrating the occlusal plane.



FIG. 39 shows an example of the patient's image with the digital model superimposed illustrating the occlusal line.



FIG. 40 shows a front view of the digital model of the patient's dentition for illustrating the max midline.



FIG. 41 shows an example of the patient's image with the digital model superimposed illustrating the max midline.



FIG. 42 shows a front view of the digital model of the patient's dentition for illustrating the max transverse display.



FIG. 43 shows an example of the patient's image with the digital model superimposed illustrating the max transverse display.



FIG. 44 shows a front view of the digital model of the patient's dentition for illustrating the cuspid inclination factor.



FIG. 45 shows a front view of the digital model of the patient's dentition for illustrating the curved lines and upright cuspid lines formed upon the teeth.



FIGS. 46 and 47 show examples of the patient's image with the digital model superimposed illustrating the upright cuspid lines.



FIG. 48 shows a front view of the digital model of the patient's dentition for illustrating the buccal segment inclination.



FIGS. 49 and 50 show examples of the patient's image with the digital model superimposed illustrating the buccal segment inclination.



FIG. 51 shows a front view of the digital model of the patient's dentition for illustrating the tooth proportionality.



FIG. 52 shows a front view of the digital model of the patient's dentition for illustrating the flow factor.



FIG. 53 shows a front view of the digital model of the patient's dentition for illustrating the flow factor.



FIG. 54 shows a front view of the digital model of the patient's dentition for illustrating the gingival display.



FIG. 55 shows an example of the patient's image with the digital model superimposed illustrating the gingival display.



FIG. 56 shows a side view of the digital model of the patient's dentition for illustrating the maxillary central inclination.



FIGS. 57 and 58 show examples of the patient's profile with the digital model superimposed illustrating the maxillary central inclination.



FIG. 59 shows a side view of the digital model of the patient's dentition for illustrating the COP factor.



FIGS. 60 and 61 show examples of the patient's profile with the digital model superimposed illustrating the COP factor.





DETAILED DESCRIPTION OF THE INVENTION

With treatment planning software, a treatment plan using aligners, brackets, etc. may be used to correct for any number of malocclusions with a patient's teeth. Particular treatment planning processes are described in further detail in U.S. Pat. Nos. 10,624,717; 10,335,250; 10,631,953; 10,357,336; 10,357,342; 10,588,723; 10,548,690, as well as U.S. Pat. Pubs. 2017/0100208; 2019/0321135; 2020/0205936; 2019/0343602; 2020/0170762; 2018/0078343; 2018/0078344; 2018/0078335; 2020/0146775. The details of these references are incorporated herein by reference in their entirety and for any purpose.


As part of the treatment planning, a three-dimensional (3D) digital scan of the patient's dental arch prior to treatment are typically obtained using any number of scanning methodologies and processes. This 3D scan of the dental arch may be used to generate an image of the patient's smile which results correspondingly from the correction treatment of the teeth positioning. As illustrated in FIG. 1, which shows a flow diagram 10 of one variation of a method for determining and optimizing a smile of a patient corresponding to a treatment plan. The 3D model of the teeth 12 may be obtained and used by a computer to register the 3D model to one or more facial photos 14 showing different angles of the patient's face, e.g., front view, profile, etc. as well as different angles of the patient's teeth.


The 3D model may be corrected via the software either automatically or manually to adjust for any scale and/or distortion 16. The corrected 3D model may then be overlaid onto the one or more facial photos 18 and the 3D model may then be manipulated or adjusted in various ways (as described in further detail below) to match a number of various features 20 of the patient's anatomy. The visual image of the smile may be presented to the patient to demonstrate how their corresponding smile would appear after their teeth are corrected for malocclusions.



FIG. 2 shows an example of the 3D arch model 30 generated from a scan of the patient's dentition which may be used not only for the treatment planning process but for creating a smile case in order to optimize various parameters of the patient's smile for presentation to the patient using the processes described herein. With the scanned 3D arch model of the patient, various photo images of the patient may also be taken via a digital camera or a smartphone having a digital imager which may be used to take a front image of the patient's face, as shown in FIG. 3A, as well as a profile image both in a smiling pose and a resting pose, as shown in FIG. 3B. These images may be used directly upon a computer or other processing device such as a smartphone, tablet, etc. with the appropriate software.


As further shown in FIG. 3A, the image of the face of the patient may be adjusted for positioning using reference lines, as well as an oval circle (as shown), to allow for the user to reach a natural looking position. These reference lines and areas may be automatically detected upon the facial photo images and/or may be adjusted by the user in order to determine where the teeth of the patient are located upon the facial images. Alternatively, the reference lines on the photo taking application (for example, on a smartphone, tablet, etc.) may be manually adjusted as well. The reference lines provided may be for the benefit of the practitioner viewing the image to have a better feeling for the facial structures of the patient. During automatic detection, the various lines, such as lip lines, may also be generated so that editing various features such as cropping out the teeth area for replacement with the digital 3D teeth model and filling any remaining areas with a color that matches the inside of the mouth may be performed. Such editing may be done for both the upper arches and/or lower arches as well.



FIG. 4 shows an example of multiple facial images of front views, profile, and detail profile views of the patient's mouth as well as photo images taken of the patient's teeth from front and side views as well as occlusal views of both the upper and lower teeth. These photo images may be uploaded to the software for processing.



FIG. 5 shows a front view of the patient's face in a photo which has been uploaded and where reference lines have been automatically generated upon the patient's face. Examples of the reference lines generated by the software are shown in a vertical line of symmetry 72 of the patient's face, a horizontal line 74 centered at a tip of the patient's nose, a horizontal line 76 between the eyes (specifically the irises) of the patient's face, as well as a horizontal line 78 between various teeth (e.g., canine teeth) to determine whether the teeth are symmetrical or canted, etc. Additional and/or alternative reference lines may be generated so long as the reference lines are used to determine various parameters relating to the facial features of the patient, such as distances and/or angles between the various features. Furthermore, while these reference lines may be automatically generated upon the facial photo, these lines may also be manually created or adjusted by the user to define the various parameters.


With the reference lines created upon the facial photo, the 3D arch model 80 may be imported and initially overlaid upon the facial photo, as shown in FIG. 6A, and in FIG. 6B which illustrates the 3D arch model 82 highlighted upon the facial photo. With the 3D arch model 82 initially overlaid upon the facial photo, the software may be used to highlight 84 the 3D arch model 82 and photo of the patient's teeth for registering the model 82 to the image of the teeth. Various control features may be used upon the graphical user interface to control movement of the 3D arch model 82 relative to the facial image to control fine movements of the model 82, e.g., linear 86 and angular 88 movement, as shown in FIG. 6D.



FIG. 7A shows a detail view of the highlighted region showing a detailed close-up view of the 3D arch model and the image of the patient's teeth. A calibration process for auto-matching the 3D arch model to the photo image may be implemented in one method by utilizing a number of markers (shown here as dots) which are generated by the system and placed upon various landmarks of the patient's teeth both upon the 3D arch model and the photo image. In this example, a series of six markers 90 may be automatically placed upon the 3D arch model 96 at features that are located along the line of the facial axis of the clinical crown (FACC) where the FACC line crosses an occlusal edge, e.g., tip of the incisal edge, so that the marker is located at the center of the tip of the tooth. As a first marker 90 is highlighted upon the 3D arch model 96, the user may click on a first corresponding location 94 located on the photo image in order to align the two via the markers, as shown in FIG. 7B. A second marker 92 may then be highlighted upon the 3D arch model 96 allowing for the user to then click on a second location upon the photo image. Each subsequent marker on the 3D arch model 96 may allow for the user to click on a corresponding location within the photo image so that the user is guided in locating each corresponding position. This process may be repeated for six different marks although fewer than six or greater than six markers may be used for registering the model 96 to the image. Furthermore, this registration process may be performed automatically by the system rather than manually. Additionally, the model 96 may also be rotated at various angles and/or translated, e.g., a moving step of 0.2 mm, for fine tuning purposes, as shown in FIG. 7C.


Once the registration has been completed, the system may then replace the photo image with the 3D arch model in the facial image of the patient, as shown in FIG. 8. In this example, the 3D arch model 100 is shown in place of the upper arch of the patient upon the facial image. In other variations, both the upper and lower arches may be utilized or just the lower arch instead may be used.


Once the front view of the 3D arch model 96 has been registered to the front view of the facial image, the profile view may also be registered as well, as shown in the profile images 110 of FIG. 9A. FIG. 9B shows the profile view where the model 96 and profile facial image is highlighted 98. FIG. 9C shows the detailed highlighted image showing the arch model 96 superimposed upon the profile image where the arch model 96 may be adjusted by translation and/or rotation to adjust the positioning of the model 96 relative to the profile image. For the profile view, since the key point information is relatively fewer, the use of registration markers may be optionally omitted so that the user may manually adjust a position of the model 96 to match for the profile view.


Once the registration has been completed so that the arch model is registered to the image of the teeth and the image has been replaced with the arch model, the color of the arch model may not match the actual color the patient's teeth. FIG. 10A shows a front image where the arch model has replaced the teeth image and where the arch model is shown with an initial color 120. The user may then select the color from the photo image and apply that color 122 onto the 3D arch model, as shown in FIG. 10B. Additionally and/or alternatively, the color 124 may be further adjusted to be darker or brighter depending upon the desired resulting image, as shown in FIG. 10C.


Aside from adjusting the color of the teeth, the color of the gums on the 3D arch model may similarly be adjusted. FIG. 11 illustrates how the color of the gums from the facial image may be applied upon the gums 130 of the arch model to result in a realistic image of the patient with the 3D arch model shown. Global color adjustments may also be done in the event the practitioner wants to view the gums and teeth in their unadjusted color as different viewing platforms (e.g., different monitors, screens, etc.) may present slightly different colors. Such a global setting may enable the practitioner to select certain colors from the software including, for example, (1) darker/brighter/warmer/colder colors, (2) standard teeth and gum shades, (3) RGB values to keep the color correctly for a particular monitor or screen by calibrating the monitor or screen so that future cases are consistent, etc.


With the positioning and registration of the arch model matched to the facial image and with the color of the teeth and gums of the arch model also matched and corrected, the matched 3D arch model 140 may be presented in the facial image and profile image, as shown in FIG. 12.


Additional parameters of the 3D arch model may be adjusted to alter various features of the model to improve aesthetic features of the patient's smile. One method for adjusting aesthetic features may incorporate the use of a curve or arc which is generated from parameters of the patient's smile to create a “smile arc”. FIG. 13 shows a flow diagram 150 for a process where the smile arc may be initially generated based upon patient features such as the curve or arc of the patient's lower lip when they smile 152. The parameters of the smile arc may be adjusted 154 and the teeth of the patient (as well as other anatomical features) may be manipulated according to the smile arc being used as a guide 156 for adjusting or improving the patient's smile.



FIG. 14A shows a front facial image of the patient where the generated smile arc 160 is superimposed upon the 3D arch model of the facial image. The smile arc 160 may be formed to have, e.g., five control points or locations 164, which may be adjusted and moved to allow for the curvature of the smile arc 160 to be changed. The initial curvature of the smile arc 160 may be obtained from the curvature of, e.g., the patient's lower lip 162, in order to be used as a guide for having the teeth follow the curvature of the lower lip to enhance the smile. The smile arc 160 can be viewed with or without the frontal image depending upon the preference of the user. The control points 164 may be moved simultaneously together or individually in order to create a symmetrical smile arc 160 or asymmetrical smile arc 160 based on the shape of the lower lip 162 and the user's preferences.


As shown in the image of FIG. 14B, the smile arc 160 may also be adjusted to move upward or downward relative to the patient's lower lip 162. As the smile arc 160 is translated, the teeth shown in the arch model may be correlated automatically (for example, via a single click, to follow a function which allows the teeth to directly move to where the smile arc is located) or manually to follow the location of the smile arc 160, e.g., by having the tips of the teeth (or individual tooth) as well as the FACC lines being used as the indicator for the follow function to allow for the teeth movement. Also, the entire smile arc 160 may be moved upwards and/or downwards while maintaining its curvature unchanged. This may allow for the user to adjust the treatment plan because while the digital tooth movements may appear to be achievable, some or all of the of the teeth may not be movable clinically over the digitally specified long distances; furthermore, the gums may need to be reshaped which the patient may or may not wish to have done. Hence, maintaining a curvature of the smile arc 160 during its adjustment may allow for the smile arc to keep its shape for the smile without having to utilize such aggressive movements.


In some cases where the treatment may utilize the use of brackets rather than aligners to effect the tooth movements, the smile arc 160 may still be used as a guide for following the patient's smile. FIGS. 15A and 15B show an example where the 3D arch model may still incorporate the smile arc 160 while preparing the 3D arch model for use with an indirect bonding tray (IDB) for the application of one or more brackets to the teeth.


A plane 170 may be introduced into a 3D arch model which shows a final position of the corrected teeth after a bracket treatment to illustrate where the one or more brackets should be placed upon the teeth, as shown in the perspective view of FIG. 16A. This plane 170 may represent a position of the brackets upon the teeth because as the correction treatment nears completion and the teeth are adjusted to their desired positions, the plane 170 may function as a guide for bracket positioning to remain in an aligned position relative to one another as the bracket wire will become straightened near the end of a correction treatment.


Digitally, a treatment may be planned to bring the fully aligned brackets on to the final stage where the teeth movements are completed. The teeth may then be digitally reverted back to their original pre-treatment positions to enable the user to see where the bracket should be placed at the outset of the treatment to achieve the final position of the teeth and the desired treatment plan.


As further illustrated, the plane 170 may be adjusted through rotation relative to the 3D arch model, as shown in FIG. 16B, or the plane may be adjusted by a linear movement relative to the 3D arch model, as shown in FIG. 16C. Once any adjustments of the plane 170 have been completed, rotation of the 3D arch model back to its front view may show the plane 170 aligned in a horizontal orientation, as shown in FIG. 16D. With the plane 170 suitably positioned, models of the brackets 180 may be applied digitally along the plane 170 and upon the teeth such that the wire receiving slot 182 of each bracket 180 is aligned with the plane 170, as shown in FIGS. 17A and 17B, so as to accommodate the arch wire 184 which also becomes aligned with the plane 170 at the completion of the bracket treatment.


With the brackets 180 superimposed upon the 3D arch model, a distance from the pocket to the gumline and the distance from the pocket to the incisal edge may be measured, as indicated in FIGS. 18A and 18B, in order to allow for the user to check and follow the guide for bracket placement. The brackets 180 can also be moved freely when selected.


When the 3D arch model is reverted back to the initial pre-treatment stage, as shown in FIGS. 18C and 18D, the brackets 180 can be seen in their pre-treatment position for mounting upon the teeth. This may allow for the arch wire to be coupled through the wire receiving slot of each bracket for treatment.


Along with the positioning of the brackets, the smile arc 160 may also adjusted as well, as shown in the front view of FIG. 19, as there may be occasions where the bracket cannot be placed clinically at the desired position because of a tooth which is too small or a region of the gums which interfere. The 3D arch model could indicate that bracket is to be placed on the gums if the tooth or gum is not modified. For instance, a tooth may require lengthening with, e.g., a composite resin, or the gum may need to be shaved short to accommodate a bracket. In such a case, the smile arc 160 may be adjusted by moving the arc 160 upwards or downwards while still maintaining the same curvature to achieve the same smile.


With the addition to the brackets to the 3D arch model, the facial images of the patient with the arch model incorporated may be updated to include the brackets 180, as shown in FIG. 20. The modified image may then be presented to the patient for evaluation.


In the event that the gums may need clinical adjustment, the gum line 190 may be adjusted on the 3D arch model, as shown in FIG. 21A, to mimic what the practitioner can potentially do with respect to, e.g., trimming the gums or applying a patch onto the gums to lengthen it. These results may be reflected in the arch model for presentation to the patient to show the patient what the expected clinical results may look like. A physical device can be fabricated for clinical use of gum adjustment reference. With the gum line being adjusted, a clear aligner can be printed where the gum line can denote the cutting line 200 of the aligner edge, as shown in FIG. 21B. The cutting line 200 for such an aligner may be adjusted by the user.


In the event that a tooth or several teeth may need clinical adjustment, such as lengthening or reduction, another module may be introduced for adding geometry onto an identified tooth. As shown in the perspective view of FIG. 22A, the identified tooth 210 for lengthening is shown and the region 212 for lengthening is digitally identified as shown in FIG. 22B.


One example for lengthening a single tooth 220, such as a bicuspid, is illustrated showing how the composite material 222 may be applied upon the tooth 220 to lengthen it. A portion 224 of the added material may be removed, e.g., shaved down, to mimic a natural tooth, as shown in the front view of FIG. 23A.


In the event that several teeth are to be lengthened, a mold such as an aligner-shaped device may be applied to the teeth. FIGS. 23B and 23C show front views of an example where several teeth are to be lengthened with bonded composite 228 adhered via a bonding agent 226 to the natural teeth. The shape of the mold with respect to the lengthened portions may be fabricated based upon the identified teeth and the shape of the extended teeth.



FIG. 23D shows a side view of one variation of such a mold which may have a mold body 234 which is configured for placement over the teeth pre-treatment. The portion of the surface 232 to be extended for each respective tooth may be received into a molding channel 238 which is configured to have a shape corresponding to the lengthened portion of the tooth. The material, such as a composite resin in liquid form, may be introduced (e.g., up to 1 cc or more) into the mold through an opening 236 so that the resin enters into the molding channel 238 to form upon the tooth 232. The excess liquid may exit the molding channel 238 through an opening 240 located on an opposite side of the opening 236. The amount of material which is lengthened can be varied, e.g., anywhere up to 10 mm, while the amount of material physically removed can also be varied as well.


In addition to lengthening the teeth, another aligner-like device may be used for removing a portion of a tooth or several teeth. The aligner-like device may be fabricated with a portion of the aligner removed corresponding to the region of the tooth to be removed. FIG. 24 shows a front view of the 3D arch model illustrating a portion 250 of a tooth to be removed. The exposed portion 250 of the tooth projecting from the aligner opening may be used as a reference guide to the user for removing this excess portion of the tooth.


Aside from the tooth extension or removal, yet another feature of the smile optimization process may include the adjustment of one or more facial features from the facial image. After the course of a correction treatment, the movement of one or more teeth may alter a number of facial features due to the repositioning of the underlying muscles and/or skin. The resulting smile of the patient may accordingly differ as well. FIG. 26A shows a flow diagram 260 of one method for digitally adjusting the facial anatomy on the facial images to produce an image of the patient's smile which accurately represents the resulting smile. The facial anatomy may be estimated to detect the muscle structures and skin areas 262. Once estimated, the areas of the face likely to be affected by the correction treatment are identified 264. The facial anatomy and/or skin areas may then be adjusted 266 either automatically by the software or manually by the user upon the facial images.



FIG. 26A shows a facial image, as described above, where the software may be used to estimate the facial anatomy by detecting muscle structures and skin areas such as the cheeks 270A, 270B, the regions adjacent to the mouth such as the perioral regions 272A, 272B, and the chin 274. The jawline 274 may also be identified and estimated as well. Other regions around the face may also be identified and estimated.



FIG. 26B shows a side view of the patient image having facial morphing features available. The software may optionally incorporate facial morphing by utilizing, e.g., one or more various markers 278 located on anatomical features to be adjusted accordingly. As shown, the patient image may integrate markers 278, e.g., along the nose, lips, cheeks, etc., for the purposes of morphing one or more of these features, if so desired.


With the movement of the teeth known and the resulting teeth location, the areas likely to be affected are identified and the system may automatically adjust a position of the muscles and/or skin to alter the patient's facial features upon the image. The positions may also be manually adjusted by the user as well. The identified regions may be bounded, as shown, where the facial regions may be freely moved within the bounds of the identified regions. FIGS. 27A and 27B show a before and after image where the cheek regions 270A, 270B may be adjusted based on the resulting tooth movements to correlate the altered facial image to the treatment performed to result in a more accurate image of the patient, e.g., the cheek regions 270A, 270B may appear puffier due to a lifting of the underlying muscles from the tooth movements. FIGS. 28A and 28B show another example of a before and after image where the jaw line 276 may be lengthened depending upon the treatment performed.


In addition to the facial regions, the lips of the patient may be adjusted as well. FIG. 29 shows a front view of the facial image where the upper lip 280 and lower lip 282 are detected and identified by the system so that the outlines of the lips are bounded 284, 286 by respective boundaries. A number of markers may be applied around each of boundaries 284, 286 to allow for adjustment of the markers by the user. Depending upon the treatment, the upper lips 280 and/or lower lips 282 may be altered, as shown in the before and after facial images shown in FIGS. 30A and 30B.


In yet another feature of the system for optimizing a patient's smile, a “smile score” may be generated for the purpose of providing the user and/or patient some relative scale to provide some indication of how optimized the resulting smile of the patient may appear. FIG. 31 shows a flow diagram 290 of one method for generating the smile score where a number of parameters may be initially input into the system 292. Factors such as the patient's smile arc, FACC line, width and height of the teeth, curvature of individual teeth, ABO score (American Board of Orthodontics (ABO) score relating to a measurable digital model), etc., may be input into a smile score engine to automatically calculate the smile score 294. The user may alter any one of these input parameters to iteratively generate the corresponding smile score and depending upon the results, the user may then implement one or more changes to further increase the corresponding smile score. The changes may then be optionally implemented by the user clinically to achieve an aesthetically pleasing smile.


In one variation, the smile score 294 may be comprised of multiple factors relating to a desirable smile and may be calculated by the following:







Smile


Score

=


(

Smile


Arc

)

+


(

Incisor


Plane


Cant

)

+


(

Occlusal


Plane


Cant

)

+


(

Max


Midline

)

+


(

Max


Transverse


Display

)

+


(

Cuspid


Inclination

)

+


(

Buccal


Segment


Inclination

)

+


(

Tooth


Proportionality

)

+


(
Flow
)

+


(

Gingival


Display

)

+


(

Maxillary


Central


Inclination

)

+


(
COP
)






Each of the individual factors shown above may be assigned a value of 1 to 5 (e.g., 1, 2, 3, 4, 5) in determining the smile score 294 where a maximum value of 60 total indicates the more aesthetically desirable smile and a lower value indicates a less aesthetically desirable smile. As noted above, one or more of these factors may be altered to iteratively generate the corresponding smile score and depending upon the results, the user may then implement one or more changes to further increase the corresponding smile score. The changes may then be optionally implemented by the user clinically to achieve an aesthetically pleasing smile. Each of the factors are described in further detail below.


One such factor included in the calculation of the smile score 294 is a smile arc factor. As seen in front view of the digital model 320 of the patient's dentition in FIG. 32, the smile arc 322 may be generated from a lower lip line 324 and the deviation between the initial position of the tip of each tooth may be compared against the smile arc 322. While the curvature between the smile arc 322 and the tips of each tooth are compared, the distance between the smile arc 322 and the lip line 324 are not necessarily considered. As shown, depending on the deviation between the tip of each tooth and the smile arc 322, a score value for the smile arc factor (shown in the chart of FIG. 32) may be used in the aggregate smile score calculation. For instance, if there is no deviation between the tips of each tooth and the smile arc 322, which is the ideal targeted value, an aggregate score of 5 may be assigned to the smile arc factor. A deviation of up to 1.5 or −1.5 between the tips of each tooth and the smile arc 322 may correlate to the smile arc factor of 3 while a deviation of up to 3.0 or −3.0 between the tips of each tooth and the smile arc 322 may correlate to the smile arc factor of 1.



FIG. 33 illustrates an example of the patient's image with the digital model superimposed illustrating the smile arc 322 and the lip line 324 directly upon the image. The initial curve 330 formed by the connection between the tips of each tooth is shown for comparison against the smile arc 322 in determining the deviation of each tooth or several teeth.


For example, FIG. 34 illustrates an initial curve 330 generated from the tips of each tooth, and lip line 324 is shown for comparison. FIG. 35 illustrates how the positioning of the teeth, once corrected, may be adjusted to follow the smile arc 322.


Another factor which may be considered in the smile score calculation is an incisor plane cant (IPC) factor. As disclosed in FIG. 36, the incisal edges of the incisors Il, 12 are lined up to form a first horizontal incisor line 360 and compared to a horizontal reference line 362. The number of degrees between the incisor line 360 and reference line 362 may be determined and depending upon the difference, a value for the incisor plane cant may be assigned (shown in the chart of FIG. 36). For example, a difference of zero between the incisor line 360 and reference line 362, which is the targeted value, may result in an assigned value of 5. A difference between the incisor line 360 and reference line 362 of up to −3.0 degrees or 3.0 degrees may result in an assigned value of 3 while a difference between the incisor line 360 and reference line 362 of up to −5.0 degrees or 5.0 degrees may result in an assigned value of 1. These thresholds and points can be varied depending on the embodiment. A number of markers can be adjusted by a user to increase the smile score calculation.



FIG. 37 illustrates an example of the patient's image with the digital model superimposed illustrating the incisor line 360 determined from the initial positioning of the incisors. The horizontal reference line 362 is shown superimposed for comparison against the incisor line 360 for determining the incisor plane cant value.


Yet another factor which may be considered in the smile score 294 calculation is an occlusal plane cant (OPC) factor. As disclosed in FIG. 38, an occlusal line 380 may be connected between the occlusal planes of each of the canine teeth C1, C2 for comparison against a horizontal reference line 382. Ideally, the occlusal plane 380 should be level when compared to the horizontal reference line 382. For example, a difference of zero between the occlusal line 380 and reference line 382, which is the targeted value, may result in an assigned value of 5 (as shown in the chart of FIG. 38). A difference between the occlusal line 380 and reference line 382 of up to −3.0 degrees or 3.0 degrees may result in an assigned value of 3 while a difference between the occlusal line 380 and reference line 382 of up to −5.0 degrees or 5.0 degrees may result in an assigned value of 1. These thresholds and points can be varied depending on the embodiment. A number of markers can be adjusted by a user to increase the smile score calculation.



FIG. 39 illustrates an example of the patient's image with the digital model superimposed illustrating the occlusal line 380 and horizontal reference line 382. This can be adjusted by a user to achieve a higher smile score. The horizontal reference line 382 is shown superimposed for comparison against the occlusal line 380 for determining the occlusal plane cant value.


A max midline factor may also be used to calculate a smile score 294. As disclosed in FIG. 40, a midline 400 between the front two incisors Il, 12 may be determined for comparison against a philtrum line 402 of the patient where the philtrum line 402 is determined by the vertical groove between the base of the nose and the border of the upper lip. A standard deviation of a distance between the midline 400 and philtrum line 402 may be determined and a max midline value assigned based on the deviation from the target which is the midline 400 and philtrum line 402 being coincident and parallel with one another. For example, if the midline 400 and philtrum line 402 are coincident and parallel, then a value of 5 may be assigned to the max midline (as shown in the chart of FIG. 40). However, an offset of up to 1.5 mm between the midline 400 and philtrum line 402 may result in a max midline value of 3 being assigned. Likewise, an offset of up to 3.0 mm may result in a max midline value of 1 being assigned. These thresholds and points can be varied depending on the embodiment. A number of markers can be adjusted by a user to increase the smile score calculation.



FIG. 41 illustrates an example of the patient's image with the digital model superimposed illustrating the application of the midline 400 and philtrum line 402 where the midline 400 and philtrum line 402 are offset by a larger deviation than an ideal target range. The deviation between the two may provide a max midline value for calculating the smile score.


A max transverse display (MTD) factor may also be used to calculate a smile score 294. As disclosed in FIG. 42, a number of teeth that are visible in an animated smile are determined, with the standard number of teeth visible being twelve teeth total, e.g., six teeth visible per side. A number of visible teeth in the smile are counted and compared against the target of six teeth per side. Depending on the number of teeth visible, a max transverse display value is assigned. For example, six teeth per side detected may result in a maximum value of 5 being assigned. Likewise, five teeth per side being detected may result in a value of 3 assigned, and four teeth per side being detected may result in a value of 1 assigned.



FIG. 43 illustrates an example of the patient's image with the digital model superimposed illustrating an example of the number of teeth per side being counted, with six per side being detected, giving a maximum point total for a max transverse display value under one embodiment.


A cuspid inclination factor may also be used to calculate a smile score 294. As disclosed in FIG. 44, upright cuspid lines are determined based on the vertical tangents of each cuspid/canine C1, C2 and are compared to a vertical line 440, 442, 444 to determine the vertical orientation of the cuspid/canine C1, C2. If the cuspid lines are parallel to the vertical line 440, 442, 444, which is the targeted value, the cuspid inclination is assigned a value of 5 (as shown in the chart of FIGS. 44 and 45). The larger the degree of discrepancy from the vertical line, the lower the value that is assigned. For instance, a difference of up to −3 degrees or 3 degrees results in a value of a cuspid inclination of 3 being assigned. Similarly, a difference of up to −6 degrees or 6 degrees results in a value of the cuspid inclination of 1 being assigned. These thresholds and points can be varied depending on the embodiment. A number of markers can be adjusted by a user to increase the smile score calculation.


The upright cuspid lines 452, 454 are determined by a process illustrated in FIG. 45 which shows how a series of curved lines 450 may be drawn from the center of the top or gingival edge of each tooth to the center of the bottom or occlusal edge of each tooth including the cuspids to create curved cuspid lines 452, 454. A tangential upright cuspid line 454, 454 may be determined relative to the curved lines 450 and these upright cuspid lines 452, 454 may be compared to the vertical line 440, 442, 444 to determine the cuspid inclination value, and can be adjusted via markers by a user to increase the smile score calculation.



FIGS. 46 and 47 illustrates an example of the patient's image with the digital model superimposed illustrating how an initial positioning of the patient's teeth reveals how the upright cuspid lines 452, 454 appear canted relative to the vertical lines 440, 442 indicating that the cuspid inclination value is off-target. In particular, FIG. 46 illustrates where the upright cuspid lines 452, 454 are far from parallel when compared to vertical lines 440, 442 while FIG. 47 illustrates how the corrected positioning of the teeth may change the curve cuspid lines 450, which in turn realigns the upright cuspid lines 452, 454 to be parallel with the vertical lines 440, 442. This creates a more desirable smile, and increases the smile score by decreasing the degrees between the vertical lines 440, 442 and the upright cuspid lines 452, 454.


Another factor that may be included in a smile score 294 calculation is a buccal segment inclination (BSI) factor. As shown in FIG. 48, the #4 cuspids C4 and #5 cuspids C5 may each have a curved cuspid line formed upon each of these cuspid teeth and a respective tangential upright line may be formed relative to each curved cuspid lines. A first reference line 480, 482 which is tilted relative to a vertical line by, e.g., 1.5 degrees, may be used for comparison against the #4 cuspids C4 and a second reference 484, 486 which is also tilted relative to the vertical line by, e.g., 3 degrees, may be used for comparison against the #5 cuspids C5.



FIGS. 49 and 50 illustrate an example of the patient's image with the digital model superimposed illustrating the first reference lines 480, 482 each tilted at, e.g., 1.5 degrees, and second reference lines 484, 486 each tilted at, e.g., 3 degrees. Each of the first and second reference lines may, of course, be tilted at other angles as desired depending upon the desired smile results. The curved cuspid lines may be seen formed upon each of the relevant teeth and the corresponding tangential upright lines 490, 492 for each of the #4 cuspids and the corresponding tangential upright lines 494, 496 for each of the #5 cuspids. Comparison of the upright lines 490, 492 against the tilted first reference lines 480, 482 and comparison of the upright lines 494, 496 against the tilted second reference lines 484, 486 may each produce a resulting buccal segment inclination value where a zero degree difference may yield an assigned value of 5 (as shown in the chart of FIG. 48). A difference of up to −3 degrees or 3 degrees may yield an assigned value of 3, and likewise a difference of up to −5 degrees or 5 degrees may yield an assigned value of 1. The resulting buccal segment inclination may be used as one of the factors in the determining the smile score 294. FIG. 50 illustrates how the upright lines 490, 492 relative to the tilted first reference lines 480, 482 and upright lines 494, 496 against the tilted second reference lines 484, 486 may yield a higher value of the buccal segment inclination once the positioning of the teeth are corrected.


A tooth proportionality factor may also be included in calculating a smile score 294. As disclosed FIG. 51, the tooth proportionality factor may be determined by using a recurring esthetic dental (RED) proportion which is calculated by dividing the width of each lateral incisor 480 by the width 484 of the adjacent central incisor 482 and the resulting number being multiplied by 100. Alternatively, the tooth proportionality factor may also be determined using the Golden proportion where the width 484 of the central incisor 482 is multiplied by 62% and compared with the width of adjacent lateral incisor 480. Similar values indicate that the width 484 of the central incisor 482 is in golden proportion to the width of the lateral incisor 480.


In using the RED proportion, the ideal portion may be within a targeted range of, for example, between 75-78%. A tooth proportionality of less than 68% or more than 81% may result in an assigned tooth proportionality value of 1. A tooth proportionality of between 68% to 72% or between 78% to 81% may result in an assigned tooth proportionality value of 3, while a tooth proportionality of between 72% and 78% may result in an assigned tooth proportionality value of 5 (as shown in the chart of FIG. 51). These thresholds and points can be varied depending on the embodiment. A number of markers can be adjusted by a user to increase the smile score calculation.


A flow factor may also be included in calculating a smile score 294. A number of different templates may be applied upon the teeth of the patient, as shown in FIG. 52, depending upon the type of desired results. These templates 520 may be applied upon the digital model of the patient's dentition for comparison against the initial fit of the teeth, as shown in FIG. 53, and differences 522 between the teeth and the template 520 may reveal that one or more of the teeth may require a coronoplasty for the addition or removal of material from the crown. If no teeth are shown to require any addition or removal of material relative to the template 520, a flow value of 5 may be assigned (as shown in the chart of FIG. 52). If one to two teeth show any issues relative to the template 520, then a flow value of 3 may be assigned; and if three to four teeth show any issues relative to the template 520, then a flow value of 1 may be assigned.


A gingival display (GD) factor may also be included in calculating a smile score 294. As disclosed in FIG. 54, the tips of all of the gums are connected and the distance between the connection line to the upper lip is averaged 540 to calculate a mean deviation from a target value. The target distance can differ between men and women, with an example target of 2 mm for women and 1 mm for men. FIG. 55 illustrates an example of a patient's image with the digital model superimposed illustrating the gum line 540 and upper lip edge 542. The distance between the tips of the gum line 540 and upper lip edge 542 may be averaged and then compared against a target value to determine the deviation from the target (e.g., relative to an average distance of 2 mm for women and 1 mm for men). No deviation may result in a gingival display value of 5 while a deviation of 0 or up to 4 may result in a gingival display value of 3 and a deviation of −2 or up to 6 may result in a gingival display value of 1 (as shown in the chart of FIG. 54).


A maxillary central inclination factor may also be included in a smile score 294 calculation. As disclosed in FIG. 56, a facial surface line 560 may be determined based on a tangent of the buccal surfaces of the main incisors. The facial surface line 560 may then be compared to a true vertical line 562. A target value of zero between the facial surface line 560 and vertical line 562 may yield a maxillary central inclination value of 5. A difference of up to −5 degrees or up to 5 degrees between the two may yield a maxillary central inclination value of 3, and a difference of up to −10 degrees or up to 10 degrees between the two may yield a maxillary central inclination value of 1 (as shown in the chart of FIG. 56). These thresholds and points can be varied depending on the embodiment. A number of markers can be adjusted by a user to increase the smile score calculation.



FIGS. 57 and 58 illustrate an example of a patient's image with the digital model illustrating the difference between the facial surface line 560 and vertical line 562 prior to correction in FIG. 57, where a difference in the angles between the two may be seen, and post correction in FIG. 58. In FIG. 57, the facial surface line 560 is far from parallel when compared to true vertical line 562, which will result in a lower smile score. In FIG. 58, markers have been adjusted to change the projected facial surface 560. This creates a more desirable smile and increases the smile score by decreasing the degrees between the facial surface line 560 and true vertical line 562.


A COP factor may also be included in a smile score 294 calculation where the COP value is the average line formed by the occlusal surfaces of the teeth, e.g., the visible teeth at least from a profile view of the patient. The COP line 592 may be compared against a true horizontal line 590, as shown in FIG. 59, and discrepancies between the COP line 592 and true horizontal line may be determined.



FIGS. 60 and 61 illustrate an example of a patient's image with the digital model superimposed. As shown in FIG. 60 prior to correction, the COP line 592 may be determined and compared against the true horizontal line 590. Typically, the desired COP line 592 may be angled about 10 degrees relative to the horizontal line 590 so that a discrepancy of zero between the two lines 590, 592 is a targeted value and may yield an assigned COP value of 5. A discrepancy of up to 5 degrees or up to 15 degrees may yield an assigned COP value of 3, and a discrepancy of 0 degrees or up to 20 degrees may yield an assigned COP value of 1 (as shown in the chart of FIG. 59). FIG. 61 shows the angle difference between the two lines 590, 592 of around 10 degrees post correction.


Yet another feature optionally available through the system may include the generation of an animation of the patient's face. Such an animation can be video based, where the patient may be requested to maintain a natural head position while repeating one or more phrases while recorded. The recorded video may be altered to swap the patient's face with the facial image of the patient with the resulting smile from treatment. The patient may then be able to view the original video and altered video with the replaced arch model for comparison purposes.


While different features are discussed, the system may incorporate any number of different features into a single system in any number of combinations. A single system provided may, for example, include or incorporate every feature described herein or it may include a select number of features depending upon the desired system.


The applications of the devices and methods discussed above are not limited to the one described but may include any number of further treatment applications. Modification of the above-described assemblies and methods for carrying out the invention, combinations between different variations as practicable, and variations of aspects of the invention that are obvious to those of skill in the art are intended to be within the scope of the claims.

Claims
  • 1. A method for adjusting an image of a smile, comprising: receiving a three-dimensional (3D) digital model of a dental arch of a patient;receiving a digital facial image of the patient which includes an image of one or more teeth of the patient when smiling;generating one or more reference lines relative to facial features of the patient for display upon the digital facial image;determining one or more parameters of the image of the one or more teeth of the patient relative to the one or more reference lines;registering the 3D digital model to the one or more teeth of the patient from the digital facial image;correcting the 3D digital model for scale and distortion to create a corrected 3D digital model; andreplacing the image of the one or more teeth with the corrected 3D digital model onto the digital facial image; andmanipulating one or more teeth from the 3D digital model according to the one or more parameters obtained from the image of the one or more teeth and the one or more reference lines.
  • 2. The method of claim 1 wherein receiving the digital facial image comprises receiving a front view and a profile view of the patient.
  • 3. The method of claim 1 wherein receiving the digital facial image comprises determining a parameter relating to one or more facial features of the patient from the digital facial image.
  • 4. The method of claim 1 wherein registering the 3D digital model comprises registering one or more locations from the 3D digital model to one or more corresponding locations on the digital facial image.
  • 5. The method of claim 1 wherein registering the 3D digital model comprises registering the 3D digital model to a front facial image and a profile facial image of the patient.
  • 6. The method of claim 1 wherein correcting the 3D digital model comprises adjusting a color of one or more teeth from the 3D digital model to match a color of the one or more teeth from the digital facial image.
  • 7. The method of claim 1 wherein correcting the 3D digital model comprises adjusting a color of gums from the 3D digital model to match a color of gums from the digital facial image.
  • 8. A method of adjusting a smile, comprising: receiving a three-dimensional (3D) digital model of a dental arch of a patient;receiving a digital facial image of the patient which includes an image of one or more teeth of the patient when smiling;generating a smile curve or arc which corresponds to a curve or arc of a lower lip of the patient from the digital facial image;overlaying the smile curve or arc in proximity to the one or more teeth on the digital facial image;adjusting one or more parameters of the smile curve or arc; andmanipulating one or more teeth from the 3D digital model according to the smile curve or arc.
  • 9. The method of claim 8 wherein adjusting one or more parameters comprises adjusting a relative position of the smile curve or arc relative to the one or more teeth on the digital facial image.
  • 10. The method of claim 8 further comprising overlaying a plane upon the one or more teeth of the digital facial image to determine a position of one or more brackets upon the one or more teeth.
  • 11. The method of claim 10 further comprising overlaying images of the one or more brackets upon the one or more teeth on the digital facial image in a corresponding manner.
  • 12. The method of claim 8 wherein manipulating one or more teeth comprises adjusting a length of the one or more teeth from the 3D digital model.
  • 13. A method of adjusting a facial image, comprising: receiving a three-dimensional (3D) digital model of a dental arch of a patient;receiving a digital facial image of the patient which includes an image of one or more teeth of the patient when smiling;estimating a facial anatomy from the digital facial image of the patient;identifying one or more areas of the facial anatomy affected by a correction treatment of the one or more teeth;adjusting the one or more areas of the facial anatomy corresponding to the correction treatment.
  • 14. The method of claim 13 wherein estimating the facial anatomy comprises estimating the facial anatomy in a cheek region, perioral region, chin region, or jawline of the digital facial image.
  • 15. The method of claim 13 wherein estimating the facial anatomy comprises presenting a boundary area around the one or more areas of the facial anatomy.
  • 16. The method of claim 13 wherein identifying one or more areas further comprises identifying an upper lip or lower lip from the digital facial image.
  • 17. A method of improving a smile of a patient, comprising: receiving a three-dimensional (3D) digital model of a dental arch of a patient;receiving a digital facial image of the patient which includes an image of one or more teeth of the patient when smiling;identifying one or more parameters relating to smile optimization;assigning a numerical value to each of the one or more parameters; andgenerating a smile score based on the numerical valve assigned to each of the one or more parameters such that an increase in the smile score is indicative of an increase of an aesthetic smile.
  • 18. The method of claim 17 wherein the one or more parameters comprise a smile curve or arc, a line of a facial axis of a clinical crown, width or height of a tooth, or curvature of an individual tooth.
  • 19. The method of claim 17 further comprising altering the one or more parameters such that a corresponding smile score is generated.
  • 20. The method of claim 17 wherein the one or more parameters for generating the smile score is selected from the group consisting of smile arc, incisor plane cant, occlusal plane cant, max midline, max transverse display, cuspid inclination, buccal segment inclination, tooth proportionality, flow, gingival display, maxillary central inclination, and COP.
  • 21. A method for adjusting an image of a smile, comprising: receiving a three-dimensional (3D) digital model of a dental arch of a patient;receiving a digital facial image of the patient which includes an image of one or more teeth of the patient when smiling;registering the 3D digital model to the one or more teeth of the patient from the digital facial image;correcting the 3D digital model for scale and distortion to create a corrected 3D digital model;overlaying the corrected 3D digital model onto the digital facial image;generating a smile curve or arc which corresponds to a curve or arc of a lower lip of the patient when the lower lip is formed into a smile in the digital facial image;overlaying the smile curve or arc in proximity to the one or more teeth on the digital facial image;adjusting one or more parameters of the smile curve or arc; andmanipulating one or more teeth from the 3D digital model according to the smile curve or arc.
  • 22. The method of claim 21 further comprising: estimating a facial anatomy from the digital facial image of the patient;identifying one or more areas of the facial anatomy affected by a correction treatment of the one or more teeth;adjusting the one or more areas of the facial anatomy corresponding to the correction treatment.
  • 23. The method claim 21 further comprising: identifying one or more parameters relating to smile optimization; andgenerating a smile score based on the one or more parameters.
  • 24. The method of claim 23 further comprising altering one or more parameters to adjust the smile score.
  • 25. The method of claim 23 wherein the one or more parameters for generating the smile score is selected from the group consisting of smile arc, incisor plane cant, occlusal plane cant, max midline, max transverse display, cuspid inclination, buccal segment inclination, tooth proportionality, flow, gingival display, maxillary central inclination, and COP.
  • 26. A method for adjusting an image of a smile, comprising: receiving a three-dimensional (3D) digital model of a dental arch of a patient;receiving a digital facial image of the patient which includes an image of one or more teeth of the patient when smiling;registering the 3D digital model to the one or more teeth of the patient from the digital facial image;correcting the 3D digital model for scale and distortion to create a corrected 3D digital model;overlaying the corrected 3D digital model onto the digital facial image;estimating a facial anatomy including muscle structures and skin areas from the digital facial image of the patient;identifying one or more areas of the facial anatomy affected by a correction treatment of the one or more teeth; andadjusting the one or more areas of the facial anatomy corresponding to the correction treatment.
  • 27. The method of claim 26 further comprising: generating a smile curve or arc which corresponds to a curve or arc of a lower lip of the patient from the digital facial image;overlaying the smile curve or arc in proximity to the one or more teeth on the digital facial image;adjusting one or more parameters of the smile curve or arc; andmanipulating one or more teeth from the 3D digital model according to the smile curve or arc.
  • 28. The method of claim 26 further comprising: identifying one or more parameters relating to smile optimization; andgenerating a smile score based on the one or more parameters.
  • 29. The method of claim 28 further comprising altering one or more parameters to adjust the smile score.
  • 30. The method of claim 28 wherein the one or more parameters for generating the smile score is selected from the group consisting of smile arc, incisor plane cant, occlusal plane cant, max midline, max transverse display, cuspid inclination, buccal segment inclination, tooth proportionality, flow, gingival display, maxillary central inclination, and COP.
  • 31. A method for adjusting an image of a smile, comprising: receiving a three-dimensional (3D) digital model of a dental arch of a patient;receiving a digital facial image of the patient which includes an image of one or more teeth of the patient when smiling;registering the 3D digital model to the one or more teeth of the patient from the digital facial image;correcting the 3D digital model for scale and distortion to create a corrected 3D digital model;overlaying the corrected 3D digital model onto the digital facial image;identifying one or more parameters relating to smile optimization;generating a smile score based on the one or more parameters, wherein the smile score comprises a relative scale which is indicative of an optimized smile of the patient; andaltering one or more parameters to adjust the smile score.
  • 32. The method of claim 31 further comprising: generating a smile curve or arc which corresponds to a curve or arc of a lower lip of the patient from the digital facial image;overlaying the smile curve or arc in proximity to the one or more teeth on the digital facial image;adjusting one or more parameters of the smile curve or arc; andmanipulating one or more teeth from the 3D digital model according to the smile curve or arc.
  • 33. The method of claim 31 further comprising: estimating a facial anatomy from the digital facial image of the patient;identifying one or more areas of the facial anatomy affected by a correction treatment of the one or more teeth;adjusting the one or more areas of the facial anatomy corresponding to the correction treatment.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of priority to U.S. Prov. 63/067,769 filed Aug. 19, 2020, which is incorporated herein by reference in its entirety.

US Referenced Citations (560)
Number Name Date Kind
3521355 Pearlman Jul 1970 A
4068379 Miller et al. Jan 1978 A
4597739 Rosenberg Jul 1986 A
4889485 Iida Dec 1989 A
4983334 Adell Jan 1991 A
5055039 Abbatte et al. Oct 1991 A
5186623 Breads et al. Feb 1993 A
5259762 Farrell Nov 1993 A
5506607 Sanders et al. Apr 1996 A
5691905 Dehoff et al. Nov 1997 A
5863198 Doyle Jan 1999 A
5975893 Chishti et al. Nov 1999 A
6120287 Chen Sep 2000 A
6183248 Chishti et al. Feb 2001 B1
6210162 Chishti et al. Apr 2001 B1
6217325 Chishti et al. Apr 2001 B1
6227850 Chishti et al. May 2001 B1
6227851 Chishti et al. May 2001 B1
6250918 Sachdeva et al. Jun 2001 B1
6293790 Hilliard Sep 2001 B1
6299440 Phan et al. Oct 2001 B1
6309215 Phan et al. Oct 2001 B1
6315553 Sachdeva et al. Nov 2001 B1
6386878 Pavlovskaia et al. May 2002 B1
6390812 Chishti et al. May 2002 B1
6394801 Chishti et al. May 2002 B2
6398548 Chishti et al. Jun 2002 B1
6454565 Phan et al. Sep 2002 B2
6463344 Pavloskaia Oct 2002 B1
6471511 Chishti et al. Oct 2002 B1
6485298 Chishti et al. Nov 2002 B2
6488499 Miller Dec 2002 B1
6524101 Phan et al. Feb 2003 B1
6554611 Chishti et al. Apr 2003 B2
6572372 Phan et al. Jun 2003 B1
6582227 Phan et al. Jun 2003 B2
6602070 Miller et al. Aug 2003 B2
6607382 Kuo et al. Aug 2003 B1
6626666 Chishti et al. Sep 2003 B2
6629840 Chishti et al. Oct 2003 B2
6682346 Chishti et al. Jan 2004 B2
6688885 Sachdeva Feb 2004 B1
6699037 Chishti et al. Mar 2004 B2
6702575 Hilliard Mar 2004 B2
6705861 Chishti et al. Mar 2004 B2
6705863 Phan et al. Mar 2004 B2
6722880 Chishti et al. Apr 2004 B2
6729876 Chishti et al. May 2004 B2
6761560 Miller Jul 2004 B2
6783360 Chishti Aug 2004 B2
6786721 Chishti et al. Sep 2004 B2
6802713 Chishti et al. Oct 2004 B1
6830450 Knopp et al. Dec 2004 B2
6846179 Chapouland et al. Jan 2005 B2
6857429 Eubank Feb 2005 B2
6886566 Eubank May 2005 B2
6964564 Phan et al. Nov 2005 B2
7011517 Nicozisis Mar 2006 B2
7029275 Rubbert et al. Apr 2006 B2
7037108 Chishti et al. May 2006 B2
7040896 Pavlovskaia et al. May 2006 B2
7056115 Phan et al. Jun 2006 B2
7059850 Phan et al. Jun 2006 B1
7063533 Phan et al. Jun 2006 B2
7074038 Miller Jul 2006 B1
7077647 Choi et al. Jul 2006 B2
7092784 Simkins Aug 2006 B1
7104790 Cronauer Sep 2006 B2
7121825 Chishti et al. Oct 2006 B2
7125248 Phan et al. Oct 2006 B2
7134874 Chishti et al. Nov 2006 B2
7156661 Choi et al. Jan 2007 B2
7160110 Imgrund et al. Jan 2007 B2
7172417 Sporbert et al. Feb 2007 B2
7192275 Miller Mar 2007 B2
7220122 Chishti May 2007 B2
7320592 Chishti et al. Jan 2008 B2
7326051 Miller Feb 2008 B2
7331783 Chishti et al. Feb 2008 B2
7347688 Kopelman et al. Mar 2008 B2
7416407 Cronauer Aug 2008 B2
7434582 Eubank Oct 2008 B2
7435083 Chishti et al. Oct 2008 B2
7442041 Imgrund et al. Oct 2008 B2
7458812 Sporbert et al. Dec 2008 B2
7476100 Kuo Jan 2009 B2
7481121 Cao Jan 2009 B1
7553157 Abolfathi et al. Jun 2009 B2
7559328 Eubank Jul 2009 B2
7578673 Wen et al. Aug 2009 B2
7590462 Rubbert et al. Sep 2009 B2
7637262 Bailey Dec 2009 B2
7641828 Desimone et al. Jan 2010 B2
7658610 Knopp Feb 2010 B2
7689398 Cheng et al. Mar 2010 B2
7717708 Sachdeva et al. May 2010 B2
7771195 Knopp et al. Aug 2010 B2
7802987 Phan et al. Sep 2010 B1
7824180 Abolfathi et al. Nov 2010 B2
7826646 Pavlovskaia et al. Nov 2010 B2
7840247 Liew et al. Nov 2010 B2
7841858 Knopp et al. Nov 2010 B2
7845938 Kim et al. Dec 2010 B2
7854609 Chen et al. Dec 2010 B2
7878801 Abolfathi et al. Feb 2011 B2
7878804 Korytov et al. Feb 2011 B2
7878805 Moss et al. Feb 2011 B2
7883334 Li et al. Feb 2011 B2
7901207 Knopp et al. Mar 2011 B2
7905724 Kuo et al. Mar 2011 B2
7914283 Kuo Mar 2011 B2
7942672 Kuo May 2011 B2
7943079 Desimone et al. May 2011 B2
7957824 Boronvinskih et al. Jun 2011 B2
7987099 Kuo et al. Jul 2011 B2
8001972 Eubank Aug 2011 B2
8002543 Kang et al. Aug 2011 B2
8021147 Sporbert et al. Sep 2011 B2
8033282 Eubank Oct 2011 B2
8038444 Kitching et al. Oct 2011 B2
8070487 Chishti et al. Dec 2011 B2
8075306 Kitching et al. Dec 2011 B2
8099268 Kitching et al. Jan 2012 B2
8099305 Kuo et al. Jan 2012 B2
8105080 Chishti et al. Jan 2012 B2
8123519 Schultz Feb 2012 B2
8152518 Kuo Apr 2012 B2
8152523 Sporbert et al. Apr 2012 B2
8177551 Sachdeva et al. May 2012 B2
8235713 Phan et al. Aug 2012 B2
8272866 Chun et al. Sep 2012 B2
8275180 Kuo et al. Sep 2012 B2
8292617 Brandt et al. Oct 2012 B2
8303302 Teasdale Nov 2012 B2
8348665 Kuo Jan 2013 B2
8356993 Marston Jan 2013 B1
8401686 Moss et al. Mar 2013 B2
8401826 Cheng et al. Mar 2013 B2
8439672 Matov et al. May 2013 B2
8439673 Korytov et al. May 2013 B2
8444412 Baughman et al. May 2013 B2
8465280 Sachdeva et al. Jun 2013 B2
8469705 Sachdeva et al. Jun 2013 B2
8469706 Kuo Jun 2013 B2
8496474 Chishti et al. Jul 2013 B2
8512037 Andreiko Aug 2013 B2
8517726 Kakavand et al. Aug 2013 B2
8535580 Puttler et al. Sep 2013 B2
8562337 Kuo et al. Oct 2013 B2
8562338 Kitching et al. Oct 2013 B2
8562340 Chishti et al. Oct 2013 B2
8636509 Miller Jan 2014 B2
8636510 Kitching et al. Jan 2014 B2
8690568 Chapoulaud et al. Apr 2014 B2
8708697 Li et al. Apr 2014 B2
8734149 Phan et al. May 2014 B2
8734150 Chishti et al. May 2014 B2
8738165 Cinader, Jr. et al. May 2014 B2
8765031 Li et al. Jul 2014 B2
8777611 Cios Jul 2014 B2
8780106 Chishti et al. Jul 2014 B2
8807999 Kuo et al. Aug 2014 B2
8858226 Phan et al. Oct 2014 B2
8864493 Leslie-Martin et al. Oct 2014 B2
8899976 Chen et al. Dec 2014 B2
8899978 Kitching et al. Dec 2014 B2
8930219 Trosien et al. Jan 2015 B2
8936464 Kopelman Jan 2015 B2
8998608 Trosien et al. Jan 2015 B2
8944812 Kuo et al. Feb 2015 B2
8961173 Miller Feb 2015 B2
8986003 Valoir Mar 2015 B2
8992215 Chapoulaud et al. Mar 2015 B2
9004915 Moss et al. Apr 2015 B2
9022781 Kuo et al. May 2015 B2
9026238 Kraemer et al. May 2015 B2
9060829 Sterental et al. Jun 2015 B2
9107722 Matov et al. Aug 2015 B2
9119691 Namiranian et al. Sep 2015 B2
9119696 Giordano et al. Sep 2015 B2
9161823 Morton et al. Oct 2015 B2
9161824 Chishti et al. Oct 2015 B2
9204942 Phan et al. Dec 2015 B2
9211166 Kuo et al. Dec 2015 B2
9241774 Li et al. Jan 2016 B2
9301814 Kaza et al. Apr 2016 B2
9320575 Chishti et al. Apr 2016 B2
9326830 Kitching et al. May 2016 B2
9326831 Cheang May 2016 B2
9333052 Miller May 2016 B2
9345557 Anderson et al. May 2016 B2
9351809 Phan et al. May 2016 B2
9364297 Kitching et al. Jun 2016 B2
9375300 Matov et al. Jun 2016 B2
9414897 Wu et al. Aug 2016 B2
9433476 Khardekar et al. Sep 2016 B2
9492245 Sherwood et al. Nov 2016 B2
9820829 Kuo Nov 2017 B2
9844420 Cheang Dec 2017 B2
9917868 Ahmed Mar 2018 B2
9922170 Trosien et al. Mar 2018 B2
10011050 Kitching et al. Jul 2018 B2
10022204 Cheang Jul 2018 B2
10335250 Wen Jul 2019 B2
10357336 Wen Jul 2019 B2
10357342 Wen Jul 2019 B2
10548690 Wen Feb 2020 B2
10588723 Falkel Mar 2020 B2
10624717 Wen Apr 2020 B2
10631953 Wen Apr 2020 B2
10881486 Wen Jan 2021 B2
10925698 Falkel Feb 2021 B2
10952821 Falkel Mar 2021 B2
11051913 Wen Jul 2021 B2
11096763 Akopov Aug 2021 B2
11207161 Brant Dec 2021 B2
11348257 Lang May 2022 B2
11364098 Falkel Jun 2022 B2
11553989 Wen et al. Jan 2023 B2
11583365 Wen Feb 2023 B2
11638628 Wen May 2023 B2
11663383 Cao May 2023 B2
11707180 Falkel Jul 2023 B2
11771524 Wen Oct 2023 B2
11833006 Wen Dec 2023 B2
12064315 Schueller et al. Aug 2024 B2
20010002310 Chishti et al. May 2001 A1
20020009686 Loc et al. Jan 2002 A1
20020010568 Rubbert et al. Jan 2002 A1
20020025503 Chapoulaud et al. Feb 2002 A1
20020042038 Miller et al. Apr 2002 A1
20020051951 Chishti et al. May 2002 A1
20020072027 Chisti Jun 2002 A1
20020094503 Chishti et al. Jul 2002 A1
20020110776 Abels et al. Aug 2002 A1
20020150859 Imgrund et al. Nov 2002 A1
20020177108 Pavlovskaia et al. Nov 2002 A1
20030003416 Chishti et al. Jan 2003 A1
20030008259 Kuo et al. Jan 2003 A1
20030039940 Miller Feb 2003 A1
20030059736 Lai et al. Mar 2003 A1
20030190576 Phan et al. Oct 2003 A1
20030207224 Lotte Nov 2003 A1
20040023188 Pavlovskaia et al. Feb 2004 A1
20040029068 Sachdeva et al. Feb 2004 A1
20040038168 Choi et al. Feb 2004 A1
20040134599 Wang et al. Jul 2004 A1
20040142299 Miller Jul 2004 A1
20040152036 Abolfathi Aug 2004 A1
20040166456 Chishti et al. Aug 2004 A1
20040166462 Phan et al. Aug 2004 A1
20040166463 Wen et al. Aug 2004 A1
20040197728 Abolfathi et al. Oct 2004 A1
20040202983 Tricca et al. Oct 2004 A1
20040219471 Cleary et al. Nov 2004 A1
20040229183 Knopp et al. Nov 2004 A1
20040242987 Liew et al. Dec 2004 A1
20040253562 Knopp Dec 2004 A1
20050010450 Hultgren et al. Jan 2005 A1
20050019721 Chishti Jan 2005 A1
20050048432 Choi et al. Mar 2005 A1
20050095552 Sporbert et al. May 2005 A1
20050095562 Sporbert et al. May 2005 A1
20050118555 Sporbert et al. Jun 2005 A1
20050153255 Sporbert et al. Jul 2005 A1
20050192835 Kuo et al. Sep 2005 A1
20050194022 Schwartz Sep 2005 A1
20050238967 Rogers et al. Oct 2005 A1
20050241646 Sotos et al. Nov 2005 A1
20050244781 Abels et al. Nov 2005 A1
20050244782 Chishti et al. Nov 2005 A1
20050271996 Sporbert et al. Dec 2005 A1
20060003283 Miller et al. Jan 2006 A1
20060035197 Hishimoto Feb 2006 A1
20060068353 Abolfathi et al. Mar 2006 A1
20060078840 Robson Apr 2006 A1
20060078841 Desimone et al. Apr 2006 A1
20060084030 Phan et al. Apr 2006 A1
20060093982 Wen May 2006 A1
20060099546 Bergersen May 2006 A1
20060115785 Li et al. Jun 2006 A1
20060147872 Andreiko Jul 2006 A1
20060177789 O'Bryan Aug 2006 A1
20060188834 Hilliard Aug 2006 A1
20060199142 Liu et al. Sep 2006 A1
20060223022 Solomon Oct 2006 A1
20060223023 Lai et al. Oct 2006 A1
20060275731 Wen et al. Dec 2006 A1
20060275736 Wen et al. Dec 2006 A1
20070003907 Chishti et al. Jan 2007 A1
20070238065 Sherwood et al. Oct 2007 A1
20070264606 Muha et al. Nov 2007 A1
20070283967 Bailey Dec 2007 A1
20080032248 Kuo Feb 2008 A1
20080044786 Kalili Feb 2008 A1
20080050692 Hilliard Feb 2008 A1
20080051650 Massie et al. Feb 2008 A1
20080057461 Cheng et al. Mar 2008 A1
20080057462 Kitching et al. Mar 2008 A1
20080076086 Kitching et al. Mar 2008 A1
20080085487 Kuo et al. Apr 2008 A1
20080113314 Pierson et al. May 2008 A1
20080115791 Heine May 2008 A1
20080118882 Su May 2008 A1
20080141534 Hilliard Jun 2008 A1
20080182220 Chishti et al. Jul 2008 A1
20080206702 Hedge et al. Aug 2008 A1
20080215176 Borovinskih et al. Sep 2008 A1
20080233528 Kim et al. Sep 2008 A1
20080233530 Cinader Sep 2008 A1
20080248438 Desimone et al. Oct 2008 A1
20080248443 Chisti et al. Oct 2008 A1
20080261165 Steingart et al. Oct 2008 A1
20080268400 Moss et al. Oct 2008 A1
20080280247 Sachdeva et al. Nov 2008 A1
20080305451 Kitching et al. Dec 2008 A1
20080305453 Kitching et al. Dec 2008 A1
20090081604 Fisher Mar 2009 A1
20090098502 Andreiko Apr 2009 A1
20090117510 Minium May 2009 A1
20090191502 Cao et al. Jul 2009 A1
20090269714 Knopp Oct 2009 A1
20090280450 Kuo Nov 2009 A1
20090291407 Kuo Nov 2009 A1
20090291408 Stone-Collonge et al. Nov 2009 A1
20100036682 Trosien et al. Feb 2010 A1
20100055635 Kakavand Mar 2010 A1
20100086890 Kuo Apr 2010 A1
20100138025 Morton et al. Jun 2010 A1
20100167225 Kuo Jul 2010 A1
20100173266 Lu et al. Jul 2010 A1
20100179789 Sachdeva et al. Jul 2010 A1
20100239992 Brandt et al. Sep 2010 A1
20100280798 Pattijn et al. Nov 2010 A1
20110005527 Andrew et al. Jan 2011 A1
20110015591 Hanson et al. Jan 2011 A1
20110020761 Kalili Jan 2011 A1
20110039223 Li et al. Feb 2011 A1
20110091832 Kim et al. Apr 2011 A1
20110114100 Alvarez et al. May 2011 A1
20110123944 Knopp et al. May 2011 A1
20110129786 Chun et al. Jun 2011 A1
20110159451 Kuo et al. Jun 2011 A1
20110165533 Li et al. Jul 2011 A1
20110269092 Kuo et al. Nov 2011 A1
20110269097 Sporbert et al. Nov 2011 A1
20110270588 Kuo et al. Nov 2011 A1
20110281229 Abolfathi Nov 2011 A1
20120028221 Williams Feb 2012 A1
20120035901 Kitching et al. Feb 2012 A1
20120123577 Chapoulaud et al. May 2012 A1
20120150494 Anderson et al. Jun 2012 A1
20120186589 Singh Jul 2012 A1
20120199136 Urbano Aug 2012 A1
20120214121 Greenberg Aug 2012 A1
20120225399 Teasdale Sep 2012 A1
20120225400 Chishti et al. Sep 2012 A1
20120225401 Kitching et al. Sep 2012 A1
20120227750 Tucker Sep 2012 A1
20120244488 Chishti et al. Sep 2012 A1
20120270173 Pumphrey et al. Oct 2012 A1
20120288818 Vendittelli Nov 2012 A1
20130004634 McCaskey et al. Jan 2013 A1
20130022255 Chen et al. Jan 2013 A1
20130052625 Wagner Feb 2013 A1
20130078593 Andreiko Mar 2013 A1
20130081271 Farzin-Nia et al. Apr 2013 A1
20130085018 Jensen et al. Apr 2013 A1
20130095446 Andreiko et al. Apr 2013 A1
20130122445 Marston May 2013 A1
20130122448 Kitching May 2013 A1
20130157213 Arruda Jun 2013 A1
20130201450 Bailey et al. Aug 2013 A1
20130204583 Matov et al. Aug 2013 A1
20130230819 Arruda Sep 2013 A1
20130231899 Khardekar et al. Sep 2013 A1
20130236848 Arruda Sep 2013 A1
20130266906 Soo Oct 2013 A1
20130302742 Li et al. Nov 2013 A1
20130308846 Chen et al. Nov 2013 A1
20130317800 Wu et al. Nov 2013 A1
20130323665 Dinh et al. Dec 2013 A1
20130325431 See et al. Dec 2013 A1
20140023980 Kitching et al. Jan 2014 A1
20140072926 Valoir Mar 2014 A1
20140073212 Lee Mar 2014 A1
20140076332 Luco Mar 2014 A1
20140122027 Andreiko May 2014 A1
20140124968 Kim May 2014 A1
20140167300 Lee Jun 2014 A1
20140172375 Grove Jun 2014 A1
20140178830 Widu Jun 2014 A1
20140193765 Kitching et al. Jul 2014 A1
20140193767 Li et al. Jul 2014 A1
20140229878 Wen et al. Aug 2014 A1
20140242532 Arruda Aug 2014 A1
20140255864 Machata et al. Sep 2014 A1
20140272757 Chishti Sep 2014 A1
20140287376 Hultgren et al. Sep 2014 A1
20140288894 Chishti et al. Sep 2014 A1
20140315153 Kitching Oct 2014 A1
20140315154 Jung et al. Oct 2014 A1
20140067335 Andreiko Nov 2014 A1
20140329194 Sachdeva et al. Nov 2014 A1
20140349242 Phan et al. Nov 2014 A1
20140358497 Kuo et al. Dec 2014 A1
20140363779 Kopelman Dec 2014 A1
20140370452 Tseng Dec 2014 A1
20150004553 Li et al. Jan 2015 A1
20150004554 Cao et al. Jan 2015 A1
20150018956 Steinmann et al. Jan 2015 A1
20150025907 Trosien et al. Jan 2015 A1
20150044623 Rundlett Feb 2015 A1
20150044627 German Feb 2015 A1
20150057983 See et al. Feb 2015 A1
20150064641 Gardner Mar 2015 A1
20150093713 Chen et al. Apr 2015 A1
20150093714 Kopelman Apr 2015 A1
20150125802 Tal May 2015 A1
20150128421 Mason et al. May 2015 A1
20150157421 Martz et al. Jun 2015 A1
20150182303 Abraham et al. Jul 2015 A1
20150182321 Karazivan et al. Jul 2015 A1
20150216626 Ranjbar Aug 2015 A1
20150216627 Kopelman Aug 2015 A1
20150238280 Wu et al. Aug 2015 A1
20150238282 Kuo et al. Aug 2015 A1
20150238283 Tanugula et al. Aug 2015 A1
20150238284 Wu et al. Aug 2015 A1
20150245887 Izugami et al. Sep 2015 A1
20150254410 Sterental et al. Sep 2015 A1
20150265376 Kopelman Sep 2015 A1
20150289949 Moss et al. Oct 2015 A1
20150289950 Khan Oct 2015 A1
20150305830 Howard et al. Oct 2015 A1
20150305831 Cosse Oct 2015 A1
20150305919 Stubbs et al. Oct 2015 A1
20150313687 Blees et al. Nov 2015 A1
20150320518 Namiranian et al. Nov 2015 A1
20150320532 Matty et al. Nov 2015 A1
20150335399 Caraballo Nov 2015 A1
20150335404 Webber et al. Nov 2015 A1
20150336299 Tanugula et al. Nov 2015 A1
20150342464 Wundrak et al. Dec 2015 A1
20150351870 Mah Dec 2015 A1
20150351871 Chishti et al. Dec 2015 A1
20150359609 Khan Dec 2015 A1
20150366637 Kopelman et al. Dec 2015 A1
20150366638 Kopelman et al. Dec 2015 A1
20160000527 Arruda Jan 2016 A1
20160008095 Matov et al. Jan 2016 A1
20160008097 Chen et al. Jan 2016 A1
20160051341 Webber Feb 2016 A1
20160051342 Phan et al. Feb 2016 A1
20160051348 Boerjes et al. Feb 2016 A1
20160067013 Morton et al. Mar 2016 A1
20160067014 Kottemann et al. Mar 2016 A1
20160074137 Kuo et al. Mar 2016 A1
20160074138 Kitching et al. Mar 2016 A1
20160095668 Kuo et al. Apr 2016 A1
20160095670 Witte et al. Apr 2016 A1
20160106521 Tanugula et al. Apr 2016 A1
20160120617 Lee May 2016 A1
20160120621 Li et al. May 2016 A1
20160128803 Webber et al. May 2016 A1
20160135924 Choi et al. May 2016 A1
20160135925 Mason et al. May 2016 A1
20160135926 Djamchidi May 2016 A1
20160135927 Boltunov et al. May 2016 A1
20160157961 Lee Jun 2016 A1
20160166363 Varsano Jun 2016 A1
20160175068 Cai et al. Jun 2016 A1
20160175069 Korytov et al. Jun 2016 A1
20160184129 Liptak et al. Jun 2016 A1
20160193014 Morton et al. Jul 2016 A1
20160199216 Cam et al. Jul 2016 A1
20160203604 Gupta et al. Jul 2016 A1
20160206402 Kitching et al. Jul 2016 A1
20160220200 Sanholm et al. Aug 2016 A1
20160228213 Tod et al. Aug 2016 A1
20160256240 Shivapuja et al. Sep 2016 A1
20160310235 Derakhshan et al. Oct 2016 A1
20160338799 Wu et al. Nov 2016 A1
20160367339 Khardekar et al. Dec 2016 A1
20170007359 Kopelman et al. Jan 2017 A1
20170065373 Martz et al. Mar 2017 A1
20170079748 Andreiko Mar 2017 A1
20170100207 Wen Apr 2017 A1
20170100208 Wen Apr 2017 A1
20170100209 Wen Apr 2017 A1
20170100210 Wen Apr 2017 A1
20170100211 Wen Apr 2017 A1
20170100214 Wen Apr 2017 A1
20170224444 Viecilli et al. Aug 2017 A1
20170231721 Akeel et al. Aug 2017 A1
20170325911 Marshall Nov 2017 A1
20180014912 Radmand Jan 2018 A1
20180028065 Elbaz et al. Feb 2018 A1
20180042708 Caron et al. Feb 2018 A1
20180055611 Sun et al. Mar 2018 A1
20180078335 Falkel Mar 2018 A1
20180078343 Falkel Mar 2018 A1
20180078344 Falkel Mar 2018 A1
20180078347 Falkel Mar 2018 A1
20180092714 Kitching et al. Apr 2018 A1
20180092715 Kitching et al. Apr 2018 A1
20180125610 Carrier, Jr. May 2018 A1
20180158544 Trosien et al. Jun 2018 A1
20180161126 Marshall et al. Jun 2018 A1
20180168781 Kopelman et al. Jun 2018 A1
20180174367 Marom Jun 2018 A1
20180333226 Tsai et al. Nov 2018 A1
20180344431 Kuo et al. Dec 2018 A1
20190008612 Kitching et al. Jan 2019 A1
20190046297 Kopelman et al. Feb 2019 A1
20190090987 Hung Mar 2019 A1
20190155789 Dorman May 2019 A1
20190231478 Kopelman Aug 2019 A1
20190321135 Wen Oct 2019 A1
20190343602 Wen Nov 2019 A1
20190350680 Chekh Nov 2019 A1
20190358002 Falkel Nov 2019 A1
20190388189 Shivapuja et al. Dec 2019 A1
20200000552 Mednikov et al. Jan 2020 A1
20200047868 Young et al. Feb 2020 A1
20200081413 Georg et al. Mar 2020 A1
20200105028 Gao Apr 2020 A1
20200146775 Wen May 2020 A1
20200170762 Falkel Jun 2020 A1
20200205936 Wen Jul 2020 A1
20200214598 Li et al. Jul 2020 A1
20200214801 Wang et al. Jul 2020 A1
20200253693 Wen Aug 2020 A1
20200316856 Mojdeh et al. Oct 2020 A1
20200345459 Schueller et al. Nov 2020 A1
20200357186 Pokotilov et al. Nov 2020 A1
20200360120 Inoue et al. Nov 2020 A1
20200390523 Sato et al. Dec 2020 A1
20210106404 Wen Apr 2021 A1
20210153981 Falkel May 2021 A1
20210186668 Falkel Jun 2021 A1
20210244518 Ryu et al. Aug 2021 A1
20210282899 Wen Sep 2021 A1
20210369417 Wen et al. Dec 2021 A1
20210393376 Wu et al. Dec 2021 A1
20210393385 Parkar et al. Dec 2021 A1
20220265395 Falkel Aug 2022 A1
20220266577 Sharma et al. Aug 2022 A1
20220323182 Lee Oct 2022 A1
20220409338 Cao Dec 2022 A1
20230005593 Raslambekov Jan 2023 A1
20230053766 Cao et al. Feb 2023 A1
20230058890 Kenworthy Feb 2023 A1
20230233288 Wen Jul 2023 A1
20230240808 Schueller et al. Aug 2023 A1
20230320565 Falkel Oct 2023 A1
20230380936 Wen Nov 2023 A1
20230380938 Sharma et al. Nov 2023 A1
20230380939 Lai et al. Nov 2023 A1
20230414324 Wen Dec 2023 A1
Foreign Referenced Citations (71)
Number Date Country
2557573 Jul 2012 CA
1575782 Feb 2005 CN
1997324 Jul 2007 CN
101427256 May 2009 CN
101636122 Jan 2010 CN
1973291 Sep 2010 CN
102438545 May 2012 CN
101528152 Dec 2012 CN
103932807 Jul 2014 CN
105748163 Jul 2016 CN
106580509 Apr 2017 CN
1474062 Apr 2011 EP
2056734 Sep 2015 EP
2957252 Dec 2015 EP
40004866 Aug 2022 HK
2005-515826 Jun 2005 JP
2006-500999 Jan 2006 JP
2008-532563 Aug 2008 JP
2009-202031 Sep 2009 JP
4323322 Sep 2009 JP
2010-502246 Jan 2010 JP
2010-528748 Aug 2010 JP
4566746 Oct 2010 JP
2012-139540 Jul 2012 JP
5015197 Aug 2012 JP
5015765 Aug 2012 JP
5149898 Feb 2013 JP
2013-081785 May 2013 JP
5291218 Sep 2013 JP
2007-525289 Sep 2017 JP
2019-013463 Jan 2019 JP
2019-529042 Oct 2019 JP
2019-537033 Dec 2019 JP
2004-46323 Oct 2009 KR
10-1450866 Oct 2014 KR
2018-0090481 Aug 2018 KR
WO 2001082192 Nov 2001 WO
WO 2002047571 Jun 2002 WO
WO 2003063721 Aug 2003 WO
WO 2004028391 Apr 2004 WO
WO 2005086058 Sep 2005 WO
WO 2004098379 Nov 2005 WO
WO 2006050452 May 2006 WO
WO 2006096558 Sep 2006 WO
WO 2008026064 Mar 2008 WO
WO 2008102132 Aug 2008 WO
WO 2008118546 Oct 2008 WO
WO 2008149222 Dec 2008 WO
WO 2009057937 May 2009 WO
WO 2009068892 Jun 2009 WO
WO 2016004415 Jan 2016 WO
WO 2016100577 Jun 2016 WO
WO 2017062207 Apr 2017 WO
WO 2017062208 Apr 2017 WO
WO 2017062209 Apr 2017 WO
WO 2017062210 Apr 2017 WO
WO 2018057622 Mar 2018 WO
WO 2018112273 Jun 2018 WO
WO 2018118200 Jun 2018 WO
WO 2020222905 Nov 2020 WO
WO 2020223384 Nov 2020 WO
WO 2020239429 Dec 2020 WO
WO 2020257724 Dec 2020 WO
WO 2021105878 Jun 2021 WO
WO 2021247145 Dec 2021 WO
WO 2021247950 Dec 2021 WO
WO 2022040671 Feb 2022 WO
WO 2022178514 Aug 2022 WO
WO 2023023417 Feb 2023 WO
WO 2023023418 Feb 2023 WO
WO 2023230460 Nov 2023 WO
Non-Patent Literature Citations (1)
Entry
Kovach, I. V. et al., “Clinic, diagnosis, treatment, prevention, prosthetics various dentofacial anomalies and deformities,” DMA, 2018.
Related Publications (1)
Number Date Country
20220054232 A1 Feb 2022 US
Provisional Applications (1)
Number Date Country
63067769 Aug 2020 US