This invention generally relates to methods and a user interfaces for determining the shade of a patient's tooth or teeth and for utilizing the determined tooth shades for designing and manufacturing dental restorations.
When designing and manufacturing a dental restoration for a patient, such as a crown or a bridge restoration, it is advantageous that both the shape and shade of the manufactured restoration is adapted to the patient's natural teeth surrounding the restoration. If the shade of the restoration differs significantly from the surrounding natural teeth, e.g. is significantly darker or brighter than these, the restoration appear artificial and deteriorate the aesthetic impression of the patient's smile.
The tooth color can be represented in many different color spaces, such as the L*C*h* color space representing color in terms of Lightness, Chroma and Hue, or in the L*a*b* color space as described e.g. by Hassel et al. (Hassel 2012) and Dozic et al. (Dozic 2007). The L*a*b* color space has the advantage that it is designed to approximate human vision with the L* component closely matches human perception of lightness.
In order to aid dental technicians in their manual work of manufacturing a restoration which appears natural, the tooth colors are often expressed in terms of reference tooth shade values of a tooth shade system (often referred to as a tooth shade guide). Each reference tooth shade value in a tooth shade guide represents a predetermined and known tooth color value and often correspond to the color of commercially available ceramics for the production of dental restorations. This is e.g. the case for the VITA 3D-Master or the VITA Classic shade guides provided by VITA Zahnfabrik, Germany.
In the VITA 3D-Master system, tooth shades are expressed in codes referring to the L*C*h* color space, where each code is constructed according to (Lightness, hue, Chroma). One example of a tooth shade value is 3R1.5 where “3” refers to the lightness, “R” to the hue and “1.5” to the Chroma of the tooth. This allows the dentist to describe the shade of the patient's tooth in terms that a dental technician immediately understands, such that the technician will know from which ceramics he should manufacture the restoration to provide that it has the correct shade.
When manually determining which reference tooth shade value best matches the color of a patient's tooth, the dentist holds different pre-manufactured teeth of the shade guide at the tooth for comparison. Often a picture is taken with the pre-manufactured structures arranged at the teeth. The technician who produces the prosthetic then uses the picture in evaluating which ceramic must be used for the different parts of the restoration based on the picture. This process is both time consuming and inaccurate.
Disclosed is a method for determining shade of a patient's tooth, wherein the method comprises:
Disclosed is a user interface for determining and displaying shade of a patient's tooth, wherein the user interface is configured for:
The texture data of the digital 3D representation expresses the texture of the tooth. The texture data can be a texture profile expressing the variation in the texture over the tooth. The shape data of the digital 3D representation expresses the shape of the tooth.
In some embodiments, the texture information comprises at least one of tooth color or surface roughness.
When the texture information comprises tooth color information, the texture data expressing a texture profile of the tooth may be color data expressing a color profile of the tooth, and the tooth shade value for a point on the tooth may be derived by comparing the color data of the corresponding point of the digital 3D representation with known color values of one or more reference tooth shade values.
Determining both the tooth shade value from a digital 3D representation comprising both shape data expressing the shape of the tooth and texture data expressing a texture profile of the tooth provides the advantage that shade and geometry information are directly linked. This e.g. advantageous e.g. in CAD/CAM dentistry where dental restorations are designed using Computer Aided Design (CAD) tools and subsequently manufactured from the design using Computer Aided Design (CAM) tools. The material used for the manufacture of the dental restoration can then be selected based on the determined tooth shade value.
In many cases, the dental restoration is manufactured with a shade profile where the shade differs from the incisal edge towards cervical end of the restoration. The disclosed invention allows the operator to determine tooth shade values for several points on the tooth such that a shade profile can be determined for the dental restoration. Multi-shaded milling blocks exits which mimics standard tooth shade profiles. Having the shape data and the tooth shade values linked via the digital 3D representation provides that the correct portion of the multi-shaded milling block can be milled out. The remaining portion of the multi-shaded milling block forming the dental restoration will then have a shape and shade profile which closely resembles that of a natural tooth.
In some embodiments, obtaining the digital 3D representation of the tooth comprises recording a series of sub-scans of the tooth, where at least one of said sub-scans comprises both texture information and geometry information for said tooth, and generating the digital 3D representation of the tooth from the recorded series of sub-scans.
When a plurality of the sub-scans comprise texture information, the texture data for the digital 3D representation can be derived by combining the texture information of the several sub-scans.
The recorded sub-scans comprise at least data of the tooth for which the shade is determined, but potentially also of the neighboring teeth such that for example the shape and location of the neighboring teeth can be taken into account when designing a dental restoration for the tooth. Texture information and texture data for the neighboring teeth can also be used to determine the shade value for the tooth, e.g. by interpolation of the shades determined for the neighbor teeth.
In some embodiments, the method comprises creating a shade profile for the tooth from shade values determined for one or more of points on the tooth.
The shade profile of natural teeth often has a brighter shade at the incisal edge of the tooth and gradually changes into a darker shade towards the cervical end of the tooth, i.e. the end at the patient's gingiva.
When the tooth shade value is determined for one point only the tooth shade profile may be generated based on knowledge of the normal tooth shade profile for that particular type of tooth and patient. This knowledge may relate to how the shade profile normally changes over the tooth, the age and gender of the patients, etc.
Often the profile will be based on tooth shades determined in several points on the tooth to provide the most reliable tooth shade profile.
In some embodiments the user interface is configured for creating a shade profile for the tooth from tooth shade values determined for one or more points on the tooth.
In some embodiments, the tooth shade profile can be created by interpolation of tooth shade values determined for points distributed over the tooth surface with some distance between the points. The tooth shade value for some parts of the tooth surface are then not derived directly from sub-scan texture information relating to these parts but from the determined tooth shade values for other parts/points on the tooth surface. A tooth shade profile for the entire labial/buccal surface of the tooth can thus be created from a selection of points on the surface providing a fast and often sufficiently accurate procedure for creating the tooth shade profile. The interpolation of the tooth shade values can be realized by an interpolation in each of the coordinates of the color space used to describe the tooth color.
In some embodiments, the tooth shade profile comprises a one or more tooth shade regions on the tooth surface where an average tooth shade is derived for each region from tooth shade values determined for a number of points within the region.
The tooth shade region can be defined by a structure encircling a portion of the tooth surface in the digital 3D representation, where either the operator or a computer implemented algorithm decides where each geometric structure is located on the digital 3D representation. Different shapes (e.g. circles, squares, or rectangles) and sizes (e.g. corresponding to a few millimeters) of the geometric structure can be used. The number of points within the geometrical structure can be increased to provide a more accurate measure of the shade or reduced to provide a faster calculation.
The average tooth shade value for a region can e.g. be derived as a weighted average where the tooth shade value for points in the center of the structure is assigned a higher weight than tooth shade value of points closer to the boundary.
The tooth surface can also be divided into a coronal, a middle and a cervical region. Some natural teeth has a shade profile which can be expressed by such a division and many dentists and dental technicians are familiar with such a division.
In some embodiments, tooth shade values are determined for a plurality of teeth, i.e. on parts of the digital 3D representation corresponding to two or more teeth, and a tooth shade value and/or a tooth shade profile for each of these teeth is created from the determined tooth shade values.
In some embodiments, the texture data at least partly are derived by combining the texture information from corresponding parts of a number of the sub-scans.
The digital 3D representation can be generated through registration of sub-scans into a common coordinate system by matching overlapping sections of sub-scans, i.e. the sections of the sub-scans which relate to the same region of the tooth. When two or more sub-scans also comprise texture information relating to the same region of the tooth, deriving the texture data for this region in the digital 3D representation can comprise combining the corresponding texture information, i.e. the texture information in the sub-scans corresponding to the same sections of the tooth.
Deriving the texture data based on texture information from two or more sub-scan can provide a more accurate measurement of the texture data. The texture information of one sub-scan for a particular region of the tooth may be unreliable e.g. due to the angle between the surface in this region and the scanner when this particular sub-scan was recorded. The combination of texture information from several sub-scans can provide a more reliable color.
In some embodiments, combining the texture information from the sub-scans comprises interpolating the texture information, i.e. texture information from parts of the sub-scans corresponding to a point on the tooth are interpolated to determine the texture data for that point.
Such an interpolation can provide that the determined texture data is more accurate e.g. in cases where the texture information for a point on the tooth is not linearly varying over the sub-scans such that a simple averaging will not provide the best result.
In some embodiments, combining the texture information from the sub-scans comprises calculating an average value of the texture information, i.e. texture data for a point on the digital 3D representation are determined by averaging the texture information of the sub-scans corresponding to that point on the tooth.
In some embodiments, the calculated average value is a weighted average of the texture information.
This approach has the advantage that the derived texture data of the digital 3D representation are not as sensitive to errors in the texture information of a single sub-scan.
Such errors can be caused by several factors. One factor is the angle between the optical path of the probe light at the tooth surface and the tooth surface itself. When utilizing e.g. the focus scanning technique, the texture data for a point on the tooth is preferably derived from a number of sub-scans where at least some of the sub-scans are recorded at different orientations of the scanner relative to the teeth. The sections of the sub-scans relating to this point are hence acquired at different angles relative to the tooth surface in this point.
A portion of a sub-scan recorded from a surface perpendicular to the optical path of the probe light at the tooth may be dominated by specular reflected light which does not describe the texture of the tooth but rather the spectral distribution of the probe light. A portion of a sub-scan recorded from a tooth surface almost parallel to the optical path is often quite weak and hence often provide an erroneous detection of the texture at that point.
In some embodiments, the texture information from parts of a sub-scan relating to a tooth surface which is substantially perpendicular or parallel to the optical path are assigned a low weight in the weighted averaging of the texture information to determine the texture data for the point.
The orientation of the scanner relative to the tooth when a sub-scan is acquired can be determined from the shape of the sub-scan. Parts of the sub-scan relating to tooth surfaces which are substantially parallel or perpendicular to the optical path can thus immediately be detected in the sub-scan such that the texture information of the corresponding parts are assigned at low weight when determining the texture data for this point from a series of sub-scans.
A specular reflection from the tooth often has an intensity which is significantly higher than that of e.g. diffuse light from surfaces which have an oblique angle relative to the optical path. In some cases the specular reflection will saturate the pixels of the image sensor used for the recording of the sub-scans.
In some embodiments, the method comprises detecting saturated pixels in the recorded sub-scans and assigning a low weight to the texture information of the saturated pixels when combining the texture information from the sub-scans, i.e. when calculating the weighted average of the texture information.
Specular reflection from a tooth surface may also be detected from a comparison between the spectrum of the light received from the tooth and that of the probe light. If these spectra a very similar it indicates that the tooth has a perfectly white surface which is not natural. Such texture information may thus be assigned a low weight in a weighted average of texture information.
In some embodiments determining the tooth shade value for the point comprises selecting the reference tooth shade value with known texture value closest to the texture data of the point.
When the texture data comprises color data, selecting the tooth shade value of the point can comprise calculating the color difference between the determined color data in the point and the color data of the reference tooth shade values. This difference can e.g. be calculated as a Euclidian distance in the used color space. As an example, Dozic et al. (Dozic 2007) describes that the Euclidian distance ΔE between two points (L1*, a1*, b1*) and (L2*, a2*, b2*) in the L*a*b* color space is given by:
Selecting the tooth shade value can then comprise determining for which of the reference tooth shades the color difference, i.e. the Euclidian distance, is the smallest.
In some embodiments determining the tooth shade value for the point comprises an interpolation of the two or more reference tooth shade values having known texture values close to the texture data of the point.
This interpolation provides that the tooth shade can be represented with a more detailed solution than what is provided by the tooth shade standard used to describe the tooth shade. For instance when using a Lightness-Hue-Chroma code a tooth shade value of 1.5M2.5 can be determined for the tooth by interpolation of Lightness values of 1 and 2, and Chroma values of 2 and 3.
The tooth shade value can be displayed in a user interface e.g. together with the digital 3D representation of the tooth. If the digital 3D representation also contains parts relating to other teeth the tooth shade value for the tooth is preferably displayed at the tooth, such as at the point of the for which the tooth shade value has been determined.
The tooth shade value can also be represented as a color mapped onto the digital 3D representation.
When a dental restoration is designed based on the determined tooth shade value this can provide a visualization of how the restoration will appear together with neighboring teeth also contained in the digital 3D representation obtained by scanning the teeth.
In some embodiments, the method comprises deriving a certainty score expressing the certainty of the determined tooth shade value.
Deriving a certainty score for the determined tooth shade value provides the advantage that a measure of how accurate the determined value is can be displayed to the operator, preferably when the patient is still at the clinic such that further scanning can be performed if this is required to provide a more precise tooth shade value.
In some embodiments, the method comprises generating a visual representation of the certainty score and displaying this visual representation in a user interface.
In some embodiments, the method comprises generating a certainty score profile at least for a portion of the tooth, where the certainty scope profile represents the certainty scores for tooth shade values determined for a number of points on the tooth, such as for the values in a tooth shade profile for the tooth. The certainty score profile can be mapped onto the digital 3D representation of the tooth and visualized in a user interface. When the tooth shade profile also is mapped onto the tooth digital 3D representation the operator may be allowed to toggle between having the tooth shade profile and having the certainty scope profiled visualized on the digital 3D representation.
In some embodiments the visual representation of the certainty score is displayed together with or is mapped onto the digital 3D representation of the tooth.
In some embodiments, the method comprises comparing the derived certainty score with a range of acceptable certainty score values. This is done to verify that the certainty score is acceptable, i.e. that the determined tooth shade value is sufficiently reliable.
One boundary of the range can be defined by a threshold value. When a high certainty scope indicates that the determined shade value most likely is correct, the threshold value may define the lower boundary of the range and vice versa.
A visual representation of the certainty score or of the result of the comparison of the certainty score with the range can be generated and displayed in a user interface. Preferably, this visual representation is displayed together with the determined tooth shade value.
In some embodiments, the method comprises deciding based on the certainty score whether the determined tooth shade value or tooth shade profile is acceptable. This may be based on the comparison of the derived certainty score and the range of acceptable certainty score values, e.g. where it is decided that the determined tooth shade value is acceptable if the certainty score is within the range of acceptable values.
In some embodiments, the certainty measure relates to how uniform the sub-scan texture information is at the point.
If large variations are found in the texture information in the vicinity of the parts corresponding to the point for a substantial fraction of the sub-scans, the texture data derived therefrom may be unreliable and the tooth shade value derived for this point is accordingly not very reliable.
In some embodiments, the certainty measure relates to how close the texture data is to the known texture value of the determined tooth shade value. In particular, the certainty measure may relate to how close one parameter of the color data of the digital 3D representation is to the corresponding parameter of the known color for the determined tooth shade value. For example, the certainty measure may relate to the difference in the lightness parameter between point of the digital 3D representation and the determined tooth shade value.
The Euclidian distance between the color data to the selected reference tooth shade value can also be used in determining the certainty measure. If the Euclidian distance is above a threshold value the uncertainty is then evaluated to be too large. The color data can here both relate to color data of the point or the average color data for a region surrounding the point.
In some embodiments, the certainty measure relates to the amount of texture information used to derive the texture data at the point.
When the texture data for the point is derived from a limited amount of texture information the texture data, and accordingly the tooth shade value derived therefrom, may be less reliable than the tooth shade values derived from large amounts of texture information.
In some embodiments, the visual representation of the certainty score comprises a binary code, such as red for certainty scores outside a range of acceptable certainty score values, and green for certainty scores within the range, a bar structure with a color gradient, a numerical value, and/or a comparison between the texture data and the known texture value of the determined tooth shade value.
In some embodiments, the visual representation of the certainty score comprises a certainty score indicator.
The certainty score indicator may comprise a bar structure with a color gradient going from a first color representing a low certainty score to a second color representing a high certainty score. The first color may be red and the second color green. The color gradient of the bar structure may be configured to have an intermediate color, e.g. yellow representing the threshold value for the certainty score. The certainty score indicator may comprise marker which is arranged relative to the color gradient of the bar structure such that it indicated the certainty score.
In some embodiments, the visual representation of the certainty score comprises a numerical value, such as a numerical value in an interval extending from a lower limit indicating a low certainly, i.e. a relatively uncertain tooth shade value, to a higher limit indicating a high certainty, i.e. a relatively certain tooth shade value.
In some embodiments, the one or more reference tooth shade values relate to shade values for natural teeth with intact surface and/or to shade values for teeth prepared for a dental restoration.
The reference tooth shade values used for determining the tooth shade can be selected based on the tooth. Intact and healthy teeth normally have tooth shades in one range of tooth shade values where a tooth prepared for a dental restoration has a tooth shade in another range, which may overlap with the range for healthy teeth. It may thus be advantageous that the operator enters whether the tooth is intact or prepared for a restoration and the appropriate color space is used in the comparison with the texture data.
If the color data in the point on the digital 3D representation of the tooth has a poor match to all the reference tooth shade values of the selected tooth shade system/guide the point may e.g. be on the gingiva of the patient or relate to silver filling.
In some embodiments, the method comprises comparing the texture data with known texture values for soft oral tissue, such as gum tissue and gingiva.
This may e.g. be relevant when the certainty scores are outside said range of acceptable certainty score values for all tooth shade values of a tooth shade system, i.e. if there is a poor match between the texture data and the known texture for all the tooth shades of the reference set.
In a user interface for implementing the method, it may be suggested to the operator that the point perhaps is not on a tooth surface but on the gums or gingiva of the patient. This suggestion may be provided both when the texture data has been found to give a good match with known texture values of gum/gingiva and/or when the texture data has a poor match with the known texture values of the reference tooth shade values in the tooth shade system or systems.
In some embodiments, the method comprises determining an alternative tooth shade value for the point when said certainty score is outside said range of acceptable certainty score values.
In some embodiments, the method comprises displaying the alternative tooth shade value in the user interface optionally together with the digital 3D representation of the patient's set of teeth and/or the initially determined tooth shade value.
The digital 3D representation of the tooth is generated at least partly from the geometry information of the sub-scans. In some embodiments, the texture information of the sub-scans is also taken into account when generating the digital 3D representation of the tooth.
Sub-scans comprising texture information and geometry information may be recorded for more than said tooth, such that the generated digital 3D representation may comprise shade data expressing the shape and texture data expressing the texture profile of several of the patient's teeth.
Disclosed is a method for determining shade of a patient's tooth, wherein the method comprises:
Disclosed is a user interface for determining and displaying shade of a patient's tooth, wherein the user interface is configured for:
In some embodiments, the user interface is configured for deriving a certainty score expressing the certainty of the determined tooth shade value for said point.
In some embodiments, the user interface comprises a virtual tool which when activated on a point of the digital 3D representation of the tooth provides that
The user interface can then provide the operator with an opportunity to decide based on the visualized certainty score and/or the visual representations whether the determined tooth shade value or tooth shade profile is acceptable.
In some embodiments, the visual representation of the comparison of the derived certainty score with the range of acceptable certainty score values comprises a binary code, such as red for certainty scores outside a range of acceptable certainty score values, and green for certainty scores within the range. Other means for this visualization are described above.
The visualized certainty score and/or the representation(s) of the certainty score or comparison of the certainty score with the range of acceptable certainty score values may be displayed at the digital 3D representation in the user interface or in a shade value region of the user interface.
In some embodiments, the user interface is configured for determining an alternative shade value for the point and for displaying the alternative shade value when the certainty scores outside a range of acceptable certainty score values.
Disclosed is a method for designing a dental restoration for a patient, wherein the method comprises:
The digital restoration design can e.g. be for the manufacture of dental prosthetic restoration for the patient, such as a crown or a bridge restoration, where the digital restoration design expresses a desired shape and shade profile of the dental restoration. Such digital restoration designs can be in the form of a CAD model of the dental restoration.
In some embodiments, the method comprises suggesting a dental material for manufacturing the dental restoration from the digital restoration design based on the determined restoration shade.
In cases where the dental restoration is designed and manufactured for an existing tooth which has an acceptable shade, the tooth shade value or tooth shade profile can be determined for the existing tooth and the shade of the digital restoration design based on the tooth shade value or tooth shade profile of the existing tooth.
This may e.g. be advantageous for the crown portions of a bridge restoration in the case where the tooth which is intended to accept the crown portion of the bridge is a healthy tooth.
In some cases the dental restoration is designed and manufactured for a tooth which either is damaged or has an undesired shade profile, such as for a broken or dead tooth. In such cases it can be advantageous to determine the tooth shade value or tooth shade profile for one or more of the neighboring teeth and selecting the restoration shade of the digital restoration design from e.g. an interpolation of the tooth shade values/profiles of the neighboring teeth.
Disclosed is a method for designing a dental restoration for a first tooth, wherein the method comprises:
In some embodiments, the desired texture profile is derived by interpolation or averaging of the texture data of the digital 3D representation of the neighbor teeth.
In some embodiments, one or more of the sub-scans comprise texture information for the patient's soft tissue, and optionally geometry information for said soft tissue. The generated digital 3D representation may then comprise shape data expressing the shape of the soft tissue and texture data expressing a texture profile of the soft tissue.
From this information, an aesthetica) pleasing denture can be designed where the color of the soft tissue part of the denture is selected based on the texture profile of the corresponding part of the digital 3D representation.
Knowledge of the texture of the soft tissue, such as of the color of the soft tissue, can also be used for diagnostics. When the texture data of a point on the digital 3D representation corresponding to soft tissue does not provide a sufficiently good match with a known range of texture values for soft tissue, a warning may be prompted in a user interface to alert the operator that the soft tissue is suspicious.
Disclosed is a system for determining shade of a patient's tooth, wherein the system comprises:
In some embodiments, the sub-scans are recorded using an intra-oral scanner, such as the 3Shape TRIOS intra-oral scanner.
The intra-oral scanner may be configured for utilizing focus scanning, where the sub-scans of the scanned teeth are reconstructed from in-focus images acquired at different focus depths. The focus scanning technique can be performed by generating a probe light and transmitting this probe light towards the set of teeth such that at least a part of the set of teeth is illuminated. Light returning from the set of teeth is transmitted towards a camera and imaged onto an image sensor in the camera by means of an optical system, where the image sensor/camera comprises an array of sensor elements. The position of the focus plane on/relative to the set of teeth is varied by means of focusing optics while images are obtained from/by means of said array of sensor elements. Based on the images, the in-focus position(s) of each of a plurality of the sensor elements or each of a plurality of groups of the sensor elements may be determined for a sequence of focus plane positions.
The in-focus position can e.g. be calculated by determining the maximum of a correlation measure for each of a plurality of the sensor elements or each of a plurality of groups of the sensor elements for a range of focus planes as described in WO2010145669. From the in-focus positions, sub-scans of the set of teeth can be derived with geometry information relating to the shape of the scanned surface. When e.g. the image sensor is a color sensor and the light source provides a multispectral signal a plurality of the sub-scans can include both geometry information and texture information, such as color information, for said tooth.
A digital 3D representation of the set of teeth can then be generated from the recorded sub-scans by e.g. the use of an Iterative Closest Point (ICP) algorithm. Iterative Closest Point (ICP) is an algorithm employed to minimize the difference between two clouds of points. ICP can be used to reconstruct 2D or 3D surfaces from different scans or sub-scans. The algorithm is conceptually simple and is commonly used in real-time. It iteratively revises the transformation, i.e. translation and rotation, needed to minimize the distance between the points of two raw scans or sub-scans. The inputs are: points from two raw scans or sub-scans, initial estimation of the transformation, criteria for stopping the iteration. The output is: refined transformation. Essentially the algorithm steps are:
The generated digital 3D representation formed by such a procedure comprises shape data expressing the shape of the tooth. The texture information of the sub-scans can be used in various ways to provide that the generated digital 3D representation also comprises texture data expressing a texture profile of the tooth. For a number of the sub-scans, the part of the sub-scan relating to the same point on the tooth can be identified, e.g. during the ICP procedure. The corresponding texture information of these parts of the sub-scans can then be combined to provide the texture data for that point.
Furthermore, the invention relates to a computer program product comprising program code means for causing a data processing system to perform the method according to any of the embodiments, when said program code means are executed on the data processing system, and a computer program product, comprising a computer-readable medium having stored there on the program code means.
The present invention relates to different aspects including the method and user interface described above and in the following, and corresponding methods and user interface, each yielding one or more of the described advantage, and each having one or more embodiments corresponding to the embodiments described above and/or disclosed in the appended claims.
The above and/or additional objects, features and advantages of the present invention, will be further elucidated by the following illustrative and non-limiting detailed description of embodiments of the present invention, with reference to the appended drawings, wherein:
In the following description, reference is made to the accompanying figures, which show by way of illustration how the invention may be practiced.
In step 102 a series of sub-scans of the patient's set of teeth is recorded, where a plurality of said sub-scans comprises both texture information and shape information for the tooth.
In step 103 a digital 3D representation of the tooth is generated from said sub-scans, where the digital 3D representation comprises texture data expressing a texture profile of the tooth. The digital 3D representation further comprises shape data expressing the shape of the tooth such that the shape of the tooth can be visualized in a user interface.
In step 104 a tooth shade value for a point on the tooth is determined based on the texture data. This is done at least in part by comparing the texture data of the corresponding point of the digital 3D representation with a known texture value of one or more reference tooth shade values. The reference tooth shade values may be provided in the form of a library file and comprise tooth shade values and corresponding texture values based on e.g. the VITA 3D-Master and/or the VITA Classic tooth shade systems.
The point or points on the tooth for which the tooth shade value(s) is/are determined can be selected by an operator. This can be the case e.g. when the digital 3D representation of the tooth is visualized in a user interface and the operator uses a pointing tool, such as a computer mouse, to indicate where on the digital 3D representation of the tooth, he wishes to determine the tooth shade value. The point or points can also be selected by a computer implemented algorithm based on predetermined positions on the digital 3D representation of the tooth, such as a point arranged at a certain distance to the incisal edge of the tooth.
The screen shot 210 seen in
The screen shot 310 seen in
The second region 314 is located at the patient's soft tissue. An anatomical correct tooth shade value can hence not be calculated from the texture data of that part of the digital 3D representation of the patient's teeth and the corresponding certainty scope is accordingly very low as seen in the vertical bars of tooth value section 319.
In step 531 a digital restoration design is created e.g. based on the shape data of a digital 3D representation of the patient's set of teeth and/or on template digital restoration design loaded from a library. Template digital restoration designs may e.g. be used when the tooth is broken.
In step 532 the tooth shade values of different points or regions of the teeth are derived from the texture data of the digital 3D representation of the patient's set of teeth. From the derived tooth shade values or from tooth shade profiles created based on the derived tooth shade values a desired shade profile for the dental restoration can be determined. This can be based on e.g. feature extraction where shade values are extracted from the other teeth by e.g. identifying shade zones on these teeth and copying these zones to the dental restoration. It can also be based on established shade rules for teeth, e.g. a rule describing a relation between the tooth shades values or profiles of the canines and the anterior teeth.
In step 533 the desired tooth shade value(s) for the dental restoration is merged into the digital restoration design.
When the dental restoration is to be drilled from a multicolored milling block it is important that the dental restoration is milled from the correct parts of the milling block. In step 534 a CAD model of the milling block is provided, where the CAD model comprises information of the shade profile of the milling block material. The optimal position of the digital restoration design relative to the CAD model of the milling block is then determined in 535, where different criteria can be apply to provide the best fit between the desired shade profile and what actually can be obtained as dictated by the shade profile of the milling block.
In step 536 the dental restoration is manufactured from the milling block by removing milling block material until the dental restoration is shaped according to the digital restoration design.
Many scanning devices have Bayer color filters with Red, Green and Blue filters and hence record color information in the RGB color space. For instance a focus scanner can record series of 2D color images for the generation of sub-scans, where the color information is provided in the RGB color space. The processor 644 then comprises algorithms for transforming the recorded color data into e.g. the L*a*b or L*C*h color spaces.
The system may further comprise a unit 648 for transmitting a digital restoration design and a CAD model of a milling block to e.g. a computer aided manufacturing (CAM) device 649 for manufacturing a shaded dental restoration or to another computer system e.g. located at a milling center where the dental restoration is manufactured. The unit for transmitting the digital restoration design can be a wired or a wireless connection.
The scanning of the patient's set of teeth using the scanning device 641 can be performed at a dentist while deriving the tooth shade values can be performed at a dental laboratory. In such cases the digital 3D representation of the patient's set of teeth can be provided via an internet connection between the dentist and the dental laboratory.
Different scanner configurations can be used to acquire sub-scans comprising both shape and texture information. In some scanner designs the scanner is mounted on axes with encoders which provides that the sub-scans acquired from different orientations can be combined using position and orientation readings from the encoders. When the scanner operates by the focus-scanning technique the individual sub-scans of the tooth are derived from a sequence of 2D images obtained while scanning a focus plane over a portion of the tooth. The focus scanning technique is described in detail in WO2010145669. The shape information of the sub-scans for an object, such as a tooth, can be combined by algorithms for stitching and registration as widely known in the literature. Texture data relating to the tooth color can be obtained using a scanner having a multi-chromatic light source, e.g. a white light source and a color image sensor. Color information from multiple sub-scans can be interpolated and averaged by methods such as texture weaving, or by simply averaging corresponding color components of the sub-scans corresponding to the same point/location on the surface. Texture weaving is described by e.g. Callieri M, Cignoni P, Scopigno R. “Reconstructing textured meshes from multiple range rgb maps”. VMV 2002, Erlangen, Nov. 20-22, 2002.
In
A digital 3D representation of the tooth can be generated by combining sub-scans acquired from different orientations relative to the teeth, e.g. by sub-scan registration. Sub-scans acquired from three such different orientations are illustrated in
One way of doing this is to calculate the average value for each of the parameters used to describe the texture. For example, when the L*a*b* color system is used to describe the color information provided in each sub-scan, the color data of the digital 3D representation can be derived by averaging over each of the L*, a*, and b* parameters of the sub-scans. For example, the L* parameter of the color data for a given point P is then given by L*(P)=1/NΣiNLi*(P) where N is the number of sub-scans used in deriving the texture data and Li*(P) is the L* parameter of the i'th sub-scan for the segment relating to P. Equivalent expressions are true for the a* and b* parameters for point P. The color parameters for each point on the digital 3D representation of the tooth can be determined for sections of or the entire surface of the tooth, such that the generated digital 3D representation comprises both shape and texture information about the tooth. The spatial resolution of the color data does not necessarily have to be identical to the resolution of the shape data of the digital 3D representation. The point P can be described e.g. in Cartesian, cylindrical or polar coordinates.
When the color data is derived for a point on the tooth, the tooth shade value for that point can be determined by comparing the derived color data with the known color data of the reference tooth shade values of a tooth shade guide such as the VITA 3D-Master.
In order to obtain more precise color data the averaging of the color information described above in relation to
In
This can be expressed by a modification of the equation given above. For a weighted averaging of the color information, the L* parameter of the color data for a given point P is given by L*(P)=ΣiN{αi(P)·Li*(P)}/ΣiNαi where α1(P) is the weight factor for the color information of the i'th sub-scan in the segment at P. When a given sub-scan (e.g. the j'th sub-scan) is recorded at an angle relative to the tooth surface which causes the optical path to be e.g. perpendicular to the tooth surface at P, the corresponding weight factor αi(P) is given a lower value than the color data of sub-scans acquired with an oblique angle between the optical path and the tooth surface.
Equivalent equations are true for the a* and b* parameters of the color data for point P.
For a given point P on the digital 3D representation, the color data (Lp*,ap*,bp*) has been determined, e.g. by combining the color information of a series of sub-scans used for generating the digital 3D representation. If the color information originally is recorded using the RGB color space it is transformed into the L*a*b* color space using algorithms known to the skilled person.
In the example illustrated by
The reference shade values of the Vita classical shade guide are: B1, A1, B2, D2, A2, C1, C2, D4, A3, D3, B3, A3.5, B4, C3, A4, and C4. The color data of these reference shades can be provided by scanning the corresponding pre-manufactured teeth of the shade guide. These color data are then also initially obtained in the RGB color space and can be converted to the L*a*b color space using the same algorithms applied to the color information/data for the point P.
The tooth shade value for the point is determined as the reference tooth shade value which has the smallest Euclidian distance to the point in the L*a*b color space. The Euclidian distance ΔEP-Ri from the color (LP*, aP*, bP*) to the known colors of the reference tooth shade values are calculated using the expression:
where Ri refers to the i'th reference tooth shade.
In
The certainty score for the tooth shade value determined for point P depends on how close the color data of the point P is to the known color value of the selected reference tooth shade value. This can be quantified by the Euclidian distance and since point P is not particularly close to R2 in
An alternative approach to using the Euclidian distance is to determine individual parameters of the tooth shade value one at a time. This approach can be used e.g. when the reference tooth shades values are those of the Vita 3D-master system.
The reference tooth shade values of the Vita 3D-master shade guide are expressed in codes consisting of the three parameters Lightness-hue-Chroma, where Lightness is given in values between 1 and 5, the Chroma in values between 1 and 3, and the hue as one of “L”, “M”, or “R”. A shade code in the Vita 3D-master can e.g. be 2M1, where the Lightness parameter equals 2, the Chroma 1 and the hue “M”.
The known color data of the VITA 3D-master shade guide reference shades can be provided by scanning the pre-manufactured teeth of the shade guide. These color data are then also initially obtained in the RGB color space and can be converted to the L*a*b color space using the same algorithms applied to the color information/data for the point P. The known color data of each reference shade guide (having a code expressed in terms of Lightness, hue and Chroma) is then provided in terms of the L*a*b color space.
Since the lightness L has the largest impact on the human perception of the tooth color, the value of the Lightness-parameter LP* in the point is determined first. The value of LP* is compared with the values of the L* parameters for the reference tooth shades. If LP* is close to the L*-value for the i'th reference tooth shade value, LRi* the L* parameter for point P may be set equal to LRi*.
In some cases the Lightness parameter is not close to any of the references but instead is located almost in the middle between two L*-values. For example when LP* in the point is between the values of LRi*=2 and LRi+1*=3 with almost equal distance to each of these as illustrated in
The same procedure is performed for first the Chroma parameter and finally for the hue such that the three parameter of the tooth shade value are determined.
Although some embodiments have been described and shown in detail, the invention is not restricted to them, but may also be embodied in other ways within the scope of the subject matter defined in the following claims. In particular, it is to be understood that other embodiments may be utilized and structural and functional modifications may be made without departing from the scope of the present invention.
In device claims enumerating several means, several of these means can be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims or described in different embodiments does not indicate that a combination of these measures cannot be used to advantage.
A claim may refer to any of the preceding claims, and “any” is understood to mean “any one or more” of the preceding claims.
It should be emphasized that the term “comprises/comprising” when used in this specification is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof.
The features of the method described above and in the following may be implemented in software and carried out on a data processing system or other processing means caused by the execution of computer-executable instructions. The instructions may be program code means loaded in a memory, such as a RAM, from a storage medium or from another computer via a computer network. Alternatively, the described features may be implemented by hardwired circuitry instead of software or in combination with software.
The present application is a continuation of U.S. application Ser. No. 16/946,186, filed on Jun. 9, 2020, which is a continuation of U.S. application Ser. No. 15/888,764, filed on Feb. 5, 2018, now U.S. Pat. No. 10,695,151, which is a continuation of U.S. application Ser. No. 15/117,078, filed on Aug. 5, 2016, now U.S. Pat. No. 10,010,387, which is a U.S. national stage of International Application No. PCT/EP2015/052537, filed on Feb. 6, 2015, which claims the benefit of Danish Application No. PA 2014-70066, filed on Feb. 7, 2014. The entire contents of each of U.S. application Ser. No. 16/946,186, U.S. application Ser. No. 15/888,764, U.S. application Ser. No. 15/117,078, International Application No. PCT/EP2015/052537, and Danish Application No. PA 2014-700665 are hereby incorporated herein by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
3878905 | Schaumann | Apr 1975 | A |
3971065 | Bayer | Jul 1976 | A |
4291958 | Frank et al. | Sep 1981 | A |
4342227 | Petersen et al. | Aug 1982 | A |
4349880 | Southgate et al. | Sep 1982 | A |
4516231 | Michaelis | May 1985 | A |
4575805 | Moermann et al. | Mar 1986 | A |
4629324 | Stern | Dec 1986 | A |
4640620 | Schmidt | Feb 1987 | A |
4781448 | Chatenever et al. | Nov 1988 | A |
4802759 | Matsumoto et al. | Feb 1989 | A |
4896015 | Taboada et al. | Jan 1990 | A |
5131844 | Marinaccio et al. | Jul 1992 | A |
5151609 | Nakagawa et al. | Sep 1992 | A |
5181181 | Glynn | Jan 1993 | A |
5269325 | Robinson et al. | Dec 1993 | A |
5339154 | Gassler et al. | Aug 1994 | A |
5372502 | Massen et al. | Dec 1994 | A |
5377011 | Koch | Dec 1994 | A |
5381236 | Morgan | Jan 1995 | A |
5428450 | Vieillefosse et al. | Jun 1995 | A |
5455899 | Forslund | Oct 1995 | A |
5563343 | Shaw et al. | Oct 1996 | A |
5605459 | Kuroda et al. | Feb 1997 | A |
5615003 | Hermary et al. | Mar 1997 | A |
5675407 | Geng | Oct 1997 | A |
5702249 | Cooper | Dec 1997 | A |
5722412 | Pflugrath et al. | Mar 1998 | A |
5737084 | Ishihara | Apr 1998 | A |
5737339 | Goto et al. | Apr 1998 | A |
5766006 | Murljacic | Jun 1998 | A |
5850289 | Fowler et al. | Dec 1998 | A |
5851113 | Jung et al. | Dec 1998 | A |
6007332 | O'Brien | Dec 1999 | A |
6026189 | Greenspan | Feb 2000 | A |
6072496 | Guenter et al. | Jun 2000 | A |
6081739 | Lemchen | Jun 2000 | A |
6135961 | Pflugrath et al. | Oct 2000 | A |
6148120 | Sussman | Nov 2000 | A |
6206691 | Lehmann et al. | Mar 2001 | B1 |
6227850 | Chishti et al. | May 2001 | B1 |
6229913 | Nayar et al. | May 2001 | B1 |
6249616 | Hashimoto | Jun 2001 | B1 |
6251073 | Imran et al. | Jun 2001 | B1 |
6259452 | Coorg et al. | Jul 2001 | B1 |
6263234 | Engelhardt et al. | Jul 2001 | B1 |
6334773 | Ahlen et al. | Jan 2002 | B1 |
6334853 | Kopelman et al. | Jan 2002 | B1 |
6361489 | Tsai | Mar 2002 | B1 |
6450807 | Chishti et al. | Sep 2002 | B1 |
6471511 | Chishti et al. | Oct 2002 | B1 |
6476803 | Zhang et al. | Nov 2002 | B1 |
6485413 | Boppart et al. | Nov 2002 | B1 |
6532299 | Sachdeva et al. | Mar 2003 | B1 |
6575751 | Lehmann et al. | Jun 2003 | B1 |
6592371 | Durbin et al. | Jul 2003 | B2 |
6645148 | Nguyen-Dinh et al. | Nov 2003 | B2 |
6697164 | Babayoff et al. | Feb 2004 | B1 |
6743014 | Kerschbaumer et al. | Jun 2004 | B2 |
6750873 | Bernardini et al. | Jun 2004 | B1 |
6751344 | Grumbine | Jun 2004 | B1 |
6761561 | Mandelkern et al. | Jul 2004 | B2 |
6865289 | Berestov | Mar 2005 | B1 |
6904159 | Porikli | Jun 2005 | B2 |
6954550 | Fujieda | Oct 2005 | B2 |
6967644 | Kobayashi | Nov 2005 | B1 |
6975898 | Seibel | Dec 2005 | B2 |
6977732 | Chen et al. | Dec 2005 | B2 |
6990228 | Wiles et al. | Jan 2006 | B1 |
7010223 | Thoms | Mar 2006 | B2 |
7027642 | Rubber et al. | Apr 2006 | B2 |
7058213 | Rubber et al. | Jun 2006 | B2 |
7068825 | Rubber et al. | Jun 2006 | B2 |
7077647 | Choi et al. | Jul 2006 | B2 |
7079679 | Kirk et al. | Jul 2006 | B2 |
7086863 | Van der Zel | Aug 2006 | B2 |
7099732 | Geng | Aug 2006 | B2 |
7123760 | Mullick et al. | Oct 2006 | B2 |
7134874 | Chishti et al. | Nov 2006 | B2 |
7141020 | Poland et al. | Nov 2006 | B2 |
7160110 | Imgrund et al. | Jan 2007 | B2 |
7166537 | Jacobsen et al. | Jan 2007 | B2 |
7184150 | Quadling et al. | Feb 2007 | B2 |
7197179 | Rubbert et al. | Mar 2007 | B2 |
7213214 | Baar et al. | May 2007 | B2 |
7215430 | Kacyra et al. | May 2007 | B2 |
7221332 | Miller et al. | May 2007 | B2 |
7230771 | Kuiper et al. | Jun 2007 | B2 |
7296996 | Sachdeva et al. | Nov 2007 | B2 |
7319529 | Babayoff | Jan 2008 | B2 |
7339170 | Deliwala | Mar 2008 | B2 |
7349104 | Geng | Mar 2008 | B2 |
7355721 | Quadling et al. | Apr 2008 | B2 |
7385708 | Ackerman et al. | Jun 2008 | B2 |
7458812 | Sporbert et al. | Dec 2008 | B2 |
7460248 | Kurtz et al. | Dec 2008 | B2 |
7471821 | Rubbert et al. | Dec 2008 | B2 |
7474414 | Bae et al. | Jan 2009 | B2 |
7483062 | Allman et al. | Jan 2009 | B2 |
7494338 | Durbin et al. | Feb 2009 | B2 |
7522322 | Blanding et al. | Apr 2009 | B2 |
7550707 | Hashimoto | Jun 2009 | B2 |
7551353 | Kim et al. | Jun 2009 | B2 |
7605817 | Zhang et al. | Oct 2009 | B2 |
7609875 | Liu et al. | Oct 2009 | B2 |
7636455 | Keaton et al. | Dec 2009 | B2 |
7698068 | Babayoff | Apr 2010 | B2 |
7708557 | Rubbert | May 2010 | B2 |
7724378 | Babayoff | May 2010 | B2 |
7762814 | van der Zel | Jul 2010 | B2 |
7813591 | Paley et al. | Oct 2010 | B2 |
7831292 | Quaid et al. | Nov 2010 | B2 |
7840042 | Kriveshko et al. | Nov 2010 | B2 |
7929151 | Liang et al. | Apr 2011 | B2 |
7929751 | Zhang et al. | Apr 2011 | B2 |
7940260 | Kriveshko | May 2011 | B2 |
7946845 | Lehmann | May 2011 | B2 |
8003889 | Turcovsky | Aug 2011 | B2 |
8026916 | Wen | Sep 2011 | B2 |
8035637 | Kriveshko | Oct 2011 | B2 |
8078006 | Sandrew et al. | Dec 2011 | B1 |
8090194 | Golrdon et al. | Jan 2012 | B2 |
8092215 | Stone-Collonge et al. | Jan 2012 | B2 |
8103134 | Sorek et al. | Jan 2012 | B2 |
8121351 | Katz et al. | Feb 2012 | B2 |
8121718 | Rubbert et al. | Feb 2012 | B2 |
8144954 | Quadling et al. | Mar 2012 | B2 |
8177551 | Sachdeva | May 2012 | B2 |
8179551 | Yamamichi | May 2012 | B2 |
8180100 | Fujimaki et al. | May 2012 | B2 |
8208704 | Wong et al. | Jun 2012 | B2 |
8260539 | Zeng | Sep 2012 | B2 |
8280152 | Thiel et al. | Oct 2012 | B2 |
8331653 | Seki et al. | Dec 2012 | B2 |
8335353 | Yamamoto | Dec 2012 | B2 |
8345961 | Li et al. | Jan 2013 | B2 |
8363228 | Babayoff | Jan 2013 | B2 |
8384665 | Powers et al. | Feb 2013 | B1 |
8386061 | Violante et al. | Feb 2013 | B2 |
8390821 | Shpunt et al. | Mar 2013 | B2 |
8400635 | Inglese et al. | Mar 2013 | B2 |
8442283 | Choi | May 2013 | B2 |
8451456 | Babayoff | May 2013 | B2 |
8456636 | Hennig | Jun 2013 | B2 |
8469705 | Sachdeva | Jun 2013 | B2 |
8477320 | Stock et al. | Jul 2013 | B2 |
8526700 | Isaacs | Sep 2013 | B2 |
8532355 | Quadling et al. | Sep 2013 | B2 |
8547374 | Sadjadi et al. | Oct 2013 | B1 |
8564657 | Michalke et al. | Oct 2013 | B2 |
8570530 | Liang | Oct 2013 | B2 |
8571281 | Wong et al. | Oct 2013 | B2 |
8571397 | Liu et al. | Oct 2013 | B2 |
8625854 | Valkenburg et al. | Jan 2014 | B2 |
8675207 | Babayoff | Mar 2014 | B2 |
8743114 | Kim et al. | Jun 2014 | B2 |
8828287 | van der Zel | Sep 2014 | B2 |
8848991 | Tjioe et al. | Sep 2014 | B2 |
8867820 | Peeper et al. | Oct 2014 | B2 |
8878905 | Fisker et al. | Nov 2014 | B2 |
8885175 | Babayoff | Nov 2014 | B2 |
8897526 | MacLeod et al. | Nov 2014 | B2 |
8903476 | Brennan et al. | Dec 2014 | B2 |
8903746 | Kudritskiy | Dec 2014 | B2 |
8914245 | Hopkins | Dec 2014 | B2 |
8998608 | Imgrund | Apr 2015 | B2 |
9084568 | Katsumata et al. | Jul 2015 | B2 |
9101433 | Babayoff | Aug 2015 | B2 |
9107723 | Hall | Aug 2015 | B2 |
9173727 | Yamamoto et al. | Nov 2015 | B2 |
9185388 | McNamer et al. | Nov 2015 | B2 |
9208531 | Boerjes et al. | Dec 2015 | B2 |
9208612 | Frahm et al. | Dec 2015 | B2 |
9262864 | Rohaly et al. | Feb 2016 | B2 |
9299192 | Kopelman | Mar 2016 | B2 |
9322646 | Pochiraju et al. | Apr 2016 | B2 |
9329675 | Ojelund et al. | May 2016 | B2 |
9402601 | Berger et al. | Aug 2016 | B1 |
9404740 | Babayoff | Aug 2016 | B2 |
9456963 | Lee | Oct 2016 | B2 |
9554692 | Levy | Jan 2017 | B2 |
9554857 | Toledo-Crow et al. | Jan 2017 | B2 |
9566138 | Fisker | Feb 2017 | B2 |
9629551 | Fisker et al. | Apr 2017 | B2 |
9662188 | Laubersheimer | May 2017 | B2 |
9675432 | Lee et al. | Jun 2017 | B2 |
9707061 | Morales et al. | Jul 2017 | B2 |
9827076 | Korten et al. | Nov 2017 | B2 |
9844430 | Morales et al. | Dec 2017 | B2 |
9845745 | Dudar | Dec 2017 | B2 |
9861457 | Fisker et al. | Jan 2018 | B2 |
10010387 | Esbech et al. | Jul 2018 | B2 |
10064553 | Fisker et al. | Sep 2018 | B2 |
10097815 | Fisker et al. | Oct 2018 | B2 |
10111714 | Kopelman et al. | Oct 2018 | B2 |
10326982 | Fisker et al. | Jun 2019 | B2 |
10349041 | Fisker et al. | Jul 2019 | B2 |
10349042 | Fisker et al. | Jul 2019 | B1 |
10595010 | Fisker et al. | Mar 2020 | B2 |
10695151 | Esbech et al. | Jun 2020 | B2 |
RE48221 | Ojelund et al. | Sep 2020 | E |
10835361 | Fisker et al. | Nov 2020 | B2 |
11051002 | Fisker et al. | Jun 2021 | B2 |
11076146 | Fisker et al. | Jul 2021 | B1 |
20010030748 | Jung et al. | Oct 2001 | A1 |
20030043089 | Hanson et al. | Mar 2003 | A1 |
20030096210 | Rubbert et al. | May 2003 | A1 |
20030156283 | Jung et al. | Aug 2003 | A1 |
20030158482 | Poland et al. | Aug 2003 | A1 |
20030164952 | Deichmann et al. | Sep 2003 | A1 |
20040080754 | Tobiason et al. | Apr 2004 | A1 |
20040125103 | Kaufman et al. | Jul 2004 | A1 |
20040155975 | Hart et al. | Aug 2004 | A1 |
20040185422 | Orth et al. | Sep 2004 | A1 |
20040204787 | Kopelman et al. | Oct 2004 | A1 |
20040254476 | Quadling et al. | Dec 2004 | A1 |
20050020910 | Quadling et al. | Jan 2005 | A1 |
20050057745 | Bontje | Mar 2005 | A1 |
20050074718 | Graham et al. | Apr 2005 | A1 |
20050090749 | Rubbert | Apr 2005 | A1 |
20050142517 | Frysh et al. | Jun 2005 | A1 |
20050212753 | Marvit et al. | Sep 2005 | A1 |
20050212756 | Marvit et al. | Sep 2005 | A1 |
20050232509 | Blake et al. | Oct 2005 | A1 |
20050237581 | Knighton et al. | Oct 2005 | A1 |
20050243330 | Magarill et al. | Nov 2005 | A1 |
20050283065 | Babayoff | Dec 2005 | A1 |
20060001739 | Babayoff | Jan 2006 | A1 |
20060020204 | Serra et al. | Jan 2006 | A1 |
20060025684 | Quistgaard et al. | Feb 2006 | A1 |
20060072123 | Wilson et al. | Apr 2006 | A1 |
20060072189 | DiMarzio et al. | Apr 2006 | A1 |
20060092133 | Touma et al. | May 2006 | A1 |
20060127852 | Wen | Jun 2006 | A1 |
20060146009 | Syrbe et al. | Jul 2006 | A1 |
20060158665 | Babayoff et al. | Jul 2006 | A1 |
20060212260 | Kopelman et al. | Sep 2006 | A1 |
20060251408 | Konno et al. | Nov 2006 | A1 |
20070016025 | Arenson et al. | Jan 2007 | A1 |
20070026363 | Lehmann et al. | Feb 2007 | A1 |
20070031774 | Cinader et al. | Feb 2007 | A1 |
20070041729 | Heinz et al. | Feb 2007 | A1 |
20070064242 | Childers | Mar 2007 | A1 |
20070078340 | Wilcox et al. | Apr 2007 | A1 |
20070081718 | Rubbert et al. | Apr 2007 | A1 |
20070103460 | Zhang et al. | May 2007 | A1 |
20070109559 | Babayoff et al. | May 2007 | A1 |
20070134615 | Lovely | Jun 2007 | A1 |
20070140539 | Katsumata et al. | Jun 2007 | A1 |
20070171220 | Kriveshko | Jul 2007 | A1 |
20070172112 | Paley et al. | Jul 2007 | A1 |
20070182812 | Ritchey | Aug 2007 | A1 |
20070194214 | Pfeiffer | Aug 2007 | A1 |
20070212667 | Jung et al. | Sep 2007 | A1 |
20070252074 | Ng et al. | Nov 2007 | A1 |
20080024768 | Babayoff | Jan 2008 | A1 |
20080058783 | Altshuler et al. | Mar 2008 | A1 |
20080063998 | Liang et al. | Mar 2008 | A1 |
20080070684 | Haigh-Hutchinson | Mar 2008 | A1 |
20080071143 | Gattani et al. | Mar 2008 | A1 |
20080118886 | Liang et al. | May 2008 | A1 |
20080131028 | Pillman et al. | Jun 2008 | A1 |
20080132886 | Cohen et al. | Jun 2008 | A1 |
20080194928 | Bandic et al. | Aug 2008 | A1 |
20080194950 | Mejia et al. | Aug 2008 | A1 |
20080316898 | Itoh et al. | Dec 2008 | A1 |
20090040175 | Xu et al. | Feb 2009 | A1 |
20090061381 | Durbin et al. | Mar 2009 | A1 |
20090076321 | Suyama et al. | Mar 2009 | A1 |
20090087050 | Gandyra | Apr 2009 | A1 |
20090097108 | Fox et al. | Apr 2009 | A1 |
20090103103 | Berner | Apr 2009 | A1 |
20090133260 | Durbin | May 2009 | A1 |
20090160858 | Chen et al. | Jun 2009 | A1 |
20090167948 | Berman et al. | Jul 2009 | A1 |
20090177050 | Griffiths et al. | Jul 2009 | A1 |
20090217207 | Kagermeier et al. | Aug 2009 | A1 |
20090231649 | Sirat | Sep 2009 | A1 |
20090233253 | Mrazek | Sep 2009 | A1 |
20090279103 | Thiel et al. | Nov 2009 | A1 |
20090291417 | Rubbert et al. | Nov 2009 | A1 |
20090298017 | Boerjes et al. | Dec 2009 | A1 |
20090322676 | Kerr et al. | Dec 2009 | A1 |
20100009308 | Wen et al. | Jan 2010 | A1 |
20100079581 | Russell et al. | Apr 2010 | A1 |
20100085636 | Berner | Apr 2010 | A1 |
20100108873 | Schwertner | May 2010 | A1 |
20100156901 | Park et al. | Jun 2010 | A1 |
20100157086 | Segale et al. | Jun 2010 | A1 |
20100201986 | Inglese et al. | Aug 2010 | A1 |
20100231509 | Boillot et al. | Sep 2010 | A1 |
20100239136 | Gandyra et al. | Sep 2010 | A1 |
20100268069 | Liang | Oct 2010 | A1 |
20110125304 | Schneider et al. | May 2011 | A1 |
20110188726 | Nathaniel et al. | Aug 2011 | A1 |
20110200249 | Minear et al. | Aug 2011 | A1 |
20110310449 | Kim et al. | Dec 2011 | A1 |
20110316978 | Dillon et al. | Dec 2011 | A1 |
20120015316 | Sachdeva et al. | Jan 2012 | A1 |
20120062557 | Dillon et al. | Mar 2012 | A1 |
20120141949 | Bodony et al. | Jun 2012 | A1 |
20120179035 | Boudier | Jul 2012 | A1 |
20120195471 | Newcombe et al. | Aug 2012 | A1 |
20130034823 | Liang et al. | Feb 2013 | A1 |
20130110469 | Kopelman | May 2013 | A1 |
20130158694 | Rubbert et al. | Jun 2013 | A1 |
20130218530 | Deichmann et al. | Aug 2013 | A1 |
20130218531 | Deichmann et al. | Aug 2013 | A1 |
20130244197 | Tjioe et al. | Sep 2013 | A1 |
20130260340 | Stegall | Oct 2013 | A1 |
20130335417 | McQueston et al. | Dec 2013 | A1 |
20140022352 | Fisker et al. | Jan 2014 | A1 |
20140022356 | Fisker et al. | Jan 2014 | A1 |
20140146142 | Duret et al. | May 2014 | A1 |
20140255878 | Jesenko et al. | Sep 2014 | A1 |
20140377718 | Korten et al. | Dec 2014 | A1 |
20150054922 | Fisker et al. | Feb 2015 | A1 |
20160022389 | Esbech et al. | Jan 2016 | A1 |
20160067018 | Korten et al. | Mar 2016 | A1 |
20180153664 | Esbech et al. | Jun 2018 | A1 |
20180255293 | Fisker et al. | Sep 2018 | A1 |
20190124323 | Fisker et al. | Apr 2019 | A1 |
20190200006 | Fisker et al. | Jun 2019 | A1 |
20190289283 | Fisker et al. | Sep 2019 | A1 |
20200169722 | Fisker et al. | May 2020 | A1 |
20200352688 | Esbech et al. | Nov 2020 | A1 |
20210211638 | Fisker et al. | Jul 2021 | A1 |
20210306617 | Fisker et al. | Sep 2021 | A1 |
20220086418 | Fisker et al. | Mar 2022 | A1 |
Number | Date | Country |
---|---|---|
1067573 | Jan 1993 | CN |
1906678 | Jan 2007 | CN |
1934481 | Mar 2007 | CN |
101426085 | May 2009 | CN |
101513350 | Aug 2009 | CN |
19524855 | Jan 1997 | DE |
19642247 | Jan 1998 | DE |
10321863 | Dec 2004 | DE |
10321883 | Dec 2004 | DE |
102007005726 | Aug 2008 | DE |
102009023952 | Dec 2010 | DE |
0837659 | Apr 1998 | EP |
2200332 | Jun 2010 | EP |
2325771 | May 2011 | EP |
2620733 | Jul 2013 | EP |
2664272 | Nov 2013 | EP |
2799032 | Nov 2014 | EP |
62-100716 | May 1987 | JP |
06-505096 | Jun 1994 | JP |
06-201337 | Jul 1994 | JP |
3321866 | Sep 2002 | JP |
2004-029685 | Jan 2004 | JP |
2005-098833 | Apr 2005 | JP |
2007-072103 | Mar 2007 | JP |
2008-194108 | Aug 2008 | JP |
2009-098146 | May 2009 | JP |
2009-238245 | Oct 2009 | JP |
8807695 | Oct 1988 | WO |
9214118 | Aug 1992 | WO |
9215034 | Sep 1992 | WO |
9702788 | Jan 1997 | WO |
9714932 | Apr 1997 | WO |
9845745 | Oct 1998 | WO |
9947964 | Sep 1999 | WO |
0008415 | Feb 2000 | WO |
0111193 | Feb 2001 | WO |
0184479 | Nov 2001 | WO |
0276327 | Oct 2002 | WO |
0360587 | Jul 2003 | WO |
0373457 | Sep 2003 | WO |
2004066615 | Aug 2004 | WO |
2005067389 | Jul 2005 | WO |
2006065955 | Jun 2006 | WO |
2007084727 | Jul 2007 | WO |
2008125605 | Oct 2008 | WO |
2009026645 | Mar 2009 | WO |
2009034157 | Mar 2009 | WO |
2009063088 | May 2009 | WO |
2009089126 | Jul 2009 | WO |
2010064156 | Jun 2010 | WO |
2010106379 | Sep 2010 | WO |
2010145669 | Dec 2010 | WO |
2011011193 | Jan 2011 | WO |
2011047731 | Apr 2011 | WO |
2011120526 | Oct 2011 | WO |
2012000511 | Jan 2012 | WO |
2012007003 | Jan 2012 | WO |
2012076013 | Jun 2012 | WO |
2012083960 | Jun 2012 | WO |
2012115862 | Aug 2012 | WO |
2013010910 | Jan 2013 | WO |
2013122662 | Aug 2013 | WO |
2014125037 | Aug 2014 | WO |
Entry |
---|
Complaint, 3Shape A/S v. Carestream Dental, LLC, Civil Action No. 6:21-cv-1110, 59 pages. |
Declaration of Dr. Chandrajit L. Bajaj, (IPR2018-001 97, Ex. 1003), Jul. 25, 2018, 142 pages. |
Exhibit 6—Latest Optical 3D Measuremental, Nov. 20, 2006, pp. 1-16. |
U.S. Pat. No. 9,962,244,“Declaration of Dr. Chandrajit L. Bajaj, Ph.D. in Support of Post-Grant Review”, Align Technology, Inc. Petitioner v. 3Shape A/S Patent Owner, Case Nos. PGR2018-00103, Oct. 30, 2018, 318 pages. |
U.S. Pat. No. 9,962,244,“Corrected Petition for Post-Grant Review”, Align Technology, Inc. Petitioner, 3Shape A/S Patent Owner, Case No. PGR2018-00103, Oct. 30, 2018, 119 pages. |
Record of Oral Hearing in IPR2018-00197, U.S. Pat. No. 9,329,675, Feb. 4, 2019, 67 pages. |
Remondino, et al., “Image-Based 3D Modelling: A Review”, The Photogrammetric Record, vol. 21, No. 115, Sep. 2006, pp. 269-291. |
Reply Declaration of Lambertus Hesselink, Exhibit 1057. |
Report and Recommendation, 3Shape A/S v. Align Technology, Inc., Case No. 1:18-886-LPS, May 6, 2020, 24 pages. |
Richard J. Cherry “New Techniques of Optical Microscopy and Microspectroscopy”, The Macmillan Press Ltd., 1991 (3 pages). |
Sato Yoichi, “Object Shape and Reflectance Modeling from Color Image Sequence”, The Robotics Institute: Carnegie Mlellon University, Jan. 1997, 158 pages. |
Savarese et al., “3D Reconstruction by Shadow Carving: Theory and Practical Evaluation”, International Journal of Computer Vision, vol. 71, No. 3, Mar. 2007, pp. 1-48. |
Schendel et al., “3D Orthognathic Surgery Simulation Using Image Fusion”, Seminars in Orthodontics, vol. 15, No. 1, Mar. 2009, pp. 48-56 (11 pages). |
Second Office Action dated Nov. 18, 2015, issued in the corresponding Chinese Patent Application No. 201180066956.6, 27 pages including 16 pages of English Translation. |
Sinescu et al., “Laser Beam Used in Dental Scanning for CAD/CAM Technologies”, TMJ, vol. 57, No. 2-3, 2007, pp. 187-191 (6 pages). |
Slabaugh, “Novel Volumetric Scene Reconstruction Methods for New View Synthesis”, PhD Thesis in Electrical and Computer Engineering at Georgia Institute of Technology, Nov. 2002, 209 pages. |
Slabaugh, G.G., et al., “Methods for Volumetric Reconstruction of Visual Scenes”, International Journal of Computer Vision, vol. 57, 2004, pp. 179-199. |
Smith Warrenj. , “Modern Optical Engineering: The Design of Optical Systems”, Third Edition, Exhibit 1065, 2000, 105 pages. |
Smith, “Digital Signal Processing: A Practical Guide for Engineers and Scientists,” Demystifying Technology Series, pp. 138, 262, 307-308 (1998). |
Spencer et al., “General Ray-Tracing Procedure”, Journal of the Optical Society of America, vol. 52, No. 6, Jun. 1962, pp. 672-678. |
Steele et al., “Bodies in Motion: Monitoring Daily Activity and Exercise with Motion Sensors in People with Chronic Pulmonary Disease”, Journal of Rehabilitation Research & Development, vol. 40, No. 5, Suppl. 2, Oct. 2003, pp. 45-58. |
Steinbach, et al., “3-D Object Reconstruction Using Spatially Extended Voxels and Multi-Hypothesis Voxel Coloring”, In Proceedings 15th International Conference on Pattern Recognition, ICPR, vol. 1, IEEE, 2000, pp. 774-777. |
Tang, et al., “Automatic Reconstruction of as-built Building Information Models from Laser-Scanned Point Clouds: A Review of Related Techniques, Automation in Construction 19”, Automation in Construction, vol. 19, No. 7, Nov. 1, 2010, pp. 829-843. |
Taxonomies of Input in Developing a Taxonomy of Input, (IPR2018-00197, Ex. 2010) Available at https://www.billbuxton.com/inputo4.Taxonomies.pdf., Jan. 4, 2009, 16 pages. |
Tiziani et al., “Theoretical Analysis of Confocal Microscopy with Microlenses”, Applied Optics vol. 35, Issue 1, Jan. 1, 1996, pp. 120-125 (7 pages). |
Transcript of Alexander V. Sergienko, Ph.D., Align Technology, Inc. v. 3Shape A/S et al., Exhibit 1056, Jul. 16, 2021, 212 pages. |
Transcript of Apr. 21, 2020 Video Claim Construction Hearing, 3Shape A/S v. Align Technology, Inc., Case No. 1:18-886-LPS, Apr. 21, 2020, 137 pages. |
Tsukizawa, et al., “3D Digitization of a Hand-held Object with a Wearable Vision Sensor”, Published in International Workshop on Computer Vision in Human-Computer Interaction, CVHCI 2004: Computer Vision in Human-Computer Interaction, 2004, pp. 129-141. |
Turner Daniel, “Hack: The Nintendo Wii”, MIT Technology Review, Jul. 1, 2007, 3 pages. |
U.S. Appl. No. 10/744,869, (IPR2018-00197, Ex. 2005), 69 pages. |
U.S. Pat. No. 9,329,675,“Declaration of Dr. Chandrajit L. Bajaj, Ph.D. in Support of Inter Partes Review”, Align Technology, Inc. Petitioner v. 3Shape A/S Patent Owner, Case IPR2018-00198, Ex. 1003, 123 pages. |
U.S. Pat. No. 9,329,675,“PTAB Trial Certificate Inter Partes Review Certificate”, IPR Trial No. IPR2018-00197, Oct. 25, 2019, 2 pages. |
U.S. Pat. No. RE48,221,“Petition (1 of 2) for Inter Partes Review”, Align Technology, Inc., Petitioner, 3Shape A/S, Patent Owner, Case No. IPR2022-00144, 99 pages. |
U.S. Pat. No. RE48,221,“Petition (2 of 2) for Inter Partes Review”, Align Technology, Inc., Petitioner, 3Shape A/S, Patent Owner, Case No. IPR2022-00145, 97 pages. |
Vedula, et al., “Shape and Motion Carving in 6D”, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2000, 7 pages. |
Vivid 910: Non-Contact 3-D Digitizer, www.minolta.com. (3 pages). |
Vogt et al., “An AR System With Intuitive User Interface for Manipulation and Visualization of 3D Medical Data”, Studies in Health Technology and Informatics, vol. 98, 2004., pp. 397-403. |
Welch et al., “Motion Tracking: No Silver Bullet, but a Respectable Arsenal”, IEEE Computer Grapllics and Applications, vol. 22, No. 6, Dec. 10, 2002, pp. 24-38. |
Welch et al., “High-Performance Wide-Area Optical Tracking The HiBall Tracking System”, Presence: Teleoperators and Virtual Environments, vol. 10, No. 1, Feb. 2001, pp. 1-22. |
Westphal et al., “Correction of Geometric and Refractive Image Distortions in Optical Coherence Tomography Applying Fermat's Principle”, Optics Express, vol. 10, No. 9, May 6, 2002, pp. 397-404. |
Wilson et al., “Confocal Microscopy by Aperture Correlation”, Optics Letters vol. 21, Issue 23, 1996, pp. 1879-1881 (4 pages). |
Wilson et al., “Dynamic Lens Compensation for Active Color Imaging and Constant Magnification Focusing”, The Robotics Institute, Carnegie Mellon University, Pittsburgh, Pennsylvania 15213, Exhibit 2027, Nov. 1991, 52 pages. |
Wilson et al., “Real-Time Three-Dimensional Imaging of Macroscopic Structures”, Journal of Microscopy, vol. 191, No. 2, Aug. 1998, pp. 116-118. |
Xia et al., “Three-Dimensional Virtual-Reality Surgical Planning and Soft-Tissue Prediction for Orthognathic Surgery”, IEEE Transactions on Information Technology in Biomedicine, vol. 5, No. 2, Jun. 2001, pp. 97-107. |
Xiao, et al., “Efficient Partial-Surface Registration for 3D Objects”, Computer Vision and Image Understanding, vol. 98, No. 2, 2005, pp. 271-294. |
Yamany et al., “Free-Form Surface Registration Using Surface Signatures”, The Proceedings of the Seventh IEEE International Conference on Computer Vision, vol. 2, 1999, 7 pages. |
Yang, et al., “Dealing with Textureless Regions and Specular Highlights—A Progressive Space Carving Scheme Using a Novel Photo-Consistency Measure”, Proceedings of the Ninth IEEE International Conference on Computer Vision (ICCV 2003) 2—Volume Set, 2003, 9 pages. |
Yoshida et al., “Intraoral Ultrasonic Scanning as a Diagnostic Aid”, Journal of Cranio-Maxillofacial Surgery, vol. 15, 1987, pp. 306-311. |
Yoshizawa Toru, “Handbook of Optical Metrology Principles and Application”, Second Edition, Feb. 25, 2009, 15 pages. |
Yuan et al., “Inferring 3D Volumetric Shape of Both Moving Objects and Static Background Observed by a Moving Camera”, IEEE Conference on Computer Vision and Pattern Recognition, 2007, 8 pages. |
Zhang et al., “A 3-dimensional Vision System for Dental Applications”, Proceedings of the 29th Annual International, Aug. 23-26, 2007, pp. 3369-3372. |
Jahne, et al., “Handbook of Computer Vision and Applications”, System and Applications, vol. 3, Academic press, 1999, 955 pages. |
Jerald Jason, “The VR Book: Human-Centered Design for Virtual Reality”, (IPR2018-00197 Ex-2014), 2016, 4 pages. |
Jethwa Manish, “Efficient Volumetric Reconstruction from Multiple Calibrated Cameras”, PhD Thesis in Electrical Engineering and Computer Science at MIT, Sep. 2004, 143 pages. |
Karatas et al., “Three-Dimensional Imaging Techniques: A Literature Review”, European Journal of Dentistry, vol. 8 Issue 1, Jan.-Mar. 2014, pp. 132-140. |
Kaufmann Hannes, “Applications of Mixed Reality”, Thesis, Vienna University of Technology, May 27, 2009, 95 pages. |
Li, et al., “Empty Space Skipping and Occlusion Clipping for Texture-based Volume Rendering”, In IEEE Visualization (VIS'03), 2003, pp. 317-324. |
Litomisky et al., “Removing Moving Objects from Point Cloud Scenes”, Advances in Depth !mage Analysis and Applications, Jan. 2013, pp. 1-10. |
Litomisky, et al., “Removing moving objects from point cloud scenes”, International Workshop on Depth Image Analysis and Applications, Springer, Bedin, Heidelberg, 2012, version listed at pdf.edu, pp. 1-10. |
Liu, et al., “A Complete Statistical Inverse Ray Tracing Approach to Multi-View Stereo”, In CVPR, IEEE, 2011, pp. 913-920. |
Logozzo et al., “Recent Advances in Dental Optics—Part I: 3D Intraoral Scanners for Restorative Denistry”, Optics and Lasers in Engineering, vol. 54, Mar. 2014, pp. 203-221 (1-19). |
Lovi Davidi., “Incremental Free-Space Carving for Real-Time 3D Reconstruction”, Master of Science Thesis in Computer Science at University of Alberta, 2011, 74 pages. |
MacKinlay et al., “A Semantic Analysis of the Design Space of Input Devices”, Human Computer Interaction, vol. 5, 1990, pp. 145-190. |
Memorandum Order, 3Shape A/S v. Align Technology, Inc., C.A. No. 18-886-LPS, Exhibit 2022, Dec. 28, 2020, 3 pages. |
Michael P. Keating “Geometric, Physical, and Visual Optics”, Butterworth Publishers, 1988 (3 pages). |
Montes et al., “An Overview of BRDF Models”, University of Granada, 2012, pp. 1-26. |
Moran et al., “A Comparison of the Imaging Performance of High Resolution Ultrasound Scanners for Preclinical Imaging”, Ultrasound in Medicine & Biology, vol. 37, No. 3, Mar. 2011, pp. 493-501. |
Myers Brada. , “Graphical User Interface Programming”, CRC Handbook of Computer Science and Engineering, 2d. Ed., Allen B. Tucker, Jan. 27, 2003, 30 pages. |
Na Tanabe et al., “Telecentric Optics for Constant-Magnification Imaging”, Department of Computer Science, Columbia University, Sep. 1995, 22 pages. |
Nasiri Steven, “A Critical Review of MEMS Gyroscopes Technology and Commercialization Status”, InvenSense, 2005, 8 pages. |
Nitschke et al., “Real-Time Space Carving Using Graphics Hardware”, IEICE Transactions on Information and Systems, Aug. 2007, pp. 1175-1184 (11 pages). |
Noguchi et al., “Microscopic Shape from Focus Using a Projected Illumination Pattern”, Mathematical and Computer Modelling, vol. 24, No. 5/6, Sep. 1996, pp. 31-48. |
Notice of Opposition issued in corresponding European Patent No. 2 442 720, dated Aug. 24, 2016 (5 pages). |
Notice of Opposition issued in corresponding European Patent No. 2 442 720, dated Aug. 24, 2016 (6 pages). |
Notice of Opposition issued in corresponding European Patent No. 2 442 720, dated Jan. 16, 2019 (15 pages). |
Notice of Opposition issued in corresponding European Patent No. 2 442 720, dated May 22, 2017 (41 pages). |
Notice of Opposition issued in corresponding European Patent No. 2 442 720, dated May 24, 2017 (23 pages). |
Notification of Information Statement dated Aug. 2, 2016, by the Japanese Patent Office in corresponding Japanese Patent Application No. 2014-234653 and English translation. (2 pages). |
Notification of third party observations concerning JP 2014-234653 mailed Oct. 27, 2015, and translation of notification (31 pages). |
Ogami Moria, “Exhibit 2-3D Imagery Handbook”, First Edition, Feb. 20, 2006, pp. 1-4. |
Ojelund et al., “Inter Partes Review Certificate”, U.S. Pat. No. 9,329,675 K1, 2 pages. |
Ojelund Provisional, U.S. Appl. No. 61/420,138, filed Dec. 6, 2010, 45 pages. |
Order, Lipocine Inc. v. Glarus Therapeutics, Inc., C.A. No. 19-622 {WCB), Exhibit 1052, Nov. 12, 2020, 2 pages. |
Paris, et al. “A surface reconstruction method using global graph cut optimization”, International Journal of Computer Vision, vol. 66. No. 2, HAL id: inria-00510219, 2010, pp. 141-161. |
Patent Owne s Preliminary Response to the Petition for Inter Partes Review in IPR2018-00198, U.S. Pat. No. 9,329,675, Mar. 3, 2018, 66 pages. |
Patent Owner's Preliminary Response to the Petition for Inter Partes Review of U.S. Pat. No. 10,349,042, Align Technology, Inc. v. 3Shape A/S, Case No. IPR2020-01088 (Oct. 23, 2020), Exhibit 1062. |
Patent Owner's Preliminary Response to the Petition for Inter Parties Review in IPR2018-00197, U.S. Pat. No. 9,329,675, Mar. 3, 2018, 66 pages. |
Patent Owner's Response to the Petition for Inter Partes Review in IPR2018-00197, U.S. Pat. No. 9,329,675, Aug. 20, 2018, 57 pages. |
Patent Owner's Submission of Demonstratives for Oral Argument in IPR2018-00197, U.S. Pat. No. 9,329,675, Jan. 31, 2019, 42 pages. |
Petition for Inter Partes Review of U.S. Pat. No. 10,349,042, Case No. IPR2020-01087, Align Technology, Inc., Petitioner, v. 3Shape A/S, Patent Owner (89 pages). |
Petition for Inter Partes Review of U.S. Pat. No. 10,349,042, Case No. IPR2020-01088, Align Technology, Inc., Petitioner, v. 3Shape A/S, Patent Owner (81 pages). |
Petition for Inter Partes Review of U.S. Pat. No. 10,349,042, Case No. IPR2020-01089, Align Technology, Inc., Petitioner, v. 3Shape A/S, Patent Owner (76 pages). |
Petition for Inter Partes Review of U.S. Pat. No. 8,363,228, 3Shape A/S et al. v. Align Technology, Inc., Case No. IPR2019-00154, Nov. 10, 2018, 100 pages. |
Petition for Inter Partes Review of U.S. Pat. No. 8,363,228, 3Shape A/S et al. v. Align Technology, Inc., Case No. IPR2019-00157 (Nov. 8, 2018), Exhibit 1063. |
Petitioner Align Technology, Inc.'s Demonstratives in IPR2018-00197, U.S. Pat. No. 9,329,675, Jan. 31, 2019, 30 pages. |
Petitioner Align Technology, Inc.'s Reply to Patent Owner Response in IPR2018-00197, U.S. Pat. No. 9,329,675, Nov. 14, 2018, 35 pages. |
Petitioner's Align Technology, Inc's Request for Rehearing in IPR20198-00198, U.S. Pat. No. 9,329,675, Jun. 29, 2018, 14 pages. |
Plaintiff and Counterclaim Defendant Align Technology, Inc.'s Stipulation Regarding IPR2022-00144 and IPR2022-00145, Case No. 6:20-cv-00979 (W.D. Tex.), Dec. 16, 2021, 4 pages. |
Pollard et al., “Change Detection in a 3-D World”, IEEE Conference on Computer Vision and Pattern Recognition, Jun. 1, 2007, 6 pages. |
Pulli, et al., “Surface Reconstruction and Display from Range and Color Data”, Graphical Models, vol. 62, Issue 3, 2000, pp. 165-201. |
Pulli, et al., “View-based Rendering: Visualizing Real Objects From Scanned Range and Color Data”, In Rendering techniques'97, Springer, Vienna, 1997, pp. 23-34. |
Defendant Align Technology, Inc.'s Initial Invalidity Contentions, 3Shape A/S v. Align Technology, Inc., C.A. No. 1:18-cv-00886-LPS, Nov. 21, 2017, 393 pages. |
Defendant Align Technology, Inc.'s Stipulation of Invalidity Contentions, 3Shape A/S v. Align Technology, Inc., C.A. No. 18-886-LPS, Exhibit 1053, Nov. 13, 2020, 3 pages. |
Defendant's Identification of Invalidity References, 3Shape A/S, Plaintiff, v. Align Technology, Inc., Defendant, C.A. No. 1:18-cv-00886-LPS, in the United States District Court for the District of Delaware. (72 pages). |
Deposition Transcript of Chandrajit Bajaj, Ph.D with Errata Sheet, (IPR2018-00197 Ex-2008), Jul. 25, 2018, 142 pages. |
Deposition Transcript of Dr. Lambertus Hesselink taken Apr. 2, 2021, Exhibit 2023. |
Deposition Transcript of Dr. Lambertus Hesselink taken Sep. 10, 2021, Exhibit 2030. |
Deposition Transcript of Dr. Ravin Balakrishnan, Nov. 5, 2018, 101 pages. |
Eisert et al., “Automatic Reconstruction of Stationary 3-D Objects from Multiple Uncalibrated Camera Views”, IEEE Transactions on Circuits and Systems for Video Technology, vol. 10, No. 2, Mar. 2000, pp. 261-277. |
Eisert, “Reconstruction of Volumetric 3D Models”, 3D Videocommunication: Algorithms, Concepts and Real-Time Systems in Human Centered Communication, John Wiley & Sons, Ltd., 2001, 20 pages. |
Eisert, et al., “Multi-Hypothesis, Volumetric Reconstruction of 3 D Objects From Multiple Calibrated Camera Views”, CASSP'99, Phoenix, USA, Mar. 1999, pp. 3509-3512. |
Elgammal, “CS 534: Computer Vision Texture,” Department of Computer Science, Rutgers University, (Spring 2003). (22 pages). |
EPO Prosecution History dated Jun. 19,2013, issued in the European Patent Application No. 11847582.1, 180 pages. |
File History (IPR2018-00197, Ex. 1002) (IPR2018-00198, Ex. 1002), U.S. Pat. No. 9,329,675, 625 pages. |
File History, U.S. Pat. No. RE48,221, 1004 pages. |
Final Written Decision—Termination Decision Document from IPR2018-00197, U.S. Pat. No. 9,329,675 B2, May 29, 2019, 64 pages. |
Final Written Decision, Align Technology, Inc. v. 3Shape A/S, IPR2020-01087, Jan. 19, 2022. |
First Office Action dated Apr. 3, 2015, issued in the corresponding Chinese Patent Application No. 201180066956.6, 13 pages. |
First Office Action dated Dec. 2, 2016, issued in the corresponding Chinese Patent Application No. 201510098304.0, 15 pages including 8 pages of English Translation. |
First Office Action dated Feb. 20, 2014, issued in the corresponding Chinese Patent Application No. CN201080027248.7, 22 pages including 13 pages of English Translation. |
Fisher, et al., “Dictionary of Computer Vision & Image Processing”, Wiley, Second Edition, 2014, 386 pages. |
Fisker et al., “Focus Scanning Apparatus”, U.S. Appl. No. 61/187,744, filed Jun. 17, 2009, 90 pages. |
Fisker et al., “Focus Scanning Apparatus”, U.S. Appl. No. 61/231,118, filed Aug. 4, 2009, 127 pages. |
Foley et al., “Introduction to Computer Graphics”, Addison-Wesley, Chapter 2.2: Basic Interaction Handling, “Chapter 6: Viewing in 3D,” and Chapter 8: Input Devices, Interaction Techniques, and Interaction Tasks, 1994, 66 pages. |
Forne ChristopherJ., “3-D Scene Reconstruction From Multiple Photometric Images”, PhD Thesis in Electrical and Computer Engineering at the University of Canterbury, Christchurch, NeW Zealand, Apr. 30, 2007, 179 pages. |
Fraser et al., “Zoom-Dependent Camera Calibration in Digital Close-Range Photogrammetry”, Photogrammetric Engineering & Remote Sensing, vol. 72, No. 9, Exhibit 1064, Sep. 2006, pp. 1017-1026. |
Gao et al., “3D Shape Reconstruction of Teeth by Shadow Speckle Correlation Method”, Optics and Lasers in Engineering, vol. 44, 2006, pp. 455-465. |
Gehrung et al., “An Approach to Extract Moving Objects From MLS Data Using a Volumetric Background Representation”, ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. IV-1/W1, Jun. 2017, pp. 107-114. |
Giammanco et al., “Using 3D Laser Scanning Technology to Create Digital Models of Hailstones”, American Meteorological Society, Jul. 2017, pp. 1341-1347 (8 pages). |
Gmitro et al., “Confocal Microscopy through a Fiber-Optic Imaging Bundle”, Optics Letters, vol. 18, No. 8, Apr. 15, 1993, pp. 565-567 (4 pages). |
Graetzel et al., “A Non-Contact Mouse for Surgeon-Computer Interaction”, Technology and Health Care, vol. 12, No. 3, 2004, pp. 245-257. |
Grant et al., “Glossary of Digital Dental Terms: American College of Prosthodontists”, Journal of Prosthodontics, vol. 25, Suppl. 2, Oct. 2016, pp. S2-S9. |
Guan et al., “Multi-view Occlusion Reasoning for Probabilistic Silhouette-Based Dynamic Scene Reconstruction”, International Journal of Computer Vision, vol. 90, 2010, pp. 283-303. |
Guehring Jens, “Dense 3D Surface Acquisition by Structured Light using off-the-Shelf Components”, Proceedings SPIE 4309, Videometrics and Optical Methods for 3D Shape Measurement, Dec. 22, 2000, pp. 220-231 (13 pages). |
Hajeer et al., “Current Products and Practices Applications of 3D Imaging in Orthodontics: Part II”, Journal of Orthodontics, vol. 31, 2004, pp. 154-162. |
Hale et al., “Measuring Free-Living Physical Activity in Adults with and Without Neurologic Dysfunction with a Triaxial Accelerometer”, Archives of Physical Medicine and Rehabilitation, vol. 89, No. 9, Sep. 2008, pp. 1765-1771. |
Havemann et al., “Seven Research Challenges of Generalized 3D Documents”, IEEE Computer Graphics and Applications, vol. 27, No. 3, May-Jun. 2007, pp. 70-76. |
Hearn et al., “Computer Graphics”, 2d. Ed., Prentice Hall, Chapter 2: Overview of Graphics Systems, “Chapter 8: Graphical User Interfaces and Interactive Input Methods,” and “Chapter 9: Three-Dimensional Concepts”, 1994, 83 pages. |
Horn, et al., “Calculating the Reflectance Map”, Applied Optic, vol. 18, No. 11, Jun. 1979, pp. 1770-1779. |
IEEE Xplore Search Results (4 pages), accessed Mar. 2, 2018; this document was made of record by the Examiner on Mar. 13, 2018, in the parent U.S. Appl. No. 15/117,078. |
Information Statement issued on Jul. 28, 2016, by the Japanese Patent Office in corresponding Japanese Patent Application No. 2014-234653 and English translation. (25 pages). |
Institution Decision entered in IPR20198-00197, U.S. Pat. No. 9,329,675, May 30, 2018, 32 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/DK2010/050148, dated Jan. 5, 2012, 10 page. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/EP2015/052537, dated Aug. 18, 2016, 11 pages. |
International Search Report (PCT/ISA/210) dated Feb. 22, 2012, issued in the International Patent Application No. PCT/DK2011/050461, 6 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/DK2010/050148, dated Oct. 6, 2010, 13 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/EP2015/052537, dated May 11, 2015, 14 pages. |
Introducing Wii MotionPlus, Nintendo's upcoming accessory for the revolutionary Wii Remote, Nintendo, The Wayback Machine, Jul. 14, 2008, 2 pages. |
Ireland et al., “3D Surface Imaging in Dentistry—What we are Looking at”, British Dental Journal, vol. 205, No. 7, Oct. 11, 2008, pp. 387-392. |
Jahne, et al., “Handbook of Computer Vision and Applications”, Sensorsand Imaging, vol. 1, Academic Press, 1999, 657 pages. |
Jahne, et al., “Handbook of Computer Vision and Applications”, Signal Processing and Pattern Recognition, Academic Press, vol. 2, 1999, 967 pages. |
3Shape A/S Markman Hearing Presentation, Case No. 1:18-886-LPS, Apr. 21, 2020, 104 pages. |
3Shape A/S v. Align Technology, Inc., IPR2021-01383, Petition for Inter Partes Review, U.S. Pat. No. 10,728,519, Aug. 20, 2021, 112 pages. |
Ahn et al., “Development of Three-Dimensional Dental Scanning Apparatus Using Structured Illumination”, Sensors, vol. 17, Issue 7, 1634 (IPR2018-00197, Ex. 2004) (IPR2018-00198, Ex. 2002), 2017, 9 pages. |
Align Technology, Inc. Petitioner V., 3Shape A/S, Patent Owner, Patent Owner's Preliminary Response ot the Petition for Post-Grant Review of U.S. Pat. No. 9,962,244, Case No. PGR2018-00104, U.S. Pat. No. 9,962,244, filed Feb. 19, 2019, 64 pages. |
Align Technology, Inc. Petitioner V., 3Shape A/S, Patent Owner, Patent Owner's Preliminary Response to the Petition for Inter Partes Review of U.S. Pat. No. 9,962,244, Case No. IPR2018-00118, U.S. Pat. No. 9,962,244, filed Mar. 4, 2019, 62 pages. |
Align Technology, Inc. Petitioner V., 3Shape A/S, Patent Owner, Patent Owner's Preliminary Response to the Petition for Inter Partes Review of U.S. Pat. No. 9,962,244, Case No. IPR2019-00117, U.S. Pat. No. 9,962,244, filed Mar. 4, 2019, 63 pages. |
Align Technology, Inc. Petitioner v., 3Shape A/S, Patent Owner, Patent Owner's Preliminary Response to the Petition for Inter Partes Review of U.S. Pat. No. 9,962,244, Case No. IPR2019-00118, U.S. Pat. No. 9,962,244, filed Mar. 4, 2019, 62 pages. |
Align Technology, Inc. Petitioner V., 3Shape A/S, Patent Owner, Patent Owner's Preliminary Response to the Petition for Post-Grant Review of U.S. Pat. No. 9,962,244, Case No. PGR2018-00103, U.S. Pat. No. 9,962,244, filed Feb. 19, 2019, 64 pages. |
Align Technology, Inc. Petitioner V., 3Shape A/S, Patent Owner, Petition for Inter Partes Review of U.S. Pat. No. 9,962,244, Case No. IPR2019-00117, U.S. Pat. No. 9,962,244, filed Nov. 5, 2018, 98 pages. |
Align Technology, Inc. Petitioner V., 3Shape A/S, Patent Owner, Petition for Inter Partes Review of U.S. Pat. No. 9,962,244, Case No. IPR2019-00118, U.S. Pat. No. 9,962,244, tiled Nov. 5, 2018, 93 pages. |
Align Technology, Inc. Petitioner V., 3Shape A/S, Patent Owner, Petition for Post-Grant Review of U.S. Pat. No. 9,962,244, Case No. PGR2018-00103, U.S. Pat. No. 9,962,244, filed Oct. 30, 2018, 119 pages. |
Align Technology, Inc. Petitioner V., 3Shape A/S, Patent Owner, Petition for Post-Grant Review of U.S. Pat. No. 9,962,244, Case No. PGR2018-00104, U.S. Pat. No. 9,962,244, filed Oct. 26, 2018, 107 pages. |
Align Technology, Inc. Petitioner V., 3Shape A/S, Patent Owner, Petition for Post-Grant Review of U.S. Pat. No. 9,962,244, Case No. PGR2019-00104, U.S. Pat. No. 9,962,244, filed Oct. 26, 2018, 107 pages. |
Align Technology, Inc. Petitioner, Declaration of Dr. Chandrajit L. Bajaj, Ph.D. in Support of Inter Partes Review of U.S. Pat. No. 9,962,244, Case Nos. IPR2019-00117 U.S. Pat. No. 9,962,244, 3Shape A/S Patent Owner, filed Nov. 5, 2018, 316 pages. |
Align Technology, Inc. Petitioner, Declaration of Dr. Chandrajit L. Bajaj, Ph.D. in Support of Inter Partes Review of U.S. Pat. No. 9,962,244, Case Nos. IPR2019-00118 U.S. Pat. No. 9,962,244, 3Shape A/S Patent Owner, filed Nov. 5, 2018, 316 pages. |
Align Technology, Inc. Petitioner, Declaration of Dr. Chandrajit L. Bajaj, Ph.D. In Support of Post-Grant Review of U.S. Pat. No. 9,962,244, Case Nos. PGR2018-00103 U.S. Pat. No. 9,962,244, 3Shape A/S Patent Owner, filed Oct. 30, 2018, 318 pages. |
Align Technology, Inc. Petitioner, Declaration of Dr. Chandrajit L. Bajaj, Ph.D. in Support of Post-Grant Review of U.S. Pat. No. 9,962,244, Case Nos. PGR2018-00104 U.S. Pat. No. 9,962,244, 3Shape A/S Patent Owner, filed Oct. 26, 2018, 318 pages. |
Align Technology, Inc. Petitioner, Second Corrected Declaration of Dr. Chandrajit L. Bajaj, Ph.D. in Support of Post-Grant Review of U.S. Pat. No. 9,962,244,Case No. PGR2018-00103 U.S. Pat. No. 9,962,244, 3Shape A/S Patent Owner, filed Oct. 30, 2018, 318 pages. |
Align Technology, Inc. Petitioner, Second Corrected Petition for Post-Grant Review of U.S. Pat. No. 9,962,244, Case No. PGR2018-00103 U.S. Pat. No. 9,962,244, 3Shape A/S Patent Owner, filed Oct. 30, 2018, 119 pages. |
Align Technology, Inc., Petitioner v. 3Shape A/S Patent Owner, Case IPR2018-00197—U.S. Pat. No. 9,329,675, Decision Institution of Inter Partes Review, May 30, 2018, 32 pages. |
Align Technology, Inc., Petitioner v. 3Shape A/S Patent Owner, Case IPR2018-00197—U.S. Pat. No. 9,329,675, Petition for Inter Partes Review, Nov. 22, 2017, 67 pages. |
Align Technology, Inc., Petitioner v. 3Shape A/S Patent Owner, Case IPR2018-00198—U.S. Pat. No. 9,329,675, Decision Denying Institution of Inter Partes Review, May 30, 2018, 15 pages. |
Align Technology, Inc., Petitioner v. 3Shape A/S Patent Owner, Case IPR2018-00198—U.S. Pat. No. 9,329,675, Petition for Inter Partes Review, Nov. 22, 2017, 78 pages. |
Amended Complaint, 3Shape A/S v. Align Technology, Inc., Case No. 1:18-886-LPS, Aug. 30, 2019, 166 pages. |
Answer, Affirmative Defenses, and Counterclaims of Align Technology, Inc., 3Shape A/S v. Align Technology, Inc.,C.A. No. 18-886-LPS, Oct. 21, 2019, 46 pages. |
Atieh Mohammada., “Accuracy Evlaution of Intral-Oral Optical Impressions: A Novel Approach”, Thesis, University of North Carolina at Chapel Hill, 2016, 87 pages. |
Bajaj, Declaration of Dr. Chandrajit L. Bajaj, Ph.D., 3Shape A/S, Patent Owner, in Support of Inter Partes Review of U.S. Pat. No. 9,329,675, Case IPR2018-00197, 127 Pages. |
Bernardini, et al., “High-Quality Texture Reconstruction from Multiple Scans”, IEEE Transactions on Visualization and Computer Graphics, vol. 7, No. 4, Oct.-Dec. 2001, pp. 318-332. |
Birnbaum et al., “Dental Impressions Using 3D Digital Scanners: Virtual Becomes Reality”, Compendium of Continuing Education in Dentistry, vol. 29, No. 8, Oct. 2008, 18 pages. |
Bob Johnstone, “Cameras give semiconductor industry a boost”, New Scientist, Nov. 7, 1985 (1 page). |
Bornik et al., “A Hybrid User Interface for Manipulation of Volumetric Medical Data”, 3D User Interfaces, 2006, pp. 29-36 (8 pages). |
Bowman et al., “3D User Interfaces Theory and Practice§ 4.1.1 “Input Device Characteristics” pp. 88-89; § 4.2.2 “2D Mice and Trackballs” pp. 91-92; § 4.8.2 “Input Device Taxonomies” pp. 128-132”, Addison Wesley (IPR2018-00197, Ex. 2013), 2005, 20 pages. |
Bowman et al., “3D User Interfaces: Theory and Practice”, 2004, pp. 96-101, (IPR2018-00197, Ex. 1038) Jul. 2004, pp. 96-101 (9 pages). |
Broadbent, B.H., “A New X-Ray Technique and Its Application to Orthodontia,” The Angle Orthodontist, vol. 1, No. 2, 1931, pp. 45-66. |
Broadhurst et al., “A Probabilistic Framework for Space Carving”, Proceedings Eighth IEEE International Conference on Computer Vision, vol. 1, 2001, 6 pages. |
Callier et al., “Reconstructing Textured Meshes From Multiple Range+rgb Maps”, 7th International Fall Workshop on Vision, Modeling, and Visualization, Nov. 2002,, 8 pages. |
Swiss Priority Document 01580/08, Oct. 6, 2008, with English Translation. |
Chen, et al., “A Volumetric Stereo Matching Method: Application to Image-Based Modeling”, IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat No. PR00149). vol. 1, 1999, 6 pages. |
Chua et al., “SonoDEX: 3D Space Management and Visualization of Ultrasound Data”, International Congress Series, vol. 1281, (IPR2018-00197, Ex. 2006), May 2005, pp. 143-148. |
Corrected Declaration of Dr. Chandrajit L. Bajaj, Ph.D. in Support of Post-Grant Review of U.S. Pat. No. 9,962,244, Align Technology, Inc. Petitioner, Case Nos. PGR2018-00103 U.S. Pat. No. 9,962,244, 3Shape A/S Patent Owner, filed Oct. 30, 2018, 318 pages. |
Curriculum Vitae of Dr. Chandrajit L. Bajaj, (IPR2018-00197, Ex. 1004) (IPR2018-00198, Ex. 1004), 49 pages. |
Curriculum Vitae of Ravin Balakrishanan Ph.D. (IPR2018-00197, Ex.2012), 30 pages. |
Darrell et al., “Pyramid Based Depth from Focus”, Proceedings CVPR '88: The Computer Society Conference on Computer Vision and Pattern Recognition, 1988, pp. 504-509. |
Decision Denying Petitioner's Request for Rehearing in IPR20198-00198, U.S. Pat. No. 9,329,675, Dec. 4, 2018, 8 pages. |
Declaration of Alexander Sergienko, Ph.D. in 3Shape A/Set al. v. Align Technology, Inc., Case No. IPR2020-01622, Exhibit 1061. |
Declaration of Alexander Sergienko, Ph.D., Align Technology, Inc. v. 3Shape A/S, Case No. IPR2020-01087, Exhibit 2019. |
Declaration of Dr. Chandrajit Bajaj (“Bajaj Deel.”) in support of Petition for Inter Partes Review, U.S. Pat. No. RE48,221, 377 pages. |
Declaration of Lambertus Hesselink, Ph.D., Exhibit 1002. |
Declaration of Ravin Balakrishanan, (IPR2018-00197, Ex.2011), 55 pages. |
Declaration of Sylvia Hall-Ellis, Ph.D. with attachments, Align Technology, Inc. v. 3Shape A/S, Case No. IPR2020-01087, Exhibit 2029. |
Agrawal, Color and Shade Management in Esthetic Dentistry, Dec. 1, 2013, 120-127, vol. 3, Issue 3. |
Baltzer et al., “The Determination of the Tooth Colors”, Quintessent Zahntechnik, 30(7), 2004, 726-740. |
Borse et al., Tooth shade analysis and selection in prosthodontics: A systematic review and meta-analysis, Apr. 7, 2020, J Indian Prosthodont Soc, 20, 131-140. |
Chu, Dental Color Matching Instrument and Systems. Review of Clinical and Research Aspects, Journal of Dentistry, 2000, e2-e16, vol. 38, Supplement 2. |
Corcodel et al., “Metameric effect between natural teeth and the shade tabs of a shade guide”, May 11, 2010, European Journal of Oral Sciences, pp. 311-316. |
Curriculum Vitae of Dr. Ioannis A. Kakadiaris (56 pages). |
Cirriculum Vitae of Dr. James L. Mullins (13 pages). |
Declaration of Dr. James L. Mullins (94 pages). |
Declaration of Ioannis A. Kakadiaris (104 pages). |
Defendants Initial Invalidity Contentions (NDGA—Civil Action No. 1:22-cv-01829-WMR), dated Sep. 12, 2022 (92 pages). |
Defendants' First Suppletemental Invalidity Contentions (NDGA—Civil Action No. 1:22-cv-01829-WMR), dated Nov. 7, 2022 (101 pages). |
DK priority document for Patent Application No. PA 2014 70066 (29 pages). |
Douglas et al., “Intraoral determinaation of the tolerance of dentists for perceptibility and acceptability of shade mismatch”, Apr. 1, 2007, Journal of Prosthetic Dentistry, 9(4), 200-208. |
Gonzalez et al., Digital Image Processing, Pearson Prentice Hall, 2008, Third Edition (977 pages). |
Ishikawa-Nagai et al., “Reproducibility of Tooth Color Gradation Using A Computer Color-Matching Technique Applied to Ceramic Restorations”, J. Prosthetic Dentistry (Feb. 2005), pp. 129-137. |
Kang, “Three-Dimensional Lookup Table with Interpolation”, Computational Color Technology, Chapter 9, 2006, pp. 151-159. |
Lagouvardos et al., “Repeatability and Interdevice Reliability of Two Portable Color Selection Devices in Matching and Measuring Tooth Color”, J. Prosthetic Dentistry (2007), 40-45. |
Petition for Inter Partes Review of U.S. Pat. No. 10,695,151, dated Dec. 21, 2022 (87 pages). |
Richert et al., Intraoral Scanner Technologies; A Review to Make a Successful Impression, Journal of Healthcare Engineering, 2017, 9 pages. |
Schropp, Shade Matching Assisted by Digital Photography and Computer Software, Jan. 1, 2009, Journal of Prosthodontics, 235-241. |
Tam, Dental Shade Matching Using a Digital Camera, Dec. 1, 2012, Journal of Dentistry, e3-e10, vol. 40, Supplement 2. |
Toriwaki, et al., Fundamentals of Three-Dimensional Digital Image Processing, Springer, 2009, 278 pages. |
Tung et al., “The Repeatability of an Intraoral Dental Colorimeter”, Dec. 1, 2002, J. Prosthetic Dentistry, 585-590. |
U.S. Pat. No. 10,349,042, “Petition for Inter Partes Review”, Align Technology, Inc, Petitioner, v. 3Shape A/S, Patent Owner, Case No. IPR2020-01089, 76 pages. |
U.S. Office Action for U.S. Appl. No. 15/888,764 (8 pages). |
Vadher, et al., Basics of Color in Dentistry: A Review, Sep. 1, 2014, IOSR Journal of Dental and Medical Sciences, 13(9), 78-85. |
Netravali, Digital Pictures: Representation and Compression (Applications of Communications Theory), 1988 (701 pages). |
Number | Date | Country | |
---|---|---|---|
20220265403 A1 | Aug 2022 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16946186 | Jun 2020 | US |
Child | 17742955 | US | |
Parent | 15888764 | Feb 2018 | US |
Child | 16946186 | US | |
Parent | 15117078 | US | |
Child | 15888764 | US |