As will be described in greater detail below, the present disclosure describes various systems and methods for imaging and determining the shades of a patient's teeth.
All patents, applications, and publications referred to and identified herein are hereby incorporated by reference in their entirety, and shall be considered fully incorporated by reference even though referred to elsewhere in the application.
A better understanding of the features, advantages and principles of the present disclosure will be obtained by reference to the following detailed description that sets forth illustrative embodiments, and the accompanying drawings of which:
The following detailed description and provides a better understanding of the features and advantages of the inventions described in the present disclosure in accordance with the embodiments disclosed herein. Although the detailed description includes many specific embodiments, these are provided by way of example only and should not be construed as limiting the scope of the inventions disclosed herein.
With reference to
In various implementations, the dental professional system 200 is configured to interface with a dental professional. A “dental professional” (used interchangeably with dentist, orthodontist, and doctor herein) as used herein, may include any person with specialized training in the field of dentistry, and may include, without limitation, general practice dentists, orthodontists, dental technicians, dental hygienists, etc. A dental professional may include a person who can assess, diagnose, and/or treat a dental condition. “Assessment” of a dental condition, as used herein, may include an estimation of the existence of a dental condition. An assessment of a dental condition need not be a clinical diagnosis of the dental condition. In some embodiments, an “assessment” of a dental condition may include an image-based shade assessment, that is an assessment of a shade of a patient's tooth or dentition based in part or on whole on photos and/or images (e.g., images that are not used to stitch a mesh or form the basis of a clinical scan) taken of the dental condition. A “diagnosis” of a dental condition, as used herein, may include a clinical identification of the nature of an illness or other problem by examination of the symptoms. “Treatment” of a dental condition, as used herein, may include prescription and/or administration of care to address the dental conditions. Examples of treatments to dental conditions include prescription and/or administration of brackets/wires, clear aligners, and/or other appliances to address orthodontic conditions, prescription and/or administration of restorative elements, such as crowns, bridges, and veneers, to address functional and/or aesthetic aspects of the patient's dentition, etc. The dental professional system 200 may provide to a user software (e.g., one or more webpages, standalone applications (e.g., dedicated treatment planning and/or treatment visualization applications), mobile applications, etc.) that allows the user to interact with patient data, treatment data, scan data, shade data, prosthetic data, treatment planning tools, control image capturing devices, the display of information, etc.
As illustrated in
As illustrated in
In some embodiments, the dental professional system 200 may include one or more image capturing devices 250. Image capturing devices 250 may comprise a camera, scanner, or other optical sensor. Image capturing devices 250 may include one or more lenses, one or more camera devices, and/or one or more other optical sensors. In some examples, image capturing devices 250 may include other sensors and/or devices which may aid in capturing optical data, such as one or more lights, depth sensors, etc. In some examples, the image capturing devices 205 may include a near-infrared imaging device that is sensitive to near-infrared wavelengths of light. In some embodiments, the image capturing devices 250 may include a filter, such as a near-infrared band-pass filter that allows passage of near-infrared light and blocks light in the visible and/or far infrared wavelengths.
Network interface 280 may include an interface for communication between the system 200 and its physical processor 130 and a network. The network may a computer network that facilitates communication or data transfer using wireless and/or wired connections. Examples of a network include, without limitation, an intranet, a Wide Area Network (WAN), a Local Area Network (LAN), a Personal Area Network (PAN), the Internet, Power Line Communications (PLC), a cellular network (e.g., a Global System for Mobile Communications (GSM) network), portions of one or more of the same, variations or combinations of one or more of the same, and/or any other suitable network. The network may also comprise a connection between elements inside a single device (e.g., a bus, any communications infrastructure (e.g., communications infrastructure 1912 shown in
Dental care datastore(s) 120 includes one or more datastore configured to store any type or form of data that may be used for dental care. In some embodiments, the dental care datastore(s) 120 include, without limitation, patient data 222 and treatment data 224. Patient data 222 may include data collected from patients, such as patient dentition information, patient historical data, patient scans, patient information, etc. Patient data may include information related to which teeth the patient may be present or missing. Patient data 222 may include data to identity the patient, such as their name, age, and other information.
Treatment data 224 may include the tooth or teeth selected for restoration, the type of restoration, such as a crown, bridge, or veneer, and other information related to the treatment of the patient, such as their dental treatment prescription. In some embodiments, treatment data 224 may include data used for treating patients, such as treatment plans, state of treatment, success of treatment, changes to treatment, notes regarding treatment, etc. The treatment plan data 224 may include the positions and orientations of each of the patient's teeth for each stage of a treatment plan. In some embodiments, the positions and orientations of the teeth may be stored as three-dimensional positional locations and angular orientations of each tooth and the patient's upper and lower arches. In some embodiments the positions and orientations of the patient's teeth may be stored as a collection of three-dimensional segmented models of the patient's upper and lower arches for each stage of treatment.
Patient data and treatment data may also include one or more of 3D scan data 226, shade data 228, restorative data 232.
The dental care datastore(s) 120 may include 3D scan data 226 which may include data representing three-dimensional models the patient's dentition, including the teeth and gingiva. In some embodiments, the 3D scan data 226 may include segmented 3D models of the patient's detention wherein each individual tooth of the patient and/or the gingiva is separated from the other teeth and/or gingiva of the detention. The three-dimensional models may be generated based on an initial three-dimensional (or 3D) intraoral scan of the patient's teeth. As shown in
The dental care datastore(s) 120 may include shade data 228. Shade data may be two-dimensional (or 2D) color images which may include data representing two-dimensional images of the patient's mouth and teeth. In some embodiments, the two-dimensional images are captured using the systems and methods described herein, for example by using an image capture device 250, discussed herein. The two-dimensional images may include buccal images of the patient's teeth.
The dental care datastore(s) 120 may include restorative data 232 may include data relating to the physical properties, appearance, and shape of a restorative. For example, restorative data 232 may include the size and shape of the external surfaces of the restorative prosthetic, such as the crown, bridge, or veneer. The external surfaces may include the occlusal, lingual, buccal, and interproximal surface of the crown, as well as the intaglio surface shape that matches the shape of a prepared tooth. Restorative data may also include the shade, transparency, thickness, materials, and other properties of the restoration to be used to treat the patient.
Example system 200 in
As will be described in greater detail below, one or more of dental care modules 108 and/or the dental care datastore(s) 120 in
In some embodiments, the elements of the system 200 (e.g., the dental care modules 108 and/or the dental care datastore(s) 120) may be operative to provide shade capture and analysis to aid in the capture of images relevant to shade capture and analysis using the image capture device 250 on the dental professional system 200.
The system 200 may include a network interface 280 and a device interface 270. The network interface 280, which may be coupled to a network and to the processor 130, may transmit signals to and receive signals from other wired or wireless devices, including remote (e.g., cloud-based) storage devices, cameras, and/or displays. For example, the network interface 280 may include wired (e.g., serial, ethernet, or the like) and/or wireless (Bluetooth, Wi-Fi, cellular, or the like) transceivers that may communicate with any other feasible device through any feasible network.
The device interface 270, which is coupled to the processor 130, may be used to interface with any feasible input and/or output device. For example, the device interface 250 may be coupled to and interface with one or more image capturing devices 130. Example image capturing devices may include 3D scanners, such as intraoral 3D scanners, optical cameras, x-ray devices, panoramic x-ray devices, shade capture devices, or the like. In another example, the device interface 270 may be coupled to and interface with a display device 260. Through the display device 260, the processor 260 may display images, feedback information, instructions, or the like.
In some examples, the display device 260 may be an integral part of the device 200. In other words, the display device 260 may share a common housing or enclosure with the processors, data stores, etc.
With reference to
The image capture device 250 may also capture near-infrared data, which penetrates into the tooth. Near-infrared data may include information related to the transparency and internal structure of the tooth. The near-infrared data may be aligned and/or mapped onto the 3D model.
The image capture device 250 may also capture color data of the surface of the patient's dentition. The surface color may be texture mapped to the 3D model.
At block 306 the system may receive an input for the type of shade capture and analysis to be performed. In some embodiments, the input may be a procedure type, such as crown, bridge, or veneer. In some embodiments, the input may include a status of the tooth for which the shade capture and analysis is to be performed. For example, if the tooth has already been prepared, then the shade may be matched to an adjacent tooth. In some embodiments, a tooth to be restored may have an undesirable shade, for example, the tooth may be dead, in which case capturing and analysis may result in shading a restored tooth to the shade of a dead tooth. In some embodiments, the input may be in the form of a prescription or treatment plan that includes information as to the type of tooth shade to be captured and analyzed, such as an original shade or an adjacent tooth shade. If the capture and analysis is to the original shade, then the process proceeds to block 310. If the capture and analysis is to the shade of one or more adjacent teeth, then the process proceeds to block 320.
At block 310 the original shade of the tooth is captured. The shade may be measured as discussed herein with a second image capture device 250, such as a shade capture device may be a 2D image capture device. A shade taking device, is preferably calibrated to work at a specific range of distances from the patient's teeth and a specific range of angles to the tooth, such as the surface normal of the tooth. The 2D image capture device may not have 3D measurement capabilities. Accordingly, is may not be able to estimate its distance and angle with respect to the tooth based on just the 2D images or without additional information, such as the 3D model of the patient's dentition, as discussed herein.
As discussed herein, with knowledge of the 3D shape of the tooth, such as the 3D model of the patient's detention, the position, distance, angle and orientation of the shade capture device may be determined based on the 2D images of the shade-taking device, where we are relative to the tooth. If the distance and angle are too far from accepted positions, the system can provide feedback to the user about it, such as to ask them to retake the image of the tooth or teeth or how to adjust the position of the shade capture device to capture the tooth or teeth.
In some embodiments, if the difference from desired distance and angle is small, the differences between the measured shade and the actual shade may be compensated for. For example, if the distance is off by 5% or less, the illumination brightness will different than the calibrated brightness and the captured shade may be compensated for based on the difference in distance for example, by increasing or decreasing the brightness.
The image capture device 250 may be guided to the correct tooth for shade capture and analysis. The image capture device 250 may begin capturing images of the patient's dentition. The teeth in the images may be compared to the teeth in the treatment plan, such as in a 3D model of the teeth captured at block 302.
In some embodiments, the comparison of the teeth in the image, which may be a 2D image, may be aligned with a projection of the teeth in the 3D model. For example, the system may receive camera data, such as the focal length, focal distance, field of view, image sensor size, etc. The process may generate multiple 2D projections of the 3D model from different angles, distances, etc, based on the camera data. Outlines of the teeth in the 2D projection of the 3D model may be generated and compared to generated outlines of the teeth in the 2D image. When the system finds a match between the teeth in the 3D model (or the 2D projection thereof) and the teeth in the 2D image, the system may determine that the tooth for which shade capture and analysis is desired, is within the field of view of the shade imaging device and begin capturing shade data. If the field of view of the imaging device does not contain the tooth of interest, then feedback may be provided to the user to guide the movement of the imaging device and its field of view towards the tooth of interest. Feedback may include guidance, such as arrow or other directions on the display, indicating a direction to move the image capture device.
In some embodiments, the teeth in the field of view of the imaging device may also be determined based on matching a shape of the tooth silhouette, the shape the gingiva, such as the gingival margin, the incisal edges of the teeth, and/or the color or color patterns of the teeth in the 2D image with the teeth in the 3D model, with or without projecting the 3D model into a 2D image plane. In some embodiments, the system may receive input from a user of the tooth for which shade capture and analysis is to be performed. For example, the images may be captured by the image capture device and a user may touch the display or click with an input device on the tooth. In some embodiments, an indication of an area, such as by drawing a circle or enclosed shaped on the image or images of the teeth.
At block 312 the tooth selected for restoration is prepared. Tooth preparation may include a material removal operation to shape the tooth to receive the restoration. After preparation, the tooth may be scanned with a 3D scanner to determine the final shape of the prepared tooth. The final shape of the prepared restoration may be used in preparing the size and shape of the physical restoration.
At block 316 the shade of the prepared tooth may be captured. During treatment, the restoration is placed over the prepared tooth. The transparency of the material used to make the restoration may allow a portion of the shade of the prepared tooth to impact the apparent shade of the restored tooth. For example, some of the color of the prepared tooth may be visible though the preparation.
The shade of the prepared tooth may be measured as discussed herein with an image capture device 250. The image capture device 250 may be guided to the correct prepared tooth for shade capture and analysis. The image capture device 250 may begin capturing images of the patient's dentition. The teeth in the images may be compared to the teeth in the treatment plan, such as in a 3D model of the teeth captured at block 302.
In some embodiments, the comparison of the teeth in the image, which may be a 2D image, may be aligned with a projection of the teeth in the 3D model. For example, the system may receive camera data, such as the focal length, focal distance, field of view, image sensor size, etc. The process may generate multiple 2D projections of the 3D model from different angles, distances, etc, based on the camera data. Outlines of the teeth in the 2D projection of the 3D model may be generated and compared to generated outlines of the teeth in the 2D image. When the system finds a match between the teeth in the 3D model (or the 2D projection thereof) and the teeth in the 2D image, the system may determine that the tooth for which shade capture and analysis is desired, is within the field of view of the shade imaging device and begin capturing shade data. If the field of view of the imaging device does not contain the prepared tooth, then feedback may be provided to the user to guide the movement of the imaging device and its field of view towards the prepared tooth. Feedback may include guidance, such as arrow or other directions on the display, indicating a direction to move the image capture device to capture the prepared tooth.
In some embodiments, the teeth in the field of view of the imaging device may also be determined based on matching a shape of the prepared tooth silhouette, the shape the gingiva, such as the gingival margin, the incisal edges of the teeth, and/or the color or color patterns of the teeth in the 2D image with the teeth in the 3D model, with or without projecting the 3D model into a 2D image plane. In some embodiments, the system may receive input from a user of the tooth for which shade capture and analysis is to be performed. For example, the images may be captured by the image capture device and a user may touch the display or click with an input device on the prepared tooth. In some embodiments, an indication of an area, such as by drawing a circle or enclosed shaped on the image or images of the teeth.
At block 330 the shade information may be updated based on the shade of the tooth before preparation, the shade of the prepared tooth, the position and shape of the prepared tooth, the position and shape of the teeth adjacent the prepared tooth, transparency of the tooth, the surface texture of the original tooth surface before the preparation, and other factors.
At block 320 the tooth selected for restoration is prepared. Tooth preparation may include a material removal operation to shape the tooth to receive the restoration. After preparation, the tooth may be scanned with a 3D scanner to determine the final shape of the prepared tooth. The final shape of the prepared restoration may be used in preparing the size and shape of the physical restoration.
At block 322 the shade of the prepared tooth and teeth adjacent the prepared tooth may be captured. The shade of the teeth may be measured as discussed herein with an image capture device 250. The image capture device 250 may be guided to the correct tooth for shade capture and analysis. The image capture device 250 may begin capturing images of the patient's dentition. The teeth in the images may be compared to the teeth in the treatment plan, such as in a 3D model of the teeth captured at block 302.
In some embodiments, the comparison of the teeth in the image, which may be a 2D image, may be aligned with a projection of the teeth in the 3D model. For example, the system may receive camera data, such as the focal length, focal distance, field of view, image sensor size, etc. The process may generate multiple 2D projections of the 3D model from different angles, distances, etc, based on the camera data. Outlines of the teeth in the 2D projection of the 3D model may be generated and compared to generated outlines of the teeth in the 2D image. When the system finds a match between the teeth in the 3D model (or the 2D projection thereof) and the teeth in the 2D image, the system may determine that the teeth for which shade capture and analysis is desired are within the field of view of the shade imaging device and begin capturing shade data of the teeth. If the field of view of the imaging device does not contain the teeth of interest, then feedback may be provided to the user to guide the movement of the imaging device and its field of view towards the teeth of interest. Feedback may include guidance, such as arrow or other directions on the display, indicating a direction to move the image capture device to capture the teeth of interest.
In some embodiments, the teeth in the field of view of the imaging device may also be determined based on matching a shape of the tooth silhouette, the shape the gingiva, such as the gingival margin, the incisal edges of the teeth, and/or the color or color patterns of the teeth in the 2D image with the teeth in the 3D model, with or without projecting the 3D model into a 2D image plane. In some embodiments, the system may receive input from a user of the tooth for which shade capture and analysis is to be performed. For example, the images may be captured by the image capture device and a user may touch the display or click with an input device on the teeth. In some embodiments, an indication of an area, such as by drawing a circle or enclosed shaped on the image or images of the teeth.
At block 330 the shade information may be updated based on the shade of the tooth before preparation, the shade of the prepared tooth, the position and shape of the prepared tooth, the position and shape of the teeth adjacent the prepared tooth, transparency of the tooth, the surface texture of the original tooth surface before the preparation, and other factors.
At block 332 the shade information is matched to the teeth of the patient. For example, the shade information may be associated with a tooth in the treatment plan or the prescription. The shade may also be matched or compared with the teeth in the 3D scan of the patient's dentition.
At block 336 the shade match and quality of the shade images are evaluated. For example, the focus, brightness, angle of the camera with respect to the tooth surface normal, distance from the tooth being too far, etc may be evaluated based on the 2D images and the 3D model, such as by deriving the location and orientation of the shade imaging device with respect to the tooth of interest based on a comparison of the teeth in the shade image with the teeth in the 3D model. If the shade and quality are acceptable, then the process may proceed to block 340. If the shade and/or quality are not acceptable, then the process may go back to blocks 310 or 320. In some embodiments, while the process may have proceeded through blocks 310, 312, and 316 in a first pass, the process may proceed to blocks 320 and 322 in a second pass. For example, because the original surface shape of the tooth may no longer exists, after preparation, the adjacent teeth may be used to determine the shade of the restoration.
At block 340 a digital model of the prepared tooth and a prosthetic is generated. For example, the digital model of the prepared tooth may be based on the scan of the prepared tooth, such as generated at block 312 or 320. The 3D model of the prosthetic may be based on the thickness of the restoration and the design of the external surface of the crown and the 3D scan of the prepared tooth. In some embodiments, material properties, such as transparency, light scattering, etc for a plurality of wavelengths of light may be applied to the volume of the restoration. In some embodiments, a material and the prosthetic fabrication may be selected and the prosthetic may be modeled based on the material and in some embodiments, also based on the fabrication method. For example, the material may be a single shade monoblock, a multi shade monoblock, such as a three-shed monoblock or a sintering material. A monoblock may be milled. Sintering material may be sintered.
The transparency of the material may be modeled based on the choice of material and the material's transparency, thickness, the shade of the preparation, and the desired restoration shade based on the shade determined according to this process, such as the shade of the tooth the restoration is restoring or the teeth adjacent to the tooth the restoration is restoring.
The shade may also be modeled based on the surface texture of the original tooth or the adjacent teeth. The texture, such as the fine 3D surface roughness, and features may be determined based on the 3D scan of the patient's original tooth or the adjacent teeth.
At block 342 the shade of the restoration is simulated based on the model generated at block 340. The simulation may be a ray tracing or other light and optical simulation of light illuminating the restoration and the rest of the patient's dentition.
With reference to
In some emblements, the intraoral scanner may be used as the shade capture device used in process 300. When using an intraoral scanner, the 3D scanning functions of the scanner may be used along with 2D shade image capture. The dental professional system 200 may switch between 3D scanning and shade capture functions automatically, based on a process flow of the system or may receive an input to change from 3D scanning to 2D shade image capture.
The shade tabs and their labels may be automatically identified by the dental professional system in the images captured by the intraoral scanner through image segmentation, wherein the edges of the gingiva, teeth, and shade tabs and determined and then the images segmented. The shades of the shade references, such as the shade tabs may be determined based on character recognition of the shade ID 524 on the shade tabs.
In some embodiments, a 3D scan of the teeth may be conducted close in time with the shade image capture, such as within 100 ms, either before or after, the shade image capture in order to determine the distance, angle, position, and orientation of the intraoral scanner with respect to the target tooth or teeth.
The system may check that the shade images taken are in a desired distance and angle range with respect to the tooth or teeth.
In some embodiments, images without illumination provided by the intraoral scanner may also be used to estimate ambient background illumination and shade. If strong non uniformity of ambient background illumination is detected, such, for example, when an overhead dental projector is used, but is partly obstructed by a doctor's hand, feedback, such as a warning may be provided on the display.
The system may also display the captured shade images and identity the teeth for which sufficient shade data has been captured and for which insufficient shade data has been captured.
In some embodiments, the system may determine a difference between an estimated shade of the tooth and the shade of the shade references. If the different between the estimated shade of the tooth and the shade of the shade references, the system may provide feedback to the user regarding the difference. In some embodiments, the system may suggest a different shade reference to be used.
The shade images captured by the 3D scanner using the sleeve 400, along with the 3D scan of the patient's dentition, color texture for the 3D scan, near IR data, such as near IR data that includes the location of the dentin in the tooth, transparency, and internal tooth texture and structure, the prescription, and other patient and treatment data, may be provided to a restoration design and fabrication system for use in selecting materials and shades for the restoration, as discussed herein. In some embodiments, low quality images, such as images with poor focus, movement blur, that are too dark or too light, may be removed from the data sent to the restoration design and fabrication system, which may be a remote dental laboratory.
The data may be displayed within dental restoration preparation environment, such as within dental restoration CAD design software. The patient and treatment data may be presented and the tooth being restored may be selected from the 3D model of the patient's donation or from the prescription.
Computing system 1910 broadly represents any single or multi-processor computing device or system capable of executing computer-readable instructions. Examples of computing system 1910 include, without limitation, workstations, laptops, client-side terminals, servers, distributed computing systems, handheld devices, or any other computing system or device. In its most basic configuration, computing system 1910 may include at least one processor 1914 and a system memory 1916.
Processor 1914 generally represents any type or form of physical processing unit (e.g., a hardware-implemented central processing unit) capable of processing data or interpreting and executing instructions. In certain embodiments, processor 1914 may receive instructions from a software application or module. These instructions may cause processor 1914 to perform the functions of one or more of the example embodiments described and/or illustrated herein.
System memory 1916 generally represents any type or form of volatile or non-volatile storage device or medium capable of storing data and/or other computer-readable instructions. Examples of system memory 1916 include, without limitation, Random Access Memory (RAM), Read Only Memory (ROM), flash memory, or any other suitable memory device. Although not required, in certain embodiments computing system 1910 may include both a volatile memory unit (such as, for example, system memory 1916) and a non-volatile storage device (such as, for example, primary storage device 1932, as described in detail below). In one example, one or more of dental care modules 108 from
In some examples, system memory 1916 may store and/or load an operating system 1940 for execution by processor 1914. In one example, operating system 1940 may include and/or represent software that manages computer hardware and software resources and/or provides common services to computer programs and/or applications on computing system 1910. Examples of operating system 1940 include, without limitation, LINUX, JUNOS, MICROSOFT WINDOWS, WINDOWS MOBILE, MAC OS, APPLE'S IOS, UNIX, GOOGLE CHROME OS, GOOGLE'S ANDROID, SOLARIS, variations of one or more of the same, and/or any other suitable operating system.
In certain embodiments, example computing system 1910 may also include one or more components or elements in addition to processor 1914 and system memory 1916. For example, as illustrated in
Memory controller 1918 generally represents any type or form of device capable of handling memory or data or controlling communication between one or more components of computing system 1910. For example, in certain embodiments memory controller 1918 may control communication between processor 1914, system memory 1916, and I/O controller 1920 via communication infrastructure 1912.
I/O controller 1920 generally represents any type or form of module capable of coordinating and/or controlling the input and output functions of a computing device. For example, in certain embodiments I/O controller 1920 may control or facilitate transfer of data between one or more elements of computing system 1910, such as processor 1914, system memory 1916, communication interface 1922, display adapter 1926, input interface 1930, and storage interface 1934.
As illustrated in
As illustrated in
Additionally or alternatively, example computing system 1910 may include additional I/O devices. For example, example computing system 1910 may include I/O device 1936. In this example, I/O device 1936 may include and/or represent a user interface that facilitates human interaction with computing system 1910. Examples of I/O device 1936 include, without limitation, a computer mouse, a keyboard, a monitor, a printer, a modem, a camera, a scanner, a microphone, a touchscreen device, variations, or combinations of one or more of the same, and/or any other I/O device.
Communication interface 1922 broadly represents any type or form of communication device or adapter capable of facilitating communication between example computing system 1910 and one or more additional devices. For example, in certain embodiments communication interface 1922 may facilitate communication between computing system 1910 and a private or public network including additional computing systems. Examples of communication interface 1922 include, without limitation, a wired network interface (such as a network interface card), a wireless network interface (such as a wireless network interface card), a modem, and any other suitable interface. In at least one embodiment, communication interface 1922 may provide a direct connection to a remote server via a direct link to a network, such as the Internet. Communication interface 1922 may also indirectly provide such a connection through, for example, a local area network (such as an Ethernet network), a personal area network, a telephone or cable network, a cellular telephone connection, a satellite data connection, or any other suitable connection.
In certain embodiments, communication interface 1922 may also represent a host adapter configured to facilitate communication between computing system 1910 and one or more additional network or storage devices via an external bus or communications channel. Examples of host adapters include, without limitation, Small Computer System Interface (SCSI) host adapters, Universal Serial Bus (USB) host adapters, Institute of Electrical and Electronics Engineers (IEEE) 1394 host adapters, Advanced Technology Attachment (ATA), Parallel ATA (PATA), Serial ATA (SATA), and External SATA (eSATA) host adapters, Fiber Channel interface adapters, Ethernet adapters, or the like. Communication interface 1922 may also allow computing system 1910 to engage in distributed or remote computing. For example, communication interface 1922 may receive instructions from a remote device or send instructions to a remote device for execution.
In some examples, system memory 1916 may store and/or load a network communication program 1938 for execution by processor 1914. In one example, network communication program 1938 may include and/or represent software that enables computing system 1910 to establish a network connection 1942 with another computing system (not illustrated in
Although not illustrated in this way in
As illustrated in
In certain embodiments, storage devices 1932 and 1933 may be configured to read from and/or write to a removable storage unit configured to store computer software, data, or other computer-readable information. Examples of suitable removable storage units include, without limitation, a floppy disk, a magnetic tape, an optical disk, a flash memory device, or the like. Storage devices 1932 and 1933 may also include other similar structures or devices for allowing computer software, data, or other computer-readable instructions to be loaded into computing system 1910. For example, storage devices 1932 and 1933 may be configured to read and write software, data, or other computer-readable information. Storage devices 1932 and 1933 may also be a part of computing system 1910 or may be a separate device accessed through other interface systems.
Many other devices or subsystems may be connected to computing system 1910. Conversely, all of the components and devices illustrated in
The computer-readable medium containing the computer program may be loaded into computing system 1910. All or a portion of the computer program stored on the computer-readable medium may then be stored in system memory 1916 and/or various portions of storage devices 1932 and 1933. When executed by processor 1914, a computer program loaded into computing system 1910 may cause processor 1914 to perform and/or be a means for performing the functions of one or more of the example embodiments described and/or illustrated herein. Additionally or alternatively, one or more of the example embodiments described and/or illustrated herein may be implemented in firmware and/or hardware. For example, computing system 1910 may be configured as an Application Specific Integrated Circuit (ASIC) adapted to implement one or more of the example embodiments disclosed herein.
Client systems 2010, 2020, and 2030 generally represent any type or form of computing device or system, such as example computing system 1910 in
As illustrated in
Servers 2040 and 2045 may also be connected to a Storage Area Network (SAN) fabric 2080. SAN fabric 2080 generally represents any type or form of computer network or architecture capable of facilitating communication between a plurality of storage devices. SAN fabric 2080 may facilitate communication between servers 2040 and 2045 and a plurality of storage devices 2090 (1)-(N) and/or an intelligent storage array 2095. SAN fabric 2080 may also facilitate, via network 2050 and servers 2040 and 2045, communication between client systems 2010, 2020, and 2030 and storage devices 2090 (1)-(N) and/or intelligent storage array 2095 in such a manner that devices 2090 (1)-(N) and array 2095 appear as locally attached devices to client systems 2010, 2020, and 2030. As with storage devices 2060 (1)-(N) and storage devices 2070 (1)-(N), storage devices 2090 (1)-(N) and intelligent storage array 2095 generally represent any type or form of storage device or medium capable of storing data and/or other computer-readable instructions.
In certain embodiments, and with reference to example computing system 1910 of
In at least one embodiment, all or a portion of one or more of the example embodiments disclosed herein may be encoded as a computer program and loaded onto and executed by server 2040, server 2045, storage devices 2060 (1)-(N), storage devices 2070 (1)-(N), storage devices 2090 (1)-(N), intelligent storage array 2095, or any combination thereof. All or a portion of one or more of the example embodiments disclosed herein may also be encoded as a computer program, stored in server 2040, run by server 2045, and distributed to client systems 2010, 2020, and 2030 over network 2050.
As detailed above, computing system 1910 and/or one or more components of network architecture 2000 may perform and/or be a means for performing, either alone or in combination with other elements, one or more steps of any of the methods disclosed herein.
While the foregoing disclosure sets forth various embodiments using specific block diagrams, flowcharts, and examples, each block diagram component, flowchart step, operation, and/or component described and/or illustrated herein may be implemented, individually and/or collectively, using a wide range of hardware, software, or firmware (or any combination thereof) configurations. In addition, any disclosure of components contained within other components should be considered example in nature since many other architectures can be implemented to achieve the same functionality.
In some examples, all or a portion of example system 200 in
In various embodiments, all or a portion of example system 200 in
According to various embodiments, all or a portion of example system 200 in
In some examples, all or a portion of example system 200 in
In addition, all or a portion of example system 200 in
In some embodiments, all or a portion of example system 200 in
The process parameters and sequence of steps described and/or illustrated herein are given by way of example only and can be varied as desired. For example, while the steps illustrated and/or described herein may be shown or discussed in a particular order, these steps do not necessarily need to be performed in the order illustrated or discussed. The various example methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or include additional steps in addition to those disclosed.
While various embodiments have been described and/or illustrated herein in the context of fully functional computing systems, one or more of these example embodiments may be distributed as a program product in a variety of forms, regardless of the particular type of computer-readable media used to actually carry out the distribution. The embodiments disclosed herein may also be implemented using software modules that perform certain tasks. These software modules may include script, batch, or other executable files that may be stored on a computer-readable storage medium or in a computing system. In some embodiments, these software modules may configure a computing system to perform one or more of the example embodiments disclosed herein.
As described herein, the computing devices and systems described and/or illustrated herein broadly represent any type or form of computing device or system capable of executing computer-readable instructions, such as those contained within the modules described herein. In their most basic configuration, these computing device(s) may each comprise at least one memory device and at least one physical processor.
The term “memory” or “memory device,” as used herein, generally represents any type or form of volatile or non-volatile storage device or medium capable of storing data and/or computer-readable instructions. In one example, a memory device may store, load, and/or maintain one or more of the modules described herein. Examples of memory devices comprise, without limitation, Random Access Memory (RAM), Read Only Memory (ROM), flash memory, Hard Disk Drives (HDDs), Solid-State Drives (SSDs), optical disk drives, caches, variations, or combinations of one or more of the same, or any other suitable storage memory.
In addition, the term “processor” or “physical processor,” as used herein, generally refers to any type or form of hardware-implemented processing unit capable of interpreting and/or executing computer-readable instructions. In one example, a physical processor may access and/or modify one or more modules stored in the above-described memory device. Examples of physical processors comprise, without limitation, microprocessors, microcontrollers, Central Processing Units (CPUs), Field-Programmable Gate Arrays (FPGAs) that implement softcore processors, Application-Specific Integrated Circuits (ASICs), portions of one or more of the same, variations or combinations of one or more of the same, or any other suitable physical processor.
Although illustrated as separate elements, the method steps described and/or illustrated herein may represent portions of a single application. In addition, in some embodiments one or more of these steps may represent or correspond to one or more software applications or programs that, when executed by a computing device, may cause the computing device to perform one or more tasks, such as the method step.
In addition, one or more of the devices described herein may transform data, physical devices, and/or representations of physical devices from one form to another. Additionally or alternatively, one or more of the modules recited herein may transform a processor, volatile memory, non-volatile memory, and/or any other portion of a physical computing device from one form of computing device to another form of computing device by executing on the computing device, storing data on the computing device, and/or otherwise interacting with the computing device.
The term “computer-readable medium,” as used herein, generally refers to any form of device, carrier, or medium capable of storing or carrying computer-readable instructions. Examples of computer-readable media comprise, without limitation, transmission-type media, such as carrier waves, and non-transitory-type media, such as magnetic-storage media (e.g., hard disk drives, tape drives, and floppy disks), optical-storage media (e.g., Compact Disks (CDs), Digital Video Disks (DVDs), and BLU-RAY disks), electronic-storage media (e.g., solid-state drives and flash media), and other distribution systems.
A person of ordinary skill in the art will recognize that any process or method disclosed herein can be modified in many ways. The process parameters and sequence of the steps described and/or illustrated herein are given by way of example only and can be varied as desired. For example, while the steps illustrated and/or described herein may be shown or discussed in a particular order, these steps do not necessarily need to be performed in the order illustrated or discussed.
The various exemplary methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or comprise additional steps in addition to those disclosed. Further, a step of any method as disclosed herein can be combined with any one or more steps of any other method as disclosed herein.
The processor as described herein can be configured to perform one or more steps of any method disclosed herein. Alternatively or in combination, the processor can be configured to combine one or more steps of one or more methods as disclosed herein.
Unless otherwise noted, the terms “connected to” and “coupled to” (and their derivatives), as used in the specification and claims, are to be construed as permitting both direct and indirect (i.e., via other elements or components) connection. In addition, the terms “a” or “an,” as used in the specification and claims, are to be construed as meaning “at least one of.” Finally, for ease of use, the terms “including” and “having” (and their derivatives), as used in the specification and claims, are interchangeable with and shall have the same meaning as the word “comprising.
The processor as disclosed herein can be configured with instructions to perform any one or more steps of any method as disclosed herein.
It will be understood that although the terms “first,” “second,” “third,” etc. may be used herein to describe various layers, elements, components, regions, or sections without referring to any particular order or sequence of events. These terms are merely used to distinguish one layer, element, component, region or section from another layer, element, component, region, or section. A first layer, element, component, region, or section as described herein could be referred to as a second layer, element, component, region, or section without departing from the teachings of the present disclosure.
As used herein, the term “or” is used inclusively to refer items in the alternative and in combination.
As used herein, characters such as numerals refer to like elements.
The present disclosure includes the following numbered clauses.
Clause 1. A system for tooth shade imaging in dental treatment, the system comprising: one or more processors and memory comprising instructions that when executed by the one or more processors cause the system to: receive scan data of a patient's dentition; determine that a tooth of interest for shade capture is within a field of view of a shade capture device; capture a first shade of the tooth of interest before tooth preparation; capture a second shade of the tooth of interest after tooth preparation; generate a digital model of the tooth of interest after tooth preparation; model a third shade resulting from placing a prosthetic on the tooth of interest after tooth preparation using the digital model of the tooth of interest after tooth preparation including the second shade and a digital model of the prosthetic.
Clause 2. The system of clause 1, wherein the instructions that when executed further cause the system to determine a shade of the prosthetic so that the third shade matches the first shade.
Clause 3. The system of clause 1, wherein the instructions to model the third shade include instructions to model the volume of the restoration.
Clause 4. The system of clause 1, wherein the instructions to model the third shade include instructions to model light characteristics of a material of the volume of the restoration.
Clause 5. The system of clause 1, wherein the light characteristics include the transparency of the material of the volume of the restoration.
Clause 6. The system of clause 1, wherein the light characteristics include the light scattering of the material of the volume of the restoration.
Clause 7. The system of any one of clauses 1-6, wherein the instruction to model the third shade include instruction to model the third shade when a plurality of wavelengths of light illuminate the prosthetic on the tooth of interest after tooth preparation.
Clause 8. The system of clause 1, wherein the instructions that determine that a tooth of interest for shade capture is within a field of view of a shade capture device includes instructions to: receive data from the shade capture device; compare the data from the shade capture device to the scan data of the patient's dentition; and identify the tooth of interest for shade capture is within the field of view of the shade capture device based on the comparison.
Clause 9. The system of clause 1, wherein the instructions that determine that a tooth of interest for shade capture is within a field of view of a shade capture device includes instructions to: receive data from the shade capture device; compare the data from the shade capture device to the scan data of the patient's dentition; and identify the teeth within the field of view of the shade capture device based on the comparison; determine whether or not the tooth of interest is identified within the field of view; provide feedback to a user based on the determination.
Clause 10. The system of clause 1, wherein the instructions that when executed by the one or more processors further cause the system to: determine an angle between the shade capture device and the tooth of interest; provide feedback to a user of the shade capture device based on the angle.
Clause 11. The system of clause 1, wherein the feedback includes an indication of how to adjust the position or orientation of the shade capture device is within an angle range of the normal of a surface of the tooth of interest for shade capture.
Clause 12. The system of clause 1, wherein the instructions that when executed by the one or more processors further cause the system to: determine a distance between the shade capture device and the tooth of interest; provide feedback to a user of the shade capture device based on the distance.
Clause 13. The system of clause 12, wherein the feedback includes an indication of how to adjust the position or orientation of the shade capture device is within a distance range of a surface of the tooth of interest for shade capture.
Clause 14. A system for tooth shade imaging in dental treatment, the system comprising: one or more processors and memory that when executed by the one or more processors cause the system to: receive scan data of a patient's dentition; determine a tooth for restoration and a tooth of interest for shade capture; determine that the tooth of interest for shade capture is within a field of view of a shade capture device; capture a first shade of the tooth of interest; capture a second shade of the tooth for restoration after tooth preparation; generate a digital model of the tooth for restoration after tooth preparation; model a third shade resulting from the application of a prosthetic on the tooth for restoration after tooth preparation using the digital model of the tooth for restoration after tooth preparation having the second shade and a digital model of the prosthetic.
Clause 15. The system of clause 14, wherein the instructions that when executed further cause the system to determine a shade of the prosthetic so that the third shade matches the first shade.
Clause 16. The system of clause 14, wherein the instructions to model the shade include instructions to model the volume of the restoration.
Clause 17. The system of clause 14, wherein the instructions to model the shade include instructions to model light characteristics of a material of the volume of the restoration.
Clause 18. The system of clause 14, wherein the light characteristics include the transparency of the material of the volume of the restoration.
Clause 19. The system of clause 114 wherein the light characteristics include the light scattering of the material of the volume of the restoration.
Clause 20. The system of any one of clauses 14-19, wherein the instruction to model the shade include instruction to model the shade when a plurality of wavelengths of light illuminate the prosthetic on the tooth for restoration after tooth preparation.
Clause 21. The system of clause 14, wherein the instructions that determine that a tooth of interest for shade capture is within a field of view of a shade capture device includes instructions to: receive data from the shade capture device; compare the data from the shade capture device to the scan data of the patient's dentition; and identify the tooth of interest for shade capture is within the field of view of the shade capture device based on the comparison.
Clause 22. The system of clause 14, wherein the instructions that determine that a tooth of interest for shade capture is within a field of view of a shade capture device includes instructions to: receive data from the shade capture device; compare the data from the shade capture device to the scan data of the patient's dentition; and identify the teeth within the field of view of the shade capture device based on the comparison; determine whether or not the tooth of interest is identified within the field of view; provide feedback to a user based on the determination.
Clause 23. The system of clause 14, wherein the instructions that when executed by the one or more processors further cause the system to: determine an angle between the shade capture device and the tooth of interest; provide feedback to a user of the shade capture device based on the angle.
Clause 24. The system of clause 23, wherein the feedback includes an indication of how to adjust the position or orientation of the shade capture device is within an angle range of the normal of a surface of the tooth of interest for shade capture.
Clause 25. The system of clause 14, wherein the instructions that when executed by the one or more processors further cause the system to: determine a distance between the shade capture device and the tooth of interest; provide feedback to a user of the shade capture device based on the distance.
Clause 26. The system of clause 25, wherein the feedback includes an indication of how to adjust the position or orientation of the shade capture device is within a distance range of a surface of the tooth of interest for shade capture.
Clause 27. A system for tooth shade imaging in dental treatment, the system comprising: one or more processors and memory that when executed by the one or more processors cause the system to: receive scan data of a patient's dentition; determine that the tooth of interest for shade capture is within a field of view of a shade capture device; automatically determine a location of one or more shade reference aids within a field of view of a shade capture device; capture a first shade of the tooth of interest along with one or more second shades of the one or more shade reference aids; and adjust the first shade based on the one or more second shades.
Clause 28. The system of clause 27, wherein the instructions that automatically determine a location of one or more shade reference aids within a field of view of a shade capture device includes instructions to: identify characters within the field of view of the shade capture device; and determine the location of one or more shade reference aids within a field of view of a shade capture device based on a location of the characters within the field of view.
Clause 29. The system of clause 27, wherein the instructions that automatically determine a location of one or more shade reference aids within a field of view of a shade capture device includes instructions to: segment an area of the shade reference aids within a field of view of a shade capture device.
Clause 30. The system of clause 27, wherein the instructions that when executed by the one or more processors further cause the system to: determine the tooth of interest within a field of view of a shade capture device.
Clause 31. The system of clause 30, wherein the instructions that when executed by the one or more processors further cause the system to: determine a first distance between the shade capture device and the shade reference aids; determine a second distance between the shade capture device and the tooth of interest; and compare a difference between first distance with the second distance to a range; and if the different is outside the range, provide feedback.
Clause 32. The system of clause 30, wherein the instructions that when executed by the one or more processors further cause the system to: determine a first angle between the shade capture device and the shade reference aids; determine a second angle between the shade capture device and the tooth of interest; and compare a difference between first angle with the second angle to a range; and if the different is outside the range, provide feedback.
Clause 33. The system of clause 27, wherein the instructions to capture a first shade of the tooth of interest along with one or more second shades of the one or more shade reference aids includes instructions that when executed by the one or more processors further cause the system to: illuminate the tooth of interest and the one or more shade reference aids with illumination from the shade capture device.
Clause 34. The system of clause 33, wherein the instructions that when executed by the one or more processors further cause the system to: capture a third shade of the tooth of interest along with one or more fourth shades of the one or more shade reference aids without illumination from the shade capture device.
Embodiments of the present disclosure have been shown and described as set forth herein and are provided by way of example only. One of ordinary skill in the art will recognize numerous adaptations, changes, variations, and substitutions without departing from the scope of the present disclosure. Several alternatives and combinations of the embodiments disclosed herein may be utilized without departing from the scope of the present disclosure and the inventions disclosed herein. Therefore, the scope of the presently disclosed inventions shall be defined solely by the scope of the appended claims and the equivalents thereof.
This application claims the benefit under 35 U.S.C. § 119 (e) of U.S. Provisional Patent Application No. 63/542,829, filed Oct. 6, 2023, and titled “SYSTEMS AND METHODS FOR TOOTH SHADE IMAGING IN DENTAL TREATMENT,” and U.S. Provisional Patent Application No. 63/621,037, filed Jan. 15, 2024, and titled “SYSTEMS AND METHODS FOR TOOTH SHADE IMAGING IN DENTAL TREATMENT,” which is incorporated, in its entirety, by this reference. Determining the correct shade for dental restorations, such as crowns, bridges, or veneers, is a challenging task for a number of reasons. Natural teeth aren't monochromatic and can have multiple shades, translucencies, and intrinsic characteristics that can be difficult to capture with current imaging devices and techniques and to replicate with dental materials. The light under which the tooth shade is selected can significantly influence the appearance of the tooth. Different lights (LED, fluorescent, incandescent, natural daylight) can make the same tooth appear as a different shade. Lack of control over the lighting during tooth shade determination also leads to incorrect or unnaturally looking dental restorations. Similarity, two objects can appear to match under one light source or lighting condition but look different under another light source or in another lighting condition, known as metamerism. Matching shades of teeth can be problematic if a shade is chosen in one lighting condition, but the restoration is viewed in another. One method of determining a tooth shade is through the use of shade guides. However, even if a matching shade guide appears close in shade to a tooth, slight variations can exist between the shade tab and the patient's natural tooth. Moreover, shade guides also may not match the translucency, absorption, scattering, and other material properties of a patient's natural tooth, leading to incorrect shading of a restoration. Even after a shade is chosen, different dental materials used to fabricate the restoration, such as ceramics or composites, have different optical properties from each other and from a patient's teeth. A shade that works well with one material might not look the same or work as well with another. Current shade imaging and determination systems are less than idea for at least the above described reasons. Work in relation to the present disclosure has found that current intraoral scanners have been produce inaccurate and shading results. For example, they fail to separate reflected light, which is mainly the color of the light source, from diffused light, which is mainly the color of the object. They also do not account for lateral and axial nonuniformity of the tooth shade when light and/or the imaging device is off-axis, too far, or too close to the teeth. As another example, single purpose shade devices also lack the ability to account for the uncertainties of lighting, tooth material properties, and the other factors discussed above. They also are not connected to the dental treatment systems to properly and accurately identify the teeth, leading to less than ideal shades in dental restorations and user error.
| Number | Date | Country | |
|---|---|---|---|
| 63621037 | Jan 2024 | US | |
| 63542829 | Oct 2023 | US |