The present disclosure relates to systems and methods of scanning a dental impression to obtain a digital model of a patient's dentition as an input to computer aided design (CAD) and computer aided manufacturing (CAM) methods for producing dental prostheses.
Dental prostheses are typically manufactured at specialized dental laboratories that employ computer-aided design (CAD) and computer-aided manufacturing (CAM) milling systems to produce dental prostheses according to patient-specific specifications provided by dentists. In a typical work flow, information about the oral situation of a patient is received from a dentist, the dental laboratory designs the dental prosthesis, and the prosthesis is manufactured using a mill or other fabrication system. When making use of CAD design and CAM manufacturing in dentistry, a digital model of the patient's dentition is required as an input to the process. Despite the rise of intraoral scanning technology, the prevalent method of acquisition of digital model data is still scanning a stone model cast from an impression. Even in more technically advanced markets it is estimated that only 10% of clinicians own an intraoral scanner, therefore any improvements to the conventional process are likely to remain relevant and benefit patients and clinicians alike for some time. Accordingly, improvements to methods of acquiring digital models of patients' dentition are desirable.
Certain embodiments of the disclosure concern systems and methods for scanning a physical impression of a patient's dentition and constructing a virtual surface image of the patient's dentition from the scan data thereby obtained. In some embodiments, the virtual surface image of the patient's dentition is constructed using isosurfaces and density gradients of a volumetric image and then directly creating a surface image based upon void spaces that correspond to the patient's dentition. This avoids an unnecessary step of first creating a surface image of the impression and then digitally or virtually reversing the surface image of the impression to obtain a surface image intended to correspond with the patient's dentition. Instead, the isosurfaces and vector gradients are selected and oriented directly to define the patient's dentition, thereby providing a surface image that is suitable for use in a dental restoration design program.
In some embodiments, the physical impression of a patient's dentition comprises a three-way dental impression tray that is adapted to obtain a physical impression containing information relating to a patient's upper jaw, lower jaw, and bite registration for at least a portion of the patient's dentition. In some embodiments, a three-way dental impression tray is scanned and the data from a single scan is used to generate virtual models of at least a portion of the patient's upper jaw, lower jaw, and bite registration. The virtual models thereby obtained are suitable for use in designing a dental restoration using known digital design products.
The foregoing and other objects, features, and advantages of the disclosed embodiments will become more apparent from the following detailed description, which proceeds with reference to the accompanying figures.
For purposes of this description, certain aspects, advantages, and novel features of the embodiments of this disclosure are described herein. The disclosed methods, apparatus, and systems should not be construed as being limiting in any way. Instead, the present disclosure is directed toward all novel and nonobvious features and aspects of the various disclosed embodiments, alone and in various combinations and sub-combinations with one another. The methods, apparatus, and systems are not limited to any specific aspect or feature or combination thereof, nor do the disclosed embodiments require that any one or more specific advantages be present or problems be solved.
Although the operations of some of the disclosed embodiments are described in a particular, sequential order for convenient presentation, it should be understood that this manner of description encompasses rearrangement, unless a particular ordering is required by specific language set forth below. For example, operations described sequentially may in some cases be rearranged or performed concurrently. Moreover, for the sake of simplicity, the attached figures may not show the various ways in which the disclosed methods can be used in conjunction with other methods. Additionally, the description sometimes uses terms like “provide” or “achieve” to describe the disclosed methods. These terms are high-level abstractions of the actual operations that are performed. The actual operations that correspond to these terms may vary depending on the particular implementation and are readily discernible by one of ordinary skill in the art.
As used in this application and in the claims, the singular forms “a,” “an,” and “the” include the plural forms unless the context clearly dictates otherwise. Additionally, the term “includes” means “comprises.” Further, the terms “coupled” and “associated” generally mean electrically, electromagnetically, and/or physically (e.g., mechanically or chemically) coupled or linked and does not exclude the presence of intermediate elements between the coupled or associated items absent specific contrary language.
In some examples, values, procedures, or apparatus may be referred to as “lowest,” “best,” “minimum,” or the like. It will be appreciated that such descriptions are intended to indicate that a selection among many alternatives can be made, and such selections need not be better, smaller, or otherwise preferable to other selections.
In the following description, certain terms may be used such as “up,” “down,” “upper,” “lower,” “horizontal,” “vertical,” “left,” “right,” and the like. These terms are used, where applicable, to provide some clarity of description when dealing with relative relationships. But, these terms are not intended to imply absolute relationships, positions, and/or orientations. For example, with respect to an object, an “upper” surface can become a “lower” surface simply by turning the object over. Nevertheless, it is still the same object.
As noted above, in a typical work flow, information about the oral situation of a patient is received from a dentist, the dental laboratory designs the dental prosthesis, and the prosthesis is manufactured using a mill or other fabrication system. When making use of CAD design and CAM manufacturing in dentistry, a digital model of the patient's dentition is required as an input to the process. Despite the rise of intraoral scanning technology, the prevalent method of acquisition of digital model data is still scanning a stone model cast from a physical negative impression of the patient's dentition.
A physical negative impression of the patient's dentition is typically obtained by the use of a dental impression tray containing impression material. An example of an impression tray is shown in
For example, in
As noted above, in a conventional workflow, a physical dental impression formed in the manner described above would be used to cast a model of the patient's dentition formed of stone, polymeric, or other suitable material. The cast model would then be scanned using a laser scanner in order to obtain a digital model. The digital model would then be used to design one or more restorations, or for other purposes. This conventional workflow creates potential sources of error or inaccuracy that would be avoided by alternative methods or alternative workflows that avoided the step of forming the case model and, instead, proceeded directly from the physical impression to a digital model.
In one embodiment of the present method, a computed tomography (CT) scanner uses x-rays to make a detailed image of a physical impression. A plurality of such images are then combined to form a 3D model of the patient's dentition. A schematic diagram of an example of a CT scanning system 140 is shown in
An example of a suitable scanning system 140 includes a Nikon Model XTH 255 CT Scanner which is commercially available from Nikon Corporation. The example scanning system includes a 225 kV microfocus x-ray source with a 3 μm focal spot size to provide high performance image acquisition and volume processing. The processor 150 may include a storage medium that is configured with instructions to manage the data collected by the scanning system.
As noted above, during operation of the scanning system 140, the impression 146 is located between the x-ray source 142 and the x-ray detector 148. A series of images of the impression 146 are collected by the processor 150 as the impression 146 is rotated in place between the source 142 and the detector 146. An example of a single image 160 is shown in
The plurality of images 160 of the impression 146 are generated by and stored within a storage medium contained within the processor 150 of the scanning system 140, where they may be used by software contained within the processor to perform additional operations. For example, in an embodiment, the plurality of images 160 undergo tomographic reconstruction in order to generate a 3D virtual image 170 (see
In one embodiment, the volumetric image 170 is converted into a surface image 180 (see, e.g.,
In one embodiment, the surface imaging algorithm used to convert the volumetric image 170 into a surface image 180 is configured to construct the surface image of the dentition 180 directly from the volumetric image 170 without including an intermediate step of constructing a surface image of the impression. For example,
In one embodiment, the surface imaging algorithm used to convert the volumetric image 170 of the dental impression into a surface image 180 of the patient's dentition relies on defining isosurfaces and density gradients and then directly creating a surface image based upon the void spaces 126 and 128 (see, e.g.,
In the embodiment shown, as described above, a dental impression is collected using a triple tray 100 dental impression tray, thereby collecting an upper impression 122, a lower impression 124, and a bite registration in a single step. As a result, after scanning, reconstruction, and generation of a volumetric image of the triple tray and impression 146 (see
For example, in
The above descriptions of the scanning system and the algorithms used to perform the scanning, imaging, reconstruction, and surface imaging functions are not intended to suggest any limitation as to scope of use or functionality, as the innovations may be implemented in diverse general-purpose or special-purpose computing systems. For example, the computing environment used to perform these functions can be any of a variety of computing devices (e.g., desktop computer, laptop computer, server computer, tablet computer, gaming system, mobile device, programmable automation controller, etc.) that can be incorporated into a computing system comprising one or more computing devices.
For example, a computing environment may include one or more processing units and memory. The processing units execute computer-executable instructions. A processing unit can be a central processing unit (CPU), a processor in an application-specific integrated circuit (ASIC), or any other type of processor. In a multi-processing system, multiple processing units execute computer-executable instructions to increase processing power. For example, a representative computing environment may include a central processing unit as well as a graphics processing unit or co-processing unit. The tangible memory may be volatile memory (e.g., registers, cache, RAM), non-volatile memory (e.g., ROM, EEPROM, flash memory, etc.), or some combination of the two, accessible by the processing unit(s). The memory stores software implementing one or more innovations described herein, in the form of computer-executable instructions suitable for execution by the processing unit(s).
A computing system may have additional features. For example, in some embodiments, the computing environment includes storage, one or more input devices, one or more output devices, and one or more communication connections. An interconnection mechanism such as a bus, controller, or network, interconnects the components of the computing environment. Typically, operating system software provides an operating environment for other software executing in the computing environment, and coordinates activities of the components of the computing environment.
The tangible storage may be removable or non-removable, and includes magnetic or optical media such as magnetic disks, magnetic tapes or cassettes, CD-ROMs, DVDs, or any other medium that can be used to store information in a non-transitory way and can be accessed within the computing environment. The storage stores instructions for the software implementing one or more innovations described herein.
The input device(s) may be, for example: a touch input device, such as a keyboard, mouse, pen, or trackball; a voice input device; a scanning device; any of various sensors; another device that provides input to the computing environment; or combinations thereof. For video encoding, the input device(s) may be a camera, video card, TV tuner card, or similar device that accepts video input in analog or digital form, or a CD-ROM or CD-RW that reads video samples into the computing environment. The output device(s) may be a display, printer, speaker, CD-writer, or another device that provides output from the computing environment.
The communication connection(s) enable communication over a communication medium to another computing entity. The communication medium conveys information, such as computer-executable instructions, audio or video input or output, or other data in a modulated data signal. A modulated data signal is a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media can use an electrical, optical, RF, or other carrier.
Any of the disclosed methods can be implemented as computer-executable instructions stored on one or more computer-readable storage media (e.g., one or more optical media discs, volatile memory components (such as DRAM or SRAM), or nonvolatile memory components (such as flash memory or hard drives)) and executed on a computer (e.g., any commercially available computer, including smart phones, other mobile devices that include computing hardware, or programmable automation controllers) (e.g., the computer-executable instructions cause one or more processors of a computer system to perform the method). The term computer-readable storage media does not include communication connections, such as signals and carrier waves. Any of the computer-executable instructions for implementing the disclosed techniques as well as any data created and used during implementation of the disclosed embodiments can be stored on one or more computer-readable storage media. The computer-executable instructions can be part of, for example, a dedicated software application or a software application that is accessed or downloaded via a web browser or other software application (such as a remote computing application). Such software can be executed, for example, on a single local computer (e.g., any suitable commercially available computer) or in a network environment (e.g., via the Internet, a wide-area network, a local-area network, a client-server network (such as a cloud computing network), or other such network) using one or more network computers.
For clarity, only certain selected aspects of the software-based implementations are described. Other details that are well known in the art are omitted. For example, it should be understood that the disclosed technology is not limited to any specific computer language or program. For instance, the disclosed technology can be implemented by software written in C++, Java, Perl, Python, JavaScript, Adobe Flash, or any other suitable programming language. Likewise, the disclosed technology is not limited to any particular computer or type of hardware. Certain details of suitable computers and hardware are well known and need not be set forth in detail in this disclosure.
It should also be well understood that any functionality described herein can be performed, at least in part, by one or more hardware logic components, instead of software. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Program-specific Integrated Circuits (ASICs), Program-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc.
Furthermore, any of the software-based embodiments (comprising, for example, computer-executable instructions for causing a computer to perform any of the disclosed methods) can be uploaded, downloaded, or remotely accessed through a suitable communication means. Such suitable communication means include, for example, the Internet, the World Wide Web, an intranet, software applications, cable (including fiber optic cable), magnetic communications, electromagnetic communications (including RF, microwave, and infrared communications), electronic communications, or other such communication means.
In view of the many possible embodiments to which the principles of the disclosure may be applied, it should be recognized that the illustrated embodiments are only preferred examples and should not be taken as limiting the scope of the disclosure. Rather, the scope of the invention is defined by all that comes within the scope and spirit of the following claims.
This application claims priority to and the benefit of U.S. provisional patent application No. 62/423,460, filed Nov. 17, 2016 which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
D302683 | Iwasaki et al. | Aug 1989 | S |
5023895 | McCroskey et al. | Jun 1991 | A |
5270827 | Kobyayashi et al. | Dec 1993 | A |
5368478 | Andreiko et al. | Nov 1994 | A |
5431562 | Andreiko et al. | Jul 1995 | A |
5447432 | Andreiko et al. | Sep 1995 | A |
5454717 | Andreiko et al. | Oct 1995 | A |
5605459 | Kuroda et al. | Feb 1997 | A |
D394316 | Kodama et al. | May 1998 | S |
5879158 | Doyle et al. | Mar 1999 | A |
6068482 | Snow | May 2000 | A |
6081739 | Lemchen | Jun 2000 | A |
6091412 | Simonoff et al. | Jul 2000 | A |
6152731 | Jordan et al. | Nov 2000 | A |
6198552 | Nagae | Mar 2001 | B1 |
6217334 | Hultgren | Apr 2001 | B1 |
6227850 | Chishti et al. | May 2001 | B1 |
6244861 | Andreiko et al. | Jun 2001 | B1 |
6318994 | Chishti et al. | Nov 2001 | B1 |
6322359 | Jordan et al. | Nov 2001 | B1 |
6350120 | Sachdeva et al. | Feb 2002 | B1 |
6371761 | Cheang et al. | Apr 2002 | B1 |
6386867 | Durbin et al. | May 2002 | B1 |
6386878 | Pavlovskaia et al. | May 2002 | B1 |
6406292 | Chishti et al. | Jun 2002 | B1 |
6409504 | Jones et al. | Jun 2002 | B1 |
6450807 | Chishti et al. | Sep 2002 | B1 |
6463344 | Pavloskaia et al. | Oct 2002 | B1 |
6512994 | Sachdeva | Jan 2003 | B1 |
6554611 | Chishti et al. | Apr 2003 | B2 |
6582225 | Bergersen | Jun 2003 | B1 |
D476658 | Machi et al. | Jul 2003 | S |
6602070 | Miller et al. | Aug 2003 | B2 |
6621491 | Baumrind et al. | Sep 2003 | B1 |
6632089 | Rubbert et al. | Oct 2003 | B2 |
6633789 | Nikolskiy et al. | Oct 2003 | B1 |
6648640 | Rubbert et al. | Nov 2003 | B2 |
6688886 | Hughes et al. | Feb 2004 | B2 |
6726478 | Isiderio et al. | Apr 2004 | B1 |
6767208 | Kaza | Jul 2004 | B2 |
6783360 | Chishti | Aug 2004 | B2 |
7013191 | Rubbert et al. | Mar 2006 | B2 |
7027642 | Rubbert et al. | Apr 2006 | B2 |
7029275 | Rubbert et al. | Apr 2006 | B2 |
7040896 | Pavlovskaia et al. | May 2006 | B2 |
7068825 | Rubbert et al. | Jun 2006 | B2 |
7080979 | Rubbert et al. | Jul 2006 | B2 |
7134874 | Chishti et al. | Nov 2006 | B2 |
7140877 | Kaza | Nov 2006 | B2 |
D533555 | Odhe et al. | Dec 2006 | S |
7156655 | Sachdeva et al. | Jan 2007 | B2 |
7234937 | Sachdeva et al. | Jun 2007 | B2 |
7292716 | Kim | Nov 2007 | B2 |
7361018 | Imgrund et al. | Apr 2008 | B2 |
7361020 | Abolfathi et al. | Apr 2008 | B2 |
7373286 | Nikolskiy et al. | May 2008 | B2 |
D573146 | Sukenari et al. | Jul 2008 | S |
D580962 | Sukenari et al. | Nov 2008 | S |
7476100 | Kuo | Jan 2009 | B2 |
7545372 | Kopelman et al. | Jun 2009 | B2 |
7609875 | Liu et al. | Oct 2009 | B2 |
D612851 | Maruyama et al. | Mar 2010 | S |
7717708 | Sachdeva et al. | May 2010 | B2 |
7740476 | Rubbert et al. | Jun 2010 | B2 |
7805003 | Cohen et al. | Sep 2010 | B1 |
8013853 | Douglas et al. | Sep 2011 | B1 |
8045180 | Friemel | Oct 2011 | B2 |
8075306 | Kitching et al. | Dec 2011 | B2 |
8229180 | Baloch et al. | Jul 2012 | B2 |
8308481 | DiAngelo et al. | Nov 2012 | B2 |
8332061 | Baloch et al. | Dec 2012 | B2 |
8342843 | Perot et al. | Jan 2013 | B2 |
8380644 | Zouhar et al. | Feb 2013 | B2 |
D678383 | Park et al. | Mar 2013 | S |
D714940 | Kim | Oct 2014 | S |
8855375 | Macciola et al. | Oct 2014 | B2 |
8995732 | Kaza et al. | Mar 2015 | B2 |
9055988 | Galgut et al. | Jun 2015 | B2 |
9135498 | Andreiko et al. | Sep 2015 | B2 |
D742010 | Metcalf | Oct 2015 | S |
9421074 | Sachdeva et al. | Aug 2016 | B2 |
D776818 | Metcalf | Jan 2017 | S |
9629698 | Lior et al. | Apr 2017 | B2 |
9737381 | Lee | Aug 2017 | B2 |
9888983 | Sachdeva et al. | Feb 2018 | B2 |
10149744 | Lior et al. | Dec 2018 | B2 |
10624717 | Wen | Apr 2020 | B2 |
20020006217 | Rubbert et al. | Jan 2002 | A1 |
20020028418 | Farag et al. | Mar 2002 | A1 |
20020141626 | Caspi | Oct 2002 | A1 |
20020150859 | Imgrund et al. | Oct 2002 | A1 |
20030198377 | Ng | Oct 2003 | A1 |
20030198378 | Ng | Oct 2003 | A1 |
20030207227 | Abolfathi | Nov 2003 | A1 |
20030207235 | Van der Zel | Nov 2003 | A1 |
20030224314 | Bergersen | Dec 2003 | A1 |
20040072120 | Lauren | Apr 2004 | A1 |
20040146198 | Herley | Jul 2004 | A1 |
20040152036 | Abolfathi | Aug 2004 | A1 |
20040175671 | Jones et al. | Sep 2004 | A1 |
20040197728 | Abolfathi et al. | Oct 2004 | A1 |
20040214128 | Sachdeva et al. | Oct 2004 | A1 |
20050018901 | Kaufmann et al. | Jan 2005 | A1 |
20050019732 | Kaufmann et al. | Jan 2005 | A1 |
20050030368 | Morrison | Feb 2005 | A1 |
20050043837 | Rubbert et al. | Feb 2005 | A1 |
20050089213 | Geng | Apr 2005 | A1 |
20050089822 | Geng | Apr 2005 | A1 |
20050191593 | Knopp | Sep 2005 | A1 |
20050192835 | Kuo et al. | Sep 2005 | A1 |
20050208449 | Abolfathi et al. | Sep 2005 | A1 |
20050271996 | Sporbert et al. | Dec 2005 | A1 |
20060127859 | Wen | Jun 2006 | A1 |
20060147872 | Andreiko | Jul 2006 | A1 |
20060154198 | Durbin | Jul 2006 | A1 |
20060263739 | Sporbert et al. | Nov 2006 | A1 |
20060263741 | Imgrund et al. | Nov 2006 | A1 |
20060275736 | Wen et al. | Dec 2006 | A1 |
20070003900 | Miller | Jan 2007 | A1 |
20070031790 | Raby et al. | Feb 2007 | A1 |
20070031791 | Cinader et al. | Feb 2007 | A1 |
20070065768 | Nadav | Mar 2007 | A1 |
20070128573 | Kuo | Jun 2007 | A1 |
20070128574 | Kuo et al. | Jun 2007 | A1 |
20070129991 | Kuo | Jun 2007 | A1 |
20070134613 | Kuo et al. | Jun 2007 | A1 |
20070141527 | Kuo et al. | Jun 2007 | A1 |
20070167784 | Shekhar et al. | Jul 2007 | A1 |
20070168152 | Matov et al. | Jul 2007 | A1 |
20070190481 | Schmitt | Aug 2007 | A1 |
20070207441 | Lauren | Sep 2007 | A1 |
20070238065 | Sherwood et al. | Oct 2007 | A1 |
20080020350 | Matov et al. | Jan 2008 | A1 |
20080048979 | Ruttenberg | Feb 2008 | A1 |
20080057466 | Jordan et al. | Mar 2008 | A1 |
20080064008 | Schmitt | Mar 2008 | A1 |
20080182220 | Chishti et al. | Jul 2008 | A1 |
20080248443 | Chishti et al. | Oct 2008 | A1 |
20080261165 | Steingart et al. | Oct 2008 | A1 |
20080305458 | Lemchen | Dec 2008 | A1 |
20090080746 | Ku et al. | Mar 2009 | A1 |
20090087817 | Jansen et al. | Apr 2009 | A1 |
20090162813 | Glor et al. | Jun 2009 | A1 |
20090191503 | Matov et al. | Jul 2009 | A1 |
20090220916 | Fisker | Sep 2009 | A1 |
20090246726 | Chelnokov et al. | Oct 2009 | A1 |
20090248184 | Steingart | Oct 2009 | A1 |
20090298017 | Boerjes et al. | Dec 2009 | A1 |
20090311647 | Fang et al. | Dec 2009 | A1 |
20100009308 | Wen | Jan 2010 | A1 |
20100100362 | Zouhar et al. | Apr 2010 | A1 |
20100105009 | Karkar | Apr 2010 | A1 |
20100111386 | El-Baz | May 2010 | A1 |
20100138025 | Morton et al. | Jun 2010 | A1 |
20100145898 | Malfliet et al. | Jun 2010 | A1 |
20100217567 | Marshall | Aug 2010 | A1 |
20100260405 | Cinader, Jr. | Oct 2010 | A1 |
20100297572 | Kim | Nov 2010 | A1 |
20110004331 | Cinader, Jr. et al. | Jan 2011 | A1 |
20110045428 | Boltunov et al. | Feb 2011 | A1 |
20110059413 | Schutyser et al. | Mar 2011 | A1 |
20110060438 | Stoddard et al. | Mar 2011 | A1 |
20110090513 | Seidl et al. | Apr 2011 | A1 |
20110184762 | Chishti et al. | Jul 2011 | A1 |
20110206247 | Dachille et al. | Aug 2011 | A1 |
20110207072 | Schiemann | Aug 2011 | A1 |
20110244415 | Batesole | Oct 2011 | A1 |
20110268326 | Kuo et al. | Nov 2011 | A1 |
20110292047 | Chang et al. | Dec 2011 | A1 |
20120015316 | Sachdeva et al. | Jan 2012 | A1 |
20120065756 | Rubbert | Mar 2012 | A1 |
20120088208 | Schulter et al. | Apr 2012 | A1 |
20120139142 | Van der Zel | Jun 2012 | A1 |
20120214121 | Greenberg | Aug 2012 | A1 |
20130172731 | Gole | Jul 2013 | A1 |
20130218531 | Deichmann et al. | Aug 2013 | A1 |
20130226534 | Fisker et al. | Aug 2013 | A1 |
20130275107 | Alpern et al. | Oct 2013 | A1 |
20130325431 | See et al. | Dec 2013 | A1 |
20130329020 | Kriveshko et al. | Dec 2013 | A1 |
20130335417 | McQueston et al. | Dec 2013 | A1 |
20140003695 | Dean | Jan 2014 | A1 |
20140055135 | Nielsen et al. | Feb 2014 | A1 |
20140067334 | Kuo | Mar 2014 | A1 |
20140067337 | Kopleman | Mar 2014 | A1 |
20140185742 | Chen | Jul 2014 | A1 |
20140272772 | Andreiko et al. | Sep 2014 | A1 |
20140278278 | Nikolskiy et al. | Sep 2014 | A1 |
20140278279 | Azernikov et al. | Sep 2014 | A1 |
20140308624 | Lee et al. | Oct 2014 | A1 |
20140329194 | Sachdeva et al. | Nov 2014 | A1 |
20140379356 | Sachdeva et al. | Dec 2014 | A1 |
20150049081 | Coffey et al. | Feb 2015 | A1 |
20150056576 | Nikolskiy et al. | Feb 2015 | A1 |
20150111168 | Vogel | Apr 2015 | A1 |
20150154678 | Fonte et al. | Jun 2015 | A1 |
20150182316 | Morales | Jul 2015 | A1 |
20150320320 | Kopelman et al. | Nov 2015 | A1 |
20150347682 | Chen et al. | Dec 2015 | A1 |
20160135924 | Choi et al. | May 2016 | A1 |
20160148370 | Maury et al. | May 2016 | A1 |
20160239631 | Wu et al. | Aug 2016 | A1 |
20160256035 | Kopelman et al. | Aug 2016 | A1 |
20160256246 | Stapleton et al. | Sep 2016 | A1 |
20160367336 | Lv et al. | Dec 2016 | A1 |
20170100214 | Wen | Apr 2017 | A1 |
20170135655 | Wang et al. | May 2017 | A1 |
20170231721 | Akeel et al. | Aug 2017 | A1 |
20170340418 | Raanan | Nov 2017 | A1 |
20180028063 | Elbaz | Feb 2018 | A1 |
20180028064 | Elbaz et al. | Feb 2018 | A1 |
20180028065 | Elbaz et al. | Feb 2018 | A1 |
20180055600 | Matov et al. | Mar 2018 | A1 |
20180132982 | Nikolskiy et al. | May 2018 | A1 |
20180146934 | Ripoche et al. | May 2018 | A1 |
20180165818 | Tsai et al. | Jun 2018 | A1 |
20180189420 | Fisker | Jul 2018 | A1 |
20180303581 | Martz et al. | Oct 2018 | A1 |
20200121429 | Pesach et al. | Apr 2020 | A1 |
Number | Date | Country |
---|---|---|
108024841 | May 2018 | CN |
108665533 | Oct 2018 | CN |
2345387 | Jul 2011 | EP |
2886077 | Jun 2015 | EP |
3180761 | Nov 2001 | WO |
2001080763 | Nov 2001 | WO |
2013180423 | May 2013 | WO |
2016097033 | Jun 2016 | WO |
2017178908 | Oct 2017 | WO |
2018022054 | Feb 2018 | WO |
2018038748 | Mar 2018 | WO |
2018101923 | Jun 2018 | WO |
Entry |
---|
Hollt et al, “GPU-Based Direct vol. Rendering of Industrial CT Data ” (Year: 2007). |
Kilic et al. “GPU Supported Haptic Device Integrated Dental Simulation Environment” (Year: 2006). |
Zeng et al. “Finite Difference Error Analysis of Geometry Properties of Implicit Surfaces” (Year: 2011). |
Ibraheem, “Reduction of artifacts in dental cone beam CT images to improve the three dimensional image reconstruction” (Year: 2012). |
Changhwan Kim, Scientific Reports, “Efficient digitalization method for dental restorations using micro-CT data”, www.nature.com/scientificreports, 7:44577|DOI:10.1038/srep44577 (dated Mar. 15, 2017). |
Emiliano Perez et al., A Comparison of Hole-Filing Methods In 3D, Int. J. Appl. Math. Comput. Sci., 2016, vol. 26, No. 4, 885-903, in 19 pages. |
Yokesh Kumar et al., Automatic Feature Identification in Dental Meshes, ResearchGate, Article in Computer-Aided Design and Applications, Aug. 2013, in 24 pages. |
Andrew W. Fitzgibbon et al., Direct Least Squares Fitting of Ellipses, Department of Artificial Intelligence, The University of Edinburgh, dated Jan. 4, 1996, in 15 pages. |
Oscar Sebio Cajaraville, Four Ways to Create a Mesh for a Sphere, Dec. 7, 2015, in 9 pages. |
Shuai Yang et al., Interactive Tooth Segmentation Method of Dental Model based on Geodesic, ResearchGate, Conference paper, Jan. 2017, in 6 pages. |
Changhwan Kim et al., Efficient digitalization method for dental restorations using micro-CT data, nature.com/scientificreports, published Mar. 15, 2017, in 8 pages. |
Dexter C. Kozen, The Design and Analysis of Algorithms, Texts and Monographs in Computer Science, (c) 1992, See Whole book. |
Bob Sedgewick et al., Algorithms and Data Structures Fall 2007, Department of Computer Science, Princeton University, https://www.cs.princeton.edu/˜rs/AlgsDS07/, downloaded Oct. 28, 2021, in 41 pages. |
Alban Pages et al., Generation of Computational Meshes from MRI and CT-Scan data, ResearchGate, ESAIM Proceedings, Sep. 2005, vol. 14, 213-223 in 12 pages. |
William E. Lorensen et al., Marching Cubes: A High Resolution 3D Surface Construction Algorithm, Computer Graphics, vol. 21, No. 4, Jul. 1987 in 7 pages. |
Alfred V. Aho et al., The Design and Analysis of Computer Algorithms, Addison-Wesley Publishing Company, Jun. 1974, pp. 124-155. |
Sheng-hui Liao et al., Automatic Tooth Segmentation of Dental Mesh Based on Harmonic Fields, Hindawi Publishing Corporation, BioMed Research International, vol. 2015, Article ID 187173, in 11 pages. |
Bribiesca, E. “3D-Curve Representation by Means of a Binary Chain Code”, Mathematical and computer modelling 10.3(2004):285-295; p. 292, paragraph 2; p. 293, paragraph 1. |
Kiattisin, S. et al. “A Match of X-Ray Teeth Films Using Image Processing Based on Special Features of Teeth”, SICE Annual Conference, 2008. IEEE: Aug. 22, 2008; p. 97; col. 2, paragraph 2; a 98, col. 1-2. |
Cui, M, Femiani, J., Hu, J., Wondka, Razada A. “Curve Matching for Open 2D Curves”, Pattern Recognition Letters 30 (2009): pp. 1-10. |
Gumhold, S., Wang, X., MacLeod R. “Feature Extraction From Point Clouds”, Scientific Computing and Imaging Institute: pp. 1-13 Proceedings, 10th International Meshing Roundtable, Sandia National Laboratores, pp. 293-305, Oct. 7-10, 2001. |
Wolfson, H. “On Curve Matching”, Robotics Research Technical Report, Technical Report No. 256, Robotic Report No. 86 (Nov. 1986) New York University, Dept, of Computer Science, New York, New York 10012. |
Rietzel et al., “Moving targets: detection and tracking of internal organ motion for treatment planning and patient set up”, Radiotherapy and Oncology, vol. 73, supplement 2, Dec. 2004, pp. S68-S72. |
Murat Arikan et al., O-Snap: Optimization-Based Snapping for Modeling Architecture, ACM Transactions on Grphics, vol. 32, No. 1, Article 6, Publication date: Jan. 2013, in 15 pages. |
Brian Amberg et al., Optimal Step Nonrigid ICP Algorithms for Surface Registration, Proceedings/CVPR, IEEE Computer Society Conference on Computer Vision and Pattern Recognition, IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Jun. 2007, in 9 pages. |
T. Rabbani et al., Segmentation Of Point Clouds Using Smoothness Constraint, ISPRS vol. XXXVI, Part 5, Dresden Sep. 25-27, 2006, in 6 pages. |
Number | Date | Country | |
---|---|---|---|
20180132982 A1 | May 2018 | US |
Number | Date | Country | |
---|---|---|---|
62423460 | Nov 2016 | US |