The disclosure relates generally to the field of medical imaging and more particularly relates to apparatus and methods for supporting implant surgery with 3-D imaging.
Dental implants are used to replace missing or badly damaged teeth. In order to mount a dental implant securely in bony tissue, a hole is drilled into the mandibular or jaw-bone of the patient. The implant portion or abutment that holds the artificial tooth is usually made of titanium or a titanium alloy and must be able to rapidly integrate with the bone of the patient. Once the implant is seated and secure, the artificial tooth can be installed. The abutment between the implant and the prosthesis can include an elbow so that the axis of insertion of the prosthesis does not necessarily coincide with the axis of insertion of the prosthesis.
Osteotomy, that is the drilling of a hole in the jaw or mandibular bone at the proper angle and dimension, requires accuracy so that the implant fits correctly without damage to surrounding tissue or structures and so that the completed work is aesthetically acceptable. For edentulous or at least partially edentulous patients, implant planning is carefully executed. Based on information from x-ray or computerized tomography (CT) imaging of the patient's dental arch, dedicated software tools allow the dentist to define the location, diameter, length or drill depth, shape and angulation of the implant to be affixed on the patient's jawbone. One consideration in this planning is reducing the risk of damage to nearby nerves or blood vessels.
A step of the implantology process is acquiring sufficient information related to the dental clinical situation of the patient. For this purpose, a Cone Beam Computerized Tomography (CBCT) scan can be performed on the patient's dentition and a three dimensional scan of the jaw bone is obtained. The image is particularly helpful to determine the position of teeth, roots, sinus, blood vessels and nerves as well as the thickness of the bones. Depending on this anatomical information, implant planning can begin. This planning includes defining the position, diameter, length, and tilt of the implant to be screwed into the jaw bone. Among planning considerations is bone health and robustness; the implant must be screwed into bone that is sufficiently thick and strong enough to be drilled and to support the effort of chewing after the prosthesis is installed. A hole is then virtually defined on the three-dimensional image of the patient's anatomy.
The so-called standard double scan protocol is a method used to define the implant planning. A radiographic guide, defined based on a mould of the patient's mouth, is manufactured, such as using a rapid prototyping process. This guide generally includes some prosthetic teeth that are missing in the patient's mouth, some grooves and gaps that surround, existing teeth and some radio-opaque markers. A first CBCT scan of the guide is performed, along with a second CBCT scan of the patient's jaw with the guide in the patient's mouth. By registering the markers on both 3D images, the volume images can be merged and a 3D image featuring the prosthesis in the patient's mouth is obtained. The implant planning is then performed using the combined image data.
Once the drill hole is defined, in terms of length, diameter, tilt, and location, the results of implant planning, representing the hole in the 3D image of the patient's jaw, are sent to a laboratory to manufacture a surgical guide. Custom-fabricated for each patient, shaped to conform to at least a portion of the patient's dental arch, the surgical guide is fitted to the patient's mouth and includes one or more guide holes to guide the dental drill down into the jawbone according to the implant planning. The surgical guide generally has the form of a template worn in the patient's mouth and provided with at least one hole filed with a metallic sleeve and having geometric characteristics related to the holes defined in the implant planning. The laboratory sends the manufactured surgical guide to the dentist for use in the implant procedure.
At the start of surgery, the surgical guide is positioned in the patient's mouth. The dentist inserts the drilling tool into the metallic sleeve of the hole in the surgical guide and the tool is guided for drilling into the patient's jaw and jawbone. The implant can then be screwed into the bone.
There are a number of drawbacks to the existing process for implant preparation and execution. Fabrication of the surgical guide is complex and time-consuming, so that the guide can be fairly costly, with a number of workflow steps that must be carefully and correctly executed. Although some dentists are equipped with a milling machine that enables them to mill the surgical guide on-site, many dentists do not have in-house milling equipment and must work with a laboratory to manufacture the guide. It would be advantageous for saved time and cost to reduce the requirements for guide accuracy or to eliminate the need for the surgical guide altogether.
There are other practical difficulties to the surgery process, even where advanced CBCT scans and a precision fabricated surgical guide are used. During the drill placement and drilling procedure, the practitioner cannot simultaneously view the implant plan on the display while working on the patient. The need for constant reference back to the displayed implant plan interrupts the process and can contribute to errors, even causing misalignment or other problems.
Thus, it can be appreciated that there is a need for solutions that reduce the cost and complexity of implant surgery and improve the visualization of the dental practitioner for developing and implementing an implant plan.
An object of the present disclosure is to advance the art of dental imaging for implant surgery. Among other aspects, embodiments of the present disclosure process and display images obtained from volume image data and modified to show dental implant planning. Embodiments of the present disclosure can reduce or eliminate the need to form a surgical guide for implant procedure in some cases, helping to save time and reduce the cost of implant surgery.
These objects are given only by way of illustrative example, and such objects may be exemplary of one or more embodiments of the invention. Other desirable objectives and advantages inherently achieved by the may occur or become apparent to those skilled in the art. The invention is defined by the appended claims.
According to one aspect of the disclosure, there is provided a method for guiding the position of a dental drill for implant treatment of a patient, the method comprising: acquiring a volume image of patient anatomy; superimposing an image of a planned drill hole on a display of the acquired volume image according to observer instructions to form an implant plan; displaying at least a portion of the implant plan in stereoscopic form on a head-mounted device worn by an observer and tracking patient position so that the displayed portion of the implant plan is registered to the patient anatomy that lies in the observer's field of view; and highlighting the location of the planned drill hole on the head-mounted device display.
The foregoing and other objects, features, and advantages of the invention will be apparent from the following more particular description of the embodiments of the invention, as illustrated in the accompanying drawings. The elements of the drawings are not necessarily to scale relative to each other.
The following is a detailed description of the preferred embodiments, reference being made to the drawings in which the same reference numerals identify the same elements of structure in each of the several figures.
The terms “first”, “second”, and so on, do not necessarily denote any ordinal, sequential, or priority relation, but are used to distinguish one step, element, or set of elements from another, unless specified otherwise.
The term “volume image” is synonymous with the terms “3-Dimensional image” or “3-D image”.
The terms “viewer”, “observer”, “user”, and “viewing practitioner” have equivalent meaning and refer generally to the practitioner or technician who views displayed, computer-generated image content.
For the image processing steps described herein, the terms “pixels” for picture image data elements, conventionally used with respect 2-D imaging and image display, and “voxels” for volume image data elements, often used with respect to 3-D imaging, can be used interchangeably. The 3-D volume image is itself synthesized from image data obtained as pixels on a 2-D sensor array and displays as a 2-D image from some angle of view. Thus, 2-D image processing and image analysis techniques can be applied to the 3-D volume image data. In the description that follows, techniques described as operating upon pixels may alternately be described as operating upon the 3-D voxel data that is stored and represented in the form of 2-D pixel data for display. In the same way, techniques that operate upon voxel data can also be described as operating upon pixels.
Embodiments of the present disclosure can be used with volume data from any of a number of sources, including computed tomography (CT), CBCT, or other volume image modalities. Methods of the present disclosure generate 3-D volume data from a set of 2-D projection images.
The term “energizable” relates to a device or set of components that perform an indicated function upon receiving power and, optionally, upon receiving an enabling signal.
The term “actuable” has its conventional meaning, relating to a device or component that is capable of effecting an action in response to a stimulus, such as in response to an electrical signal, for example.
The term “highlighting” for a displayed feature has its conventional meaning as is understood to those skilled in the information and image display arts. In general, highlighting uses some form of localized display enhancement to attract the attention of the viewer. Highlighting a portion of an image, such as an individual organ, tooth, bone, or structure, or a path from one object to the next, for example, can be achieved in any of a number of ways, including, but not limited to, annotating, displaying a nearby or overlaying symbol, outlining or tracing, display in a different color or at a markedly different intensity or gray scale value than other image or information content, blinking or animation of a portion of a display, or display at higher sharpness or contrast.
The phrase “left-eye image” denotes the image formed by a display apparatus and intended for viewing by the left eye of the viewer. Likewise, the phrase “right-eye image” refers to the complementary image that is intended for viewing from the right eye of the viewer. The term “stereo pair” denotes the combination of right-eye image and corresponding complementary left-eye image for a stereoscopic view. A stereo pair can be hyperstereoscopic where there is an abnormally large separation distance between the angular views for the complementary left- and right-eye images, relative to the pupil-to-pupil distance of an average viewer. A stereo pair can be hypostereoscopic where there is an abnormally small separation distance between the angular views for left- and right-eye images. The separation distance is sometimes referred to as the “stereo base”.
The terms “virtual view” and “virtual image” are used to connote computer-generated or computer-processed images that are displayed stereoscopically to the viewer. The virtual image that is generated can be formed by the optical system using a number of well-known techniques and this virtual image can be formed by the display optics using convergence or divergence of light.
An image is considered to be “in register” with a subject that is in the field of view when the image and subject are visually aligned from the perspective of the observer. As the term “registered” is used in the current disclosure, a registered feature of a computer-generated or virtual image is sized, positioned, and oriented on the display so that its appearance represents the planned or intended size, position, and orientation for the corresponding object, correlated to the field of view of the observer. Registration is in three dimensions, so that, from the view perspective of the practitioner/observer, the registered feature is rendered at the position and angular orientation that is appropriate for the patient who is in the treatment chair and in the visual field of the observing practitioner. Thus, for example, where the computer-generated feature is a drill hole for a patient's tooth, and where the observer is looking into the mouth of the patient, the display of the drill hole appears as if superimposed or overlaid in position within the mouth.
The logic flow diagram of
In a volume reconstruction step S120, the acquired projection image data is used to generate a reconstructed 3-D volume image. This can be a standard reconstructed volume image formed from a set of 2-D projection images or may be an image generated from combined sets of 2-D projection image data, such as the fused image volume generated as described in commonly assigned U.S. 2013/0004041 entitled “Methods and Apparatus for Texture Based Filter Fusion for CBCT System and Cone-beam Image Reconstruction” by Yang et al., incorporated herein by reference in its entirety. The reconstructed volume image can then be displayed and manipulated, such as by rotation, panning, and other image manipulation utilities that are well known to those skilled in the 3-D volume image display arts.
In a plan development step S130, an implant plan 40 is developed interactively, by viewing the reconstructed image volume of the jaw at an appropriate angle, modifying the volume image by adding an image that represents an implant or drill tool device to the displayed image, and adjusting the displayed virtual implant or drill tool position within the image of the jaw until the planned implant is accurately modeled.
Procedures for interactively visualizing and adjusting the proposed location of an implant within a 3-D image of the corresponding patient anatomy are well known to those skilled in the medical and dental imaging arts. Methods for placement of a 3-D object into position relative to another 3-D object, for example, are known and widely used in computer visualization utilities.
Referring to
The logic flow diagram of
For the sequence of
A drill registration step S220, which can be executed simultaneously with step S210, registers the actual drill that is held by the dentist with the planned drill hole and related image content specified in implant plan 40. To do this, the visualization apparatus used by the practitioner detects the position of the drill in the dentist's hand and tracks this position relative to the intended drill position in implant plan 40. As is represented by the dashed lines shown in
A drill monitoring step 5230 continues to track drill progress once the drill is in position and indicates when the drilling operation is done and this phase of the surgery completed. Drill monitoring step S230 is refreshed regularly, as indicated in
Practitioner 12 views a volume image 28 from an appropriate perspective and identifies the desired location for an implant using the display 22 and suitable operator interface utilities. In addition, practitioner 12 can also indicate other features in the volume image 28, such as a facial nerve 26 and one or more targets 24, such as the location of the top or bottom of a hole for the implant, or a central axis for drilling the hole. Operator interface 20 also displays a 3-D view of an implant 30 for placement on the displayed volume image 28.
The side view of
Embodiments of the present disclosure project the volume image of the implant plan in a visualization apparatus that is worn by the practitioner or otherwise disposed so that the displayed volume image is in the visual field of the practitioner and superimposed on the view of the area of the patient's mouth. The reconstructed and modified image of the implant plan appears in stereoscopic form, that is, with display of a right-eye image and a left-eye image.
According to an embodiment of the present disclosure, as shown in the top view of
To correlate the obtained CBCT image data with the dentist's view of the patient, and apply this in real-time, HMD 50 performs a number of visualization functions simultaneously.
HMD devices and related wearable devices that have cameras, sensors, and other integrated components are known in the art and are described, for example, in U.S. Pat. No. 6,091,546 to Spitzer et al.; U.S. Pat. No. 8,582,209 to Amirparviz; U.S. Pat. No. 8,576,276 to Bar-Zeev et al.; and in U.S. Patent Application Publication 2013/0038510 to Brin et al.
For the superimposition of computer-generated image 64 from CBCT imaging on the real-world view of the patient's mouth, computer-generated image 64 is positionally registered with the view that is detected by cameras 561 and 56r. Registration can be performed in a number of ways; methods for registration of a computer-generated image to its real-world counterpart are known to those skilled in the arts, including the use of multiple markers and object recognition, for example. According to an embodiment of the present disclosure, a registration sequence is provided, in which the practitioner follows initial procedural instructions for setting up registration coordinates, such as to view the patient from a specified angle to allow registration software to detect features of the patient anatomy. According to an alternate embodiment of the present disclosure, image feature recognition software is used to detect features of the face and mouth of the patient that help to correlate the visual field to the volume image data so that superposition of the virtual and real images is achieved. Image feature recognition software algorithms are well known to those skilled in the image processing arts. According to an embodiment of the present invention, feature recognition software processing uses stored patient image data and is also used to verify patient identification so that the correct information is shown.
Once the CBCT computer-generated image 64 is registered with the patient anatomy, proper drill positioning and operation can be displayed to assist the practitioner. As shown in
Advantageously, the apparatus and method allows interaction between the displayed image content and the position of the dental practitioner and drill or other tool. The computer-generated display is updated as the position of the dentist's head changes relative to the patient and as the positioning of drill 70 changes relative to the hole 34 and target 24.
Detecting drill depth can be performed using any of a number of techniques. According to an embodiment of the present invention, identifiable image features near the drill hole, such as the height of nearby teeth or position of the gumline or jawbone dimensions are used to calculate and monitor drill depth as the hole is being drilled.
The head-mounted device 50 of the present disclosure can be used in any of a number of dental or medical procedures in addition to implant surgery. By providing tools for 3-D visualization of a plan for placement of a device relative to the patient's anatomy using a CBCT scan or other volume image data, then displaying an image based on this data overlaid with the field of view of the practitioner, the method and apparatus of the present disclosure allow the practitioner to carry out steps of a procedure without waiting for fabrication of intermediate guides that are used to direct the drilling of holes in bone structures or the placement of prosthetic devices.
Head mounted devices (HMDs) are known to those skilled in the visualization art and operate by displaying a computer-generated image that correlates to the real-world image that lies in the field of view of an observer, so that the computer-generated image appears to be superimposed on the real-world image. This appearance of superposition can be executed in any of a number of ways. According to an embodiment of the present invention, display elements 541 and 54r have pixels spaced apart so that the computer-generated image only obstructs a portion of the real-world view and both views are visible at the same time.
According to an alternate embodiment, the computer-generated view is opaque, and the display that appears on display elements 541 and 54r is rapidly alternated with a clear display through lenses 521 and 52r, such as 20 times per second or more, so that the appearance of simultaneous viewing is provided to the HMD viewer.
Display elements 541 and 54r can be devices that incorporate a spatial light modulator, such as a digital micro-mirror array or similar device, or can be emissive devices, such as organic light-emitting diode (OLED) arrays, for example.
Gaze sensing and other methods can be used to detect head or eye movement for the person wearing the HMD and to report changes to processor 60 so that the displayed stereoscopic images can be adjusted. Gaze sensing can be used, for example, to adjust the view angle for the volume image content.
In one embodiment, at least a portion of the implant plan is displayed in stereoscopic form on a head-mounted device worn by an observer and tracking patient position so that the displayed portion of the implant plan is registered to the patient anatomy that lies in the observer's field of view. Then, a response to an observer instruction can disable registration of the implant plan to the patient's mouth anatomy and allow changing the view angle of the implant plan. From another aspect, an embodiment also enables a visualization mode that is independent of the real-world field of view. Using this mode, as shown in
Applicants have described a method for guiding the position of a dental drill for implant treatment of a patient, comprising: acquiring a volume image of patient anatomy; superimposing an image of a planned drill hole on a display of the acquired volume image according to observer instructions to form an implant plan; displaying at least a portion of the implant plan in stereoscopic form on a head-mounted device worn by an observer and tracking patient anatomy position and movement so that the displayed portion of the implant plan is registered to the patient anatomy that lies in the observer's field of view; and highlighting the location of the planned drill hole on the head-mounted device display.
The stereoscopic image of the at least a portion of the implant plan can alternate with the real-world view from the head mounted device at least 20 times per second. The volume image can be acquired using cone-beam computed tomography imaging. Displaying the at least a portion of the implant plan on the head-mounted device can include energizing an emissive display device or energizing a spatial light modulator. Highlighting the location of the planned drill hole can include displaying a drill axis. The method can also track the position of the dental drill relative to the highlighted location and indicate, on the display, when the dental drill is in position for drilling the planned drill hole. Tracking the position of the dental drill can include analyzing images from one or more cameras on a head-mounted device and/or providing a message that indicates that the drill has reached a predetermined drill depth.
Embodiments allow the viewer to adjust stereoscopic left-/right-eye separation so that it is more acceptable to the visual characteristics of a particular practitioner. For example, stereo image separation can be widened or narrowed, to provide slightly hyperstereoscopic or hypostereoscopic view conditions, respectively. Separation adjustment can be performed using the operator interface, for example.
According to an alternate embodiment, one or more markers are used as guides to positioning. In addition, visual indicators are provided for assisting in placement and use of the dental drill. Using the sequence shown in
According to an alternate embodiment, a moire pattern can be displayed and used as a reference marker. The moire pattern is registered to a feature, such as a portion of a tooth or filling, and displays to the viewer wearing head-mounted device 50. The moire pattern is advantaged for stereoscopic viewing, since the appearance of the pattern is dependent on viewing angle. This would allow the use of a single camera, instead of the two cameras that are required for conventional stereoscopic viewing of a marker.
Continuing with the
Continuing with the
At the conclusion of step S 130, implant plan 940 is formed, in which an image representing the implant or drill hole for the implant is registered within the volume image for the patient's jaw or corresponding portion of the patient's jaw to provide a virtual view that can be controlled and manipulated in 3-D space. Implant plan 940 includes reference positioning information that is inherently obtained from the positioning of markers 68. Implant plan 940 can include additional metadata supporting the image data, with information about the patient, data on relative bone density, implant material type, hole diameter and depth, and other information.
The logic flow diagram of
In the
As is represented by the dashed lines shown for step S1110 in
A drill registration step S 1120, which can be executed simultaneously with step S1110, registers the actual drill that is held by the dentist with the planned drill hole 34 and related image content specified in implant plan 940. To do this, the visualization apparatus used by the practitioner detects the position of the drill in the dentist's hand and tracks this position relative to the intended drill position in implant plan 940. A number of indicators suggestive of drill movement can be displayed, appearing within the field of view of the practitioner, as described subsequently.
According to an alternate embodiment, as shown in
As represented by the dashed lines shown in
Other options for stereoscopic indicators include indicators that utilize visual patterns that are responsive to the relative accuracy of positioning and alignment. Moire patterns, for example, can be advantaged because they can have stereoscopic effects, without requiring stereoscopic image generation.
As a form of highlighting, indicators can change state, such as changing color, flashing, or sending out some other visible sign when drilling has been completed or when an error or problem has been detected.
According to an alternate embodiment, a marker 72, optionally positioned on the drill as shown in
Accordingly to at least one embodiment, the system utilizes a computer program with stored instructions that perform on image data that is accessed from an electronic memory. As can be appreciated by those skilled in the image processing arts, a computer program of an embodiment of the present disclosure can be utilized by a suitable, general-purpose computer system, such as a personal computer or workstation. However, many other types of computer systems can be used to execute the computer program of the present disclosure, including an arrangement of networked processors, for example. The computer program for performing the method of the present disclosure may be stored in a computer readable storage medium. This medium may comprise, for example; magnetic storage media such as a magnetic disk such as a hard drive or removable device or magnetic tape; optical storage media such as an optical disc, optical tape, or machine readable optical encoding; solid state electronic storage devices such as random access memory (RAM), or read only memory (ROM); or any other physical device or medium employed to store a computer program. The computer program for performing the method of the present disclosure may also be stored on computer readable storage medium that is connected to the image processor by way of the internet or other network or communication medium. Those skilled in the art will further readily recognize that the equivalent of such a computer program product may also be constructed in hardware.
It is noted that the term “memory”, equivalent to “computer-accessible memory” in the context of the present disclosure, can refer to any type of temporary or more enduring data storage workspace used for storing and operating upon image data and accessible to a computer system, including a database. The memory could be non-volatile, using, for example, a long-term storage medium such as magnetic or optical storage. Alternately, the memory could be of a more volatile nature, using an electronic circuit, such as random-access memory (RAM) that is used as a temporary buffer or workspace by a microprocessor or other control logic processor device. Display data, for example, is typically stored in a temporary storage buffer that is directly associated with a display device and is periodically refreshed as needed in order to provide displayed data. This temporary storage buffer can also be considered to be a memory, as the term is used in the present disclosure. Memory is also used as the data workspace for executing and storing intermediate and final results of calculations and other processing. Computer-accessible memory can be volatile, non-volatile, or a hybrid combination of volatile and non-volatile types.
It is understood that the computer program product of the present disclosure may make use of various image manipulation algorithms and processes that are well known. It will be further understood that the computer program product embodiment of the present disclosure may embody algorithms and processes not specifically shown or described herein that are useful for implementation. Such algorithms and processes may include conventional utilities that are within the ordinary skill of the image processing arts. Additional aspects of such algorithms and systems, and hardware and/or software for producing and otherwise processing the images or co-operating with the computer program product of the present invention, are not specifically shown or described herein and may be selected from such algorithms, systems, hardware, components and elements known in the art.
This application claims the benefit of and is a U.S. National Phase filing of PCT Application PCT/IB14/02021 filed May 15, 2014 entitled “METHOD FOR IMPLANT SURGERY USING AUGMENTED VISUALIZATION”, in the name of Bothorel et al, which itself claims the benefit of U.S. provisional application U.S. Ser. No. 61/929725 filed on Jan. 21, 2014 entitled “METHOD FOR IMPLANT SURGERY USING AUGMENTED VISUALIZATION”, all of which are incorporated herein in their entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IB2014/002021 | 5/15/2014 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2015/110859 | 7/30/2015 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
6091546 | Spitzer | Jul 2000 | A |
7720191 | Muller | May 2010 | B2 |
7804933 | Nyholm | Sep 2010 | B2 |
8172573 | Sonenfeld | May 2012 | B2 |
8576276 | Bar-Zeev et al. | Nov 2013 | B2 |
8582209 | Amirparviz | Nov 2013 | B1 |
8705177 | Miao | Apr 2014 | B1 |
20030004041 | Hartman et al. | Jan 2003 | A1 |
20030227470 | Genc | Dec 2003 | A1 |
20050203380 | Sauer | Sep 2005 | A1 |
20060184396 | Dennis | Aug 2006 | A1 |
20060291968 | Greenberg | Dec 2006 | A1 |
20080088529 | Tang | Apr 2008 | A1 |
20090191503 | Matov | Jul 2009 | A1 |
20100103247 | Lim | Apr 2010 | A1 |
20100141905 | Burke | Jun 2010 | A1 |
20100149213 | Navab | Jun 2010 | A1 |
20130038510 | Bin et al. | Feb 2013 | A1 |
20130122463 | Csillag | May 2013 | A1 |
20130131504 | Daon | May 2013 | A1 |
20130141421 | Mount | Jun 2013 | A1 |
20130278631 | Border | Oct 2013 | A1 |
20140022283 | Chan | Jan 2014 | A1 |
20140081659 | Nawana | Mar 2014 | A1 |
20140111639 | Tanaka | Apr 2014 | A1 |
20140178832 | Choi | Jun 2014 | A1 |
20140307315 | Bohn | Oct 2014 | A1 |
20170168296 | Giwnewer | Jun 2017 | A1 |
Number | Date | Country |
---|---|---|
2010067267 | Jun 2010 | WO |
2013022544 | Feb 2013 | WO |
Entry |
---|
D. Katic, G. Sudra, S. Speidel, G. Castrillon-Oberndorfer, G. Eggers, and R. Dillmann, Knowledge-Based Situation Interpretation for Context-Aware Augmented Reality in Dental Implant Surgery, 2010, Proceedings of the 5th International Workshop on Medical Imaging and Augmented Reality MIAR 2010, pp. 531-540. |
Nardy Casap, Sahar Nadel, Eyal Tarazi, and Ervin I. Weiss, Evaluation of a Navigation System for Dental Implantation as a Tool to Train Novice Dental Practitioners, 2011, Journal of Oral Maxillofacial Surgery 69(10):2548-2546 (Year: 2011). |
Eszter Somogyi-Ganss, Evaluation of the Accuracy of NaviDent, a Novel Dynamic Computer-Guided Navigation System for Placing Implants, 2013, Master's Thesis, Department of Prosthodontics, University of Toronto, Toronto, CA, (Year: 2013). |
Junchen Wang, Hideyuki Suenaga, Kazuto Hoshi, Liangjing Yang, Etsuko Kobayashi, Ichiro Sakuma, and Hongen Liao, Augmented Reality Navigation With Automatic Marker-Free Image Registration Using 3-D Image Overlay for Dental Surgery, Apr. 2014, IEEE Transactions on Biomedical Engineering, 61(4):1295-1304 (Year: 2014). |
International Search Report, International application No. PCT/IB2014/002021, dated Jan. 8, 2015, 2 pages. |
Number | Date | Country | |
---|---|---|---|
20160324598 A1 | Nov 2016 | US |
Number | Date | Country | |
---|---|---|---|
61929725 | Jan 2014 | US |