Dental aligners are plastic trays custom made to fit tightly over the teeth of a user to force the teeth to move to a desired location. Creating a three-dimensional model of teeth is useful in creating dental aligners. Conventional ways to create a three-dimensional model of teeth include making a physical impression of the teeth or taking an intraoral scan of the teeth.
To make a physical impression of teeth, a patient must either go to the office of an orthodontic professional to take the impression or use an at-home dental impression kit. To scan teeth intraorally, the orthodontic professional inserts a scanning device into the mouth of the patient such that images of the teeth can be captured from inside the mouth. Scanning teeth in the office of an orthodontic or dental professional is inconvenient (e.g., the user must travel to the office), costly (e.g., the user must pay the professional for the service), and limiting (e.g., if a scan is incomplete or corrupted, the user must take time to travel back to the office for an additional scan). Using an at-home dental impression kit can be difficult for a user to execute properly, which can result in the need for repeat impressions.
A scanning device for a user to accurately scan the user's own teeth without requiring a visit to a dental or orthodontic professional is desirable to avoid the complications associated with physical impressions and scanning in a professional setting.
An embodiment relates to a method. The method includes providing an imaging system configured to acquire images of an intraoral cavity of a user. The imaging system is configured to be operable by the user and to allow the user to visually monitor the intraoral cavity while the images of the intraoral cavity are acquired. The imaging system includes an image acquisition device configured to acquire the images and an orientation element configured to orient the image acquisition device with respect to the intraoral cavity to facilitate acquisition of the images. The method further includes acquiring the images at a location where a dentist or orthodontist is present, and at least one of determining an oral condition of the user based on the acquired images or providing feedback to the user based on at least one of the acquired images or the user's operation of the imaging system.
Another embodiment relates to a method. The method includes providing an imaging system configured to acquire images of an intraoral cavity of a user. The imaging system is configured to be operable by the user and to allow the user to visually monitor the intraoral cavity while the images of the intraoral cavity are acquired. The imaging system includes an image acquisition device configured to acquire the images, and an orientation element configured to orient the image acquisition device with respect to the intraoral cavity to facilitate acquisition of the images. The method further includes acquiring the images at a location remote from a dentist or orthodontist, providing communication between the user and the dentist or orthodontist during acquisition of the images, and at least one of determining an oral condition of the user based on the acquired images or providing feedback to the user based on at least one of the acquired images or the user's operation of the imaging system.
Another embodiment relates to a method. The method includes providing an imaging system configured to acquire images of an intraoral cavity. The imaging system is configured to allow a user to visually monitor the intraoral cavity while acquiring the images of the intraoral cavity. The imaging system includes an image acquisition device configured to acquire the images, and an orientation element configured to orient the image acquisition device with respect to the intraoral cavity to facilitate acquisition of the images. The method further includes acquiring the images, and determining an oral condition of the user based on comparing the acquired images with previously acquired images of the intraoral cavity.
Before turning to the figures, which illustrate certain exemplary embodiments in detail, it should be understood that the present disclosure is not limited to the details or methodology set forth in the description or illustrated in the figures. It should also be understood that the terminology used herein is for the purpose of description only and should not be regarded as limiting.
For a patient to begin a treatment plan using dental aligners, it is beneficial to create a three-dimensional (“3D”) model of the teeth of the patient. As described, the 3-D model can be generated using conventional impression techniques or a scanning device. However, a patient may need to visit an orthodontic professional to have the impressions taken or teeth scanned with the scanning device. In some instances, a patient may need to visit a different location where a scan is conducted by a professional that has been trained in the scanning process (e.g., a professional that is not a dental or orthodontic professional). In another example, a professional trained in the scanning process may bring a scanner to the patient to conduct a scan. A conventional scanner is generally large and can require significant structural support (e.g., attachment to a vehicle or a rolling cart, etc.). In either instance, the scan must be scheduled such that a professional trained in use of a scanner (e.g., a dental or orthodontic professional, an individual with specific training, etc.) can conduct the scan of the patient. A patient may therefore be more interested in scanning teeth with a portable device (e.g., a device that does not require structural support, and is significantly lighter than a conventional device such that the entire device can be held in the hand of the patient) that can be operated by the user (e.g., without assistance and/or direction from a trained professional) to scan teeth. Such a device provides the patient the ability to scan the patient's teeth as the patient desires. For instance, after conducting an initial scan, the patient may desire to conduct a subsequent scan. With a conventional scanner, the patient would have to schedule another scan and then wait until the scheduled date and time to conduct the scan. With a portable device, the patient can conduct subsequent scans as often as the patient desires.
As used herein, the term “extraoral scan” refers to a scan of the teeth of a patient where the lens of the scanner does not enter the mouth of the patient. As used herein, the term “intraoral scan” refers to scanning the teeth by inserting a lens of a scanning device (e.g., a camera, a 3-D scanner, or any other scanning device that is capable of capturing images of the teeth) into the mouth to capture images of the teeth. As used herein, the terms “user” and “patient” may be used interchangeably such that the user may be the patient and the patient may be the user. As used herein, the term “dental aligner” is intended to cover any type of dental appliance and can refer to any orthodontic device or device intended for use in a patient's mouth, including but not limited to dental aligners for repositioning one or more teeth of the patient, a denture, a mouth guard, or a retainer.
Referring to
The scanning device 102 can be any device configured to scan a 3D surface. Examples of the scanning device 102 include, but are not limited to, non-contact active 3D scanners (e.g., time-of-flight, triangulation, conoscopic holography, or any other kind of non-contact active 3D scanner), hand held laser 3D scanners, structured light 3D scanners, modulated light 3D scanners, and non-contact passive 3D scanners (e.g., stereoscopic, photometric, silhouette, or any other kind of non-contact passive 3D scanner).
The scanning device 102 includes a processing circuit 104, a scanning circuit 110, a communications circuit 112, a machine learning circuit 114, and an analysis circuit 116. The processing circuit 104 is further shown to include a processor 106 and a memory 108. The processor 106 can be any type of processor capable of performing the functions described herein. The processor 106 may be a single or multi-core processor(s), digital signal processor, microcontroller, or other processor or processing/controlling circuit. Similarly, the memory 108 can be any type of volatile or non-volatile memory or data storage capable of performing the functions described herein. In operation, the memory 108 may store various data and software used during operation of the scanning device 102, such as operating systems, applications, programs, libraries, and drivers. The memory 108 is communicatively coupled to the processor 106 such that the processor 106 can execute files located in the memory 108.
The scanning circuit 110 is communicably coupled to the processor 106 and is configured to conduct a scan of one or more objects. In this regard, the scanning circuit 110 gathers images of the object(s) being scanned (e.g., the size, shape, color, depth, tracking distance, and other physical characteristics) such that the data can be provided to other circuits in the scanning device 102 (e.g., the analysis circuit 116 and the machine learning circuit 114). To appropriately scan the target objects, the scanning circuit 110 can include a wide variety of sensors including, but not limited to, gyroscopes, accelerometers, magnetometers, inertial measurement units (“IMU”), depth sensors, and color sensors.
The communications circuit 112 is communicably coupled to the scanning circuit 110, the machine learning circuit 114, and the analysis circuit 116 and is configured to send communications to, and receive communications from, the mobile device 122. For example, the communications circuit 112 can communicate information regarding the sufficiency of a scan to the mobile device 122 such that the mobile device 122 can display the information to the patient conducting the scan. As another example, the communications circuit 112 can provide images of the scan to the mobile device 122 in real time for the mobile device 122 to display to the patient during the scan. The communications circuit 112 can communicate with the mobile device 122 in a variety of ways including, but not limited to, Bluetooth, a WiFi network, a wired local area network (LAN), Zigbee, or any other suitable way for devices to exchange information.
The machine learning circuit 114 is configured to receive the images gathered by the scanning circuit 110 during a scan and determine characteristics of the scan. For example, the machine learning circuit 114 can determine, based on the images received, whether the patient initiated a scan from the correct location or whether the patient held the scanning device 102 in the correct orientation during the scan. The machine learning circuit 114 is also communicably coupled to the communications circuit 112 and is further configured to receive updates to improve the capabilities of the machine learning circuit 114.
The analysis circuit 116 is configured to receive information from the scanning circuit 110 and the machine learning circuit 114 and analyze the information. For example, the analysis circuit 116 can receive image data from the scanning circuit 110 and determine a level of confidence that the scan being conducted by the patient will be acceptable. As another example, the analysis circuit 116 can merge multiple scans together such that the acceptable portions of the scans are combined to generate an acceptable scan. The analysis circuit 116 can provide the results of the analysis to the machine learning circuit 114 and the communications circuit 112.
The mobile device 122 can be any type of portable device configured to run a mobile application (“application”). As used herein, the term “application” refers to software that can be loaded on to a piece of hardware (e.g., the mobile device 122), where the software communicates with both the hardware and a server to perform the desired functions. Examples of the mobile device 122 include, but are not limited to, a mobile phone, a tablet computer, a laptop computer, a smart watch, a fitness tracker, and any other Internet-connected device that is capable of running an application. The mobile device 122 can be a personal mobile device 122 of the patient. For example, the mobile device 122 can be the patient's own mobile phone.
The mobile device 122 is shown to include a processing circuit 124, an analysis circuit 130, a communication circuit 132, and a display circuit 134. The processing circuit 124 is further shown to include a processor 126 and a memory 128. The processor 126 can be any type of processor capable of performing the functions described herein. The processor 126 may be a single or multi-core processor(s), digital signal processor, microcontroller, or other processor or processing/controlling circuit. Similarly, the memory 128 can be any type of volatile or non-volatile memory or data storage capable of performing the functions described herein. In operation, the memory 128 may store various data and software used during operation of the mobile device 122, such as operating systems, applications, programs, libraries, and drivers. The memory 128 is communicatively coupled to the processor 126 such that the processor 126 can execute files located in the memory 128.
The communication circuit 132 is communicably coupled to the analysis circuit 130 and the display circuit 134 and is configured to send communications to, and receive communications from, the mobile device 122, and to send communications to, and receive communications from, the cloud server 142. For example, the communication circuit 132 can communicate information regarding images received from the scanning device 102 to the cloud server 142 such that the cloud server 142 can perform further operations. The communication circuit 132 can communicate with the mobile device 122 and the cloud server 142 in a variety of ways including, but not limited to, Bluetooth, a WiFi network, a wired local area network (LAN), Zigbee, or any other suitable way for devices to exchange information. In some embodiments, the communication circuit 132 includes a radiofrequency transceiver to communicate with one or more radio towers to transmit and/or receive information using radio waves. In some embodiments, the radiofrequency transceiver includes a single band transceiver. In some embodiments, the radiofrequency transceiver includes a dual band transceiver.
The analysis circuit 130 is configured to receive information from the communication circuit 132 and analyze the information. For example, the communication circuit 132 can receive confidence level data from the scanning device 102 and provide the data to the analysis circuit 130. The confidence level data can be based on a comparison of the accuracy of the current scan to the accuracy of the scan of an average user. The analysis circuit 130 can determine whether the patient is scanning more or less accurately throughout the scan and can provide the results of the analysis to the display circuit 134 and the cloud server 142. Additionally, the analysis circuit 130 is operable to perform additional operations using the images recorded by the scanning device 102. For example, the analysis circuit 130 can determine the depth of the scan based on the information provided to it by the scanning device 102.
The display circuit 134 is configured to receive information from the scanning device 102 and the analysis circuit 116 and display the information to the patient. For example, the display circuit 134 can receive information regarding the accuracy of the scan being performed. The display circuit 134 can provide the scan accuracy information to the patient by showing the scan accuracy information on the display of the mobile device 122. The accuracy information can be provided as color coded information, where teeth that have been scanned successfully will show as green and teeth that have either been missed or scanned unsuccessfully will show as red. As another example, the display circuit 134 can receive images of the teeth of the patient from the scanning device 102, and the display circuit 134 can display the images of the teeth of the patient on the display of the mobile device 122 in real time as the patient scans the teeth.
The cloud server 142 is communicably coupled to the mobile device 122 and the aligner fabrication center 162. In addition to being communicably coupled to the mobile device 122, the cloud server 142 can be communicably coupled to a plurality of mobile devices, with each of the plurality of mobile devices being communicably coupled to a separate scanning device. The cloud server 142 is configured to receive scan data from the mobile device 122 and perform additional operations on the data in order to prepare the scan data to send to the aligner fabrication center 162. The additional operations the cloud server 142 can perform include, but are not limited to, high-resolution reconstruction of scanned images and converting the scanned images to one or more 3D images. The cloud server 142 can communicate the results of the additional operations to the aligner fabrication center 162.
In addition, the cloud server 142 is configured to analyze the data received from the plurality of mobile devices to which the cloud server 142 is communicably coupled, and provide the output of the analysis to the machine learning circuit 114 via the mobile device 122. For example, the cloud server 142 may determine, based on the analysis of a plurality of scan data from various scans, that patients must hold the scanning device 102 within a certain angular tolerance. The cloud server 142 can provide that information to the machine learning circuit 114 (and the other machine learning circuits associated with the plurality of mobile devices connected to the cloud server 142) such that the scanning device 102 can ensure the patient begins the scan with the scanning device 102 in the proper orientation.
The aligner fabrication center 162 is communicably coupled to the cloud server 142 and is configured to receive the 3D images from the cloud server 142 and generate one or more orthodontic aligners based on the 3D images. The aligner fabrication center 162 is shown to include an aligner fabrication computer system 164 and an aligner fabrication system 174. The aligner fabrication computing system 164 is configured to determine, based on the 3D images, the optimal way to reposition the teeth of the patient from a first configuration to a second configuration. The aligner fabrication computing system 164 includes a communications circuit 166, an image recognition circuit 168, an image stitching circuit 170, and a 3D model creation circuit 172. The aligner fabrication system 174 is configured to create, based on one or more physical 3D models, one or more aligners operable to reposition the teeth of a patient.
The communications circuit 166 is communicably coupled to the cloud server 142 such that the cloud server 142 provides image data to the aligner fabrication computer system 164 via the communications circuit 166. The communications circuit 166 can also communicate information to the cloud server 142 for the cloud server 142 to provide to the machine learning circuit 114 via the mobile device 122. For example, if the images provided to the aligner fabrication computer system 164 were not sufficient to generate viable aligners, the communications circuit 166 notifies the cloud server 142 of the deficiency, the cloud server 142 updates the machine learning circuit 114, and the machine learning circuit 114 prevents similar deficiencies from being accepted in future scans.
The image recognition circuit 168 is operable to receive the image data from the communications circuit 166 and determine the location of the images for further processing. For example, the image recognition circuit can receive four images of the mouth of a patient, and the four images represent the scan of the entire mouth. The image recognition circuit 168 can determine which image represents the lower right quadrant of the mouth, the lower left quadrant of the mouth, the upper right quadrant of the mouth, and the upper left quadrant of the mouth such that the image recognition circuit 168 organizes the images such that the images can be stitched together.
The image stitching circuit 170 receives the recognized images from the image recognition circuit 168 and stitches the images together to create images of the top and bottom teeth. In some embodiments, the image stitching circuit 170 creates one image for the top teeth and one image for the bottom teeth. In some embodiments, the image stitching circuit creates a single image that includes both the top teeth and the bottom teeth. The image stitching circuit 170 can implement any known method of stitching to stitch the images together. For example, the image stitching circuit 170 can stitch the images together using keypoint detection (e.g., finding distinct regions in images used to match images together), registration (e.g., matching features in images that minimize the sum of absolute differences between overlapping pixels), or any other known stitching method. The stitched image is provided to the 3D model creation circuit 172 for further processing.
The 3D model creation circuit 172 receives the stitched image and creates a 3D model from the stitched image. The 3D model creation circuit 172 can use any known method of generating a 3D model including, but not limited to, depth-based conversion, depth from motion, depth from focus, depth from perspective, or any other known method. The 3D model creation circuit 172 generates a treatment plan based on the 3D model of the teeth of the patient. The treatment plan includes a first tooth configuration based on the 3D model of the teeth, and a second tooth configuration based on an optimal tooth configuration (e.g., the treatment plan can include moving the teeth from a crooked configuration to a straight configuration). The 3D model creation circuit 172 determines a number of steps in the treatment plan required to move the teeth from the first configuration to the second configuration, and generates 3D models of the teeth for each step in the treatment plan. The 3D model creation circuit 172 can also create physical representations of the 3D models via any known method including, but not limited to, 3D printing, machining, molding, or any other method capable of creating a physical 3D model. The 3D model creation circuit 172 sends the physical 3D models to the aligner fabrication system 174.
The aligner fabrication system 174 receives the physical 3D models from the 3D model creation circuit 172 and generates aligners for repositioning the teeth of a patient. The aligner fabrication system 174 includes a thermoforming machine 176, a cutting machine 178, and a laser etching machine 180. The thermoforming machine is operable to create an aligner by placing a sheet of polymeric material on top of one or more 3D models. The polymeric material is heated and drawn tightly over the 3D models (e.g., via a vacuum system, a press, or any other known methods). The polymeric material is allowed to cool, and the thermoformed polymeric material is removed from the 3D model.
The cutting machine 178 receives the thermoformed polymeric material from the thermoforming machine 176 and is operable to trim excess polymeric material from the thermoformed polymeric material. The excess material is trimmed with a cutting system (e.g., using lasers, mechanical methods, or other known cutting methods) to generate the aligners.
The laser etching machine 180 is operable to include identification marks on the aligners via a laser etching process. The identification marks can include a patient specific number, a sequence number indicating how the aligner should be worn in sequence with other aligners, or any other type of marks that can provide an identifying feature.
As described, the images recorded by the scanning device 102 can be processed and modified by the scanning device 102, the mobile device 122, the cloud server 142, and/or the aligner fabrication center 162. In some embodiments, the images recorded by the scanning device 102 can be at least partially processed by a combination of at least two or more of the scanning device 102, the mobile device 122, the cloud server 142, or the aligner fabrication center 162 such that the combination of partial processing results in fully processed and modified images.
Referring to
To send a signal from an originating location to a destination location, each tower can send a signal to a tower within an adjacent area. For example, the tower 182 can send a signal to the tower 183 because the area 193 is adjacent to the area 192. Additionally, the tower 182 can send a signal to the tower 185 because the area 195 is adjacent to the area 192.
As described with reference to
In some embodiments, the signals sent by the mobile device 122 include multiple digital signals, multiple analog signals, or a combination of multiple digital and analog signals. In such embodiments, the multiple signals are combined into one signal using multiplexing protocols to reduce the resources required to send the signal. In an example embodiment, frequency division multiplexing (FDM) can be used to combine the signals. In FDM, each user is assigned a different frequency from the complete frequency spectrum such that all frequencies can travel simultaneously. In another example embodiment, time division multiplexing (TDM) can be used to combine the signals. In TDM, a single radio frequency is divided into multiple slots and each slot is assigned to a different user such that multiple users can be supported simultaneously. In yet another example embodiment, code division multiplexing (CDMA) can be used to combine the signals. In CDMA, several users share the same frequency spectrum simultaneously and are differentiated via unique codes assigned to each user. The receiver is supplied with the unique keys such that the user can be identified by the receiver.
In some embodiments, the signal is sent via a packet switching process. In a packet switching process, the signal (e.g., the data being sent) is divided into smaller parts called packets. The packets are then sent individually from the source (e.g., the mobile device 122) to the destination (e.g., the cloud server 142). In some embodiments, each packet can follow a different path to the destination, and the packets can arrive out of order at the destination, where the packets are assembled in order (e.g., the datagram approach). In some embodiments, each packet follows the same path to the destination, and the packets arrive in the correct order (e.g., the virtual circuit approach).
Accordingly, any data acquired by the scanning device 102 can be communicated to the cloud server 142 or the aligner fabrication center 162 by way of or not by way of cloud server 142 using the tower network 181. In some embodiments, any data acquired by the scanning device 102 can be communicated using one or more towers 182-186 of the tower network 181. For example, 3D images of the user's teeth captured by the scanning device 102 can be communicated to the cloud server 142 or the aligner fabrication center 162 by way of one or more towers the 182-186 of the tower network 181. Additionally, 3D images of the user's teeth captured by the scanning device 102 can be communicated directly to other locations (e.g., the office of a dental professional, etc.) by way of one or more of the towers 182-186 for diagnostic purposes, treatment purposes, or any other purpose. In some cases, communicating 3D image data over the tower network 181 is advantageous over other methods of communication. For example, the 3D image data can be more efficiently communicated when broken up into smaller sized packets or smaller file sizes and separately communicated over the tower network 181 rather than being communicated in a large file size or larger packets when using other methods of communication.
Referring now to
The first camera 206 and the second camera 208 are spaced apart and secured to the body 202 such that the lenses of the first camera 206 and the second camera 208 extend beyond the front portion 204. In some embodiments, the first camera 206 and the second camera 208 can be equivalent cameras. For example, the first camera 206 and the second camera 208 can be two-dimensional cameras that capture two-dimensional images along with some depth information. In some embodiments, the first camera 206 and the second camera 208 can be different cameras. For example, the first camera 206 can be a two-dimensional camera that captures depth information, and the second camera 208 can be a 3D camera.
The projector 207 is shown to be located in between the first camera 206 and the second camera 208; however, the projector 207 can be located anywhere on the body 202. The projector 207 receives the images from the first camera 206 and the second camera 208 and overlays the images from the first camera 206 and the second camera 208. The overlaid images are provided to the analysis circuit 116 and the analysis circuit 116 can stereoscopically calculate the depth of the images from the first camera 206 and the second camera 208. In some embodiments, the projector 207 includes a light source and the projector 207 projects light from the light source toward the object being scanned. The light source can be, for example, a near infrared light, a light in the visible spectrum (e.g., light with a wavelength between approximately four hundred and approximately seven hundred nanometers). In some embodiments, the light source is a light with a wavelength of between approximately three hundred eighty nanometers and five hundred nanometers.
The scanning device 102 may also include a light source 214 to project light on the surface for scanning. In some embodiments, the light source 214 can project visible light onto the surface for scanning (e.g., blue light or any other type of visible light). In some embodiments, the light source 214 can project infrared light (or other non-visible light, e.g. near-infrared light) onto the surface for scanning. The light source 214 can be turned on prior to the scanning procedure, either by the user (e.g., via a button, switch, or other type of actuator that can open or close a circuit coupled to the light source 214) or automatically.
In some embodiments, the scanning device 102 includes sensors 216 to receive light reflected from the surfaces being scanned. In some embodiments, the sensors 216 are configured to receive infrared light (or other non-visible light) reflected from the surfaces being scanned. In some embodiments, the sensors 216 are configured to receive visible light (e.g., blue light or any other type of visible light) reflected from the surfaces being scanned. In embodiments where the light source 214 is turned on automatically, the sensors 216 can be configured to detect the motion and/or orientation of the scanning device 102. When the sensors 216 detect motion indicative of a scanning procedure (e.g., the scanning device 102 is oriented at a certain angle or angles, the scanning device 102 is moved according to a certain pattern, etc.) the sensors 216 send a signal to the light source 214 such that the light source 214 turns on.
In some embodiments, the scanning device 102 may include bone scanning technology. Bone scanning technology may include various systems and methods to scan the exterior and interior of a bone. For example, in a dental environment, a bone scan may provide information regarding the internal health of teeth. A bone scan may also provide information regarding the health of the roots of the teeth, or other portions of the teeth that may not be visible such that they cannot be scanned by a conventional scanning device. In some embodiments, the bone scanning technology enables acquisition of the shape of teeth roots and the position of teeth roots with respect to one another, teeth of the user, or a jawbone of the user.
The guide 210 extends from the front portion 204 and defines a guide tip 212, where the guide tip 212 is located at the end of the guide 210 opposite the front portion 204. The guide 210 can be constructed from the same material as the body 202; however, the guide 210 can also be constructed from different materials. The guide 210 is operable to extend between the teeth of a patient and the body 202 during a scan of the teeth of the patient, therefore the guide 210 is substantially rigid such that the body 202 can be maintained at a substantially constant distance (e.g., between 2-6 centimeters (cm), around 3 cm, around 4 cm, etc.) from the teeth of the patient during the scan.
The guide tip 212 is operable to contact the teeth of the patient throughout the scan such that the body 202 can be maintained at a substantially constant distance from the teeth of the patient during the scan. As shown, the guide tip 212 is a flat surface; however, the guide tip 212 can be any shape that directs the patient to maintain contact between the guide tip 212 and the teeth during a scan. For example, the guide tip 212 can be shaped like a “V” such that the teeth can be inserted into the “V” to maintain contact throughout the scan. As another example, the guide tip 212 can be shaped like a “U” such that the teeth can be inserted into the “U” to maintain contact throughout the scan. The guide tip 212 can be oriented in any configuration to promote accurate scanning. For example, in the embodiment where the guide tip 212 is “V” shaped, the “V” can be oriented such that the opening of the “V” is facing down (e.g., away from the first camera 206 and the second camera 208). As another example, in the embodiment where the guide tip 212 is “U” shaped, the “U” can be oriented such that the opening of the “U” is facing up (e.g., toward the first camera 206 and the second camera 208).
In embodiments where the scanning device 102 scans structures inside of a mouth (e.g., teeth, gums, etc.), the scanning device 102 is preferably capable of scanning at a resolution of approximately 100 microns when scanning an individual tooth. Furthermore, the scanning device 102 is preferably capable of a resolution of approximately 300 microns when conducting a cross-arch scan (e.g., scanning all the upper teeth or all the lower teeth). In some embodiments, the scanning resolution is preferably greater (e.g., between approximately 50-100 microns when scanning an individual tooth and between approximately 100-300 microns when conducing a cross-arch scan).
Referring to
To generate a scan of the teeth, the patient 302 places the guide tip 212 in a specific section of the mouth, as instructed. For example, the guide tip 212 may be positioned on the back molar on the right side of lower teeth 308 to begin the scan. By placing the guide tip 212 on the lower teeth 308, the scan of the teeth will begin by scanning in the scanning area 310 (e.g., the upper teeth 306) to prevent the scanning device 102 from confusing the guide tip 212 with the teeth to be scanned. After the scan begins, the patient 302 moves the scanning device 102 around the mouth in the direction of the arrow 312 while keeping the guide tip 212 in contact with the lower teeth 308. As the patient 302 moves the scanning device 102 around the mouth, the first camera 206 and the second camera 208 record images of the upper teeth 306. The patient 302 moves the scanning device 102 until the guide tip 212 contacts the back molar on the left side of the lower teeth 308. The scan of the upper teeth 306 is complete.
The patient rotates the scanning device 180 degrees and places the guide tip 212 in a specific section of the mouth, as instructed. For example, the guide tip 212 may be positioned on the back molar on the right side of upper teeth 306 to begin the scan of the lower teeth 308. With the scanning device 102 oriented in this manner, the scanning area 310 is focused on the lower teeth 308. The patient 302 moves the scanning device 102 around the mouth in the direction of the arrow 312 while keeping the guide tip 212 in contact with the upper teeth 306. As the patient 302 moves the scanning device 102 around the mouth, the first camera 206 and the second camera 208 record images of the lower teeth 308. The patient 302 moves the scanning device 102 until the guide tip 212 contacts the back molar on the left side of the upper teeth 306. The scan of the lower teeth 308 is complete.
In some embodiments, the guide 210 is not included on the scanning device 102. In such embodiments, the scanning device 102 may be inserted into the mouth to bring the camera 206 and the camera 208 closer to the teeth for scanning. In some embodiments, the guide 210 is included on the scanning device 102, but the user can determine whether to use the guide 210 or insert the scanning device 102 into the mouth to scan the teeth.
In some embodiments, after the scan of the lower teeth 308 and the upper teeth 306 is complete, the images are sent to the cloud server 142 for further processing. In some embodiments, the images are sent to the cloud server 142 in batches. For example, after scanning the right quadrant of the upper teeth 306, the images of the right quadrant of the upper teeth 306 are sent to the cloud server 142. Furthermore, after scanning the left quadrant of the upper teeth 306, the images of the left quadrant of the upper teeth 306 are sent to the cloud server 142. In embodiments where the images are sent to the cloud server 142 in batches, the patient 302 may be instructed to stop scanning at certain points to allow the images to be sent to the cloud server 142. For example, after scanning the right quadrant of the upper teeth 306, the patient 302 may be instructed to stop the scan to allow the images to be sent to the cloud server 142, and then instructed to begin the scan again after the images are sent.
Referring to
Referring to
Referring to
Referring to
In some embodiments, a plurality of aligners is created, with each aligner including a slightly different geometry such that the aligner forces the teeth to be gradually repositioned. In embodiments where a plurality of aligners is created, the upper right quadrant 502 and the upper left quadrant 504 may have geometries slightly different from the upper right quadrant 402 and upper left quadrant 404. When the patient 302 inserts, for example, the upper teeth 306 into the aligner 500, the aligner 500 may exert forces on the upper teeth 306 that cause the upper teeth 306 to move.
Referring to
After the patient 302 installs the application 600 on the mobile device 122, the patient 302 initiates the application 600 to begin the scanning process. Upon initiating the application 600, the application 600 displays a message 604 on the display 602 asking the patient 302 if the patient 302 is ready to being the scanning process. If the patient 302 selects the no button 606, the application 600 may close or direct the patient 302 to begin the application 600 again when the patient 302 is ready to begin scanning. If the patient 302 selects the yes button 608, the mobile application 600 will move to the next screen.
The mobile application 600 displays a message 608 regarding the initial position of the scanning device 102 relative to the teeth of the patient 302. As shown in image 610 of
As shown, the patient 302 can conduct the scan of teeth using the guide 210 so the camera 206 and the camera 208 do not enter the mouth. However, the patient can also conduct the scan of the teeth by inserting the scanning device 102 into the mouth to scan the teeth, as described.
The mobile application 600 displays a message 616 prompting the patient 302 to begin the scan by pressing the start button 618. Upon pressing the start button 618, the message 620 is displayed to provide the patient 302 with instructions as to how to conduct the scan. An image 622 is also shown on the display to guide the patient 302 during the scan. In some embodiments, the image 622 is a short video showing the patient 302 how to conduct the scan. In some embodiments, the scanning device 102 communicates its position to the mobile application 600 in real time such that the display 602 can superimpose an image of the scanning device 102 with the image 622 and the patient 302 can attempt to match the position on the image to provide for a more accurate scan. In some arrangements, a confidence level is shown on the display 602. The confidence level provides the user with real time feedback regarding the accuracy of the scan. For example, the display 602 may indicate that the confidence level during the scan is 40%. Furthermore, the message may be highlighted in red to indicate that the confidence level is low for an accurate scan. As another example, the display 602 may indicate that the confidence level during the scan is 95%, and the message may be highlighted in green to indicate that the confidence level is high for an accurate scan. The scanning device 102 or application may alert the scanner when the confidence level does not meet a threshold (e.g., an audio, visual, or tactile alert when the confidence level is less than 95%).
After the scan of the first set of teeth is complete, the application provides a message 624 to the patient 302 instructing the patient 302 to orient the scanning device 102 to scan the other set of teeth. The message 628 is displayed to provide the patient 302 with instructions as to how to conduct the scan. An image 630 is also shown on the display to guide the patient 302 during the scan. In some embodiments, the image 630 is a short video showing the patient 302 how to conduct the scan. In some embodiments, the scanning device 102 communicates its position to the mobile application 600 in real time such that the display 602 can superimpose an image of the scanning device 102 with the image 630 such that the patient 302 can attempt to match the position on the image to provide for a more accurate scan. As described, confidence information can also be displayed in real time during the scan.
In some embodiments, the scan may not be successful. In such embodiments, the scanning device 102 provides the application 600 with data regarding the portion(s) of the scan that were unsuccessful, and the application 600 provides the patient 302 with the message 632 indicating that the scan was unsuccessful. An image 634 provides a highlighted section 636 to indicate to the user the specific portion of the scan that was unsuccessful. The patient 302 can then initiate a new scan in attempts to correct the deficiency.
In some arrangements, as shown in
When the scan is complete, a message 646 is provided to the patient 302 to advise the patient that the scan is being evaluated to determine whether it is acceptable or unacceptable. During this time, the scan data may be sent to the cloud server 142 to verify whether the scan is acceptable.
The patient 302 receives a message 648 indicating that the scan was successful, or the patient 302 receives a message 650 indicating that the scan was unsuccessful and that a new scan should be started by pressing the start scan button 652.
Referring to
At 704, a tooth scanning application on the mobile device 122 is opened. For example, after downloading the application 600 to the mobile device 122, the patient 302 opens the application 600 on the mobile device 122.
At 706, the guide 210 is placed on the bottom teeth on one side of the mouth. For example, as shown in
At 708, the scan is started. For example, as shown in
At 710, the scanning device 102 is moved from one side of the mouth to the other. For example, as shown in
At 712, the scanning device 102 is turned over and the guide 210 is placed on the upper teeth 306. For example, the patient 302 places the guide tip 212 of the guide 210 on the upper teeth 306 on one side of the mouth.
At 714, the scanning device 102 is moved from one side of the mouth to the other. For example, as shown in
At 716, a determination is made as to whether the scan is acceptable. The determination can be made by either the scanning device 102 or the cloud server 142. In either case, if the scan is not acceptable, at 718 feedback is provided regarding the scan. For example, as shown in
At 722, the images are stitched together to create a single image. For example, the image stitching circuit 170 stitches multiple images of the teeth together to create one 3D image of the top teeth and one 3D image of the bottom teeth from the scan. In some embodiments, a single 3D image including both the top teeth and the bottom teeth is created. In some embodiments, a single 3D image of the top teeth in contact with the bottom teeth (e.g., bite articulation) is created based on one or more scans conducted with the top teeth and bottom teeth in a bite articulation position.
At 724, a treatment plan is created based on the 3D image of the teeth. For example, the 3D image of the teeth of the patient 302 is compared to the 3D image of straightened teeth, and the aligner fabrication computer system 164 determines the number of aligners needed to move the teeth from the initial configuration to the straightened configuration.
At 726, one or more 3D models of the teeth are created. For example, the 3D model creation circuit 172 generates a mesh similar to the mesh 440 of
At 728, aligners are manufactured from the 3D models. For example, aligners are manufactured using the thermoforming machine 176, the cutting machine 178, and the laser etching machine 180, as described. Aligners can be manufactured singly, or in batches. When manufactured in batches, a plurality of physical 3D models are arranged such that a large sheet of polymeric material fits over all the physical 3D models. The polymeric material is heated and formed to the physical 3D models to create the aligners, as described.
Referring to
At 804, a mobile application is provided to the patient 302. For example, when the patient 302 receives the scanning device 102, the patient is instructed to download the mobile application 600 in order to properly use the scanning device 102.
At 806, the patient is directed to use the scanning device 102 with the mobile application 600. For example, as shown in
At 808, data from the scanning device 102 is received via the mobile device 122. For example, when the scan is complete, the scanning device 102 provides the image data to the mobile device 122, and the mobile device 122 provides the image data to the cloud server 142. In some embodiments, the image data is provided to the cloud server 142 in batches (e.g., after one quadrant of teeth is scanned) instead of after all teeth have been scanned.
At 810, a treatment plan is determined based on the image data. For example, the aligner fabrication computer system 164 compares the scanned images to images of an ideal smile and determines the number of aligners required to move the teeth of the patient 302 to match the ideal smile. The aligner fabrication computer system 164 further determines the geometry of each aligner such that the aligners can be manufactured.
At 812, the aligners are manufactured. For example, the 3D model creation circuit 172 generates a physical 3D model for each aligner determined by the aligner fabrication computer system 164, and the aligners are created via the thermoforming machine 176, the cutting machine 178, and the laser etching machine 180, as described.
Referring to
During the scan, the scanning device 102 may determine that the upper teeth 906 or the lower teeth 908 include dental conditions that require intervention or additional care. For example, the scanning device 102 may find that the user 902 has a cavity 910 on the upper teeth 906. In addition to finding the cavity 910, the scanning device 102 is capable of finding and diagnosing other dental conditions including, but not limited to, cracked teeth, broken crowns, gingivitis, and other dental issues that may require intervention or additional care.
In some embodiments, the scanning device 102 makes determinations about dental conditions that require intervention or care, as described. In some embodiments, the scanning device 102 provides the scan data to the mobile device 122 and the mobile device makes determinations about dental conditions that require intervention or care. In some embodiments, the mobile device 122 communicates with the cloud server 142, which communicates with a teledentistry center, and the cloud server 142 or the teledentistry center makes determinations about dental conditions that require intervention or care. In some embodiments, a combination of two or more of the scanning device 102, the mobile device 122, the cloud server 142, and the teledentistry center make determinations about dental conditions that require intervention or care.
Referring to
Referring to
At 1104, a mobile application is provided to the user 902. For example, when the user 902 receives the scanning device 102, the patient is instructed to download the mobile application 600 to use the scanning device 102 properly.
At 1106, the patient is directed to use the scanning device 102 with the mobile application 600. For example, as shown in
At 1108, data from the scanning device 102 is received via the mobile device 122. For example, when the scan is complete, the scanning device 102 provides the image data to the mobile device 122, and the mobile device 122 provides the image data to the cloud server 142 for analysis. As described, in some embodiments the analysis of scanned images is completed by one or more of the scanning device 102, the mobile device 122, the cloud server 142, and a teledentistry center.
At 1110, a dental condition is diagnosed from the data. For example, after receiving the image data from the scan, a determination is made that the user 902 has a cavity or some other dental condition (e.g., receding gumline, gingivitis, broken tooth, etc.).
At 1112, a treatment plan is determined based on the dental condition. For example, the user 902 may have a receding gumline. The display 602 may provide the user with instructions as to how to prevent the receding gumline from worsening. The user 902 may also be connected to a dental practitioner through the mobile application 600 such that the dental practitioner can provide the user 902 with advice as to how to care for the dental condition. As another example, the user 902 may have a cracked tooth. The display 602 may provide the user 902 with instructions as to how to perform a temporary repair, and also connect the user 902 with a dental practitioner near the user 902 such that the dental practitioner can fix the problem in a more permanent fashion.
The embodiments described herein refer to scanning teeth to provide for 3D models from which aligners can be constructed for orthodontic treatment. In other embodiments, the scanning device 102 can be used for other dental or orthodontic purposes. For example, the scanning device 102 can be used to provide images to determine the oral health of a patient (e.g., to detect gum disease, cavities, cracked teeth or crowns, and other dental health concerns). Additionally, the scanning device 102 can be used during orthodontic treatment to determine, for example, whether an aligner fits properly, whether an orthodontic change has occurred since starting treatment, whether the patient has regressed, and other orthodontic concerns. Furthermore, the scanning device 102 can be used to determine whether an orthodontic correction is required while the user is adhering to a treatment plan (e.g., a mid-course correction and/or refinement). The scanning device 102 can also be used to provide information to a dental and/or orthodontic professional for regular check-ups. In some instances, the dental and/or orthodontic professional can use the information from the scanning device 102 to determine whether the user needs to travel to the office of the dental and/or orthodontic professional for additional care.
Referring to
Additionally, the scanning device 2600 can include a processing circuit substantially similar to the processing circuit 104, a scanning circuit substantially similar to the scanning circuit 110, a communications circuit substantially similar to the communications circuit 112, a machine learning circuit substantially similar to the machine learning circuit 114, and an analysis circuit substantially similar to the analysis circuit 116.
The scanning device 2600 includes a body 2602 and a guide portion 2604. The body 2602 includes a first body portion 2606, a second body portion 2608, a base body portion 2610, a button 2612, an electrical connection 2614, and a lens 2616. The guide portion 2604 includes a stem 2618, a guide 2620, and a coupler 2622.
The first body portion 2606, the second body portion 2608, and the base body portion 2610 can be manufactured from any material suitable for use in a scanning device. Examples of suitable materials include, but are not limited to, plastics (e.g., acrylonitrile butadiene styrene (ABS), polyurethane, polycarbonate, etc.), metals (e.g., aluminum, stainless steel, etc.), or any combination thereof. In some embodiments, the base body portion 2610 and the second body portion 2608 are of unitary construction. In some embodiments, the base body portion 2610 and the second body portion 2608 are separate components. The second body portion 2608, the first body portion 2606, and the base body portion 2610 can be coupled by any suitable method to form the body 2602. Suitable methods include, but are not limited to, physical connections (e.g., screws, rivets, bolts, etc.), chemical connections (e.g., adhesives, etc.), and designed connections (e.g., press fit, snap fit, etc.).
The body 2602 is configured to contain the components described above (e.g., the processing circuit, the scanning circuit, the communications circuit, the machine learning circuit, and the analysis circuit). In some embodiments, the components described above may be included on a printed circuit board (PCB). In some embodiments, the components described above may be included as separate components. The body 2602 may also be configured to contain additional components. For example, the body 2602 may be configured to contain a haptic motor, a battery, a sealing component, and optics, which will be further described with reference to
The base body portion 2610 may be constructed from the same materials as the second body portion 2608 and the first body portion 2606, and includes the electrical connection 2614. The electrical connection 2614 is recessed within the base body portion 2610 and includes electrical components electrically coupled to the battery such that the battery can be recharged when the electrical connection 2614 is coupled to a power source. The electrical connection 2614 can be any type of conventional electrical connection. For example, the electrical connection 2614 can be a USB connection (e.g., USB-A, USB-B, USB-C, mini-USB, micro-USB, Lightning). The electrical connection 2614 can also be any other type of connection configured to couple to a power source and receive power to charge a battery.
The button 2612 is disposed on, or within, the second body portion 2608. In some embodiments, the button 2612 is flush with an outer surface of the second body portion 2608. In some embodiments, the button 2612 protrudes above the outer surface of the second body portion 2608. In some embodiments, the button 2612 is recessed below the outer surface of the second body portion 2608. The button 2612 is configured to actuate the electrical connection between the battery and the other components located within the body 2602. For example, when the scanning device 2600 is off, the electrical connection between the battery and the other components is open. When the user desires to turn on the scanning device 2600, the user depresses the button 2612, which closes the connection between the battery and the other components such that the other components draw power from the battery, and the scanning device 2600 turns on.
The lens 2616 is disposed on, or within the second body portion 2608, and is situated between the optics and the scanning target. For example, the scanning target may be a set of teeth. When the scanning device 2600 is oriented such that the optics are pointed at the set of teeth, the lens is positioned between the optics and the set of teeth. In some embodiments, the lens may be formed to a desired shape by grinding, polishing, or molding processes. In some embodiments, the lens 2616 is configured to focus the optics on the scanning target such that the image of the scanning target is clear. In some embodiments, the lens 2616 is configured to prevent foreign bodies (e.g., dust, particles, etc.) from entering the scanning device 2600 and does not focus the optics. In such embodiments, the lens 2616 may not include a specific shape but may be a transparent or translucent barrier (e.g., plastic, glass, etc.). The lens can be manufactured from any suitable material, including glass and plastic.
The guide portion 2604 is releasably coupled to the body 2602 and includes a stem 2618, a guide 2620, and a coupler 2622. The guide portion 2604 may be coupled to the body 2602 by any type of releasable connection. Examples of releasable connections include, but are not limited to, magnetic connections, threaded connections, and bayonet connections.
Referring to
The second body portion 2608 also includes a lens cavity 2626 and a button cavity 2628. The lens cavity 2626 is sized and configured to receive the lens 2616. The lens 2616 is coupled to the lens cavity 2626 by any suitable coupling mechanism (e.g., adhesive, mechanical, etc.) such that the lens 2616 is rigidly coupled to the lens cavity 2626. The button cavity 2628 is sized and configured to receive the button 2612. In some embodiments, the button 2612 is coupled to the button cavity 2628 such that the button 2612 is movable relative to the button cavity 2628. In some embodiments, the button 2612 is coupled to the button cavity 2628 such that the button 2612 is rigidly connected to the button cavity 2628.
The body 2602 is configured to secure optics 2630, a motherboard 2632, a daughterboard 2634, a battery 2636, and a haptic motor 2638. The body 2602 may also include a sealing component (not shown) positioned between the second body portion 2608 and the first body portion 2606. The optics 2630 may include the first camera 206, the second camera 208, and the projector 207. In some embodiments, first camera 206 and the second camera 208 may include lenses that are fixed such that a focal length defined by the lenses is also fixed. In other embodiments, the first camera 206 and the second camera 208 may include lenses that are movable such that a focal length defined by the lenses is also movable.
In some embodiments, the scanning device 2600 may also include a light source (not shown) to project light on the surface for scanning. In some embodiments, the light source can project visible light on to the surface for scanning. In some embodiments, the light source can project infrared light (or other non-visible light) on to the surface for scanning.
In some embodiments, the scanning device 2600 includes sensors (not shown) to receive light reflected from the surfaces being scanned. In some embodiments, the sensors are configured to receive infrared light (or other non-visible light) reflected from the surfaces being scanned. In some embodiments, the sensors are configured to receive visible light reflected from the surfaces being scanned.
In some embodiments, the scanning device 2600 may include bone scanning technology. Bone scanning technology may include various methods to scan the exterior and interior of a bone. For example, in a dental environment, a bone scan may provide information regarding the internal health of teeth. A bone scan may also provide information regarding the health of the roots of the teeth, or other portions of the teeth that may not be visible such that they can be scanned by a conventional scanning device. In some embodiments, the bone scanning technology enables acquisition of the shape of teeth roots and the position of teeth roots with respect to one another, teeth of the user, or a jawbone of the user.
The motherboard 2632 is electrically coupled to the daughterboard 2634, the battery 2636, the optics 2630, and the haptic motor 2638. The motherboard 2632 may include a processor and a memory. In some embodiments, the processor can be any type of processor capable of performing the functions described herein. The processor may be a single or multi-core processor(s), digital signal processor, microcontroller, or other processor or processing/controlling circuit. Similarly, the memory can be any type of volatile or non-volatile memory or data storage capable of performing the functions described herein. In operation, the memory may store various data and software used during operation of the scanning device 2600, such as operating systems, applications, programs, libraries, and drivers. The memory is communicatively coupled to the processor such that the processor can execute files located in the memory.
The daughterboard 2634 is electrically coupled to the motherboard and is configured to expand the functionality of the motherboard 2632. In some embodiments, the daughterboard 2634 includes portions to connect to the optics 2630, the button 2612, and the haptic motor 2638. Arranged in this manner, the daughterboard 2634 serves to reduce the footprint of the components positioned within the body 2602 to reduce the overall size of the scanning device 2602.
The battery 2636 is electrically coupled to the motherboard 2632, the daughterboard 2634, the optics 2630, and the haptic motor 2638, and is operable to provide power to allow the scanning device 2600 to operate. In some embodiments, the battery 2636 is an alkaline battery (e.g., an AA battery, an AAA battery, etc.). In embodiments where the battery 2636 is an alkaline battery, the battery 2636 may be replaced as needed by separating the second body portion 2608 from the first body portion 2606. In some embodiments, the battery 2636 is a rechargeable battery (e.g., a lithium-ion battery, a nickel-cadmium battery, a nickel-metal-hydride battery, etc.). In such embodiments, the battery 2636 can be recharged by connecting the electrical connection 2614 to a power source. The electrical connection 2614 is electrically coupled to the battery 2636. In some embodiments, the battery 2636 is provided partially charged such that a user can complete a number of scans (e.g., 6 scans, 12 scans, 18 scans, etc.) before needing to charge the battery 2636. In some embodiments, the battery 2636 is provided fully charged such that a user can complete a larger number of scans (e.g., 20 scans, 25 scans, 30 scans, etc.) before needing to charge the battery 2636.
The haptic motor 2638 is electrically coupled to the battery 2636, the motherboard 2632, and the daughterboard 2634, and is configured to provide haptic feedback to the user regarding the use of the scanner. For example, the haptic motor may induce a vibration in the scanning device 2600 when the user is holding the scanning device 2600 to notify the user of an event. The event may include the initiation of a scan, the completion of a scan, an error during scanning, or other events associated with the scanning device 2600.
The sealing component may be positioned between the second body portion 2608 and the first body portion 2606 such that the sealing component prevents fluids from entering the body 2602. In embodiments where the base body portion 2610 is a separate component from the second body portion 2608 and the first body portion 2606, the sealing component may include multiple sealing components between the second body portion 2608, the first body portion 2606, and the base body portion 2610 such that fluid is prevented from entering the body 2602.
Depending on the length of the stem 2618 and other features included in the scanning device 2600, the scanning device 2600 can scan teeth either intraorally or extraorally. For example, the distance between the teeth and the lens 2616 and the teeth is at least partially dependent on the length of the stem 2618. A relatively short stem 2618 may result in the lens being at least partially within the mouth during a scan (e.g., an intraoral scan). A relatively long stem 2618 may result in the lens being outside of the mouth during a scan (e.g., an extraoral scan).
Referring to
Additionally, the scanning device 2700 can include a processing circuit substantially similar to the processing circuit 104, a scanning circuit substantially similar to the scanning circuit 110, a communications circuit substantially similar to the communications circuit 112, a machine learning circuit substantially similar to the machine learning circuit 114, and an analysis circuit substantially similar to the analysis circuit 116.
The scanning device 2700 includes a body 2702 and a guide portion 2704. The body 2702 includes first body portion 2706, a second body portion 2708, a base body portion 2710, a button 2712, an electrical connection 2714, and a lens 2716. The guide portion 2704 includes a stem 2718, a guide 2720, and a coupler 2722.
The first body portion 2706, the second body portion 2708, and the base body portion 2710 can be manufactured from any material suitable for use in a scanning device. Examples of suitable materials include, but are not limited to, plastics (e.g., acrylonitrile butadiene styrene (ABS), polyurethane, polycarbonate, etc.), metals (e.g., aluminum, stainless steel, etc.), or any combination thereof. In some embodiments, the base body portion 2710 and the second body portion 2708 are of unitary construction. In some embodiments, the base body portion 2710 and the second body portion 2708 are separate components. The second body portion 2708, the first body portion 2706, and the base body portion 2710 can be coupled by any suitable method to form the body 2702. Suitable methods include, but are not limited to, physical connections (e.g., screws, rivets, bolts, etc.), chemical connections (e.g., adhesives, etc.), and designed connections (e.g., press fit, snap fit, etc.).
The body 2702 is configured to contain the components described above (e.g., the processing circuit substantially, the scanning circuit, the communications circuit, the machine learning circuit, and the analysis circuit). In some embodiments, the components described above may be included on a printed circuit board (PCB). In some embodiments, the components described above may be included as separate components. The body 2702 may also be configured to contain additional components. For example, the body 2702 may be configured to contain a haptic motor, a battery, a sealing component, and optics, which will be further described with reference to
The base body portion 2710 may be constructed from the same materials as the second body portion 2708 and the first body portion 2706, and includes the electrical connection 2714. The electrical connection 2714 is recessed within the base body portion 2710 and includes electrical components electrically coupled to a battery such that the battery can be recharged when the electrical connection 2714 is coupled to a power source. The electrical connection 2714 can be any type of conventional electrical connection. For example, the electrical connection 2614 can be a USB connection (e.g., USB-A, USB-B, USB-C, mini-USB, micro-USB, Lightning). The electrical connection 2714 can also be any other type of connection configured to couple to a power source and receive power to charge a battery.
The button 2712 is disposed on, or within, the second body portion 2708. In some embodiments, the button 2712 is flush with an outer surface of the second body portion 2708. In some embodiments, the button 2712 protrudes above the outer surface of the second body portion 2708. In some embodiments, the button 2712 is recessed below the outer surface of the second body portion 2708. The button 2712 is configured to actuate the electrical connection between the battery and the other components located within the body 2702. For example, when the scanning device 2700 is off, the electrical connection between the battery and the other components is open. When the user desires to turn on the scanning device 2700, the user depresses the button 2712, which closes the connection between the battery and the other components such that the other components draw power from the battery, and the scanning device 2700 turns on.
The lens 2716 is disposed on, or within the second body portion 2708, and is situated between the optics and the scanning target. For example, the scanning target may be a set of teeth. When the scanning device 2700 is oriented such that the optics are pointed at the set of teeth, the lens is positioned between the optics and the set of teeth. In some embodiments, the lens may be formed to a desired shape by grinding, polishing, or molding processes. In some embodiments, the lens 2716 is configured to focus the optics on the scanning target such that the image of the scanning target is clear. In some embodiments, the lens 2716 is configured to prevent foreign bodies (e.g., dust, particles, etc.) from entering the scanning device 2700 and does not focus the optics. In such embodiments, the lens 2616 may not include a specific shape but may be a transparent or translucent barrier (e.g., plastic, glass, etc.) The lens can be manufactured from any suitable material, including glass and plastic.
The guide portion 2704 is releasably coupled to the body 2702 and includes a stem 2718, a guide 2720, and a coupler 2722. The guide portion 2704 may be coupled to the body 2702 by any type of releasable connection. Examples of releasable connections include, but are not limited to, magnetic connections, threaded connections, and bayonet connections.
Referring to
The second body portion 2708 also includes a lens cavity 2726 and a button cavity 2728. The lens cavity 2726 is sized and configured to receive the lens 2716. The lens 2716 is coupled to the lens cavity 2726 by any suitable coupling mechanism (e.g., adhesive, mechanical, etc.) such that the lens 2716 is rigidly coupled to the lens cavity 2726. The button cavity 2728 is sized and configured to receive the button 2712. In some embodiments, the button 2712 is coupled to the button cavity 2728 such that the button 2712 is movable relative to the button cavity 2728. In some embodiments, the button 2712 is coupled to the button cavity 2728 such that the button 2712 is rigidly connected to the button cavity 2728.
The body 2702 is configured to secure optics 2730, a motherboard 2732, a daughterboard 2734, a battery 2736, and a haptic motor 2738. The body 2702 may also include a sealing component 2740 positioned between the second body portion 2708 and the first body portion 2706. The optics 2730 may include the first camera 206, the second camera 208, and the projector 207. In some embodiments, first camera 206 and the second camera 208 may include lenses that are fixed such that a focal length defined by the lenses is also fixed. In other embodiments, the first camera 206 and the second camera 208 may include lenses that are movable such that a focal length defined by the lenses is also movable.
In some embodiments, the scanning device 2700 may also include a light source (not shown) to project light on the surface to be scanned. In some embodiments, the light source can project visible light on to the surface to be scanned. In some embodiments, the light source can project infrared light (or other non-visible light) on to the surface to be scanned.
In some embodiments, the scanning device 2700 includes sensors (not shown) to receive light reflected from the surfaces being scanned. In some embodiments, the sensors are configured to receive infrared light (or other non-visible light) reflected from the surfaces being scanned. In some embodiments, the sensors are configured to receive visible light reflected from the surfaces being scanned.
In some embodiments, the scanning device 2700 may include bone scanning technology. Bone scanning technology may include various methods to scan the exterior and interior of a bone. For example, in a dental environment, a bone scan may provide information regarding the internal health of teeth. A bone scan may also provide information regarding the health of the roots of the teeth, or other portions of the teeth that may not be visible such that they can be scanned by a conventional scanning device. In some embodiments, the bone scanning technology enables acquisition of the shape of teeth roots and the position of teeth roots with respect to one another, teeth of the user, or a jawbone of the user.
The motherboard 2732 is electrically coupled to the daughterboard 2734, the battery 2736, the optics 2730, and the haptic motor 2738. The motherboard 2732 may include a processor and a memory. In some embodiments, the processor can be any type of processor capable of performing the functions described herein. The processor may be a single or multi-core processor(s), digital signal processor, microcontroller, or other processor or processing/controlling circuit. Similarly, the memory can be any type of volatile or non-volatile memory or data storage capable of performing the functions described herein. In operation, the memory may store various data and software used during operation of the scanning device 2700, such as operating systems, applications, programs, libraries, and drivers. The memory is communicatively coupled to the processor such that the processor can execute files located in the memory.
The daughterboard 2734 is electrically coupled to the motherboard and is configured to expand the functionality of the motherboard 2732. In some embodiments, the daughterboard 2734 includes portions to connect to the optics 2730, the button 2712, and the haptic motor 2738. Arranged in this manner, the daughterboard 2734 serves to reduce the footprint of the components positioned within the body 2702 to reduce the overall size of the scanning device 2702.
The battery 2736 is electrically coupled to the motherboard 2732, the daughterboard 2734, the optics 2730, and the haptic motor 2738, and is operable to provide power to allow the scanning device 2700 to operate. In some embodiments, the battery 2736 is an alkaline battery (e.g., an AA battery, an AAA battery, etc.). In embodiments where the battery 2736 is an alkaline battery, the battery 2736 may be replaced as needed by separating the second body portion 2708 from the first body portion 2706. In some embodiments, the battery 2736 is a rechargeable battery (e.g., a lithium-ion battery, a nickel-cadmium battery, a nickel-metal-hydride battery, etc.). In such embodiments, the battery 2736 can be recharged by connecting the electrical connection 2714 to a power source. The electrical connection 2714 is electrically coupled to the battery 2736. In some embodiments, the battery 2736 is provided partially charged such that a user can complete a number of scans (e.g., 6 scans, 12 scans, 18 scans, etc.) before needing to charge the battery 2736. In some embodiments, the battery 2736 is provided fully charged such that a user can complete a larger number of scans (e.g., 20 scans, 25 scans, 30 scans, etc.) before needing to charge the battery 2736.
The haptic motor 2738 is electrically coupled to the battery 2736, the motherboard 2732, and the daughterboard 2734, and is configured to provide haptic feedback to the user regarding the use of the scanner. For example, the haptic motor may induce a vibration in the scanning device 2700 when the user is holding the scanning device 2700 to notify the user of an event. The event may include the initiation of a scan, the completion of a scan, an error during scanning, or other events associated with the scanning device 2700.
The sealing component 2740 may be positioned between the second body portion 2708 and the first body portion 2706 such that the sealing component prevents fluids from entering the body 2702. In embodiments where the base body portion 2710 is a separate component from the second body portion 2708 and the first body portion 2706, the sealing component may include multiple sealing components 2740 between the second body portion 2708, the first body portion 2706, and the base body portion 2710 such that fluid is prevented from entering the body 2702.
Depending on the length of the stem 2718 and other features included in the scanning device 2700, the scanning device 2700 can scan teeth either intraorally or extraorally. For example, the distance between the teeth and the lens 2716 and the teeth is at least partially dependent on the length of the stem 2718. A relatively short stem 2718 may result in the lens being at least partially within the mouth during a scan (e.g., an intraoral scan). A relatively long stem 2718 may result in the lens being outside of the mouth during a scan (e.g., an extraoral scan).
Referring to
Additionally, the scanning device 2800 can include a processing circuit substantially similar to the processing circuit 104, a scanning circuit substantially similar to the scanning circuit 110, a communications circuit substantially similar to the communications circuit 112, a machine learning circuit substantially similar to the machine learning circuit 114, and an analysis circuit substantially similar to the analysis circuit 116.
The scanning device 2800 includes a body 2802 and a guide portion 2804. The body 2802 includes first body portion 2806, a second body portion 2808, a base body portion 2810, a button 2812, an electrical connection 2814, and a lens 2816. The guide portion 2804 includes a stem 2818, a guide 2820, and a coupler 2822.
The first body portion 2806, the second body portion 2808, and the base body portion 2810 can be manufactured from any material suitable for use in a scanning device. Examples of suitable materials include, but are not limited to, plastics (e.g., acrylonitrile butadiene styrene (ABS), polyurethane, polycarbonate, etc.), metals (e.g., aluminum, stainless steel, etc.), or any combination thereof. In some embodiments, the base body portion 2810 and the second body portion 2808 are of unitary construction. In some embodiments, the base body portion 2810 and the second body portion 2808 are separate components. The second body portion 2808, the first body portion 2806, and the base body portion 2810 can be coupled by any suitable method to form the body 2802. Suitable methods include, but are not limited to, physical connections (e.g., screws, rivets, bolts, etc.), chemical connections (e.g., adhesives, etc.), and designed connections (e.g., press fit, snap fit, etc.).
The body 2802 is configured to contain the components described above (e.g., the processing circuit substantially, the scanning circuit, the communications circuit, the machine learning circuit, and the analysis circuit). In some embodiments, the components described above may be included on a printed circuit board (PCB). In some embodiments, the components described above may be included as separate components. The body 2802 may also be configured to contain additional components. For example, the body 2802 may be configured to contain a haptic motor, a battery, a sealing component, and optics, which will be further described with reference to
The haptic motor may be configured to provide haptic feedback to the user regarding the use of the scanner. For example, the haptic motor may induce a vibration in the scanning device 2800 when the user is holding the scanning device 2800 to notify the user of an event. The event may include the initiation of a scan, the completion of a scan, an error during scanning, or other events associated with the scanning device 2800.
The battery is configured to provide power to the scanning device 2800. In some embodiments, the battery is a rechargeable battery that is configured to recharge when the scanning device 2800 is coupled to a power source. In some embodiments, the battery is a disposable battery that can be replaced by separating the second body portion 2808 and the first body portion 2806 such that the battery is exposed and can be removed and replaced with a new battery.
The sealing component may be positioned between the second body portion 2808 and the first body portion 2806 such that the sealing component prevents fluids from entering the body 2802. In embodiments where the base body portion 2810 is a separate component from the second body portion 2808 and the first body portion 2806, the sealing component may include multiple sealing components between the second body portion 2808, the first body portion 2806, and the base body portion 2810 such that fluid is prevented from entering the body 2802.
In some embodiments, the optics component may include the first camera 206, the second camera 208, and the projector 207. In some embodiments, first camera 206 and the second camera 208 may include lenses that are fixed such that a focal length defined by the lenses is also fixed. In other embodiments, the first camera 206 and the second camera 208 may include lenses that are movable such that a focal length defined by the lenses is also movable.
The base body portion 2810 may be constructed from the same materials as the second body portion 2808 and the first body portion 2806, and includes the electrical connection 2814. The electrical connection 2814 is recessed within the base body portion 2810 and includes electrical components electrically coupled to the battery such that the battery can be recharged when the electrical connection 2814 is coupled to a power source. The electrical connection 2814 can be any type of conventional electrical connection. For example, the electrical connection 2814 can be a USB connection (e.g., USB-A, USB-B, USB-C, mini-USB, micro-USB, Lightning). The electrical connection 2814 can also be any other type of connection configured to couple to a power source and receive power to charge a battery.
The button 2812 is disposed on, or within, the second body portion 2808. In some embodiments, the button 2812 is flush with an outer surface of the second body portion 2808. In some embodiments, the button 2812 protrudes above the outer surface of the second body portion 2808. In some embodiments, the button 2812 is recessed below the outer surface of the second body portion 2808. The button 2812 is configured to actuate the electrical connection between the battery and the other components located within the body 2802. For example, when the scanning device 2800 is off, the electrical connection between the battery and the other components is open. When the user desires to turn on the scanning device 2800, the user depresses the button 2812, which closes the connection between the battery and the other components such that the other components draw power from the battery, and the scanning device 2800 turns on.
The lens 2816 is disposed on, or within the second body portion 2808, and is situated between the optics and the scanning target. For example, the scanning target may be a set of teeth. When the scanning device 2800 is oriented such that the optics are pointed at the set of teeth, the lens is positioned between the optics and the set of teeth. In some embodiments, the lens may be formed to a desired shape by grinding, polishing, or molding processes. In some embodiments, the lens 2816 is configured to focus the optics on the scanning target such that the image of the scanning target is clear. In some embodiments, the lens 2816 is configured to prevent foreign particles (e.g., dust, particles, etc.) from entering the scanning device 2800 and does not focus the optics. In such embodiments, the lens 2816 may not include a specific shape but may be a transparent or translucent barrier (e.g., plastic, glass, etc.) The lens can be manufactured from any suitable material, including glass and plastic.
The guide portion 2804 is releasably coupled to the body 2802 and includes a stem 2818, a guide 2820, and a coupler 2822. The guide portion 2804 may be coupled to the body 2802 by any type of releasable connection. Examples of releasable connections include, but are not limited to, magnetic connections, threaded connections, and bayonet connections.
Referring to
The second body portion 2808 also includes a lens cavity 2826 and a button cavity 2828. The lens cavity 2826 is sized and configured to receive the lens 2816. The lens 2816 is coupled to the lens cavity 2826 by any suitable coupling mechanism (e.g., adhesive, mechanical, etc.) such that the lens 2816 is rigidly coupled to the lens cavity 2826. The button cavity 2828 is sized and configured to receive the button 2812. In some embodiments, the button 2812 is coupled to the button cavity 2828 such that the button 2812 is movable relative to the button cavity 2828. In some embodiments, the button 2812 is coupled to the button cavity 2828 such that the button 2812 is rigidly connected to the button cavity 2828.
The body 2802 is configured to secure optics 2830, a motherboard 2832, a daughterboard 2834, a battery 2836, and a haptic motor 2838. The body 2802 may also include a disc 2840 positioned between the second body portion 2808 and the first body portion 2806. The optics 2830 may include the first camera 206, the second camera 208, and the projector 207. In some embodiments, first camera 206 and the second camera 208 may include lenses that are fixed such that a focal length defined by the lenses is also fixed. In other embodiments, the first camera 206 and the second camera 208 may include lenses that are movable such that a focal length defined by the lenses is also movable.
In some embodiments, the scanning device 2800 may also include a light source (not shown) to project light on the surface to be scanned. In some embodiments, the light source can project visible light on to the surface to be scanned. In some embodiments, the light source can project infrared light (or other non-visible light) on to the surface to be scanned.
In some embodiments, the scanning device 2800 includes sensors (not shown) to receive light reflected from the surfaces being scanned. In some embodiments, the sensors are configured to receive infrared light (or other non-visible light) reflected from the surfaces being scanned. In some embodiments, the sensors are configured to receive visible light reflected from the surfaces being scanned.
In some embodiments, the scanning device 2800 may include bone scanning technology. Bone scanning technology may include various methods to scan the exterior and interior of a bone. For example, in a dental environment, a bone scan may provide information regarding the internal health of teeth. A bone scan may also provide information regarding the health of the roots of the teeth, or other portions of the teeth that may not be visible such that they can be scanned by a conventional scanning device. In some embodiments, the bone scanning technology enables acquisition of the shape of teeth roots and the position of teeth roots with respect to one another, teeth of the user, or a jawbone of the user.
The motherboard 2832 is electrically coupled to the daughterboard 2834, the battery 2836, the optics 2830, and the haptic motor 2838. The motherboard 2832 may include a processor and a memory. In some embodiments, the processor can be any type of processor capable of performing the functions described herein. The processor may be a single or multi-core processor(s), digital signal processor, microcontroller, or other processor or processing/controlling circuit. Similarly, the memory can be any type of volatile or non-volatile memory or data storage capable of performing the functions described herein. In operation, the memory may store various data and software used during operation of the scanning device 2800, such as operating systems, applications, programs, libraries, and drivers. The memory is communicatively coupled to the processor such that the processor can execute files located in the memory.
The daughterboard 2834 is electrically coupled to the motherboard and is configured to expand the functionality of the motherboard 2832. In some embodiments, the daughterboard 2834 includes portions to connect to the optics 2830, the button 2812, and the haptic motor 2838. Arranged in this manner, the daughterboard 2834 serves to reduce the footprint of the components positioned within the body 2802 to reduce the overall size of the scanning device 2802.
The battery 2836 is electrically coupled to the motherboard 2832, the daughterboard 2834, the optics 2830, and the haptic motor 2838, and is operable to provide power to allow the scanning device 2800 to operate. In some embodiments, the battery 2836 is an alkaline battery (e.g., an AA battery, an AAA battery, etc.). In embodiments where the battery 2836 is an alkaline battery, the battery 2836 may be replaced as needed by separating the second body portion 2808 from the first body portion 2806. In some embodiments, the battery 2836 is a rechargeable battery (e.g., a lithium-ion battery, a nickel-cadmium battery, a nickel-metal-hydride battery, etc.). In such embodiments, the battery 2836 can be recharged by connecting the electrical connection 2814 to a power source. The electrical connection 2814 is electrically coupled to the battery 2836. In some embodiments, the battery 2836 is provided partially charged such that a user can complete a number of scans (e.g., 6 scans, 12 scans, 18 scans, etc.) before needing to charge the battery 2836. In some embodiments, the battery 2836 is provided fully charged such that a user can complete a larger number of scans (e.g., 20 scans, 25 scans, 30 scans, etc.) before needing to charge the battery 2836.
The haptic motor 2838 is electrically coupled to the battery 2836, the motherboard 2832, and the daughterboard 2834, and is configured to provide haptic feedback to the user regarding the use of the scanner. For example, the haptic motor may induce a vibration in the scanning device 2800 when the user is holding the scanning device 2800 to notify the user of an event. The event may include the initiation of a scan, the completion of a scan, an error during scanning, or other events associated with the scanning device 2800.
The sealing component (not shown) may be positioned between the second body portion 2808 and the first body portion 2806 such that the sealing component prevents fluids from entering the body 2802. In embodiments where the base body portion 2810 is a separate component from the second body portion 2808 and the first body portion 2806, the sealing component may include multiple sealing components between the second body portion 2808, the first body portion 2806, and the base body portion 2810 such that fluid is prevented from entering the body 2802.
A disc 2840 may be positioned between the stem 2818 and the body 2802. In some embodiments, the disc 2840 may be magnetic such that the stem 2818 (which may also be magnetic, in some embodiments), can be releasable coupled to the disc 2840. The disc 2840 may be rigidly coupled to the body 2802 such that, when the stem 2818 is removed from the disc 2840, the disc 2840 remains coupled to the body 2802.
Depending on the length of the stem 2818 and other features included in the scanning device 2800, the scanning device 2800 can scan teeth either intraorally or extraorally. For example, the distance between the teeth and the lens 2816 and the teeth is at least partially dependent on the length of the stem 2818. A relatively short stem 2818 may result in the lens being at least partially within the mouth during a scan (e.g., an intraoral scan). A relatively long stem 2718 may result in the lens being outside of the mouth during a scan (e.g., an extraoral scan).
Referring to
The body 2922 is shown to enclose optics 2926, a motherboard 2928, a daughterboard 2930, and a battery 2932. The optics 2926, the motherboard 2928, the daughterboard 2930, and the battery 2932 are substantially similar to the similarly named components of
The battery 2932 is positioned toward the bottom of the body 2922 such that no other components are positioned between the battery 2932 and the bottom of the body 2922. The motherboard 2928 and the daughterboard 2930 are positioned between the battery 2932 and the optics 2926. The motherboard 2928 and the daughterboard 2930 are arranged such that the motherboard 2928 and the daughterboard 2930 are adjacent to each other.
Referring to
The body 2952 is shown to enclose optics 2956, a motherboard 2958, a daughterboard 2960, a battery 2962, a button 2964, and a haptic motor 2968. The optics 2956, the motherboard 2958, the daughterboard 2960, the battery 2962, the button 2964, and the haptic motor 2968 are substantially similar to the similarly named components of
The motherboard 2958 is positioned toward the bottom of the body 2952 such that no other components are positioned between the motherboard 2958 and the bottom of the body 2922. The haptic motor 2968 is positioned adjacent to the motherboard 2958 toward the bottom of the body 2952. The battery 2962 is positioned between the motherboard 2958 and the optics 2956, and is adjacent to the daughterboard 2960. The button 2964 extends from the daughterboard 2960 such that the button 2964 is accessible to a user. In some embodiments, the button 2964 extends through the body 2952 for user access. In some embodiments, the button 2964 is flush with an outer surface of the body 2952 for user access.
Referring to
Referring to
Referring to
Referring to
Referring to
The guide 3300 includes a base 3302 and a boss 3308 protruding from the base 3302. The base 3302 includes a base top 3310 and a base bottom 3312, with the base 3302 extending from the base top 3310 to the base bottom 3312. A first side portion 3304 protrudes from the base 3302 and extends from the base top 3310 to the base bottom 3312. The first side portion 3304 defines a concave shape such that the first side portion curves toward the boss 3308 as the first side portion extends from the base top 3310 toward the boss 3308, and the first side portion curves away from the boss 3308 as the first side portion extends from the boss 3308 toward the base bottom 3312. A second side portion 3306 defines a concave shape such that the second side portion curves toward the boss 3308 as the second side portion extends from the base top 3310 toward the boss 3308, and the second side portion curves away from the boss 3308 as the second side portion extends from the boss 3308 toward the base bottom 3312.
The boss 3308 includes a first protrusion 3314, a second protrusion 3316, a third protrusion 3318, and a fourth protrusion 3320. The first protrusion 3314 protrudes from the base 3302 and the second side portion 3306. The second protrusion 3316 protrudes from the base 3302 and the second side portion 3306. The first protrusion 3314 and the second protrusion 3316 meet at an axis extending across the base 3302 approximately midway between the base top 3310 and the base bottom 3312. The third protrusion 3318 protrudes from the base 3302 and the first side portion 3304. The fourth protrusion 3320 protrudes from the base 3302 and the first side portion 3304. The third protrusion 3318 and the fourth protrusion 3320 meet at the axis extending across the base 3302 approximately midway between the base top 3310 and the base bottom 3312. The first protrusion 3314, the second protrusion 3316, the third protrusion 3318, and the fourth protrusion 3320 meet at an intersection of the axis extending across the base 3302 approximately midway between the base top 3310 and the base bottom 3312 and an axis extending across the base 3302 approximately midway between the first side portion 3304 and the second side portion 3306. Accordingly, the shape of the boss 3308 may be described as a “bow tie” shape.
Referring to
The guide 3300 is shown moving over molar teeth 3404 in
Referring to
The guide 3500 includes a base 3502 and a boss 3508 protruding from the base 3502. The base 3502 includes a base top 3510 and a base bottom 3512, with the base 3502 extending from the base top 3510 to the base bottom 3512. A first side portion 3504 protrudes from the base 3502 and extends from the base top 3510 to the base bottom 3512. The first side portion 3504 defines a concave shape such that the first side portion curves toward the boss 3508 as the first side portion extends from the base top 3510 toward the boss 3508, and the first side portion curves away from the boss 3508 as the first side portion extends from the boss 3508 toward the base bottom 3512. A second side portion 3506 defines a concave shape such that the second side portion curves toward the boss 3508 as the second side portion extends from the base top 3510 toward the boss 3508, and the second side portion curves away from the boss 3508 as the second side portion extends from the boss 3508 toward the base bottom 3512.
The boss 3508 includes first protrusion 3514, a second protrusion 3516, a third protrusion 3518, and a fourth protrusion 3520. The first protrusion 3514 protrudes from the base 3502 and extends toward the midpoint of the base 3502. The second protrusion 3516 protrudes from the base 3502 and extends toward the midpoint of the base 3502. The third protrusion 3518 protrudes from the base 3502 and extends toward the midpoint of the base 3502, and the fourth protrusion 3520 protrudes from the base 3502 and extends toward the midpoint of the base 3502. The first protrusion 3514, the second protrusion 3516, the third protrusion 3518, and the fourth protrusion 3520 meet at an intersection of an axis extending across the base 3502 approximately midway between the base top 3510 and the base bottom 3512 and an axis extending across the base approximately midway between the first side portion 3504 and the second side portion 3506. Accordingly, the shape of the boss 3508 may be described as a “diamond” or “pyramid” shape, with the apex located along an axis coaxial with an approximate center of the base 3502.
Referring to
The guide 3500 is shown moving over molar teeth 3404 in
Referring to
The guide 3700 includes a base 3702 (not shown) rotatably coupled to the stem 2618, and a boss 3708. A first side portion 3704 extends away from the base 3702 at an angle (e.g., 30 degrees, 40 degrees, 50 degrees, etc.). A second side portion 3706 extends away from the base 3702 at an angle opposite the angle of the first side portion 3704 (e.g., −30 degrees, −40 degrees, −50 degrees, etc.). The lengths of the first side portion 3704 and the second side portion 3706 are substantially similar such that the first side portion 3704 and the second side portion 3706 create two sides of an isosceles triangle, with the third side being open to allow the guide 3700 to receive a tooth.
The boss 3708 includes a first protrusion 3714 and a second protrusion 3716. The first protrusion 3714 extends from the first side portion 3704 and away from the stem 2618. The second protrusion 3718 extends from the second side portion 3706 and away from the stem 2618. Both the first protrusion 3714 and the second protrusion 3718 extend from the first side portion 3704 and the second side portion 3706, respectively, such that the first protrusion 3714 and the second protrusion 3716 are thickest where the first protrusion 3714 meets the second protrusion 3716 (e.g., in the approximate center of the guide 3700). The thickness of the first protrusion 3714 decreases as the first protrusion 3714 extends away from the approximate center of the guide 3700. The thickness of the second protrusion 3716 also decreases as the second protrusion 3716 extends away from the approximate center of the guide 3700. In some embodiments, the first protrusion 3714 and the second protrusion 3716 are one continuous protrusion that extends from a center portion of the base 3702 along the first side portion 3704 and the second side portion 3706.
Referring to
The guide 3700 is shown moving over molar teeth 3404 in
Referring to
The guide 3900 includes a base 3902, a first side portion 3904, and a second side portion 3908. The base 3902 is coupled to the coupler 2622 via a spindle 3924 that extends through the base 3902. In some embodiments, the spindle 3924 is rotatably coupled to the coupler 2622 and is rigidly coupled to the base 3902 such that the base 3902 rotates relative to the stem 2618. In some embodiments, the spindle 3924 is rigidly coupled to the coupler 2622 and is rotatably coupled to the base 3902 such that the base 3902 rotates relative to the spindle 3924 and the stem 2618. The base 3902 includes a first cavity 3904 and a second cavity 3906. The first cavity 3904 and the second cavity 3906 are openings in the base 3902 that extend entirely through the base 3902. The first cavity 3904 provides one or more surfaces to which the first side portion 3908 is coupled, and the second cavity 3906 provides one or more surfaces to which the second side portion 3910 is coupled.
The first side portion 3908 includes a first extension 3912, a first flange 3916, and a first tooth contact surface 3920. The first extension 3912 is rigidly coupled to the base 3902 and extends from the base 3902 such that the first extension 3912 is substantially perpendicular (e.g., between 80° and 100°) to the base 3902. The first flange 3916 extends from the first extension 3912 such that the first flange 3916 is substantially perpendicular (e.g., between 80° and 100°) to the first extension 3912. The first tooth contact surface 3920 is a concave surface positioned on the first flange 3916 opposite the first extension 3912. The first tooth contact surface 3920 is configured to contact one or more teeth during a scan.
The second side portion 3910 includes a second extension 3914, a second flange 3918, and a second tooth contact surface 3922. The second extension 3914 is rigidly coupled to the base 3902 and extends from the base 3902 such that the second extension 3914 is substantially perpendicular (e.g., between 80° and 100°) to the base 3902. The second flange 3918 extends from the second extension 3914 such that the second flange 3918 is substantially perpendicular (e.g., between 80° and 100°) to the second extension 3914. The second tooth contact surface 3922 is a concave surface positioned on the second flange 3918 opposite the second extension 3914. The second tooth contact surface 3922 is configured to contact one or more teeth during a scan. The first tooth contact surface 3920 and the second tooth contact surface 3922 are positioned opposite each other to contact opposite sides of one or more teeth during a scan.
Referring to
The guide 3900 is shown moving over molar teeth 3404 in
Referring to
The top 4102 and the bottom 4104 are coupled such that the top 4102 and the bottom 4104 cannot be separated, and the top 4102 and the bottom 4104 are pivotally coupled to each other. For example, the top 4102 and the bottom 4104 can be a single molded component with a living hinge that provides for pivotal movement between the top 4102 and the bottom 4104. As another example, the top 4102 and the bottom 4104 can be separate molded components coupled by a hinge that provides for pivotal movement between the top 4102 and the bottom 4104. In yet another example, the top 4102 and the bottom 4104 can be separate molded components coupled by incorporating a snap-fit connection that provides for pivotal movement between the top 4102 and the bottom 4104. The top 4102 and the bottom 4104 can be manufactured from any type of plastic that provides the desired properties to protect the scanning device 2700. For example, the top 4102 and the bottom 4104 can be manufactured from ABS, polycarbonate, polyurethane, polyethylene, or any other type of material that provides the desired properties.
The top latch 4106 is integrated with the top 4102 and is configured to mate with the bottom latch 4108 (which is integrated with the bottom 4104) to secure the case 4100 in a closed position. In some embodiments, the top latch 4106 and the bottom latch 4108 are magnets that attract each other when they are close together. For example, when closing the case 4100, the top latch 4106 and the bottom latch 4108 may force the case 4100 closed when the top latch 4106 and the bottom latch 4108 are within a threshold distance of each other (e.g., 6 mm, 8 mm, 4 mm, etc.). In some embodiments, the top latch 4104 includes a tab and the bottom latch 4106 includes a slot configured to receive the tab. In such embodiments, the user must close the case 4100 such that the top latch 4104 engages with the bottom latch 4106 to prevent the case 4100 from opening inadvertently.
The insert 4110 is configured to fit within the bottom 4104 and provide secure storage areas for various components. The insert 4110 includes a first cavity 4112, a second cavity 4114, and a third cavity 4116. The first cavity 4112 is a recessed portion of the insert 4110 and is sized to receive the body 2702 of the scanning device 2700. In various other embodiments, the first cavity 4112 is sized and configured to receive the body 2602 of the scanning device 2600, the body 2802 of the scanning device 2800, or any other body of another scanning device. The second cavity 4114 and the third cavity 4116 are sized and configured to receive various guide portions (e.g., the guide portion 2604, the guide portion 2704, the guide portion 2804, etc.) or other guides (e.g., the guide 3300, the guide 3500, the guide 3700, the guide 3900, etc.). In some embodiments, the guide portions or guides stored in the second cavity 4114 and the third cavity 4116 can be the same guide (e.g., the second cavity 4114 and the third cavity 4116 may each store a guide portion 2704). In some embodiments, the guide portions or guides stored in the second cavity 4114 and the third cavity 4116 can be different guides (e.g., the second cavity 4114 may store the guide portion 2704 and the third cavity 4116 may store the guide 3300).
In some embodiments, the case 4100 includes an electrical connection such that the case 4100 can be plugged in to a standard outlet to provide power to one or more components stored within the case 4100.
Referring to
The mobile device cavity 4206 is positioned such that a bottom of the mobile device 4208 can rest within the mobile device cavity such that a back of the mobile device contacts the back portion 4206 to maintain the mobile device 4208 in a substantially upright (e.g., between 45° and 90° to horizontal) position. In some embodiments, the first cavity 4112 does not include the mobile device cavity 4202, and the mobile device 4208 rests within the first cavity 4112 such that the back of the mobile device 4208 contacts the back portion 4206 to maintain the mobile device 4208 in a substantially upright position.
The scanning device cavity 4204 is sized and configured such that a base of a scanning device (e.g., the base body portion 2610, the base body portion 2710, the base body portion 2810, etc.) fits within the scanning device cavity 4204. The scanning device cavity 4204 may also include an electrical charger to mate with an electrical connection of a scanning device (e.g., the electrical connection 2614, the electrical connection 2714, the electrical connection 2814, etc.). For example, when the scanning device 2700 is secured within the scanning device cavity 4204, the electrical charger mates with the electrical connection 2714 such that the scanning device 2700 charges when the case 4100 is connected to a power source.
Referring to
At 4304, a mobile application is provided to the user. For example, after providing the user's email address, the treatment provider may send the user an email and prompt the user to download a mobile application for use on a mobile device (e.g., a mobile phone, a tablet computer, etc.). The mobile application may be associated with the treatment provider and may provide data to the treatment provider to develop a treatment plan. The mobile application may also provide information to the user regarding the progress of the user according to the treatment plan.
At 4306, information is provided to the user regarding orthodontic treatment. For example, after the user downloads the mobile application and opens the mobile application, the mobile application may provide the user with information regarding orthodontic treatment. The mobile application may show the user different orthodontic treatment options, including wire and bracket systems (e.g., braces) and aligner systems, and show the user how each system works. The mobile application may also show the user advantages and disadvantages of each system.
At 4308, the user is prompted to purchase a scanning device. For example, the mobile application may provide the user the option to order a scanning device to conduct a scan to begin orthodontic treatment with aligners. The user may choose to purchase the scanning device at that time, or the user may choose to defer the purchase of the scanning device to a later date. If the user chooses to purchase the scanning device at that time, the user will be prompted to enter payment information to purchase the scanner.
At 4310, the user is prompted to create an account with the treatment provider. For example, the user can create an account through the mobile application by entering additional information and creating a username and password. As another example, the user can create an account through the website of the treatment provider by entering additional information and creating a login and password.
Though the steps of method 4300 are shown in a particular order, the steps can be performed in any order. For example, the user may be provided with information regarding orthodontic treatment before the treatment provider receives information from the user and before the user is provided with the mobile application. Additionally, the user may be prompted to purchase a scanning device after creating an account or before providing the mobile application to the user. Other orders of the steps of method 4300 are also possible, and the examples above are not intended to be limiting.
Referring to
At 4404, a determination is made as to whether the user has an account. If the user has an account, then at 4406 a determination is made as to whether the user account is a complete account. For example, the user may have entered the minimum amount of information when initially setting up the account, and therefore the user only has a partial account. As another example, the user may have entered all of the required information, and therefore the user has a complete account. If it is determined that the user has a complete account, at 4408 verification information is received from the user. For example, the user may enter the username and password to verify the complete account and sign in to the mobile application As another example, the user may provide a biometric marker (e.g., a fingerprint, retina scan, facial scan, etc.) to verify the complete account and sign in to the mobile application.
At 4418, a determination is made as to whether a scan has already been completed. For example, if a scan has already been completed, the user is notified by the mobile application at 4430 that the scanned images are being processed. Image processing will be further described with reference to
The treatment provider receives the order for the scanning device and ships the scanning device to the user. Upon shipping the scanning device, the mobile application may provide a notification to the user that the scanning device has been shipped, along with an estimated delivery date. The mobile application may also send the user a notification when the scanning device is delivered.
Returning to 4420, if the user has a scanning device (e.g., if the user previously purchased a scanning device or if the user received the scanning device after ordering the scanning device from the mobile application prompt), the user responds to the question regarding the scanning device by indicating the user has a scanning device. The user is then introduced to the scanning device at 4424. For example, the mobile application may instruct the user how to turn on the device. The mobile application may also provide the user with an overview of the scanning device by showing the user the different parts of the scanning device.
At 4426, the scanning device is connected to the mobile device. For example, the mobile application may instruct the user how to initiate pairing of the scanning device and the mobile device via Bluetooth or other short range connection mechanism. Additionally, the mobile application may instruct the user how to initiate pairing of the scanning device and the mobile device via an Internet connection (e.g., a WiFi connection or a wired Internet connection). When the scanning device and the mobile device are paired, the mobile application may provide feedback to the user to indicate that the devices are paired. The scanning device may also provide feedback (e.g., by vibration, sound, light, etc.) that it has been paired to the mobile device.
At 4428, the mobile application provides training to the user regarding how to use the scanning device. For example, the mobile application may play a video that provides a detailed overview of the scanning process. The mobile application may also provide step-by-step instructions (e.g., a combination of pictures, illustrations, and text) where the user can swipe the screen of the mobile device to move forward or backward through the steps so the user can learn each step of the operation of the scanning device. The mobile application may also provide an augmented reality training video that incorporates the user into the training video. For example, a front facing camera of the mobile device can record an image of the user as the user views the training video on the mobile device. The image of the user can be include in the training video such that the user can see how to use the scanning device relative to the user's face and/or mouth.
Returning to 4406, if the account is not a complete account, the user is prompted to complete the account at 4410 by entering additional information via the mobile application. After entering the additional information, the determination is made as to whether the user has a scanning device at 4420, as described.
Returning to 4404, if the determination is made that the user does not have an account, a determination is then made as to whether the user has a scanning device at 4412. For example, the mobile application may display a question to the user to ask the user if the user has a scanning device. If the user does not have a scanning device, the user is then prompted to purchase a scanning device at 4416. The user then enters information required to purchase the scanning device (payment information, shipping address, etc.) and submits the purchase to the treatment provider. After the user receives the scanning device, the user is prompted to complete the account set up at 4414. The user is then introduced to the scanning device at 4424, as described. If the user already has a scanning device, the user is prompted to complete the account set up at 4414, and then the user is introduced to the scanning device at 4424, as described.
Referring to
At 4604, information is displayed regarding how to conduct an acceptable scan. In some embodiments, the user may be prompted to prepare a space in which the scan will be conducted. For example, the user may be prompted to clear an area in which the case 4100 can be placed, and to put the mobile device in the case such that the mobile device is visible to the user, as described with reference to
At 4606, the mobile application receives scanned images. For example, the user may initiate the scan by pressing a button on the scanning device. In some embodiments, the user may initiate the scan by pressing an icon on the mobile device, which communicates with the scanning device and initiates the scan. In some embodiments, the scanning device can detect its position and communicate with the mobile application when the scanning device is oriented to initiate a scan (e.g., the user holds the scanning device in a substantially horizontal position). The user proceeds to scan the user's teeth using the scanning device. For example, the user may scan the user's teeth using the scanning device 2600. The user places the guide 2620 on the lower teeth and moves the guide around the lower teeth as the scanning device scans the upper teeth. The user then places the guide 2620 on the upper teeth and moves the guide around the upper teeth as the scanning device scans the lower teeth. The images are then provided to the mobile application for further analysis.
At 4608, the mobile application determines if the scan is valid. In some embodiments, the data from the scanner is transferred from the scanning device to the mobile device via near-field communication, such as Bluetooth or WiFi. While the data is being transferred from the scanning device to the mobile device, the scanning device may provide indications that the transfer is occurring. For example, the scanning device may vibrate or flash a light (in embodiments where the scanning device is equipped with a light) to indicate the transfer is occurring. The mobile device may also provide indications that the transfer is occurring and a status of the transfer (e.g., 25%, complete, 50% complete, 75% complete). For example, the mobile application may display a message to the user that the data transfer is occurring and explain that the user may have to wait for a specified period of time before conducting an additional scan.
After the mobile application receives the data, the mobile application analyzes the images obtained by the scanning device and determines whether the data is sufficient to create a 3D model of the teeth. If the scan is not valid (e.g., the scan includes areas with missing data such that a 3D model cannot be created), the user is instructed to conduct the scan again. If the scan is valid, the mobile application determines whether all scans are complete at 4610. For example, the scanning process may include conducting a scan of the upper teeth and determining the validity of the scan prior to scanning the lower teeth, or vice versa. If the mobile application determines that not all scans are complete, the user is instructed to conduct the next scan, and may be provided with additional information regarding how to conduct an acceptable scan. If the mobile application determines that all scans are complete, the scan data is transferred to a server (e.g., the server 142) at 4612. In some embodiments, the scan data is transferred via a wired connection. In some embodiments, the scan data is transferred via a wireless connection.
At 4614, the server determines whether the scan is valid. For example, the server (e.g., the server 142) analyzes the scan data for the user's upper teeth and lower teeth and determines whether a 3D model can be created from the scan data provided. If a 3D model cannot be created, the server notifies the mobile application, and the mobile application prompts the user to rescan. For example, the server 142 may determine that not all teeth (e.g., canine, molar, cuspid, pre-molar, etc.) can be identified by the scan, thus requiring a rescan. Furthermore, the server 142 may determine that appropriate differentiation between upper teeth and lower teeth cannot be achieved (e.g., the boundaries of upper and/or lower teeth cannot be properly determined), thus requiring a rescan. Additionally, the server 142 may determine that a surface scanned by the scanner and identified as a tooth is not, in fact, a tooth, thus requiring a rescan. If a 3D model can be created, the server notifies the mobile application, and the mobile application displays a success message to the user. In some embodiments, while the server is attempting to validate the scan, the mobile application provides the user with a progress bar to indicate to the user the progress of the scan validation process. The mobile application may also prompt the user to take selfie photos of the user's teeth to provide photos of the user's teeth prior to beginning the orthodontic treatment. For example, the mobile application may prompt the user to take a selfie photo including all of the user's teeth. In some embodiments, the mobile application may prompt the user to take a photo of only the upper teeth or the lower teeth. Additionally, the mobile application may prompt the use to answer additional medical questions (e.g., medical history, dental history, etc.) during the scan validation process.
Referring to
At 4704, identification information is received from the guest. For example, the type of identification information provided by the guest may be substantially similar to the type of identification information provided by the user to create an account. For example, the guest may be prompted to provide the guest's name, address, phone number, email address, etc.
At 4706, the guest is introduced to the scanning device. The guest may be introduced to the scanning device in substantially the same manner in which the user was introduced to the scanning device in step 4424 of
At 4708, a scan is conducted. For example, the guest may conduct the scan in a substantially similar manner as the user conducted the scan in
At 4710, the scan is transferred and validated. For example, the guest scan may be transferred and validated in a substantially similar manner as steps 4606-4614 of
At 4712, the guest scan is complete. For example, the mobile application may provide a message to the guest stating that the guest scan is complete.
At 4714, the mobile application is provided to the guest for the guest to download. For example, after the guest scan is complete, the mobile application may send a communication to the guest (e.g., an email to the guest's email account, a text to the user's mobile phone, etc.) prompting the guest to download the application on the guest's device.
At 4716, information is provided to the guest regarding orthodontic treatment. For example, the mobile application may provide the guest information regarding different orthodontic treatment options, including wire and bracket systems (e.g., braces) and aligner systems, and show the user how each system works. The mobile application may also show the user advantages and disadvantages of each system.
At 4718, the guest may be prompted to create an account. For example, the mobile application on the user's mobile device may prompt the guest to create an account such that the login process on the guest's mobile device is faster for the guest when the guest downloads the mobile application. The account created by the guest may be substantially similar to the complete account described in step 4410 of
At 4720, the guest's data is linked to the guest account. For example, when the guest downloads the mobile application on the guest's mobile device and logs in to the mobile application using the login credentials created on the user's mobile device, the mobile application may determine that the user's scan data is linked to the user's mobile device and not the guest's mobile device. The mobile application may then prompt the guest to link the guest's scan data to the guest's mobile device, thereby eliminating the link between the guest's scan data and the user's mobile device.
Referring to
At 4804, information regarding scanner care is provided. For example, the mobile application may provide information to the user regarding how to clean the scanning device and any guides that were used during the scan. Information may be provided to the user regarding how to store the scanning device and guides to prevent damage.
At 4806, the mobile application provides an option to refer a friend. For example, during the image processing step, the mobile application may prompt the user to enter contact information for one or more friends the user wants to refer for treatment. After the user enters the contact information for the one or more friends, information is sent from the treatment provider to the one or more friends regarding the treatment, and a link to download the mobile application is provided. If the friends determine that they desire orthodontic treatment and download the mobile application, the friends will be directed to purchase a scanner and conduct a scan, as described.
At 4808, a determination is made whether the scan is valid. For example, after the server attempts to create a 3D model from the scanned images, the server will determine whether the scanned images are usable or unusable for that purpose. If the server determines that a 3D model cannot be created from the scanned images, the mobile application will prompt the user to retake the scans at 4810. If the server determines that a 3D model can be created from the scanned images, the mobile device may display a confirmation message to the user to notify the user that the images were acceptable and a 3D model was created.
At 4812, the user's current 3D smile is displayed. For example, the mobile application may display a message to the user that the user's scan is ready to be viewed, and provide the user with a button to push to view the scan. After pressing the button, the user's scan will be displayed on the mobile device. In some embodiments, the user's scan is shown as a 3D model of the current configuration of the user's teeth. The user may be able to manipulate the 3D model (e.g., zoom, pan, rotate) to view the 3D model from various orientations.
At 4814, the user's treatment plan is displayed. For example, the mobile application may display a message to the user that the user's treatment plan is ready to be viewed, and provide the user with a button to push to view the treatment plan. After pressing the button, the user's treatment plan will be displayed on the mobile device. In some embodiments, the user's treatment plan will be shown as a progression of 3D models of the user's teeth throughout the treatment plan. The user may be able to select any of the individual 3D models to provide a larger image of the 3D model, which the user may be able to manipulate (e.g., pan, zoom, rotate) to view the 3D model from various orientations. For example, the user may desire to view the 3D model of the final step in the treatment plan, which represents the final tooth configuration for the user after treatment. The user can select the final 3D model by selecting the model on the mobile device, and a larger image of the final 3D model will be displayed on the screen of the mobile device. The user can then manipulate the 3D model to view the final configuration of the user's teeth from various angles and orientations.
At 4816, the user is prompted to purchase the treatment plan. For example, if the user has not already purchased the plan at a different point in the process, the mobile application will prompt the user to enter payment information to purchase the treatment plan. After entering the payment information, the user's scan data is sent to the aligner fabrication center 162 such that the user's aligners can be manufactured and sent to the user to begin the orthodontic treatment.
Referring to
To use the scanning device 5104, the user removes the components from the packaging 5102 and plugs the base 5110 into a power source. The user assembles the scanning device 5104 with the first guide 5106 or the second guide 5108 and places the mobile device 5118 in the slot 5120. The slot 5120 is positioned such that the display of the mobile device 5118 can be viewed when the user is scanning the user's teeth, as shown. As the user scans the user's teeth, the user follows along on the display of the mobile device 5118 to create an accurate scan. After the scan is complete, the user stores the charging equipment 5112 inside the opening 5114 and places the scanning device 5104 in the recess 5116. The user may also leave the mobile device 5118 in the slot 5120 if the user chooses.
Referring to
To use the scanning device 5204, the user removes the components from the packaging 5202 and plugs the base 5210 into a power source to charge the base 5210. After the base 5210 is charged, the base 5210 can be coupled to a smooth surface 5222 (e.g., a wall, a mirror, a countertop) using securement devices 5224 (e.g., suction cups, adhesive strips, etc.). The user assembles the scanning device 5204 with the first guide 5206 or the second guide 5208 and places the mobile device 5218 in the slot 5220. The slot 5120 is positioned such that the display of the mobile device 5218 can be viewed when the user is scanning the user's teeth, as shown. As the user scans the user's teeth, the user follows along on the display of the mobile device 5218 to create an accurate scan. The user can also view the positioning of the scanning device 5104 and the first guide 5206 in the smooth surface 5222 during the scan. After the scan is complete, the user stores the charging equipment 5212 inside the opening 5214 and places the scanning device 5204 in the recess 5216. The user may also leave the mobile device 5218 in the slot 5220 if the user chooses.
Referring to
To use the scanning device 5304, the user removes the components from the packaging 5302 and plugs the base 5310 into a power source. The user assembles the scanning device 5304 with the first guide 5306 or the second guide 5308 and places the mobile device 5318 in the slot 5320. The slot 5320 is positioned such that the display of the mobile device 5318 can be viewed when the user is scanning the user's teeth, as shown. As the user scans the user's teeth, the user follows along on the display of the mobile device 5318 to create an accurate scan. After the scan is complete, the user stores the charging equipment 5312 inside the opening 5314 and places the scanning device 5304 in the recess 5316. The user may also leave the mobile device 5318 in the slot 5320 if the user chooses.
Referring to
The scanning device 5420 includes a handle 5422, an extension 5424, a track 5426, a first cutout 5438, a second cutout 5440, and a scanner 5428. The scanning device 5420 may be manufactured from any material suitable to facilitate scanning of the user's teeth. Suitable materials may include, but are not limited to, metals, plastics, and composites. The handle 5422 is coupled to the extension 5424 and is configured to provide a surface for the user to grasp when inserting the scanning device 5420 into the user's mouth. The extension 5424 provides a surface that can be positioned over the teeth of the user such that the scanner 5428 can scan the teeth. The track 5426 is coupled to the extension 5424 and is configured to receive the scanner 5428 and provide a space along which the scanner 5428 can travel during a scan. The first cutout 5438 is a space positioned along the extension 5424 that is configured to engage with the first engagement portion 5404 to secure the scanning device 5420 in place. The second cutout 5440 is a space positioned along the extension 5424 that is configured to engage with the second engagement portion 5406 to secure the scanning device 5420 in place.
The scanner 5428 includes a first camera 5430, a second camera 5432, a top 5434, and a base 5436. The scanner 5428 may also include a projector (not shown), a power source (not shown), and a motor (not shown). The first camera 5430 and the second camera 5432 may be substantially similar to the first camera 206 and the second camera 208, and they may communicate with the projector to scan the teeth of the user accurately. The first camera 5430 and the second camera 5432 are positioned on the top 5434 such that, as shown, the scanning device 5420 can scan the upper teeth of the user. The top 5434 is coupled to the base 5436, and the base is configured to be received by the track 5426. The base may include protrusions that are configured to engage with slots located along the track 5426. When the motor of the scanner 5428 is activated, the protrusions may rotate in one direction to propel the scanner 5428 along the track 5426 from one end of the track 5426 to the other end of the track 5426.
To scan the user's teeth using the scanning device 5400, the user places the insert 5402 in the user's mouth and then couples the scanning device 5420 to the insert 5402. The user may then engage with the mobile application to begin the scan, and the mobile application may send a signal to the scanner 5428 to begin the scan. The scanner 5428 will move along the track at a predetermined rate to scan the top teeth of the user. When the scan of the top teeth is complete, the mobile application may instruct the user to remove the scanning device 5420, rotate it such that the scanner is facing the user's bottom teeth, and couple the scanning device 5420 to the insert 5402 in that orientation. The mobile device may send another signal to the scanner 5428 to begin the scan, and the scanner will move along the track at the predetermined rate to scan the bottom teeth of the user. The scans are then provided to the mobile device and the server for further processing, as described with respect to the other embodiments of the device 5420 disclosed herein.
The scanning devices described herein (e.g., the scanning devices 102, 2600, 2700, 2800, 2920, 2950) are sized to be held and easily transported by the user. For example, the scanning devices described herein are approximately the same size as an electric toothbrush. In some embodiments, the scanning devices described herein are between approximately five inches and twelve inches long. In some embodiments, the scanning devices described herein are between approximately six inches and ten inches long. In some embodiments, the scanning devices described herein weigh between 0.25 pounds and three pounds. In some embodiments, the scanning devices described herein weight between one pound and two pounds.
Furthermore, the scanning devices described herein are configured to conduct a scan without being tethered to additional large equipment (e.g., equipment that requires a cart or other transportation device to move the equipment). In some embodiments, the scanning devices described herein are configured to conduct a scan without a wired tether to any other systems or devices, and can conduct a scan while being wirelessly coupled to other components of the scanning system (e.g., a mobile device). In some embodiments, the scanning devices described herein are configured to conduct a scan while being tethered to a mobile device with a wire.
Additionally, the scanning devices described herein provide an efficient, inexpensive way to conduct a scan of teeth. A scanning operation conducted with the scanning devices described herein can be executed by a user that is not a dental or orthodontic professional. The data generated by the scanning devices described herein can be provided to various entities (e.g., dental or orthodontic professionals, manufacturing facilities for manufacturing orthodontic aligners, etc.) for a variety of purposes (e.g., dental diagnosis and/or treatment and/or checkups, orthodontic treatment planning or mid-course correction, orthodontic aligner manufacturing, orthodontic checkups, etc.).
Referring back to
The images acquired during a scan may be two-dimensional images, such as still photographs. However, the disclosure is not limited to still photographs. For example, the scanning device 102 may be configured to capture video of the oral cavity during the scan. After the video is captured, the scanning device 102 (which, as noted above, may be part of the mobile device 122) or the cloud server 142 may determine whether the scan is acceptable. In some embodiments, the scanning device 102 or the cloud server 142 may determine that a first portion of the video is acceptable and a second portion of the video is unacceptable. In such embodiments, the scanning device 102 or the cloud server 142 may discard the unacceptable portion and retain the acceptable portion. The scanning device 102 may then prompt the user to scan the unacceptable portion again to obtain an acceptable scan. The acceptable portions of the scan may be used to determine a treatment plan and/or an oral condition of the patient, as described above.
In some embodiments, the mobile application directs the user to conduct multiple scans of the oral cavity. For example, the mobile application directs the user to conduct the initial scan. After the initial scan is complete, the mobile application may prompt the user to conduct one or more subsequent scans (e.g., a second scan, a third scan, etc.). In some arrangements, the mobile application prompts the user to conduct subsequent scans at regular intervals such that the progress of the user is monitored regularly. The prompt from the mobile application may occur at a time set by the user or at times predetermined (e.g., based on when the initial scan is performed). For instance, the mobile application may include a default setting in which the user is prompted to scan the oral cavity every two weeks after the initial scan is completed. In some examples, the mobile application may prompt the user to scan the oral cavity, or the user may set the mobile application to prompt the user to scan the oral cavity, every week, every three weeks, every four weeks, every month, every two months, every three months, every six months, every 10 days, every 20 days, every 30 days, or every 90 days. The mobile application may prompt the user to scan the oral cavity over any reoccurring period of time. The data from a subsequent scan can be compared to data from another one of the subsequent scans and/or the initial scan to monitor the oral condition (e.g., a progress of the rearrangement of the user's teeth, whether the alignment of the user's teeth is retained within a threshold, whether a hygienic condition has changed, whether a cavity has formed, etc.). In some examples, data from the second scan can be compared to data from the initial scan, data from the fourth scan can be compared to data from the second scan, etc., to monitor progress between the various subsequent scans.
In some embodiments, the scanning device 102 is a limited use device. For example, the scanning device 102 may be constructed for purposes of only being used by a single user for a typical treatment duration. For example, the scanning device 102 may be configured to be able to take a limited number of scans (e.g., due to a memory constraint, other hardware constraint, etc.). For example, each scan may consume a certain amount of memory on the scanning device 102. Therefore, the scanning device 102 may only be capable of conducting a certain number of scans (e.g., 3 scans, 10 scans, 50 scans, etc.). The scanning device 102 may be configured to take only a certain number of scans needed for a treatment duration. For example, if a typical treatment duration is one year, the scanning device 102 may be configured to take only twelve scans so that the user is able to take one scan a month over the course of the entire treatment duration, or the scanning device 102 may be configured to take 52 scans so that the user is able to take one scan every week over the entire treatment duration. In some embodiments, the scanning device 102 is capable of performing a single scan (i.e., it is a single-use device). In some embodiments, the scanning device 102 can be refurbished or otherwise returned to an entity that can reuse the scanning device 102. For example, the scanning device 102 can be used by a first user during the first user's treatment duration, returned to an entity associated with providing the treatment, then after cleaning and/or refurbishing, the scanning device 102 can be sent to a second user for the second user to use during the second user's treatment duration.
In another embodiment, the scanning device 102 (which may be combined with the mobile device 122 as a unitary device, as described) is configured to detect other intraoral cavity conditions, such as odors or disease. For example, the scanning device 102 is equipped with an odor detector (e.g., an electronic nose, a chemosensor, etc.) configured to detect odors within the oral cavity. In some instances, when a user has an oral condition (e.g., a cavity, a cracked tooth, an ulcer, etc.) within the oral cavity, the oral cavity may produce an odor that is different from an odor the oral cavity would produce in the absence of the oral condition. In such instances, the odor detector detects the odor and communicates with the scanning device 102 and/or the cloud server 142 to provide an indication of the odor detected in the oral cavity. The scanning device 102 and/or the cloud server 142 then determines the oral condition based on the indication of the odor produced by the oral cavity and detected by the odor detector. The oral condition may then be communicated to the user via the mobile device 122. In another example, the mobile application or server system may be configured to process the detected odors apart from or along with the acquired images to detect signs of disease, such as gum disease (gingivitis), cancerous cells, dry mouth, bacteria, tooth decay, sores, and tooth erosion. The mobile application or server system may be able to detect such conditions based on detecting bad breath, discolored (e.g., red) gums, swollen gums, bleeding gums, sores, lumps, rough areas in the mouth, canker sores, fever blisters, cold sores, thrush, etc.
In some embodiments, the user is directed and/or supervised by a dentist and/or an orthodontist during a scanning procedure (e.g., before, during, and after a scan is conducted). For example, the scanning device 102 may be configured to provide a communication (e.g., a video communication, a phone communication, or online streaming) with the dentist and/or orthodontist such that the dentist and/or orthodontist is able to view the scanning process in real time. In some instances, the dentist and/or orthodontist can direct the user to conduct the scan. For example, the dentist and/or orthodontist may communicate with the user and inform the user of any changes to the position of the scanning device 102 that must occur prior to conducting the scan. Furthermore, during the scan the dentist and/or orthodontist may communicate with the user regarding various parameters of the scan including, but not limited to, the angle of the scanning device 102, the speed of the scan, the available lighting, relative motion between the scanning device 102 and the oral cavity of the user, etc. For example, the dentist and/or orthodontist may notify the user that the scanning device 102 is positioned at too steep or too shallow of an angle relative to the oral cavity to prompt the user to change the position. The dentist and/or orthodontist may notify the user that the user is moving the scanning device too quickly and prompt the user to slow down or start over to generate a valid scan. The dentist and/or orthodontist may also notify the user that the images are too dark and that the user should move to an area with more light or use a light incorporated with the scanning device 102. In addition, the dentist and/or orthodontist may notify the user that the user's head is moving too much during the scan and prompt the user to keep the user's head stationary during the scan.
The dentist and/or orthodontist may supervise the user during the scan and provide comments after the scan is complete. For instance, the dentist and/or orthodontist may view the entire scan conducted by the user, and then when the scan is complete the dentist and/or orthodontist can provide feedback regarding the various parameters described above. The dentist and/or orthodontist may find it more useful to provide feedback after the scan than during the scan, as it may be difficult for a user to follow the direction of the dentist and/or orthodontist while attempting to scan the oral cavity.
In some instances, the user and the dentist and/or orthodontist are in different physical locations (e.g., the user may be at the user's home and the dentist and/or orthodontist may be at an office or at their own respective home). The user and the dentist and/or orthodontist may be in the same physical location (e.g., the user and the dentist and/or orthodontist may be at the user's home or some other physical location, such as an office associated with the dentist and/or orthodontist). In some embodiments, the user can conduct the scan at an office space where a dentist and/or orthodontist is present, even if the dentist and/or orthodontist are not in close proximity to the user (e.g., the dentist and/or orthodontist is in another room or assisting another user but the dentist and/or orthodontist is nonetheless in the same building).
In instances where the user and the dentist and/or orthodontist are in the same physical location, the scan can be conducted in a variety of ways. For example, the user may conduct the scan with the dentist and/or orthodontist assisting the user for the duration of the scan (e.g., the dentist and/or orthodontist helps the user set up the scanning device 102, initiate the scan, orient the scanning device 102 properly, move the scanning device 102 along the teeth, complete the scan, upload the images, etc.). The user may also conduct the scan with the dentist and/or orthodontist assisting the user for only portions of the scan (e.g., the dentist and/or orthodontist helps the user with one or more of setting up the scanning device 102, initiating the scan, orienting the scanning device 102 properly, moving the scanning device 102 along the teeth, completing the scan, uploading the images, etc). In some embodiments, the dentist and/or orthodontist may be available to the user only when the user asks for assistance (e.g., the dentist and/or orthodontist is in the same location as the user but does not interact with the user unless the user needs assistance). For example, the user may encounter difficulty initiating the scan (e.g., opening or configuring the mobile application, logging in, selecting the appropriate options for a scan, etc.) and may require assistance, which the dentist and/or orthodontist may provide to the user at the user's request. After assisting with the initiation of the scan, the user may execute the scan without additional assistance. In another example, the user may initiate the scan without encountering difficulty, but may encounter difficulty in executing the scan (e.g., orienting the scanning device 102 properly, moving the scanning device 102 to scan the teeth, etc.) and may ask for assistance. The dentist and/or orthodontist may then assist the user in executing the scan. As yet another example, the user may initiate the scan and conduct the scan without assistance, but may encounter difficulty when attempting to complete the scan (e.g., upload images, etc.) and may ask for assistance. The dentist and/or orthodontist may then assist the user in completing the scan.
In some instances, the dentist and/or orthodontist provides the user with a tutorial on the scanning process prior to the user initiating the scan. The tutorial may include the dentist and/or orthodontist showing the user how to conduct a scan (e.g., via discussion and demonstration with a scanning device). The tutorial may also include a printed instruction manual for the user to read and/or a recorded video for the user to watch prior to initiating the scan. In some embodiments, the dentist and/or orthodontist reviews the results of the scan after the scan is complete to verify whether the scan is accurate or complete, and whether it should be repeated, in whole or in part. For example, the dentist and/or orthodontist may determine that a first portion of the scan is acceptable, and a second portion of the scan is not acceptable. The dentist and/or orthodontist may prompt the user to conduct an additional scan of the second portion. As another example, the dentist and/or orthodontist may determine that the entire scan must be repeated due to incomplete data or unreliable data. In such instances, the dentist and/or orthodontist may prompt the user to conduct an additional scan of the user's teeth.
In instances where the user is at a different physical location than the dentist and/or orthodontist, the scan can be conducted in a variety of ways. For example, the user can be in contact with the dentist and/or orthodontist (e.g., via video call, audio call, online streaming, etc.) for the duration of the scanning process so the dentist and/or orthodontist can assist the user throughout the scanning process. In some instances, the dentist and/or orthodontist may be in contact with the user prior to the user conducting the scan, but then may allow the user to conduct the scan without further assistance, unless the user requires assistance or is unable to complete the scan. If the user requires assistance, the user may contact the dentist and/or orthodontist (e.g., via video call, voice call, online streaming, etc.) for assistance. In some embodiments, the dentist and/or orthodontist can follow the progress of the user on another device (e.g., a computer, mobile device, etc.) that is in communication with the scanning device 102 and/or mobile device 122 of the user. If the dentist and/or orthodontist determines that the user needs assistance, the dentist and/or orthodontist can contact the user to assist the user in completing the scan. In all of the above examples, regardless of whether the user is at a different physical location than the dentist and/or orthodontist or at the same location, it will be appreciated that a technician who is not a dentist or an orthodontist may alternatively, or additionally, assist the user.
In some arrangements, the dentist and/or orthodontist contacts the user to provide feedback regarding physical impressions sent by the user for processing. The impressions may not provide the required information to develop a treatment plan, and in such instances, the dentist and/or orthodontist can contact the user (e.g. via video call, audio call, online streaming, etc.) and provide the user with guidance as to how to properly make an impression of the teeth of the user.
It is important to note that the construction and arrangement of the systems, apparatuses, and methods shown in the various exemplary embodiments is illustrative only. Additionally, any element disclosed in one embodiment may be incorporated or utilized with any other embodiment disclosed herein. For example, any of the exemplary embodiments described in this application can be incorporated with any of the other exemplary embodiment described in the application. Although only one example of an element from one embodiment that can be incorporated or utilized in another embodiment has been described above, it should be appreciated that other elements of the various embodiments may be incorporated or utilized with any of the other embodiments disclosed herein.
Although the figures and description may illustrate a specific order of method steps, the order of such steps may differ from what is depicted and described, unless specified differently above. Also, two or more steps may be performed concurrently or with partial concurrence, unless specified differently above. Such variation may depend, for example, on the software and hardware systems chosen and on designer choice. All such variations are within the scope of the disclosure. Likewise, software implementations of the described methods could be accomplished with standard programming techniques with rule-based logic and other logic to accomplish the various connection steps, processing steps, comparison steps, and decision steps.
References herein to the positions of elements (e.g., “top,” “bottom,” “above,” “below”) are merely used to describe the orientation of various elements in the figures. It should be noted that the orientation of various elements may differ according to other exemplary embodiments, and that such variations are intended to be encompassed by the present disclosure.
The term “or,” as used herein, is used in its inclusive sense (and not in its exclusive sense) so that when used to connect a list of elements, the term “or” means one, some, or all of the elements in the list. Conjunctive language such as the phrase “at least one of X, Y, and Z,” unless specifically stated otherwise, is understood to convey that an element may be either X, Y, Z; X and Y; X and Z; Y and Z; or X, Y, and Z (i.e., any combination of X, Y, and Z). Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y, and at least one of Z to each be present, unless otherwise indicated.
The term “coupled” and variations thereof, as used herein, means the joining of two members directly or indirectly to one another. Such joining may be stationary (e.g., permanent or fixed) or moveable (e.g., removable or releasable). Such joining may be achieved with the two members coupled directly to each other, with the two members coupled to each other using a separate intervening member and any additional intermediate members coupled with one another, or with the two members coupled to each other using an intervening member that is integrally formed as a single unitary body with one of the two members. If “coupled” or variations thereof are modified by an additional term (e.g., directly coupled), the generic definition of “coupled” provided above is modified by the plain language meaning of the additional term (e.g., “directly coupled” means the joining of two members without any separate intervening member), resulting in a narrower definition than the generic definition of “coupled” provided above. Such coupling may be mechanical, electrical, or fluidic.
As utilized herein, the term “substantially” and similar terms are intended to have a broad meaning in harmony with the common and accepted usage by those of ordinary skill in the art to which the subject matter of this disclosure pertains. It should be understood by those of skill in the art who review this disclosure that these terms are intended to allow a description of certain features described and claimed without restricting the scope of these features to the precise numerical ranges provided. Accordingly, these terms should be interpreted as indicating that insubstantial or inconsequential modifications or alterations of the subject matter described and claimed are considered to be within the scope of the disclosure as recited in the appended claims.
The hardware and data processing components used to implement the various processes, operations, illustrative logics, logical blocks, modules and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose single- or multi-chip processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, or, any conventional processor, controller, microcontroller, or state machine. A processor also may be implemented as a combination of computing devices, such as a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. In some embodiments, particular processes and methods may be performed by circuitry that is specific to a given function. The memory (e.g., memory, memory unit, storage device) may include one or more devices (e.g., RAM, ROM, Flash memory, hard disk storage) for storing data and/or computer code for completing or facilitating the various processes, layers and modules described in the present disclosure. The memory may be or include volatile memory or non-volatile memory, and may include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures described in the present disclosure. According to an exemplary embodiment, the memory is communicably connected to the processor via a processing circuit and includes computer code for executing (e.g., by the processing circuit or the processor) the one or more processes described herein.
The present disclosure contemplates methods, systems and program products on any machine-readable media for accomplishing various operations. The embodiments of the present disclosure may be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system. Embodiments within the scope of the present disclosure include program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media can comprise RAM, ROM, EPROM, EEPROM, or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions include, for example, instructions and data which cause a general-purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
This application is a continuation-in-part of U.S. patent application Ser. No. 17/397,550, filed Aug. 9, 2021, which is a continuation of U.S. patent application Ser. No. 17/106,635, filed Nov. 30, 2020, now U.S. Pat. No. 11,083,551, which is a continuation of U.S. patent application Ser. No. 16/711,173, filed Dec. 11, 2019, now U.S. Pat. No. 10,849,723, which claims the benefit of and priority to U.S. Provisional Application No. 62/844,694, filed May 7, 2019, which are each hereby incorporated by reference in their entirety.
Number | Date | Country | |
---|---|---|---|
62844694 | May 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17106635 | Nov 2020 | US |
Child | 17397550 | US | |
Parent | 16711173 | Dec 2019 | US |
Child | 17106635 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17397550 | Aug 2021 | US |
Child | 17843381 | US |