This disclosure relates to an imaging apparatus and an imaging system for imaging an intraoral image of a subject person.
Hitherto, there has been known a diagnosis of, for example, a viral cold based on a doctor's observation of a change in oral state of a subject person. “Posterior Pharyngeal Wall Follicles as a Diagnostic Marker of Influenza During Physical Examination: Considering Their Meaning and Value” by Miyamoto and Watanabe, Journal of Nihon University Medical Association 72(1): 11 to 18 (2013) reports that lymph follicles appearing at a deepest part of a pharynx in an oral cavity have a pattern peculiar to influenza. Lymph follicles having the peculiar pattern are referred to as “influenza follicles”. Influenza follicles are a characteristic sign of influenza, and are considered to appear about two hours after the onset of influenza. However, appropriate identification of influenza follicles requires intensive training through a large number of cases, and thus is not easy for general doctors.
Thus, in view of the above-mentioned technology, this disclosure has an object to provide an imaging apparatus and an imaging system according to various embodiments, which are suitable to make a diagnosis based on an object image obtained by imaging an intraoral image.
According to one aspect of this disclosure, there is provided “an imaging apparatus, including: a main body which has a proximal end and a distal end, and is formed into a columnar shape having a predetermined length between the proximal end and the distal end so as to enable insertion of at least the distal end into an oral cavity; one or a plurality of light sources which are provided on the proximal end side or the distal end side of the main body, or to the main body, and are configured to emit light having a predetermined frequency band; a diffuser plate which is provided on the proximal end side or the distal end side of the main body, or to the main body, and is configured to diffuse the light emitted from the light sources toward the oral cavity; and a camera which is provided on the proximal end side or the distal end side of the main body, or to the main body, and is configured to image an object image based on reflected light which has been diffused by the diffuser plate and reflected from the oral cavity.”
According to one aspect of this disclosure, there is provided “an imaging system, including: the above-mentioned imaging apparatus; and a processing device which is connected to the above-mentioned imaging apparatus in a wired or wireless manner so as to be communicable with the imaging apparatus, and is configured to process an object image imaged by the above-mentioned imaging apparatus.”
According to various embodiments of this disclosure, the imaging apparatus and the imaging system, which are suitable to make a diagnosis based on the object image obtained by imaging the intraoral image, can be provided.
The above-mentioned effects are mere exemplifications for convenience of description, and are not limitative. In addition to or in place of the above-mentioned effects, any effects described in this disclosure and effects apparent to those skilled in the art can be obtained.
Various embodiments of this disclosure are described with reference to the accompanying drawings. The same components in the drawings are denoted by the same reference symbols.
1. Overview of Imaging System 1
An imaging system 1 according to this disclosure is mainly used to image an intraoral image of a test subject person so as to obtain an object image. In particular, the imaging system 1 is used to image an image of an intraoral region around the back of a throat, more specifically, a pharynx. Thus, the imaging system 1 according to this disclosure, which is used to image an image of the pharynx, is mainly described below. However, the pharynx is an example of a region being subjected to imaging. It is apparent that the imaging system 1 according to this disclosure can be suitably used for any other intraoral regions.
As an example, the imaging system 1 according to this disclosure is used to image an image of a pharynx in an oral cavity of a subject person to determine a probability of influenza infection of the subject person. Thus, now, there is described a case in which the imaging system 1 is used to determine a probability of influenza infection. However, the determination of a probability of influenza infection is a mere example of the use of the imaging system of this disclosure. It is apparent that the imaging system of this disclosure may be suitably used for the determination of any diseases that may appear as different intraoral findings as a result of infection. Examples of such diseases include streptococcal infections, adenovirus infections, Epstein-Barr (EB) virus infections, mycoplasma infections, and hypertensions.
In this disclosure, the terms relating to diseases, such as “determination” and “diagnosis”, are used. However, these terms do not always refer to a definitive determination or diagnosis made by a doctor. For example, it is apparent that the terms “determination” and “diagnosis” may also refer to a determination and a diagnosis made by a processing device 100 included in the imaging system 1 when an imaging subject person himself or herself or a user other than doctors uses the imaging system 1 of this disclosure.
The thus imaged object image (image of the pharynx 715) is transmitted from the imaging apparatus 200 to the processing device 100 that is connected to the imaging apparatus 200 in a wired manner so as to be communicable therewith. After a processor of the processing device 100 that has received the object image processes programs stored in a memory of the processing device 100, a probability of influenza infection is determined based on the object image. Then, a result thereof is output to, for example, a display.
2. Configuration of Imaging System 1
At least the distal end of the imaging apparatus 200 is inserted into an oral cavity of the subject person to image an intraoral image, in particular, an image of a pharynx. Specific imaging processing is described later. The imaged object image is transmitted to the processing device 100 via a wired connection cable.
The processor 111 functions as a control unit that controls other components of the imaging system 1 based on the programs stored in the memory 112. The processor 111 controls driving of the camera 211 and driving of the light source 212, and stores the object image received from the imaging apparatus 200 to process the stored object image based on the programs stored in the memory 112. More specifically, the processor 111 receives an instruction input to the input interface 113 by the user to execute processing based on the programs stored in the memory 112. The processing includes, for example, processing for turning on the light source 212 to instruct the camera 211 to perform imaging, processing for extracting a still image to be used for a determination from the imaged images, processing for performing a predetermined process on the extracted image to determine likelihood of a predetermined disease, and processing for outputting a result of determination to, for example, the display 114. The processor 111 mainly includes one or a plurality of CPUs. However, the processor 111 may suitably include a GPU or other units in combination therewith.
The memory 112 is formed of, for example, a RAM, a ROM, a non-volatile memory, or an HDD, and functions as a storage unit. The memory 112 stores instructions and commands for various kinds of control on the imaging system 1 according to this embodiment as programs. More specifically, the memory 112 receives an instruction input to the input interface 113 by the user to store the programs that allow the processor 111 to execute the processing. The processing includes, for example, the processing for turning on the light source 212 to instruct the camera 211 to perform imaging, the processing for extracting a still image to be used for determination from the imaged images, processing for performing a predetermined process on the extracted image to determine likelihood of a predetermined disease, and the processing for outputting a result of determination to, for example, the display 114. Further, the memory 112 stores, in addition to the programs, the object image imaged by the camera 211 of the imaging apparatus 200, a result of determination of likelihood of a disease, which is made by the processor 111, various kinds of information about subject persons including the subject person 700, and other information.
The input interface 113 functions as an input unit that receives an instruction issued to the processing device 100 and the imaging apparatus 200, which is input by the user. Examples of the input interface 113 include, as illustrated in
The display 114 functions as a display unit for displaying the object image imaged by the imaging apparatus 200 and outputting a result of determination made by the processor 111. The display 114 is formed of a liquid crystal panel. However, the display 114 is not limited to a liquid crystal panel, and may also be formed of, for example, an organic electroluminescence (EL) display or a plasma display.
The camera 211 functions as an imaging unit. The imaging unit is driven in response to an instruction output from the processing device 100, and detects light reflected by the oral cavity being an object, to thereby generate the object image. To detect the light, the camera 211 includes, as an example, a CMOS image sensor, a lens system, and a drive system. The lens system and the drive system are provided to achieve desired functions. The image sensor is not limited to a CMOS image sensor, and other sensors such as a CCD image sensor may be used. Although not particularly shown, the camera 211 may have an autofocus function. It is preferred that the camera 211 be set so as to be focused on a specific region in front of a lens, for example. Further, the camera 211 may have a zooming function. It is preferred that the camera 211 be set to image an image at a suitable magnification in accordance with a size of a pharynx or influenza follicles.
In this embodiment, the camera 211 is inserted into the oral cavity of the subject person, and is used to image an image of the pharynx located in the back of the oral cavity. Thus, a distance between the camera 211 and the object is relatively small. Thus, the camera 211 has an angle of view (20) that allows values calculated by [(distance from distal end portion of camera 211 to rear wall of pharynx)*tan θ] to be 20 mm or larger as a vertical range and 40 mm or larger as a horizontal range. The use of the camera having the above-mentioned angle of view enables imaging over a wider range even when the camera 211 and an object are close to each other. Thus, a general camera may be used as the camera 211, or a so-called wide angle camera or super-wide angle camera may be used.
Further, in this embodiment, influenza follicles, which are a target of imaging by the camera 211, are formed at the pharynx in the oral cavity. The pharynx is generally located at the back in a depth direction. Thus, when a depth of field is shallow, a region between an anterior part of the pharynx and a posterior part of the pharynx may be out of focus. Hence, it is difficult to obtain an appropriate object image to be used for the determination made by the processing device 100. Thus, the camera 211 has a depth of field of at least 20 mm or larger, preferably, 30 mm or larger. The use of the camera having such a depth of field allows an in-focus object image of any region between the anterior part of the pharynx and the posterior part of the pharynx to be obtained.
The light source 212 is driven in response to an instruction output from the processing device 100, and functions as an optical source unit for irradiating the oral cavity with light. The light source 212 includes one or a plurality of light sources. In this embodiment, the light source 212 is formed of one or a plurality of light-emitting diodes (LEDs). Light having a predetermined frequency band is radiated from each of the LEDs in a direction toward the oral cavity. As light emitted from the light source 212, light having a desired band selected from an ultraviolet light band, a visible light band, and an infrared light band, or a combination thereof is used. In particular, when influenza follicles are irradiated with light having a short-wavelength band in the ultraviolet light band, a specific component in the influenza follicles reacts therewith. Thus, likelihood of a disease can be more reliably determined.
In this embodiment, description has been given of the case in which the processing device 100 and the imaging apparatus 200 are connected to each other through a wired connection cable so as to be communicable with each other. However, it is apparent that the connection therebetween is not limited to the wired connection. The processing device 100 and the imaging apparatus 200 may be connected through wireless communication.
3. Flow of Processing Executed by Processing Device 100
In this embodiment, at least one image that satisfies predetermined conditions is extracted from object images (video image) imaged by the imaging apparatus 200. For example, the video image imaged by the imaging apparatus 200 includes still images at a predetermined film rate. The still images include those imaged under desirable imaging conditions and those imaged under undesirable imaging conditions. In this case, several frames (still images) imaged under desirable imaging conditions are selected from the imaged video image. For example, conditions under which a specific disease (for example, influenza) is diagnostically determined with high accuracy are found in advance based on a number of pieces of similar image data. The conditions include all or some of a specific imaging angle, a degree of light emission from a specific light source, wideness of a specific field of view, a position at which a specific region (for example, a pharynx) is extracted, and a degree of focus on the specific region. The conditions are set as criteria for determining whether imaging conditions are desirable or undesirable. Then, the likelihood of a disease is determined by processing several still images extracted based on the criteria. Machine learning such as a deep learning function may be used for extraction processing. In this embodiment, description has been given of the case in which still images are extracted from a video image. However, a plurality of still images may be imaged so that several appropriate ones are extracted therefrom. As another example, only the number of still images used for the determination may be imaged, and the extraction processing may be omitted.
As illustrated in
Next, the processor 111 extracts still images appropriate for the determination from still images appropriate for the determination from a plurality of still images included in the acquired video image (Step S113). In this step, the extracted still images may be shown on the display 114. Then, the processor 111 determines likelihood of a specific disease (for example, likelihood of influenza) based on the extracted still images and a determination algorithm (Step S114). Then, the processor 111 performs control so as to output the result of determination to, for example, the display 114 (Step S115). As a result, a series of processing steps ends.
As an example, machine learning is used for the determination based on the determination algorithm in Step S114. More specifically, an image of the pharynx of an influenza patient and a label attached thereto, which indicates whether or not the patient is infected with influenza, are collected in, for example, a medical institution. At this time, a ground truth label is attached based on, for example, a result of an influenza rapid test, a PCR test, or a virus isolation culture test, each using a swab, which the patient has undergone. The PCR test and the virus isolation culture test require several weeks to give results. However, the PCR test and the virus isolation culture test have extremely high accuracy. Thus, the results thereof are suitable as ground truth labels.
Next, training data (image data) labelled with a result of determination of the PCR test or the virus isolation culture test as a ground truth is generated. Machine learning is performed based on the training data to generate the determination algorithm. The determination algorithm serves to determine, when an image is given, whether or not the image shows the likelihood of influenza. The determination algorithm enables a probability of influenza to be quantified, for example, as “98.5%”. The probability may be indicated by an appropriate combination of, for example, a “positive” or “negative” determination and a grade evaluation as “high”, “middle”, or “low”.
As illustrated in
4. Configuration of Imaging Apparatus 200
The imaging system 1 according to this disclosure includes the imaging apparatus 200 that picks up an image to be processed in the processing device 100.
In this embodiment, the imaging apparatus 200 includes a record button 215. Thus, the user can instruct to start and end imaging with use of the record button of the input interface 113 of the processing device 100, but the user can also instruct to start and end imaging with use of the record button 215.
As illustrated in
The main body 214 is formed in a hollow cylindrical columnar shape with a perfect circular cross section. A wall portion 224 thereof may be made of any material that allows light to be guided therein, and may be made of a thermoplastic resin as an example. As the thermoplastic resin, for example, polyolefin-based resins, such as chain polyolefin-based resins (such as polypropylene-based resins) and cyclic polyolefin-based resins (such as norbornene-based resins), cellulosic ester-based resins, such as tri-acetylcellulose and di-acetylcellulose, polyester-based resins, polycarbonate-based resins, (meta)-acrylic resins, polystyrene-based resins, or a mixture or a copolymer thereof are used. Specifically, the main body 214 functions as a light guiding member for guiding light emitted from the light source into the oral cavity or in a direction toward the diffuser plate.
The main body 214 is hollow, and hence has an accommodation space 223 defined by an inner surface of the wall portion 224. The camera 211 is accommodated in the accommodation space 223. The main body 214 is only required to be formed in a columnar shape having the accommodation space 223. Thus, the accommodation space 223 is not required to have a cylindrical shape with a perfect circular cross section, and may have an ellipsoidal or polygonal sectional shape. Further, the main body 214 is not always required to be hollow.
In this case, as an example, a length of the main body 214 is determined based on a positional relationship with the incisor teeth of the subject person.
Returning to
The handgrip 213 has a distal end connected to the proximal end 220 of the main body 214. The user holds the handgrip 213 to perform an operation such as removal and insertion of the imaging apparatus 200. The handgrip 213 has a shape suitable to be held in a user's palm. Specifically, the handgrip 213 is tapered relatively toward its distal end connected to the main body 214 and is upcurved toward its proximal end on the opposite side of the main body 214. In this embodiment, the handgrip 213 has a perfect circular cross section. However, the cross section of the handgrip 213 is not required to be perfect circular, and may also be ellipsoidal or polygonal.
In this case, as an example, a width (distance D2) of the main body 214 in a direction perpendicular to a direction that connects the distal end 221 and the proximal end 220 of the main body 214 is determined based on a relationship with an opening width of a subject person's mouth in a vertical direction. As illustrated in
Returning to
The handgrip 213 has engagement protrusions 217 and a positioning protrusion 218 in the vicinity of the proximal end 220 of the main body 214. The engagement protrusions 217 and the positioning protrusion 218 are formed so as to enable positioning of the auxiliary member 300. The engagement protrusions 217 are engaged with engagement protrusions 318 of the auxiliary member 300. Further, the positioning protrusion 218 is inserted into an insertion hole 321 of the auxiliary member 300, to thereby position the imaging apparatus 200 and the auxiliary member 300 with respect to each other. In this embodiment, as the engagement protrusions 217 of the main body 214, a total of four engagement protrusions (engagement protrusions 217-1, 217-2, 217-3, and 217-4) are arranged at equal intervals in the vicinity of the proximal end 220 of the main body 214 on a surface of the handgrip 213. A single positioning protrusion 218 is arranged between the engagement protrusions 217 on the surface of the handgrip 213 in the vicinity of the proximal end 220 of the main body 214. However, the arrangement of the engagement protrusions 217 and the positioning protrusion 218 is not limited to that described above. Only any one of a set of the engagement protrusions 217 and the positioning protrusion 218 may be formed. Further, any number of engagement protrusions 217 and positioning protrusions 218 may be formed as long as one or a plurality of engagement protrusions 217 and one or a plurality of positioning protrusions 218 are formed.
The diffuser plate 219 is arranged at the distal end 221 of the main body 214, and diffuses light, which has been emitted from the light source 212 and has passed through the main body 214, into the oral cavity. The diffuser plate 219 has a shape in conformity with a sectional shape of a part of the main body 214, which is formed so as to be capable of guiding the light. In this embodiment, the main body 214 is formed in a hollow cylindrical shape. Thus, the diffuser plate 219 is also formed to have a donut-like sectional shape in conformity with the shape of the main body 214.
The camera 211 is used to detect the light, which has been diffused by the diffuser plate 219 to be radiated into the oral cavity and has been reflected by the object, to generate an object image. The camera 211 is arranged inside the inner surface of the wall portion 224 of the main body 214, specifically, in the accommodation space 223 defined inside the main body 214. In this embodiment, only one camera 211 is provided. However, the imaging apparatus 200 may include a plurality of cameras. When the object images are generated with use of a plurality of cameras, the object images include information about three-dimensional shapes of influenza follicles. As a result, the influenza follicles are more accurately detected based on the determination algorithm to enable determination of a probability of influenza infection. Further, in this embodiment, the camera 211 is arranged in the accommodation space 223 of the main body 214. However, the camera 211 may also be provided to the distal end 221 of the main body 214 or to the main body 214 (may be arranged in the main body 214 or on an outer periphery of the main body 214).
As described with reference to
The light source 212 is provided to a rear part of the main body 214 behind the diffuser plate 217, specifically, at a predetermined position on a side closer to the handgrip 213. Light is emitted from the light source 212 in a direction toward the main body 214. The light, which has passed through the main body 214, is diffused into the oral cavity through the diffuser plate 219. In this embodiment, the light source 212 is arranged in conformity with the sectional shape of the main body 214. Specifically, four light sources (light sources 212-1, 212-2, 212-3, and 212-4) are arranged at equal intervals on a board 225 (not shown in
3. Configuration of Auxiliary Member 300
In this embodiment, the auxiliary member 300 is manufactured by integral molding of a resin. However, the auxiliary member 300 may be made of other materials such as paper, cloth, a metal, or a combination thereof. Further, it is desired that the auxiliary member 300 be of disposable type. However, the auxiliary member 300 may also be of reusable type.
As illustrated in
The main body 312 is formed so as to cover at least a part of the main body 214 of the imaging apparatus 200 on the distal end 221 side, and thus is formed in a cylindrical shape having a predetermined length corresponding to a length of the main body 214. The main body 312 has a proximal end-side opening 320 on the proximal end 316 side and a distal end-side opening 313 on the distal end 314 side. The proximal end-side opening 320 receives the imaging apparatus 200 inserted therein. The distal end-side opening 313 allows light emitted from the light source 212 of the imaging apparatus 200 and reflected light from the object to pass therethrough. In this embodiment, the distal end 314 of the auxiliary member 300 is first inserted into the oral cavity.
Further, in this embodiment, the main body 214 of the imaging apparatus 200 has an inner diameter and an outer diameter that are substantially constant from the proximal end 220 to the distal end 221. Thus, the main body 312 of the auxiliary member 300 is also formed to have substantially constant inner diameter and outer diameter in a longitudinal direction so as to conform with the shape of the main body 214. It is preferred that the main body 312 be formed to have a sectional shape in conformity with the sectional shape of the main body 214 of the imaging apparatus 200. However, the main body 312 may have any sectional shape, such as a perfect circular shape, an ellipsoidal shape, or a polygonal shape, as long as the main body 214 is insertable therein.
The gripping plates 311 are arranged along the distal end 316 of the main body 316. The gripping plates 311 are gripped by the user with a hand when the user inserts the auxiliary member 300 into the oral cavity of the subject person. In this embodiment, the gripping plates 311 also function as a regulating member. When the auxiliary member 300 is inserted into the oral cavity, distal end-side surfaces of the gripping plates 311 are brought into contact with, for example, lips of the subject person, to thereby regulate further insertion of the auxiliary member 300. In this embodiment, as the gripping plates 311, a pair of gripping plates are provided to the main body 312 on the proximal end 316 side so as to be symmetric in the vertical direction. However, a gripping plate may be formed in any shape such as a donut-like shape extending along the proximal end 316 of the main body 312.
The tongue depressor 315 is provided to a lower part (side closer to a tongue in the oral cavity) of the main body 312, and is formed in a blade-like shape extending in a direction toward the tongue. When images are imaged by the imaging apparatus 200, movement of the tongue of the subject person in front of the imaging apparatus 200 hinders the pickup of intraoral images. The tongue depressor 315 pushes the tongue downward to restrict the movement of the tongue in the oral cavity and prevent the tongue from being located in front of the imaging apparatus 200. Thus, in this embodiment, the tongue depressor 315 is formed in a blade-like shape. However, the tongue depressor 315 may have any shape as long as the above-mentioned function is achievable.
Further, the main body 312 has mechanisms for positioning the imaging apparatus 200 on the proximal end 316 side of the main body 312 when the imaging apparatus 200 is inserted from the proximal end 316 side. More specifically, the main body 312 includes a pair of grooves 319, regions 317, the engagement protrusions 318, and the insertion hole 321. The pair of grooves 319 are formed to extend in a direction perpendicular to an outer periphery of the main body 312 to have a predetermined length. The regions 317 are defined by the pairs of grooves 319. The engagement protrusions 318 are arranged in the regions 317, and are configured to be engaged with the engagement protrusions 217 formed on the distal end of the handgrip 213 of the imaging apparatus 200. The positioning protrusion 218 of the handgrip 213 is inserted into the insertion hole 321.
The engagement protrusions 318 are formed at positions corresponding to the engagement protrusions 217 formed on the handgrip 213 of the imaging apparatus 200 so as to protrude in a direction toward an inner surface of the main body 312. Thus, in this embodiment, four engagement protrusions 318-1, 318-2, 318-3, and 318-4 are arranged at positions corresponding to four engagement protrusions 217-1, 217-2, 217-3, and 217-4, respectively. Further, the insertion hole 321 is formed at a position corresponding to the positioning protrusion 218 formed on the handgrip 213 of the imaging apparatus 200. Thus, in this embodiment, one insertion hole 321 is formed at a position corresponding to the single positioning protrusion 218.
Next, when the imaging apparatus 200 is inserted to climb over the engagement protrusions 318 of the auxiliary member 300, the positioning protrusion 218 formed on the handgrip 213 of the imaging apparatus 200 so as to protrude outwardly therefrom is inserted into the insertion hole 321 of the auxiliary member 300. Then, the positioning protrusion 218 of the imaging apparatus 200 is brought into abutment against a wall surface of the insertion hole 321 formed on the distal end 314 side. As a result, further insertion of the imaging apparatus 200 in a direction toward the distal end 314 of the auxiliary member 300 is regulated.
Specifically, in this embodiment, the movement of the imaging apparatus 200 in the direction of insertion of the imaging apparatus 200 is regulated by the positioning protrusion 218 and the insertion hole 321 under a state in which the imaging apparatus 200 is fully inserted into the auxiliary member 300. The movement of the imaging apparatus 200 in a direction opposite to the above-mentioned direction, specifically, a direction of removal of the imaging apparatus 200 is regulated by the engagement protrusions 217 and the engagement protrusions 318.
6. Configuration of Light Source 212
The light sources 212-1 to 212-4 may be configured so as to be independently controllable. For example, when light is emitted from some of the light sources 212-1 to 212-4, the object image can include a shadow of influenza follicles each having a three-dimensional shape. As a result, the object image includes information about the three-dimensional shape of the influenza follicles. Thus, influenza follicles can be more accurately detected based on the determination algorithm to enable determination of a probability of influenza infection.
Further, in this embodiment, the light sources 212-1 to 212-4 are provided on the proximal end 220 side of the main body 214. However, the light sources 212-1 to 212-4 may be provided to the distal end 221 of the main body 214 or to the main body 214 (inside the main body 214 or on the outer periphery of the main body 214).
In this embodiment, the diffuser plate 219 is used to prevent only a part of the oral cavity from being irradiated with the light emitted from the light source 212 and generate uniform light. Thus, as an example, micro lens arrays are formed on a surface of the diffuser plate 219 so that a lens-shaped diffuser plate having a suitable diffusion angle is used. In place of such a diffuser plate, a diffuser plate capable of diffusing light by other methods, such as a diffuser plate having a surface with randomly arranged micro concavities and convexities to achieve a light diffusing function, may be used. Further, the diffuser plate 219 may be formed integrally with the main body 214. Such a diffuser plate can be achieved by, for example, a method of forming micro concavities and convexities at the distal end portion of the main body 214.
Further, the diffuser plate 219 is provided on the distal end 221 side of the main body 214 in this embodiment. However, a position of the diffuser plate 219 is not limited to that described above. The diffuser plate 219 may be provided at any position between the light source 212 and the oral cavity being a target of irradiation. For example, the diffuser plate 219 may be provided to the distal end 221 of the main body 214 or to the main body 214 (inside of the main body 214 or on the outer periphery of the main body 214).
As described above, the imaging apparatus 200 according to this embodiment can be inserted into the oral cavity of the subject person to image images. Thus, in this embodiment, the likelihood of a disease can be appropriately determined by processing the object images obtained by the imaging.
In the embodiment described above, description has been given of the case in which the processing device 100 and the imaging apparatus 200 are connected through a wired connection cable so as to be communicable with each other. However, the connection is not limited to the wired connection. The processing device 100 and the imaging apparatus 200 may be connected through wireless communication. In this case, for example, the object image imaged by the imaging apparatus 200 is transmitted to a remotely installed server device. When the server device is controlled to function as a processing device, the likelihood of a disease can be determined. As another example, the object image is transmitted to a remotely installed terminal device for a doctor. When the terminal device is controlled to function as a processing device, the determination of a disease can be accompanied by doctor's findings.
Further, in the embodiment described above, description has been given of the case in which the processing device 100 and the imaging apparatus 200 are connected through a wired connection cable so as to be communicable with each other. However, the configurations of the processing device 100 and the imaging apparatus 200 are not limited to those described above. The imaging apparatus 200 itself may have a determination function of the processing device 100 so as to determine the likelihood of a disease. In this case, a result of the determination may be, for example, output from an output unit of the imaging apparatus 200 or output to a terminal device of the user, which is connected to the imaging apparatus 200 in a wired or wireless manner.
Further, in the embodiment described above, description has been given of the case in which the imaging apparatus 200 and the auxiliary member 300 are formed separately. However, the configurations of the imaging apparatus 200 and the auxiliary member 300 are not limited to those described above, and the imaging apparatus 200 and the auxiliary member 300 may be formed integrally. For example, the imaging apparatus 200 may include a tongue depressor for restricting the movement of the tongue inside the oral cavity. The tongue depressor is provided to a surface of the main body 214, which is closer to the tongue, so as to be arranged on a side closer to the tongue in the oral cavity. In this manner, images can be imaged without using the auxiliary member 300.
Number | Date | Country | Kind |
---|---|---|---|
2019-198736 | Oct 2019 | JP | national |
This application is a U.S. National Phase Application under 35 U.S.C. 371 of International Application No. PCT/JP2020/040329, filed on Oct. 27, 2020, which claims priority to Japanese Patent Application No. 2019-198736, filed on Oct. 31, 2019. The entire disclosures of the above applications are expressly incorporated by reference herein.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/040329 | 10/27/2020 | WO |