The disclosure relates to a technique for clinical sign detection.
In modern medicine, inspection and observation are the first step of clinical examination by healthcare professionals. General observations begin at the first encounter with a patient or examinee and the health professionals and continue through the whole process of physical examination.
A method of and an imaging system for clinical sign detection are disclosed.
According to one of the exemplary embodiments, the method is applied to an imaging system having an RGB image sensor and a processing device. The method includes the following steps. An image of a patient or examinee is captured by the RGB image sensor to generate an RGB image. Clinical signs of the patient or examinee are detected by the processing device based on the RGB image.
According to one of the exemplary embodiments, the imaging system includes an RGB image sensor and a processing device having a memory and a processor. The RGB image sensor is configured to capture an RGB image of a patient or examinee to generate an RGB image. The memory is configured to store data. The processor is configured to detect clinical signs of the patient or examinee based on the RGB image.
In order to make the aforementioned features and advantages of the disclosure comprehensible, some embodiments accompanied with figures are described in detail below. It is to be understood that both the foregoing general description and the following detailed description are exemplary, and are intended to provide further explanation of the disclosure as claimed.
It should be understood, however, that this summary may not contain all the aspects and embodiments of the disclosure and is therefore not meant to be limiting or restrictive in any manner. Also, the disclosure would include improvements and modifications which are obvious to one skilled in the art.
The accompanying drawings are included to provide a further understanding of the disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the disclosure and, together with the description, serve to explain the principles of the disclosure.
To make the above features and advantages of the application more comprehensible, several embodiments accompanied with drawings are described in detail as follows.
The advances of digital health enable a paradigm shift from precision medicine to precision health. It is desirable and beneficial for a user to monitor his or her wellness and health condition whenever possible.
As described herein, inspections, observations, and examinations are within the scope of the present disclosure that are implemented through the use of the methods and systems disclosed herein. The conditions of examinee, including development, nutritional status, body figure, physical status, mentality, facial expression, position and posture, can be evaluated. Goal-directed inspection for specific body parts provide crucial information and signs which help the clinical reasoning and diagnosis.
The inspection usually starts from the HEENT examination, which principally concerns the head, eyes, ears, nose, throat/mouth and neck (“HEENT examination”). The hand, extremities and skin also provide important clues to evaluate the underlying chronic illness or acute conditions. The external manifestation of certain illness can be visualized by a detailed inspection, even without specific equipment. The information obtained through inspection may be thoughtfully integrated with the patient's medical history and current health condition.
The observation includes the identification of landmarks of body, measurement the size, relative location to other landmarks, the shape, position, alignment, color, symmetry, and unusual features. A longer visual observation helps to detect movement problems, and respiratory pattern of examinees.
In modern medicine, inspection and observation are the first step of clinical examination, which usually starts from the HEENT examination. However, the inspection is subjective to a physician's experience and usually requires years of training and clinical practices. Besides, self-monitoring is important in examinees with acute illness or chronic diseases, and in regions or situations with limited accessibility to health care.
In the disclosure, an objective and effective self-inspection, an imaging system and a method would be proposed to detect informative clinical signs that reveal early, subtle but critical information related to an examinee's health condition. The information would help the healthcare providers or the users to identify abnormal physical signs related to health condition or certain diseases, and would discover new physical signs before the onset of diseases.
Some embodiments of the disclosure will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the application are shown. Indeed, various embodiments of the disclosure may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout. Also, it should be noted that the terms “patient” and “examinee” are used interchangeably throughout the disclosure.
Referring to
In some embodiments, the processor 124 would be configured to perform clinical sign detection and may be one or more of a North Bridge, a South Bridge, a field programmable array (FPGA), a programmable logic device (PLD), an application specific integrated circuit (ASIC), or other similar device or a combination thereof. In some embodiments, the processor 124 may also be a central processing unit (CPU), a programmable general purpose or special purpose microprocessor, a digital signal processor (DSP), a graphics processing unit (GPU), other similar devices, integrated circuits, or a combination thereof.
In one exemplary embodiment, the RGB image sensor 110 and the processing device 120 may be integrated in the imaging system 100 as an all-in-one device such as a smart phone, a tablet computer, and so forth. In another exemplary embodiment, the processing device 120 may be a desktop computer, a laptop computer, a server computer, a tabular computer, a work station, a cloud storage and computation device, or a computer system or a platform that is able to wired or wirelessly connected to the RGB image sensor 110 through a communication interface. In some embodiments, the communication interface may be a transmission interface that is compatible to any wired connection or wireless communication standard to transmit data with other devices.
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
It should also be noted that, in some embodiments, color calibration may be a pre-processing step before clinical sign detection that is performed on the RGB image.
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
In addition, Table 4 illustrates a collection of clinical signs and medical conditions described in Tables 1-3. This list is termed as Clinical Sign Representation in Medical Records (CSRMR), a novel representation of clinical signs in text-based medical records and can be reviewed for each patient. With CSRMR representation from image-based clinical signs into text-based medical records, it is now feasible to apply DNN or other machine learning approaches for recommended examinations and predicted outcomes.
Referring to
Referring to
Referring to
Referring to
As illustrated above, the imaging system may be a mono-modality optical device such as consumer-grade RGB camera on smartphones or tablets, or may be a more comprehensive imaging apparatus with additional modalities such as a thermal image sensor and/or hyperspectral image sensor in accordance with some embodiments. Clinical signs may be obtained from the general appearance of a patient's head, eyes, ears, nose, throat/mouth, tongue, neck, hand, extremities, skin and any body parts which could provide important clues to evaluate the underlying medical conditions. The former system could potentially cover the list of clinical signs demonstrated in Table 1.
In utilization, the systems and devices disclosed herein are widely used for individuals at home before visiting clinics and hospitals, and used at senior center and nursing home as for telemedicine. Further, it can be used by the physicians to record the inspection by images (rather than written texts), which provides more accurate, transferrable, and are able to be compared serially. The later system is more comprehensive and could detect additional clinical signs listed in Table 2 and Table 3. With a more sophisticated and expensive system in some embodiments, it is used in medical centers and teaching hospitals and provide critical information for precision health and precision medicine. Compared to other prior studies, the system disclosed here could detect a more variety and comprehensive collection of clinical signs based on both images and videos as well as machine learning analytics such as DNN-based computational models for further recommend examinations and outcome predictions.
In operation, capturing an image of a patient by using the RGB image sensor to generate an RGB image and detecting clinical signs of the patient by the processing device is based on the RGB image.
In view of the aforementioned descriptions, the imaging system and method may be used to detect informative clinical signs based on the HEENT examination and any body part which may reveal subtle and critical information related to a patient's health condition, which enables the patient to understand his/her health condition and discover the early stage of diseases by a non-invasive and convenient manner.
No element, act, or instruction used in the detailed description of disclosed embodiments of the present application should be construed as absolutely critical or essential to the present disclosure unless explicitly described as such. Also, as used herein, each of the indefinite articles “a” and “an” could include more than one item. If only one item is intended, the terms “a single” or similar languages would be used. Furthermore, the terms “any of” followed by a listing of a plurality of items and/or a plurality of categories of items, as used herein, are intended to include “any of”, “any combination of”, “any multiple of”, and/or “any combination of multiples of the items and/or the categories of items, individually or in conjunction with other items and/or other categories of items. Further, as used herein, the term “set” is intended to include any number of items, including zero. Further, as used herein, the term “number” is intended to include any number, including zero.
It will be apparent to those skilled in the art that various modifications and variations can be made to the structure of the disclosed embodiments without departing from the scope or spirit of the disclosure. In view of the foregoing, it is intended that the disclosure cover modifications and variations of this disclosure provided they fall within the scope of the following claims and their equivalents.
This application claims the priority benefit of U.S. provisional application Ser. No. 62/872,695, filed on Jul. 10, 2019. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of this specification.
Number | Name | Date | Kind |
---|---|---|---|
6550933 | Panz | Apr 2003 | B1 |
7283106 | Allen et al. | Oct 2007 | B2 |
9504428 | Gelbman et al. | Nov 2016 | B1 |
9750420 | Agrawal et al. | Sep 2017 | B1 |
9788917 | Mah | Oct 2017 | B2 |
9848780 | DeBusschere et al. | Dec 2017 | B1 |
9992409 | Anzue et al. | Jun 2018 | B2 |
10052026 | Tran | Aug 2018 | B1 |
10390770 | Allen et al. | Aug 2019 | B2 |
10579203 | Zamir et al. | Mar 2020 | B2 |
20080139966 | Zhang et al. | Jun 2008 | A1 |
20090226059 | Levenson | Sep 2009 | A1 |
20120283573 | Gong et al. | Nov 2012 | A1 |
20140221847 | Dubielczyk | Aug 2014 | A1 |
20140320611 | Choi | Oct 2014 | A1 |
20160157725 | Munoz | Jun 2016 | A1 |
20160206216 | Kirenko | Jul 2016 | A1 |
20160210746 | Matsuda | Jul 2016 | A1 |
20170178220 | Chong et al. | Jun 2017 | A1 |
20170319148 | Shahin et al. | Nov 2017 | A1 |
20180289334 | De Brouwer et al. | Oct 2018 | A1 |
20190046099 | Lee | Feb 2019 | A1 |
20190213309 | Morestin | Jul 2019 | A1 |
Number | Date | Country |
---|---|---|
1758257 | Apr 2006 | CN |
101828905 | Sep 2010 | CN |
102103662 | Jun 2011 | CN |
101496715 | Dec 2012 | CN |
101999879 | Jan 2013 | CN |
103249353 | Aug 2013 | CN |
104825136 | Aug 2015 | CN |
105764408 | Jul 2016 | CN |
106691388 | May 2017 | CN |
102711603 | Jul 2017 | CN |
I296110 | Apr 2008 | TW |
201301075 | Jan 2013 | TW |
I492737 | Jul 2015 | TW |
I627940 | Jul 2018 | TW |
M569679 | Nov 2018 | TW |
201902411 | Jan 2019 | TW |
2013116316 | Aug 2013 | WO |
2017111606 | Jun 2018 | WO |
Entry |
---|
Andre Esteva et al., “Dermatologist-level classification of skin cancer with deep neural networks” , Nature, vol. 542, Feb. 2017, pp. 1-11. |
Tsai et al, “Skin Imaging Detection Method Applied to the Analysis of Human Blood Oxygen Concentration” , The 13th National AOI Forum and Exhibition, Mar. 2018, with English translation thereof, pp. 1-12. |
Office Action of Taiwan Counterpart Application, dated Apr. 7, 2021, pp. 1-20. |
Gladimir V. G. Baranoski et al., “Assessing the sensitivity of human skin hyperspectral responses to increasing anemia severity levels.” J. of Biomedical Optics, vol. 20, No. 9, 095002, Sep. 2015, pp. 1-15. |
Haiwei Xie et al., “Relationship between dynamic infrared thermal images and blood perfusion rate of the tongue in anaemia patients.” Infrared Physics & Technology, vol. 89, Mar. 2018, pp. 27-34. |
Anna-Marie Hosking et al., “Hyperspectral Imaging in Automated Digital Dermoscopy Screening for Melanoma.” Lasers Surg Med, vol. 51, No. 3, Mar. 2019, pp. 214-222. |
Cila Herman, “The role of dynamic infrared imaging in melanoma diagnosis.” Expert Rev Dermatol., vol. 8, No. 2, Apr. 1, 2013, pp. 1-12. |
Mirwaes Wahabzada et al., “Monitoring wound healing in a 3D wound model by hyperspectral imaging and efficient clustering.” PLoS One , vol. 12, No. 12, 2017, pp. 1-14. |
Erica Y. Xue et al., “Use of FLIR One Smartphone Thermography in Burn Wound Assessment.” Annals of Plastic Surgery, vol. 80, Feb. 1, 2018, pp. 1-3. |
Peter H Lin et al., “Infrared thermography in the diagnosis and management of vasculitis.” J Vasc Surg Cases Innov Tech., vol. 3, No. 3, Jul. 4, 2017, pp. 112-114. |
Mie Jin Lim et al., “Digital Thermography of the Fingers and Toes in Raynaud's Phenomenon.” J Korean Med Sci., vol. 29, No. 4, 2014, pp. 502-506. |
Marcinkevics, Z. et al., “Hyperspectral evaluation of skin blood oxygen saturation at baseline and during arterial occlusion.” Proceedings of SPIE, vol. 10685, 2018, pp. 1-9. |
Chi-Lun Huang et al., “The application of infrared thermography in evaluation of patients at high risk for lower extremity peripheral arterial disease.” J Vasc Surg., vol. 54, No. 4, Oct. 2011, pp. 1074-1080. |
Matija Milanic et al., “Hyperspectral imaging for detection of arthritis: feasibility and prospects.” J Biomed Opt., vol. 20, No. 9, Sep. 2015, pp. 1-10. |
R Lasanen et al., “Thermal imaging in screening of joint inflammation and rheumatoid arthritis in children.” Physiol Meas., vol. 36, No. 2, Feb. 2015, pp. 273-282. |
Norimichi Tsumura et al., “Image-based skin color and texture analysis/synthesis by extracting hemoglobin and melanin information in the skin.” ACM SIGGRAPH 2003, pp. 770-779. |
Romesh P. Nalliah et al., “Students distracted by electronic devices perform at the same level as those who are focused on the lecture.” PeerJ, vol. 2, No. 12, Sep. 2014, pp. 1-8. |
Lou Gevaux et al., “Three-dimensional hyperspectral imaging: A new method for human face acquisition.” Material Appearrance in Electronic Imaging 2018, Jan. 2018, pp. 1-11. |
Alvin Rajkomar et al., “Scalable and accurate deep learning with electronic health records.” npj Digital Medicine, vol. 1, No. 18, May 8, 2018, pp. 1-10. |
Joshua C Mandel et al., “SMART on FHIR: a standards-based, interoperable apps platform for electronic health records.” J Am Med Inform Assoc., Sep. 2016, vol. 23, No. 5, pp. 899-908. |
Yinghao Huang et al., “Machine Learning for Medical Examination Report Processing.” Real World Data Mining Applications, Annals of Information Systems, vol. 17, 2015, pp. 271-295. |
Number | Date | Country | |
---|---|---|---|
20210007606 A1 | Jan 2021 | US |
Number | Date | Country | |
---|---|---|---|
62872695 | Jul 2019 | US |