Ptosis describes sagging or prolapse of an organ or part, including droopy upper eyelid, also known as Blepharoptosis or Blepharochalasis. For medical or aesthetic reasons it may be desirable to treat a patient having a droopy upper eyelid. Medical reasons include impaired vision. Purely aesthetically unpleasing droopy eyelid does not impair vision of the patient.
The severity of eyelid prolapse defines the nature of the corrective surgery as either medical or aesthetic. The classification has consequences regarding insurance and logistical issues.
Typically, this distinction has been defined through a visual evaluation of a patient's droopy eyelid.
The subject matter regarded as the invention is particularly pointed out and distinctly claimed in the concluding portion of the specification. The invention is best understood in view of the accompanying drawings in which:
It will be appreciated that for the sake of clarity, elements shown in the figures may not be drawn to scale and reference numerals may be repeated in different figures to indicate corresponding or analogous elements.
The following description, certain details are set forth to facilitate understanding; however, it should be understood by those skilled in the art that the present invention may be practiced without these specific details. Furthermore, well-known methods, procedures, and components have not been omitted to highlight the invention.
Visual characterization of droopy eyelids is subjective in nature and are often inaccurate. Therefore, there is a need for a system and method for objective and accurate differentiation between aesthetic and vision-impairing droopy eyelid.
Embodiments pertain to a droopy upper eyelid evaluation system operative to differentiate between aesthetical and vision impairing ptosis by identifying, for subject (also: patient) a degree unaided prolapse of the upper eyelid, based on facial image data of the patient. In some embodiments, the droopy upper eyelid evaluation system is configured and/or operable to automatically or semi-automatically evaluate or characterize, based on facial image data of the patient, a droopy eyelid condition for one or both eyes of a patient, simultaneously or separately.
In some embodiments, the evaluation system is operable to determine, facial image data of the patient, whether the subject is making attempts to malinger a droopy eyelid in general and, optionally, a vision-impairing droopy eyelid in particular. In some examples, the evaluation system is operable to determine, facial image data of the patient, whether the subject is making attempts to forcefully exaggerate a pre-existing, merely aesthetic droopy eyelid to become, at the time of the patient evaluation, a vision-impairing droopy eyelid.
In some examples, the evaluation system may employ a rule-based engine and/or artificial intelligence functionalities which are based, for example, on a machine learning model. The machine learning model may be trained by a multiplicity of facial image data of patients. The rule-based engine and/or the machine learning model are configured to determine whether a patient is malingering a droopy eyelid or not. In some embodiments, the rule-based engine and/or the machine learning model are configured to distinguish, based in the patient's facial image data, between malingered or non-malingered vision-impairing droopy eyelids. The patient's image data may be a sequence of images captured in a video recording session.
In some embodiments, a rule-based engine may be employed for determining whether the patient's droopy eyelid condition is aesthetical in nature or vision-impairing. A machine learning algorithm may then be employed for characterizing the vision-impairing droopy eyelid condition, for example, as malingered or not.
In some embodiments, a machine learning algorithm may first be employed to determine whether the patient is making attempts to malinger a droopy eyelid condition or not. After the evaluation system has determined that the patient is not making attempts to malinger a droopy eyelid condition, the evaluation system employs a rule-based engine for determining whether a detected droopy eyelid condition is vision-impairing or not (i.e., merely aesthetical in nature).
Turning now to the figures,
In some embodiments, one or more criteria may pertain to (e.g., geometric) facial features of a patient such as, for example, a distance between a center of a patient's pupil and, for the same eye, a feature of the patient's upper eyelid including, for example, the lower central edge of the patient's upper eyelid.
It is noted that characterizing (e.g., classifying) a droopy upper eyelid may also encompass characterizing whether droopy upper eyelid is more likely vision impairing than not. For example, the system may be operable to determine a probability of vision-impairing droopy eyelid. In a further example, the system may be operable to determine a probability of obstructive or non-obstructive upper eyelid.
In some embodiments, grid 50 overlays the images in certain embodiments. Grid 50 facilitates machine (e.g., automated) detection of various degrees of eyelid prolapse that could be indicative of patient malingering, since a degree of prolapse is expected to remain within a certain range, or remain constant within the time frame needed to capture a series of images, for example, within different light settings. It should be noted that grid 50 in a certain embodiment is not displayed and is implemented as an internal coordinate system providing a basis of reference for tracking the degree of eyelid prolapse, for instance, over a certain period of time.
In some embodiments, additional facial features may be captured by a camera and processed to determine, for example, whether the patient is trying to exaggerate droopy upper eyelid to malinger vision-impairing. Such facial features can pertain, for example, comparison to the patient's other eye, facial expressions and/or movements of the patient's mouth, eyebrows, cheekbones and/or forehead.
For instance, patient malingering (or lack thereof) may for example be detected by an evaluation system based on artificial intelligence functionalities which are based on a machine learning model (e.g., an artificial neural network and/or other deep learning machine learning models; regression-based analysis; a decision tree; and/or the like), and/or by a rule-based engine. For example, the evaluation system may be configured to analyze image data descriptive of a patient's facial muscle features, muscle activation, facial expressions, and/or the like, and provide an output indicating whether the patient is malingering a vision-impairing droopy eyelid, or not. For example, a machine learning model may be trained with images of video sequences of facial expressions, labelled by an expert either as “malingering” or “not malingering”.
In some examples, a droopy eyelid may be classified by comparing features of one eye with features of the other eye of the same patient, e.g., by analyzing the patient's facial muscle features, muscle activation, facial expressions, and/or the like.
In some embodiments, a criteria for characterizing (e.g., classifying) a droopy upper eyelid as vision impairing or as not vision-obstructive may be based on measuring a distance D between a center C of pupil 20 and a feature of eyelid 10 such as, for example, lower central edge of the upper eyelid 12. This distance may herein also be referred to as Marginal Reflex Distance Test 1 or MRD1. The position of center C of pupil 20 in a captured image frame may be determined based on light reflected from the pupil. Merely for the sake of clarity, the distances D(A) and D(B) are respectively depicted in
In some examples, if MRD1 is <2 mm or 2 mm, the droopy upper eyelid may be characterized as vision-impairing justifying, e.g., coverage and/or reimbursements of the costs of a medical procedure to treat the vision-impairing droopy eyelid for example through corrective surgery. Otherwise, the droopy upper eyelid may be characterized as a (purely) aesthetic problem, not necessarily justifying coverage and/or reimbursement of the costs of a medical procedure for treatment thereof.
Optionally, a criterion may also relate to a geometric feature of the imaged pupil 20. The geometric feature may include, for example, a contour of the portion of pupil 20 that is visually non-impaired; the pupil area that is visible in the image; entire pupil area, diameter and/or radius when not impaired by the patient's eyelid; pupil diameter; pupil curvature and/or the like.
In some embodiments, the droopy upper eyelid evaluation system may be adapted to determine parameter values of a geometric feature of a pupil even if the pupil is not fully visible in the captured image. For example, the droopy upper eyelid evaluation system may complement parameter values of non-visible geometric features, e.g., based geometric features of the pupil that are visible. For instance, the entire pupil area may be determined based on the partially visible portion of the pupil.
Optionally, data descriptive of a geometric reference object relating to, for example, the entire pupil area may be generated. The geometric reference object may be used as reference for droopy upper eyelid characterization (e.g., classification). The geometric reference object may be a circular object indicating the contour of the entire pupil area. Characteristics of the circular object may be compared against characteristics of the visible pupil portion for differentiating between (purely) aesthetic and vision impairing droopy upper eyelid.
In some embodiments, different imaging parameter values may be selected for capturing a patient's region of interest (ROI). For example, images of a facial ROI of the patient may be captured (e.g., through video) in the visible wavelength range and/or in the infrared wavelength range to generate one or more frames of facial image data for conducting droopy upper eyelid characterization (e.g., classification). The frame may be captured from different distances, field of views (FOVs), viewing angles, at different imaging resolutions, under different light conditions, etc.
In some embodiments, the ROI may not only include the patient's eye or eyes, but also additional portions of the patient face such as the forehead, nose, cheek, etc., for example, to capture and analyze the patient's facial muscle movement. Capturing images of the patient's face may facilitate determining whether a patient's attempts to malinger or fake vision-impairing droopy eyelid, or not. In some examples, the ROI may also include non-facial portions, for instance, to capture a patient's body posture, which may also provide an indication whether a patient attempts to malinger vision-impairing droopy eyelid or not.
In some embodiments, imaging parameter values may be standardized to ensure that droopy upper eyelid characterization (e.g., classification) is performed in standardized manner for any patient.
In some embodiments, facial image data may be processed and/or analyzed with a variety of processing and/or analysis techniques including, for example, edge detection, high-pass filtering, low-pass filtering, deblurring, edge detection, and/or the like.
In some embodiments, the patient profile may be used to search population data (e.g., inter-subject measurement data) having similar profiles as the patient to determine the likelihood of vision impairing droopy upper eyelid, patient malingering, on the basis of common prolapse rates found among data of a population.
In some embodiments, historic same-patient data (e.g., intra-subject measurement data) may be used to determine, for example, the likelihood of vision impairing and/or patient malingering.
In some embodiments, the droopy upper eyelid evaluation system may be operable to identify relevant demographic and/or health parameters conducive in evaluating if the aesthetic prolapse will advance into a case of vision impairing droopy eyelid or will remain vision non-obstructive. Optionally, artificial intelligence techniques may be employed for identifying relevant demographic and/or health parameters conducive in evaluating if the prolapse will advance into a case of vision impairing or will remain a vision non-obstructive droopy eyelid.
In some embodiments, a droopy upper eyelid evaluation system 100 may provide a user of the system with indicators (e.g., visual and/or auditory) regarding a desired patient head orientation and, optionally, body posture, relative to camera 122 during the capturing of images of one or more facial features of the patient. For example, droopy upper eyelid evaluation system 100 may provide reference markings to indicate a desired yaw, pitch and/or roll orientation of the patient's head relative to camera 122. Capturing facial features at a desired head orientation may for example reduce, minimize or eliminate the probability of false positives (i.e., that the droopy eyelid is vision impairing) and/or of false negatives (that droopy eyelid is not vision impairing).
In some embodiments, face detection or facial feature detection algorithms may be employed for characterizing (e.g., classifying) a droopy upper eyelid.
Communication module 113 may, for example, include I/O device drivers (not shown) and network interface drivers (not shown) for enabling the transmission and/or reception of data over a network. A device driver may for example, interface with a keypad or to a USB port. A network interface driver may for example execute protocols for the Internet, or an Intranet, Wide Area Network (WAN), Local Area Network (LAN) employing, e.g., Wireless Local Area Network (WLAN)), Metropolitan Area Network (MAN), Personal Area Network (PAN), extranet, 2G, 3G, 3.5G, 4G, 5G, 6G mobile networks, 3GPP, LTE, LTE advanced, Bluetooth® (e.g., Bluetooth smart) , ZigBee™, near-field communication (NFC) and/or any other current or future communication network, standard, and/or system. Evaluation system 100 may further include a power module 130 configured to power the various components of the system. Power module 130 may comprise an internal power supply (e.g., a rechargeable battery) and/or an interface for allowing connection to an external power supply.
As shown, in step 410 the system captures one or more images of a patient's face including at least one eye of the patient together with its droopy upper eyelid.
In step 420, the system identifies eye-related and, optionally, additional facial features. For example, the system identifies the iris 30 and pupil 20 as shown in
In step 430, the system identifies (e.g., calculates) one or more geometric feature of the patient's eye(s). Such features include, inter alia, pupil diameter, pupil area, pupil curvature, center C of pupil 20 and/or a feature of eyelid 10 such as, for example, lower central edge of the upper eyelid 12, pupillary distance, and/or the like.
In step 440, the system analyzes a geometric feature of the eye.
In step 450, the system determines, based on the analysis, whether the droopy upper eyelid is vision impairing or not.
In some embodiments, step 440 of analyzing a geometric feature of the eye may comprise determining a distance between a center C of pupil 20 and a feature of eyelid 10 such as, for example, lower central edge of the upper eyelid 12. As mentioned herein, the distance D may herein also be referred to as Marginal Reflex Distance Test 1 or MRD1. The position of center C of pupil 20 in a captured image frame may be determined based on light reflected from the pupil. Merely for the sake of clarity, the distances D(A) and D(B) are respectively depicted in
Further referring to
The method may then include, for example, calculating the area of the reference circle (step 444) for determining the difference between the area of the reference circle and the area of the visible part of the imaged pupil (step 446).
If the difference exceeds a vision-impairment threshold value, the droopy upper eyelid is characterized as vision-impairing (step 448). If the difference does not exceed the vision-impairment threshold value, the patient's droopy upper eyelid is characterized as not vision-impairing (step 449). Droopy upper eyelid characterization may then be output (step 450).
In step 450 the droopy upper eyelid characterization may be output through an output device like a printer, display, speaker, or even to another computer in communication with the system.
Additional reference is made to
As shown, in step 510 a image of a droopy upper eyelid is captured, e.g., together with the retina and the pupil. As noted above, the image capture may be implemented in the same lighting conditions and angle of image capture.
In step 520, the system identifies facial components such as, for example, iris 30 and pupil 20, of
In step 530, the distance between the corneal light reflexes in the pupillary center and of the margin of the upper eyelid is automatically measured as a function of time, for example, continuously (e.g., by imagers comprised in glasses worn by the patient), at irregular or regular intervals like, once or several times a day, once a week, or once a month, or once a year, all in accordance with patient needs, e.g., to determine a statistical parameter value to evaluate, for example, whether the patient is malingering a droopy eyelid or not, e.g., by determining a deviation between measurements; and/or to determine a trend (also: disease progress) of the patient's droopy eyelid condition.
In step 540, a prolapse rate of the droopy upper eyelid is determined (e.g., calculated) on the basis of at least two images, each captured at a different time.
In step 550, the system identifies a prolapse rate indicative of future vision impairment within a population. For example, the system determines a statistical likelihood of the prolapse advancing into a state of vision impairing prolapse, for example, based on a database of droopy upper eyelid sufferers is searched for those having a history of a similar prolapse rate that advanced to an image impairing stage and/or based the patient's own prolapse rate may serve as a reference. In some examples, the system determines a statistical likelihood of future vision impairing based on the patient data. Optionally, additional demographic and/or health data are employed to better refine the search, in a certain embodiment.
In some embodiments, machine learning techniques employing, for example, Bayesian networks, artificial neural networks and/or other techniques proving such functionality, are employed to identify relevant parameters associated vision impairing and uses these parameters in the search.
In step 560, a present droopy eyelid is characterized, e.g., it is determined whether it has become vision-impairing or not, e.g., by implementing the steps outlined with respect to step 440.
In some embodiments, a series of images are captured in each of a variety of lighting conditions. The different lighting conditions compel a patient to open the eyes widely in low intensity lighting and squint in high intensity lighting. The variable light conditions make it more difficult for a patent to exaggerate eyelid prolapse.
The term “processor”, as used herein, may additionally or alternatively refer to a controller. A processor may be implemented by various types of processor devices and/or processor architectures including, for example, embedded processors, communication processors, graphics processing unit (GPU)-accelerated computing, soft-core processors and/or general purpose processors.
According to some embodiments, memory 112 may include one or more types of computer-readable storage media. Memory 112 may include transactional memory and/or long-term storage memory facilities and may function as file storage, document storage, program storage, or as a working memory. The latter may for example be in the form of a static random access memory (SRAM), dynamic random access memory (DRAM), read-only memory (ROM), cache and/or flash memory. As working memory, memory 112 may, for example, including, e.g., temporally-based and/or non-temporally based instructions. As long-term memory, memory 112 may for example include a volatile or non-volatile computer storage medium, a hard disk drive, a solid state drive, a magnetic storage medium, a flash memory and/or other storage facility. A hardware memory facility may for example store a fixed information set (e.g., software code) including, but not limited to, a file, program, application, source code, object code, data, and/or the like.
It will be appreciated that separate modules and/or components can be allocated for each of evaluation system 100. However, for simplicity and without be construed in a limiting manner, the description and claims may refer to a single module and/or component. For example, although processor 111 may be implemented by several processors, the following description will refer to processor 111 as the component that conducts all the necessary processing functions of system 100.
It is important to note that the methods described herein and illustrated in the accompanying diagrams shall not be construed in a limiting manner. For example, methods described herein may include additional or even fewer processes or operations in comparison to what is described herein and/or illustrated in the diagrams. In addition, method steps are not necessarily limited to the chronological order as illustrated and described herein.
Any digital computer system, unit, device, module and/or engine exemplified herein can be configured or otherwise programmed to implement a method disclosed herein, and to the extent that the system, module and/or engine is configured to implement such a method, it is within the scope and spirit of the disclosure. Once the system, module and/or engine are programmed to perform particular functions pursuant to computer readable and executable instructions from program software that implements a method disclosed herein, it in effect becomes a special purpose computer particular to embodiments of the method disclosed herein. The methods and/or processes disclosed herein may be implemented as a computer program product that may be tangibly embodied in an information carrier including, for example, in a non-transitory tangible computer-readable and/or non-transitory tangible machine-readable storage device. The computer program product may directly loadable into an internal memory of a digital computer, comprising software code portions for performing the methods and/or processes as disclosed herein.
The methods and/or processes disclosed herein may be implemented as a computer program that may be intangibly embodied by a computer readable signal medium. A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a non-transitory computer or machine-readable storage device and that can communicate, propagate, or transport a program for use by or in connection with apparatuses, systems, platforms, methods, operations and/or processes discussed herein.
The terms “non-transitory computer-readable storage device” and “non-transitory machine-readable storage device” encompasses distribution media, intermediate storage media, execution memory of a computer, and any other medium or device capable of storing for later reading by a computer program implementing embodiments of a method disclosed herein. A computer program product can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by one or more communication networks.
These computer readable and executable instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable and executable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable and executable instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The term “engine” may comprise one or more computer modules, wherein a module may be a self-contained hardware and/or software component that interfaces with a larger system. A module may comprise a machine or machines executable instructions. A module may be embodied by a circuit or a controller programmed to cause the system to implement the method, process and/or operation as disclosed herein. For example, a module may be implemented as a hardware circuit comprising, e.g., custom VLSI circuits or gate arrays, an Application-specific integrated circuit (ASIC), off-the-shelf semiconductors such as logic chips, transistors, and/or other discrete components. A module may also be implemented in programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices and/or the like.
In the discussion, unless otherwise stated, adjectives such as “substantially” and “about” that modify a condition or relationship characteristic of a feature or features of an embodiment of the invention, are to be understood to mean that the condition or characteristic is defined to within tolerances that are acceptable for operation of the embodiment for an application for which it is intended.
Unless otherwise specified, the terms “substantially”, “about” and/or “close” with respect to a magnitude or a numerical value may imply to be within an inclusive range of −10% to +10% of the respective magnitude or value.
“Coupled with” can mean indirectly or directly “coupled with”.
It is important to note that the method may include is not limited to those diagrams or to the corresponding descriptions. For example, the method may include additional or even fewer processes or operations in comparison to what is described in the figures. In addition, embodiments of the method are not necessarily limited to the chronological order as illustrated and described herein.
Discussions herein utilizing terms such as, for example, “processing”, “computing”, “calculating”, “determining”, “establishing”, “analyzing”, “checking”, “estimating”, “deriving”, “selecting”, “inferring” or the like, may refer to operation(s) and/or process(es) of a computer, a computing platform, a computing system, or other electronic computing device, that manipulate and/or transform data represented as physical (e.g., electronic) quantities within the computer's registers and/or memories into other data similarly represented as physical quantities within the computer's registers and/or memories or other information storage medium that may store instructions to perform operations and/or processes. The term determining may, where applicable, also refer to “heuristically determining”.
It should be noted that where an embodiment refers to a condition of “above a threshold”, this should not be construed as excluding an embodiment referring to a condition of “equal or above a threshold”. Analogously, where an embodiment refers to a condition “below a threshold”, this should not be construed as excluding an embodiment referring to a condition “equal or below a threshold”. It is clear that should a condition be interpreted as being fulfilled if the value of a given parameter is above a threshold, then the same condition is considered as not being fulfilled if the value of the given parameter is equal or below the given threshold. Conversely, should a condition be interpreted as being fulfilled if the value of a given parameter is equal or above a threshold, then the same condition is considered as not being fulfilled if the value of the given parameter is below (and only below) the given threshold.
It should be understood that where the claims or specification refer to “a” or “an” element and/or feature, such reference is not to be construed as there being only one of that element. Hence, reference to “an element” or “at least one element” for instance may also encompass “one or more elements”.
Terms used in the singular shall also include the plural, except where expressly otherwise stated or where the context otherwise requires.
In the description and claims of the present application, each of the verbs, “comprise” “include” and “have”, and conjugates thereof, are used to indicate that the data portion or data portions of the verb are not necessarily a complete listing of components, elements or parts of the subject or subjects of the verb.
Unless otherwise stated, the use of the expression “and/or” between the last two members of a list of options for selection indicates that a selection of one or more of the listed options is appropriate and may be made. Further, the use of the expression “and/or” may be used interchangeably with the expressions “at least one of the following”, “any one of the following” or “one or more of the following”, followed by a listing of the various options.
As used herein, the phrase “A,B,C, or any combination of the aforesaid” should be interpreted as meaning all of the following: (i) A or B or C or any combination of A, B, and C, (ii) at least one of A, B, and C; (iii) A, and/or B and/or C, and (iv) A, B and/or C. Where appropriate, the phrase A, B and/or C can be interpreted as meaning A, B or C. The phrase A, B or C should be interpreted as meaning “selected from the group consisting of A, B and C”. This concept is illustrated for three elements (i.e., A,B,C), but extends to fewer and greater numbers of elements (e.g., A, B, C, D, etc.).
It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments or example, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, example and/or option, may also be provided separately or in any suitable sub-combination or as suitable in any other described embodiment, example or option of the invention. Certain features described in the context of various embodiments, examples and/or optional implementation are not to be considered essential features of those embodiments, unless the embodiment, example and/or optional implementation is inoperative without those elements.
It is noted that the terms “in some embodiments”, “according to some embodiments”, “for example”, “e.g.”, “for instance” and “optionally” may herein be used interchangeably.
The number of elements shown in the Figures should by no means be construed as limiting and is for illustrative purposes only.
“Real-time” as used herein generally refers to the updating of information at essentially the same rate as the data is received. More specifically, in the context of the present invention “real-time” is intended to mean that the image data is acquired, processed, and transmitted from a sensor at a high enough data rate and at a low enough time delay that when the data is displayed, data portions presented and/or displayed in the visualization move smoothly without user-noticeable judder, latency or lag.
It is noted that the terms “operable to” can encompass the meaning of the term “modified or configured to”. In other words, a machine “operable to” perform a task can in some embodiments, embrace a mere capability (e.g., “modified”) to perform the function and, in some other embodiments, a machine that is actually made (e.g., “configured”) to perform the function.
Throughout this application, various embodiments may be presented in and/or relate to a range format. It should be understood that the description in range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the embodiments. Accordingly, the description of a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as from 1 to 6 should be considered to have specifically disclosed subranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
The phrases “ranging/ranges between” a first indicate number and a second indicate number and “ranging/ranges from” a first indicate number “to” a second indicate number are used herein interchangeably and are meant to include the first and second indicated numbers and all the fractional and integral numerals there between.
While the invention has been described with respect to a limited number of embodiments, these should not be construed as limitations on the scope of the invention, but rather as exemplifications of some of the embodiments.
Example 1 includes a method for characterizing droopy upper eyelid performed on a computer having a processor, memory, and one or more code sets stored in the memory and executed in and/or by the processor, the method comprising:
Example 2 includes the subject matter of example 1 and, optionally, further comprising providing an output indicating whether the droopy upper eyelid is vision impairing or not, or whether the droopy upper eyelid is more likely vision impairing than not vision-impairing
Example 3 includes the subject matter of example 1 and/or example 2 and, optionally, wherein the determining includes identifying at least one geometric feature of the pupil for determining, based on the at least one geometric feature, whether the droopy upper eyelid is vision impairing or not, or whether the droopy upper eyelid is more likely vision impairing than not vision-impairing.
Example 4 includes the subject matter of example 3 and, optionally, wherein the least one geometric feature of the pupil is the pupil diameter.
Example 5 includes the subject matter of example 3 or example 4 and, optionally, wherein the least one geometric feature of the pupil is the pupil curvature.
Example 6 includes the subject matter of any one or more of the Examples 3 to 5 and, optionally, wherein the at least one geometric feature of the pupil includes a pupil area visible in the image.
Example 7 includes the subject matter of example 6 and, optionally, wherein the determining is implemented through comparison of the pupil area to a circular geometric object having a diameter matching the diameter of the pupil.
Example 8 includes the subject matter of Examples 6 and/or 7 and, optionally, wherein the determining is implemented through comparison of the pupil area to a circle having a curvature matching the pupil curvature.
Example 9 includes the subject matter of any one or more of the examples 1 to 8 and, optionally, determining a distance D between a center C of the pupil and a feature of the upper eyelid.
Example 10 includes the subject matter of Example 9 and, optionally, wherein the feature of the upper eyelid is the lower central edge of the upper eyelid.
Example 11 includes the subject matter of any one or more of the Example 1 to 10 and, optionally, determining a Marginal Reflex Distance Test 1.
Example 12 includes the subject matter of any one or more of the Examples 7 to 11 and, optionally, wherein the position of center C of the pupil in a captured image frame may be determined based on light reflected from the pupil.
Example 13 includes the subject matter of any one or more of the examples 1 to 12 and, optionally further comprising characterizing a droopy eyelid as the result of patient malingering or not; or characterizing how likely the droopy eyelid is the result of patient malingering or not.
Example 14 includes the subject matter of any one or more of the examples 1 to 13 and, optionally, further comprising characterizing a vision-impairing droopy eyelid as being the result of patient malingering or not, or characterizing how likely the vision-impairing droopy eyelid is the result of patient malingering or no.
Example 15 includes the subject matter of example 14 and, optionally, wherein the characterizing of the vision-impairing droopy eyelid as being due to the result of patient malingering or not (or characterizing how likely the vision-impairing droopy eyelid is the result of patient malingering or not), is performed by a machine learning model implemented as an artificial neural network.
Example 16 pertains to a system for identifying vision-impairing droopy eyelid, the system comprising:
Example 17 includes the subject matter of Example 16 and, optionally, wherein the least one geometric feature of the pupil is the pupil diameter.
Example 18 includes the subject matter of examples 16 and/or 17 and, optionally, wherein the least one geometric feature of the pupil is the pupil curvature.
Example 19 includes the subject matter of any one or more of the Examples 16 to 18 and, optionally, wherein the at least one geometric feature of the pupil includes a pupil area visible in the image.
Example 20 includes the subject matter of any one or more of the Examples 16 to 19 and, optionally, wherein the determining is implemented through comparison of the pupil area to a circular geometric object having a diameter matching the diameter of the pupil.
Example 21 includes the subject matter of any one or more of the Examples 16 to 20 and, optionally, wherein the determining is implemented through comparison of the pupil area to a circle having a curvature matching the pupil curvature.
Example 22 includes the subject matter of any one or more of the examples 16 to 21 and, optionally, wherein the determining comprises: determining a distance D between a center C of the pupil and a feature of the upper eyelid.
Example 23 includes the subject matter of Example 22 and, optionally, wherein the feature of the upper eyelid is the lower central edge of the upper eyelid.
Example 24 includes the subject matter of any one or more of examples 16 to 23 and, optionally, further comprises determining a Marginal Reflex Distance Test 1.
Example 25 includes the subject matter of any one or more of the examples 22 to 24 and, optionally, wherein the position of center C of the pupil in a captured image frame may be determined based on light reflected from the pupil.
Example 26 includes the subject matter of any one or more of the examples 16 to 25 and, optionally, further comprising characterizing a droopy eyelid as being due to patient malingering or not.
Example 27 includes the subject matter of any one or more of the examples 16 to 26 and, optionally, further comprising characterizing a vision-impairing droopy eyelid as being due to patient malingering or not.
Example 28 includes the subject matter of example 27 and, optionally, wherein the characterizing of the vision-impairing droopy eyelid as being due to patient malingering or not, is performed by a machine learning model implemented as an artificial neural network.
Example 29 includes a method for identifying vision-impairing droopy eyelid performed on a computer having a processor, memory, and one or more code sets stored in the memory and executed in the processor, the method comprising:
Example 30 includes the subject matter of example 29 and, optionally, wherein the output indicates whether the prolapse is due to patient malingering, or not.
Example 31 includes a system for identifying vision-impairing droopy eyelid, the system comprising a processor, memory, and one or more code sets stored in the memory and executed in the processor for performing:
Example 32 includes the subject matter of example 31 and, optionally, wherein the output indicates whether the prolapse is due to patient malingering, or not.
It should be appreciated that the droopy upper eyelid evaluation system embodies an advance in droopy upper eyelid analysis capable of providing a more reliable characterization of droopy upper eyelids and therefore can reduce, if not entirely eliminate, erroneous characterizations (e.g., classifications or evaluations). Erroneous characterization of aesthetic, not vision-impairing droopy upper eyelids as vision impairing causes medical resources, like physicians and operation rooms, to be directed to corrective, vision restoration surgery when indeed the procedure is entirely optional. Furthermore, insurance providers benefit in that the reliable characterization enables them to accurately apply policies that differentiate between crucial vision restoration and optional, aesthetic surgery. Furthermore, the system enables insurance providers to identify patient malingering directed to securing insurance funding for corrective surgery of a medical condition when in fact the desired surgery is an optional aesthetic procedure.
It should be appreciated that embodiments formed from combinations of features set forth in separate embodiments are also within the scope of the present invention.
While certain features of the invention have been illustrated and described herein, modifications, substitutions, and equivalents are included within the scope of the invention.
| Number | Date | Country | Kind |
|---|---|---|---|
| PCT/IB2020/055938 | Jun 2020 | WO | international |
This application claims priority from PCT/IB2020/055938 filed on Jun. 23, 2020, which is expressly incorporated herein by reference in its entirety.
| Filing Document | Filing Date | Country | Kind |
|---|---|---|---|
| PCT/IB2021/055451 | 6/21/2021 | WO |