This application is a 35 U.S.C. §371 national phase application of PCT/KR2012/000711, filed on Jan. 31, 2012, entitled “METHOD OF TRACKING A POSITION OF AN EYE AND A MEDICAL HEAD LAMP USING THE SAME”, which application claims priority to and the benefit of Korean Patent Application No. 2011-0022856, filed Mar. 15, 2011, the disclosure of which is incorporated herein by reference in its entirety.
1. Field of the Invention
The present invention relates to a method of tracking a position of an eye and a medical head lamp using the same, and more particularly, to a method of tracking a position of an eye which is able to illuminate a dark diseased part or a desired part by adjusting a lighting direction of a lamp based on positions of the physician's eyes without the direct manual operation of a head lamp when the physician medically examines and treats a patient, and a medical head lamp using the same.
2. Discussion of Related Art
Generally, a head lamp has been used in the fields of mining, medicine, and particularly surgery. Typically, the head lamp is attached to a head band, a hat or a helmet.
Such a head lamp has an advantage in that a region at which a user wearing the head lamp looks can be illuminated without using a fixed floodlight or a portable lamp.
In particular, when the head lamp is used for medical service, that is, when a physician observes and operates on a diseased part, a desired part is brightly illuminated so that the medical procedure can be practiced more easily.
However, the medical head lamp is fixed on the head regardless of the direction of the physician's gaze. Therefore, when a position of the head lamp has to be adjusted during surgery, a nurse rather than an operating physician manually adjusts the direction of the head lamp worn on the head of the physician so as to prevent the patient from being contaminated.
Here, when the physician adjusts the head lamp mounted on his head himself, it is possible to place the head lamp at a desired position of the head more quickly and accurately, but surgical tools such as surgical gloves may be contaminated.
The present invention is directed to providing a method of tracking a position of an eye. Here, the method includes binarizing an image of a region of an eye taken by an imaging unit and extracting a feature point from the image.
Also, the present invention is directed to providing a medical head lamp which uses an imaging unit to image a physician's eye, calculates a position of the eye from an obtained image using the method of tracking a position of an eye, and enables a lighting direction of a head lamp to automatically correspond to the direction of the physician's gaze based on the calculated position of the eye.
One aspect of the present invention provides a method of tracking a position of an eye, which includes:
i) a binarization step to convert an obtained image into a binary number image;
ii) a feature point extraction step to classify the corner of the eye, maximum and minimum values of the pupil of the eye and extract a feature point from the binary number image on which the binarization step is completed;
iii) a distance and inclination calculation step to calculate a distance and inclination between the corner and the pupil of the eye using the feature point extracted through the feature point extraction step; and
iv) an eye gaze direction determination step to determine a gaze direction of the eye based on the distance and inclination calculated through the distance and inclination calculation step.
The above and other objects, features and advantages of the present invention will become more apparent to those of ordinary skill in the art by describing in detail exemplary embodiments thereof with reference to the attached drawings, in which:
Hereinafter, the exemplary embodiments of the present invention will be described in detail. However, the present invention is not limited to the embodiments disclosed below, but can be implemented in various forms. The following embodiments are described in order to enable those of ordinary skill in the art to embody and practice the present invention.
Although the terms first, second, etc. may be used to describe various elements, these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and similarly, a second element could be termed a first element, without departing from the scope of the exemplary embodiments. The term “and/or” includes any and all combinations of one or more of the associated listed items.
It will be understood that when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected” or “directly coupled” to another element, there are no intervening elements present.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the exemplary embodiments. The singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes” and/or “including,” when used herein, specify the presence of stated features, integers, steps, operations, elements, components and/or groups thereof, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components and/or groups thereof.
With reference to the appended drawings, the exemplary embodiments of the present invention will be described in detail below. To aid in understanding the present invention, like numbers refer to like elements throughout the description of the figures, and the description of the same elements will not be reiterated.
According to one aspect, the present invention provides a method of tracking a position of an eye, which includes i) a binarization step to convert an obtained image into a binary number image; ii) a feature point extraction step to classify the corner of the eye, maximum and minimum values of the pupil of the eye and extract a feature point from the binary number image on which the binarization step is completed; iii) a distance and inclination calculation step to calculate a distance and inclination between the corner and the pupil of the eye using the feature point extracted through the feature point extraction step; and iv) an eye gaze direction determination step of determining a gaze direction of the eye based on the distance and inclination calculated through the distance and inclination calculation step.
According to another aspect, the present invention provides a medical head lamp capable of controlling a lighting direction of the medical head lamp using the method of tracking a position of an eye.
In particular, the medical head lamp uses an imaging unit to take an image of a physician's eye, calculates a position of the eye from the obtained image using the method of tracking a position of an eye, and enables the lighting direction of the head lamp to automatically correspond to the direction of the physician's gaze based on the calculated position of the eye.
Here, the medical head lamp functions to automatically illuminate a dark diseased part or a desired part based on positions of the physician's eyes, preferably, positions of the pupils of the eyes without the direct manual operation of a head lamp when the physician medically examines and treats a patient. Therefore, head lamps may be used without limitation as long as they are used for this purpose.
In particular, the head lamp according to the present invention may apply to medical services and applications in which a user wants to wear the head lamp on the head and illuminate a desired position based on the direction of the user's gaze, for example, repair work, mining, etc.
Hereinafter, the present invention will be described in further detail with reference to the accompanying drawings. However, it should be understood that the description below pertains merely to a preferable example for the purpose of illustration only and is not intended to limit the scope of the invention.
As shown in
The binarization step of Step i) according to the present invention is performed to convert an obtained image into a binary number image. Here, binarization methods may be used without limitation as long as they are used for this purpose.
In particular, the binarization step includes dividing an obtained image, preferably an image of an obtained eye region, into partitions and converting the image pixels in each of the partitions into binary numbers, for example, “0” or “1,” based on a predetermined critical value.
Here, the critical value may vary according to the user's choice. Upon binarization of the image pixels, the binary numbers “0” and “1” may also be reversely set based on the critical value, depending on the user's choice.
Also, the obtained image is an image taken of a region an eye positioned on the face. In this case, the image may be an image taken of each of left/right eye regions, and includes both a black/white image and a color image.
In this case, each of the obtained images of the left/right eye regions may sequentially undergo the binarization step, the feature point extraction step, and the distance and inclination calculation step to track a position of the eye.
In particular, prior to the binarization step according to the present invention, the method of tracking a position of an eye according to the present invention may further include a skin color removal operation to remove skin color from the obtained image.
In this case, the obtained image used for the skin color removal operation may be a color image.
Also, the removal of the skin color may be performed by binarizing the obtained image only when red, green and blue pixel values in the image satisfy a predetermined range.
In this case, for example, the set pixel values are in the range of 50 to 100 for red pixels, in the range of 20 to 50 for green pixels, and in the range of 200 to 255 for blue pixels. On the assumption that the pixel values are binarized, when the range values are all satisfied, the corresponding region is a region corresponding to the white of the eye, and thus a binarized value may be set to “0,” and when the range values are not satisfied, the corresponding region is a region corresponding to the contour of the eye, and thus a binarized value may be set to “1.”
The feature point extraction step of Step ii) according to the present invention is performed by classifying the corner of the eye, maximum and minimum values of the pupil of the eye, and extracting a feature point from the binary number image on which the binarization step of Step ii) is completed. Here, typical methods may be used without limitation as long as they are used for these purposes.
Here, the corner of the eye refers to a lateral palpebral commissure of the eye.
In particular, the feature point extraction step includes translocating a mask from one end to another end of the binarized image with an increase of one pixel, storing a summation value at each pixel position upon the translocation, regarding a position in a region in which the summation value first becomes significant, that is, in which the contour of the eye is found for the first time, for example, in which a summation value is 3 or more, as the corner of the eye, regarding a position in which the difference between mask values summated at regular distances is greatest as a pupil region, and searching for the maximum and minimum values at a position of the pupil, as shown in
Here, the mask means a region used to search a portion of the entire image, and the summation means the sum of the binarized values in the searched region. Also, the maximum value measured through the summation mask refers to the highest contour point of a pupil region, and the minimum value refers to the lowest contour point of the pupil region.
The distance and inclination calculation step of Step iii) according to the present invention is performed by calculating a distance and inclination between the corner and the pupil of the eye using the feature point extracted through the feature point extraction step of Step ii).
In addition, the positions of the corner and pupil of the eye are extracted through the feature point extraction step of Step ii). Here, the distance and inclination between the corner and pupil of the eye are calculated according to the following Equations 1 and 2.
Distance=(maximum value of pupil+minimum value of pupil)/2−corner of the eye Equation 1
Inclination=maximum value of pupil−minimum value of pupil Equation 2
Here, the maximum and minimum values of the pupil of the eye and the corner of the eye defined in Equations 1 and 2 refer to the pixel positions of a binarized image.
In particular, the maximum value of the pupil refers to the highest contour point of a pupil region, the minimum value of the pupil refers to the lowest contour point of the pupil region, and the corner of the eye refers to a position value of a region in which the contour of the eye is found.
The eye gaze direction determination step according to the present invention is performed by determining a gaze direction of the eye, based on the distance and inclination calculated through the distance and inclination calculation step of Step iv). Typical determination methods may be used without limitation as long as they are used for this purpose.
As one example of the eye gaze direction determination method, when the difference between a distance of a right eye and a distance of a left eye is higher than a threshold, the eye gaze is directed to the left, when the difference between the distance of the left eye and the distance of the right eye is higher than the critical point, the eye gaze is directed to the right, and when the difference does not meet the requirements, the eye gaze is directed to the left/right center, as shown in
Also, when the average of an inclination of the right eye and an inclination of the left eye is higher than an upward critical point, the eye gaze is directed upward, when the average of an inclination of the left eye and the inclination of the right eye is lower than a downward threshold, the eye gaze is directed downward, and when the difference does not meet these requirements, the eye gaze is directed at up-down center.
Here, the threshold is an experimentally calculated value that may be set in advance according to the user's choice.
Meanwhile, the above-described method of tracking a position of an eye according to the present invention is applicable to all fields in which it is necessary to track a position of the eye.
Here, the medical head lamp includes a body 2 mounted on the head, a driving unit 4 coupled to one side of the body 2 to provide a driving force, a position adjustment unit 6 coupled to one side of the driving unit 4, a lamp unit 8 which is coupled to one side of the position adjustment unit 6, moves in response to the movement of the position adjustment unit 6, and emits light forward, an imaging unit 10 coupled to one side of the body 2 to take an image of an eye region, a controller 12 coupled to the driving unit 4 and the imaging unit 10 to analyze the image taken by the imaging unit 10, preferably calculate a position of the eye using the method of tracking a position of an eye according to the present invention, and activate the driving unit 4 based on the calculated position of the eye, and a power supply unit 14 coupled to the controller 12, the driving unit 4 and the lamp unit 8 to apply an electric current thereto.
The body 2 is mounted on a human's head. Here, typical devices known in the art may be used without particular limitation as long as they are used for this purpose.
A hair band, head band and the like may be used as the body 2.
The driving unit 4 is coupled to one side of the body 2, and more particularly, to one side of the body 2 that the direction of the human's gaze reaches, to provide a driving force to activate the position adjustment unit 6 coupled to the lamp unit 8 configured to emit light.
In this case, typical devices may be used without limitation as long as they function to provide a driving force. In this case, a motor may be used as the driving unit 4.
The adjustment unit 6 is coupled to one side of the driving unit 4, moves in response to the activation of the driving unit 4, and adjusts a position of the lamp unit 8.
Also, the lamp unit 8 is coupled to one side of the position adjustment unit 6 so that the position adjustment unit 6 can adjust a position of the lamp unit 8 in response to the activation of the driving unit 4.
Meanwhile, in order to automatically adjust a position of the lamp unit 8, a driving unit configured to translocate the lamp unit 8 to an x-coordinate and a driving unit configured to translocate the lamp unit 8 to a y-coordinate should be provided separately. However, the position adjustment unit 6 according to the present invention may adjust a position of the lamp unit 8 using one driving unit 4.
For this purpose, the position adjustment unit 6 according to the present invention functions to rotate the position adjustment unit 6 upon driving of the driving unit 4 to adjust a position of the lamp unit 8 since a driving shaft of the driving unit 4, for example, a rotational shaft of a motor, is positioned at the center of gravity 26 of the position adjustment unit 6, and the lamp unit 8 is coupled to one side of the position adjustment unit 6 and spaced a predetermined distance apart from the center of gravity 26 of the position adjustment unit 6 coupled to the driving shaft.
In this case, there is no limitation on the shape of the position adjustment unit 6 as long as the position adjustment unit 6 is coupled to the driving shaft of the driving unit 4. Preferably, the shape of the position adjustment unit 6 may be that of a plate, and more preferably, a plate extending in a radial direction, a circular plate, or a plate extending in a longitudinal direction.
The lamp unit 8 according to the present invention is coupled to the position adjustment unit 6 to emit light forward, thereby providing lighting to a user. Here, a typical lamp unit 8 known in the art may be used without particular limitation as long as it is used for this purpose.
In this case, the lamp unit 8 is coupled to the position adjustment unit 6 and moves in response to the movement of the position adjustment unit 6, which moves by action of the driving unit 4, and a position of the lamp unit 8 is then changed.
According to a specific exemplary embodiment, the lamp unit 8 according to the present invention includes a housing 24 configured to provide an appearance of the lamp unit 8, a body 20 positioned so that grooves can be formed on the inside surface of the housing 24, and a lamp 22 inserted and anchored in the grooves of the body 20.
In this case, the lamp 22 moves back and forth inside the body by the rotation of the body 20 and is configured to change an area exposed to the lamp 22, thereby changing a lighting range of the lamp 22.
By way of example, as shown in
The imaging unit 10 according to the present invention is coupled to one side of the body 2 to take an image of the eye. Here, a typical imaging unit 10 may be used without particular limitation as long as it is used for this purpose.
In this case, a camera, for example, a digital camera or a CCD camera may be used as the imaging unit 10. Here, the imaging unit 10 may be coupled to one side of a guide bar 16 longitudinally extending from one side surface of the body 2 to ensure that the user has a clear line of sight.
Also, the two imaging units 10 may be provided in the vicinity of the left and right eyes, as necessary, so that two images can be taken in both the left and right eyes of the user.
The controller 12 according to the present invention may be coupled to the driving unit 4 and the imaging unit 10 to analyze the image taken by the imaging unit 10, track, and preferably calculate, a position of the eye, activate the driving unit 4 based on the calculated position of the eye, and adjust a direction such that the lamp unit 8 coupled to the position adjustment unit 6 can emit light to correspond to a position of the eye, using the method of tracking a position of an eye according to the present invention.
A typical controller 12 known in the art may be used without limitation as long as it is used for this purpose. Preferably, a printed circuit board may be used as the controller 12.
The power supply unit 14 according to the present invention is coupled to the controller 12, the driving unit 4 and the lamp unit 8 to apply an electric current thereto. Typical power supply units 14 known in the art may be used without limitation as long as they are used for this purpose.
Here, the power supply unit 14 may be coupled to the controller 12, the driving unit 4 and/or the lamp unit 8 by means of an electric wire 18.
According to one specific exemplary embodiment, the power supply unit 14 is positioned independent from the medical head lamp and may apply an electric current to the medical head lamp via an electric wire. However, a rechargeable cell or battery may be coupled to one side of the body 2 of the medical head lamp so that a user can move freely.
Here, when the power supply unit 14 is coupled to one side of the body 2 of the medical head lamp, the power supply unit 14 may be provided in the other side of the body 2 in which the lamp unit 8 is positioned and which faces one side of the body 2.
Furthermore, the power supply unit 14 may also be formed integrally with the controller 12.
The method of tracking a position of an eye according to the present invention can be useful in binarizing an image of an eye region taken by a camera, extracting a feature point form the image, tracking a position of an eye, and controlling a lighting angle of a lamp provided in a head lamp to automatically correspond to the direction of a user's gaze.
While the invention has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the invention as defined by the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2011-0022856 | Mar 2011 | KR | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/KR2012/000711 | 1/31/2012 | WO | 00 | 6/12/2013 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2012/124893 | 9/20/2012 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5357293 | Uomori et al. | Oct 1994 | A |
5382989 | Uomori et al. | Jan 1995 | A |
5975703 | Holman et al. | Nov 1999 | A |
6585724 | Toh | Jul 2003 | B2 |
6608615 | Martins | Aug 2003 | B1 |
6690451 | Schubert | Feb 2004 | B1 |
7321385 | Rus et al. | Jan 2008 | B2 |
7362883 | Otsuka et al. | Apr 2008 | B2 |
7370991 | Ellis-Fant | May 2008 | B1 |
7379567 | Azuma et al. | May 2008 | B2 |
7416301 | Hanebuchi et al. | Aug 2008 | B2 |
7524064 | Wyatt | Apr 2009 | B2 |
7796784 | Kondo et al. | Sep 2010 | B2 |
7815342 | Medinis | Oct 2010 | B2 |
7862172 | Yoshinaga et al. | Jan 2011 | B2 |
8175331 | Nagaoka et al. | May 2012 | B2 |
8175337 | Park et al. | May 2012 | B2 |
8500278 | Lo et al. | Aug 2013 | B2 |
8652044 | Abramov | Feb 2014 | B2 |
8678589 | Sakata et al. | Mar 2014 | B2 |
20010021108 | Shimada et al. | Sep 2001 | A1 |
20020063851 | Shibutani et al. | May 2002 | A1 |
20020085372 | Lehrer | Jul 2002 | A1 |
20050013488 | Hashimoto et al. | Jan 2005 | A1 |
20050152583 | Kondo et al. | Jul 2005 | A1 |
20060120570 | Azuma et al. | Jun 2006 | A1 |
20070100327 | Smith | May 2007 | A1 |
20100110374 | Raguin et al. | May 2010 | A1 |
20110206135 | Drugeon et al. | Aug 2011 | A1 |
20110299742 | D'Hose | Dec 2011 | A1 |
20120045099 | Ishioka | Feb 2012 | A1 |
20120120635 | Strong et al. | May 2012 | A1 |
20120140989 | Hori | Jun 2012 | A1 |
20120147328 | Yahav | Jun 2012 | A1 |
20120219180 | Mehra | Aug 2012 | A1 |
20120275140 | Feinbloom et al. | Nov 2012 | A1 |
20120294478 | Publicover et al. | Nov 2012 | A1 |
20120328150 | Pelz et al. | Dec 2012 | A1 |
20130010096 | S. et al. | Jan 2013 | A1 |
20130016102 | Look et al. | Jan 2013 | A1 |
20130114850 | Publicover et al. | May 2013 | A1 |
20130170754 | Tsukizawa et al. | Jul 2013 | A1 |
20130178287 | Yahav | Jul 2013 | A1 |
20130194408 | Hanna et al. | Aug 2013 | A1 |
20130294659 | Hanna et al. | Nov 2013 | A1 |
20130322053 | Kim et al. | Dec 2013 | A1 |
20140072183 | Hanna et al. | Mar 2014 | A1 |
Number | Date | Country |
---|---|---|
2002324403 | Nov 2002 | JP |
2003036704 | Feb 2003 | JP |
2006-209442 | Aug 2006 | JP |
2010-230651 | Oct 2010 | JP |
2000-0060570 | Oct 2000 | KR |
20060131775 | Dec 2006 | KR |
20080086143 | Sep 2008 | KR |
2010-0119420 | Nov 2010 | KR |
Entry |
---|
Official Action prepared by the United States Patent and Trademark Office for U.S. Appl. No. 13/515,766, mailed Oct. 22, 2013, 13 pages. |
Number | Date | Country | |
---|---|---|---|
20140039273 A1 | Feb 2014 | US |