This application is based upon and claims the benefit of priority of the prior Japanese Patent Application No. 2016-031477, filed on Feb. 22, 2016, the entire contents of which are incorporated herein by reference.
The embodiment discussed herein is related to a technique to detect an eye gaze of a subject.
There is a technique of a corneal reflection method using a near-infrared light source and a near-infrared camera to detect an eye gaze of a person that is a subject (for example, An Eye Tracking System Based on Eye Ball Model —Toward Realization of Gaze Controlled Input Device by Takehiko Ohno et al., the research report of Information Processing Society of Japan, 2001-HI-93, 2001, pp. 47-54). In the corneal reflection method, near-infrared light is reflected by a cornea (corneal reflection) by using the near-infrared light source, and a center position of the corneal reflection and a center position of the pupil are obtained by image processing. Then, in the corneal reflection method, an eye gaze of the subject is detected based on a positional relationship between the center position of the reflection and the center position of the pupil.
Moreover, when the subject whose eye gaze is to be detected is a person wearing eyeglasses, for example, near-infrared light emitted from the near-infrared light source is reflected on the lens surface of the eyeglass other than the corneal surface. Hereinafter, the reflection on the lens surface of the eyeglasses is referred to as eyeglass reflection. In the corneal reflection method, when the eyeglass reflection is detected as the corneal reflection, a wrong direction or position is detected as an eye-gaze direction or eye-gaze position.
Therefore, there have been proposed various eye-gaze detection techniques considering the eyeglass reflection (for example, Japanese Laid-open Patent Publication Nos. 2006-167256 and 2012-239550). For example, a certain corneal reflection determination device extracts an eye region from each frame of image data and detects corneal reflection candidates from the eye region in the image data. Then, the corneal reflection determination device finds, among the multiple corneal reflection candidates, a corneal reflection candidate extracted only intermittently in the frames of the image data, and determines the selected corneal reflection candidate as the corneal reflection. More specifically, the corneal reflection determination device excludes continuously extracted corneal reflection candidates as the eyeglass reflection by utilizing the fact that the corneal reflection disappears while the eyeglass reflection does not disappear in a frame of image data captured at the moment of a blink.
According to an aspect of the invention, a detection system for detecting an eye gaze of a subject includes an image sensor configured to generate an image by capturing reflected light from a subject side irradiated with light from a light source; and a processor. The processor is configured to extract a region from the image, the region having an intensity of the reflected light which is equal to or larger than a specific value, when there are two or more regions, as a extraction result of the region, in which a first region having a characteristic indicating movement over time coexist with a second region having another characteristic indicating no movement over time, determine the first region as a corneal reflection, and detect the eye gaze of the subject based on the corneal reflection.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, as claimed.
As described above, in order to accurately detect the eye gaze, an attempt has been made to accurately detect the corneal reflection by excluding the eyeglass reflection. However, the corneal reflection determination device described above may fail in successfully excluding the eyeglass reflection in some cases depending on capabilities of the camera and other reasons. For example, if the camera fails to capture image data at the moment of a blink, it is impossible to exclude the eyeglass reflection. For this reason, it is inevitable for the camera to have a certain frame rate or higher.
Therefore, according to one aspect, it is an object of the technique disclosed herein to provide another method for detecting an eye gaze of a subject wearing eyeglasses.
Hereinafter, with reference to the drawings, detailed description is given of an embodiment of an eye-gaze detection technique. Note that the technique disclosed in the embodiment is not limited to the embodiment.
In the part 100 of the image illustrated in
Next, two reflections (122 and 124) are taken similarly in a part 120 of an image illustrated in
Here, in
On the other hand, the part 110 of the image illustrated in
Note that, as human characteristics, eye gaze movement and cassard phenomena frequently occur even without facial or head movement. Such eye gaze movement and cassard phenomena also occur when the face is in motion. However, when the face is in motion, the facial movement is larger and thus the influence of the facial movement becomes dominant in the state of
The inventors have focused attention on the existence of the images illustrated in
As described in detail below, in this embodiment, a detection device configured to detect a corneal reflection used to detect an eye gaze of a subject detects reflected light from the subject side irradiated with light from a light source, extracts a region where the reflected light has intensity of a predetermined value or higher, based on the detection result, and determines, when there are two or more regions in which a first region having a characteristic indicating movement over time coexists with a second region having a characteristic indicating no movement over time, the first region as a corneal reflection region representing the corneal reflection.
Note that, in
As described above, in this embodiment, the eyeglass reflection and the corneal reflection may be distinguished from each other by using characteristic images even when there is eyeglass reflection, regardless of the capabilities of the camera. Therefore, the technique disclosed in this embodiment accurately detects the corneal reflection and reduces erroneous eye-gaze detection by using the detection result on the corneal reflection.
The embodiment is described in detail below.
The detection unit 7 detects light reflected from the subject side irradiated with the light from the light source 6. The detection unit 7 is, for example, a camera sensitive to the light having the predetermined wavelength, which is emitted by the light source 6. Thus, the detection unit 7 detects light reflected from the subject side irradiated with the light from the light source 6, by shooting the subject. Note that the subject is a person whose eye gaze is to be detected. Also, the subject side represents the subject and other objects (including eyeglasses).
In this embodiment, near-infrared light invisible to the subject is used as the light having the predetermined wavelength. Thus, the camera as an example of the detection unit 7 is a near-infrared light camera. The light source 6 is an LED (Light Emitting Diode) that emits near-infrared light. An image to be acquired as the detection result by the detection unit 7 is a near-infrared image. The near-infrared image captures the situation of the subject side at brightness levels corresponding to the intensities of the reflection of the near-infrared light emitted by the light source 6 and of reflection of near-infrared light (for example, natural light or fluorescent light) emitted from another light source.
The eye-gaze detection device 1 detects an eye gaze of the subject. Note that the eye-gaze detection device 1 is, for example, a computer including a processor configured to execute various kinds of processing and a memory configured to store information.
In the eye-gaze detection system 10, the light source 6 and the detection unit 7 are connected to the eye-gaze detection device 1. Note, however, that such connection may be realized by wireless communication, other than wired connection. For example, when eye-gaze detection processing is started, the detection unit 7 takes images of the subject side at regular time intervals under the control of the eye-gaze detection device 1, and outputs the images taken to the eye-gaze detection device 1. Also, when the eye-gaze detection processing is started, the eye-gaze detection device 1 controls lighting of the light source 6.
In this embodiment, the eye-gaze detection device 1 detects the eye gaze of the subject by executing the eye-gaze detection processing on a near-infrared image acquired from the detection unit 7. For example, the eye-gaze detection device 1 uses the near-infrared image to detect a center position of corneal reflection of the light emitted from the light source 6 and a center position of the pupil of the subject, thereby detecting the eye gaze using a corneal reflection method. Note that, for processing other than the detection of the corneal reflection according to this embodiment, a conventional technique of the corneal reflection method is basically used. For example, the method described in Ohno et al. is adopted.
Here, the result of the processing by the eye-gaze detection device 1 is used for marketing analysis, for example. To be more specific, when the detection unit 7 and the light source 6 are installed in a shelf or a plate of a product in a store, the eye-gaze detection device 1 detects an eye gaze of a customer (subject) who comes into the store. Then, in the marketing analysis, which product the customer is interested in is estimated from the result of the eye gaze detection, for example. Moreover, a product displayed in a position where many eye gazes are focused is specified by statistically processing the result of the eye-gaze detection. Thus, a product in which many customers are interested may be figured out based on the output (eye-gaze detection result) from the eye-gaze detection device 1.
Moreover, the processing result obtained by the eye-gaze detection device 1 is used to detect dangerous driving, for example. To be more specific, when the detection unit 7 is installed in a position where images of the eyes of a driver seated in a driver seat may be taken and the light source 6 is installed in a position where the eyes of the driver may be irradiated with light, the eye-gaze detection device 1 detects an eye gaze of the driver (subject). In detection of dangerous driving, whether or not the driver performs proper driving while paying attention in various directions, whether or not there is a risk of drowsy driving, and the like are estimated from the result of the eye gaze detection.
As illustrated in
The controller 3 detects the corneal reflection and the pupil from the detection result inputted by the acquisition unit 2, and controls eye-gaze detection processing to detect the eye gaze based on a center position of the corneal reflection and a center position of the pupil. Note that, in detection of the cornea, the controller 3 uses an image in which a corneal reflection candidate having a characteristic indicating movement over time coexists with a corneal reflection candidate having a characteristic indicating no movement over time. As for the corneal reflection candidate, the intensity of the reflected light in the detection result is a predetermined value or higher. The corneal reflection candidate is, for example, a region where pixels having brightness of 200 or more are assembled in a near-infrared image (in the case of 8-bit, the maximum value of brightness is 255). The details thereof are described later.
The storage unit 4 stores various information desired for the eye-gaze detection processing according to this embodiment. The storage unit 4 stores, for example, the detection result (image information) from the detection unit 7, which is acquired by the acquisition unit 2, information on the corneal reflection candidates, and the like. The details thereof are described later.
The output unit 5 outputs the detected eye-gaze information, as the eye-gaze detection result, to another device. Such another device may be a device configured to perform the marketing analysis described above, a device configured to detect the dangerous driving described above, or a display device such as a display. Note that the output unit 5 functions as a transmitter when the output unit 5 and another device communicate with each other through wireless communication.
Next, the controller 3 is described in detail.
The extraction unit 31 extracts an approximately circular region where the reflection intensity (for example, brightness) is a predetermined value or higher, as a corneal reflection candidate, based on the detection result from the detection unit 7.
For example, the extraction unit 31 detects a face region from the acquired image, and further detects an eye region from the face region. Note that a conventional facial recognition technique is used for facial recognition and eye recognition. In the facial recognition technique, a template obtained by learning characteristics of a face and parts of the face is scanned on a process target image. Thus, a region having characteristics similar to the template is recognized as the face or parts of the face.
Next, the extraction unit 31 generates a binarized image by binarizing the eye region. Note that, in generation of the binarized image, the extraction unit 31 gives “1” to a pixel having brightness of 200 or more and “0” to a pixel having brightness smaller than 200, for example. Then, the extraction unit 31 further groups regions where a predetermined number of pixels or more with “1” given thereto are assembled. Note that such grouping processing is also called labeling processing.
Next, the extraction unit 31 determines whether or not the shape of each of the grouped regions is approximately circular. For such approximately circular shape determination, the technique disclosed in “Digital Image Processing”, pp. 224-225, by Wilhelm Burger et al. or “Analysis of Snowflake Shape by a Region and Contour Approach” by Kenichiro Muramoto et al., The Journal of the Institute of Electronics, Information and Communication Engineers, May 1993, Vol. J76-D-II, No. 5, pp. 949-958, for example, may be used. For example, the extraction unit 31 evaluates the circularity among the characteristics of each region, and determines that the region is approximately circular when the circularity is a threshold or more.
Then, the extraction unit 31 stores the region determined to be approximately circular as the corneal reflection candidate in the storage unit 4. For example, the extraction unit 31 stores positional information on corneal reflection candidates and identification information on each of the corneal reflection candidates in the storage unit 4. Note that the positional information on the corneal reflection candidates is coordinates of the barycenter position and pixels on the circumference. Furthermore, the extraction unit 31 may extract image information of the regions stored as the corneal reflection candidates from image information (not the binarized images), and store the extracted image information.
Moreover, in
For example, as for a region of a corneal reflection candidate having the identification “1” given thereto, the barycenter position is (XG1, YG1) and coordinates of circumferential pixels are (X11, Y11), (X12, Y12), (X13, Y13), and (X14, Y14), respectively. Note that
Referring back to
Hereinafter, description is given of an example where the generation unit 32 generates brightness gradients of the corneal reflection candidate region in two or more directions (x direction, y direction, and others). However, in generation of the characteristic information, the generation unit 32 uses image information (information before being binarized) acquired from the detection unit 7, rather than the binarized information.
Furthermore, the generation unit 32 sets the brightness gradient θX11 as a brightness gradient θX1 in the X direction of the corneal reflection candidate 112. However, the generation unit 32 may also obtain left and right brightness gradients (θX11 and θX12) and sets the average thereof as the brightness gradient θX1 in the X direction of the corneal reflection candidate 112. In this case, the generation unit 32 obtains the brightness gradient θX12 based on Equation 1, obtains the average of θX11 and θX12, and sets the average as the brightness gradient θX1 in the X direction of the corneal reflection candidate 112.
Moreover, a method for obtaining the brightness gradient is not limited to Equation 1. For example, the generation unit 32 may obtain a brightness gradient φX11 based on Equation 2 below. The brightness gradient φX11 is equivalent to the slope of the double-dashed dotted line in
Next,
Next,
The generation unit 31 obtains a brightness gradient θX21 in the X direction in the same manner as Equation 1, and generates a brightness gradient θX2 in the X direction of the corneal reflection candidate 114. Then, the generation unit 31 obtains a brightness gradient θY21 in the Y direction in the same manner as Equation 3, and generates a brightness gradient θY2 in the Y direction of the corneal reflection candidate 114. Note that, as in the case of
Referring back to
As illustrated in
Moreover, the description is given of the fact that, in the state of
Next, when there are two or more regions of the corneal reflection candidates in which a first corneal reflection candidate having a characteristic indicating movement over time coexists with a second corneal reflection candidate having a characteristic indicating no movement over time, the determination unit 34 determines the first corneal reflection candidate as a corneal reflection region representing the corneal reflection. On the other hand, when there are two or more regions of the corneal reflection candidates in which the first corneal reflection candidate having a characteristic indicating movement over time and the second corneal reflection candidate having a characteristic indicating no movement over time do not coexist, the determination unit 34 determines the corneal reflection as unidentifiable. Moreover, when there is only one corneal reflection candidate, the determination unit 34 determines this corneal reflection candidate as the corneal reflection.
For example, the determination unit 34 uses an image in which a corneal reflection candidate having a difference not smaller than a threshold between the brightness gradients (brightness gradient θX2 and brightness gradient θY2) in the two or more directions coexists with a corneal reflection candidate having a difference smaller than the threshold between the brightness gradients (brightness gradient θX1 and brightness gradient θY1) in the two or more directions to detect the corneal reflection candidate with movement as the corneal reflection.
Then, the determination unit 34 inputs the position of the corneal reflection candidate identified as the corneal reflection to the eye-gaze detection unit 37 to be described later. Furthermore, the determination unit 34 stores the position of the corneal reflection candidate identified as the corneal reflection in the storage unit 4 while associating the position with a frame number of the process target image.
Here, detailed description is given of the determination method by the determination unit 34. For example, when processing is performed on the eye region in
Meanwhile, when processing is performed on the eye region in
Furthermore, when processing is performed on the eye region in
Referring back to
A part 120 of the image is equivalent to
The corneal reflection candidate 122 and the corneal reflection candidate 124 in the process target image 120 are both determined to have a characteristic with movement by the movement determination unit 33, and the corneal reflection is determined as unidentifiable by the determination unit 34.
Therefore, the tracking unit 35 uses the corneal reflection 114 in the previous image 110 to determine a corneal reflection candidate closer to the position of the corneal reflection 114 as the corneal reflection. As illustrated in
Next, the pupil detection unit 36 detects a pupil. For example, the pupil detection unit 36 detects a region having a predetermined characteristic as the pupil from a near-infrared process target image. Note that a conventional pupil detection technique is adopted for the pupil detection. For example, the pupil detection unit 36 performs matching against templates based on shapes and characteristics of pupils.
The eye-gaze detection unit 37 uses the position of the corneal reflection and the position of the pupil to detect an eye gaze position of the subject. Note that the conventional corneal reflection method is adopted for a method for determining the eye gaze position based on the position of the corneal reflection and the position of the pupil.
The output control unit 38 generates an eye-gaze detection result, and controls the output unit 5 to output the eye-gaze detection result to another device.
Next, description is given of eye-gaze detection processing including processing of determining the corneal reflection according to this embodiment.
First, the acquisition unit 2 acquires a near-infrared image from the detection unit 7 (camera) and inputs the near-infrared image to the controller 3 (Op. 10). The extraction unit 31 in the controller 3 extracts a face region from the near-infrared process target image (Op. 12). Furthermore, the extraction unit 31 further extracts an eye region (Op. 14). Next, the extraction unit 31 generates a binarized image by binarizing the eye region (Op. 16). Then, the extraction unit 31 groups regions where a predetermined number of pixels or more with “1” given thereto are assembled, by performing labeling processing on the binarized image (Op. 18). In this event, arbitrary identification information is given to each group. For example, the regions are grouped into Group 1, Group2, . . . , and Group n.
With 1 set as an initial value to a counter N indicating identification information of the group (N=1) and Group 1 as a processing target, the extraction unit 31 determines whether or not the shape of the group is approximately circular (Op. 20). When it is determined that the shape is approximately circular (YES in Op. 20), the extraction unit 31 stores the region determined to be approximately circular as a corneal reflection candidate in the storage unit 4 (Op. 22). To be more specific, coordinates of a barycenter position of the corneal reflection candidate and pixels on the circumference as well as identification information on each corneal reflection candidate are stored in the storage unit 4. For example, when m corneal reflection candidates are extracted, 1 to m integers are given to the respective corneal reflection candidates as the identification information.
On the other hand, when it is determined that the shape is not approximately circular (NO in Op. 20) and after Op. 22, the extraction unit 31 determines whether or not the shape determination is finished for all the groups (Op. 24). When the shape determination is not finished for all the groups (NO in Op. 24), the process target group is changed to next one (N=N+1) and Op. 20 is repeated.
On the other hand, when the shape determination is finished for all the groups (in the case of N=n) (YES in Op. 24), the generation unit 32 determines whether or not there are two or more corneal reflection candidates (Op. 26). When there are not two or more corneal reflection candidates (NO in Op. 26), the generation unit 32 notifies the determination unit 34 to that effect. Then, the determination unit 34 determines one corneal reflection candidate as the corneal reflection, and stores the corneal reflection information in the storage unit 4 (Op. 46).
On the other hand, when there is more than one corneal reflection candidate (YES in Op. 26), the generation unit 32 sets 1 as an initial value to a counter M indicating identification information of the corneal reflection candidate (M=1), and obtains brightness gradients in the two or more directions of the corneal reflection candidate (Op. 28). For example, a brightness gradient in the X direction and a brightness gradient in the Y direction are obtained using the method described above.
Then, the movement determination unit 33 determines whether or not a difference between the brightness gradients in the respective directions is not smaller than a threshold (Op. 30). When the difference is not smaller than the threshold (YES in Op. 30), the movement determination unit 33 determines that the process target corneal reflection candidate has movement (Op. 32). On the other hand, when the difference is smaller than the threshold (NO in Op. 30), the movement determination unit 33 determines that the process target corneal reflection candidate has no movement (Op. 34).
Next, the movement determination unit 33 determines whether or not the movement determination processing is finished for all the corneal reflection candidates (Op. 36). Note that, when M=m, it is determined that the movement determination processing is finished. When the movement determination processing is not finished for all the corneal reflection candidates (NO in Op. 36), the process target corneal reflection candidate is changed to next one (M=M+1) and Op. 28 is repeated.
On the other hand, when the movement determination processing is finished for all the corneal reflection candidates (YES in Op. 36), the determination unit 34 determines whether or not the corneal reflection candidate with movement coexists with the corneal reflection candidate without movement (Op. 38). When the corneal reflection candidate with movement coexists with the corneal reflection candidate without movement (YES in Op. 38), the determination unit 34 determines the corneal reflection candidate with movement as the corneal reflection, and stores the corneal reflection information in the storage unit 4 (Op. 40).
On the other hand, when the corneal reflection candidate with movement and the corneal reflection candidate without movement do not coexist (NO in Op. 38), the determination unit 34 determines the corneal reflection as unidentifiable and requests the tracking unit 35 for the tracking processing (Op. 42). Upon request of the determination unit 34, the tracking unit 35 executes the tracking processing (Op. 44).
Next, the tracking unit 35 calculates a difference (distance) between the previous corneal reflection position and the position of each of the corneal reflection candidates in the process target image (Op. 62). Then, the tracking unit 35 identifies a corneal reflection candidate having the smallest difference (Op. 64). Furthermore, the tracking unit 35 determines whether or not the difference between the corneal reflection candidates is not larger than a threshold (Op. 66). When the difference is not larger than the threshold (YES in Op. 66), the tracking unit 35 determines the corneal reflection candidate identified in Op. 64 as the corneal reflection, and stores the corneal reflection information in the storage unit 4 (Op. 68). On the other hand, when the difference is larger than the threshold (NO in Op. 66), the tracking unit 35 determines the corneal reflection as unidentifiable (Op. 70). Then, the tracking unit 35 terminates the tracking processing.
Next, when any of Op. 40, Op. 44, and Op. 46 is finished, Op. 48 in
When no corneal reflection is detected (NO in Op. 48) or when Op. 52 is finished, the output unit 5 outputs the eye-gaze detection result depending on the various processing results to another device under the control of the output control unit 38 (Op. 54). Note that, when no corneal reflection is detected or when no pupil or eye gaze is detected, the information indicating no eye gaze detectable is outputted as the eye-gaze detection result. When the eye gaze is detected, eye gaze information indicating the direction and position of the eye gaze is outputted as the eye-gaze detection result.
Then, the controller 3 determines whether to terminate the eye-gaze detection processing (Op. 56). When terminating the eye-gaze detection processing (YES in Op. 56), the controller 3 terminates the series of eye-gaze detection processing. On the other hand, when not terminating the eye-gaze detection processing (NO in Op. 56), the controller 3 returns to Op. 10 to repeat the series of processing.
Through the above processing, the technique disclosed in this embodiment determines, in an image where a corneal reflection candidate indicating reflection on an object in motion coexists with a corneal reflection candidate indicating reflection on an object not in motion, the former as the corneal reflection, in order to enable eye-gaze detection using the corneal reflection method even if there is eyeglass reflection.
Even when the eyeglass reflection and the corneal reflection coexist, the technique disclosed in this embodiment may distinguish the eyeglass reflection from the corneal reflection by using a characteristic image in which a region of a corneal reflection candidate having a characteristic indicating movement over time coexists with a region of a corneal reflection candidate having a characteristic indicating no movement over time, regardless of the capabilities of the camera.
Moreover, even when the process target image is not the characteristic image, the corneal reflection may be tracked using the detection result of the previous corneal reflection. Therefore, the technique disclosed in this embodiment may detect the corneal reflection even when the eyeglass reflection is generated, thereby improving the accuracy of the eye-gaze detection result using the corneal reflection.
Next, description is given of a hardware configuration example of the eye-gaze detection system including the eye-gaze detection device.
The eye-gaze detection device 1 included in the eye-gaze detection system 10 includes, as the hardware configuration, a processor 1001, a Read Only Memory (ROM) 1002, a Random Access Memory (RAM) 1003, a Hard Disk Drive (HDD) 1004, a communication device 1005, an input device 1008, a display device 1009, and a medium reader 1010. Furthermore, the eye-gaze detection system 10 includes an interface circuit 1012, a light source 1006, and a camera 1007, in addition to the hardware configuration of the eye-gaze detection device 1. The processor 1001, the ROM 1002, the RAM 1003, the HDD 1004, the communication device 1005, the input device 1008, the display device 1009, the medium reader 1010, and the interface circuit 1012 are connected to each other through a bus 1011. Accordingly, data may be transmitted and received to and from those described above under the control of the processor 1001.
A program related to the eye-gaze detection processing is recorded in a recording medium that may be read by the eye-gaze detection system 10. The recording medium that may be read by the eye-gaze detection system 10 includes a magnetic recording device, an optical disk, a magneto-optical recording medium, a semiconductor memory, and the like. The magnetic recording device includes an HDD, a flexible disk (FD), a magnetic tape (MT), and the like.
The optical disk includes a Digital Versatile Disc (DVD), a DVD-RAM, a Compact Disc-Read Only Memory (CD-ROM), a Compact Disc-Recordable/ReWritable (CD-R/RW), and the like. The magneto-optical recording medium includes a Magneto-Optical disk (MO), and the like. When distributing a program describing the processing according to the embodiment, it is conceivable to sell a portable recording medium such as a DVD and a CD-ROM recording the program.
The medium reader 1010 in the eye-gaze detection system 10 that executes a program according to this embodiment reads the program from a recording medium recording the program. The processor 1001 stores the read program in the HDD 1004 or in the ROM 1002 or the RAM 1003.
The processor 1001 controls the entire operations of the eye-gaze detection device 1. The processor includes an electronic circuit such as a Central Processing Unit (CPU), for example.
The processor 1001 functions as the acquisition unit 2 and the controller 3 in the eye-gaze detection device 1 by reading the program describing the processing according to this embodiment from the HDD 1004 and executing the program. The communication device 1005 functions as the acquisition unit 2 under the control of the processor 1001. The HDD 1004 stores various information and functions as the storage unit 4 under the control of the processor 1001. The various information may be stored in the ROM 1002 or the RAM 1003 accessible to the processor 1001, as in the case of the program. Furthermore, various information temporarily generated and held through the processing is stored in the RAM 1003, for example.
The input device 1008 receives various inputs. The input device 1008 is a keyboard or a mouse, for example. The display device 1009 displays the various information. The display device 1009 is a display, for example.
As described above, the functional units illustrated in
Moreover, the eye-gaze detection processing according to this embodiment may be executed on cloud. In this case, the light source 6 and the detection unit 7 are disposed in a space where the subject is present. Then, upon receipt of the detection result from the detection unit 7, the eye-gaze detection device 1 (one or more servers) executes the eye-gaze detection processing illustrated in
All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the invention and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although the embodiment of the present invention has been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2016-031477 | Feb 2016 | JP | national |