Systems and methods of eye tracking control on mobile device

Information

  • Patent Grant
  • 9952666
  • Patent Number
    9,952,666
  • Date Filed
    Wednesday, March 8, 2017
    7 years ago
  • Date Issued
    Tuesday, April 24, 2018
    6 years ago
Abstract
Methods and systems to facilitate eye tracking control on mobile devices are provided. An image of a portion of a user is received at an eye tracking device, where the image includes reflections caused by light emitted on the user from one or more light sources located within the eye tracking device. One or more eye features associated with an eye of the user is detected using the reflections. Point of regard information is determined using the one or more eye features, where the point of regard information indicates a location on a display of a computing device coupled to the eye tracking device at which the user was looking when the image of the portion of the user was taken. The point of regard information is sent to an application capable of performing a subsequent operation using the point of regard information.
Description
TECHNICAL FIELD

The present disclosure generally relates to mobile devices and, more specifically, to systems and methods for facilitating eye tracking control on mobile devices.





BRIEF DESCRIPTION OF THE DRAWINGS

Some embodiments are illustrated by way of example and not of limitation in the figures of the accompanying drawings.



FIG. 1 is a device diagram of an example mobile device coupled to a docking device capable of facilitating eye tracking control, according to some embodiments;



FIG. 2 is a device diagram of another example of a mobile device coupled to a docking device capable of facilitating eye tracking control, according to some embodiments;



FIGS. 3A-3C are device diagrams of example mobile devices capable of facilitating eye tracking control, according to some embodiments;



FIG. 4 is a block diagram of an example software architecture for facilitating eye tracking control, according to some embodiments;



FIG. 5 is a block diagram of an example flow of data used to facilitate eye tracking control, according to some embodiments;



FIG. 6 is a flowchart of an example method for facilitate eye tracking control, according to some embodiments; and



FIG. 7 is a block diagram of a machine in the example form of a computer system within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed, according to some embodiments.





DETAILED DESCRIPTION

Example systems and methods to facilitate eye tracking control on mobile devices are described. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of example embodiments. It will be evident, however, to one skilled in the art that the present technology may be practiced without these specific details.


A user of a mobile device may interact with and control objects and applications displayed on the mobile device through the user's eye movement. An image of the user's eyes and/or face, captured by a front-facing camera on or coupled to the mobile device, may be analyzed using computer-vision algorithms, such as, for example, eye tracking algorithms and gaze detection algorithms. For example, the captured images may be processed to extract information relating to one or more features of the user's eyes and/or face. The mobile device may then use the extracted information to determine the location of the user's eyes and estimate the location on the display at which the user is looking. For example, the mobile device may be able to estimate at which icon on the display the user is looking. The estimation of where the user is looking may be used to direct one or more objects, applications, and the like to perform a particular operation. For example, the user may direct and control the movement of an object on the screen depending on where the user is looking on the display of the mobile device, including controlling scrolling functions, the movement of objects in a virtual game, and the like.



FIG. 1 is a device diagram 100 of an example mobile device 102 coupled to a docking device 104 capable of facilitating eye tracking control. The mobile device 102 may be any type of mobile device, including, but not limited to, a smart phone, a personal digital assistant (PDA), a mobile phone, a computing tablet, an electronic reader, and the like. During eye tracking control, the mobile device 102 may be used by the user by holding the mobile device 102 with one hand, both hands, or while the mobile device 102 is on a stand.


A docking device 104 may be coupled to the mobile device 102 in any manner, such as through a USB port on the mobile device 102, micro USB port on the mobile device 102, and the like. While the docking device 104 of FIG. 1 is depicted at the bottom of the mobile device 102, one of ordinary skill in the art will appreciate that the docking device 104 may be located at any suitable location relative to the mobile device 102. The docking device 104 may include a camera module 108 and one or more light-emitting diodes (LEDs) 106. For explanatory purposes, LEDs 106 are depicted and described throughout the disclosure. However, one of ordinary skill in the art will appreciate that any appropriate light-emitting source may be used (e.g., infrared laser).


The docking device 104 may include any number of infrared LEDs 106 that may be placed in a suitable location in any manner within the docking device 104 (e.g., tilted at an angle such that it points toward the user's face). In some embodiments, the docking device 104 may have either three or six LEDs 106 (e.g., for batteries with voltage=5V). However, any number of LEDs and any arrangement may be used with or without voltage divider circuitry. In a specific embodiment, the one or more LEDs 106 may have any one or more of the following features or characteristics: 1 Watt or 3 Watts emitting at 850 nm, emitting light at approximately 30-40 degrees field of emission, placed in pairs (e.g., to bring the voltage level to an appropriate voltage (e.g., from 3.7V to 1.85V) based on the current the LEDs need), blinks in the same frequency as the camera records, and the like.


In some embodiments, the docking device 104 may also include a suitable type of infrared pass filter (e.g., active, mechanical, high-pass, band-pass, etc.). In some embodiments, a high-pass filter that blocks light below 800 nm and allows light above 800 nm is used. In some embodiments, the infrared pass filter may only allow light between 800-900 nm to enter the one or more cameras of the camera module 108.


The camera module 108 may include one or more front-facing cameras placed in any suitable location in any manner within the docking device 104 (e.g., tilted at an angle such that it points toward the user's face) and may be used to capture images of the user's eyes and/or face. The one or more cameras may be placed at an appropriate distance from the LEDs to optimize the proper capture of the infrared light. In some embodiments, a camera on the mobile device 102 is used in combination with camera module 108 in stereo mode. The camera module 108 may include any one or more of the following: a black and white (e.g., monochrome) or color (e.g., RGB) CMOS sensor, running at an appropriate frame per second rate (e.g., minimum high-definition at 30 frames per second), a lens without an infrared block filter and with an appropriate field of view (e.g., approximately 35 degrees) and depth of field (e.g., approximately 40-80 cm), and the like. The one or more cameras in the camera module 108 may be positioned such that the one or more cameras are tilted upward (e.g., toward a user's face).


The images captured by the camera may need to be rotated. The eye tracking software can use sensors on the mobile device 102 (e.g., accelerometer, magnetometer, etc.) to detect the orientation of the mobile device 102 and rotate the image accordingly so that it can be properly processed.


The LEDs 106 emit light that is focused and centered toward the eyes of the user. The infrared light from the LEDs 106 is reflected in the pupil and on the cornea of the user and recorded by the cameras in the camera module 108. The LEDs 106 may be synchronized with the one or more cameras so that the LEDs 106 are on only when the one or more cameras are grabbing an image. In some embodiments, to improve the image quality, the visible light below 800 nm is filtered out using an infrared pass filter. The field of view and depth of view of the lenses of the one or more cameras in the camera module 108 may allow the user to move around, thereby accommodating for head pose variance of the user. The eye tracking control software may analyze the images taken by the camera module 108 to provide x,y coordinates of where the user is looking on the display of the mobile device 102. The x,y coordinates may be used for any number of applications (e.g., scrolling, moving objects, selecting icons, playing games, etc.).


The LEDs 106 and the camera module 108 may be turned on and/or off in any manner, such as by utilizing an external slider, an on-off dedicated button on the side or on the back of either the mobile device 102 or the docking device 104, controlled by an application or a digital button on the screen, controlled by movement or shaking of the mobile device 102 and/or the docking device 104, controlled by voice commands, on-screen capacitive buttons, touch pad(s), bio-signals (e.g., EMG, EEG, etc.) and the like. As such, in some embodiments, the eye tracking components may consume power only while the LEDs and the camera are turned on (e.g., when the user is using the eye tracking features).


In some embodiments, the eye tracking features are optimized when the camera is located at the bottom of the mobile device 102 (e.g., with respect to the perspective of the user). The user may rotate the mobile device 102 coupled to the docking device 104 to properly orient the camera module 108 such that it is located at the bottom of the mobile device 102. In some embodiments, using the accelerometer and/or magnetometer of the mobile device 102, the LEDs, the pass filter, and/or the camera may be turned on and/or off depending on the orientation of the mobile device 102 and the docking device 104 (e.g., turn off the LEDs and the camera when the mobile device 102 and the docking device 104 are rotated such that the camera module 108 is located at the top of the mobile device 102 with respect to the perspective of the user).


The LEDs and the camera may be turned off when the user's face is not recognized for a predetermined amount of time (e.g., 5-10 seconds) and may turn on again when the user's face is detected and recognized.



FIG. 2 is a device diagram 200 of another example of a mobile device 202 coupled to a docking device 204 capable of facilitating eye tracking control. The example shown in FIG. 2 may operate similarly to the example shown in FIG. 1 and may incorporate any one or combination of features described for FIG. 1. However, FIG. 2 shows that the docking device 204 may be integrated with LEDs 206, and the camera module 208 of the mobile device 202 may be used (instead of the camera module being integrated with the docking device 204). In some embodiments which couple the mobile device 202 with the docking device 204 using a USB, a micro-USB port, or a proprietary port, the configuration depicted in FIG. 2 may allow for faster transfer of images from the camera since the camera of the mobile device 202 is used to capture the images. The front-facing camera for eye tracking control may be utilized while simultaneously utilizing one or more back-facing cameras.



FIGS. 3A-3C are device diagrams of example mobile devices capable of facilitating eye tracking control. The examples shown in FIGS. 3A-3C may operate similarly to the example shown in FIG. 1 and may incorporate any one or combination of features described for FIG. 1. However, the LEDs and camera modules are integrated into the mobile device (instead of being part of a docking device). FIGS. 3A-3C depict mobile devices 300, 310, 320, respectively, with LEDs 302, 312, 322 and camera modules 304, 314, 324 integrated into the mobile devices 300, 310, and 320 in different example configurations (with respect to the user's perspective).


The LEDs and the camera modules on the mobile devices may be located in any one of a number of configurations on the mobile devices. FIG. 3A shows the LEDs 302 and the camera module 304 being located at the bottom of the mobile device 300. FIG. 3B shows the LEDs 312 being located on one side of the mobile device 310 while the camera module 314 is located on the opposite side of the mobile device 310. FIG. 3C shows the LEDs 302 and the camera module 304 being located on the same side of the mobile device 300.



FIG. 4 is a block diagram of an example software architecture for facilitating eye tracking control. Any one or more of the components of the software architecture may run on either a control processing unit (CPU) of the mobile device or on a combination of a CPU and a graphics processing unit (GPU) of the mobile device. In some embodiments, any one or more of the components of the software architecture may run on a dedicated chip. The software may run as a background process (e.g. as part of the operating system (OS), in a web browser, etc.) and may provide an application programming interface (API) that other applications can access. The API may fire an event or use some other similar mechanism to send the information of where the user is looking on the screen to other applications.


The software architecture may be divided into different layers. The bottom layer would correspond to the hardware (e.g. the camera(s), the infrared illumination, etc.). A camera layer may be in charge of communicating with the camera(s) in order to perform camera operations such as, for example, starting the camera, grabbing images, controlling the camera properties, and the like. This layer may also synchronize the one or more cameras and the infrared emitters so that the lights are turned on when there is an image being captured and off the rest of the time (e.g. strobing).


The camera layer may deliver images to the eye tracking layer. In the eye tracking layer, images may be processed to find features like face location, eye region location, pupil center, pupil size, location of the corneal reflections, eye corners, iris center, iris size, and the like. These features are used in the gaze estimation stage, which may be in charge of calculating the point of regard of the user, which may be the location on the display where the user is looking. The gaze estimation stage may also calculate the optical and visual axes of the user's eyes.


The API layer may be used for communication between the eye tracking layer and applications that use eye gaze information (e.g., OS API, games that employ eye gaze information, etc.). The API may send data calculated by the eye tracking layer, such as coordinates of the point of regard, three-dimensional (3-D) location of the user's eyes, pupil size, and the like. The API may also accept commands from an application to the eye tracking layer (e.g., to start and/or stop the eye tracking engine, query for specific information, etc.). An application may connect to the eye tracker's API and use eye gaze information for any suitable purpose (e.g., control an app or a game, record eye data for visual behavior studies, etc.).



FIG. 5 is a block diagram of an example flow of data used to facilitate eye tracking control. The one or more cameras and the infrared LED illumination may capture an image of the user and use the captured data to detect eye features (e.g., location of eye(s), pupils, corneal reflections, etc.). Using the detected eye features, the gaze estimation module may estimate the user's point of regard, which may then be used to control aspects of an application.


A calibration process may be conducted the first time the user uses the eye tracking functionality in order to calculate personal parameters (e.g., vertical and horizontal offset between optical and visual axes). These personal parameters and the information of the face and eyes are then employed to estimate where the user is looking on the screen through a gaze estimation algorithm.



FIG. 6 is a flowchart of an example method 600 for facilitate eye tracking control. The method 600 may be performed using the mobile device, cameras, and LEDs in any configuration.


In operation 602, an image of a portion of a user is received at the mobile device. The image includes reflections (e.g., corneal reflections) caused by light emitted on the user.


In operation 604, eye features (e.g., pupil location, size, corneal reflection location, eye corners, iris location, etc.) of the user are detected using the reflections received.


In operation 606, a point of regard is determined for the user using the eye features detected. Optical and/or visual axes may also be determined. The determination of the point of regard may account for the location of the one or more camera and the LEDs with respect to the screen.


In operation 608, the point of regard information is sent to an application capable of using the point of regard information in a subsequent operation.


Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules. A hardware module is a tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.


In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.


Accordingly, the term “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired) or temporarily configured (e.g., programmed) to operate in a certain manner and/or to perform certain operations described herein. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.


Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation, and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).


The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.


Similarly, the methods described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment or as a server farm), while in other embodiments the processors may be distributed across a number of locations.


The one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., Application Program Interfaces (APIs)).


Example embodiments may be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. Example embodiments may be implemented using a computer program product, e.g., a computer program tangibly embodied in an information carrier, e.g., in a machine-readable medium for execution by, or to control the operation of, data processing apparatus, e.g., a programmable processor, a computer, or multiple computers.


A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.


In example embodiments, operations may be performed by one or more programmable processors executing a computer program to perform functions by operating on input data and generating output. Method operations can also be performed by, and apparatus of example embodiments may be implemented as, special purpose logic circuitry (e.g., a FPGA or an ASIC).


The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In embodiments deploying a programmable computing system, it will be appreciated that that both hardware and software architectures require consideration. Specifically, it will be appreciated that the choice of whether to implement certain functionality in permanently configured hardware (e.g., an ASIC), in temporarily configured hardware (e.g., a combination of software and a programmable processor), or a combination of permanently and temporarily configured hardware may be a design choice. Below are set out hardware (e.g., machine) and software architectures that may be deployed, in various example embodiments.



FIG. 7 is a block diagram of a machine in the example form of a computer system 700 within which instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed. In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.


Example computer system 700 includes a processor 702 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 704, and a static memory 706, which communicate with each other via a bus 708. Computer system 700 may further include a video display device 710 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). Computer system 700 also includes an alphanumeric input device 712 (e.g., a keyboard), a user interface (UI) navigation device 714 (e.g., a mouse or touch sensitive display), a disk drive unit 716, a signal generation device 718 (e.g., a speaker) and a network interface device 720.


Disk drive unit 716 includes a machine-readable medium 722 on which is stored one or more sets of instructions and data structures (e.g., software) 724 embodying or utilized by any one or more of the methodologies or functions described herein. Instructions 724 may also reside, completely or at least partially, within main memory 704, within static memory 706, and/or within processor 702 during execution thereof by computer system 700, main memory 704 and processor 702 also constituting machine-readable media.


While machine-readable medium 722 is shown in an example embodiment to be a single medium, the term “machine-readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more instructions or data structures. The term “machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding or carrying instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present technology, or that is capable of storing, encoding or carrying data structures utilized by or associated with such instructions. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media. Specific examples of machine-readable media include non-volatile memory, including by way of example semiconductor memory devices, e.g., Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.


Instructions 724 may further be transmitted or received over a communications network 726 using a transmission medium. Instructions 724 may be transmitted using network interface device 720 and any one of a number of well-known transfer protocols (e.g., HTTP). Examples of communication networks include a local area network (“LAN”), a wide area network (“WAN”), the Internet, mobile telephone networks, Plain Old Telephone (POTS) networks, and wireless data networks (e.g., WiFi and WiMAX networks). The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible media to facilitate communication of such software.


Although an embodiment has been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the technology. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. The accompanying drawings that form a part hereof, show by way of illustration, and not of limitation, specific embodiments in which the subject matter may be practiced. The embodiments illustrated are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed herein. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. This Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.


Such embodiments of the inventive subject matter may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.

Claims
  • 1. A method comprising: controlling, at an eye tracking device, one or more light sources located within the eye tracking device to synchronize emission of light from the one or more light sources with a timing of image capture by a camera, wherein the controlling further includes filtering the light from the light sources using an infrared (IR) pass filter that allows light within an IR passband between 800nm and 900 nm to enter the camera and prevents light outside of the IR passband from entering the camera, and wherein the camera has a field of view that overlaps with a field of emission of the one or more light sources;receiving, at the eye tracking device, an image of a portion of a user captured by the camera, the image including reflections caused by light emitted on the user from the one or more light sources located within the eye tracking device;detecting one or more eye features associated with an eye of the user using the reflections;determining point of regard information using the one or more eye features, the point of regard information indicating a location on a display of a computing device coupled to the eye tracking device at which the user was looking when the image of the portion of the user was taken; andsending the point of regard information to an application capable of performing a subsequent operation using the point of regard information,wherein at least one of the camera and the one or more light sources are turned off when the user is not detected for a predetermined amount of time and are turned on when the user is detected again.
  • 2. The method of claim 1, wherein the camera is located within the eye tracking device.
  • 3. The method of claim 1, wherein the camera has a depth of field ranging from 40 cm to 80 cm.
  • 4. The method of claim 1, wherein the subsequent operation includes moving an object displayed on the display based on the point of regard information.
  • 5. The method of claim 1, wherein the field of emission is approximately 30-40 degrees.
  • 6. A non-transitory machine-readable storage medium storing instructions which, when executed by one or more processors, cause the one or more processors to perform operations comprising: controlling, at an eye tracking device, one or more light sources located within the eye tracking device to synchronize emission of light from the one or more light sources with a timing of image capture by a camera, wherein the controlling further includes filtering the light from the light sources using an infrared (IR) pass filter that allows light within an IR passband between 800nm and 900 nm to enter the camera and prevents light outside of the IR passband from entering the camera, and wherein the camera has a field of view that overlaps with a field of emission of the one or more light sources;receiving, at the eye tracking device, an image of a portion of a user captured by the camera, the image including reflections caused by light emitted on the user from the one or more light sources located within the eye tracking device;detecting one or more eye features associated with an eye of the user using the reflections;determining point of regard information using the one or more eye features, the point of regard information indicating a location on a display of a computing device coupled to the eye tracking device at which the user was looking when the image of the portion of the user was taken; andsending the point of regard information to an application capable of performing a subsequent operation using the point of regard information,wherein at least one of the camera and the one or more light sources are turned off when the user is not detected for a predetermined amount of time and are turned on when the user is detected again.
  • 7. The non-transitory machine-readable storage medium of claim 6, wherein the camera is located within the eye tracking device.
  • 8. The non-transitory machine-readable storage medium of claim 6, wherein the camera has a depth of field ranging from 40 cm to 80 cm.
  • 9. The non-transitory machine-readable storage medium of claim 6, wherein the subsequent operation includes moving an object displayed on the display based on the point of regard information.
  • 10. The non-transitory machine-readable storage medium of claim 6, wherein the field of emission is approximately 30-40 degrees.
  • 11. An eye tracking device comprising: one or more light sources configured to emit light wherein the one or more light sources further comprise an infrared (IR) pass filter that allows light within an IR passband between 800 nm and 900 nm to enter the camera and prevents light outside of the IR passband from entering the camera; andone or more hardware processors in communication with the one or more light sources, the one or more hardware processors configured to: control the one or more light sources to synchronize emission of light from the one or more light sources with a timing of image capture by a camera, wherein the camera has a field of view that overlaps with a field of emission of the one or more light sources,receive an image of a portion of a user captured by the camera, the image including reflections caused by light emitted on the user from the one or more light sources,detect one or more eye features associated with an eye of the user including detecting the one or more eye features using the reflections,determine point of regard information using the one or more eye features, the point of regard information indicating a location on a display of a computing device coupled to the eye tracking device at which the user was looking when the image of the portion of the user was taken, andsend the point of regard information to an application capable of performing a subsequent operation using the point of regard information,wherein at least one of the camera and the one or more light sources are turned off when the user is not detected for a predetermined amount of time and are turned on when the user is detected again.
  • 12. The eye tracking device of claim 11, wherein the camera is located within the eye tracking device.
  • 13. The eye tracking device of claim 11, wherein the predetermined amount of time is five seconds or greater.
  • 14. The eye tracking device of claim 11, wherein the subsequent operation includes moving an object displayed on the display based on the point of regard information.
  • 15. The eye tracking device of claim 11, wherein the predetermined amount of time is five seconds or greater.
  • 16. The eye tracking device of claim 11, wherein the field of emission is approximately 30-40 degrees.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 14/089,609, filed Nov. 25, 2013, which claims the benefit of Provisional Application No. 61/730,407, filed Nov. 27, 2012, both of which are incorporated herein by reference in their entirety.

US Referenced Citations (110)
Number Name Date Kind
4772885 Uehara et al. Sep 1988 A
4973149 Hutchinson Nov 1990 A
5218387 Ueno et al. Jun 1993 A
5293427 Ueno et al. Mar 1994 A
5360971 Kaufman et al. Nov 1994 A
5481622 Gerhardt et al. Jan 1996 A
5638176 Hobbs et al. Jun 1997 A
5649061 Smyth Jul 1997 A
5748371 Cathey et al. May 1998 A
5835083 Nielsen et al. Nov 1998 A
5963300 Horwitz Oct 1999 A
6082858 Grace et al. Jul 2000 A
6152371 Schwartz et al. Nov 2000 A
6152563 Hutchinson et al. Nov 2000 A
6162186 Scinto et al. Dec 2000 A
6204828 Amir et al. Mar 2001 B1
6320610 Van Sant et al. Nov 2001 B1
6323884 Bird et al. Nov 2001 B1
6351273 Lemelson et al. Feb 2002 B1
6377284 Lentz et al. Apr 2002 B1
6421064 Lemelson et al. Jul 2002 B1
6478425 Trajkovic et al. Nov 2002 B2
6526159 Nickerson Feb 2003 B1
6577329 Flickner et al. Jun 2003 B1
6643721 Sun Nov 2003 B1
6704034 Rodriguez et al. Mar 2004 B1
7013258 Su et al. Mar 2006 B1
7202793 Grace et al. Apr 2007 B2
7487461 Zhai et al. Feb 2009 B2
7513620 Dai et al. Apr 2009 B2
7572008 Elvesjo et al. Aug 2009 B2
7657062 Pilu Feb 2010 B2
7808587 Shirasaka et al. Oct 2010 B2
7963652 Vertegaal et al. Jun 2011 B2
8066375 Skogö et al. Nov 2011 B2
RE42998 Teiwes Dec 2011 E
8120577 Bouvin et al. Feb 2012 B2
8185845 Bjorklund et al. May 2012 B2
8220926 Blixt et al. Jul 2012 B2
8235529 Raffle et al. Aug 2012 B1
8314707 Kobetski et al. Nov 2012 B2
8562136 Blixt et al. Oct 2013 B2
20020036617 Pryor Mar 2002 A1
20020070965 Austin Jun 2002 A1
20020070966 Austin Jun 2002 A1
20020070968 Austin et al. Jun 2002 A1
20020075384 Harman Jun 2002 A1
20020105482 Lemelson et al. Aug 2002 A1
20020180801 Doyle et al. Dec 2002 A1
20030038754 Goldstein et al. Feb 2003 A1
20030123027 Amir et al. Jul 2003 A1
20030179314 Nozaki Sep 2003 A1
20040001100 Wajda Jan 2004 A1
20040005083 Fujimura Jan 2004 A1
20040061831 Aughey et al. Apr 2004 A1
20040073827 Tsirkel et al. Apr 2004 A1
20040213463 Morrison Oct 2004 A1
20050047629 Farrell et al. Mar 2005 A1
20050050686 Kurokawa Mar 2005 A1
20050100191 Harbach et al. May 2005 A1
20050110887 Shin et al. May 2005 A1
20050119642 Grecu et al. Jun 2005 A1
20050199783 Wenstrand et al. Sep 2005 A1
20050243054 Beymer et al. Nov 2005 A1
20050289363 Tsirkel et al. Dec 2005 A1
20060047386 Kanevsky et al. Mar 2006 A1
20060093998 Vertegaal May 2006 A1
20060120599 Steinberg et al. Jun 2006 A1
20060238707 Elvesjo et al. Oct 2006 A1
20060239670 Cleveland Oct 2006 A1
20070078552 Rosenberg Apr 2007 A1
20070122140 Ito et al. May 2007 A1
20070159599 Yamada Jul 2007 A1
20070164990 Bjorklund et al. Jul 2007 A1
20070230933 Sugimoto et al. Oct 2007 A1
20070282506 Breed et al. Dec 2007 A1
20080079687 Cernasov Apr 2008 A1
20080122792 Izadi et al. May 2008 A1
20080284980 Skogo et al. Nov 2008 A1
20080297614 Lieberman et al. Dec 2008 A1
20090027336 Lin et al. Jan 2009 A1
20090067680 Dowski et al. Mar 2009 A1
20090073128 Marsden Mar 2009 A1
20090073142 Yamashita et al. Mar 2009 A1
20090103048 Tsukiji Apr 2009 A1
20090125849 Bouvin et al. May 2009 A1
20090179853 Beale Jul 2009 A1
20090251407 Flake et al. Oct 2009 A1
20090263012 Georgis et al. Oct 2009 A1
20090273562 Baliga et al. Nov 2009 A1
20100066975 Rehnstrom Mar 2010 A1
20100077421 Cohen et al. Mar 2010 A1
20100079508 Hodge et al. Apr 2010 A1
20100195881 Orderud et al. Aug 2010 A1
20100205667 Anderson et al. Aug 2010 A1
20100262281 Suzuki et al. Oct 2010 A1
20100272363 Steinberg et al. Oct 2010 A1
20110013007 Holmberg et al. Jan 2011 A1
20110029918 Yoo et al. Feb 2011 A1
20110051239 Daiku Mar 2011 A1
20110170061 Gordon Jul 2011 A1
20110304706 Border et al. Dec 2011 A1
20120105486 Lankford May 2012 A1
20120105490 Pasquero et al. May 2012 A1
20120272179 Stafford Oct 2012 A1
20120319972 Tse et al. Dec 2012 A1
20120327101 Blixt et al. Dec 2012 A1
20130076885 Kobetski et al. Mar 2013 A1
20130106681 Eskilsson et al. May 2013 A1
20130107214 Blixt et al. May 2013 A1
Foreign Referenced Citations (38)
Number Date Country
101282680 Oct 2008 CN
100589752 Feb 2010 CN
19731301 Jan 1990 DE
4122752 Jan 1993 DE
0 816 983 Jan 1998 EP
1 114 608 Jul 2001 EP
1 336 372 Aug 2003 EP
1 679 577 Jul 2006 EP
2 037 352 Mar 2009 EP
2 581 034 Apr 2013 EP
682680 Mar 1994 JP
9211310 Aug 1997 JP
10221016 Aug 1998 JP
2000137792 May 2000 JP
2011081807 Apr 2011 JP
2011115606 Jun 2011 JP
5297486 Sep 2013 JP
10-2008-0106218 Dec 2008 KR
10-2011-0124246 Nov 2011 KR
WO 9927412 Jun 1999 WO
WO 0152722 Jul 2001 WO
WO 2004045399 Jun 2004 WO
WO 2004090581 Oct 2004 WO
WO 2005046465 May 2005 WO
WO 2006016366 Feb 2006 WO
WO 2006020408 Feb 2006 WO
WO 2007076479 Jul 2007 WO
WO 2007085682 Aug 2007 WO
WO 2007101690 Sep 2007 WO
WO 2008056274 May 2008 WO
WO 2008066460 Jun 2008 WO
WO 2009022924 Feb 2009 WO
WO 2009027773 Mar 2009 WO
WO 2009101238 Aug 2009 WO
WO 2010127714 Nov 2010 WO
WO 2011089199 Jul 2011 WO
WO 2013060826 May 2013 WO
WO 2013102551 Jul 2013 WO
Non-Patent Literature Citations (5)
Entry
“DynaVox Announces the EyeMax: New Eye Gaze System Provides Greater Independence”, [Online]. Retrieved from the Internet: <URL: http://www.dynavoxtech.com/company/press/release/detail.aspx?id=11>, (Aug. 6, 2008), 2 pgs.
Talukder, Ashit, et al., “Real-time Non-Intrusive Eyetracking and Gaze-point Determination for Human Computer Interaction and Biomedicine”, [Online]. Retrieved from the Internet: <URL: http://www.wseas.us/e-library/conferences/skiathos2002/papers/44 7-297.pdf>, (2002), 6 pgs.
United States Office Action, U.S. Appl. No. 14/089,609, dated Jun. 16, 2016, 16 pages.
United States Office Action, U.S. Appl. No. 14/089,609, dated Feb. 2, 2016, 12 pages.
United States Office Action, U.S. Appl. No. 14/089,609, dated Jul. 8, 2015, 10 pages.
Related Publications (1)
Number Date Country
20170177081 A1 Jun 2017 US
Provisional Applications (1)
Number Date Country
61730407 Nov 2012 US
Continuations (1)
Number Date Country
Parent 14089609 Nov 2013 US
Child 15453727 US