Unless otherwise indicated herein, the materials described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section.
Computing devices such as personal computers, laptop computers, tablet computers, cellular phones, and countless types of Internet-capable devices are increasingly prevalent in numerous aspects of modern life. Over time, the manner in which these devices are providing information to users is becoming more intelligent, more efficient, more intuitive, and/or less obtrusive.
The trend toward miniaturization of computing hardware, peripherals, as well as of sensors, detectors, and image and audio processors, among other technologies, has helped open up a field sometimes referred to as “wearable computing.” In the area of image and visual processing and production, in particular, it has become possible to consider wearable displays that place a very small image display element close enough to a wearer's (or user's) eye(s) such that the displayed image fills or nearly fills the field of view, and appears as a normal sized image, such as might be displayed on a traditional image display device. The relevant technology may be referred to as “near-eye displays.”
Near-eye displays are fundamental components of wearable displays, also sometimes called “head-mountable displays” (HMDs). A head-mountable display can place a graphic display or displays close to one or both eyes of a wearer. To generate the images on a display, a computer processing system may be used. Such displays may occupy a wearer's entire field of view, or only occupy part of wearer's field of view. Further, head-mountable displays may be as small as a pair of glasses or as large as a helmet.
Emerging and anticipated uses of wearable displays include applications in which users interact in real time with an augmented or virtual reality. Such applications can be mission-critical or safety-critical, such as in a public safety or aviation setting. The applications can also be recreational, such as interactive gaming.
In one aspect, an example method can include: (a) emitting infrared (IR) radiation from an IR radiation source associated with a head-mountable display toward a target location, (b) receiving, at an IR sensor associated with the head-mountable display, reflected IR radiation, wherein the reflected IR radiation includes IR radiation emitted by the IR radiation source and reflected from the target location, (c) generating amplitude data for the reflected IR radiation received at the IR sensor, and (d) determining, based on the amplitude data and using the head-mountable display, a viewing state of the target location using the head-mountable display, from among a no-wearer-present viewing state, a wearer-present viewing state, a closed-eye viewing state, an open-eye viewing state, a non-display-viewing viewing state, and a display-viewing viewing state.
In another aspect, an article of manufacture can include a non-transitory computer-readable medium having instructions stored thereon. Upon execution of the instructions by a computing device, the instructions can cause the computing device to perform functions. The functions can include: (a) emitting infrared (IR) radiation from an IR radiation source associated with a head-mountable display toward a target location, (b) receiving reflected IR radiation, where the reflected IR radiation includes IR radiation emitted by the IR radiation source and reflected from the target location, (c) generating amplitude data for the reflected IR radiation, and (d) determining, based on the amplitude data, a viewing state of the target location from among a no-wearer-present viewing state, a wearer-present viewing state, a closed-eye viewing state, an open-eye viewing state, a non-display-viewing viewing state, and a display-viewing viewing state.
In yet another aspect, a head-mountable device can include: an infrared (IR) radiation source, an IR sensor, and a processor. The IR radiation source can be configured to emit IR radiation toward a target location. The IR sensor can be configured to receive IR radiation reflected from the target location and to generate amplitude data for the reflected IR radiation. The processor can be configured to: (a) receive the amplitude data, and (b) determine, based on the amplitude data, a viewing state of the target location using the head-mountable display, from among a no-wearer-present viewing state, a wearer-present viewing state, a closed-eye viewing state, an open-eye viewing state, a non-display-viewing viewing state, and a display-viewing viewing state.
These as well as other aspects, advantages, and alternatives, will become apparent to those of ordinary skill in the art by reading the following detailed description, with reference where appropriate to the accompanying drawings.
Devices, such as head-mountable displays (HMDs), can benefit from the use of gaze detection. For example, a HMD can be configured to respond to a wearer only when the wearer is looking at the display portion of the HMD. In particular, a simple, low-cost, low-power solution to determine when the wearer is “proximate” (i.e., when the wearer is wearing the HMD and/or looking at the display portion) is beneficial, as HMDs often have limited resources for operation, including power.
One solution to the proximate wearer problem is to analyze the signal of a light-emitting diode (LED) that emits from the display portion and is reflected at a “target location”, or position to readily view the display portion of the HMD, toward a sensor. For example, the LED can emit infra-red (IR) radiation that can be reflected off an eye of a wearer at the target location and then detected by an IR sensor. IR radiation has the advantages of both not being visible by the wearer and not causing damage to the wearer's eye.
When IR radiation is emitted from the HMD, it can be reflected from an object, such as an eye of a wearer, at the target location. However, when the HMD is not being worn, an eye or other object is not present at the target location to reflect IR radiation. Thus, little or no IR radiation is likely to be reflected.
When an eye is present at the target location, some IR radiation will be absorbed and some IR radiation will be reflected. When the eye at the target location is looking at the display portion, much of the IR radiation is reflected from darker portions of the eye, such as the pupil and iris. These darker portions of the eye absorb more radiation than lighter portions of the eye, such as the sclera or white of the eye. As such, a lesser amount of reflected radiation is observed when the eye is looking at the display portion than when the eye is looking away from the display portion.
When the eye at the target location is closed, the eyelid can reflect the IR radiation. The eyelid is a better reflector of IR radiation than the surface of the eye, and so more IR radiation is reflected when the eye is closed than when open. Thus, an IR sensor would observe an increase in IR radiation when the eye is closed relative to when the eye is open.
Based on these observations, the computing device can receive IR intensity data from the IR sensor to determine a viewing state of the eye. Example viewing states include: no-wearer-present, wearer-present, closed-eye, open-eye, non-display-viewing, and display-viewing states. Other viewing states are possible as well. In scenarios where sensors are used on both eyes of a wearer, the indications from the two eyes can be combined to determine if both eyes are simultaneously looking at an object, both eyes are closed, or are in different positions (e.g., winking).
These techniques can also sense blinking of the wearer, because the eyelid is more reflective than the iris. In these embodiments, intentional blinks can be used as inputs (e.g., “one blink for yes, two blinks for no”), as they differ in some measure, such as duration, than unintentional blinks. As such, the HMD can determine a blinking state as well as, or separately from, the viewing state.
Viewing states, blinking states, and/or changes of viewing and/or blinking state can be used to generate signals/commands to change a state of a user interface (UI). Example UI state change commands include commands to start/stop the display light source, take a photo, start/stop video and/or audio recording etc. For example, to operate a video camera associated with the HMD: the HMD could be in display-viewing viewing state to take video, transition to pause the video during a non-display-viewing viewing state, toggle audio recording based on observing intentional blinks, and quit recording when a no-wearer-present state is observed. Many other examples are possible as well.
Example Head Mountable Displays
Additionally,
In operation, visible light generated by display light source 148 can be guided by display beam splitter 152 toward distal beam splitter 140. Distal beam splitter 140 can substantially reflect the received light towards the proximal beam splitter 144 and then, to image former 145. Image former 145 can include a concave mirror or other optical component that forms a virtual image that is viewable through the proximal beam splitter 144. In this manner, a viewable image can be delivered to a target location for viewing by HMD wearer's eye 128.
Infrared light or radiation emitted by an IR LED at position 160p follows a common path with the visible light described in the paragraph immediately above to reach eye 128. When an IR LED is at position 160a, some of the infrared light or radiation emitted by the IR LED can reach eye 128 via using distal beam splitter 140, proximal beam splitter 144, and image former 145. These pathways of infrared light are shown as IR beams 164a, 164b using black coloration in
After being reflected by proximal beam splitter 144, both visible and infrared light is directed toward eye 128, where some of both types of light can reflect from the surface of eye 128. Some of the infrared light or radiation can be received at sensor 162 as an infrared signal. In response, sensor 162 can generate amplitude data for the received infrared signal and transmit the amplitude data to computer 114.
Amplitude data can be processed by computer 114 (or another computing device). Computer 114 can use calibration data to compare input signal amplitudes from calibration data to determine a “viewing state”. Example viewing states for HMD 100 or HMD 100a are indicated in Table 1 below.
Example Determinations of Viewing States
The second non-heading row of table 200 shows eye 214b at target location 210b wearing head mountable device 100 and looking at proximal beam splitter 144. A portion of eye 214b looking directly at proximal beam splitter 144 is a relatively dark portion of the eye, including the pupil, which is black, and the iris, which is most frequently darker than the surrounding sclera. That is, IR radiation reflected to IR sensor 162 will be greater when a wearer is looking at proximal beam splitter 144 than when no wearer is present, but will be relatively low. Thus, when an eye is at target location 210b, the observed IR intensity 220b at IR sensor 162 is relatively low. Based on a relatively low IR intensity, a viewing state can be determined to be a viewing state 230b of “Wearer present and looking at display.”
The third non-heading row of table 200 shows eye 214c at target location 210c wearing head mountable device 100 and looking away from proximal beam splitter 144, as indicated via gaze 212c. A portion of eye 214c directly under proximal beam splitter 144 is a relatively light portion of the eye, partially or completely including sclera. IR radiation reflected to IR sensor 162 will be greater than when a wearer is looking directly at proximal beam splitter 144, but will be lower than when an eye is present and closed. Thus, when an eye is at eye position 210c, the observed IR intensity 220c at IR sensor 162 is relatively moderate. Based on a relatively moderate IR intensity, a viewing state can be determined to be a viewing state 230c of “Wearer present and not looking at display.”
The third non-heading row of table 200 shows closed eye 214d at target location 210d wearing head mountable device 100. A closed eye, such as eye 214d, can act as a better reflector of IR radiation than an open eye. As such, IR radiation reflected to IR sensor 162 will be greater than when a wearer's eye is open, or when the eye is not present. Thus, the observed IR intensity 220d at IR sensor 162 is relatively high. Based on a relatively high IR intensity, a viewing state can be determined to be a viewing state 230d of “Wearer present with eye closed.”
While I is greater than or equal to Tpresent,
Specific values of Tpresent, Topen, and/or Tviewing can be determined using pre-determined information, such as, but not limited to, stored empirical data, data from reference sources, and/or data from other sources. In some embodiments, specific values of Tpresent, Topen, and/or Tviewing can be determined using a calibration process, where a viewer is instructed to perform certain actions (e.g., take off the HMD, put on the HMD, look at the display, look away from the display, blink, etc.). During the calibration process, data related to Tpresent, Topen, and/or Tviewng can be gathered and processed to determine the threshold value(s) being calibrated.
Method 350 begins at block 360, where intensity data I is received from the IR sensor. The intensity data can include time-varying amplitude data of reflected IR radiation as received at the IR sensor.
At block 362, I is compared to the Tpresent threshold. If I is less than Tpresent, then control proceeds to block 364; otherwise, control proceeds to block 366.
At block 364, a determination is made that the viewing state is “wearer not-present” and control proceeds to block 380.
At block 366, I is compared to the Topen threshold. If I is less than Topen, then control proceeds to block 368; otherwise, control proceeds to block 370.
At block 368, a determination is made that the viewing state is “wearer present” and “eye closed” and control proceeds to block 380.
At block 370, I is compared to the Tviewing threshold. If I is less than Tviewing, then control proceeds to block 372; otherwise, control proceeds to block 374.
At block 372, a determination is made that the viewing state is “wearer present”, “eye open”, and “not viewing display” and control proceeds to block 380.
At block 374, a determination is made that the viewing state is “wearer present” “eye open”, and “viewing display” and control proceeds to block 380.
At block 380, a determination is made as to whether method 350 is completed. If method 350 is determined not to be completed, control proceeds to block 360. Otherwise, method 350 is terminated.
Blink durations can be calculated from a given sequence of viewing states and corresponding times. For example, suppose that viewing states are determined using method 350, or some equivalent method, for amplitude data of reflected IR radiation collected every T seconds; e.g., T≦0.25. An example set of viewing states can then be generated as shown in Table 2 below, for T=0.1 seconds, starting at a time T1.
Table 2 shows transitions between an eye-open related viewing state and an eye-closed related viewing state. For example, at time T1, the viewing state is shown as “Wearer present+eye open+viewing display” and at T1+0.1 sec., the viewing state is shown as “Wearer present+eye closed.” Based on these two viewing states, a transition between an eye of the wearer being open at time T1 and being closed at time T1+0.1 sec. can be detected. Table 2 shows this transition as an “Open to Closed” transition at time T1+0.1 sec.
Table T2 shows at time T1+0.2 sec., the viewing state is shown as “Wearer present+eye closed” and at T1+0.3 sec., the viewing state is shown as “Wearer present+eye open+viewing display.” Based on these two viewing states, a transition between the eye of the wearer being closed at time T1+0.2 sec. and being open at time T1+0.3 sec. can be detected. Table 2 shows this transition as a “Closed to Open” transition at time T1+0.3 sec.
Using the terminology shown in Table 2, a blink can be defined as an “Open to Closed” transition followed by a “Closed to Open” transition. Then, a blink duration B can be defined as a time when a “Closed to Open” transition is detected minus a time when an “Open to Closed” transition is detected. In this example, blink duration B=T1+0.3−(T1+0.1)=0.2 sec. Table 2 also shows an “Open to Closed” transition at T1+1.1 sec. and “Closed to Open” transition at T1+1.5 sec. In this example, blink duration B=T1+1.5−(T1+1.1)=0.4 seconds.
This sequence of viewing states and corresponding blink durations can be used to control the head mountable display. For example, a user interface can receive viewing state data, such as shown above in Table 2, and act in accordance with viewing states, blinks, and blink durations, as discussed below in detail in the context of
Typically, voluntary blinks take longer than involuntary blinks
Specific values of BInMin, BInMax, BVMin, and/or BvMax can be determined using pre-determined information, such as, but not limited to, stored empirical data, data from reference sources, and/or data from other sources. In some embodiments, specific values of BInMin, BInMax, BVMin, and/or BVMax can be determined using a calibration process, where a viewer is instructed to perform certain actions (e.g., wear the HMD for a period of time to detect involuntary blinks, blink a number of times in succession, etc.). During the calibration process, data, such as blink durations related to BInMin, BInMax, BVMin, and/or BVMax can be gathered and processed to determine the threshold value(s) being calibrated.
Method 450 begins at block 460, where a blink duration value B is received.
At block 462, B is compared to the R1 range. If B is within R1, that is, BVMin≦B≦BVMax, then control proceeds to block 464; otherwise, control proceeds to block 466.
At block 464, a determination is made that the blink state is an “involuntary blink” and control proceeds to block 480.
At block 462, B is compared to the R2 range. If B is within R2, that is, BVMin≦B≦BVMax, then control proceeds to block 468; otherwise, control proceeds to block 470.
At block 468, a determination is made that the blink state is a “voluntary blink” and control proceeds to block 480.
At block 468, a determination is made that the blink state is “not determined” and control proceeds to block 480. In some embodiments not shown in
At block 480, a determination is made as to whether method 450 is completed. If method 450 is determined not to be completed, control proceeds to block 460. Otherwise, method 450 is terminated.
Example User Interface Scenario
Scenario 500 begins with a wearer 510 named Joe using a video camera application of HMD 100 that is equipped to perform in accordance with viewing states and blink states. During scenario 500, wearer 510 looks at the display to record, blinks to toggle between video-only recording and audio-video recording, and takes off HMD 100 to terminate the video camera application.
The video camera application of scenario 500 is configured to receive viewing state and blink state information and generate user interface commands based on the viewing and blink states.
Scenario 500 continues on
Scenario 500 continues on
In scenario 500, Joe then asks Bro “how is Mom doing?” via speech 542 as shown in
Scenario 500 concludes with wearer 510 removing HMD 100 and uttering speech 552 of “Nice seeing you again, Bro!” as shown in
Many other example scenarios, user interfaces, user interface commands, and uses of viewing states and/or blink states are possible as well.
Example Methods for Determining Viewing States
Method 600 may be implemented to determine a viewing state at a target location. Method 600 begins at block 610, where a head-mountable display can emit IR radiation (a.k.a. IR light) from an associated IR radiation source toward a target location. The head-mountable display can include an optical system, where the optical system can include an IR radiation source configured to emit IR radiation and a display light source configured to emit visible light toward the target location. The optical system can be configured so that the IR radiation and the visible light follow a common path to the target location. In some embodiments, the IR sensor can be configured to be physically separate from the optical system.
At block 620, an IR sensor associated with the head-mountable display can receive reflected IR radiation. The reflected IR radiation can include IR radiation emitted by the IR radiation source and reflected from the target location.
At block 630, amplitude data can be generated for the reflected IR radiation.
At block 640, the head-mountable display can be used to determine a viewing state of the target location. The viewing state can be based on the amplitude data. The viewing state can determined from among a no-wearer-present viewing state, a wearer-present viewing state, a closed-eye viewing state, an open-eye viewing state, a non-display-viewing viewing state, and a display-viewing viewing state.
In some embodiments, determining the viewing state of the target location can include comparing the amplitude data to calibration data corresponding to a plurality of viewing states.
In some embodiments, method 600 can further include generating a user-interface command for the head-mountable display based on the viewing state.
In particular embodiments, method 600 can further include: (a) generating time-varying amplitude data for the reflected IR radiation received at the IR sensor, (b) determining a sequence of viewing states of the target location based on the time-varying amplitude data, and (c) controlling the head-mountable display based on the sequence of viewing states. In some of the particular embodiments, the sequence of viewing states can include a first viewing state at a first time and a second viewing state at a second time. The second time can be after the first time. Then, controlling the head-mountable display based on the sequence of viewing states can include generating a user-interface command for the head-mountable display, in response to the second viewing state differing from the first viewing state.
In other of the particular embodiments, the sequence of viewing states can include a first viewing state at a first time and a second viewing state at a second time. The second time can be after the first time. Then, controlling the head-mountable display based on the sequence of viewing states can include generating a user-interface command for the head-mountable display, in response to the second viewing state equaling the first viewing state.
Example methods and systems are described herein. It should be understood that the word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any embodiment or feature described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments or features. The example embodiments described herein are not meant to be limiting. It will be readily understood that certain aspects of the disclosed systems and methods can be arranged and combined in a wide variety of different configurations, all of which are contemplated herein.
The above detailed description describes various features and functions of the disclosed systems, devices, and methods with reference to the accompanying figures. In the figures, similar symbols typically identify similar components, unless context dictates otherwise. The illustrative embodiments described in the detailed description, figures, and claims are not meant to be limiting. Other embodiments can be utilized, and other changes can be made, without departing from the spirit or scope of the subject matter presented herein. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the figures, can be arranged, substituted, combined, separated, and designed in a wide variety of different configurations, all of which are explicitly contemplated herein.
With respect to any or all of the ladder diagrams, scenarios, and flow charts in the figures and as discussed herein, each block and/or communication may represent a processing of information and/or a transmission of information in accordance with example embodiments. Alternative embodiments are included within the scope of these example embodiments. In these alternative embodiments, for example, functions described as blocks, transmissions, communications, requests, responses, and/or messages may be executed out of order from that shown or discussed, including substantially concurrent or in reverse order, depending on the functionality involved. Further, more or fewer blocks and/or functions may be used with any of the ladder diagrams, scenarios, and flow charts discussed herein, and these ladder diagrams, scenarios, and flow charts may be combined with one another, in part or in whole.
A block that represents a processing of information may correspond to circuitry that can be configured to perform the specific logical functions of a herein-described method or technique. Alternatively or additionally, a block that represents a processing of information may correspond to a module, a segment, or a portion of program code (including related data). The program code may include one or more instructions executable by a processor for implementing specific logical functions or actions in the method or technique. The program code and/or related data may be stored on any type of computer readable medium such as a storage device including a disk or hard drive or other storage medium.
The computer readable medium may also include non-transitory computer readable media such as computer-readable media that stores data for short periods of time like register memory, processor cache, and random access memory (RAM). The computer readable media may also include non-transitory computer readable media that stores program code and/or data for longer periods of time, such as secondary or persistent long term storage, like read only memory (ROM), optical or magnetic disks, compact-disc read only memory (CD-ROM), for example. The computer readable media may also be any other volatile or non-volatile storage systems. A computer readable medium may be considered a computer readable storage medium, for example, or a tangible storage device.
Moreover, a block that represents one or more information transmissions may correspond to information transmissions between software and/or hardware modules in the same physical device. However, other information transmissions may be between software modules and/or hardware modules in different physical devices.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
Number | Name | Date | Kind |
---|---|---|---|
4081623 | Vogeley | Mar 1978 | A |
5360971 | Kaufman | Nov 1994 | A |
5570151 | Terunuma | Oct 1996 | A |
5621424 | Shimada | Apr 1997 | A |
5926655 | Irie | Jul 1999 | A |
5991085 | Rallison et al. | Nov 1999 | A |
6091546 | Spitzer | Jul 2000 | A |
6163281 | Torch | Dec 2000 | A |
6369952 | Rallison et al. | Apr 2002 | B1 |
6542081 | Torch | Apr 2003 | B2 |
6920283 | Goldstein | Jul 2005 | B2 |
7192136 | Howell et al. | Mar 2007 | B2 |
RE39539 | Torch | Apr 2007 | E |
7255437 | Howell et al. | Aug 2007 | B2 |
7347551 | Fergason et al. | Mar 2008 | B2 |
7380936 | Howell et al. | Jun 2008 | B2 |
7401918 | Howell et al. | Jul 2008 | B2 |
7401920 | Kranz et al. | Jul 2008 | B1 |
7438410 | Howell et al. | Oct 2008 | B1 |
7481531 | Howell et al. | Jan 2009 | B2 |
7500746 | Howell et al. | Mar 2009 | B1 |
7500747 | Howell et al. | Mar 2009 | B2 |
7515054 | Torch | Apr 2009 | B2 |
7543934 | Howell et al. | Jun 2009 | B2 |
7581833 | Howell et al. | Sep 2009 | B2 |
7621634 | Howell et al. | Nov 2009 | B2 |
7677723 | Howell et al. | Mar 2010 | B2 |
RE41376 | Torch | Jun 2010 | E |
7760898 | Howell et al. | Jul 2010 | B2 |
7762665 | Vertegaal et al. | Jul 2010 | B2 |
7771046 | Howell et al. | Aug 2010 | B2 |
7784945 | Sugiyama | Aug 2010 | B2 |
7792552 | Thomas et al. | Sep 2010 | B2 |
7806525 | Howell et al. | Oct 2010 | B2 |
7922321 | Howell et al. | Apr 2011 | B2 |
8073198 | Marti | Dec 2011 | B2 |
8109629 | Howell et al. | Feb 2012 | B2 |
8235529 | Raffle et al. | Aug 2012 | B1 |
8363098 | Rosener et al. | Jan 2013 | B2 |
8428053 | Kannappan | Apr 2013 | B2 |
8506080 | Raffle et al. | Aug 2013 | B2 |
8593570 | Boland et al. | Nov 2013 | B2 |
8760310 | Rosener | Jun 2014 | B2 |
20010028309 | Torch | Oct 2001 | A1 |
20040119814 | Clisham et al. | Jun 2004 | A1 |
20040183749 | Vertegaal | Sep 2004 | A1 |
20050007552 | Fergason et al. | Jan 2005 | A1 |
20050264527 | Lin | Dec 2005 | A1 |
20060103591 | Tanimura et al. | May 2006 | A1 |
20060115130 | Kozlay | Jun 2006 | A1 |
20060192775 | Nicholson et al. | Aug 2006 | A1 |
20070024579 | Rosenberg | Feb 2007 | A1 |
20070086764 | Konicek | Apr 2007 | A1 |
20070201847 | Lei | Aug 2007 | A1 |
20080211768 | Breen | Sep 2008 | A1 |
20090195497 | Fitzgerald et al. | Aug 2009 | A1 |
20100066821 | Rosener et al. | Mar 2010 | A1 |
20100079356 | Hoellwarth | Apr 2010 | A1 |
20100109895 | Rosener | May 2010 | A1 |
20100118158 | Boland et al. | May 2010 | A1 |
20100149073 | Chaum et al. | Jun 2010 | A1 |
20100215170 | Kannappan | Aug 2010 | A1 |
20100235667 | Mucignat et al. | Sep 2010 | A1 |
20110115878 | Noteware et al. | May 2011 | A1 |
20110231757 | Haddick et al. | Sep 2011 | A1 |
20120019645 | Maltz | Jan 2012 | A1 |
20120019662 | Maltz | Jan 2012 | A1 |
20130128067 | Boland et al. | May 2013 | A1 |
20130135204 | Raffle et al. | May 2013 | A1 |
20130176533 | Raffle et al. | Jul 2013 | A1 |
20130257709 | Raffle et al. | Oct 2013 | A1 |
20130300652 | Raffle et al. | Nov 2013 | A1 |
20140101608 | Ryskamp et al. | Apr 2014 | A1 |
20140161412 | Chase et al. | Jun 2014 | A1 |
20140184729 | Noteware et al. | Jul 2014 | A1 |
Number | Date | Country |
---|---|---|
2011221890 | Nov 2011 | JP |
2011114092 | Sep 2011 | WO |
Entry |
---|
T. Arias, “Relation Between Intensity and Amplitude”, Sep. 13, 2001. |
M. Chau et al.,“Real Time Eye Tracking and Blink Detection with USB Cameras”, Boston University Technical Report No. 2005-12, May 12, 2005. |
College of Engineering at the University of Wisconsin-Madison, “Device May Help Prevent ‘Falling Asleep at the Switch’”, Jan. 27, 1997. |
Digi-Key Corporation,“Order Page for Silicon Laboratories SI1143-A10-GMR”, Mar. 15, 2012. |
M. Eizenmann et al., “Precise Non-Contacting Measurement of Eye Movements Using the Corneal Reflex”, Vision Research, vol. 24, Issue 2, pp. 167-174, 1984. |
Silicon Labs, “Si1102 and Si1120 Designers Guide”, Rev 0.1, Oct. 2009, Silicon Laboratories. |
Silicon Labs, “Si1143 Proximity/Ambient Light Sensor with IΛ2C Interface”, Nov. 19, 2010, Silicon Laboratories. |
M. A. Tinker, “Apparatus for Recording Eye-Movements”, The American Journal of Psychology, Jan. 1931, pp. 115-118, vol. 43, No. 1, University of Illinois Press. |
Miluzzo et al., “EyePhone: Activating Mobile Phones with your Eyes”, MobiHeld 2010 Proceedings of the Second ACM SIGCOMM Workshop on Networking, Systems and Applications on Mobile Handhelds, Aug. 30, 2010, pp. 15-20, published by Association for Computing Machinery, New York, NY, USA. |