This disclosure relates generally to controlling an electronic contact lens using eye gestures.
An electronic contact lens may include various integrated electronic components such as projectors, imaging devices, sensors, and batteries. The electronic contact lens may enable augmented reality applications in which images are projected by the electronic contact lens onto the user's retina to augment the user's view of the external environment. The electronic contact lens may include integrated motion sensors for tracking eye movements that may be used to control various functions of the electronic contact lens.
A system includes an electronic contact lens that can detect eye gestures for initiating various actions. The electronic contact lens includes integrated sensors for obtaining sensor measurements characterizing eye motion. The sensor measurements are processed to detect gestures mapped to specific actions such as changing a power state of the electronic contact lens, activating or deactivating a user interface or other feature, or selecting an item from a virtual menu.
To provide a quality user experience, eye gestures should be detectable with low rates of false positives. For example, it is desirable for the electronic contact lens system to consistently distinguish between intentional eye gestures and other eye or head movements that are not intended to invoke an action. Furthermore, it is desirable to utilize eye gestures that are detectable with low rates of false negatives and that are easy for a user to perform consistently.
An example class of eye gestures that meets the above characteristics involves the user initiating the gesture by pausing the eye at a starting pitch, executing a first change in eye pitch in a first direction, executing a second change in eye pitch in the opposite direction, and then again pausing the eye to complete the gesture. The eye gesture may furthermore be subject to various motion and timing constraints. For example, in one embodiment, a gesture may be deemed valid only if (1) the eye motion crosses a first pitch threshold in a first direction from the starting pitch and subsequently crosses a second pitch threshold in an opposite direction from the starting pitch; (2) the eye motion achieves a span (i.e., total change in pitch) that exceeds a minimum span threshold; and (3) the different aspects of the motion are performed within a set of configured timing constraints. In another embodiment, different motion segments are individually scored against respective target criteria, and the scores are combined and evaluated to holistically determine whether the motion is a valid gesture. In this embodiment, failure to meet an individual criterion does not necessarily invalidate the gesture. The electronic contact lens system can be programmed to invoke different actions depending on the type and direction of the detected eye gesture.
As shown in
The optional femtoprojector 120 is a small projector that projects images inward onto the user's retina. It is located in a central region of the contact lens 110, so that light from the femtoprojector 120 propagates through the user's pupil to the retina. The femtoprojector 120 typically includes an electronics backplane (e.g., driver circuitry), a front plane of light emitting elements (e.g., an LED array) and projection optics. The front plane produces an image (referred to as the source image), which is optically projected by the projection optics through the various eye structures and onto the retina 105, as shown in
The optional femtoimager 130 is a small imager that is outward facing and captures images of the external environment. In this example, it is located outside the central region of the contact lens 110 so that it does not block light from entering the user's eye. The femtoimager 130 typically includes imaging optics, a sensor array, and sensor circuitry. The imaging optics images a portion of the external environment onto the sensor array, which captures the image. The sensor array may be an array of photosensors.
The femtoprojector 120 and femtoimager 130 typically are not larger than 2 mm wide. They may fit within a 2 mm×2 mm×2 mm volume. In an embodiment, the electronic contact lens 110 has a thickness that is less than two millimeters.
The sensors 140 and other associated electronics may be mounted on a flexible bus located in a peripheral zone of the electronic contact lens 110. The sensors 140 may include motion sensors such as an accelerometer and a gyroscope. The sensors 140 may furthermore include a magnetometer and additional sensors such as temperature sensors, light sensors, and audio sensors. Sensed data from the sensors 140 may be combined to estimate position, velocity, acceleration, orientation, angular velocity, angular acceleration or other motion parameters of the eye. For example, in one embodiment, gyroscope data, magnetometer data, and accelerometer data may be combined in a filter to estimate the orientations. Furthermore, gyroscope measurements may be compensated for variations in temperature.
The electronic contact lens 110 may furthermore include various other electronic components (not shown) such as a radio transceiver, power circuitry, an antenna, a battery, or inductive charging coils. The electronic contact lens 110 may also include cosmetic elements, for example covering the sensors 140 or other electronic components. The cosmetic elements may be surfaces colored to resemble the iris and/or sclera of the user's eye.
As shown in
A processing module 220 interfaces with the electronic contact lens 110 to track motion data of the electronic contact lens, detect eye gestures, and initiate actions responsive to the detected eye gestures. The processing module 220 may furthermore perform other functions of the electronic contact lens 110 such as generating virtual images for display using the femtoprojector 120, processing images obtains from the femtoimager 130, or other tasks.
Various components of the processing module 220 may be implemented in whole or in part in the electronic contact lens 110, the accessory device 212, the server 216, or a combination thereof. In some implementations, certain time-sensitive functions of the processing module 220 may be implemented directly on the electronic contact lenses 110 for low latency while other more computationally intensive functions may be offloaded to the accessory device 212 or to the server 216 to enable the electronic contact lens 110 to operate with relatively light computational and storage requirements. For example, in one implementation, the electronic contact lens 110 transfers the raw sensor data to the accessory device 212 for processing. The accessory device 212 may process the data directly or may offload one or more functions in whole or in part to the server 216. Alternatively, the electronic contact lens 110 may perform some lightweight initial processing on the sensor data and send the initially processed sensor data to the accessory device 212. For example, the electronic contact lens 110 may perform some filtering or compression of the sensor data. Responsibility for other tasks such as generating virtual images and processing captured image data may similarly be shared between the electronic contact lenses 110, accessory device 212, and server 216 in different ways.
The processing module 220 includes a motion analysis module 222, a power state control module 224, and an interface control module 226. Other embodiments may include different, additional, or fewer components.
The motion analysis module 222 processes sensor measurements from the electronic contact lens 110 to detect occurrences of one or more eye gestures. Here, the motion analysis module 222 may apply various filters and/or functions to the raw sensor data (e.g., from the accelerometer, gyroscope, magnetometer, or other sensors) to detect a sequence of movements consistent with a predefined eye gesture.
In an embodiment, the sensor measurements processed by the motion analysis module 222 may include image data from the femtoimager 130. Here, for example, the motion analysis module 222 may perform image-based motion analysis techniques on images captured from the femtoimager 130 over time that may be used alone or in conjunction with other sensor data to estimate changes in eye orientation and detect eye gestures.
In an embodiment, the motion analysis module 222 may optionally obtain and analyze sensor data from sensors external to an electronic contact lens 110. For example, head-mounted sensors or external cameras may be used to track head position. The motion analysis module 222 may utilize this data to estimate gaze orientation relative to the head (e.g., whether the gaze position is centered or at a peripheral region).
In an example implementation, the motion analysis module 222 comprises a state machine having a sequence of states that each correspond to one of the motion segments of the eye gesture. Beginning at a starting state corresponding to a first motion segment, the state machine compares motion data in a recent time window to motion criteria defining the first motion segment. The state machine progresses to the next state when the detected motion is consistent with the first motion segment. In the next state, a different set of criteria is applied to a subsequent time window of motion data to determine if the motion data in the subsequent time window is consistent with the next defined motion segment of the eye gesture. The state machine continues to progress in this matter as each segment of the eye gesture is detected. Otherwise, if the detected motion at any given state is inconsistent with the defined motion segment for that state, the state machine returns to the starting state.
In an embodiment, each stage of the state machine determines whether or not the criteria associated with that motion segment are met. In this case, an eye gesture is detected when the state machine reaches the end state, indicating that the full set of motion segments are sequentially detected according to their respective criteria.
In another embodiment, the motion analysis module 222 compares the motion against two different sets of criteria at each stage of the state machine. First, the motion analysis module 222 compares the motion against state transition criteria that represents the minimum criteria for transitioning to the next state. Second, the motion analysis module 222 compares the motion against target criteria to generate as core indicating how closely the motion conforms to a target motion. The state transition criteria may be more relaxed than the target criteria. The scores may then be combined (e.g., as a sum, weighted sum, or weighted average) and a gesture is detected if the total score exceeds a threshold. In this case, reaching the final state of the state machine does not necessarily indicate a detection of the gesture since it may be possible to reach the final state based on the state transition criteria without achieving a total score sufficient to detect the gesture.
In an embodiment, the motion analysis module 222 may comprise two or more state machines executing in parallel. Here, when one state machine advances past the initial state, another state machine may initiate in the initial state to determine if a subsequent eye movement corresponds to the first motion segment. This embodiment ensures that the start of the gesture is not missed when the initial state machine advances past the initial state but fails to detect later motion segments of an eye gesture.
For each state of the state machine, the motion criteria may be defined positively (i.e., the state machine progresses when the specified criteria for the current state are met), negatively (i.e., the state machine is reset to the starting state when the criteria for the current state are met), or as a combination thereof. The criteria for detecting each motion segment of the eye gesture may be based on factors such as changes in orientation, velocity, or acceleration associated with movements, durations of time associated with movements or in between movements, or other factors that collectively describe a detectable eye gesture. In other embodiments, the criteria for each state may be defined in terms of specific types of detectable eye movements (such as saccades, microsaccades, smooth pursuits, drifts, fixations, etc.) and characteristics of those movements. Specific examples of eye gestures and techniques for detecting them are described in further detail below with respect to
In other embodiments, the motion analysis module 222 detects a gesture without necessarily using a state machine. For example, in another implementation, the motion analysis module 222 obtains a set of samples associated with a time window and independently characterizes the motion in each of a sub-windows. The characterized motions can then be compared against target motions to evaluate whether or not the gesture is detected.
The power state control module 224 controls a power state of the electronic contact lens 110. The power state may be controlled, at least in part, in response to a detected eye gesture. In an embodiment, the electronic contact lens 110 can operate in at least a low power state and a full power state. In some embodiments, additional power states may be available. In the low power state, the electronic contact lens 110 operates with limited functionality to conserve power. In one example implementation, the electronic contact lens 110 may enable only functions for detecting a trigger event that causes the electronic contact lens 110 to transition to the full power state. Thus, at least the femtoimager 130 and femtoprojector 120 may be deactivated in the low power state.
In one embodiment, the electronic contact lens 110 furthermore disables the gyroscope in the low power state. In this case, the electronic contact lens 110 uses only the accelerometer and magnetometer data to detect an eye gesture that activates the full power state, which then enables the gyroscope, the femtoimager 130, and femtoprojector 120, or other components. In another embodiment, only the magnetometer is enabled during the lower power state and the accelerometer and other sensors are disabled until the full power state is activated. In embodiments where only the magnetometer and/or accelerometer are active in the low power state, the gesture for activating the full power state may be evaluated based only on changes in pitch detectable by the accelerometer and/or magnetometer, without regard to changes in yaw.
When operating in the full power state, the electronic contact lens 110 may activate a wider set of sensors (e.g., the gyroscope), the femtoimager 130, and/or the femtoprojector 120 to enable various user functions. An eye gesture may furthermore be utilized to transition the electronic contact lens 110 from the full power state back to the low power state.
In other implementations, the power state control module 224 may perform some automatic transitions between power states. For example, if the user is wearing two electronic contact lenses 110, one lens 110 may operate in the low power state described above while the other lens 110 may operate in a sleep state in which it does not track eye motion. When the lens 110 in the low power state detects an eye gesture for transitioning to the full power state, it transitions to the full power state and sends a signal to the other contact lens 110 to cause it to enter the full power state. In an embodiment, the lenses 110 may automatically switch which lens 110 operates in the low power state and which operates in the sleep state. The lenses 110 may switch periodically or based on their relatively battery levels. For example, the lens 110 with the lower battery level may be configured to operate in the sleep state and the lens 110 with the higher battery level operates in the low power state.
The interface control module 226 controls various user functions of the electronic contact lens 110 that may be invoked responsive to a detected eye gesture or other command input (e.g., a voice input). The interface control module 226 may generate a user interface displayed by the femtoprojector 120 including virtual elements that the user may interact with such as virtual objects, text, menus, or other elements. Eye gestures may be detected to initiate actions such as activating or deactivating a virtual menu, selecting an item of a virtual menu, switching between virtual menus, interacting with virtual objects, or controlling settings of the electronic contact lens 110. In an embodiment, different types of eye gestures or performing eye gestures of the same type in different directions may invoke different actions. For example, the direction of the eye gesture may control a position of the user interface display.
In some embodiments, the same eye gesture may be mapped to different functions in different power states. For example, the same eye gesture could be used to transition from the low power state to the full power state and vice versa.
In an embodiment, a pre-activation filter may operate to only initiate detection of a particular type of eye gesture when the electronic contact lens 110 is in a specific state. For example, when the electronic contact lens 110 is in a low power state, it may operate to detect an eye gesture for transitioning to the full power state but does not necessarily operate to detect other types of eye gestures that are only applicable in the full power state. Similarly, when the electronic contact lens 110 is in the full power state, it does not necessarily operate to detect an eye gesture that solely operates to invoke a transition to the full power state.
In the example gesture 300, the user first pauses at an initial position 304, then executes a first motion 310 to the second position 306 (e.g., near a bottom edge of the range 302), then executes a second motion 312 to the third position 308 (e.g., near a top edge of the range 302). The shorter duration of the pause at position 306 may be consistent with a user attempting to transition between the motions quickly (i.e., without deliberately trying to stabilize the eye at position 306), while durations of the pauses at positions 304, 308 may be consistent with deliberate pauses at these positions.
In the illustrated example, the pitch of the starting position 304 is in between the pitches of the second position 306 and third position 308, i.e., p2<p1<p3 where p1, p2, and p3 are the pitches of the first position 304, second position 306, and third position 308 respectively.
As illustrated, the gestures 300, 400 of
In an embodiment, the eye gestures 300, 400 of
In Example A 610, the motion constitutes a valid activation associated with the gesture. Here, both pitch thresholds Q, R are crossed and the total span exceeds the minimum span SMIN. Example B 620 illustrates an example of a motion that does not result in an activation associated with the gesture. In this case, both thresholds Q and R are crossed, but the minimum span SMIN criterion is not met. Example C 630 illustrates another example of a motion that fails to result in an activation associated with the gesture. Here, the minimum span SMIN is met and the threshold R is crossed, but the gesture fails because the threshold Q is not crossed.
In a first state S1, the motion analysis module 202 detects when the eye meets a stability metric. For example, the motion analysis module 202 may determine that the stability metric is met when Δt1>Δt1-MIN where Δt1 is a time window during which the variation in pitch stays within a predefined stability range (e.g., I±δ where I is an arbitrary starting pitch) and Δt1-MIN is a predefined time threshold. In an embodiment, the threshold Δt1-MIN may be set to, for example, Δt1-MIN=0.2 seconds. In a typical gesture, the period Δt1 may last, for example, 2-3 seconds. The starting pitch I is not necessarily a predefined pitch and may represent, for example, an average pitch during the stable period Δt1 that may be computed once the stability metric is met. After determining the starting pitch I associated with the stable period Δt1, the pitch offsets Q, R may be determined as predefined offsets from the starting pitch I. The state machine transitions from the first state S1 to the second state S2 after the stability metric is met and the pitch subsequently exits the stability range (e.g., crosses a pitch threshold I+6). In an embodiment the pitch threshold I+6 for transitioning to the second state S2 may be in between I and Q.
When in the second state S2, the motion analysis module 202 detects if the pitch crosses the first threshold Q and subsequently crosses the second threshold R such that the time between crossing Q and R is within a predefined time window. For example, the motion analysis module 202 detects if Δt2-MIN<Δt2<Δt2-MAX, where Δt2 is the time between crossing threshold Q and threshold R, and t2-MIN, Δt2-MAX are predefined time thresholds. In an example embodiment, the time window is defined as Δt2-MIN=0.2 seconds and Δt2-MAX=1 second. If the second state criteria is met, the state machine moves to the third state S3. If the second state criteria is not met within the maximum time Δt2-MAX, the activation fails and the state machine may reset to the starting state S1.
In the third state S3, the motion analysis module 202 detects if the total span of the pitch exceeds a minimum span SMIN within a predefined time period. For example, the third state criteria is met if p-PEAK−pt>SMIN (i.e., the span criterion is met) and Δt3<Δt3-MAX where p-PEAK is the peak pitch detected during the second state S2, SMIN is a predefined minimum span, Δt3 is the time between crossing the R threshold and the span criterion being met, and Δt3-MAX is a predefined time limit. For example, in an embodiment, Δt3-MAX=0.4 seconds. If the span criterion is not within the maximum time period Δt3-MAX, the activation fails and the state machine may reset to the starting state S1.
In the 4th state S4, the motion analysis module 202 detects when the velocity of the pitch trajectory sufficiently slows or changes direction to indicate that the pitch is stabilizing. For example, the fourth state criteria may be met when dp/dt<vMIN or when dp/dt changes sign, where vMIN is a predefined velocity threshold. The final pitch F is determined as the pitch when the fourth state criteria is met. In embodiment, state S4 is not constrained by a time limit.
In the 5th state S5, the motion analysis module 202 detects if an ending stability metric is met. Here, the ending stability criteria may be met when the pitch remains relatively stable (e.g., within a predefined pitch range) for at least a minimum time period, i.e., Δt5>Δt5-MIN, where Δt5 is the period of time that the pitch remains within a predefined pitch range of the ending pitch F determined in state S4 (e.g., the pitch remains in the range F±δ during the time period Δt5) and Δt5-MIN is a predefined time threshold. In an example embodiment, the minimum time period is set as Δt5-MIN=0.2 seconds. In a typical gesture, the time period Δt5 may last, for example, 2-3 seconds. Once the ending criteria is met, the gesture is detected and the associated action may be activated 710.
In the example timing diagram of
As described above, in an alternative implementation, the activation 710 does not necessarily occur when the criteria of state S5 is met. Instead, the motion analysis module 222 may generate a score for each state based on a set of target criteria that may be different than the criteria for transitioning between states. Then, upon reaching the end of the state machine, the motion analysis module 222 combines the scores (e.g., as an average or weighted sum) and compares the combined score to an activation threshold to determine whether or not the gesture is detected.
Furthermore, as described above, instead of evaluating different criteria during sequential states of a state machine, the motion analysis module 222 may instead evaluate an overall similarity metric of a set of samples captured over a time window against a target motion. The similarity metric may then be compared to a threshold to determine whether or not the gesture is detected.
In an alternative embodiment, the techniques described herein can apply to an augmented, virtual reality system, or a displayless eye-tracking system that is not necessarily embodied as an electronic contact lens 110. For example, in an embodiment, the described eye gestures can be recognized by a glasses-type augmented reality device or a different type of head-mounted device. In these embodiments, motion data may be captured from an eye-facing camera integrated in the head-mounted device instead of from motion sensors mounted directly to the eye. Here, images captured from the integrated camera are processed to estimate eye movements and to detect gestures from those eye movements using the same techniques described above in
Although the detailed description contains many specifics, these should not be construed as limiting the scope of the invention but merely as illustrating different examples. It should be appreciated that the scope of the disclosure includes other embodiments not discussed in detail above. Various other modifications, changes and variations which will be apparent to those skilled in the art may be made in the arrangement, operation and details of the method and apparatus disclosed herein without departing from the spirit and scope as defined in the appended claims. Therefore, the scope of the invention should be determined by the appended claims and their legal equivalents.
Alternate embodiments are implemented in computer hardware, firmware, software and/or combinations thereof. Implementations can be implemented in a computer program product tangibly embodied in a non-transitory computer-readable storage device for execution by a programmable processor; and method steps can be performed by a programmable processor executing a program of instructions to perform functions by operating on input data and generating output. Embodiments can be implemented advantageously in one or more computer programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from and to transmit data and instructions to, a data storage system, at least one input device and at least one output device. Each computer program can be implemented in a high-level procedural or object-oriented programming language, or in assembly or machine language if desired; and in any case, the language can be a compiled or interpreted language. Suitable processors include, by way of example, both general and special purpose microprocessors. Generally, a processor will receive instructions and data from a read-only memory and/or a random access memory. Generally, a computer will include one or more mass storage devices for storing data files. Any of the foregoing can be supplemented by, or incorporated in, ASICs (application-specific integrated circuits) and other forms of hardware.
Number | Name | Date | Kind |
---|---|---|---|
4871247 | Haynes | Oct 1989 | A |
5844544 | Kahn | Dec 1998 | A |
6027216 | Guyton | Feb 2000 | A |
7431458 | Jongsma | Oct 2008 | B2 |
7542210 | Chirieleison, Sr. | Jun 2009 | B2 |
7626562 | Iwasaki | Dec 2009 | B2 |
8430310 | Ho | Apr 2013 | B1 |
8446341 | Amirparviz | May 2013 | B2 |
8520309 | Sprague | Aug 2013 | B2 |
8764185 | Biederman | Jul 2014 | B1 |
8786675 | Deering | Jul 2014 | B2 |
8798332 | Otis | Aug 2014 | B2 |
8827445 | Wiser | Sep 2014 | B1 |
8870370 | Otis | Oct 2014 | B1 |
8874182 | Etzkorn | Oct 2014 | B2 |
8890946 | Publicover | Nov 2014 | B2 |
8911087 | Publicover | Dec 2014 | B2 |
8960898 | Etzkorn | Feb 2015 | B1 |
8964298 | Haddick | Feb 2015 | B2 |
8971978 | Ho | Mar 2015 | B2 |
8989834 | Ho | Mar 2015 | B2 |
9028068 | Chang | May 2015 | B2 |
9040923 | Sprague | May 2015 | B2 |
9047512 | Otis | Jun 2015 | B2 |
9052533 | Pugh | Jun 2015 | B2 |
9111473 | Ho | Aug 2015 | B1 |
9153074 | Zhou | Oct 2015 | B2 |
9170646 | Toner | Oct 2015 | B2 |
9196094 | Ur | Nov 2015 | B2 |
9215293 | Miller | Dec 2015 | B2 |
9298002 | Border | Mar 2016 | B2 |
9298020 | Etzkorn | Mar 2016 | B1 |
9341843 | Border | May 2016 | B2 |
9390326 | Publicover | Jul 2016 | B2 |
9405365 | Publicover | Aug 2016 | B2 |
9600069 | Publicover | Mar 2017 | B2 |
9837052 | Deering | Dec 2017 | B2 |
9870060 | Marggraff | Jan 2018 | B2 |
9921645 | Theytaz | Mar 2018 | B2 |
10025379 | Drake | Jul 2018 | B2 |
10178367 | Zhou | Jan 2019 | B2 |
10345621 | Franklin | Jul 2019 | B2 |
10353463 | Shtukater | Jul 2019 | B2 |
10718957 | Wiemer | Jul 2020 | B2 |
10901505 | Haine | Jan 2021 | B1 |
20040155907 | Yamaguchi | Aug 2004 | A1 |
20090066722 | Kriger | Mar 2009 | A1 |
20100231504 | Bloem | Sep 2010 | A1 |
20110077548 | Torch | Mar 2011 | A1 |
20110221659 | King, III | Sep 2011 | A1 |
20130145304 | Deluca | Jun 2013 | A1 |
20130258287 | Pugh | Oct 2013 | A1 |
20140063054 | Osterhout | Mar 2014 | A1 |
20140081178 | Pletcher | Mar 2014 | A1 |
20140098226 | Pletcher | Apr 2014 | A1 |
20140168056 | Swaminathan | Jun 2014 | A1 |
20140198128 | Hong | Jul 2014 | A1 |
20140243971 | Pugh | Aug 2014 | A1 |
20140268029 | Pugh | Sep 2014 | A1 |
20140347265 | Aimone | Nov 2014 | A1 |
20140354539 | Skogö | Dec 2014 | A1 |
20150049004 | Deering | Feb 2015 | A1 |
20150143234 | Norris, III | May 2015 | A1 |
20150192992 | Di Censo | Jul 2015 | A1 |
20150205106 | Norden | Jul 2015 | A1 |
20150212576 | Ambrus | Jul 2015 | A1 |
20150235439 | Schowengerdt | Aug 2015 | A1 |
20150235440 | Schowengerdt | Aug 2015 | A1 |
20150235444 | Schowengerdt | Aug 2015 | A1 |
20150235446 | Schowengerdt | Aug 2015 | A1 |
20150235457 | Schowengerdt | Aug 2015 | A1 |
20150235468 | Schowengerdt | Aug 2015 | A1 |
20150235471 | Schowengerdt | Aug 2015 | A1 |
20150241698 | Schowengerdt | Aug 2015 | A1 |
20150243090 | Schowengerdt | Aug 2015 | A1 |
20150261294 | Urbach | Sep 2015 | A1 |
20150301338 | Van Heugten | Oct 2015 | A1 |
20150312560 | Deering | Oct 2015 | A1 |
20150338915 | Publicover | Nov 2015 | A1 |
20150339857 | O'Connor | Nov 2015 | A1 |
20150362749 | Biederman | Dec 2015 | A1 |
20150362750 | Yeager | Dec 2015 | A1 |
20150362753 | Pletcher | Dec 2015 | A1 |
20160011419 | Gao | Jan 2016 | A1 |
20160018650 | Haddick | Jan 2016 | A1 |
20160018651 | Haddick | Jan 2016 | A1 |
20160018652 | Haddick | Jan 2016 | A1 |
20160018653 | Haddick | Jan 2016 | A1 |
20160025981 | Burns | Jan 2016 | A1 |
20160091737 | Kim | Mar 2016 | A1 |
20160097940 | Sako | Apr 2016 | A1 |
20160133201 | Border | May 2016 | A1 |
20160195924 | Weber | Jul 2016 | A1 |
20160253831 | Schwarz | Sep 2016 | A1 |
20160274660 | Publicover | Sep 2016 | A1 |
20160283595 | Folkens | Sep 2016 | A1 |
20160299357 | Hayashi | Oct 2016 | A1 |
20170019661 | Deering | Jan 2017 | A1 |
20170023793 | Shtukater | Jan 2017 | A1 |
20170111619 | Benosman | Apr 2017 | A1 |
20170115742 | Apr 2017 | A1 | |
20170116897 | Ahn | Apr 2017 | A1 |
20170123492 | Marggraff | May 2017 | A1 |
20170131764 | Bognar | May 2017 | A1 |
20170177078 | Henderek | Jun 2017 | A1 |
20170270636 | Shtukater | Sep 2017 | A1 |
20170285742 | Marggraff | Oct 2017 | A1 |
20170371184 | Shtukater | Dec 2017 | A1 |
20180120568 | Miller | May 2018 | A1 |
20180149884 | Miller | May 2018 | A1 |
20180173011 | Barrows | Jun 2018 | A1 |
20180180980 | Ouderkirk | Jun 2018 | A1 |
20180275753 | Publicover | Sep 2018 | A1 |
20180335835 | Lemoff | Nov 2018 | A1 |
20180348969 | Kawamura | Dec 2018 | A1 |
20190025607 | Liao | Jan 2019 | A1 |
20190025608 | Liao | Jan 2019 | A1 |
20190056785 | Suk | Feb 2019 | A1 |
20190107734 | Lee | Apr 2019 | A1 |
20190235276 | Wiemer | Aug 2019 | A1 |
20190235624 | Goldberg | Aug 2019 | A1 |
20190250408 | Lafon | Aug 2019 | A1 |
20190250432 | Kim | Aug 2019 | A1 |
20190307399 | Gutierrez | Oct 2019 | A1 |
20190377428 | Mirjalili | Dec 2019 | A1 |
20190390976 | Anderson | Dec 2019 | A1 |
20200073122 | Rothkopf | Mar 2020 | A1 |
20200096786 | Toner | Mar 2020 | A1 |
20210026444 | Haine | Jan 2021 | A1 |
20210072821 | von und zu Liechtenstein | Mar 2021 | A1 |
20210124415 | Haine | Apr 2021 | A1 |
20210208674 | Haine | Jul 2021 | A1 |
20220046156 | Lemoff | Feb 2022 | A1 |
20220121344 | Pastrana Vicente | Apr 2022 | A1 |
Number | Date | Country |
---|---|---|
106445115 | Feb 2017 | CN |
107092346 | Aug 2017 | CN |
2006015315 | Feb 2006 | WO |
2016195201 | Dec 2016 | WO |
2018109570 | Jun 2018 | WO |
Entry |
---|
Christiansen et al, editors. Motion Sensors Explainer. W3C Working Group Note, Aug. 30, 2017. retrieved from [https://www.w3.org/TR/motion-sensors/] on [Oct. 21, 2021]. (Year: 2017). |
CN107092346A English translation (Year: 2017). |
International Search Report and Written Opinion for Application No. PCT/US2019/015338, dated Apr. 11, 2019, 15 pages. |
International Search Report and Written Opinion in PCT/US2020/056376, dated Jan. 12, 2021, 10 pages. |
Ioannou, S. et al., “Proximity and Gaze Influences Facial Temperature: A Thermal Infrared Imaging Study,” Frontiers in Psychology, Aug. 2014, pp. 1-12, vol. 5, Article 845. |
ITMO University. New femto-camera with quadrillion fractions of a second resolution. Eureka Alert. Jun. 21, 2017. [Retrieved Mar. 18, 2019]. Retrieved from: https://www.eurekalert.org/pub_releases/2017-06/iu-nfw062117.php> entire document (3 pages). |
Kim, J. et al., “3D Multi-Spectrum Sensor System with Face Recognition,” Sensors, 2013, p. 12804-12829, vol. 13. |
Lingley et al., “Asingle-pixel wireless contact lens display” J. Micromech. Microeng. 21 125014. (Year: 2011), 9 pages. |
Liu, Y. et al., “Facial Expression Recognition with Fusion Features Extracted from Salient Facial Areas,” Sensors, 2017, pp. 1-18, vol. 17, No. 712. |
Lu, J. et al., “A 1 TOPS/W Analog Deep Machine-Learning Engine With Floating-Gate Storage in 0.13|jm CMOS,” IEEE Journal of Solid-State Circuits, Jan. 2015, pp. 270-281, vol. 50, No. 1. |
Merla, A., “Thermal Expression of Intersubjectivity Offers New Possibilities to Human-Machine and Technologically Mediated Interactions,” Frontiers in Psychology, Jul. 2014, pp. 1-6, vol. 5, Article 802. |
WO2016195201A1 English translation (Year: 2016). |