This application relates generally to analysis of mental states and more particularly to mental state analysis using eye blink rates.
An individual's mental state may manifest itself in many different ways. Some of these manifestations are externally detectable, such as facial expressions, heart rate, sweating, and changes to respiration and blood pressure. A person's mental state may be impacted by many types of external stimuli. One increasingly common external stimulus is interaction with a computer. People spend ever-larger amounts of time interacting with computers, and consume a vast amount of computer-delivered media. This interaction may be for many different reasons, such as a desire for educational content, entertainment, social media interaction, document creation, and gaming, to name a few.
In some cases the human-computer interaction can take the form of a person performing a task using the computer and a software tool running on the computer. Examples of this can include creating a document, editing a video, or doing one or more of the other activities enabled by modern computers. The person might find certain activities interesting or even exciting to perform, and might be surprised at how easy it is to accomplish the activities. The person can become excited, happy, or content as they perform those activities. On the other hand, the person might find some activities difficult to perform, and might become frustrated or even angry with the computer, even though the computer is oblivious to his or her emotions. In other cases of human-computer interaction, the person might be consuming content or media, such as news, pictures, music or video. A person's mental state can prove useful in determining whether the person enjoys the media.
In some cases, users can be surveyed to try to determine their mental state in reaction to a stimulus, such as the previously mentioned human-computer interaction. Survey results are often unreliable because the surveys are often done well after the activity was performed. Additionally survey participation rates may be low, and people might not provide accurate and honest answers to the survey. In another manner of determining mental state reactions, people can self-rate media to communicate personal preferences by entering a specific number of stars corresponding to a level of like or dislike. These types of subjective evaluations are, in many cases, neither a reliable nor practical way to evaluate personal response to media. Recommendations based on such methods are imprecise, subjective, unreliable, and are often further subject to problems related to the small number of individuals willing to participate in such evaluations.
A mental state analysis includes obtaining video of an individual as the individual is interacting with a computer, either by performing various operations or by consuming a media presentation. The video is then analyzed to determine eye blink information on the individual, such as eye blink rate and/or eye blink duration. A mental state of the individual is then inferred based on the eye blink information. A computer-implemented method for mental state analysis is disclosed comprising obtaining video of an individual; analyzing the video to detect a blink event; and inferring mental states of the individual based on the blink event.
The method may include evaluating blink duration for the blink event. The method may further comprise using the blink event and one or more other blink events to determine blink-rate information. The method can further include aggregating the blink-rate information for the individual with blink-rate information for a plurality of other people. The method may include comprise determining a classifier for a blink. The inferring of mental states may include one or more of attention, concentration, boredom, or fatigue.
In embodiments, a computer program product embodied in a non-transitory computer readable medium for mental state analysis may comprise: code for obtaining video of an individual; code for analyzing the video to determine a blink event; and code for inferring mental states of the individual based on the blink event. In some embodiments, a computer system for mental state analysis may comprise: a memory which stores instructions; one or more processors attached to the memory wherein the one or more processors, when executing the instructions which are stored, are configured to: obtain video of an individual; analyze the video to determine a blink event; and infer mental states of the individual based on the blink event. In embodiments, a computer-implemented method for mental state analysis may comprise: receiving eye blink-rate information obtained from video of an individual; and inferring mental states of the individual based on the eye blink-rate information. In some embodiments, a computer-implemented method for mental state analysis may comprise: capturing video of an individual; analyzing the video to determine eye blink-rate information; and sending the eye blink-rate information for inferring mental states. In embodiments, a computer-implemented method for mental state analysis may comprise: receiving eye blink-rate information based on video of an individual; receiving mental state information inferred from the eye blink-rate information; and rendering one or more of the eye blink-rate information and the mental state information which was inferred.
Various features, aspects, and advantages of various embodiments will become more apparent from the following further description.
The following detailed description of certain embodiments may be understood by reference to the following figures wherein:
Many manifestations of an individual's mental state can be observed through the individual's actions and/or behavior. Such external manifestations that can be related to mental state include facial movements such as smiling, frowning, grimacing, and laughing. One additional facial movement that can be related to an individual's mental state is eye blinking That is, the rate at which an individual blinks his or her eyes and/or the duration of a single eye blink can be related to an individual's mental state.
An individual's mental state can be impacted by his or her interaction with a computer. Understanding the individual's mental state during such interactions can be valuable for a variety of reasons, such as improving the program that the individual is using, rating a media presentation, or optimizing an advertisement. Traditional methods of monitoring an individual's mental state often do not provide an effective way to monitor the individual's mental state during his or her interaction with a computer, for a variety of reasons. For example, surveys or rating systems are prone to non-participation and inaccurate reporting, and even though physiological information can in some instances provide an accurate measure of mental state, traditional physiological monitoring devices are intrusive and not available at most computer workstations.
In contrast, a webcam is able to unobtrusively monitor an individual as they interact with the computer. Many computer systems today already include a webcam, and for systems that do not already have one, a webcam can be easily and inexpensively added to nearly any modern computer workstation. An individual can interact with a computer to view a media presentation or to perform some type of task on the computer while being monitored by a webcam. In some embodiments, some other type of image capture device, for example, a security camera or a camera on a mobile device such as a tablet or a phone, is used to monitor the individual in place of, or in addition to, the webcam. The video from the webcam is then analyzed to determine eye blink information. The eye blink information can include eye-blink rate, eye-blink duration, time between blinks, and/or other information related to one or more eye blinks by the individual being monitored.
Once the eye blink information is determined, the eye blink information can be correlated with context, for example, the activity being performed by the user, demographic information about the user such as the user's age and/or gender, the time of day, the brightness of the screen and/or environment, or other contextual information. In some embodiments, the eye-blink information is compensated, or adjusted, based on the context. The eye blink information can then be used to infer the mental state of the individual, which is correlated to context in some embodiments. The mental state can be used to modify the activity being performed, a game being played, a choice of advertisement to be displayed, a media presentation, or some other activity. In some embodiments, an output is rendered to display the mental states and/or eye blink information, which can be correlated with the context, such as the timeline of a media presentation.
The flow 100 further comprises analyzing the video 120 to detect a blink event. A blink event can start with an eye being open but starting to close. The blink event can conclude with the eye opening or going back to its normal state. The analysis of the video can include detecting on each frame of the video, or portion of the video, whether an eye is open, closed, or in between. By analyzing surrounding frames, and possibly the video as a whole, a blink can be differentiated from a wink, sleeping or relaxing, looking down, and the like. The analyzing can comprise determining a classifier 121 for a blink in order to identify eye blinks in the video. In some embodiments, the blink event is detected using the classifier. The flow 100 includes using the blink event and one or more other blink events to determine blink-rate information 130. The analyzing can filter out single eye winks 122 as eye winks sometimes represent a conscious act and may not be a reliable indicator of mental state. The analyzing can filter out looking down 123 by the individual. As the individual looks down, the individual's eyes can give an appearance of blinking, depending on the position of the camera, even if the eyes do not actually blink. Likewise eye closures, which are longer than blinks, can be filtered. In at least some embodiments, the classifier is configured to do the filtering and differentiation for winks, looking down, and eye closures.
The video is analyzed for information in addition to eye blink-rate information in some embodiments. For example, the flow 100 can further comprise evaluating blink duration 124 because the length of time that an individual's eyes are closed can be indicative of one or more mental states. Some embodiments further comprise evaluating average blink duration 124. The blink-rate information can include information on blink duration. Some embodiments further comprise determining context for the individual. Some embodiments determine context directly from the video, such as lighting conditions, number of people in the room, or other context. Additional context can be gathered from other sources such as direct input by the user, login credentials, the programs currently running, file names being accessed, various types of sensors such as thermometers, or the computer's clock/calendar, among other inputs. Some embodiments include compensating blink-rate information for a context 126. For example, the brightness of the monitor or room can have an impact on the blink-rate that is unrelated to the individual's mental state, and therefore can be compensated for in order that the eye blink-rate may more accurately reflect the mental state of the individual.
The flow 100 further comprises inferring mental states of the individual based on the eye blink-rate information 140. The inferring can be based on the blink duration. The inferring of mental states can include one or more of attention, concentration, boredom, or fatigue. In some embodiments, the inferring of mental states includes one or more mental states of frustration, confusion, disappointment, hesitation, cognitive overload, focusing, engagement, exploration, confidence, trust, delight, disgust, skepticism, doubt, satisfaction, excitement, laughter, calmness, stress, sadness, anger, happiness, and curiosity. While various values of eye blink-rates and/or durations, as well as changes in the eye blink-rates and/or durations, can be indicative of various mental states, a higher blink rate can indicate a mental state of being focused. In some embodiments, the inferring can include evaluation of an impaired state, such as being ill or under the influence of alcohol or drugs. Various steps in the flow 100 may be changed in order, repeated, omitted, or the like without departing from the disclosed concepts. Various embodiments of the flow 100 may be included in a computer program product embodied in a non-transitory computer readable medium that includes code executable by one or more processors.
Some embodiments use the mental state information to render an output 220. The output can include the eye blink-rate information 222 and/or the mental states 224 which were inferred. The output display correlation between the blink-rate information and a stimulus which the individual is encountering. The mental states, which were inferred, can be correlated to a context for the individual. In some embodiments the mental states and/or the context trigger an action to be taken 230. The actions which may be taken based on inferred mental state include selecting an advertisement 232, modifying a game 234, modifying a media presentation 236, or the like. Various steps in the flow 200 may be changed in order, repeated, omitted, or the like without departing from the disclosed concepts. Various embodiments of the flow 200 may be included in a computer program product embodied in a non-transitory computer readable medium that includes code executable by one or more processors.
The webcam 330 can capture video, audio, and/or still images of the individual 310. A webcam, as the term is used herein, can include a video camera, a still camera, a thermal imager, a CCD device, a phone camera, a three-dimensional camera, a depth camera, multiple webcams used to show different views of a person, or any other type of image capture apparatus that can allow captured data to be used in an electronic system. The images of the person 310 from the webcam 330 can be captured by a video capture unit 340. In some embodiments, video is captured, while in others, one or more still images are captured. The system 300 can include analyzing the video for eye blink-rate information 350, eye blink duration, facial data, and/or physiological data. The facial data includes information on facial expressions, action units, head gestures, smiles, smirks, brow furrows, squints, lowered eyebrows, raised eyebrows, or attention, in various embodiments. Analysis of physiological data can also be performed based on the video. Respiration, heart rate, heart rate variability, perspiration, temperature, and other physiological indicators of mental state can be determined by analyzing the video.
A process for mental state analysis can comprise collecting physiological data or accelerometer data with a biosensor. Mental states can be inferred based on physiological data (such as the physiological data captured by the sensor 412) along with blink-rate information. Mental states can also be inferred based, in part, on facial expressions and head gestures observed by a webcam or a combination of data from the webcam along with data from the sensor 412. The mental states can be analyzed based on arousal and valence. Arousal can range from being highly activated, such as when someone is agitated, to being entirely passive, such as when someone is bored. Valence can range from being very positive, such as when someone is happy, to being very negative, such as when someone is angry. Physiological data can include one or more of electrodermal activity (EDA), heart rate, heart rate variability, skin temperature, respiration, skin conductance or galvanic skin response (GSR), accelerometer readings, and other types of analysis of a human being. It will be understood that both here and elsewhere in this document, physiological information can be obtained either by biosensor 412 or by facial observation via the webcam 330.
Electrodermal activity can also be collected. The electrodermal activity can be analyzed 430 to indicate arousal, excitement, boredom, or other mental states based on observed changes in skin conductance. Skin temperature can also be collected and/or recorded on a periodic basis and in turn may be analyzed 432. Changes in skin temperature can indicate arousal, excitement, boredom, or other mental states. Heart rate information can be collected and recorded and can also be analyzed 434. A high heart rate can indicate excitement, arousal or another mental state. Accelerometer data can be collected and used to track one, two, or three dimensions of motion. The accelerometer data can be recorded. The accelerometer data can be used to create an actigraph showing an individual's activity level over time. The accelerometer data can be analyzed 436 and can indicate a sleep pattern, a state of high activity, a state of lethargy, or another state. The various data collected by the biosensor 412 can be used along with the eye blink-rate information captured by the webcam in the analysis of mental state. Contextual information can also be based on one or more of skin temperature or accelerometer data.
A second track 562 can include continuously collected mental state data such as electrodermal activity data 530. A third track 564 can include mental state data such as facial data 540, which can be collected on an intermittent basis by a first camera (although in some embodiments the facial data is collected continuously). The facial data can be collected intermittently when the individual is looking toward a camera. The facial data 540 can include one or more still photographs, videos, or abstracted facial expressions, which can be collected when the user looks in the direction of the camera.
A fourth track 566 can include eye blink-rate information which can be determined using video. The video is collected sporadically, in some embodiments, so the blink-rate information may not be continuous. A first set of blink-rate information 544 can be determined for a first period of time, a second set of blink-rate information 546 can be determined for a second period of time, and a third set of blink-rate information 548 can be determined for a third period of time.
A fifth track 568 can include contextual data, which is collected along with the collection of the mental state data. In the example shown, the fifth track 568 includes location 554, environmental information 556, and time 558, although other types of contextual data can be collected in other embodiments. In the embodiment shown, the fifth track 568 allows contextual data to be associated with, and correlated to, the fourth track 566 containing the eye blink-rate information. Some analysis can evaluate and combine multiple tracks of additional data associated with one or more tracks of mental state data. For example, another track can include identity information about the individual being monitored by a camera, in embodiments, the same camera that captures the third track 564 or the fourth track 566 of mental state data.
Additional tracks, through the nth track 570, of mental state data or additional data of any type can be collected. The additional tracks 570 can be collected on a continuous or on an intermittent basis. The intermittent basis can be either occasional or periodic. Analysis can further comprise interpolating mental state data when the mental state data collected is intermittent, and/or imputing additional mental state data where the mental state data is missing. One or more interpolated tracks 576 can be included and can be associated with mental state data that can be collected on an intermittent basis, such as the eye blink-rate data of the fourth track 566. Interpolated data 545 and a second instance of interpolated data 547 can contain interpolations of the eye blink-rate data of the fourth track 566 for the time periods where no blink-rate data was collected in that track. Other embodiments can interpolate data for periods where other types of information are missing. In other embodiments, analysis includes interpolating mental state analysis when the collected mental state data is intermittently available.
In embodiments, the flow 600 includes evaluating blinking for a group of people 650 of which the individual is a part. If a group of people are simultaneously viewing an event, a video, or another media presentation, then the group of people will often blink at the same time. The blinking can occur at a scene change, a lighting change, and so on. If someone is not paying attention, then the person's blinking can occur at different times from those who are paying attention. The method can include evaluating synchronicity of blinking for the group. In some embodiments, the method includes determining a difference in blinking between the individual and a remainder of the group. The difference can be used to determine a mental state for the individual. In some cases the mental state includes lacking attention. Various steps in the flow 600 may be changed in order, repeated, omitted, or the like without departing from the disclosed concepts. Various embodiments of the flow 600 may be included in a computer program product embodied in a non-transitory computer readable medium that includes code executable by one or more processors.
An individual can interact with the mental state collection machine 720, interact with another computer, or view a media presentation on another electronic display, among other activities. The mental state collection machine 720 can capture video of the interacting individual, and determine blink-rate information for the individual based on the video. The mental state collection machine 720 can then infer mental states based on the blink-rate information or in some way process mental state data that was collected. The mental state collection machine 720 can then send the blink-rate information 730 to another computer (such as the analysis server 750) across the internet 710 or using another computer-aided communications medium. In some embodiments, the mental state collection machine 720 sends the raw video showing a blinking individual to another machine. In other embodiments, the mental state collection machine infers mental states and sends the mental states to another machine, such as the rendering machine 770. In some embodiments, the one or more processors 724 can be configured to perform a computer-implemented method for mental state analysis comprising capturing video of an individual, analyzing the video to determine eye blink-rate information, and sending the eye blink-rate information.
Some embodiments can include an analysis server 750. The analysis server 750 can include one or more processors 754 coupled to a memory 756 to store instructions. In embodiments, the analysis server 750 includes a display 752. The analysis server 750 can receive the blink rate information 740 from the mental state collection machine 720 through the internet 710. The one or more processors 754 can be configured to perform a computer-implemented method for mental state analysis, which, in embodiments, comprises receiving eye blink-rate information obtained from video of an individual and inferring mental states of the individual based on the eye blink-rate information. In some embodiments, the analysis server 750 is configured as a web server, so the inferring of the mental states can be performed by a web service.
The system 700 can include a rendering machine 770. The rendering machine can include one or more processors 774 coupled to a memory 776 to store instructions and a display 772. The rendering machine 770 can receive the mental state information 770 from the Internet 710 or another computer-aided communication method. The mental state information 770 can include eye blink-rate information from the analysis server 750 or from the mental state data collection machine 720, and can render an output to the display 772. So, the system 700 can enable a computer-implemented method for mental state analysis comprising receiving eye blink-rate information based on video of an individual, receiving mental state information inferred from the eye blink-rate information, and rendering one or more of the eye blink-rate information and the mental state information which was inferred. The system 700 can comprise a computer program product embodied in a non-transitory computer readable medium for mental state analysis including code for obtaining video of an individual, code for analyzing the video to determine eye blink-rate information, and code for inferring mental states of the individual based on the eye blink-rate information.
Each of the above methods may be executed on one or more processors on one or more computer systems. Embodiments may include various forms of distributed computing, client/server computing, and cloud-based computing. Further, it will be understood that the depicted steps or boxes contained in this disclosure's flow charts are solely illustrative and explanatory. The steps may be modified, omitted, repeated, or re-ordered without departing from the scope of this disclosure. Further, each step may contain one or more sub-steps. While the foregoing drawings and description set forth functional aspects of the disclosed systems, no particular implementation or arrangement of software and/or hardware should be inferred from these descriptions unless explicitly stated or otherwise clear from the context. All such arrangements of software and/or hardware are intended to fall within the scope of this disclosure.
The block diagrams and flowchart illustrations depict methods, apparatus, systems, and computer program products. The elements and combinations of elements in the block diagrams and flow diagrams, show functions, steps, or groups of steps of the methods, apparatus, systems, computer program products and/or computer-implemented methods. Any and all such functions—generally referred to herein as a “circuit,” “module,” or “system”—may be implemented by computer program instructions, by special-purpose hardware-based computer systems, by combinations of special purpose hardware and computer instructions, by combinations of general purpose hardware and computer instructions, and so on.
A programmable apparatus which executes any of the above mentioned computer program products or computer-implemented methods may include one or more microprocessors, microcontrollers, embedded microcontrollers, programmable digital signal processors, programmable devices, programmable gate arrays, programmable array logic, memory devices, application specific integrated circuits, or the like. Each may be suitably employed or configured to process computer program instructions, execute computer logic, store computer data, and so on.
It will be understood that a computer may include a computer program product from a computer-readable storage medium and that this medium may be internal or external, removable and replaceable, or fixed. In addition, a computer may include a Basic Input/Output System (BIOS), firmware, an operating system, a database, or the like that may include, interface with, or support the software and hardware described herein.
Embodiments of the present invention are neither limited to conventional computer applications nor the programmable apparatus that run them. To illustrate: the embodiments of the presently claimed invention could include an optical computer, quantum computer, analog computer, or the like. A computer program may be loaded onto a computer to produce a particular machine that may perform any and all of the depicted functions. This particular machine provides a means for carrying out any and all of the depicted functions.
Any combination of one or more computer readable media may be utilized including but not limited to: a non-transitory computer readable medium for storage; an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor computer readable storage medium or any suitable combination of the foregoing; a portable computer diskette; a hard disk; a random access memory (RAM); a read-only memory (ROM), an erasable programmable read-only memory (EPROM, Flash, MRAM, FeRAM, or phase change memory); an optical fiber; a portable compact disc; an optical storage device; a magnetic storage device; or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
It will be appreciated that computer program instructions may include computer executable code. A variety of languages for expressing computer program instructions may include without limitation C, C++, Java, JavaScript™, ActionScript™, assembly language, Lisp, Perl, Tcl, Python, Ruby, hardware description languages, database programming languages, functional programming languages, imperative programming languages, and so on. In embodiments, computer program instructions may be stored, compiled, or interpreted to run on a computer, a programmable data processing apparatus, a heterogeneous combination of processors or processor architectures, and so on. Without limitation, embodiments of the present invention may take the form of web-based computer software, which includes client/server software, software-as-a-service, peer-to-peer software, or the like.
In embodiments, a computer may enable execution of computer program instructions including multiple programs or threads. The multiple programs or threads may be processed approximately simultaneously to enhance utilization of the processor and to facilitate substantially simultaneous functions. By way of implementation, any and all methods, program codes, program instructions, and the like described herein may be implemented in one or more threads which may in turn spawn other threads, which may themselves have priorities associated with them. In some embodiments, a computer may process these threads based on priority or other order.
Unless explicitly stated or otherwise clear from the context, the verbs “execute” and “process” may be used interchangeably to indicate execute, process, interpret, compile, assemble, link, load, or a combination of the foregoing. Therefore, embodiments that execute or process computer program instructions, computer-executable code, or the like may act upon the instructions or code in any and all of the ways described. Further, the method steps shown are intended to include any suitable method of causing one or more parties or entities to perform the steps. The parties performing a step, or portion of a step, need not be located within a particular geographic location or country boundary. For instance, if an entity located within the United States causes a method step, or portion thereof, to be performed outside of the United States then the method is considered to be performed in the United States by virtue of the causal entity.
While the invention has been disclosed in connection with preferred embodiments shown and described in detail, various modifications and improvements thereon will become apparent to those skilled in the art. Accordingly, the forgoing examples should not limit the spirit and scope of the present invention; rather it should be understood in the broadest sense allowable by law.
This application claims the benefit of U.S. provisional patent applications “Mental State Analysis Using Blink Rate” Ser. No. 61/789,038, filed Mar. 15, 2013, “Mental State Analysis Using Heart Rate Collection Based on Video Imagery” Ser. No. 61/793,761, filed Mar. 15, 2013, “Mental State Data Tagging for Data Collected from Multiple Sources” Ser. No. 61/790,461, filed Mar. 15, 2013, “Mental State Well Being Monitoring” Ser. No. 61/798,731, filed Mar. 15, 2013, “Personal Emotional Profile Generation” Ser. No. 61/844,478, filed Jul. 10, 2013, “Heart Rate Variability Evaluation for Mental State Analysis” Ser. No. 61/916,190, filed Dec. 14, 2013, “Mental State Analysis Using an Application Programming Interface” Ser. No. 61/924,252, filed Jan. 7, 2014, and “Mental State Analysis for Norm Generation” Ser. No. 61/927,481, filed Jan. 15, 2014. This application is also a continuation-in-part of U.S. patent application “Mental State Analysis Using Web Services” Ser. No. 13/153,745, filed Jun. 6, 2011, which claims the benefit of U.S. provisional patent applications “Mental State Analysis Through Web Based Indexing” Ser. No. 61/352,166, filed Jun. 7, 2010, “Measuring Affective Data for Web-Enabled Applications” Ser. No. 61/388,002, filed Sep. 30, 2010, “Sharing Affect Data Across a Social Network” Ser. No. 61/414,451, filed Nov. 17, 2010, “Using Affect Within a Gaming Context” Ser. No. 61/439,913, filed Feb. 6, 2011, “Recommendation and Visualization of Affect Responses to Videos” Ser. No. 61/447,089, filed Feb. 27, 2011, “Video Ranking Based on Affect” Ser. No. 61/447,464, filed Feb. 28, 2011, and “Baseline Face Analysis” Ser. No. 61/467,209, filed Mar. 24, 2011. The foregoing applications are each hereby incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
3034500 | Backster, Jr. | May 1962 | A |
3548806 | Fisher | Dec 1970 | A |
3870034 | James | Mar 1975 | A |
4353375 | Colburn et al. | Oct 1982 | A |
4448203 | Williamson et al. | May 1984 | A |
4794533 | Cohen | Dec 1988 | A |
4807642 | Brown | Feb 1989 | A |
4817628 | Zealear et al. | Apr 1989 | A |
4950069 | Hutchinson | Aug 1990 | A |
4964411 | Johnson et al. | Oct 1990 | A |
5016282 | Tomono et al. | May 1991 | A |
5031228 | Lu | Jul 1991 | A |
5219322 | Weathers | Jun 1993 | A |
5247938 | Silverstein et al. | Sep 1993 | A |
5259390 | Maclean | Nov 1993 | A |
5507291 | Stirbl et al. | Apr 1996 | A |
5572596 | Wildes et al. | Nov 1996 | A |
5619571 | Sandstrom et al. | Apr 1997 | A |
5647834 | Ron | Jul 1997 | A |
5649061 | Smyth | Jul 1997 | A |
5663900 | Bhandari et al. | Sep 1997 | A |
5666215 | Fredlund et al. | Sep 1997 | A |
5725472 | Weathers | Mar 1998 | A |
5741217 | Gero | Apr 1998 | A |
5760917 | Sheridan | Jun 1998 | A |
5762611 | Lewis et al. | Jun 1998 | A |
5772591 | Cram | Jun 1998 | A |
5774591 | Black et al. | Jun 1998 | A |
5802220 | Black et al. | Sep 1998 | A |
5825355 | Palmer et al. | Oct 1998 | A |
5867587 | Aboutalib et al. | Feb 1999 | A |
5886683 | Tognazzini et al. | Mar 1999 | A |
5898423 | Tognazzini et al. | Apr 1999 | A |
5920477 | Hoffberg et al. | Jul 1999 | A |
5945988 | Williams et al. | Aug 1999 | A |
5959621 | Nawaz et al. | Sep 1999 | A |
5969755 | Courtney | Oct 1999 | A |
5983129 | Cowan et al. | Nov 1999 | A |
5987415 | Breese et al. | Nov 1999 | A |
6004061 | Manico et al. | Dec 1999 | A |
6004312 | Finneran et al. | Dec 1999 | A |
6008817 | Gilmore, Jr. | Dec 1999 | A |
6026321 | Miyata et al. | Feb 2000 | A |
6026322 | Korenman et al. | Feb 2000 | A |
6056781 | Wassick et al. | May 2000 | A |
6067565 | Horvitz | May 2000 | A |
6088040 | Oda et al. | Jul 2000 | A |
6099319 | Zaltman et al. | Aug 2000 | A |
6134644 | Mayuzumi et al. | Oct 2000 | A |
6182098 | Selker | Jan 2001 | B1 |
6185534 | Breese et al. | Feb 2001 | B1 |
6195651 | Handel et al. | Feb 2001 | B1 |
6212502 | Ball et al. | Apr 2001 | B1 |
6222607 | Szajewski et al. | Apr 2001 | B1 |
6309342 | Blazey et al. | Oct 2001 | B1 |
6327580 | Pierce et al. | Dec 2001 | B1 |
6349290 | Horowitz et al. | Feb 2002 | B1 |
6351273 | Lemelson et al. | Feb 2002 | B1 |
6437758 | Nielsen et al. | Aug 2002 | B1 |
6443840 | Von Kohorn | Sep 2002 | B2 |
6530082 | Del Sesto et al. | Mar 2003 | B1 |
6577329 | Flickner et al. | Jun 2003 | B1 |
6606102 | Odom | Aug 2003 | B1 |
6629104 | Parulski et al. | Sep 2003 | B1 |
6792458 | Muret et al. | Sep 2004 | B1 |
6847376 | Engeldrum et al. | Jan 2005 | B2 |
7013478 | Hendricks et al. | Mar 2006 | B1 |
7113916 | Hill | Sep 2006 | B1 |
7120880 | Dryer et al. | Oct 2006 | B1 |
7197459 | Harinarayan et al. | Mar 2007 | B1 |
7233684 | Fedorovskaya et al. | Jun 2007 | B2 |
7246081 | Hill | Jul 2007 | B2 |
7263474 | Fables et al. | Aug 2007 | B2 |
7266582 | Stelting | Sep 2007 | B2 |
7307636 | Matraszek et al. | Dec 2007 | B2 |
7327505 | Fedorovskaya et al. | Feb 2008 | B2 |
7350138 | Swaminathan et al. | Mar 2008 | B1 |
7353399 | Ooi et al. | Apr 2008 | B2 |
7355627 | Yamazaki et al. | Apr 2008 | B2 |
7428318 | Madsen et al. | Sep 2008 | B1 |
7496622 | Brown et al. | Feb 2009 | B2 |
7549161 | Poo et al. | Jun 2009 | B2 |
7551755 | Steinberg et al. | Jun 2009 | B1 |
7555148 | Steinberg et al. | Jun 2009 | B1 |
7558408 | Steinberg et al. | Jul 2009 | B1 |
7564994 | Steinberg et al. | Jul 2009 | B1 |
7573439 | Lau et al. | Aug 2009 | B2 |
7580512 | Batni et al. | Aug 2009 | B2 |
7584435 | Bailey et al. | Sep 2009 | B2 |
7587068 | Steinberg et al. | Sep 2009 | B1 |
7610289 | Muret et al. | Oct 2009 | B2 |
7620934 | Falter et al. | Nov 2009 | B2 |
7644375 | Anderson et al. | Jan 2010 | B1 |
7676574 | Glommen et al. | Mar 2010 | B2 |
7747801 | Han et al. | Jun 2010 | B2 |
7826657 | Zhang et al. | Nov 2010 | B2 |
7830570 | Morita et al. | Nov 2010 | B2 |
7881493 | Edwards et al. | Feb 2011 | B1 |
7921036 | Sharma | Apr 2011 | B1 |
8010458 | Galbreath et al. | Aug 2011 | B2 |
8401248 | Moon et al. | Mar 2013 | B1 |
8600120 | Gonion et al. | Dec 2013 | B2 |
20010033286 | Stokes et al. | Oct 2001 | A1 |
20010041021 | Boyle et al. | Nov 2001 | A1 |
20020007249 | Cranley | Jan 2002 | A1 |
20020030665 | Ano | Mar 2002 | A1 |
20020042557 | Bensen et al. | Apr 2002 | A1 |
20020054174 | Abbott et al. | May 2002 | A1 |
20020084902 | Zadrozny et al. | Jul 2002 | A1 |
20020105427 | Hamamoto et al. | Aug 2002 | A1 |
20020171551 | Eshelman | Nov 2002 | A1 |
20020182574 | Freer | Dec 2002 | A1 |
20030035567 | Chang et al. | Feb 2003 | A1 |
20030037041 | Hertz | Feb 2003 | A1 |
20030060728 | Mandigo | Mar 2003 | A1 |
20030078513 | Marshall | Apr 2003 | A1 |
20030093784 | Dimitrova et al. | May 2003 | A1 |
20030169907 | Edwards | Sep 2003 | A1 |
20030191682 | Shepard et al. | Oct 2003 | A1 |
20030191816 | Landress et al. | Oct 2003 | A1 |
20040181457 | Biebesheimer et al. | Sep 2004 | A1 |
20050187437 | Matsugu | Aug 2005 | A1 |
20050283055 | Shirai et al. | Dec 2005 | A1 |
20050289582 | Tavares et al. | Dec 2005 | A1 |
20060019224 | Behar et al. | Jan 2006 | A1 |
20060115157 | Mori | Jun 2006 | A1 |
20060143647 | Bill | Jun 2006 | A1 |
20060235753 | Kameyama | Oct 2006 | A1 |
20070167689 | Ramadas et al. | Jul 2007 | A1 |
20070239787 | Cunningham et al. | Oct 2007 | A1 |
20070255831 | Hayashi et al. | Nov 2007 | A1 |
20070265507 | de Lemos | Nov 2007 | A1 |
20070299964 | Wong et al. | Dec 2007 | A1 |
20080091512 | Marci et al. | Apr 2008 | A1 |
20080091515 | Thieberger et al. | Apr 2008 | A1 |
20080101660 | Seo | May 2008 | A1 |
20080103784 | Wong et al. | May 2008 | A1 |
20080184170 | Periyalwar | Jul 2008 | A1 |
20080208015 | Morris et al. | Aug 2008 | A1 |
20080218472 | Breen et al. | Sep 2008 | A1 |
20080221472 | Lee et al. | Sep 2008 | A1 |
20080287821 | Jung et al. | Nov 2008 | A1 |
20090006206 | Groe et al. | Jan 2009 | A1 |
20090083421 | Glommen et al. | Mar 2009 | A1 |
20090094286 | Lee et al. | Apr 2009 | A1 |
20090112694 | Jung et al. | Apr 2009 | A1 |
20090112810 | Jung et al. | Apr 2009 | A1 |
20090133048 | Gibbs et al. | May 2009 | A1 |
20090147080 | Inada | Jun 2009 | A1 |
20090150919 | Lee et al. | Jun 2009 | A1 |
20090210290 | Elliott et al. | Aug 2009 | A1 |
20090217315 | Malik et al. | Aug 2009 | A1 |
20090271417 | Toebes et al. | Oct 2009 | A1 |
20090299840 | Smith | Dec 2009 | A1 |
20100070523 | Delgo et al. | Mar 2010 | A1 |
20100099955 | Thomas et al. | Apr 2010 | A1 |
20100266213 | Hill | Oct 2010 | A1 |
20100274847 | Anderson | Oct 2010 | A1 |
20110092780 | Zhang et al. | Apr 2011 | A1 |
20110102553 | Corcoran et al. | May 2011 | A1 |
20110126226 | Makhlouf | May 2011 | A1 |
20110134026 | Kang et al. | Jun 2011 | A1 |
20110143728 | Holopainen et al. | Jun 2011 | A1 |
20110196855 | Wable et al. | Aug 2011 | A1 |
20110231240 | Schoen et al. | Sep 2011 | A1 |
20110251493 | Poh et al. | Oct 2011 | A1 |
20110263946 | el Kaliouby et al. | Oct 2011 | A1 |
20120229248 | Parshionikar et al. | Sep 2012 | A1 |
20120293548 | Perez et al. | Nov 2012 | A1 |
20120304206 | Roberts et al. | Nov 2012 | A1 |
20130215390 | Johns | Aug 2013 | A1 |
20140192325 | Klin | Jul 2014 | A1 |
20140198196 | Howard | Jul 2014 | A1 |
Number | Date | Country |
---|---|---|
08115367 | Jul 1996 | JP |
10-2005-0021759 | Mar 2005 | KR |
10-2008-0016303 | Feb 2008 | KR |
1020100048688 | May 2010 | KR |
100964325 | Jun 2010 | KR |
1020100094897 | Aug 2010 | KR |
WO 2011045422 | Apr 2011 | WO |
Entry |
---|
Rana Ayman El Kaliouby, Mind-reading machines: automated inference of complex mental states, Jul. 2005, of Cambridge, Cambridge, United Kingdom University. |
International Search Report dated Nov. 14, 2011 for PCT/US2011/39282. |
International Search Report dated Apr. 16, 2012 for PCT/US2011/054125. |
International Search Report dated May 24, 2012 for PCT/US2011/060900. |
Xiaoyu Wang, An HOG-LBP human detector with partial occlusion handling, Sep. 29, 2009, IEEE 12th International Conference on Computer Vision, Kyoto, Japan. |
Zhihong Zeng, A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions, Jan. 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, No. 1. |
Nicholas R. Howe and Amanda Ricketson, Improving the Boosted Correlogram, 2004, Lecture Notes in Computer Science, ISSN 0302-9743, Springer-Verlag, Germany. |
Xuming He, et al, Learning and Incorporating Top-Down Cues in Image Segmentation, 2006, Lecture Notes in Computer Science, ISBN 978-3-540-33832-1, Springer-Verlag, Germany. |
Ross Eaton, et al, Rapid Training of Image Classifiers through Adaptive, Multi-frame Sampling Methods, Oct. 2008, IEEE 37th Applied Imagery Pattern Recognition Workshop, Washington DC. |
Verkruysse, Wim, Lars O. Svaasand, and J. Stuart Nelson. “Remote plethysmographic imaging using ambient light.” Optics express 16.26 (2008): 21434-21445. |
Number | Date | Country | |
---|---|---|---|
20140200417 A1 | Jul 2014 | US |
Number | Date | Country | |
---|---|---|---|
61789038 | Mar 2013 | US | |
61793761 | Mar 2013 | US | |
61790461 | Mar 2013 | US | |
61798731 | Mar 2013 | US | |
61844478 | Jul 2013 | US | |
61916190 | Dec 2013 | US | |
61924252 | Jan 2014 | US | |
61927481 | Jan 2014 | US | |
61352166 | Jun 2010 | US | |
61388002 | Sep 2010 | US | |
61414451 | Nov 2010 | US | |
61439913 | Feb 2011 | US | |
61447089 | Feb 2011 | US | |
61447464 | Feb 2011 | US | |
61467209 | Mar 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13153745 | Jun 2011 | US |
Child | 14214918 | US |