This patent is related to U.S. patent application Ser. No. 12/056,190; U.S. patent application Ser. No. 12/056,211; U.S. patent application Ser. No. 12/056,221; U.S. patent application Ser. No. 12/056,225; U.S. patent application Ser. No. 12/113,863; U.S. patent application Ser. No. 12/113,870; U.S. patent application Ser. No. 12/122,240; U.S. patent application Ser. No. 12/122,253; U.S. patent application Ser. No. 12/122,262; U.S. patent application Ser. No. 12/135,066; U.S. patent application Ser. No. 12/135,074; U.S. patent application Ser. No. 12/182,851; U.S. patent application Ser. No. 12/182,874; U.S. patent application Ser. No. 12/199,557; U.S. patent application Ser. No. 12/199,583; U.S. patent application Ser. No. 12/199,596; U.S. patent application Ser. No. 12/200,813; U.S. patent application Ser. No. 12/234,372; U.S. patent application Ser. No. 12/135,069; U.S. patent application Ser. No. 12/234,388; U.S. patent application Ser. No. 12/544,921; U.S. patent application Ser. No. 12/544,934; U.S. patent application Ser. No. 12/546,586; U.S. patent application Ser. No. 12/544,958; U.S. patent application Ser. No. 12/846,242; U.S. patent application Ser. No. 12/410,380; U.S. patent application Ser. No. 12/410,372; U.S. patent application Ser. No. 12/413,297; U.S. patent application Ser. No. 12/545,455; U.S. patent application Ser. No. 12/608,660; U.S. patent application Ser. No. 12/608,685; U.S. patent application Ser. No. 13/444,149; U.S. patent application Ser. No. 12/608,696; U.S. patent application Ser. No. 12/731,868; U.S. patent application Ser. No. 13/045,457; U.S. patent application Ser. No. 12/778,810; U.S. patent application Ser. No. 12/778,828; U.S. patent application Ser. No. 13/104,821; U.S. patent application Ser. No. 13/104,840; U.S. patent application Ser. No. 12/884,034; U.S. patent application Ser. No. 12/868,531; U.S. patent application Ser. No. 12/913,102; U.S. patent application Ser. No. 12/853,213; and U.S. patent application Ser. No. 13/105,774.
The present disclosure relates to using neuro-response data to evaluate marketing and entertainment in virtual reality environments.
Conventional systems for evaluating marketing materials typically involve monitoring and surveying individuals exposed to materials such as products, packages, advertisements, and services. Attempts have been made to present marketing materials in their natural environments such as showrooms, store shelves, displays, etc. However, mechanisms for presenting marketing materials in natural environments are limited. In some examples, individuals are asked to respond to surveys quickly after exposure to marketing materials in actual environments, but information collected is typically limited. Furthermore, conventional systems are subject to brain pattern, semantic, syntactic, metaphorical, cultural, and interpretive errors that prevent accurate and repeatable analyses.
Consequently, it is desirable to provide improved methods and apparatus for evaluating marketing materials in natural environments that use neuro-response data such as central nervous system, autonomic nervous system, and effector system measurements along with survey based data.
The disclosure may best be understood by reference to the following description taken in conjunction with the accompanying drawings, which illustrate particular example embodiments.
Reference will now be made in detail to some specific examples of the invention including the best modes contemplated by the inventors for carrying out the invention. Examples of these specific embodiments are illustrated in the accompanying drawings. While the invention is described in conjunction with these specific embodiments, it will be understood that it is not intended to limit the invention to the described embodiments. On the contrary, it is intended to cover alternatives, modifications, and equivalents as may be included within the spirit and scope of the invention as defined by the appended claims.
For example, the techniques and mechanisms of the present invention will be described in the context of particular types of stimulus materials. However, it should be noted that the techniques and mechanisms of the present invention apply to a variety of different types of stimulus materials including marketing and entertainment materials. It should be noted that various mechanisms and techniques can be applied to any type of stimuli. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. Particular example embodiments of the present invention may be implemented without some or all of these specific details. In other instances, well known process operations have not been described in detail in order not to unnecessarily obscure the present invention.
Various techniques and mechanisms of the present invention will sometimes be described in singular form for clarity. However, it should be noted that some embodiments include multiple iterations of a technique or multiple instantiations of a mechanism unless noted otherwise. For example, a system uses a processor in a variety of contexts. However, it will be appreciated that a system can use multiple processors while remaining within the scope of the present invention unless otherwise noted. Furthermore, the techniques and mechanisms of the present invention will sometimes describe a connection between two entities. It should be noted that a connection between two entities does not necessarily mean a direct, unimpeded connection, as a variety of other entities may reside between the two entities. For example, a processor may be connected to memory, but it will be appreciated that a variety of bridges and controllers may reside between the processor and memory. Consequently, a connection does not necessarily mean a direct, unimpeded connection unless otherwise noted.
Overview
A system presents stimulus materials such as products, product packages, displays, services, offerings, etc., in virtual reality environments such as market aisles, store shelves, showroom floors, etc. Sensory experiences output to the user via the virtual reality environment elicit user interactivity. User activity and responses are used to modify marketing materials and/or virtual reality environments. Neuro-response data including electroencephalography (EEG) data is collected from users in order to evaluate the effectiveness of marketing materials in virtual reality environments. In particular examples, neuro-response data is used to modify marketing materials and virtual reality environments presented to the user.
Marketing materials such as products, product packages, brochures, displays, signs, offerings, and arrangements are typically evaluated by surveying individuals exposed to the marketing materials. Survey responses and focus groups elicit user opinions about the marketing materials. The survey responses and focus groups provide some limited information about the effectiveness of the marketing materials. It is recognized that user responses to marketing materials in a laboratory or evaluation setting can sometimes be different than user responses to the marketing materials in a natural environment, such as a store shelf, a supermarket aisle, a tradeshow floor, building, or a showroom. However, opportunities to evaluate the effectiveness of marketing materials in natural environments are limited.
In some instances, efforts are made to elicit user responses to marketing materials after users visit actual showrooms, stores, or tradeshows. In other instances, model stores, displays, and mock presentations are set up to test the effectiveness of particular packages, displays, presentations, offerings, etc. However, using actual displays or establishing mock presentations is cumbersome and inflexible. It is highly inefficient to test a variety of presentations or make changes to presentations based on user feedback. Furthermore, even when ample user feedback is obtained, user feedback is subject to brain pattern, semantic, syntactic, metaphorical, cultural, and interpretive errors that prevent accurate and repeatable analyses.
Consequently, the techniques of the present invention provide mechanisms for evaluating marketing materials presented in virtual reality environments by using neuro-response data. In some examples, neuro-response data along with other survey and focus group data is used to test marketing presentations and displays in virtual reality environments. Virtual reality environments and virtual reality environment templates can be generated to allow reuse, customization, and integration with generated marketing materials.
Neuro-response data is analyzed to determine the effectiveness of marketing materials presented in various virtual reality environments to particular individuals. Individuals are provided with mechanisms to interact with the virtual reality environment and marketing materials are manipulated in the framework of the virtual reality environment. Sensors, cameras, microphones, motion detectors, gyroscopes, temperature sensors, etc., can all be used to monitor user responses to allow not only manipulation of the virtual reality environment but modification of the marketing materials presented. In particular embodiments, neuro-response data is used to evaluate the effectiveness of marketing materials and make real-time adjustments and modifications to marketing materials or the virtual reality environment presented.
Neuro-response measurements such as central nervous system, autonomic nervous system, and effector measurements can be used to evaluate subjects during stimulus presentation. Some examples of central nervous system measurement mechanisms include Functional Magnetic Resonance Imaging (fMRI), Electroencephalography (EEG), Magnetoencephlography (MEG), and Optical Imaging. Optical imaging can be used to measure the absorption or scattering of light related to concentration of chemicals in the brain or neurons associated with neuronal firing. MEG measures magnetic fields produced by electrical activity in the brain. fMRI measures blood oxygenation in the brain that correlates with increased neural activity. However, current implementations of fMRI have poor temporal resolution of few seconds. EEG measures electrical activity associated with post synaptic currents occurring in the milliseconds range. Subcranial EEG can measure electrical activity with the most accuracy, as the bone and dermal layers weaken transmission of a wide range of frequencies. Nonetheless, surface EEG provides a wealth of electrophysiological information if analyzed properly. Even portable EEG with dry electrodes provides a large amount of neuro-response information.
Autonomic nervous system measurement mechanisms include Electrocardiograms (EKG) and pupillary dilation, etc. Effector measurement mechanisms include Electrooculography (EOG), eye tracking, facial emotion encoding, reaction time etc.
Multiple modes and manifestations of precognitive neural signatures are blended with cognitive neural signatures and post cognitive neurophysiological manifestations to more accurately perform neuro-response analysis. In some examples, autonomic nervous system measures are themselves used to validate central nervous system measures. Effector and behavior responses are blended and combined with other measures. According to various embodiments, central nervous system, autonomic nervous system, and effector system measurements are aggregated into a measurement that allows evaluation of stimulus material effectiveness in particular environments.
In particular embodiments, subjects are exposed to stimulus material and data such as central nervous system, autonomic nervous system, and effector data is collected during exposure. According to various embodiments, data is collected in order to determine a resonance measure that aggregates multiple component measures that assess resonance data. In particular embodiments, specific event related potential (ERP) analyses and/or event related power spectral perturbations (ERPSPs) are evaluated for different regions of the brain both before a subject is exposed to stimulus and each time after the subject is exposed to stimulus.
According to various embodiments, pre-stimulus and post-stimulus differential as well as target and distracter differential measurements of ERP time domain components at multiple regions of the brain are determined (DERP). Event related time-frequency analysis of the differential response to assess the attention, emotion and memory retention (DERPSPs) across multiple frequency bands including but not limited to theta, alpha, beta, gamma and high gamma is performed. In particular embodiments, single trial and/or averaged DERP and/or DERPSPs can be used to enhance the resonance measure and determine priming levels for various products and services.
According to various embodiments, enhanced neuro-response data is generated using a data analyzer that performs both intra-modality measurement enhancements and cross-modality measurement enhancements. According to various embodiments, brain activity is measured not just to determine the regions of activity, but to determine interactions and types of interactions between various regions. The techniques and mechanisms of the present invention recognize that interactions between neural regions support orchestrated and organized behavior. Attention, emotion, memory, and other abilities are not merely based on one part of the brain but instead rely on network interactions between brain regions.
The techniques and mechanisms of the present invention further recognize that different frequency bands used for multi-regional communication can be indicative of the effectiveness of stimuli. In particular embodiments, evaluations are calibrated to each subject and synchronized across subjects. In particular embodiments, templates are created for subjects to create a baseline for measuring pre and post stimulus differentials. According to various embodiments, stimulus generators are intelligent and adaptively modify specific parameters such as exposure length and duration for each subject being analyzed.
A variety of modalities can be used including EEG, GSR, EKG, pupillary dilation, EOG, eye tracking, facial emotion encoding, reaction time, etc. Individual modalities such as EEG are enhanced by intelligently recognizing neural region communication pathways. Cross modality analysis is enhanced using a synthesis and analytical blending of central nervous system, autonomic nervous system, and effector signatures. Synthesis and analysis by mechanisms such as time and phase shifting, correlating, and validating intra-modal determinations allow generation of a composite output characterizing the significance of various data responses.
According to various embodiments, survey based and actual expressed responses and actions for particular groups of users are integrated with neuro-response data and stored in a stimulus material and virtual reality environment. According to particular embodiments, pre-articulation predictions of expressive response for various stimulus material can be made by analyzing neuro-response data.
The integrated marketing materials and virtual reality environment are provided to a presentation device 121. The presentation device 121 may include screens, headsets, domes, multidimensional displays, speakers, motion simulation devices, movable platforms, smell generators, etc., to provide the subject 123 with a simulated environment. Subject response collection mechanism 131 may include cameras recorders, motion detectors, etc., that capture subject activity and responses. According to various embodiments, neuro-response data collection mechanisms are also used to capture neuro-response data such as electroencephalography (EEG) data for the subject presented with stimulus materials. In particular embodiments, feedback and modification mechanism 141 uses subject responses to modify marketing materials and/or the virtual reality environment based on subject actions. According to various embodiments, product packages may be manipulated by a subject in the virtual reality environment. In particular embodiments, store displays may be viewed from different angles, products may be opened, etc.
According to various embodiments, neuro-response data including EEG data is used to make real-time modifications to marketing materials and virtual reality environments. In particular embodiments, lack of interest is detected using neuro-response data and different marketing materials are dynamically presented to the user as the user moves along in a grocery aisle.
The stimuli can involve a variety of senses and occur with or without human supervision. Continuous and discrete modes are supported. According to various embodiments, the virtual reality environment presentation device 151 also has protocol generation capability to allow intelligent customization of stimulus and environments provided to multiple subjects in different settings such as laboratory, corporate, and home settings.
According to various embodiments, virtual reality environment presentation device 151 could include devices such as headsets, goggles, projection systems, display devices, speakers, tactile surfaces, etc., for presenting the stimulus in virtual reality environments.
According to various embodiments, the subjects 153 are connected to data collection devices 155. The data collection devices 155 may include a variety of neuro-response measurement mechanisms including neurological and neurophysiological measurements systems such as EEG, EOG, MEG, pupillary dilation, eye tracking, facial emotion encoding, and reaction time devices, etc. According to various embodiments, neuro-response data includes central nervous system, autonomic nervous system, and effector data. In particular embodiments, the data collection devices 155 include EEG 161, EOG 163, and fMRI 165. In some instances, only a single data collection device is used. Data collection may proceed with or without human supervision.
The data collection device 155 collects neuro-response data from multiple sources. This includes a combination of devices such as central nervous system sources (EEG), autonomic nervous system sources (EKG, pupillary dilation), and effector sources (EOG, eye tracking, facial emotion encoding, reaction time). In particular embodiments, data collected is digitally sampled and stored for later analysis. In particular embodiments, the data collected could be analyzed in real-time. According to particular embodiments, the digital sampling rates are adaptively chosen based on the neurophysiological and neurological data being measured.
In one particular embodiment, the system includes EEG 161 measurements made using scalp level electrodes, EOG 163 measurements made using shielded electrodes to track eye data, fMRI 165 measurements performed using a differential measurement system, a facial muscular measurement through shielded electrodes placed at specific locations on the face, and a facial affect graphic and video analyzer adaptively derived for each individual.
In particular embodiments, the data collection devices are clock synchronized with a virtual reality environment presentation device 151. In particular embodiments, the data collection devices 155 also include a condition evaluation subsystem that provides auto triggers, alerts and status monitoring and visualization components that continuously monitor the status of the subject, data being collected, and the data collection instruments. The condition evaluation subsystem may also present visual alerts and automatically trigger remedial actions. According to various embodiments, the data collection devices include mechanisms for not only monitoring subject neuro-response to stimulus materials, but also include mechanisms for identifying and monitoring the stimulus materials. For example, data collection devices 155 may be synchronized with a set-top box to monitor channel changes. In other examples, data collection devices 155 may be directionally synchronized to monitor when a subject is no longer paying attention to stimulus material. In still other examples, the data collection devices 155 may receive and store stimulus material generally being viewed by the subject, whether the stimulus is a program, a commercial, printed material, or a scene outside a window. The data collected allows analysis of neuro-response information and correlation of the information to actual stimulus material and not mere subject distractions.
According to various embodiments, the virtual reality stimulus presentation system also includes a data cleanser device 171. In particular embodiments, the data cleanser device 171 filters the collected data to remove noise, artifacts, and other irrelevant data using fixed and adaptive filtering, weighted averaging, advanced component extraction (like PCA, ICA), vector and component separation methods, etc. This device cleanses the data by removing both exogenous noise (where the source is outside the physiology of the subject, e.g. a phone ringing while a subject is viewing a video) and endogenous artifacts (where the source could be neurophysiological, e.g. muscle movements, eye blinks, etc.).
The artifact removal subsystem includes mechanisms to selectively isolate and review the response data and identify epochs with time domain and/or frequency domain attributes that correspond to artifacts such as line frequency, eye blinks, and muscle movements. The artifact removal subsystem then cleanses the artifacts by either omitting these epochs, or by replacing these epoch data with an estimate based on the other clean data (for example, an EEG nearest neighbor weighted averaging approach).
According to various embodiments, the data cleanser device 171 is implemented using hardware, firmware, and/or software. It should be noted that although a data cleanser device 171 is shown located after a data collection device 155, the data cleanser device 171 like other components may have a location and functionality that varies based on system implementation. For example, some systems may not use any automated data cleanser device whatsoever while in other systems, data cleanser devices may be integrated into individual data collection devices.
In particular embodiments, a survey and interview system collects and integrates user survey and interview responses to combine with neuro-response data to more effectively perform virtual reality stimulus presentation. According to various embodiments, the survey and interview system obtains information about user characteristics such as age, gender, income level, location, interests, buying preferences, hobbies, etc.
According to various embodiments, the virtual reality stimulus presentation system includes a data analyzer 173 associated with the data cleanser 171. The data analyzer 173 uses a variety of mechanisms to analyze underlying data in the system to determine resonance. According to various embodiments, the data analyzer 173 customizes and extracts the independent neurological and neuro-physiological parameters for each individual in each modality, and blends the estimates within a modality as well as across modalities to elicit an enhanced response to the presented stimulus material. In particular embodiments, the data analyzer 173 aggregates the response measures across subjects in a dataset.
According to various embodiments, neurological and neuro-physiological signatures are measured using time domain analyses and frequency domain analyses. Such analyses use parameters that are common across individuals as well as parameters that are unique to each individual. The analyses could also include statistical parameter extraction and fuzzy logic based attribute estimation from both the time and frequency components of the synthesized response.
In some examples, statistical parameters used in a blended effectiveness estimate include evaluations of skew, peaks, first and second moments, distribution, as well as fuzzy estimates of attention, emotional engagement and memory retention responses.
According to various embodiments, the data analyzer 173 may include an intra-modality response synthesizer and a cross-modality response synthesizer. In particular embodiments, the intra-modality response synthesizer is configured to customize and extract the independent neurological and neurophysiological parameters for each individual in each modality and blend the estimates within a modality analytically to elicit an enhanced response to the presented stimuli. In particular embodiments, the intra-modality response synthesizer also aggregates data from different subjects in a dataset.
According to various embodiments, the cross-modality response synthesizer or fusion device blends different intra-modality responses, including raw signals and signals output. The combination of signals enhances the measures of effectiveness within a modality. The cross-modality response fusion device can also aggregate data from different subjects in a dataset.
According to various embodiments, the data analyzer 173 also includes a composite enhanced effectiveness estimator (CEEE) that combines the enhanced responses and estimates from each modality to provide a blended estimate of the effectiveness. In particular embodiments, blended estimates are provided for each exposure of a subject to stimulus materials. The blended estimates are evaluated over time to assess resonance characteristics. According to various embodiments, numerical values are assigned to each blended estimate. The numerical values may correspond to the intensity of neuro-response measurements, the significance of peaks, the change between peaks, etc. Higher numerical values may correspond to higher significance in neuro-response intensity. Lower numerical values may correspond to lower significance or even insignificant neuro-response activity. In other examples, multiple values are assigned to each blended estimate. In still other examples, blended estimates of neuro-response significance are graphically represented to show changes after repeated exposure.
According to various embodiments, a data analyzer 173 passes data to a resonance estimator that assesses and extracts resonance patterns. In particular embodiments, the resonance estimator determines entity positions in various stimulus segments and matches position information with eye tracking paths while correlating saccades with neural assessments of attention, memory retention, and emotional engagement. In particular embodiments, the resonance estimator stores data in the priming repository system. As with a variety of the components in the system, various repositories can be co-located with the rest of the system and the user, or could be implemented in remote locations.
Data from various repositories is blended and passed to a virtual reality stimulus presentation engine to generate patterns, responses, and predictions 175. In some embodiments, the virtual reality stimulus presentation engine compares patterns and expressions associated with prior users to predict expressions of current users. According to various embodiments, patterns and expressions are combined with orthogonal survey, demographic, and preference data. In particular embodiments linguistic, perceptual, and/or motor responses are elicited and predicted. Response expression selection and pre-articulation prediction of expressive responses are also evaluated.
According to various embodiments, forces applied by electrodes 221 and 223 counterbalance forces applied by electrodes 261 and 263. In particular embodiments, forces applied by electrodes 231 and 233 counterbalance forces applied by electrode 251. In particular embodiments, the EEG dry electrodes operate to detect neurological activity with minimal interference from hair and without use of any electrically conductive gels. According to various embodiments, neuro-response data collection mechanism also includes EOG sensors such as sensors used to detect eye movements.
According to various embodiments, data acquisition using electrodes 221, 223, 231, 233, 251, 261, and 263 is synchronized with stimulus material presented to a user. Data acquisition can be synchronized with stimulus material presented by using a shared clock signal. The shared clock signal may originate from the stimulus material presentation mechanism, a headset, a cell tower, a satellite, etc. The data collection mechanism 201 also includes a transmitter and/or receiver to send collected neuro-response data to a data analysis system and to receive clock signals as needed. In some examples, a transceiver transmits all collected media such as video and/or audio, neuro-response, and sensor data to a data analyzer. In other examples, a transceiver transmits only interesting data provided by a filter. According to various embodiments, neuro-response data is correlated with timing information for stimulus material presented to a user.
In some examples, the transceiver can be connected to a computer system that then transmits data over a wide area network to a data analyzer. In other examples, the transceiver sends data over a wide area network to a data analyzer. Other components such as fMRI and MEG that are not yet portable but may become portable at some point may also be integrated into a headset.
It should be noted that some components of a neuro-response data collection mechanism have not been shown for clarity. For example, a battery may be required to power components such as amplifiers and transceivers. Similarly, a transceiver may include an antenna that is similarly not shown for clarity purposes. It should also be noted that some components are also optional. For example, filters or storage may not be required.
In particular embodiments, a subject attribute data model 315 includes a subject name 317 and/or identifier, contact information 321, and demographic attributes 319 that may be useful for review of neurological and neuro-physiological data. Some examples of pertinent demographic attributes include marriage status, employment status, occupation, household income, household size and composition, ethnicity, geographic location, sex, race. Other fields that may be included in data model 315 include shopping preferences, entertainment preferences, and financial preferences. Shopping preferences include favorite stores, shopping frequency, categories shopped, favorite brands. Entertainment preferences include network/cable/satellite access capabilities, favorite shows, favorite genres, and favorite actors. Financial preferences include favorite insurance companies, preferred investment practices, banking preferences, and favorite online financial instruments. A variety of subject attributes may be included in a subject attributes data model 315 and data models may be preset or custom generated to suit particular purposes.
Other data models may include a data collection data model 337. According to various embodiments, the data collection data model 337 includes recording attributes 339, equipment identifiers 341, modalities recorded 343, and data storage attributes 345. In particular embodiments, equipment attributes 341 include an amplifier identifier and a sensor identifier.
Modalities recorded 343 may include modality specific attributes like EEG cap layout, active channels, sampling frequency, and filters used. EOG specific attributes include the number and type of sensors used, location of sensors applied, etc. Eye tracking specific attributes include the type of tracker used, data recording frequency, data being recorded, recording format, etc. According to various embodiments, data storage attributes 345 include file storage conventions (format, naming convention, dating convention), storage location, archival attributes, expiry attributes, etc.
A preset query data model 349 includes a query name 351 and/or identifier, an accessed data collection 353 such as data segments involved (models, databases/cubes, tables, etc.), access security attributes 355 included who has what type of access, and refresh attributes 357 such as the expiry of the query, refresh frequency, etc. Other fields such as push-pull preferences can also be included to identify an auto push reporting driver or a user driven report retrieval system.
Other queries may retrieve stimulus material recorded based on shopping preferences of subject participants, countenance, physiological assessment, completion status. For example, a user may query for data associated with product categories, products shopped, shops frequented, subject eye correction status, color blindness, subject state, signal strength of measured responses, alpha frequency band ringers, muscle movement assessments, segments completed, etc.
Response assessment based queries 437 may include attention scores 439, emotion scores, 441, retention scores 443, and effectiveness scores 445. Such queries may obtain materials that elicited particular scores. Response measure profile based queries may use mean measure thresholds, variance measures, number of peaks detected, etc. Group response queries may include group statistics like mean, variance, kurtosis, p-value, etc., group size, and outlier assessment measures. Still other queries may involve testing attributes like test location, time period, test repetition count, test station, and test operator fields. A variety of types and combinations of types of queries can be used to efficiently extract data.
According to various embodiments, client cumulative reports 511 include media grouped reporting 513 of all stimulus assessed, campaign grouped reporting 515 of stimulus assessed, and time/location grouped reporting 517 of stimulus assessed. According to various embodiments, industry cumulative and syndicated reports 521 include aggregate assessment responses measures 523, top performer lists 525, bottom performer lists 527, outliers 529, and trend reporting 531. In particular embodiments, tracking and reporting includes specific products, categories, companies, brands.
At 607, stimulus material is integrated into a virtual reality environment and presented to a user. At 609, interaction data is received from users exposed to stimulus material. Interaction data may be received from haptic gloves, platforms, sensors, cameras, microphones, platforms, magnetic fields, controllers, etc.
At 611, neuro-response data is received from the subject neuro-response data collection mechanism. In some particular embodiments, EEG, EOG, pupillary dilation, facial emotion encoding data, video, images, audio, GPS data, etc., can all be transmitted from the subject to a neuro-response data analyzer. In particular embodiments, only EEG data is transmitted. At 613, stimulus material and the virtual reality environment is modified based on user interaction. In particular embodiments, products may be manipulated by the user in the virtual reality environment. According to various embodiments, stimulus material and/or the virtual reality environment can also be modified based on neuro-response data at 615. In particular embodiments, if a user is determined to be losing interest in a product, a different product may be presented. Alternatively, a different environment displaying the product may be presented after a transition from one store to another. According to various embodiments, neuro-response and associated data is transmitted directly from an EEG cap wide area network interface to a data analyzer. In particular embodiments, neuro-response and associated data is transmitted to a computer system that then performs compression and filtering of the data before transmitting the data to a data analyzer over a network.
According to various embodiments, data is also passed through a data cleanser to remove noise and artifacts that may make data more difficult to interpret. According to various embodiments, the data cleanser removes EEG electrical activity associated with blinking and other endogenous/exogenous artifacts. Data cleansing may be performed before or after data transmission to a data analyzer.
At 617, neuro-response data is synchronized with timing, environment, and other stimulus material data. In particular embodiments, neuro-response data is synchronized with a shared clock source. According to various embodiments, neuro-response data such as EEG and EOG data is tagged to indicate what the subject is viewing or listening to at a particular time.
At 619, data analysis is performed. Data analysis may include intra-modality response synthesis and cross-modality response synthesis to enhance effectiveness measures. It should be noted that in some particular instances, one type of synthesis may be performed without performing other types of synthesis. For example, cross-modality response synthesis may be performed with or without intra-modality synthesis.
A variety of mechanisms can be used to perform data analysis 609. In particular embodiments, a stimulus attributes repository is accessed to obtain attributes and characteristics of the stimulus materials, along with purposes, intents, objectives, etc. In particular embodiments, EEG response data is synthesized to provide an enhanced assessment of effectiveness. According to various embodiments, EEG measures electrical activity resulting from thousands of simultaneous neural processes associated with different portions of the brain. EEG data can be classified in various bands. According to various embodiments, brainwave frequencies include delta, theta, alpha, beta, and gamma frequency ranges. Delta waves are classified as those less than 4 Hz and are prominent during deep sleep. Theta waves have frequencies between 3.5 to 7.5 Hz and are associated with memories, attention, emotions, and sensations. Theta waves are typically prominent during states of internal focus.
Alpha frequencies reside between 7.5 and 13 Hz and typically peak around 10 Hz. Alpha waves are prominent during states of relaxation. Beta waves have a frequency range between 14 and 30 Hz. Beta waves are prominent during states of motor control, long range synchronization between brain areas, analytical problem solving, judgment, and decision making Gamma waves occur between 30 and 60 Hz and are involved in binding of different populations of neurons together into a network for the purpose of carrying out a certain cognitive or motor function, as well as in attention and memory. Because the skull and dermal layers attenuate waves in this frequency range, brain waves above 75-80 Hz are difficult to detect and are often not used for stimuli response assessment.
However, the techniques and mechanisms of the present invention recognize that analyzing high gamma band (kappa-band: Above 60 Hz) measurements, in addition to theta, alpha, beta, and low gamma band measurements, enhances neurological attention, emotional engagement and retention component estimates. In particular embodiments, EEG measurements including difficult to detect high gamma or kappa band measurements are obtained, enhanced, and evaluated. Subject and task specific signature sub-bands in the theta, alpha, beta, gamma and kappa bands are identified to provide enhanced response estimates. According to various embodiments, high gamma waves (kappa-band) above 80 Hz (typically detectable with sub-cranial EEG and/or magnetoencephalograophy) can be used in inverse model-based enhancement of the frequency responses to the stimuli.
Various embodiments of the present invention recognize that particular sub-bands within each frequency range have particular prominence during certain activities. A subset of the frequencies in a particular band is referred to herein as a sub-band. For example, a sub-band may include the 40-45 Hz range within the gamma band. In particular embodiments, multiple sub-bands within the different bands are selected while remaining frequencies are band pass filtered. In particular embodiments, multiple sub-band responses may be enhanced, while the remaining frequency responses may be attenuated.
An information theory based band-weighting model is used for adaptive extraction of selective dataset specific, subject specific, task specific bands to enhance the effectiveness measure. Adaptive extraction may be performed using fuzzy scaling. Stimuli can be presented and enhanced measurements determined multiple times to determine the variation profiles across multiple presentations. Determining various profiles provides an enhanced assessment of the primary responses as well as the longevity (wear-out) of the marketing and entertainment stimuli. The synchronous response of multiple individuals to stimuli presented in concert is measured to determine an enhanced across subject synchrony measure of effectiveness. According to various embodiments, the synchronous response may be determined for multiple subjects residing in separate locations or for multiple subjects residing in the same location.
Although a variety of synthesis mechanisms are described, it should be recognized that any number of mechanisms can be applied—in sequence or in parallel with or without interaction between the mechanisms.
Although intra-modality synthesis mechanisms provide enhanced significance data, additional cross-modality synthesis mechanisms can also be applied. A variety of mechanisms such as EEG, Eye Tracking, GSR, EOG, and facial emotion encoding are connected to a cross-modality synthesis mechanism. Other mechanisms as well as variations and enhancements on existing mechanisms may also be included. According to various embodiments, data from a specific modality can be enhanced using data from one or more other modalities. In particular embodiments, EEG typically makes frequency measurements in different bands like alpha, beta and gamma to provide estimates of significance. However, the techniques of the present invention recognize that significance measures can be enhanced further using information from other modalities.
For example, facial emotion encoding measures can be used to enhance the valence of the EEG emotional engagement measure. EOG and eye tracking saccadic measures of object entities can be used to enhance the EEG estimates of significance including but not limited to attention, emotional engagement, and memory retention. According to various embodiments, a cross-modality synthesis mechanism performs time and phase shifting of data to allow data from different modalities to align. In some examples, it is recognized that an EEG response will often occur hundreds of milliseconds before a facial emotion measurement changes. Correlations can be drawn and time and phase shifts made on an individual as well as a group basis. In other examples, saccadic eye movements may be determined as occurring before and after particular EEG responses. According to various embodiments, time corrected GSR measures are used to scale and enhance the EEG estimates of significance including attention, emotional engagement and memory retention measures.
Evidence of the occurrence or non-occurrence of specific time domain difference event-related potential components (like the DERP) in specific regions correlates with subject responsiveness to specific stimulus. According to various embodiments, ERP measures are enhanced using EEG time-frequency measures (ERPSP) in response to the presentation of the marketing and entertainment stimuli. Specific portions are extracted and isolated to identify ERP, DERP and ERPSP analyses to perform. In particular embodiments, an EEG frequency estimation of attention, emotion and memory retention (ERPSP) is used as a co-factor in enhancing the ERP, DERP and time-domain response analysis.
EOG measures saccades to determine the presence of attention to specific objects of stimulus. Eye tracking measures the subject's gaze path, location and dwell on specific objects of stimulus. According to various embodiments, EOG and eye tracking is enhanced by measuring the presence of lambda waves (a neurophysiological index of saccade effectiveness) in the ongoing EEG in the occipital and extra striate regions, triggered by the slope of saccade-onset to estimate the significance of the EOG and eye tracking measures. In particular embodiments, specific EEG signatures of activity such as slow potential shifts and measures of coherence in time-frequency responses at the Frontal Eye Field (FEF) regions that preceded saccade-onset are measured to enhance the effectiveness of the saccadic activity data.
According to various embodiments, facial emotion encoding uses templates generated by measuring facial muscle positions and movements of individuals expressing various emotions prior to the testing session. These individual specific facial emotion encoding templates are matched with the individual responses to identify subject emotional response. In particular embodiments, these facial emotion encoding measurements are enhanced by evaluating inter-hemispherical asymmetries in EEG responses in specific frequency bands and measuring frequency band interactions. The techniques of the present invention recognize that not only are particular frequency bands significant in EEG responses, but particular frequency bands used for communication between particular areas of the brain are significant. Consequently, these EEG responses enhance the EMG, graphic and video based facial emotion identification.
Integrated responses are generated at 621. According to various embodiments, the data communication device transmits data to the response integration using protocols such as the File Transfer Protocol (FTP), Hypertext Transfer Protocol (HTTP) along with a variety of conventional, bus, wired network, wireless network, satellite, and proprietary communication protocols. The data transmitted can include the data in its entirety, excerpts of data, converted data, and/or elicited response measures. According to various embodiments, data is sent using telecommunications, wireless, Internet, satellite, or any other communication mechanisms that is capable of conveying information from multiple subject locations for data integration and analysis. The mechanism may be integrated in a set top box, computer system, receiver, mobile device, etc.
In particular embodiments, the data communication device sends data to the response integration system. According to various embodiments, the response integration system combines analyzed and enhanced responses to the stimulus material while using information about stimulus material attributes. In particular embodiments, the response integration system also collects and integrates user behavioral and survey responses with the analyzed and enhanced response data to more effectively measure and track neuro-responses to stimulus materials. According to various embodiments, the response integration system obtains attributes such as requirements and purposes of the stimulus material presented.
Some of these requirements and purposes may be obtained from a variety of databases. According to various embodiments, the response integration system also includes mechanisms for the collection and storage of demographic, statistical and/or survey based responses to different entertainment, marketing, advertising and other audio/visual/tactile/olfactory material. If this information is stored externally, the response integration system can include a mechanism for the push and/or pull integration of the data, such as querying, extraction, recording, modification, and/or updating.
The response integration system can further include an adaptive learning component that refines user or group profiles and tracks variations in the neuro-response data collection system to particular stimuli or series of stimuli over time. This information can be made available for other purposes, such as use of the information for presentation attribute decision making. According to various embodiments, the response integration system builds and uses responses of users having similar profiles and demographics to provide integrated responses at 621. In particular embodiments, stimulus and response data is stored in a repository at 623 for later retrieval and analysis.
According to various embodiments, various mechanisms such as the data collection mechanisms, the intra-modality synthesis mechanisms, cross-modality synthesis mechanisms, etc. are implemented on multiple devices. However, it is also possible that the various mechanisms be implemented in hardware, firmware, and/or software in a single system.
According to particular example embodiments, a system 700 suitable for implementing particular embodiments of the present invention includes a processor 701, a memory 703, an interface 711, and a bus 715 (e.g., a PCI bus). When acting under the control of appropriate software or firmware, the processor 701 is responsible for such tasks such as pattern generation. Various specially configured devices can also be used in place of a processor 701 or in addition to processor 701. The complete implementation can also be done in custom hardware. The interface 711 is typically configured to send and receive data packets or data segments over a network. Particular examples of interfaces the device supports include host bus adapter (HBA) interfaces, Ethernet interfaces, frame relay interfaces, cable interfaces, DSL interfaces, token ring interfaces, and the like.
In addition, various very high-speed interfaces may be provided such as fast Ethernet interfaces, Gigabit Ethernet interfaces, ATM interfaces, HSSI interfaces, POS interfaces, FDDI interfaces and the like. Generally, these interfaces may include ports appropriate for communication with the appropriate media. In some cases, they may also include an independent processor and, in some instances, volatile RAM. The independent processors may control such communications intensive tasks as data synthesis.
According to particular example embodiments, the system 700 uses memory 703 to store data, algorithms and program instructions. The program instructions may control the operation of an operating system and/or one or more applications, for example. The memory or memories may also be configured to store received data and process received data.
Because such information and program instructions may be employed to implement the systems/methods described herein, the present invention relates to tangible, machine readable media that include program instructions, state information, etc. for performing various operations described herein. Examples of machine-readable media include, but are not limited to, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROM disks and DVDs; magneto-optical media such as optical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory devices (ROM) and random access memory (RAM). Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter.
Although the foregoing invention has been described in some detail for purposes of clarity of understanding, it will be apparent that certain changes and modifications may be practiced within the scope of the appended claims. Therefore, the present embodiments are to be considered as illustrative and not restrictive and the invention is not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
2549836 | McIntyre et al. | Apr 1951 | A |
3490439 | Rolston | Jan 1970 | A |
3572322 | Wade | Mar 1971 | A |
3735753 | Pisarski | May 1973 | A |
3880144 | Coursin et al. | Apr 1975 | A |
3901215 | John | Aug 1975 | A |
3998213 | Price | Dec 1976 | A |
4075657 | Weinblatt | Feb 1978 | A |
4149716 | Scudder | Apr 1979 | A |
4201224 | John | May 1980 | A |
4279258 | John | Jul 1981 | A |
4411273 | John | Oct 1983 | A |
4417592 | John | Nov 1983 | A |
4537198 | Corbett | Aug 1985 | A |
4557270 | John | Dec 1985 | A |
4610259 | Cohen et al. | Sep 1986 | A |
4613951 | Chu | Sep 1986 | A |
4632122 | Johansson et al. | Dec 1986 | A |
4683892 | Johansson et al. | Aug 1987 | A |
4695879 | Weinblatt | Sep 1987 | A |
4736751 | Gevins et al. | Apr 1988 | A |
4800888 | Itil et al. | Jan 1989 | A |
4802484 | Friedman et al. | Feb 1989 | A |
4846190 | John | Jul 1989 | A |
4885687 | Carey | Dec 1989 | A |
4894777 | Negishi et al. | Jan 1990 | A |
4913160 | John | Apr 1990 | A |
4955388 | Silberstein | Sep 1990 | A |
4967038 | Gevins et al. | Oct 1990 | A |
4987903 | Keppel et al. | Jan 1991 | A |
5003986 | Finitzo et al. | Apr 1991 | A |
5010891 | Chamoun | Apr 1991 | A |
5038782 | Gevins et al. | Aug 1991 | A |
5052401 | Sherwin | Oct 1991 | A |
5083571 | Prichep | Jan 1992 | A |
RE34015 | Duffy | Aug 1992 | E |
5137027 | Rosenfeld | Aug 1992 | A |
5213338 | Brotz | May 1993 | A |
5226177 | Nickerson | Jul 1993 | A |
5243517 | Schmidt et al. | Sep 1993 | A |
5273037 | Itil et al. | Dec 1993 | A |
5291888 | Tucker | Mar 1994 | A |
5293867 | Oommen | Mar 1994 | A |
5295491 | Gevins | Mar 1994 | A |
5339826 | Schmidt et al. | Aug 1994 | A |
5357957 | Itil et al. | Oct 1994 | A |
5363858 | Farwell | Nov 1994 | A |
5392788 | Hudspeth | Feb 1995 | A |
5406956 | Farwell | Apr 1995 | A |
5447166 | Gevins | Sep 1995 | A |
5474082 | Junker | Dec 1995 | A |
5479934 | Imran | Jan 1996 | A |
5518007 | Becker | May 1996 | A |
5537618 | Boulton et al. | Jul 1996 | A |
5617855 | Waletzky et al. | Apr 1997 | A |
5655534 | Ilmoniemi | Aug 1997 | A |
5676138 | Zawilinski | Oct 1997 | A |
5720619 | Fisslinger | Feb 1998 | A |
5724987 | Gevins et al. | Mar 1998 | A |
5729205 | Kwon | Mar 1998 | A |
5740035 | Cohen et al. | Apr 1998 | A |
5762611 | Lewis et al. | Jun 1998 | A |
5771897 | Zufrin | Jun 1998 | A |
5787187 | Bouchard et al. | Jul 1998 | A |
5800351 | Mann | Sep 1998 | A |
5812642 | Leroy | Sep 1998 | A |
5817029 | Gevins et al. | Oct 1998 | A |
5848399 | Burke | Dec 1998 | A |
5945863 | Coy | Aug 1999 | A |
5961332 | Joao | Oct 1999 | A |
5983129 | Cowan et al. | Nov 1999 | A |
6001065 | DeVito | Dec 1999 | A |
6021346 | Ryu et al. | Feb 2000 | A |
6052619 | John | Apr 2000 | A |
6099319 | Zaltman | Aug 2000 | A |
6120440 | Goknar | Sep 2000 | A |
6128521 | Marro et al. | Oct 2000 | A |
6154669 | Hunter et al. | Nov 2000 | A |
6161030 | Levendowski et al. | Dec 2000 | A |
6173260 | Slaney | Jan 2001 | B1 |
6175753 | Menkes et al. | Jan 2001 | B1 |
6228038 | Claessens | May 2001 | B1 |
6236885 | Hunter et al. | May 2001 | B1 |
6254536 | DeVito | Jul 2001 | B1 |
6280198 | Calhoun et al. | Aug 2001 | B1 |
6286005 | Cannon | Sep 2001 | B1 |
6289234 | Mueller | Sep 2001 | B1 |
6292688 | Patton | Sep 2001 | B1 |
6301493 | Marro et al. | Oct 2001 | B1 |
6315569 | Zaltman | Nov 2001 | B1 |
6330470 | Tucker et al. | Dec 2001 | B1 |
6334778 | Brown | Jan 2002 | B1 |
6374143 | Berrang et al. | Apr 2002 | B1 |
6381481 | Levendowski et al. | Apr 2002 | B1 |
6398643 | Knowles et al. | Jun 2002 | B1 |
6422999 | Hill | Jul 2002 | B1 |
6434419 | Gevins et al. | Aug 2002 | B1 |
6453194 | Hill | Sep 2002 | B1 |
6487444 | Mimura | Nov 2002 | B2 |
6488617 | Katz | Dec 2002 | B1 |
6510340 | Jordan | Jan 2003 | B1 |
6520905 | Surve et al. | Feb 2003 | B1 |
6545685 | Dorbie | Apr 2003 | B1 |
6575902 | Burton | Jun 2003 | B1 |
6577329 | Flickner et al. | Jun 2003 | B1 |
6585521 | Obrador | Jul 2003 | B1 |
6594521 | Tucker | Jul 2003 | B2 |
6598006 | Honda et al. | Jul 2003 | B1 |
6654626 | Devlin et al. | Nov 2003 | B2 |
6662052 | Sarwal et al. | Dec 2003 | B1 |
6665560 | Becker et al. | Dec 2003 | B2 |
6688890 | von Buegner | Feb 2004 | B2 |
6708051 | Durousseau | Mar 2004 | B1 |
6712468 | Edwards | Mar 2004 | B1 |
6754524 | Johnson, Jr. | Jun 2004 | B2 |
6757556 | Gopenathan et al. | Jun 2004 | B2 |
6788882 | Geer et al. | Sep 2004 | B1 |
6792304 | Silberstein | Sep 2004 | B1 |
6842877 | Robarts et al. | Jan 2005 | B2 |
6904408 | McCarthy et al. | Jun 2005 | B1 |
6950698 | Sarkela et al. | Sep 2005 | B2 |
6958710 | Zhang et al. | Oct 2005 | B2 |
6973342 | Swanson | Dec 2005 | B1 |
6993380 | Modarres | Jan 2006 | B1 |
7120880 | Dryer et al. | Oct 2006 | B1 |
7130673 | Tolvanen-Laakso et al. | Oct 2006 | B2 |
7150715 | Collura et al. | Dec 2006 | B2 |
7164967 | Etienne-Cummings et al. | Jan 2007 | B2 |
7177675 | Suffin et al. | Feb 2007 | B2 |
7222071 | Neuhauser et al. | May 2007 | B2 |
7272982 | Neuhauser et al. | Sep 2007 | B2 |
7286871 | Cohen | Oct 2007 | B2 |
7340060 | Tomkins et al. | Mar 2008 | B2 |
7391835 | Gross et al. | Jun 2008 | B1 |
7408460 | Crystal et al. | Aug 2008 | B2 |
7420464 | Fitzgerald et al. | Sep 2008 | B2 |
7443292 | Jensen et al. | Oct 2008 | B2 |
7460827 | Schuster et al. | Dec 2008 | B2 |
7460903 | Pineda et al. | Dec 2008 | B2 |
7463143 | Forr et al. | Dec 2008 | B2 |
7463144 | Crystal et al. | Dec 2008 | B2 |
7471987 | Crystal et al. | Dec 2008 | B2 |
7483835 | Neuhauser et al. | Jan 2009 | B2 |
7496400 | Hoskonen et al. | Feb 2009 | B2 |
7548774 | Kurtz et al. | Jun 2009 | B2 |
7551952 | Gevins et al. | Jun 2009 | B2 |
7557775 | Gross | Jul 2009 | B2 |
7592908 | Zhang et al. | Sep 2009 | B2 |
7614066 | Urdang et al. | Nov 2009 | B2 |
7623823 | Zito et al. | Nov 2009 | B2 |
7636456 | Collins et al. | Dec 2009 | B2 |
7650793 | Jensen et al. | Jan 2010 | B2 |
7689272 | Farwell | Mar 2010 | B2 |
7697979 | Martinerie et al. | Apr 2010 | B2 |
7698238 | Barletta et al. | Apr 2010 | B2 |
7720351 | Levitan | May 2010 | B2 |
7729755 | Laken | Jun 2010 | B2 |
7809420 | Hannula et al. | Oct 2010 | B2 |
7840248 | Fuchs et al. | Nov 2010 | B2 |
7840250 | Tucker | Nov 2010 | B2 |
7865394 | Calloway | Jan 2011 | B1 |
7892764 | Xiong et al. | Feb 2011 | B2 |
7908133 | Neuhauser | Mar 2011 | B2 |
7917366 | Levanon et al. | Mar 2011 | B1 |
7962315 | Jensen et al. | Jun 2011 | B2 |
7988557 | Soderland | Aug 2011 | B2 |
7996264 | Kusumoto et al. | Aug 2011 | B2 |
8014847 | Shastri et al. | Sep 2011 | B2 |
8055722 | Hille | Nov 2011 | B2 |
8069125 | Jung et al. | Nov 2011 | B2 |
8082215 | Jung et al. | Dec 2011 | B2 |
8086563 | Jung et al. | Dec 2011 | B2 |
8098152 | Zhang et al. | Jan 2012 | B2 |
8103328 | Turner et al. | Jan 2012 | B2 |
8135606 | Dupree | Mar 2012 | B2 |
8165916 | Hoffberg et al. | Apr 2012 | B2 |
8209224 | Pradeep et al. | Jun 2012 | B2 |
8229469 | Zhang et al. | Jul 2012 | B2 |
8270814 | Pradeep et al. | Sep 2012 | B2 |
20010020236 | Cannon | Sep 2001 | A1 |
20010056225 | DeVito | Dec 2001 | A1 |
20020065826 | Bell et al. | May 2002 | A1 |
20020072952 | Hamzey et al. | Jun 2002 | A1 |
20020077534 | DuRousseau | Jun 2002 | A1 |
20020155878 | Lert, Jr. et al. | Oct 2002 | A1 |
20020156842 | Signes et al. | Oct 2002 | A1 |
20020188216 | Kayyali et al. | Dec 2002 | A1 |
20020188217 | Farwell | Dec 2002 | A1 |
20020193670 | Garfield et al. | Dec 2002 | A1 |
20030013981 | Gevins et al. | Jan 2003 | A1 |
20030036955 | Tanaka et al. | Feb 2003 | A1 |
20030059750 | Bindler et al. | Mar 2003 | A1 |
20030073921 | Sohmer et al. | Apr 2003 | A1 |
20030100998 | Brunner et al. | May 2003 | A2 |
20030104865 | Itkis et al. | Jun 2003 | A1 |
20030165270 | Endrikhovski et al. | Sep 2003 | A1 |
20030177488 | Smith et al. | Sep 2003 | A1 |
20030233278 | Marshall | Dec 2003 | A1 |
20040005143 | Tsuru et al. | Jan 2004 | A1 |
20040013398 | Miura et al. | Jan 2004 | A1 |
20040015608 | Ellis et al. | Jan 2004 | A1 |
20040073129 | Caldwell et al. | Apr 2004 | A1 |
20040092809 | DeCharms | May 2004 | A1 |
20040098298 | Yin | May 2004 | A1 |
20040187167 | Maguire et al. | Sep 2004 | A1 |
20040210159 | Kibar | Oct 2004 | A1 |
20040220483 | Yeo et al. | Nov 2004 | A1 |
20040236623 | Gopalakrishnan | Nov 2004 | A1 |
20050010475 | Perkowski et al. | Jan 2005 | A1 |
20050076359 | Pierson et al. | Apr 2005 | A1 |
20050079474 | Lowe | Apr 2005 | A1 |
20050097594 | O'Donnell et al. | May 2005 | A1 |
20050107716 | Eaton et al. | May 2005 | A1 |
20050143629 | Farwell | Jun 2005 | A1 |
20050154290 | Langleben | Jul 2005 | A1 |
20050177058 | Sobell | Aug 2005 | A1 |
20050197590 | Osorio et al. | Sep 2005 | A1 |
20050203798 | Jensen et al. | Sep 2005 | A1 |
20050223237 | Barletta et al. | Oct 2005 | A1 |
20050227233 | Buxton et al. | Oct 2005 | A1 |
20050240956 | Smith et al. | Oct 2005 | A1 |
20050272017 | Neuhauser et al. | Dec 2005 | A1 |
20050273017 | Gordon | Dec 2005 | A1 |
20050273802 | Crystal et al. | Dec 2005 | A1 |
20050288954 | McCarthy et al. | Dec 2005 | A1 |
20050289582 | Tavares et al. | Dec 2005 | A1 |
20060003732 | Neuhauser et al. | Jan 2006 | A1 |
20060035707 | Nguyen et al. | Feb 2006 | A1 |
20060053110 | McDonald et al. | Mar 2006 | A1 |
20060093998 | Vertegaal | May 2006 | A1 |
20060111044 | Keller | May 2006 | A1 |
20060111644 | Guttag et al. | May 2006 | A1 |
20060129458 | Maggio | Jun 2006 | A1 |
20060167376 | Viirre et al. | Jul 2006 | A1 |
20060168613 | Wood et al. | Jul 2006 | A1 |
20060168630 | Davies | Jul 2006 | A1 |
20060256133 | Rosenberg | Nov 2006 | A1 |
20060257834 | Lee et al. | Nov 2006 | A1 |
20060259360 | Flinn et al. | Nov 2006 | A1 |
20060293921 | McCarthy et al. | Dec 2006 | A1 |
20070048707 | Caamano et al. | Mar 2007 | A1 |
20070055169 | Lee et al. | Mar 2007 | A1 |
20070060831 | Le et al. | Mar 2007 | A1 |
20070066874 | Cook | Mar 2007 | A1 |
20070066915 | Frei et al. | Mar 2007 | A1 |
20070066916 | Lemos | Mar 2007 | A1 |
20070067007 | Schulman et al. | Mar 2007 | A1 |
20070078706 | Datta et al. | Apr 2007 | A1 |
20070079331 | Datta et al. | Apr 2007 | A1 |
20070106170 | Dunseath, Jr. et al. | May 2007 | A1 |
20070135727 | Virtanen et al. | Jun 2007 | A1 |
20070135728 | Snyder et al. | Jun 2007 | A1 |
20070225585 | Washbon et al. | Sep 2007 | A1 |
20070225674 | Molnar et al. | Sep 2007 | A1 |
20070226760 | Neuhauser et al. | Sep 2007 | A1 |
20070235716 | Delic et al. | Oct 2007 | A1 |
20070238945 | Delic et al. | Oct 2007 | A1 |
20070250846 | Swix et al. | Oct 2007 | A1 |
20070265507 | de Lemos | Nov 2007 | A1 |
20070294132 | Zhang et al. | Dec 2007 | A1 |
20070294705 | Gopalakrishnan | Dec 2007 | A1 |
20070294706 | Neuhauser et al. | Dec 2007 | A1 |
20080001600 | deCharms | Jan 2008 | A1 |
20080010110 | Neuhauser et al. | Jan 2008 | A1 |
20080027345 | Kumada et al. | Jan 2008 | A1 |
20080040740 | Plotnick et al. | Feb 2008 | A1 |
20080059997 | Plotnick et al. | Mar 2008 | A1 |
20080065468 | Berg et al. | Mar 2008 | A1 |
20080065721 | Cragun | Mar 2008 | A1 |
20080081961 | Westbrook et al. | Apr 2008 | A1 |
20080082019 | Ludving et al. | Apr 2008 | A1 |
20080086356 | Glassman et al. | Apr 2008 | A1 |
20080091512 | Marci et al. | Apr 2008 | A1 |
20080097854 | Young | Apr 2008 | A1 |
20080109840 | Walter et al. | May 2008 | A1 |
20080125110 | Ritter | May 2008 | A1 |
20080147488 | Tunick et al. | Jun 2008 | A1 |
20080152300 | Knee et al. | Jun 2008 | A1 |
20080204273 | Crystal et al. | Aug 2008 | A1 |
20080208072 | Fadem et al. | Aug 2008 | A1 |
20080214902 | Lee et al. | Sep 2008 | A1 |
20080221400 | Lee et al. | Sep 2008 | A1 |
20080221472 | Lee et al. | Sep 2008 | A1 |
20080221969 | Lee et al. | Sep 2008 | A1 |
20080222670 | Lee et al. | Sep 2008 | A1 |
20080222671 | Lee et al. | Sep 2008 | A1 |
20080228077 | Wilk et al. | Sep 2008 | A1 |
20080255949 | Genco et al. | Oct 2008 | A1 |
20080295126 | Lee et al. | Nov 2008 | A1 |
20090024049 | Pradeep et al. | Jan 2009 | A1 |
20090024447 | Pradeep et al. | Jan 2009 | A1 |
20090024448 | Pradeep et al. | Jan 2009 | A1 |
20090024449 | Pradeep et al. | Jan 2009 | A1 |
20090024475 | Pradeep et al. | Jan 2009 | A1 |
20090025023 | Pradeep et al. | Jan 2009 | A1 |
20090025024 | Beser et al. | Jan 2009 | A1 |
20090030287 | Pradeep et al. | Jan 2009 | A1 |
20090030303 | Pradeep et al. | Jan 2009 | A1 |
20090030717 | Pradeep et al. | Jan 2009 | A1 |
20090030762 | Lee et al. | Jan 2009 | A1 |
20090030930 | Pradeep et al. | Jan 2009 | A1 |
20090036755 | Pradeep et al. | Feb 2009 | A1 |
20090036756 | Pradeep et al. | Feb 2009 | A1 |
20090037575 | Crystal et al. | Feb 2009 | A1 |
20090060240 | Coughlan et al. | Mar 2009 | A1 |
20090062629 | Pradeep et al. | Mar 2009 | A1 |
20090062679 | Tan et al. | Mar 2009 | A1 |
20090062680 | Sandford | Mar 2009 | A1 |
20090062681 | Pradeep et al. | Mar 2009 | A1 |
20090063255 | Pradeep et al. | Mar 2009 | A1 |
20090063256 | Pradeep et al. | Mar 2009 | A1 |
20090069652 | Lee et al. | Mar 2009 | A1 |
20090070798 | Lee et al. | Mar 2009 | A1 |
20090082643 | Pradeep et al. | Mar 2009 | A1 |
20090082689 | Guttag et al. | Mar 2009 | A1 |
20090083129 | Pradeep et al. | Mar 2009 | A1 |
20090088610 | Lee et al. | Apr 2009 | A1 |
20090089830 | Chandratillake et al. | Apr 2009 | A1 |
20090094286 | Lee et al. | Apr 2009 | A1 |
20090094627 | Lee et al. | Apr 2009 | A1 |
20090094628 | Lee et al. | Apr 2009 | A1 |
20090094629 | Lee et al. | Apr 2009 | A1 |
20090097689 | Prest et al. | Apr 2009 | A1 |
20090099474 | Pineda et al. | Apr 2009 | A1 |
20090112077 | Nguyen et al. | Apr 2009 | A1 |
20090131764 | Lee et al. | May 2009 | A1 |
20090133047 | Lee et al. | May 2009 | A1 |
20090150919 | Lee et al. | Jun 2009 | A1 |
20090158308 | Weitzenfeld et al. | Jun 2009 | A1 |
20090163777 | Jung et al. | Jun 2009 | A1 |
20090195392 | Zalewski | Aug 2009 | A1 |
20090214060 | Chuang et al. | Aug 2009 | A1 |
20090248484 | Surendran et al. | Oct 2009 | A1 |
20090248496 | Hueter et al. | Oct 2009 | A1 |
20090253996 | Lee et al. | Oct 2009 | A1 |
20090259137 | Delic et al. | Oct 2009 | A1 |
20090292587 | Fitzgerald | Nov 2009 | A1 |
20090318773 | Jung et al. | Dec 2009 | A1 |
20090318826 | Green et al. | Dec 2009 | A1 |
20090327068 | Pradeep et al. | Dec 2009 | A1 |
20090328089 | Pradeep et al. | Dec 2009 | A1 |
20100004977 | Marci et al. | Jan 2010 | A1 |
20100022821 | Dubi et al. | Jan 2010 | A1 |
20100041962 | Causevic et al. | Feb 2010 | A1 |
20100060300 | Mueller et al. | Mar 2010 | A1 |
20100125219 | Harris et al. | May 2010 | A1 |
20100145176 | Himes | Jun 2010 | A1 |
20100145215 | Pradeep et al. | Jun 2010 | A1 |
20100145217 | Otto et al. | Jun 2010 | A1 |
20100183279 | Pradeep et al. | Jul 2010 | A1 |
20100186031 | Pradeep et al. | Jul 2010 | A1 |
20100186032 | Pradeep et al. | Jul 2010 | A1 |
20100198042 | Popescu et al. | Aug 2010 | A1 |
20100214318 | Pradeep et al. | Aug 2010 | A1 |
20100215289 | Pradeep et al. | Aug 2010 | A1 |
20100218208 | Holden | Aug 2010 | A1 |
20100249538 | Pradeep et al. | Sep 2010 | A1 |
20100249636 | Pradeep et al. | Sep 2010 | A1 |
20100250325 | Pradeep et al. | Sep 2010 | A1 |
20100250458 | Ho | Sep 2010 | A1 |
20100257052 | Zito et al. | Oct 2010 | A1 |
20100268540 | Arshi et al. | Oct 2010 | A1 |
20100268573 | Jain et al. | Oct 2010 | A1 |
20100269127 | Krug | Oct 2010 | A1 |
20100274152 | McPeck et al. | Oct 2010 | A1 |
20100274153 | Tucker et al. | Oct 2010 | A1 |
20100317988 | Terada et al. | Dec 2010 | A1 |
20100325660 | Holden | Dec 2010 | A1 |
20100331661 | Nakagawa | Dec 2010 | A1 |
20110004089 | Chou | Jan 2011 | A1 |
20110015503 | Joffe et al. | Jan 2011 | A1 |
20110015539 | deCharms | Jan 2011 | A1 |
20110040202 | Luo et al. | Feb 2011 | A1 |
20110046473 | Pradeep et al. | Feb 2011 | A1 |
20110046502 | Pradeep et al. | Feb 2011 | A1 |
20110046503 | Pradeep et al. | Feb 2011 | A1 |
20110046504 | Pradeep et al. | Feb 2011 | A1 |
20110047121 | Pradeep et al. | Feb 2011 | A1 |
20110059422 | Masaoka | Mar 2011 | A1 |
20110085700 | Lee | Apr 2011 | A1 |
20110098593 | Low et al. | Apr 2011 | A1 |
20110105937 | Pradeep et al. | May 2011 | A1 |
20110106621 | Pradeep et al. | May 2011 | A1 |
20110106750 | Pradeep et al. | May 2011 | A1 |
20110119124 | Pradeep et al. | May 2011 | A1 |
20110119129 | Pradeep et al. | May 2011 | A1 |
20110131274 | Hille | Jun 2011 | A1 |
20110144519 | Causevic | Jun 2011 | A1 |
20110153391 | Tenbrock | Jun 2011 | A1 |
20110208515 | Neuhauser | Aug 2011 | A1 |
20110224569 | Isenhart et al. | Sep 2011 | A1 |
20110237923 | Picht et al. | Sep 2011 | A1 |
20110237971 | Pradeep et al. | Sep 2011 | A1 |
20110248729 | Mueller et al. | Oct 2011 | A2 |
20110257502 | Lee | Oct 2011 | A1 |
20110257937 | Lee | Oct 2011 | A1 |
20110270620 | Pradeep et al. | Nov 2011 | A1 |
20110276504 | Pradeep et al. | Nov 2011 | A1 |
20110282231 | Pradeep et al. | Nov 2011 | A1 |
20110282232 | Pradeep et al. | Nov 2011 | A1 |
20110282749 | Pradeep et al. | Nov 2011 | A1 |
20110298706 | Mann | Dec 2011 | A1 |
20110319975 | Ho et al. | Dec 2011 | A1 |
20120004899 | Arshi | Jan 2012 | A1 |
20120022391 | Leuthardt | Jan 2012 | A1 |
20120036005 | Pradeep et al. | Feb 2012 | A1 |
20120054018 | Pradeep et al. | Mar 2012 | A1 |
20120072289 | Pradeep et al. | Mar 2012 | A1 |
20120108995 | Pradeep et al. | May 2012 | A1 |
20120114305 | Holden | May 2012 | A1 |
20120130800 | Pradeep et al. | May 2012 | A1 |
20120239407 | Jain et al. | Sep 2012 | A1 |
20120245978 | Crystal et al. | Sep 2012 | A1 |
Number | Date | Country |
---|---|---|
1374658 | Nov 1974 | GB |
2221759 | Feb 1990 | GB |
95-18565 | Jul 1995 | WO |
9717774 | May 1997 | WO |
9740745 | Nov 1997 | WO |
9741673 | Nov 1997 | WO |
02-100241 | Dec 2002 | WO |
02-102238 | Dec 2002 | WO |
2004049225 | Jun 2004 | WO |
2008-077178 | Jul 2008 | WO |
2008-109694 | Sep 2008 | WO |
2008-109699 | Sep 2008 | WO |
2008121651 | Oct 2008 | WO |
2008137579 | Nov 2008 | WO |
2008154410 | Dec 2008 | WO |
2009018374 | Feb 2009 | WO |
2009052833 | Apr 2009 | WO |
2011-055291 | May 2011 | WO |
2011-056679 | May 2011 | WO |
Entry |
---|
“Virtual reality” definition, downloaded Aug. 3, 2012 from http://www.merriam-webster.com/dictionary/virtuarl%20reality. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,211, on Jul. 8, 2011, 16 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,211, on Jan. 7, 2011, 19 pages. |
Office Action issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,221, on Apr. 15, 2011, 24 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/113,863, on Jun. 9, 2011, 12 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/113,863, on Dec. 27, 2010, 15 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/113,870, on Apr. 21, 2011, 10 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/113,870, on Dec. 3, 2010, 12 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/122,240, on Jun. 10, 2011, 12 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/122,262, on May 26, 2011, 15 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/122,262, on Dec. 9, 2010, 13 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,066, on Jan. 21, 2011, 16 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,066, on Oct. 28, 2010, 14 pages. |
Notice of Panel Decision from Pre-Appeal Brief Review, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,066, on May 31, 2011, 2 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,074, on Dec. 23, 2010, 14 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,074, on Jun. 9, 2011, 10 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/182,874, on Jul. 7, 2011, 14 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/182,874, on Dec. 27, 2010, 17 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,557, on Dec. 27, 2010, 14 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,557, on Jun. 9, 2011, 12 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,583, on Jun. 21, 2011, 14 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,583, on Dec. 27, 2010, 17 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,596, on Jun. 14, 2011, 13 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,596, on Dec. 27, 2010, 17 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/200,813, on Jul. 6, 2011, 13 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/200,813, on Dec. 27, 2010, 14 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/234,372, on Jun. 7, 2011, 10 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,069, on Feb. 17, 2011, 32 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,069, on Oct. 29, 2010, 21 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/357,315, on May 4, 2011, 9 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/410,380, on Jun. 7, 2011, 9 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/413,297, on Jul. 18, 2011, 9 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/608,685,on Jul. 12, 2011, 15 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,190,on Aug. 10, 2011, 28 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/357,322,on Aug. 23, 2011, 12 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,069, on Aug. 26, 2011, 33 pages. |
Restriction Requirement, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/122,253,on Sep. 2, 2011, 7 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/410,372, on Sep. 12, 2011, 12 pages. |
Restriction Requirement, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/182,851, on Sep. 12, 2011, 7 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,066, on Sep. 29, 2011, 37 pages. |
Restriction Requirement, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,225, on Oct. 3, 2011, 6 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/234,388, on Oct. 12, 2011, 27 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/234,372, on Oct. 13, 2011, 22 pages. |
International Preliminary Report on Patentability, issued by the International Bureau of WIPO in connection with International Application No. PCT/US08/058264, on Sep. 29, 2009, 1 page. |
International Search Report, issued by the International Searching Authority in connection with International Application No. PCT/US08/058264, on Aug. 1, 2008, 2 pages. |
Written Opinion, issued by the International Searching Authority in connection with International Application No. PCT/US08/058264, on Aug. 1, 2008, 5 pages. |
International Preliminary Report on Patentability, issued by the International Bureau of WIPO in connection with International Application No. PCT/US08/062273, on Nov. 3, 2009, 1 page. |
International Search Report, issued by the International Searching Authority in connection with International Application No. PCT/US08/062273, on Sep. 5, 2008, 2 pages. |
Written Opinion, issued by the International Searching Authority in connection with International Application No. PCT/US08/062273, on Sep. 5, 2008, 4 pages. |
International Preliminary Report on Patentability, issued by the International Bureau of WIPO in connection with International Application No. PCT/US08/062275, on Nov. 3, 2009, 1 page. |
International Search Report, issued by the International Bureau in connection with International Application No. PCT/US08/062275, on Sep. 22, 2008, 2 pages. |
Written Opinion, issued by the International Bureau in connection with International Application No. PCT/US08/062275, on Sep. 22, 2008, 6 pages. |
International Preliminary Report on Patentability, issued by the International Bureau of WIPO in connection with International Application No. PCT/US08/063984, on Nov. 17, 2009, 1 page. |
International Search Report, issued by the International Bureau in connection with International Application No. PCT/US08/063984, on Sep. 29, 2008, 3 pages. |
Written Opinion, issued by the International Bureau in connection with International Application No. PCT/US08/063984, on Sep. 29, 2008, 4 pages. |
International Preliminary Report on Patentability, issued by the International Bureau of WIPO in connection with International Application No. PCT/US08/063989, on Nov. 17, 2009, 1 page. |
International Search Report, issued by the International Bureau in connection with International Application No. PCT/US08/063989, on Jul. 17, 2008, 2 pages. |
Written Opinion, issued by the International Bureau in connection with International Application No. PCT/US08/063989, on Jul. 17, 2008, 4 pages. |
International Preliminary Report on Patentability, issued by the International Bureau of WIPO in connection with International Application No. PCT/US08/066166, on Dec. 7, 2009, 1 page. |
International Search Report, issued by the International Bureau in connection with International Application No. PCT/US08/066166, on Aug. 25, 2008, 2 pages. |
Written Opinion, issued by the International Bureau in connection with International Application No. PCT/US08/066166, on Aug. 25, 2008, 6 pages. |
International Preliminary Report on Patentability, issued by the International Bureau of WIPO in connection with International Application No. PCT/US08/071639, on Feb. 2, 2010, 1 page. |
International Search Report, issued by the International Bureau in connection with International Application No. PCT/US08/071639, on Oct. 22, 2008, 3 pages. |
Written Opinion, issued by the International Bureau in connection with International Application No. PCT/US08/071639, on Oct. 22, 2008, 4 pages. |
International Preliminary Report on Patentability, issued by the International Bureau of WIPO in connection with International Application No. PCT/US08/074467, on Mar. 2, 2010, 1 page. |
International Search Report, issued by the International Bureau in connection with International Application No. PCT/US08/074467, on Nov. 17, 2008, 2 pages. |
Written Opinion, issued by the International Bureau in connection with International Application No. PCT/US08/074467, on Nov. 17, 2008, 4 pages. |
International Preliminary Report of Patentability, issued by the International Bureau in connection with International Application No. PCT/US10/021535, on Jul. 26, 2011, 1 page. |
International Search Report, issued by the International Bureau in connection with International Application No. PCT/US10/021535, on Mar. 23, 2010, 3 pages. |
Written Opinion, issued by the International Bureau in connection with International Application No. PCT/US10/021535, on Mar. 23, 2010, 4 pages. |
International Preliminary Report of Patentability, issued by the International Bureau in connection with International Application No. PCT/US09/065368, on Jun. 23, 2011, 2 pages. |
International Search Report, issued by the International Bureau in connection with International Application No. PCT/US09/065368, on Jan. 21, 2010, 3 pages. |
Written Opinion, issued by the International Bureau in connection with International Application No. PCT/US09/065368, on Jan. 21, 2010, 7 pages. |
Extended European Search Report, issued by the European Patent Office in connection with European Application No. 08744383.4-2221/2130146, on Jul. 27, 2011, 6 pages. |
Extended European Search Report, issued by the European Patent Office in connection with European Application No. 10173095.0-2221, on Dec. 17, 2010, 3 pages. |
Extended European Search Report, issued by the European Patent Office in connection with European Application No. 10189294.1-2221, on Mar. 21, 2011, 7 pages. |
First Office Action, issued by the State Intellectual Property Office of P.R. China in connection with Patent Application No. 200880104982.1, on Jan. 25, 2011, 15 pages. |
First Office Action, issued by the State Intellectual Property Office of P.R. China in connection with Patent Application No. 2008801015007, on May 25, 2011, 8 pages. |
First Office Action, issued by the State Intellectual Property Office of P.R. China in connection with Patent Application No. 200880019166.0, on Jul. 22, 2011, 16 pages. |
Decision of Rejection, issued by the State Intellectual Property Office of P.R. China in connection with Patent Application No. 200880104982.1, on Sep. 23, 2011, 10 pages. |
Edgar, et al., “Digital Filters in ERP Research,” in Event-Related Potentials: A Methods Handbook pp. 85-113, (Todd C. Handy, ed., 2005), 15 pages. |
Simon-Thomas, et al, “Behavioral and Electrophysiological Evidence of a Right Hemisphere Bias for the Influence of Negative Emotion on Higher Cognition,” Journal of Cognitive Neuroscience, pp. 518- 529, Massachusetts Institute of Technology (2005), 12 pages. |
Flinker, A. et al, “Sub-centimeter language organization in the human temporal lobe,” Brain and Language, Elsevier Inc., (2010), doi.org/10.1016/j.bandl.2010.09.009, 7 pages. |
Friedman, et al., “Event-Related Potential (ERP) Studies of Memory Encoding and Retrieval: A Selective Review,” Microscopy Research and Technique 51:6-26, Wiley-Less, Inc. (2000), 23 pages. |
Gaillard, “Problems and Paradigms in ERP Research,” Biological Psychology, Elsevier Science Publisher B.V. (1988), 10 pages. |
Hopf, et al., “Neural Sources of Focused Attention in Visual Search,” Cerebral Cortex, 10:1233-1241, Oxford University Press, (Dec. 2000), 9 pages. |
Swick, et al., “Contributions of Prefrontal Cortex to Recognition Memory: Electrophysiological and Behavioral Evidence,” Neuropsychology, vol. 13, No. 2, pp. 155-170, American Psychological Association, Inc. (1999), 16 pages. |
Luck, et al., “The sped of visual attention in schizophrenia: Electrophysiological and behavioral evidence,” Schizophrenia Research, pp. 174-195, Elsevier B.V. www.sciencedirect.com, (2006), 22 pages. |
Makeig, et al., “Mining event-related brain dynamics,” Trends in Cognitive Sciences, vol. 8, No. 5, (May 2004), www.sciencedirect.com, 7 pages. |
Herrmann, et al., “Mechanisms of human attention: event-related potentials and oscillations,” Neuroscience and Biobehavioral Reviews, pp. 465-476, Elsevier Science Ltd., www.elsvevier.com/locate/neubiorev, (2001), 12 pages. |
Knight, “Consciousness Unchained: Ethical Issues and the Vegetative and minimally Conscious State,” The American Journal of Bioethics, 8:9, 1-2, http://dx.doi.org/10.1080/15265160802414524, (Sep. 1, 2008), 3 pages. |
Kishiyama, et al., “Novelty Enhancements in Memory Are Dependent on Lateral Prefrontal Cortex,” The Journal of Neuroscience, pp. 8114-8118, Society for Neuroscience (Jun. 24, 2009), 5 pages. |
Paller, et al., “Validating neural correlates of familiarity,” Trends in Cognitive Sciences, vol. 11, No. 6, www.sciencedirect.com, (May 2, 2007), 8 pages. |
Picton, et al., “Guidelines for using human event-related potentials to study cognition: Recording standards and publication criteria,” Psychophysiology, pp. 127-152, Society for Psychophysiological Research, (2000), 26 pages. |
Yamaguchi, et al., “Rapid-Prefrontal—Hippocampal Habituation to Novel Events,” The Journal of Neuroscience, pp. 5356-5363, Society for Neuroscience, (Apr. 29, 2004), 8 pages. |
Rugg, et al., “Event-related potentials and recognition memory,” Trends in Cognitive Sciences, vol. 11, No. 6, www.sciencedirect.com, (May 3, 2007), 7 pages. |
Rugg, et al., “The ERP and cognitive psychology: conceptual issues,” (Sep. 1996), 7 pages. |
Keren, et al., “Saccadic spike potentials in gamma-band EEG: Characterization, detection and suppression,” NeuroImage, http://dx.doi:10.1016/j.neuroimage.2009.10.057, (Oct. 2009), 16 pages. |
Kishiyama, et al., “Socioeconomic Disparities Affect Prefrontal Function in Children,” Journal of Cognitive Neuroscience pp. 1106-1115, Massachusetts Institute of Technology, (2008), 10 pages. |
Spencer, “Averaging, Detection, and Classification of Single-Trial ERPs,” in Event-Related Potentials: A Methods Handbook, pp. 209-227, (Todd C. Handy, ed., 2005), 10 pages. |
Srinivasan, “High-Resolution EEG: Theory and Practice,” in Event-Related Potentials: A Methods Handbook, pp. 167-188, (Todd C. Handy, ed., 2005), 12 pages. |
Taheri, et al., “A dry electrode for EEG recording,” Electroencephalography and clinical Neurophysiology, pp. 376-383, Elsevier Science Ireland Ltd. (1994), 8 pages. |
Talsma, et al., “Methods for the Estimation and Removal of Artifacts and Overlap in ERP Waveforms,” in Event-Related Potentials: A Methods Handbook, pp. 115-148, (Todd C. Handy, ed., 2005), 22 pages. |
Davidson, et al., “The functional neuroanatomy of emotion and affective style,” Trends in Cognitive Sciences, vol. 3, No. 1, (Jan. 1999), 11 pages. |
Vogel, et al., “Electrophysiological Evidence for a Postperceptual Locus of Suppression During the Attentional Blink,” Journal of Experimental Psychology: Human Perception and Performance, vol. 24, No. 6, pp. 1656-1674, (1998), 19 pages. |
Rizzolatti et al., “The Mirror-Neuron System,” Annu. Rev. Neurosci., vol. 27, pp. 169-192, (Mar. 5, 2004), 30 pages. |
Voytek, et al., “Prefrontal cortex and basal ganglia contributions to visual working memory,” PNAS Early Edition, www.pnas.org/doi/10.1073/pnas.1007277107, (2010), 6 pages. |
Voytek, et al., “Hemicraniectomy: A New Model for Human Electrophysiology with High Spatio-temporal Resolution,” Journal of Cognitive Neuroscience, vol. 22, No. 11, pp. 2491-2502, Massachusetts Institute of Technology, (Nov. 2009) 12 pages. |
Wang, “Neurophysiological and Computational Principles of Cortical Rhythms in Cognition,” Physiol Rev 90: pp. 1195-1268, American Physiological Society, www.prv.org, (2010), 75 pages. |
Woldorf, “Distortion of ERP averages due to overlap from temporally adjacent ERPs: Analysis and correction,” Psychophysiology, Society for Psychophysiological Research, Cambridge University Press (1993), 22 pages. |
Woodman, et al., “Serial Deployment of Attention During Visual Search,” Journal of Experimental Psychology: Human Perception and Performance, vol. 29, No. 1, pp. 121-138, American Physiological Association (2003), 18 pages. |
Filler, “MR Neurography and Diffusion Tensor Imaging: Origins, History & Clinical Impact of the first 50,000 Cases With an Assortment of Efficacy and Utility in a Prospective 5,000 Patent Study Group,” Institute for Nerve Medicine, (Nov. 7, 2008), 56 pages. |
Yuval-Greenberg, et al., “Transient Induced Gamma-Bands Response in EEG as a Manifestation of Miniature Saccades,” Neuron, vol. 58, pp. 429-441, Elsevier Inc. (May 8, 2008), 13 pages. |
Knight, et al., “Prefrontal cortex regulates inhibition and excitation in distributed neural networks,” Acta Psychologica vol. 101, pp. 159-178, Elsevier (1999), 20 pages. |
Akam, et al., “Oscillations and Filtering Networks Support Flexible Routing of Information,” Neuron, vol. 67, pp. 308-320, Elsevier, (Jul. 29, 2010), 13 pages. |
Gargiulo et al., “A Mobile EEG System With Dry Electrodes,” (Nov. 2008), 4 pages. |
Badre, et al. “Frontal Cortex and the Discovery of Abstract Action Rules,” Neuron, vol. 66, pp. 315-326, Elsevier, (Apr. 29, 2010), 13 pages. |
Buschman, et al., “Top-Down versus Bottom-Up Control of Attention in the Prefrontal and posterior Parietal Cortices,” Science, vol. 315, www.sciencemag.org/cgi/content/full/315/5820/1860, American Association for the Advancement of Science, (2007), 4 pages. |
Buschman, et al., “Serial, Covert Shifts of Attention during Visual Search Are Reflected by the Frontal Eye Fields and Correlated with Population Oscillations,” Neuron, vol. 63, pp. 386-396, Elsevier, (Aug. 13, 2009), 11 pages. |
Cheng, et al. “Gender Differences I the Mu Rhythm of the Human Mirror-Neuron System,” PLos ONE, vol. 3, Issue 5, www.plosone.org, (May 2008), 7 pages. |
Fogelson, et al., “Prefrontal cortex is critical for contextual processing: evidence from brain lesions,” Brain: A Journal of Neurology, vol. 132, pp. 3002-3010, doi:10.1093/brain/awp230, (Aug. 27, 2009), 9 pages. |
Fuster, “Cortex and Memory: Emergence of a New Paradigm,” Journal of Cognitive Neuroscience, vol. 21, No. 11, pp. 2047-2072, Massachusetts Institute of Technology, (Nov. 2009), 26 pages. |
D'Esposito, “From cognitive to neural models of working memory,” Phil. Trans. R. Soc. B, doi: 10.1098/rstb.2007.2086, (Mar. 30, 2007), 12 pages. |
Dien, et al., “Application of Repeated Measures ANOVA to High-Dens Dataset: A Review and Tutorial,” in Event-Related Potentials: A Methods Handbook pp. 57-82, (Todd C. Handy, ed., 2005), 14 pages. |
Neurofocus—Neuroscientific Analysis for Audience Engagement, accessed on Jan. 8, 2010 at http://web.archive.org/web/20080621114525/www.neurofocus.com/BrandImage.htm, (2008), 2 pages. |
Ambler, “Salience and Choice: Neural Correlates of Shopping Decisions,” Psychology & Marketing, vol. 21, No. 4, p. 247-261, Wiley Periodicals, Inc., doi: 10.1002/mar20004, (Apr. 2004), 16 pages. |
Hazlett, et al., “Emotional Response to Television Commercials: Facial EMG vs. Self-Report,” Journal of Advertising Research, (Apr. 1999), 17 pages. |
Makeig, et al., “Dynamic Brain Sources of Visual Evoked Responses,” Science, vol. 295, www.sciencemag.org, (Jan. 25, 2002), 5 pages. |
William, “Brain Signals to Control Movement of Computer Cursor,” Blog article: Brain Signals to Control Movement of Computer Cursor, Artificial Intelligence, retrieved from the Internet on Aug. 17, 2011, http://whatisartificialintelligence.com/899/brain-signals-to-control-movement-of-computer-cursor/, (Feb. 17, 2010), 3 pages. |
Lewis et al., “Market Researchers make Increasing use of Brain Imaging,” ACNR, vol. 5, No. 3, pp. 36-37, (Jul./Aug. 2005), 2 pages. |
Sutherland, “Neuromarketing: What's it all about?” Retrieved from Max Sutherland's Weblog on Aug. 23, 2011, http://www.sutherlandsurvey.com/Column—pages/Neuromarketing—whats—it—all—about.htm, (Mar. 2007), 5 pages. |
Haq, “This Is Your Brain on Advertising,” BusinessWeek, Market Research, (Oct. 8, 2007), 3 pages. |
EEG Protocols, “Protocols for EEG Recording,” retrieved from the Internet on Aug. 23, 2011, http://www.q-metrx.com/EEGrecordingProtocols.pdf, (Nov. 13, 2007), 3 pages. |
Aaker et al., “Warmth in Advertising: Measurement, Impact, and Sequence Effects,” Journal of Consumer Research, vol. 12, No. 4, pp. 365-381, (Mar. 1986), 17 pages. |
Belch et al., “Psychophysiological and cognitive Response to Sex in Advertising,” Advances in Consumer Research, vol. 9, pp. 424-427, (1982), 6 pages. |
Ruchkin et al., “Modality-specific processing streams in verbal working memory: evidence from spatio-temporal patterns of brain activity,” Cognitive Brain Research, vol. 6, pp. 95-113, Elsevier, (1997), 19 pages. |
Page et al., “Cognitive Neuroscience, Marketing and Research,” Congress 2006—Foresight—The Predictive Power of Research Conference Papers, ESOMAR Publications, (Sep. 17, 2006), 25 pages. |
Braeutigam, “Neuroeconomics—From neural systems to economic behavior,” Brain Research Bulletin, vol. 67, pp. 355-360, Elsevier, (2005), 6 pages. |
Lee et al., “What is ‘neuromarketing’? A discussion and agenda for future research,” International Journal of Psychophysiology, vol. 63, pp. 199-204, Elsevier (2006), 6 pages. |
Crawford et al., “Self-generated happy and sad emotions in low and highly hypnotizable persons during waking and hypnosis: laterality and regional EEG activity differences,” International Journal of Psychophysiology, vol. 24, pp. 239-266, (Dec. 1996), 28 pages. |
Desmet, “Measuring Emotion: Development and Application of an Instrument to Measure Emotional Responses to Products,” to be published in Funology: From Usability to Enjoyment, pp. 111-123, Kluwer Academic Publishers, (Blythe et al., eds., 2004), 13 pages. |
Bagozzi et al., “The Role of Emotions in Marketing,” Journal of the Academy of Marketing Science, vol. 27, No. 2, pp. 184-206, Academy of Marketing Science (1999), 23 pages. |
Blakeslee, “If You Have a ‘Buy Button’ in Your Brain, What Pushes It?” The New York Times, www.nytimes.com, (Oct. 19, 2004), 3 pages. |
Kay et al., “Identifying natural images from human brain activity,” Nature, vol. 452, pp. 352-356, Nature Publishing Group, (Mar. 20, 2008), 5 pages. |
Anonymous, “Functional magnetic resonance imaging,” retrieved online from Wikipedia, the Free Encyclopedia on Aug. 23, 2011, at http://en.wikipedia.org/w/index.php?title-Functional—magnetic—resonance—imaging&oldid=319601772, (Oct. 13, 2009), 8 pages. |
Osborne, “Embedded Watermarking for image Verification in Telemedicine,” Thesis submitted for the degree of Doctor of Philosophy, Electrical and Electronic Engineering, University of Adelaide (2005), 219 pages. |
Nielsen, “Neuroinformatics in Functional Neuroimaging,” Informatics and Mathematical Modeling, Technical University of Denmark, (Aug. 30, 2002), 241 pages. |
Arousal in Sport, in Encyclopedia of Applied Psychology, vol. 1, p. 159, retrieved from Google Books, (Spielberger, ed., Elsevier Academic Press, 2004), 1 page. |
Ziegenfuss, “Neuromarketing: Advertising Ethical & Medical Technology,” The Brownstone Journal, vol. XII, Boston University, pp. 69-73, (May 2005), 5 pages. |
Zyga, “A Baseball Cap That Can Read Your Mind,” PhysOrg.com, located at www.physorg.com/news130152277.html, (May 16, 2008), 11 pages. |
Lekakos, “Personalized Advertising Services Through Hybrid Recommendation Methods: The Case of Digital Interactive Television,” Department of Informatics, Cyprus University, (2004), 11 pages. |
Yap et al., “TIMER: Tensor Image Morphing for Elastic Registration,” Neurolmage, vol. 47, (May 3, 2009), 15 pages. |
Clifford, “Billboards That Look Back,” The New York Times, NYTimes.com, available at http://www.nytimes.com/2008/05/31/business/media/31billboard.html, (May 31, 2008), 4 pages. |
Ambler et al., “Ads on the Brain; A Neuro-Imaging Comparison of Cognitive and Affective Advertising Stimuli,” London Business School, Centre for Marketing Working Paper, No. 00-902, (Mar. 2000), 23 pages. |
U.S. Appl. No. 13/045,457, filed Mar. 10, 2011, (unpublished). |
U.S. Appl. No. 12/778,810, filed May 12, 2010, (unpublished). |
U.S. Appl. No. 12/778,828, filed May 12, 2010, (unpublished). |
U.S. Appl. No. 13/104,821, filed May 10, 2011, (unpublished). |
U.S. Appl. No. 13/104,840, filed May 10, 2011, (unpublished). |
U.S. Appl. No. 12/884,034, filed Sep. 16, 2010, (unpublished). |
U.S. Appl. No. 12/868,531, filed Aug. 25, 2010, (unpublished). |
U.S. Appl. No. 12/913,102, filed Oct. 27, 2010, (unpublished). |
U.S. Appl. No. 12/853,213, filed Aug. 9, 2010, (unpublished). |
U.S. Appl. No. 13/105,774, filed May 11, 2011, (unpublished). |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/357,302, on May 7, 2012, 16 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/868,531, on May 8, 2012, 16 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/608,696, on May 15, 2012, 16 pages. |
Restriction Requirement, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/545,455, on Jun. 13, 2012, 5 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,225, on Jun. 15, 2012, 9 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/544,934, on Jun. 18, 2012, 11 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,066, on Jun. 21, 2012, 9 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/608,660, on Jul. 10, 2012, 13 pages. |
Second Office Action, issued by the State Intellectual Property Office of P.R. China in connection with Patent Application No. 200880019166.0, on Jun. 5, 2012, 8 pages. |
Second Office Action, issued by the State Intellectual Property Office of P.R. China in connection with Patent Application No. 200880104982.1, on Jun. 29, 2012, 5 pages. |
Barreto et al., “Physiologic Instrumentation for Real-time Monitoring of Affective State of Computer Users,” WSEAS International Conference on Instrumentation, Measurement, Control, Circuits and Systems (IMCCAS), (2004), 6 pages. |
Jung et al., “Analysis and Visualization of Single-Trial Event-Related Potentials,” Human Brain Mapping vol. 14, 166-185 (2001), 20 pages. |
Krugman, “Brain Wave Measures of Media Involvement,” Journal of Advertising Research vol. 11, (Feb. 3-9, 1971), 7 pages. |
The Mathworks, Inc., “MATLAB Data Analysis: Version 7,” p. 4-19 (2005), 3 pages. |
Klimesch, “EEG alpha and theta oscillations reflect cognitive and memory performance a review and analysis,” Brain Research Reviews, vol. 29, 169-195, (1999), 27 pages. |
Krakow et al., “Methodology: EEG-correlated fMRI,” Functional Imaging in the Epilepsies, (Lippincott Williams & Wilkins, 2000), 17 pages. |
Allen et al., “A Method of Removing Imaging Artifact from Continuous EEG Recorded during Functional MRI,” Neuroimage, vol. 12, 230-239, (Aug. 2000). |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/868,531, on Mar. 1, 2012, 6 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/182,851, on Mar. 12, 2012, 14 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/608,685, on Mar. 29, 2012, 17 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/846,242, on Mar. 29, 2012, 15 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/234,388, on Apr. 6, 2012, 6 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/357,315, on Apr. 9, 2012, 17 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/544,958, on May 2, 2012, 14 pages. |
English Translation of Office Action, issued by the Israel Patent Office in connection with Patent Application No. 203176, on Feb. 21, 2012, 2 pages. |
English Translation of Office Action, issued by the Israel Patent Office in connection with Patent Application No. 203177, on Mar. 1, 2012, 2 pages. |
Second Office Action, issued by the State Intellectual Property Office of P.R. China in connection with Patent Application No. 200880101500.7, on Apr. 5, 2012, 5 pages. |
Padgett et al., “Categorical Perception in Facial Emotion Classification,” In Proceedings of the 18th Annual Conference of the Cognitive Science Society, pp. 249-253 (1996), 5 pages. |
de Gelder et al., “Categorical Perception of Facial Expressions: Categories and their Internal Structure,” Cognition and Emotion, vol. 11(1), pp. 1-23 (1997), 23 pages. |
Bimler et al., “Categorical perception of facial expressions of emotion: Evidence from multidimensional scaling,” Cognition and Emotion, vol. 15(5), pp. 633-658 (Sep. 2001), 26 pages. |
Newell et al., “Categorical perception of familiar objects,” Cognition, vol. 85, Issue 2, pp. 113-143 (Sep. 2002), 31 pages. |
Meriam Webster Online Dictionary, Definition Virtual Reality, available at http://www.meriam-webster.com/dictionary/virtual%20reality, 2 page. |
Griss et al., “Characterization of micromachined spiked biopotential electrodes,” Biomedical Engineering, IEEE Transactions (Jun. 2002), 8 pages. |
“User monitoring,” Sapien Systems, available at http://web.archive.org/web/20030818043339/http:/www.sapiensystems.com/eyetracking.html, (Aug. 18, 2003), 1 page. |
Sullivan et al., “A brain-machine interface using dry-contact, low-noise EEG sensors,” In Proceedings of the 2008 IEEE International Symposium on Circuits and Systems, (May 18, 2008), 4 pages. |
Lui, Tsz-Wai, et al., “Marketing Strategies in Virtual Worlds,” The Data Base for Advances in Information Systems, vol. 38, No. 4, Nov. 2007, pp. 77-80. |
Barcelo, Francisco, et al., “Prefrontal Modulation of Visual Processing in Humans,” Nature Neuroscience, vol. 3, No. 4, Apr. 2000, pp. 399-403. |
Canolty, R.T., et al., “High Gamma Power Is Phase-Locked to Theta Oscillations in Human Neocortex,” Science, vol. 313, Sep. 15, 2006, pp. 1626-1628. |
Engel, Andreas, et al., “Dynamic Predictions: Oscillations and Synchrony in Top-Down Processing,” Macmillan Magazines Ltd, vol. 2, Oct. 2001, pp. 704-716. |
Fries, Pascal, “A Mechanism for Cognitive Dynamics: Neuronal Communication Through Neuronal Coherence,” Trends in Cognitive Sciences, vol. 9, No. 10, Oct. 2005, p. 474-480. |
Gazzaley, Adam, et al., “Top-down Enhancement and Suppression of the Magnitude and Speed of Neural Activity,” Journal of Cognitive Neuroscience, vol. 17, No. 3, pp. 507-517. |
Hartikainen, Kaisa, et al., “Emotionally Arousing Stimuli Compete with Attention to Left Hemispace,” Editorial Manager(tm) for NeuroReport, Manuscipt Draft, Manuscript No. NR-D-07-5935R1, submitted Sep. 8, 2007, 26 pages. |
Knight, Robert T., “Contribution of Human Hippocampal Region to Novelty Detection,” Nature, vol. 383, Sep. 19, 1996, p. 256-259. |
Knight Robert T., “Decreased Response to Novel Stimuli After Prefrontal Lesions in Man,” Electroencephalography and Clinical Neurophysiology, vol. 59, 1984, pp. 9-20. |
Miltner, Wolfgang H.R., et al., “Coherence of Gamma-band EEG Activity as a Basis for Associative Learning,” Nature, vol. 397, Feb. 4, 1999, pp. 434-436. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/410,380, on Oct. 19, 2011, 21 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/357,315, on Oct. 26, 2011, 41 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/122,240, on Oct. 27, 2011, 39 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,221, on Nov. 28, 2011, 44 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/608,660, on Dec. 7, 2011, 8 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/113,863, on Dec. 22, 2011, 17 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/122,262, on Dec. 22, 2011, 17 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,074, on Dec. 22, 2011, 16 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,557, on Dec. 22, 2011, 17 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,596, on Dec. 22, 2011, 15 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/200,813, on Dec. 22, 2011, 18 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,583, on Dec. 29, 2011, 18 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/410,372, on Jan. 3, 2012, 10 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/413,297, on Jan. 4, 2012, 10 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/544,921, on Jan. 9, 2012, 13 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/357,302, on Jan. 17, 2012, 11 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,225, on Jan. 20, 2012, 12 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,066, on Jan. 24, 2012, 12 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/546,586, on Feb. 1, 2012, 17 pages. |
Restriction Requirement, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/544,958, on Feb. 10, 2012, 6 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,069, on Feb. 14, 2012, 35 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/357,322, on Feb. 14, 2012, 14 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,211, on Feb. 16, 2012, 15 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,190, on Feb. 17, 2012, 22 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/122,253, on Feb. 17, 2012, 20 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/182,874, on Feb. 17, 2012, 15 pages. |
Extended European Search Report, issued by the European Patent Office in connection with European Application No. 11006934.1-2221, on Oct. 25, 2011, 5 pages. |
First Office Action, issued by the State Intellectual Property Office of P.R. China in connection with Patent Application No. 200880017883.X, on Nov. 30, 2011, 16 pages. |
Meriam-Webster Online Dictionary definition for “tangible,” available at http://www.meriam-webster.com/dictionary/tangible, 1 page. |
Mosby's Dictionary of Medicine, Nursing, & Health Professions, 2009, Mosby, Inc., Definition of Alpha Wave, 1 page. |
Mosby's Dictionary of Medicine, Nursing, & Health Professions, 2009, Mosby, Inc., Definition of Beta Wave, 1 page. |
U.S. Appl. No. 13/249,512, filed Sep. 30, 2011, (unpublished). |
U.S. Appl. No. 13/249,525, filed Sep. 30, 2011, (unpublished). |
U.S. Appl. No. 13/288,504, filed Nov. 3, 2011, (unpublished). |
U.S. Appl. No. 13/288,571, filed Nov. 3, 2011, (unpublished). |
U.S. Appl. No. 12/304,234, filed Nov. 3, 2011, (unpublished). |
Examiner's Answer, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/234,372, on May 23, 2012, 11 pages. |
Advisory Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/234,388, on Aug. 28, 2012, 3 pages. |
Examiner's Answer, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/410,380, on Jun. 8, 2012, 12 pages. |
Examiner's Answer, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/410,372, on Aug. 3, 2012, 8 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/545,455, on Aug. 29, 2012, 11 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/608,685, on Jul. 30, 2012, 15 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/778,810, on Aug. 31, 2012, 12 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/778,828, on Aug. 30, 2012, 9 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,190, on Sep. 17, 2012, 11 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/122,262, on Sep. 17, 2012, 11 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/122,253, on Sep. 17, 2012, 17 pages. |
Examiner's Answer, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/413,297, on Sep. 18, 2012, 18 pages. |
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/546,586, on Sep. 18, 2012, 17 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,211, on Sep. 19, 2012, 10 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,074, on Sep. 19, 2012, 10 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/200,813, on Sep. 20, 2012, 11 pages. |
Second Office Action, issued by the State Intellectual Property Office of China in connection with Chinese Patent Application No. 200880017883.X, on Aug. 10, 2012 (9 pages). |
Oberman et al., “EEG evidence for mirror neuron activity during the observation of human and robot actions: Toward an analysis of the human qualities of interactive robots,” Elsevier, Neurocomputing vol. 70 (2007), Jan. 2, 2007 (10 pages). |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/853,213, on Sep. 7, 2012, 9 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,583, on Sep. 26, 2012, 14 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,596, on Sep. 27, 2012, 14 pages. |
English Translation of Office Action, issued by the Israel Patent Office in connection with Patent Application No. 203176, on Sep. 27, 2012, 1 pages. |
English Translation of Office Action, issued by the Israel Patent Office in connection with Patent Application No. 203177, on Sep. 27, 2012, 1 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,557, on Sep. 28, 2012, 12 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/113,863, on Oct. 1, 2012, 12 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 13/444,149, on Oct. 4, 2012, 9 pages. |
Final Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/182,851, on Oct. 4, 2012, 14 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/234,388, on Oct. 5, 2012, 6 pages. |
Office Action, issued by the Japanese Patent Office in connection with Patent Application No. 2010-501190, on Oct. 5, 2012, 5 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/868,531, on Oct. 22, 2012, 5 pages. |
English Translation of Office Action, issued by the Japanese Patent Office in connection with Patent Application No. 2010-506646, on Oct. 23, 2012, 3 pages. |
Final Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/846,242, on Nov. 29, 2012, 14 pages. |
Clemons, “Resonance Marketing in the Age of the Truly Informed Consumer: Creating Profits through Differentiation and Delight,” Wharton Information Strategy & Economics Blog 2, available at http://opim.wharton.upenn.edu/˜clemons/blogs/resonanceblog.pdf, (Mar. 28, 2007), 8 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/200,813, on Nov. 2, 2012, 5 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,190, on Nov. 2, 2012, 5 pages. |
Restriction Requirement, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,225, on Nov. 2, 2012, 5 pages. |
Final Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,066, on Nov. 13, 2012, 9 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/113,863, on Nov. 16, 2012, 5 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,211, on Nov. 21, 2012, 5 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/868,531, on Nov. 23, 2012, 5 pages. |
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/913,102, on Dec. 7, 2012, 7 pages. |
Final Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/544,958, on Dec. 10, 2012, 16 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,190, on Dec. 21, 2012, 14 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,211, on Dec. 21, 2012, 10 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/122,262, on Dec. 21, 2012, 19 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,074, on Dec. 21, 2012, 12 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,557, on Dec. 21, 2012, 14 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,596, on Dec. 21, 2012, 17 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/200,813, on Dec. 21, 2012, 9 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/853,213, on Dec. 21, 2012, 10 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/868,531, on Dec. 26, 2012, 2 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/113,863, on Dec. 31, 2012, 5 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,583, on Dec. 31, 2012, 10 pages. |
Final Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/182,874, on Jan. 4, 2013, 17 pages. |
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,583, on Jan. 11, 2013, 11 pages. |
Final Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,225, on Jan. 11, 2013, 11 pages. |
English Translation of Office Action, issued by the Israeli Patent Office in connection with Patent Application No. 201187, on Nov. 27, 2012, 2 pages. |
English Translation of Office Action, issued by the State Intellectual Property Office of P.R. China in connection with Patent Application No. 200880101500.7, on Nov. 21, 2012, 5 pages. |
Extended European Search Report, issued by the European Patent Office in connection with European Application No. 08796890.5-2319/2170161, on Dec. 7, 2012, 9 pages. |
Palva et al., “Phase Synchrony Among Neuronal oscillations in the Human Cortex,” Journal of Neuroscience 25 (2005), 3962-3972, 11 pages. |
Lachaux et al., “Measuring Phase Synchrony in Brain Signals,” Human Brain Mapping 8 (1999), 194-208, 15 pages. |
Number | Date | Country | |
---|---|---|---|
20120036004 A1 | Feb 2012 | US |