Methods and apparatus for providing alternate media for video decoders

Information

  • Patent Grant
  • 9826284
  • Patent Number
    9,826,284
  • Date Filed
    Thursday, March 31, 2016
    8 years ago
  • Date Issued
    Tuesday, November 21, 2017
    6 years ago
Abstract
A system provides programming and advertising to a video decoder such as a digital video recorder, computer system, software or hardware player, etc. When a user makes a request to skip a commercial by issuing a command such as 30 second skip forward, alternate media is provided. In some examples, an image advertisement is provided for a predetermined period of time either during the commercial break or when regular programming resumes. In other examples, a substitute commercial is shown. The substitute commercial may be shortened or compressed. The alternate media may be perceptually encoded in a video stream, hidden in a video stream, or provided in a separate stream. In some examples, survey based and neuro-response based data is used to evaluate and select alternate media for particular programming.
Description
TECHNICAL FIELD

The present disclosure relates to providing alternate media, such as a different commercials or advertisements, for video played on devices such as video recorders, computer systems, hardware and software players, set top boxes, etc.


BACKGROUND

A variety of conventional systems are available for delivering and manipulating video. In some instances, personal video recorders or digital video recorders store video and audio to allow user playback and/or manipulation of the video. A user may fast forward, rewind, skip forward, and/or play video back at varying speeds. Computing systems may also hold video in memory that allows playback and manipulation of the video.


Although a variety of video delivery and manipulation mechanisms are available, the ability to provide alternate media is limited. Consequently, it is desirable to provide improved methods and apparatus for presenting alternate media.





BRIEF DESCRIPTION OF THE DRAWINGS

The disclosure may best be understood by reference to the following description taken in conjunction with the accompanying drawings, which illustrate particular example embodiments.



FIG. 1 illustrates one example of a system for providing alternate media in a video recorder system.



FIG. 2 illustrates one example of a video with commercial breaks.



FIG. 3 illustrates one example of a series of video frames.



FIG. 4 illustrates another example of a series of video frames.



FIG. 5 illustrates one example of a system for analyzing alternate media.



FIG. 6 illustrates one example of a technique for providing alternate media.



FIG. 7 illustrates one example of technique for performing data analysis for video.



FIG. 8 provides one example of a system that can be used to implement one or more mechanisms.





DETAILED DESCRIPTION

Reference will now be made in detail to some specific examples of the invention including the best modes contemplated by the inventors for carrying out the invention. Examples of these specific embodiments are illustrated in the accompanying drawings. While the invention is described in conjunction with these specific embodiments, it will be understood that it is not intended to limit the invention to the described embodiments. On the contrary, it is intended to cover alternatives, modifications, and equivalents as may be included within the spirit and scope of the invention as defined by the appended claims.


For example, the techniques and mechanisms of the present invention will be described in the context of particular types of video and video players. However, it should be noted that the techniques and mechanisms of the present invention apply to a variety of different types of video and video players. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. Particular example embodiments of the present invention may be implemented without some or all of these specific details. In other instances, well known process operations have not been described in detail in order not to unnecessarily obscure the present invention.


Various techniques and mechanisms of the present invention will sometimes be described in singular form for clarity. However, it should be noted that some embodiments include multiple iterations of a technique or multiple instantiations of a mechanism unless noted otherwise. For example, a system uses a processor in a variety of contexts. However, it will be appreciated that a system can use multiple processors while remaining within the scope of the present invention unless otherwise noted. Furthermore, the techniques and mechanisms of the present invention will sometimes describe a connection between two entities. It should be noted that a connection between two entities does not necessarily mean a direct, unimpeded connection, as a variety of other entities may reside between the two entities. For example, a processor may be connected to memory, but it will be appreciated that a variety of bridges and controllers may reside between the processor and memory. Consequently, a connection does not necessarily mean a direct, unimpeded connection unless otherwise noted.


Overview


A system provides programming and advertising to a video decoder such as a digital video recorder, computer system, software or hardware player, etc. When a user makes a request to skip a commercial by issuing a command such as 30 second skip forward, alternate media is provided. In some examples, an image advertisement is provided for a predetermined period of time either during the commercial break or when regular programming resumes. In other examples, a substitute commercial is shown. The substitute commercial may be shortened or compressed. The alternate media may be perceptually encoded in a video stream, hidden in a video stream, or provided in a separate stream. In some examples, survey based and neuro-response based data is used to evaluate and select alternate media for particular programming.


Example Embodiments

Conventional video decoders included in devices such as personal video recorders, digital video recorders, televisions, hardware and software players, and computer systems allow viewers to skip over content using mechanisms such as fast forward and 30 second skip forward. Viewers are particularly keen on skipping over commercial and advertising content. However, having a large number of viewers skipping over commercial and advertising content decreases the value of commercial and advertising content for programming, leading to decreased revenue for content providers and other parties. This in turn creates a disincentive to provide programming supported by commercial and advertising content.


As commercial skipping becomes more prevalent, the techniques of the present invention recognize that it is useful to provide advertisers, content providers, and service providers with a mechanism for introducing alternate advertising and commercial content to viewers. According to various embodiments, a receiver would introduce an image or logo associated with a commercial when the commercial is skipped. The image or logo may be displayed for a predetermined period of time.


However, there is still a substantial decrease in the value of the commercial slot, as the image or logo may be significantly less effective than the full length commercial. Consequently, the techniques of the present invention provide alternate commercials to a video recorder. According to various embodiments, alternate commercials and advertisements of varying length are stored on a device. When a viewer attempts to skip over an advertisement during a commercial break, the device provides an alternate advertisement. For example, when a viewer attempts to skip over a home furnishings commercial, an alternate restaurant commercial may be displayed instead. In particular embodiments, when the viewer attempts to skip over the restaurant commercial, an alternate beverage commercial may be displayed.


If a commercial break originally includes three commercials, the device may require that three commercials be viewed in their entirety before regular programming continues. Alternatively, if a commercial break is two minutes, the device may require viewing of two minutes of commercials total. This allows viewers to select which commercials they would like to watch and to skip over those that are not interesting. Viewers are provided with some choice in what they decide to view. Content providers can still generate revenue by providing high value, selected advertising.


In other embodiments, a request to skip forward over a commercial will play a shortened version of the commercial that still elicits similar neurological and neurophysiological responses from a viewer. In still other embodiments, a request to skip forward over a commercial will result in an advertisement being shown or an image being shown when regular programming resumes.


Alternate media may be provided in the same video stream in a hidden or embedded format. In other embodiments, alternate media may be provided on a different channel or at a different time to a device. Any device capable of receiving video and playing video can be used. Examples include personal video recorders, digital video recorders, computer systems, hardware and software players, televisions, etc. According to various embodiments, recorders and players need to be modified so that replacement commercials can be selected and displayed.


The techniques and mechanisms of the present invention also optionally provide a neuro-response analyzer to determine the effectiveness of alternate media. The system may also determine what type of alternate media to provide and how to provide the alternate media.



FIG. 1 illustrates one example of a system for providing alternate media to a video recorder. In particular embodiments, a video library 111 provides video to a video recorder receiver 123. An alternate media library 121 also provides media to a video recorder receiver 123. The alternate media library 121 may include images, commercials, advertisements, logos, etc. and may deliver media to the video recorder receiver 123 on the same channel the video library 111 uses or may use a different channel. In some instances, content providers send both video and alternate media to the video recorder receiver 123. According to various embodiments, alternate media is embedded in a video stream and shown when a skip forward command is issued. In other embodiments, alternate media is hidden in a video stream and shown when any fast forward type command is issued during a commercial break.


The alternate media library 121 and the video library 111 may send data to the video recorder receiver 123 at the same or different times. According to various embodiments, an alternate media library 121 is sending alternate media to a video recorder receiver 123 when a viewer is not using the video recorder. Although a video library 111 and an alternate media library 121 are shown, in some examples, video and alternate media may be provided from another source, such as a live feed or a satellite feed. According to various embodiments, the video and alternate media are stored at the video recorder using alternate media storage 125 and live program storage 131. In some examples, both video and alternate media are buffered on a disk or memory associated with a video recorder.


A processor 133 accesses the alternate media and the video as necessary to provide to a display 135. According to various embodiments, the processor 133 provides video to the display 135 until the processor 133 receives a skip request such as a skip forward or fast forward request from a viewer during a commercial break. In particular embodiments, the processor 133 provides alternate media such as an image or logo in place of the skipped commercial. The image may be displayed for a predetermined period of time. In other examples, the processor 133 provides an alternate commercial when a commercial is skipped.


The alternate commercial may relate to a different subject matter than the skipped commercial. In other embodiments, a skip request such as a 30 second skip forward or a fast forward request brings up a menu showing several different alternative advertisements that a viewer may select. It is recognized that viewers may be more willing to watch commercials if they have some choice in which commercials they watch. According to various embodiments, a processor 133, requires that three commercials be viewed in full during a commercial break.



FIG. 2 illustrates one example of a video and an alternate media stream. A video includes programming 201 and 221 interrupted with three default commercials 211, 213, and 215. If a viewer issues a command such as a skip forward command, a video recorder may enforce a policy that the viewer needs to view at least three alternate commercials or images from a pool of alternate media 231, 233, 235, 237, and 243. Alternate media may include substitute commercials, overlay images, graphics, logos, purchase offers, etc. Alternate media can also vary in length. In some examples, alternate media is merely an image displayed while material is being skipped. In particular embodiments, a video player only allows a certain number of skips before commercial viewing is required.


According to various embodiments, alternate media is introduced in place of regular video or is a partial screen or full screen overlay on video frames. The alternate media may be blended with video. In particular embodiments, the alternate media is a fully or partially opaque overlay on video frames. In some examples, making skip requests during a commercial break will introduce a banner type advertisement that is displayed during program 221. The screen for showing the video program may be reduced while the banner type advertisement is displayed. According to various embodiments, if a viewer skips commercials 211, 213, and 215, during a commercial break, company logos associated with the skipped commercials 211, 213, and 215 are displayed for a predetermined period of time during viewing of program 221. The logos may be obtained from alternate media 231, 233, 235, 237, and 243.



FIG. 3 illustrates one example of a sequence of video frames and alternate media frames. According to various embodiments, video frames 311, 313, 315, 317, 319, 321, 323, 325, 327, 329, 331, and 333 include information indicating whether the frame shows program content, commercial content, or other content. Alternate media includes a series of frames 341, 343, 345, 347, 351, and 353. In some examples, the alternate media may be a series of video frames used to replace frames shown during a commercial break. However, alternate media may also be images, logos, banners, etc.



FIG. 4 illustrates one example of a sequence of encoded video frames and alternate media frames. Many video encoding mechanisms include different types of frames. According to various embodiments, frames include intra-coded frames (I-frames), predicted frames (P-frames), and bi-predictive frames (B-frames). I-frames provide substantially all of the data needed to present a full picture. On the other hand, P-frames and B-frames provide information about differences between the predictive frame and an I-frame. Predictive frames such as B-frames and P-frames are smaller and more bandwidth efficient than I-frames.


According to various embodiments, frames sequences 411, 413, 415, 417, 419, 421, 423, 425, 427, 429, 431, and 433 include I-frames 411, 419, 425, and 433. The frame sequence also includes predictive frames including P-frames 413, 417, 421, 423, and 427 as well as B-frames 415, 429, and 431. Alternate media includes I-frame 441 and P-frames 443 and 445.


A variety of survey based and neuro-response based mechanisms can be used to determine the effectiveness of alternate media. Using feedback from survey based and/or neuro-response based mechanisms allows adjustment of alternate media presentation. For example, survey based and/or neuro-response mechanisms may determine that alternate media is more effectively provided as logos played during regular programming. Survey based and/or neuro-response mechanisms may also be used to select what alternate media commercials would be most appropriate for particular programs.



FIG. 5 illustrates one example of a system for evaluating alternate media using central nervous system, autonomic nervous system, and/or effector measures. According to various embodiments, the alternate media system includes a stimulus presentation device 501. In particular embodiments, the stimulus presentation device 501 is merely a display, monitor, screen, etc., that displays stimulus material to a user. Continuous and discrete modes are supported. According to various embodiments, the stimulus presentation device 501 also has protocol generation capability to allow intelligent customization of stimuli provided to multiple subjects in different markets.


According to various embodiments, stimulus presentation device 501 could include devices such as televisions, cable consoles, computers and monitors, projection systems, display devices, speakers, tactile surfaces, etc., for presenting the video from different networks, local networks, cable channels, syndicated sources, websites, internet content aggregators, portals, service providers, etc.


According to various embodiments, the subjects 503 are connected to data collection devices 505. The data collection devices 505 may include a variety of neuro-response measurement mechanisms including neurological and neurophysiological measurements systems. According to various embodiments, neuro-response data includes central nervous system, autonomic nervous system, and effector data.


Some examples of central nervous system measurement mechanisms include Functional Magnetic Resonance Imaging (fMRI) and Electroencephalography (EEG). fMRI measures blood oxygenation in the brain that correlates with increased neural activity. However, current implementations of fMRI have poor temporal resolution of few seconds. EEG measures electrical activity associated with post synaptic currents occurring in the milliseconds range. Subcranial EEG can measure electrical activity with the most accuracy, as the bone and dermal layers weaken transmission of a wide range of frequencies. Nonetheless, surface EEG provides a wealth of electrophysiological information if analyzed properly.


Autonomic nervous system measurement mechanisms include Galvanic Skin Response (GSR), Electrocardiograms (EKG), pupillary dilation, etc. Effector measurement mechanisms include Electrooculography (EOG), eye tracking, facial emotion encoding, reaction time etc.


According to various embodiments, the techniques and mechanisms of the present invention intelligently blend multiple modes and manifestations of precognitive neural signatures with cognitive neural signatures and post cognitive neurophysiological manifestations to more accurately allow assessment of alternate media. In some examples, autonomic nervous system measures are themselves used to validate central nervous system measures. Effector and behavior responses are blended and combined with other measures. According to various embodiments, central nervous system, autonomic nervous system, and effector system measurements are aggregated into a measurement that allows definitive evaluation stimulus material


In particular embodiments, the data collection devices 505 include EEG 511, EOG 513, and GSR 515. In some instances, only a single data collection device is used. Data collection may proceed with or without human supervision.


The data collection device 505 collects neuro-response data from multiple sources. This includes a combination of devices such as central nervous system sources (EEG), autonomic nervous system sources (GSR, EKG, pupillary dilation), and effector sources (EOG, eye tracking, facial emotion encoding, reaction time). In particular embodiments, data collected is digitally sampled and stored for later analysis. In particular embodiments, the data collected could be analyzed in real-time. According to particular embodiments, the digital sampling rates are adaptively chosen based on the neurophysiological and neurological data being measured.


In one particular embodiment, the alternate media system includes EEG 511 measurements made using scalp level electrodes, EOG 513 measurements made using shielded electrodes to track eye data, GSR 515 measurements performed using a differential measurement system, a facial muscular measurement through shielded electrodes placed at specific locations on the face, and a facial affect graphic and video analyzer adaptively derived for each individual.


In particular embodiments, the data collection devices are clock synchronized with a stimulus presentation device 501. In particular embodiments, the data collection devices 505 also include a condition evaluation subsystem that provides auto triggers, alerts and status monitoring and visualization components that continuously monitor the status of the subject, data being collected, and the data collection instruments. The condition evaluation subsystem may also present visual alerts and automatically trigger remedial actions. According to various embodiments, the data collection devices include mechanisms for not only monitoring subject neuro-response to stimulus materials, but also include mechanisms for identifying and monitoring the stimulus materials. For example, data collection devices 505 may be synchronized with a set-top box to monitor channel changes. In other examples, data collection devices 505 may be directionally synchronized to monitor when a subject is no longer paying attention to stimulus material. In still other examples, the data collection devices 505 may receive and store stimulus material generally being viewed by the subject, whether the stimulus is a program, a commercial, printed material, or a scene outside a window. The data collected allows analysis of neuro-response information and correlation of the information to actual stimulus material and not mere subject distractions.


According to various embodiments, the alternate media system also includes a data cleanser and analyzer device 521. In particular embodiments, the data cleanser and analyzer device 521 filters the collected data to remove noise, artifacts, and other irrelevant data using fixed and adaptive filtering, weighted averaging, advanced component extraction (like PCA, ICA), vector and component separation methods, etc. This device cleanses the data by removing both exogenous noise (where the source is outside the physiology of the subject, e.g. a phone ringing while a subject is viewing a video) and endogenous artifacts (where the source could be neurophysiological, e.g. muscle movements, eye blinks, etc.).


The artifact removal subsystem includes mechanisms to selectively isolate and review the response data and identify epochs with time domain and/or frequency domain attributes that correspond to artifacts such as line frequency, eye blinks, and muscle movements. The artifact removal subsystem then cleanses the artifacts by either omitting these epochs, or by replacing these epoch data with an estimate based on the other clean data (for example, an EEG nearest neighbor weighted averaging approach).


According to various embodiments, the data cleanser and analyzer device 521 is implemented using hardware, firmware, and/or software.


The data analyzer portion uses a variety of mechanisms to analyze underlying data in the system to determine resonance. According to various embodiments, the data analyzer customizes and extracts the independent neurological and neuro-physiological parameters for each individual in each modality, and blends the estimates within a modality as well as across modalities to elicit an enhanced response to the presented stimulus material. In particular embodiments, the data analyzer aggregates the response measures across subjects in a dataset.


According to various embodiments, neurological and neuro-physiological signatures are measured using time domain analyses and frequency domain analyses. Such analyses use parameters that are common across individuals as well as parameters that are unique to each individual. The analyses could also include statistical parameter extraction and fuzzy logic based attribute estimation from both the time and frequency components of the synthesized response.


In some examples, statistical parameters used in a blended effectiveness estimate include evaluations of skew, peaks, first and second moments, distribution, as well as fuzzy estimates of attention, emotional engagement and memory retention responses.


According to various embodiments, the data analyzer may include an intra-modality response synthesizer and a cross-modality response synthesizer. In particular embodiments, the intra-modality response synthesizer is configured to customize and extract the independent neurological and neurophysiological parameters for each individual in each modality and blend the estimates within a modality analytically to elicit an enhanced response to the presented stimuli. In particular embodiments, the intra-modality response synthesizer also aggregates data from different subjects in a dataset.


According to various embodiments, the cross-modality response synthesizer or fusion device blends different intra-modality responses, including raw signals and signals output. The combination of signals enhances the measures of effectiveness within a modality. The cross-modality response fusion device can also aggregate data from different subjects in a dataset.


According to various embodiments, the data analyzer also includes a composite enhanced effectiveness estimator (CEEE) that combines the enhanced responses and estimates from each modality to provide a blended estimate of the effectiveness. In particular embodiments, blended estimates are provided for each exposure of a subject to stimulus materials. The blended estimates are evaluated over time to assess resonance characteristics. According to various embodiments, numerical values are assigned to each blended estimate. The numerical values may correspond to the intensity of neuro-response measurements, the significance of peaks, the change between peaks, etc. Higher numerical values may correspond to higher significance in neuro-response intensity. Lower numerical values may correspond to lower significance or even insignificant neuro-response activity. In other examples, multiple values are assigned to each blended estimate. In still other examples, blended estimates of neuro-response significance are graphically represented to show changes after repeated exposure.


According to various embodiments, a data analyzer passes data to a resonance estimator that assesses and extracts resonance patterns. In particular embodiments, the resonance estimator determines entity positions in various stimulus segments and matches position information with eye tracking paths while correlating saccades with neural assessments of attention, memory retention, and emotional engagement. In particular embodiments, the resonance estimator stores data in the priming repository system. As with a variety of the components in the system, various repositories can be co-located with the rest of the system and the user, or could be implemented in remote locations.



FIG. 6 illustrates an example of a technique for providing alternate media. At 601, video is received. According to various embodiments, video is received over a variety of different media in a real-time or a time-delayed manner. Video may be received over satellite, cable, wireless networks, wired networks, physical media, etc. At 603, alternate media is received. According to various embodiments, the alternate media is received on the same channel and medium as the video. However, it is recognized that alternate media may be received on different channels, different media, and/or different time slots. In some examples, alternate media is received when a video recorder or other video player is not being used.


At 605, alternate media is maintained. Alternate media may be maintained on persistent storage such as disks or temporary storage such as memory buffers. At 609, video is provided to a user in real-time or a time-delayed manner. According to various embodiments, video is displayed using a system that allows user requests, such as skip forward, fast forward, and rewind requests. At 611, a skip request is received during a commercial break. At 613, alternate media is provided in place of the skipped commercial. In particular embodiments, the alternate media is an overlay image or logo provided during the commercial break or during regular programming. In other embodiments, the alternate media is a substitute commercial. According to various embodiments, the system tracks the number of skip requests and may only allow a certain number of commercial skips.



FIG. 7 illustrates one example of using neuro-response based feedback for evaluating and selecting alternate media. At 701, stimulus material is provided to multiple subjects. According to various embodiments, stimulus includes streaming video and audio. In particular embodiments, subjects view stimulus in their own homes in group or individual settings. In some examples, verbal and written responses are collected for use without neuro-response measurements. In other examples, verbal and written responses are correlated with neuro-response measurements. At 703, subject neuro-response measurements are collected using a variety of modalities, such as EEG, ERP, EOG, GSR, etc. At 705, data is passed through a data cleanser to remove noise and artifacts that may make data more difficult to interpret. According to various embodiments, the data cleanser removes EEG electrical activity associated with blinking and other endogenous/exogenous artifacts.


According to various embodiments, data analysis is performed. Data analysis may include intra-modality response synthesis and cross-modality response synthesis to enhance effectiveness measures. It should be noted that in some particular instances, one type of synthesis may be performed without performing other types of synthesis. For example, cross-modality response synthesis may be performed with or without intra-modality synthesis.


A variety of mechanisms can be used to perform data analysis. In particular embodiments, a stimulus attributes repository is accessed to obtain attributes and characteristics of the stimulus materials, along with purposes, intents, objectives, etc. In particular embodiments, EEG response data is synthesized to provide an enhanced assessment of effectiveness. According to various embodiments, EEG measures electrical activity resulting from thousands of simultaneous neural processes associated with different portions of the brain. EEG data can be classified in various bands. According to various embodiments, brainwave frequencies include delta, theta, alpha, beta, and gamma frequency ranges. Delta waves are classified as those less than 4 Hz and are prominent during deep sleep. Theta waves have frequencies between 3.5 to 7.5 Hz and are associated with memories, attention, emotions, and sensations. Theta waves are typically prominent during states of internal focus.


Alpha frequencies reside between 7.5 and 13 Hz and typically peak around 10 Hz. Alpha waves are prominent during states of relaxation. Beta waves have a frequency range between 14 and 30 Hz. Beta waves are prominent during states of motor control, long range synchronization between brain areas, analytical problem solving, judgment, and decision making. Gamma waves occur between 30 and 60 Hz and are involved in binding of different populations of neurons together into a network for the purpose of carrying out a certain cognitive or motor function, as well as in attention and memory. Because the skull and dermal layers attenuate waves in this frequency range, brain waves above 75-80 Hz are difficult to detect and are often not used for stimuli response assessment.


However, the techniques and mechanisms of the present invention recognize that analyzing high gamma band (kappa-band: Above 60 Hz) measurements, in addition to theta, alpha, beta, and low gamma band measurements, enhances neurological attention, emotional engagement and retention component estimates. In particular embodiments, EEG measurements including difficult to detect high gamma or kappa band measurements are obtained, enhanced, and evaluated. Subject and task specific signature sub-bands in the theta, alpha, beta, gamma and kappa bands are identified to provide enhanced response estimates. According to various embodiments, high gamma waves (kappa-band) above 80 Hz (typically detectable with subcranial EEG and/or magnetoencephalograophy) can be used in inverse model-based enhancement of the frequency responses to the stimuli.


Various embodiments of the present invention recognize that particular sub-bands within each frequency range have particular prominence during certain activities. A subset of the frequencies in a particular band is referred to herein as a sub-band. For example, a sub-band may include the 40-45 Hz range within the gamma band. In particular embodiments, multiple sub-bands within the different bands are selected while remaining frequencies are band pass filtered. In particular embodiments, multiple sub-band responses may be enhanced, while the remaining frequency responses may be attenuated.


An information theory based band-weighting model is used for adaptive extraction of selective dataset specific, subject specific, task specific bands to enhance the effectiveness measure. Adaptive extraction may be performed using fuzzy scaling. Stimuli can be presented and enhanced measurements determined multiple times to determine the variation profiles across multiple presentations. Determining various profiles provides an enhanced assessment of the primary responses as well as the longevity (wear-out) of the marketing and entertainment stimuli. The synchronous response of multiple individuals to stimuli presented in concert is measured to determine an enhanced across subject synchrony measure of effectiveness. According to various embodiments, the synchronous response may be determined for multiple subjects residing in separate locations or for multiple subjects residing in the same location.


Although a variety of synthesis mechanisms are described, it should be recognized that any number of mechanisms can be applied—in sequence or in parallel with or without interaction between the mechanisms.


Although intra-modality synthesis mechanisms provide enhanced significance data, additional cross-modality synthesis mechanisms can also be applied. A variety of mechanisms such as EEG, Eye Tracking, GSR, EOG, and facial emotion encoding are connected to a cross-modality synthesis mechanism. Other mechanisms as well as variations and enhancements on existing mechanisms may also be included. According to various embodiments, data from a specific modality can be enhanced using data from one or more other modalities. In particular embodiments, EEG typically makes frequency measurements in different bands like alpha, beta and gamma to provide estimates of significance. However, the techniques of the present invention recognize that significance measures can be enhanced further using information from other modalities.


For example, facial emotion encoding measures can be used to enhance the valence of the EEG emotional engagement measure. EOG and eye tracking saccadic measures of object entities can be used to enhance the EEG estimates of significance including but not limited to attention, emotional engagement, and memory retention. According to various embodiments, a cross-modality synthesis mechanism performs time and phase shifting of data to allow data from different modalities to align. In some examples, it is recognized that an EEG response will often occur hundreds of milliseconds before a facial emotion measurement changes. Correlations can be drawn and time and phase shifts made on an individual as well as a group basis. In other examples, saccadic eye movements may be determined as occurring before and after particular EEG responses. According to various embodiments, time corrected GSR measures are used to scale and enhance the EEG estimates of significance including attention, emotional engagement and memory retention measures.


Evidence of the occurrence or non-occurrence of specific time domain difference event-related potential components (like the DERP) in specific regions correlates with subject responsiveness to specific stimulus. According to various embodiments, ERP measures are enhanced using EEG time-frequency measures (ERPSP) in response to the presentation of the marketing and entertainment stimuli. Specific portions are extracted and isolated to identify ERP, DERP and ERPSP analyses to perform. In particular embodiments, an EEG frequency estimation of attention, emotion and memory retention (ERPSP) is used as a co-factor in enhancing the ERP, DERP and time-domain response analysis.


EOG measures saccades to determine the presence of attention to specific objects of stimulus. Eye tracking measures the subject's gaze path, location and dwell on specific objects of stimulus. According to various embodiments, EOG and eye tracking is enhanced by measuring the presence of lambda waves (a neurophysiological index of saccade effectiveness) in the ongoing EEG in the occipital and extra striate regions, triggered by the slope of saccade-onset to estimate the significance of the EOG and eye tracking measures. In particular embodiments, specific EEG signatures of activity such as slow potential shifts and measures of coherence in time-frequency responses at the Frontal Eye Field (FEF) regions that preceded saccade-onset are measured to enhance the effectiveness of the saccadic activity data.


GSR typically measures the change in general arousal in response to stimulus presented. According to various embodiments, GSR is enhanced by correlating EEG/ERP responses and the GSR measurement to get an enhanced estimate of subject engagement. The GSR latency baselines are used in constructing a time-corrected GSR response to the stimulus. The time-corrected GSR response is co-factored with the EEG measures to enhance GSR significance measures.


According to various embodiments, facial emotion encoding uses templates generated by measuring facial muscle positions and movements of individuals expressing various emotions prior to the testing session. These individual specific facial emotion encoding templates are matched with the individual responses to identify subject emotional response. In particular embodiments, these facial emotion encoding measurements are enhanced by evaluating inter-hemispherical asymmetries in EEG responses in specific frequency bands and measuring frequency band interactions. The techniques of the present invention recognize that not only are particular frequency bands significant in EEG responses, but particular frequency bands used for communication between particular areas of the brain are significant. Consequently, these EEG responses enhance the EMG, graphic and video based facial emotion identification.


According to various embodiments, post-stimulus versus pre-stimulus differential measurements of ERP time domain components in multiple regions of the brain (DERP) are measured at 707. The differential measures give a mechanism for eliciting responses attributable to the stimulus. For example the messaging response attributable to an advertisement or the brand response attributable to multiple brands is determined using pre-resonance and post-resonance estimates.


At 709, target versus distracter stimulus differential responses are determined for different regions of the brain (DERP). At 711, event related time-frequency analysis of the differential response (DERPSPs) are used to assess the attention, emotion and memory retention measures across multiple frequency bands. According to various embodiments, the multiple frequency bands include theta, alpha, beta, gamma and high gamma or kappa. At 713, priming levels and resonance for various products, services, and offerings are determined at different locations in the stimulus material. In some examples, priming levels and resonance are manually determined. In other examples, priming levels and resonance are automatically determined using neuro-response measurements. According to various embodiments, video streams are modified with different alternate media based on priming levels and resonance of the video.


At 717, multiple trials are performed to enhance priming and resonance measures. In some examples, stimulus. In some examples, multiple trials are performed to enhance resonance measures.


In particular embodiments, the priming and resonance measures are sent to a priming repository 719. The priming repository 719 may be used to automatically select alternate media.


According to various embodiments, various mechanisms such as the data collection mechanisms, the intra-modality synthesis mechanisms, cross-modality synthesis mechanisms, etc. are implemented on multiple devices. However, it is also possible that the various mechanisms be implemented in hardware, firmware, and/or software in a single system.



FIG. 8 provides one example of a system that can be used to implement one or more mechanisms. For example, the system shown in FIG. 8 may be used to implement an alternate media system.


According to particular example embodiments, a system 800 suitable for implementing particular embodiments of the present invention includes a processor 801, a memory 803, an interface 811, and a bus 815 (e.g., a PCI bus). When acting under the control of appropriate software or firmware, the processor 801 is responsible for such tasks such as pattern generation. Various specially configured devices can also be used in place of a processor 801 or in addition to processor 801. The complete implementation can also be done in custom hardware. The interface 811 is typically configured to send and receive data packets or data segments over a network. Particular examples of interfaces the device supports include host bus adapter (HBA) interfaces, Ethernet interfaces, frame relay interfaces, cable interfaces, DSL interfaces, token ring interfaces, and the like.


According to particular example embodiments, the system 800 uses memory 803 to store data, algorithms and program instructions. The program instructions may control the operation of an operating system and/or one or more applications, for example. The memory or memories may also be configured to store received data and process received data.


Because such information and program instructions may be employed to implement the systems/methods described herein, the present invention relates to tangible, machine readable media that include program instructions, state information, etc. for performing various operations described herein. Examples of machine-readable media include, but are not limited to, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROM disks and DVDs; magneto-optical media such as optical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory devices (ROM) and random access memory (RAM). Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter.


Although the foregoing invention has been described in some detail for purposes of clarity of understanding, it will be apparent that certain changes and modifications may be practiced within the scope of the appended claims. Therefore, the present embodiments are to be considered as illustrative and not restrictive and the invention is not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.

Claims
  • 1. A method comprising: detecting, by executing an instruction with a processor, a skip request, provided via a user input, to skip a first default commercial presented on a display device;presenting, by executing an instruction with the processor, a first alternate commercial on the display device in response to the skip request, the first alternate commercial being different than the first default commercial;identifying, by executing an instruction with the processor, a number of skip requests during a commercial break segment; andpresenting, by executing an instruction with the processor, at least one of a second default commercial or a second alternate commercial without allowing additional skip requests when the identified number of skip requests exceeds a threshold number of skip requests allowable during the commercial break segment.
  • 2. The method of claim 1, further including: in response to the identified number of skip requests exceeding the threshold number of skip requests, presenting at least one of a third default commercial or a third alternate commercial on the display device until at least one of a threshold number of commercials or a threshold time period associated with the commercial break segment is met.
  • 3. The method of claim 1, wherein the presenting of the first alternate commercial includes presenting the first alternate commercial as a partially opaque overlay on video frames presented on the display device.
  • 4. The method of claim 1, wherein the presenting of the first alternate commercial includes presenting the first alternate commercial as a banner advertisement on the display device.
  • 5. The method of claim 1, further including: presenting on the display device a menu of a plurality of alternate commercials including the first alternate commercial; andreceiving an input selecting the first alternate commercial.
  • 6. The method of claim 1, wherein the first alternate commercial is a shortened version of the first default commercial.
  • 7. The method of claim 1, wherein the first alternate commercial is a logo associated with the first default commercial.
  • 8. An apparatus comprising: a display device to receive a video stream including a program and at least one of a plurality of default commercials or a plurality of alternate commercials to be presented when the program is interrupted during a commercial break segment; anda processor to: present a first default commercial from the plurality of default commercials;detect a skip request to skip the first default commercial presented on the display device;present a first alternate commercial on the display device in response to the skip request, the first alternate commercial being different than the first default commercial;identify a number of skip requests during the commercial break segment; andpresent at least one of a second default commercial or a second alternate commercial without allowing additional skip requests when the identified number of skip requests exceeds a threshold number of skip requests allowable during the commercial break segment.
  • 9. The apparatus of claim 8, wherein the processor is to present at least one of a third default commercial or a third alternate commercial on the display device without allowing additional skip requests until at least one of a threshold number of commercials or a threshold time period associated with the commercial break segment is met.
  • 10. The apparatus of claim 8, wherein the processor is to present the first alternate commercial as a partially opaque overlay on video frames presented on the display device.
  • 11. The apparatus of claim 8, wherein the processor is to present the first alternate commercial as a banner advertisement on the display device.
  • 12. The apparatus of claim 8, wherein the processor is to select the first alternate commercial by: presenting a menu of a plurality of alternate commercials including the first alternate commercial; andreceiving an input selecting the first alternate commercial.
  • 13. The apparatus of claim 8, wherein the first alternate commercial is a logo associated with the first default commercial.
  • 14. A machine readable memory or storage disc comprising instructions that, when executed, cause a machine to at least: detect a skip request to skip a first default commercial from a plurality of default commercials presented on a display device during a commercial break segment associated with programming to be presented on the display device;present a first alternate commercial from a plurality of alternate commercials on the display device in response to the skip request, the first alternate commercial being different than the first default commercial;identify a number of skip requests during the commercial break segment; andpresent at least one of a second default commercial or a second alternate commercial without allowing additional skip requests when the identified number of skip requests exceeds a threshold number of allowable skip requests allowable during the commercial break segment.
  • 15. The machine readable memory or storage disc of claim 14, wherein the instructions, when executed, cause the machine to present at least one of a third default commercial or a third alternate commercial on the display device without allowing skip requests until at least one of a threshold number of commercials or a threshold time period associated with the commercial break segment is met.
  • 16. The machine readable memory or storage disc of claim 14, wherein the instructions, when executed, cause the machine to present the first alternate commercial as a partially opaque overlay on video frames presented on the display device.
  • 17. The machine readable memory or storage disc of claim 14, wherein the instructions, when executed, cause the machine to present the first alternate commercial as a banner advertisement on the display device.
  • 18. The machine readable memory or storage disc of claim 14, wherein the instructions, when executed, cause the machine to: present on the display device a menu of a plurality of alternate commercials including the first alternate commercial; andreceive an input selecting the first alternate commercial.
  • 19. The machine readable memory or storage disc of claim 14, wherein the first alternate commercial is a logo associated with the first default commercial.
  • 20. The machine readable memory or storage disc of claim 14, wherein the first alternate commercial is a shortened version of the first default commercial.
RELATED APPLICATION

This patent arises from a continuation of U.S. application Ser. No. 12/357,315, titled “Methods and Apparatus for Providing Alternate Media for Video Decoders,” filed Jan. 21, 2009, which is incorporated herein by this reference in its entirety.

US Referenced Citations (368)
Number Name Date Kind
2549836 McIntyre et al. Apr 1951 A
3490439 Rolston Jan 1970 A
3572322 Wade Mar 1971 A
3735753 Pisarski May 1973 A
3880144 Coursin et al. Apr 1975 A
3901215 John Aug 1975 A
3998213 Price Dec 1976 A
4075657 Weinblatt Feb 1978 A
4149716 Scudder Apr 1979 A
4201224 John May 1980 A
4279258 John Jul 1981 A
4411273 John Oct 1983 A
4417592 John Nov 1983 A
4537198 Corbett Aug 1985 A
4557270 John Dec 1985 A
4610259 Cohen et al. Sep 1986 A
4613951 Chu Sep 1986 A
4632122 Johansson et al. Dec 1986 A
4683892 Johansson et al. Aug 1987 A
4695879 Weinblatt Sep 1987 A
4736751 Gevins et al. Apr 1988 A
4800888 Itil et al. Jan 1989 A
4802484 Friedman et al. Feb 1989 A
4846190 John Jul 1989 A
4885687 Carey Dec 1989 A
4894777 Negishi et al. Jan 1990 A
4913160 John Apr 1990 A
4955388 Silberstein Sep 1990 A
4967038 Gevins et al. Oct 1990 A
4987903 Keppel et al. Jan 1991 A
5003986 Finitzo et al. Apr 1991 A
5038782 Gevins et al. Aug 1991 A
5052401 Sherwin Oct 1991 A
5083571 Prichep Jan 1992 A
RE34015 Duffy Aug 1992 E
5137027 Rosenfeld Aug 1992 A
5213338 Brotz May 1993 A
5226177 Nickerson Jul 1993 A
5243517 Schmidt et al. Sep 1993 A
5291888 Tucker Mar 1994 A
5293867 Oommen Mar 1994 A
5295491 Gevins Mar 1994 A
5339826 Schmidt et al. Aug 1994 A
5357957 Itil et al. Oct 1994 A
5363858 Farwell Nov 1994 A
5392788 Hudspeth Feb 1995 A
5406956 Farwell Apr 1995 A
5447166 Gevins Sep 1995 A
5474082 Junker Dec 1995 A
5479934 Imran Jan 1996 A
5518007 Becker May 1996 A
5537618 Boulton et al. Jul 1996 A
5617855 Waletzky et al. Apr 1997 A
5655534 Ilmoniemi Aug 1997 A
5676138 Zawilinski Oct 1997 A
5720619 Fisslinger Feb 1998 A
5724987 Gevins et al. Mar 1998 A
5729205 Kwon Mar 1998 A
5762611 Lewis et al. Jun 1998 A
5771897 Zufrin Jun 1998 A
5787187 Bouchard et al. Jul 1998 A
5800351 Mann Sep 1998 A
5812642 Leroy Sep 1998 A
5817029 Gevins et al. Oct 1998 A
5848399 Burke Dec 1998 A
5945863 Coy Aug 1999 A
5953083 Sharp Sep 1999 A
5961332 Joao Oct 1999 A
5983129 Cowan et al. Nov 1999 A
6001065 DeVito Dec 1999 A
6021346 Ryu et al. Feb 2000 A
6052619 John Apr 2000 A
6099319 Zaltman et al. Aug 2000 A
6120440 Goknar Sep 2000 A
6128521 Marro et al. Oct 2000 A
6154669 Hunter et al. Nov 2000 A
6161030 Levendowski et al. Dec 2000 A
6173260 Slaney Jan 2001 B1
6175753 Menkes et al. Jan 2001 B1
6191890 Baets et al. Feb 2001 B1
6228038 Claessens May 2001 B1
6236885 Hunter et al. May 2001 B1
6254536 DeVito Jul 2001 B1
6280198 Calhoun et al. Aug 2001 B1
6286005 Cannon Sep 2001 B1
6289234 Mueller Sep 2001 B1
6292688 Patton Sep 2001 B1
6301493 Marro et al. Oct 2001 B1
6315569 Zaltman Nov 2001 B1
6330470 Tucker et al. Dec 2001 B1
6334778 Brown Jan 2002 B1
6374143 Berrang et al. Apr 2002 B1
6381481 Levendowski et al. Apr 2002 B1
6398643 Knowles et al. Jun 2002 B1
6422999 Hill Jul 2002 B1
6434419 Gevins et al. Aug 2002 B1
6453194 Hill Sep 2002 B1
6487444 Mimura Nov 2002 B2
6488617 Katz Dec 2002 B1
6510340 Jordan Jan 2003 B1
6520905 Surve et al. Feb 2003 B1
6545685 Dorbie Apr 2003 B1
6575902 Burton Jun 2003 B1
6577329 Flickner et al. Jun 2003 B1
6585521 Obrador Jul 2003 B1
6594521 Tucker Jul 2003 B2
6598006 Honda et al. Jul 2003 B1
6654626 Devlin et al. Nov 2003 B2
6662052 Sarwal et al. Dec 2003 B1
6665560 Becker et al. Dec 2003 B2
6688890 von Buegner Feb 2004 B2
6708051 Durousseau Mar 2004 B1
6710831 Winker et al. Mar 2004 B1
6712468 Edwards Mar 2004 B1
6754524 Johnson, Jr. Jun 2004 B2
6757556 Gopinathan et al. Jun 2004 B2
6764182 Ito et al. Jul 2004 B2
6788882 Geer et al. Sep 2004 B1
6792304 Silberstein Sep 2004 B1
6842877 Robarts et al. Jan 2005 B2
6904408 McCarthy et al. Jun 2005 B1
6950698 Sarkela et al. Sep 2005 B2
6973342 Swanson Dec 2005 B1
6993380 Modarres Jan 2006 B1
7043746 Ma May 2006 B2
7120880 Dryer et al. Oct 2006 B1
7130673 Tolvanen-Laakso et al. Oct 2006 B2
7150715 Collura et al. Dec 2006 B2
7164967 Etienne-Cummings et al. Jan 2007 B2
7177675 Suffin et al. Feb 2007 B2
7286871 Cohen Oct 2007 B2
7340060 Tomkins et al. Mar 2008 B2
7391835 Gross et al. Jun 2008 B1
7496400 Hoskonen et al. Feb 2009 B2
7548774 Kurtz et al. Jun 2009 B2
7551952 Gevins et al. Jun 2009 B2
7623823 Zito et al. Nov 2009 B2
7636456 Collins et al. Dec 2009 B2
7689272 Farwell Mar 2010 B2
7697979 Martinerie et al. Apr 2010 B2
7698238 Barletta et al. Apr 2010 B2
7720351 Levitan May 2010 B2
7729755 Laken Jun 2010 B2
7809420 Hannula et al. Oct 2010 B2
7840248 Fuchs et al. Nov 2010 B2
7840250 Tucker Nov 2010 B2
7865394 Calloway et al. Jan 2011 B1
7892764 Xiong et al. Feb 2011 B2
7917366 Levanon et al. Mar 2011 B1
7941817 Khusheim May 2011 B2
7988557 Soderlund Aug 2011 B2
8014847 Shastri et al. Sep 2011 B2
8046787 Cerrato Oct 2011 B2
8069125 Jung et al. Nov 2011 B2
8082215 Jung et al. Dec 2011 B2
8086563 Jung et al. Dec 2011 B2
8103328 Turner et al. Jan 2012 B2
8209224 Pradeep et al. Jun 2012 B2
8270814 Pradeep et al. Sep 2012 B2
8392250 Pradeep et al. Mar 2013 B2
8392251 Pradeep et al. Mar 2013 B2
8396744 Pradeep et al. Mar 2013 B2
8464288 Pradeep et al. Jun 2013 B2
8655428 Pradeep et al. Feb 2014 B2
8955010 Pradeep et al. Feb 2015 B2
8977110 Pradeep et al. Mar 2015 B2
20010020236 Cannon Sep 2001 A1
20010056225 DeVito Dec 2001 A1
20020065826 Bell et al. May 2002 A1
20020072952 Hamzy et al. Jun 2002 A1
20020077534 DuRousseau Jun 2002 A1
20020155878 Lert, Jr. et al. Oct 2002 A1
20020156842 Signes et al. Oct 2002 A1
20020188216 Kayyali et al. Dec 2002 A1
20020188217 Farwell Dec 2002 A1
20020191950 Wang Dec 2002 A1
20020193670 Garfield et al. Dec 2002 A1
20030013981 Gevins et al. Jan 2003 A1
20030036955 Tanaka et al. Feb 2003 A1
20030059750 Bindler et al. Mar 2003 A1
20030073921 Sohmer et al. Apr 2003 A1
20030100998 Brunner et al. May 2003 A2
20030104865 Itkis et al. Jun 2003 A1
20030154128 Liga et al. Aug 2003 A1
20030165270 Endrikhovski et al. Sep 2003 A1
20030172376 Coffin, III Sep 2003 A1
20030192060 Levy Oct 2003 A1
20030233278 Marshall Dec 2003 A1
20030233656 Sie et al. Dec 2003 A1
20040005143 Tsuru et al. Jan 2004 A1
20040015608 Ellis et al. Jan 2004 A1
20040025174 Cerrato Feb 2004 A1
20040073129 Caldwell et al. Apr 2004 A1
20040092809 DeCharms May 2004 A1
20040098298 Yin May 2004 A1
20040103429 Carlucci et al. May 2004 A1
20040187167 Maguire et al. Sep 2004 A1
20040210159 Kibar Oct 2004 A1
20040220483 Yeo et al. Nov 2004 A1
20050010475 Perkowski et al. Jan 2005 A1
20050076359 Pierson Apr 2005 A1
20050079474 Lowe Apr 2005 A1
20050097594 O'Donnell et al. May 2005 A1
20050107716 Eaton et al. May 2005 A1
20050143629 Farwell Jun 2005 A1
20050154290 Langleben Jul 2005 A1
20050177058 Sobell Aug 2005 A1
20050197590 Osorio et al. Sep 2005 A1
20050223237 Barletta et al. Oct 2005 A1
20050227233 Buxton et al. Oct 2005 A1
20050240956 Smith et al. Oct 2005 A1
20050273017 Gordon Dec 2005 A1
20050288954 McCarthy et al. Dec 2005 A1
20050289582 Tavares et al. Dec 2005 A1
20060035707 Nguyen et al. Feb 2006 A1
20060093998 Vertegaal May 2006 A1
20060111644 Guttag et al. May 2006 A1
20060129458 Maggio Jun 2006 A1
20060167376 Viirre et al. Jul 2006 A1
20060168630 Davies Jul 2006 A1
20060256133 Rosenberg Nov 2006 A1
20060257834 Lee et al. Nov 2006 A1
20060259360 Flinn et al. Nov 2006 A1
20060293921 McCarthy et al. Dec 2006 A1
20070048707 Caamano et al. Mar 2007 A1
20070055169 Lee et al. Mar 2007 A1
20070066874 Cook Mar 2007 A1
20070066915 Frei et al. Mar 2007 A1
20070066916 Lemos Mar 2007 A1
20070067800 Wachtfogel Mar 2007 A1
20070078706 Datta et al. Apr 2007 A1
20070079331 Datta et al. Apr 2007 A1
20070106170 Dunseath, Jr. et al. May 2007 A1
20070135727 Virtanen et al. Jun 2007 A1
20070225585 Washbon et al. Sep 2007 A1
20070225674 Molnar et al. Sep 2007 A1
20070235716 Delic et al. Oct 2007 A1
20070238945 Delic et al. Oct 2007 A1
20070250846 Swix et al. Oct 2007 A1
20070265507 de Lemos Nov 2007 A1
20080001600 deCharms Jan 2008 A1
20080027345 Kumada et al. Jan 2008 A1
20080040740 Plotnick et al. Feb 2008 A1
20080059997 Plotnick et al. Mar 2008 A1
20080065468 Berg et al. Mar 2008 A1
20080081961 Westbrook et al. Apr 2008 A1
20080082019 Ludving et al. Apr 2008 A1
20080091512 Marci et al. Apr 2008 A1
20080097854 Young Apr 2008 A1
20080109840 Walter et al. May 2008 A1
20080125110 Ritter May 2008 A1
20080147488 Tunick et al. Jun 2008 A1
20080152300 Knee et al. Jun 2008 A1
20080155585 Craner Jun 2008 A1
20080208072 Fadem et al. Aug 2008 A1
20080214902 Lee et al. Sep 2008 A1
20080221400 Lee et al. Sep 2008 A1
20080221472 Lee et al. Sep 2008 A1
20080221969 Lee et al. Sep 2008 A1
20080222670 Lee et al. Sep 2008 A1
20080222671 Lee et al. Sep 2008 A1
20080228077 Wilk et al. Sep 2008 A1
20080255949 Genco et al. Oct 2008 A1
20080295126 Lee et al. Nov 2008 A1
20090024049 Pradeep et al. Jan 2009 A1
20090024447 Pradeep et al. Jan 2009 A1
20090024448 Pradeep et al. Jan 2009 A1
20090024449 Pradeep et al. Jan 2009 A1
20090024475 Pradeep et al. Jan 2009 A1
20090025023 Pradeep et al. Jan 2009 A1
20090025024 Beser et al. Jan 2009 A1
20090030287 Pradeep et al. Jan 2009 A1
20090030303 Pradeep et al. Jan 2009 A1
20090030717 Pradeep et al. Jan 2009 A1
20090030762 Lee et al. Jan 2009 A1
20090030930 Pradeep et al. Jan 2009 A1
20090036755 Pradeep et al. Feb 2009 A1
20090036756 Pradeep et al. Feb 2009 A1
20090060240 Coughlan et al. Mar 2009 A1
20090062629 Pradeep et al. Mar 2009 A1
20090062679 Tan et al. Mar 2009 A1
20090062680 Sandford Mar 2009 A1
20090062681 Pradeep et al. Mar 2009 A1
20090063255 Pradeep et al. Mar 2009 A1
20090063256 Pradeep et al. Mar 2009 A1
20090069652 Lee et al. Mar 2009 A1
20090070798 Lee et al. Mar 2009 A1
20090082643 Pradeep et al. Mar 2009 A1
20090083129 Pradeep et al. Mar 2009 A1
20090088610 Lee et al. Apr 2009 A1
20090094286 Lee et al. Apr 2009 A1
20090094627 Lee et al. Apr 2009 A1
20090094628 Lee et al. Apr 2009 A1
20090094629 Lee et al. Apr 2009 A1
20090097689 Prest et al. Apr 2009 A1
20090112077 Nguyen et al. Apr 2009 A1
20090131764 Lee et al. May 2009 A1
20090133047 Lee et al. May 2009 A1
20090150919 Lee et al. Jun 2009 A1
20090158308 Weitzenfeld et al. Jun 2009 A1
20090163777 Jung et al. Jun 2009 A1
20090195392 Zalewski Aug 2009 A1
20090214060 Chuang et al. Aug 2009 A1
20090248496 Hueter et al. Oct 2009 A1
20090253996 Lee et al. Oct 2009 A1
20090259137 Delic et al. Oct 2009 A1
20090280215 Yotsumoto Nov 2009 A1
20090318773 Jung et al. Dec 2009 A1
20090318826 Green et al. Dec 2009 A1
20090327068 Pradeep et al. Dec 2009 A1
20090328089 Pradeep et al. Dec 2009 A1
20100004768 Dunning et al. Jan 2010 A1
20100004977 Marci et al. Jan 2010 A1
20100022821 Dubi et al. Jan 2010 A1
20100041962 Causevic et al. Feb 2010 A1
20100060300 Müller et al. Mar 2010 A1
20100125219 Harris et al. May 2010 A1
20100145176 Himes Jun 2010 A1
20100145215 Pradeep et al. Jun 2010 A1
20100145217 Otto et al. Jun 2010 A1
20100166389 Knee et al. Jul 2010 A1
20100175079 Braun et al. Jul 2010 A1
20100183279 Pradeep et al. Jul 2010 A1
20100186031 Pradeep et al. Jul 2010 A1
20100198042 Popescu et al. Aug 2010 A1
20100214318 Pradeep et al. Aug 2010 A1
20100215289 Pradeep et al. Aug 2010 A1
20100218208 Holden Aug 2010 A1
20100249538 Pradeep et al. Sep 2010 A1
20100249636 Pradeep et al. Sep 2010 A1
20100250325 Pradeep et al. Sep 2010 A1
20100251288 Carlucci et al. Sep 2010 A1
20100257052 Zito et al. Oct 2010 A1
20100274152 McPeck et al. Oct 2010 A1
20100325660 Holden Dec 2010 A1
20110004089 Chou Jan 2011 A1
20110046473 Pradeep et al. Feb 2011 A1
20110046502 Pradeep et al. Feb 2011 A1
20110046503 Pradeep et al. Feb 2011 A1
20110046504 Pradeep et al. Feb 2011 A1
20110047121 Pradeep et al. Feb 2011 A1
20110059422 Masaoka Mar 2011 A1
20110105937 Pradeep et al. May 2011 A1
20110106621 Pradeep et al. May 2011 A1
20110106750 Pradeep et al. May 2011 A1
20110119124 Pradeep et al. May 2011 A1
20110119129 Pradeep et al. May 2011 A1
20110237971 Pradeep et al. Sep 2011 A1
20110248729 Mueller et al. Oct 2011 A2
20110270620 Pradeep et al. Nov 2011 A1
20110276504 Pradeep et al. Nov 2011 A1
20110282231 Pradeep et al. Nov 2011 A1
20110282232 Pradeep et al. Nov 2011 A1
20110282749 Pradeep et al. Nov 2011 A1
20110319975 Ho et al. Dec 2011 A1
20120036004 Pradeep et al. Feb 2012 A1
20120036005 Pradeep et al. Feb 2012 A1
20120054018 Pradeep et al. Mar 2012 A1
20120072289 Pradeep et al. Mar 2012 A1
20120083668 Pradeep et al. Apr 2012 A1
20120084139 Pradeep et al. Apr 2012 A1
20120108995 Pradeep et al. May 2012 A1
20120114305 Holden May 2012 A1
20120130800 Pradeep et al. May 2012 A1
20120284112 Pradeep et al. Nov 2012 A1
20120284332 Pradeep et al. Nov 2012 A1
20120290409 Pradeep et al. Nov 2012 A1
20120301120 Pradeep et al. Nov 2012 A1
Foreign Referenced Citations (14)
Number Date Country
1374658 Nov 1974 GB
2221759 Feb 1990 GB
9717774 May 1997 WO
9740745 Nov 1997 WO
9741673 Nov 1997 WO
2004049225 Jun 2004 WO
2008-077178 Jul 2008 WO
2008-109694 Sep 2008 WO
2008-109699 Sep 2008 WO
2008121651 Oct 2008 WO
2008137579 Nov 2008 WO
2008154410 Dec 2008 WO
2009018374 Feb 2009 WO
2009052833 Apr 2009 WO
Non-Patent Literature Citations (226)
Entry
Aaker et al., “Warmth in Advertising: Measurement, Impact, and Sequence Effects,” Journal of Consumer Research, vol. 12, No. 4, pp. 365-381, (Mar. 1986), 18 pages.
Allen et al., “A Method of Removing Imaging Artifact from Continuous EEG Recorded during Functional MRI,” Neuroimage, vol. 12, 230-239, (Aug. 2000), 12 pages.
Ambler et al., “Ads on the Brain; A Neuro-Imaging Comparison of Cognitive and Affective Advertising Stimuli,” London Business School, Centre for Marketing Working Paper, No. 00-902, (Mar. 2000), 23 pages.
Ambler, “Salience and Choice: Neural Correlates of Shopping Decisions,” Psychology & Marketing, vol. 21, No. 4, p. 247-261, Wiley Periodicals, Inc., doi: 10.1002/mar20004, (Apr. 2004), 16 pages.
Author Unknown, “Arousal in Sport, in Encyclopedia of Applied Psychology,” vol. 1, p. 159, retrieved from Google Books, (Spielberger, ed., Elsevier Academic Press, 2004), 1 page.
Bagozzi et al., “The Role of Emotions in Marketing,” Journal of the Academy of Marketing Science, vol. 27, No. 2, pp. 184-206, Academy of Marketing Science (1999), 23 pages.
Barcelo et al., “Prefrontal Modulation of Visual Processing in Humans,” Nature Neuroscience, vol. 3, No. 4, Apr. 2000, pp. 399-403, 5 pages.
Barreto et al., “Physiologic Instrumentation for Real-time Monitoring of Affective State of Computer Users,” WSEAS International Conference on Instrumentation, Measurement, Control, Circuits and Systems (IMCCAS), (2004), 6 pages.
Belch et al., “Psychophysiological and Cognitive Response to Sex in Advertising,” Advances in Consumer Research, vol. 9, pp. 424-427, (1982), 6 pages.
Bimler et al., “Categorical perception of facial expressions of emotion: Evidence from multidimensional scaling,” Cognition and Emotion, vol. 15(5), pp. 633-658 (Sep. 2001), 26 pages.
Blakeslee, “If You Have a ‘Buy Button’ in Your Brain, What Pushes It?” The New York Times, www.nytimes.com, (Oct. 19, 2004), 3 pages.
Braeutigam, “Neuroeconomics—From neural systems to economic behaviour,” Brain Research Bulletin, vol. 67, pp. 355-360, Elsevier, (2005), 6 pages.
Buschman, et al., “Top-Down versus Bottom-Up Control of Attention in the Prefrontal and posterior Parietal Cortices,” Science, vol. 315, American Association for the Advancement of Science, (2007), 4 pages.
Canolty, R.T., et al., “High Gamma Power Is Phase-Locked to Theta Oscillations in Human Neocortex,” Science, vol. 313, Sep. 15, 2006, pp. 1626-1628, 3 pages.
Cheng et al. “Gender Differences in the Mu Rhythm of the Human Mirror-Neuron System,” Plos ONE, vol. 3, Issue 5, www.plosone.org, (May 2008), 7 pages.
Clifford, “Billboards That Look Back,” The New York Times, NYTimes.com, available at http://www.nytimes.com/2008/05/31/business/media/31 billboard.html, (May 31, 2008), 4 pages.
Crawford et al., “Self-generated happy and sad emotions in low and highly hypnotizable persons during waking and hypnosis: laterality and regional EEG activity differences,” International Journal of Psychophysiology, vol. 24, pp. 239-266, (Dec. 1996), 28 pages.
Davidson, et al., “The functional neuroanatomy of emotion and affective style,” Trends in Cognitive Sciences, vol. 3, No. 1, (Jan. 1999), 11 pages.
de Gelder et al., “Categorical Perception of Facial Expressions: Categories and their Internal Structure,” Cognition and Emotion, vol. 11(1), pp. 1-23 (1997), 23 pages.
Desmet, “Measuring Emotion: Development and Application of an Instrument to Measure Emotional Responses to Products,” to be published in Funology: From Usability to Enjoyment, pp. 111-123, Kluwer Academic Publishers, (Blythe et al., eds., 2004), 13 pages.
D'Esposito, “From cognitive to neural models of working memory,” Phil. Trans. R. Soc. B, (Mar. 30, 2007), 12 pages.
Dien et al., “Application of Repeated Measures ANOVA to High-Density ERP Dataset: A Review and Tutorial,” in Event-Related Potentionals: A Methods Handbook pp. 57-82, (Todd C. Handy, ed., 2005), 14 pages.
Edgar, et al., “Digital Filters in ERP Research,” in Event-Related Potentionals: A Methods Handbook pp. 85-113, (Todd C. Handy, ed., 2005), 15 pages.
EEG Protocols, “Protocols for EEG Recording,” retrieved from the Internet on Aug. 23, 2011, (Nov. 13, 2007), 3 pages.
Engel, Andreas, et al., “Dynamic Predictions: Oscillations and Synchrony in Top-Down Processing,” Macmillan Magazines Ltd, vol. 2, Oct. 2001, pp. 704-716, 13 pages.
Filler, “MR Neurography and Diffusion Tensor Imaging: Origins, History & Clinical Impact of the first 50,000 Cases With an Assessment of Efficacy and Utility in a Prospective 5,000 Patent Study Group,” Institute for Nerve Medicine, (Nov. 7, 2008), 56 pages.
Friedman, et al., “Event-Related Potential (ERP) Studies of Memory Encoding and Retrival: A Selective Review,” Microscopy Research and Techique 51:6-28, Wiley-Less, Inc. ( 2000), 23 pages.
Fries, Pascal, “A Mechanism for Cognitive Dynamics: Neuronal Communication Through Neuronal Coherence,” Trends in Cognitive Sciences, vol. 9, No. 10, Oct. 2005, p. 474-480, 7 pages.
Gaillard, “Problems and Paradigms in ERP Research,” Biological Psychology, Elsevier Science Publisher B.V. (1988), 10 pages.
Gargiulo et al., “A Mobile EEG System With Dry Electrodes,” (Nov. 2008), 4 pages.
Gazzaley, Adam, et al., “Top-down Enhancement and Suppression of the Magnitude and Speed of Neural Activity,” Journal of Cognitive Neuroscience, vol. 17, No. 3, pp. 507-517, 11 pages.
Griss et al., “Characterization of micromachined spiked biopotential electrodes,” Biomedical Engineering, IEEE Transactions (Jun. 2002), 8 pages.
Haq, “This Is Your Brain on Advertising,” Business Week, Market Research, (Oct. 8, 2007), 4 pages.
Hartikainen, Kaisa, et al., “Emotionally Arousing Stimuli Compete with Attention to Left Hemispace,” Editorial Manager(tm) for NeuroReport, Manuscipt Draft, Manuscript No. NR-D-07-5935R1, submitted Sep. 8, 2007, 26 pages.
Hazlett, et al., “Emotional Response to Television Commercials: Facial EMG vs. Self-Report,” Journal of Advertising Research, (Apr. 1999), 17 pages.
Herrmann, et al., “Mechanisms of human attention: event-related potentials and oscillations,” Neuroscience and Biobehavioral Reviews, pp. 465-476, Elsevier Science Ltd., www.elsvevier.com/locate/neubiorev, (2001), 12 pages.
Hopf, et al., “Neural Sources of Focused Attention in Visual Search,” Cerebral Cortex, 10:1233-1241, Oxford University Press, (Dec. 2000), 9 pages.
Jung et al., “Analysis and Visualization of Single-Trial Event-Related Potentials,” Human Brain Mapping vol. 14, 166-185 (2001), 20 pages.
Kay et al., “Identifying natural images from human brain activity,” Nature, vol. 452, pp. 352-356, Nature Publishing Group, (Mar. 20, 2008), 5 pages.
Kishiyama et al., “Socioeconomic Disparities Affect Prefrontal Function in Children,” Journal of Cognitive Neuroscience pp. 1106-1115, Massachusetts Institute of Technology, (2008), 10 pages.
Klimesch, “EEG alpha and theta oscillations reflect cognitive and memory performance: a review and analysis,” Brain Research Reviews, vol. 29, 169-195, (1999), 27 pages.
Knight, Robert T., “Decreased Response to Novel Stimuli After Prefrontal Lesions in Man,” Electroencephalography and Clinical Neurophysiology, vol. 59, 1984, pp. 9-20, 12 pages.
Knight, “Consciousness Unchained: Ethical Issues and the Vegetative and minimally Conscious State,” The American Journal of Bioethics (Sep. 1, 2008), 3 pages.
Knight, et al., “Prefrontal cortex regulates inhibition and excitation in distributed neural networks,” Acta Psychologica vol. 101, pp. 159-178, Elsevier (1999), 20 pages.
Knight, Robert T., “Contribution of Human Hippocampal Region to Novelty Detection,” Nature, vol. 383, Sep. 19, 1996, p. 256-259, 4 pages.
Krakow et al., “Methodology: EEG-correlated fMRI,” Functional Imaging in the Epilepsies, (Lippincott Williams & Wilkins, 2000), 17 pages.
Krugman, “Brain Wave Measures of Media Involvement,” Journal of Advertising Research vol. 11, 3-9 (Feb. 1971), 7 pages.
Lee et al., “What is ‘neuromarketing’? A discussion and agenda for future research,” International Journal of Psychophysiology, vol. 63, pp. 199-204, Elsevier (2006), 6 pages.
Lekakos, George, “Personalized Advertising Services Through Hybrid Recommendation Methods: The Case of Digital Interactive Television,” Department of Informatics, Cyprus University, 2004, 11 pages.
Lewis et al., “Market Researchers make Increasing use of Brain Imaging,” ACNR, vol. 5, No. 3, pp. 36-37, (Jul./ Aug. 2005), 2 pages.
Luck, et al., “The speed of visual attention in schizophrenia: Electrophysiological and behavioral evidence,” Schizophrenia Research, pp. 174-195, Elsevier B.V. www.sciencedirect.com, (2006), 22 pages.
Lui et al., “Marketing Strategies in Virtual Worlds,” The Data Base for Advances in Information Systems, vol. 38, No. 4, pp. 77-80, (Nov. 2007), 4 pages.
Makeig, et al., “Dynamic Brain Sources of Visual Evoked Responses,” Science, vol. 295, (Jan. 25, 2002), 5 pages.
Makeig, et al., “Mining event-related brain dynamics,” Trends in Cognitive Sciences, vol. 8, No. 5, (May 2004), www.sciencedirect.com, 7 pages.
Meriam Webster Online Dictionary, Definition of Virtual Reality, retrieved from [http://www.merriamwebster.com/dictionary/virtual%20reality] on Feb. 25, 2012, 2 pages.
Merriam-Webster Online Dictionary definition for “tangible,” retrieved from [http://www.merriamwebster.com/dictionary/tangible] on Jan. 17, 2012, 1 page.
Miltner, Wolfgang H.R., et al., “Coherence of Gamma-band EEG Activity as a Basis for Associative Learning,” Nature, vol. 397, Feb. 4, 1999, pp. 434-436, 3 pages.
Neurofocus, “Neuroscientific Analysis for Audience Engagement”, retrieved from [http://web.archive.org/web/20080621114525/www.neurofocus.com /Brandimage.htm] on Jan. 8, 2010, (2008), 2 pages.
Newell et al., “Categorical perception of familiar objects,” Cognition, vol. 85, Issue 2, pp. 113-143 (Sep. 2002), 31 pages.
Nielsen, “Neuroinformatics in Functional Neurimaging,” Informatics and Mathematical Modelling, Technical University of Denmark, (Aug. 30, 2002), 241 pages.
Oberman et al., “EEG evidence for mirror neuron activity during the observation of human and robot actions: Toward an analysis of the human qualities of interactive robots,” Neurocomputing 70 (2007) 2194-2203, 10 pages.
Osborne, “Embedded Watermarking for image Verification in Telemedicine,” Thesis submitted for the degree of Doctor of Philosophy, Electrical and Electronic Engineering, University of Adelaide (2005), 219 pages.
Padgett et al., “Categorical Perception in Facial Emotion Classification,” In Proceedings of the 18th Annual Conference of the Cognitive Science Society, pp. 249-253 (1996), 5 pages.
Page et al., “Cognitive Neuroscience, Marketing and Research,” Congress 2006—Foresight—The Predictive Power of Research Conference Papers, ESOMAR Publications, (Sep. 17, 2006), 25 pages.
Paller, et al., “Validating neural correlates of familiarity,” Trends in Cognitive Sciences, vol. 11, No. 6, www.sciencedirect.com, (May 2, 2007), 8 pages.
Picton, et al., “Guidelines for using human event-related potentials to study cognition: Recording standards and publication criteria,” Psychophysiology, pp. 127-152, Society for Psychophysiological Research, (2000), 26 pages.
Rizzolatti et al., “The Mirror-Neuron System,” Annu. Rev. Neurosci., vol. 27, pp. 169-192, (Mar. 5, 2004), 30 pages.
Ruchkin et al., “Modality-specific processing streams in verbal working memory: evidence from spatio-temporal patterns of brain activity,” Cognitive Brain Research, vol. 6, pp. 95-113, Elsevier, (1997), 19 pages.
Rugg, et al., “Event-related potentials and recognition memory,” Trends in Cognitive Sciences, vol. 11, No. 6 (May 3, 2007), 7 pages.
Rugg, et al., “The ERP and cognitive psychology: conceptual issues,” (Sep. 1996), 7 pages.
Sapien Systems, “User monitoring,” http://web.archive.org/web/20030818043339/http:/www.sapiensystems.com/eyetracking.html, (Aug. 18, 2003), 1 page.
Simon-Thomas, et al, “Behavioral and Electrophysiological Evidence of a Right Hemisphere Bias for the Influence of Negative Emotion on Higher Cognition,” Journal of Cognitive Neuroscience, pp. 518-529, Massachusetts Institute of Technology (2005), 12 pages.
Spencer, “Averaging, Detection, and Classification of Single-Trial ERPs,” in Event-Related Potentionals: A Methods Handbook, pp. 209-227, (Todd C. Handy, ed., 2005), 10 pages.
Srinivasan, “High-Resolution EEG: Theory and Practice,” in Event-Related Potentionals: A Methods Handbook, pp. 167-188, (Todd C. Handy, ed., 2005), 12 pages.
Sullivan et al., “A brain-machine interface using dry-contact, low-noise EEG sensors,” In Proceedings of the 2008 IEEE International Symposium on Circuits and Systems, (May 18, 2008), 4 pages.
Sutherland, “Neuromarketing: What's it all about?” Retrieved from [http//:sutherlandsurvey.com] on Aug. 23, 2011, (Mar. 2007), 5 pages.
Swick, et al., “Contributions of Prefrontal Cortex to Recognition Memory: Electrophysiological and Behavioral Evidence,” Neuropsychology, vol. 13, No. 2, pp. 155-170, American Psychological Association, Inc. (1999), 16 pages.
Taheri, et al., “A dry electrode for EEG recording,” Electroencephalography and clinical Neurophysiology, pp. 376-383, Elsevier Science Ireland Ltd. (1994), 8 pages.
Talsma, et al., “Methods for the Estimation and Removal of Artifacts and Overlap in ERP Waveforms,” in Event-Related Potentionals: A Methods Handbook, pp. 115-148, (Todd C. Handy, ed., 2005), 22 pages.
The Mathworks, Inc., “MATLAB The Language of Technical Computing: Data Analysis: Version 7,” p. 4-19 (2005), 3 pages.
Vogel, et al., “Electrophysiological Evidence for a Postperceptual Locus of Suppression During the Attentional Blink,” Journal of Experiemental Psychology: Human Perception and Performance, vol. 24, No. 6, pp. 1656-1674, (1998), 19 pages.
Woldorff, “Distortion of ERP averages due to overlap from temporally adjacent ERPs: Analysis and correction,” Psychphysiology, Society for Psychphysiological Research, Cambridge University Press (1993), 22 pages.
Woodman, et al., “Serial Deployment of Attention During Visual Search,” Journal of Experimental Psychology: Human Perception and Performance, vol. 29, No. 1, pp. 121-138, American Physiological Association (2003), 18 pages.
Yamaguchi, et al., “Rapid-Prefrontal-Hippocampal Habituation to Novel Events,” The Journal of Neuroscience, pp. 5356-5363, Society for Neuroscience, (Apr. 29, 2004), 8 pages.
Yuval-Greenberg, et al., “Transient Induced Gamma-Bands Response in EEG as a Manifestation of Miniature Saccades,” Neuron, vol. 58, pp. 429-441, Elsevier Inc. (May 8, 2008), 13 pages.
Ziegenfuss, “Neuromarketing: Advertising Ethical & Medical Technology,” The Brownstone Journal, vol. XII, Boston University, pp. 69-73, (May 2005), 9 pages.
Zyga, “A Baseball Cap That Reads Your Mind,” retrieved from [http://www.physorg.com/News130152277.html] on Oct. 21, 2010 (May 16, 2008), 11 pages.
English Translation of Office Action, issued by the Israel Patent Office in connection with Patent Application No. 203176, on Feb. 21, 2012, 2 pages.
English Translation of Office Action, issued by the Israel Patent Office in connection with Patent Application No. 203177, on Mar. 1, 2012, 2 pages.
Supplementary European Search Report, issued by the European Patent Office in connection with European Application No. 08744383.4, on Jul. 27, 2011, 6 pages.
Extended European Search Report, issued by the European Patent Office in connection with European Application No. 10173095.0, on Dec. 17, 2010, 3 pages.
Extended European Search Report, issued by the European Patent Office in connection with European Application No. 10189294.1, on Mar. 21, 2011, 7 pages.
Extended European Search Report, issued by the European Patent Office in connection with European Application No. 11006934.1, on Oct. 25, 2011, 5 pages.
First Office Action, issued by the State Intellectual Property Office of P.R. China in connection with Patent Application No. 200880019166.0, on Jul. 22, 2011, 16 pages.
Second Office Action, issued by the State Intellectual Property Office of P.R. China in connection with Patent Application No. 200880019166.0, on Jun. 5, 2012, 22 pages.
Second Office Action, issued by the State Intellectual Property Office of P.R. China in connection with Patent Application No. 200880101500.7, on May 25, 2011, 7 pages.
Second Office Action, issued by the State Intellectual Property Office of P.R. China in connection with Patent Application No. 200880101500.7, on Apr. 5, 2012, 6 pages.
First Office Action, issued by the State Intellectual Property Office of P.R. China in connection with Patent Application No. 200880104982.1, on Jan. 25, 2011, 15 pages.
Decision of Rejection, issued by the State Intellectual Property Office of P.R. China in connection with Patent Application No. 200880104982.1, on Sep. 23, 2011, 10 pages.
First Office Action, issued by the State Intellectual Property Office of P.R. China in connection with Patent Application No. 200880017883.X, on Nov. 30, 2011, 16 pages.
Second Office Action, issued by the State Intellectual Property Office of P.R. China in connection with Patent Application No. 200880104982.1, on Jun. 29, 2012, 11 pages.
Written Opinion of the International Searching Authority, issued by the International Bureau in connection with International Application No. PCT/US08/62275, on Sep. 22, 2008, 6 pages.
Written Opinion of the International Searching Authority, issued by the International Bureau in connection with International Application No. PCT/US08/66166, on Aug. 25, 2008, 6 pages.
International Preliminary Report on Patentability, issued by the International Bureau of WIPO in connection with International Application No. PCT/US06/62275, on Mar. 3, 2006, 6 pages.
Written Opinion of the International Searching Authority, issued by the International Bureau in connection with International Application No. PCT/US06/62275, on Apr. 28, 2008, 5 pages.
International Preliminary Report on Patentability, issued by the International Bureau of WIPO in connection with International Application No. PCT/US08/58264, on Sep. 29, 2009, 6 pages.
Written Opinion of the International Searching Authority, issued by the International Bureau in connection with International Application No. PCT/US08/58264, on Aug. 4, 2008, 5 pages.
International Preliminary Report on Patentability, issued by the International Bureau of WIPO in connection with International Application No. PCT/US08/62273, on Nov. 3, 2009, 5 pages.
Written Opinion of the International Searching Authority, issued by the International Bureau in connection with International Application No. PCT/US08/62273, on Sep. 5, 2008, 4 pages.
International Preliminary Report on Patentability, issued by the International Bureau of WIPO in connection with International Application No. PCT/US08/62275, on Nov. 3, 2009, 7 pages.
International Preliminary Report on Patentability, issued by the International Bureau of WIPO in connection with International Application No. PCT/US08/63984, on Nov. 17, 2009, 5 pages.
Written Opinion of the International Searching Authority, issued by the International Bureau in connection with International Application No. PCT/US08/63984, on Sep. 29, 2008, 4 pages.
International Preliminary Report on Patentability, issued by the International Bureau of WIPO in connection with International Application No. PCT/US08/63989, on Nov. 17, 2009, 5 pages.
Written Opinion of the International Searching Authority, issued by the International Bureau in connection with International Application No. PCT/US08/63989, on Jul. 17, 2008, 4 pages.
International Preliminary Report on Patentability, issued by the International Bureau of WIPO in connection with International Application No. PCT/US08/66166, on Dec. 7, 2009, 7 pages.
International Preliminary Report on Patentability, issued by the International Bureau of WIPO in connection with International Application No. PCT/US08/71639, on Feb. 2, 2010, 5 pages.
Written Opinion of the International Searching Authority, issued by the International Bureau in connection with International Application No. PCT/US08/71639, on Oct. 22, 2008, 4 pages.
International Preliminary Report on Patentability, issued by the International Bureau of WIPO in connection with International Application No. PCT/US08/74467, on Mar. 2, 2010, 5 pages.
Written Opinion of the International Searching Authority, issued by the International Bureau in connection with International Application No. PCT/US08/74467, on Nov. 17, 2008, 4 pages.
International Preliminary Report of Patentability and Written Opinion, issued by the International Bureau in connection with International Application No. PCT/US09/65368, on Jun. 14, 2011, 8 pages.
International Search Report and Written Opinion, issued by the International Bureau in connection with International Application No. PCT/US09/65368, on Jan. 21, 2010, 9 pages.
International Preliminary Report of Patentability and Written Opinion, issued by the International Bureau in connection with International Application No. PCT/US10/21535, on Jul. 26, 2011, 5 pages.
International Search Report and Written Opinion of the International Searching Authority, issued by the Internation Bureau in connection with International Application No. PCT/US2010/021535, on Mar. 23, 2010, 6 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,225, on Jan. 20, 2012, 36 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,066, on Jan. 24, 2012, 12 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/846,242, on Mar. 29, 2012, 57 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,190, on Aug. 10, 2011, 28 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,211, on Jul. 8, 2011, 16 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,211, on Jan. 7, 2011, 19 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,221, on Nov. 28, 2011, 46 pages.
Office Action issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,221, on Apr. 15, 2011, 24 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,225, on Jun. 15, 2012, 19 pages.
Restriction Requirement, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,225, on Oct. 3, 2011, 6 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/113,863, on Jun. 9, 2011, 12 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/113,863, on Dec. 22, 2011, 48 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/113,863, on Dec. 27, 2010, 17 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/113,870, on Apr. 21, 2011, 10 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/113,870, on Dec. 3, 2010, 12 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/122,240, on Jun. 10, 2011, 17 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/122,240, on Oct. 27, 2011, 39 pages.
Restriction Requirement, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/122,253,on Sep. 2, 2011, 7 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/122,262, on May 26, 2011, 15 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/122,262, on Dec. 22, 2011, 48 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/122,262, on Dec. 9, 2010, 13 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,066, on Jan. 21, 2011, 16 pages.
Notice of Panel Decision from Pre-Appeal Brief Review, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,066, on May 31, 2011, 2 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,066, on Jun. 21, 2012, 14 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,066, on Oct. 28, 2010, 14 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,066, on Sep. 29, 2011, 37 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,069, on Oct. 29, 2010, 21 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,074, on Jun. 9, 2011, 10 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,074, on Dec. 22, 2011, 47 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,074, on Dec. 23, 2010, 13 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/182,851, on Mar. 14, 2012, 37 pages.
Restriction Requirement, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/182,851, on Sep. 12, 2011, 7 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/182,874, on Jul. 7, 2011, 14 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/182,874, on Dec. 27, 2010, 17 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,557, on Jun. 9, 2011, 12 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,557, on Dec. 22, 2011, 33 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,557, on Dec. 27, 2010, 14 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,583, on Jun. 21, 2011, 14 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,583, on Dec. 27, 2010, 17 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,583, on Dec. 29, 2011, 48 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,596, on Jun. 14, 2011, 13 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,596, on Dec. 22, 2011, 46 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/199,596, on Dec. 27, 2010, 17 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/200,813, on Jul. 6, 2011, 13 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/200,813, on Dec. 22, 2011, 51 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/200,813, on Dec. 27, 2010, 15 pages.
Examiner's Answer, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/234,372, on May 23, 2012, 11 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/234,372, on Jun. 7, 2011, 11 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/234,372, on Oct. 13, 2011, 22 pages.
Advisory Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/234,388, on Aug. 28, 2012, 3 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/234,388, on Apr. 6, 2012, 16 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/234,388, on Oct. 12, 2011, 27 pages.
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/357,302, on May 7, 2012, 21 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/357,302, on Jan. 17, 2012, 11 pages.
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/357,322, on Mar. 29, 2013, 18 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/357,322, on Nov. 27, 2012, 44 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/357,322, on Aug. 23, 2011, 12 pages.
Examiner's Answer, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/410,372, on Aug. 3, 2012, 8 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/410,372, on Jan. 3, 2012, 46 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/410,372, on Sep. 12, 2011, 12 pages.
Examiner's Answer, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/410,380, on Jun. 8, 2012, 12 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/410,380, on Jun. 7, 2011, 10 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/410,380, on Oct. 19, 2011, 21 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/413,297,on Jan. 4, 2012, 48 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/413,297, on Jul. 18, 2011, 10 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/544,921, on Jan. 9, 2012, 29 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/544,934, on Jun. 18, 2012, 31 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/544,958, on May 2, 2012, 64 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/545,455, on Aug. 29, 2012, 11 pages.
Restriction Requirement, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/545,455, on Jun. 13, 2012, 5 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/546,586, on Feb. 1, 2012, 36 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/608,660, on Jul. 10, 2012, 28 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/608,660, on Dec. 7, 2011, 8 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/608,685, on Jul. 30, 2012, 15 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/608,685, on Mar. 29, 2012, 45 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/608,685,on Jul. 12, 2011, 15 pages.
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/608,696, on May 15, 2012, 17 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/778,810, on Aug. 31, 2012, 82 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/778,828, on Aug. 30, 2012, 79 pages.
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/868,531, on May 8, 2012, 11 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/868,531, on Mar. 1, 2012, 7 pages.
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 13/570,946, on Oct. 30, 2014, 18 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 13/570,946, on Jul. 15, 2014, 40 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 13/914,571, on May 16, 2014, 16 pages.
Notice of Allowance, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 13/914,571, on Sep. 26, 2014, 66 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 13/914,571, on Dec. 11, 2013, 17 pages.
Restriction Requirement, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/544,958, on Feb. 10, 2012, 6 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,069, on Feb. 14, 2012, 35 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/357,322, on Feb. 14, 2012, 14 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,211, on Feb. 16, 2012, 15 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/056,190, on Feb. 17, 2012, 22 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/122,253, on Feb. 17, 2012, 20 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/182,874, on Feb. 17, 2012, 15 pages.
Mosby's Dictionary of Medicine, Nursing, & Health Professions, 2009, Mosby, Inc., Definition of Alpha Wave, 1 page.
Mosby's Dictionary of Medicine, Nursing, & Health Professions, 2009, Mosby, Inc., Definition of Beta Wave, 1 page.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,069, on Aug. 26, 2011, 10 pages.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/135,069, on Feb. 17, 2011, 31 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/182,851, on Mar. 14, 2012, 17 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/846,262, on May 13, 2011, 8 pages.
International Search Report, issued by the International Bureau of WIPO in connection with International Application No. PCT/US08/58264, on Aug. 4, 2008, 1 page.
Final Rejection, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/846,262, on Apr. 18, 2013, 6 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/846,262, on Dec. 19, 2014, 8 pages.
Office Action, issued by the United States Patent and Trademark Office in connection with U.S. Appl. No. 12/846,262, on Jul. 10, 2015, 6 pages.
Related Publications (1)
Number Date Country
20160219345 A1 Jul 2016 US
Continuations (1)
Number Date Country
Parent 12357315 Jan 2009 US
Child 15087048 US