Advertisers, media producers, educators and other relevant parties have long desired to understand the responses their targets—customers, clients and pupils—have to their particular stimulus in order to tailor their information or media instances to better suit the needs of these targets and/or to increase the effectiveness of the media instance created. A key to making a high performing media instance is to make sure that every event in the media instance elicits the desired responses from the viewers, not responses very different from what the creator of the media instance expected. The media instance herein can be but is not limited to, a video, an advertisement clip, a movie, a computer application, a printed media (e.g., a magazine), a video game, a website, an online advertisement, a recorded video, a live performance, a debate, and other types of media instance from which a viewer can learn information or be emotionally impacted.
It is well established that physiological response is a valid measurement for viewers' changes in emotions and an effective media instance that connects with its audience/viewers is able to elicit the desired physiological responses from the viewers. Every media instance may have its key events/moments—moments which, if they do not evoke the intended physiological responses from the viewers, the effectiveness of the media instance may suffer significantly. For a non-limiting example, if an ad is intended to engage the viewers by making them laugh, but the viewers do not find a 2-second-long punch-line funny, such negative responses to this small piece of the ad may drive the overall reaction to the ad. Although survey questions such as “do you like this ad or not” have long been used to gather viewers' subjective reactions to a media instance, they are unable to provide more insight into why and what have caused the viewers reacted in the way they did.
An approach enables an event-based framework for evaluating a media instance based on key events of the media instance. First, physiological responses are derived and aggregated from the physiological data of viewers of the media instance. The key events in the media instance can then be identified, wherein such key events drive and determine the viewers' responses to the media instance. Causal relationship between the viewers' responses to the key events and their surveyed feelings about the media instance can further be established to identify why and what might have caused the viewers to feel the way they do.
Such an approach provides information that can be leveraged by a creator of the media instance to improve the media instance. For a non-limiting example, if a joke in an advertisement is found to drive purchase intent of the product advertised, but the advertisement's target demographic does not respond to the joke, the joke can be changed so that the advertisement achieves its goal: increasing product purchase intent in the target demographic.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. These and other advantages of the present invention will become apparent to those skilled in the art upon a reading of the following descriptions and a study of the several figures of the drawings.
a)-(c) depict exemplary traces of physiological responses measured and exemplary dividing lines of events in a media instance.
a)-(c) depict exemplary event identification results based on different event defining approaches.
a)-(b) depict exemplary correlations between physiological responses from viewers to key jokes in an ad and the surveyed intent of the viewers to tell others about the ad.
The invention is illustrated by way of example and not by way of limitation in the figures of the accompanying drawings in which like references indicate similar elements. It should be noted that references to “an” or “one” or “some” embodiment(s) in this disclosure are not necessarily to the same embodiment, and such references mean at least one. Although the subject matter is described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Although the diagrams depict components as functionally separate, such depiction is merely for illustrative purposes. It will be apparent to those skilled in the art that the components portrayed in this figure can be arbitrarily combined or divided into separate software, firmware and/or hardware components. Furthermore, it will also be apparent to those skilled in the art that such components, regardless of how they are combined or divided, can execute on the same computing device or multiple computing devices, and wherein the multiple computing devices can be connected by one or more networks.
Physiological data, which includes but is not limited to heart rate, brain waves, electroencephalogram (EEG) signals, blink rate, breathing, motion, muscle movement, galvanic skin response and any other response correlated with changes in emotion of a viewer of a media instance, can give a trace (e.g., a line drawn by a recording instrument) of the viewer's responses while he/she is watching the media instance. The physiological data can be measure by one or more physiological sensors, each of which can be but is not limited to, an electroencephalogram, an accelerometer, a blood oxygen sensor, a galvanometer, an electromygraph, skin temperature sensor, breathing sensor, and any other physiological sensor.
The physiological data in the human body of a viewer has been shown to correlate with the viewer's change in emotions. Thus, from the measured “low level” physiological data, “high level” (i.e., easier to understand, intuitive to look at) physiological responses from the viewers of the media instance can be created. An effective media instance that connects with its audience/viewers is able to elicit the desired emotional response. Here, the high level physiological responses include, but are not limited to, liking (valence)—positive/negative responses to events in the media instance, intent to purchase or recall, emotional engagement in the media instance, thinking—amount of thoughts and/or immersion in the experience of the media instance, adrenaline—anger, distraction, frustration, and other emotional experiences to events in the media instance. In addition, the physiological responses may also include responses to other types of sensory stimulations, such as taste and/or smell, if the subject matter is food or a scented product instead of a media instance.
The response module 102 is a software component which while in operation, first accepts and/or records physiological data from each of a plurality of viewers watching a media instance, then derives and aggregates physiological responses from the collected physiological data. Such derivation can be accomplished via a plurality of statistical measures, which include but are not limited to, average value, deviation from mean, 1st order derivative of the average value, 2nd order derivative of the average value, coherence, positive response, negative response, etc., using the physiological data of the viewers as inputs. Facial expression recognition, “knob” and other measures of emotion can also be used as inputs with comparable validity. Here, the physiological data may be either be retrieved from a storage device or measured via one or more physiological sensors, each of which can be but is not limited to, an electroencephalogram, an accelerometer, a blood oxygen sensor, a galvanometer, an electromygraph, and any other physiological sensor either in separate or integrated form. The derived physiological responses can then be aggregated over the plurality of viewers watching one or more media instances.
The event defining module 104 is a software component which while in operation, defines and marks occurrences and durations of a plurality of events happening in the media instance. The duration of each of event in the media instance can be constant, non-linear, or semi-linear in time. Such event definition may happen either before or after the physiological data of the plurality of viewers has been measured, where in the later case, the media instance can be defined into the plurality of events based on the physiological data measured from the plurality of viewers.
The key event module 106 is a software component which while in operation, identifies one or more key events in the media instance and reports the key events to an interested party of the media instance, wherein the key events drive and determine the viewers' physiological responses to the media instance. Key events in the media instance can be used to pinpoint whether and/or which part of the media instance need to be improved or changed, and which part of the media instance should be kept intact. For non-limiting examples, the key event module may identify which key event(s) in the media instance trigger the most positive or negative responses from the viewers, or alternatively, which key event(s) are polarizing events, e.g., they cause large discrepancies in the physiological responses from different demographic groups of viewers, such as between groups of men and women, when the groups are defined by demographic characteristics. In addition, the key event module is operable to establish a causal relationship between the viewers' responses to the events in the media instance and their surveyed feelings about the media instance so that creator of the media instance may gain insight into the reason why and what key events might have caused the viewers to feel the way they do.
The reaction database 108 stores pertinent data of the media instance the viewers are watching, wherein the pertinent data includes but is not limited to survey questions and results asked for each of the plurality of viewers before, during, and/or after their viewing of the media instance. In addition, the pertinent data may also include but is not limited to the following:
While the system 100 depicted in
Referring to
In some embodiments, the event defining module 104 is operable to define occurrence and duration of events in the media instance based on salient positions identified in the media instance. Once salient positions in the media instance are identified, the events corresponding to the salient positions can be extracted. For a non-limiting example, an event in a video game may be defined as a “battle tank” appearing in the player's screen and lasting as long as it remains on the screen. For another non-limiting example, an event in a movie may be defined as occurring every time a joke is made. While defining humor is difficult, punch line events that are unexpected, absurd, and comically exaggerated often qualify as joke events.
a) shows an exemplary trace of the physiological response—“Engagement” for a player playing Call of Duty 3 on the Xbox 360. The trace is a time series, with the beginning of the session on the left and the end on the right. Two event instances 301 and 302 are circled, where 301 on the left shows low “Engagement” during a game play that happens during a boring tutorial section. 302 shows a high “Engagement” section that has been recorded when the player experiences the first battle of the game.
In some embodiments, the event defining module 104 is operable to define occurrence and duration of events in the media instance via at least the one or more of the following approaches. The events so identified by the event defining module 104 are then provided to the key event module 106 to test for “significance” as key events in the media instance as described below.
In some embodiments, the key event module 106 is operable to accept the events defined by the event defining module 104 and automatically spot statistically significant/important points in the aggregated physiological responses from the viewers relevant to identify the key moments/events in the media instance. More specifically, the key event module is operable to determine one or more of:
In some embodiments, the key events found can be used to improve the media instance. Here “improving the media instance” can be defined as, but is not limited to, changing the media instance so that it is more likely to achieve the goals of the interested party or creator of the media instance.
In some embodiments, the key event module 106 is further operable to establish a casual relationship between surveyed feelings about the media instance and the key events identified based on the physiological responses from the viewers. In other words, it establishes a correlation between the physiological responses from the viewers to key events in the media instance and a surveyed outcome, i.e., the viewers' reported feelings on a survey, and reports to the interested parties (e.g. creator of the event) which key events in the media instance actually caused the outcome. Here, the outcome can include but is not limited to, liking, effectiveness, purchase intent, post viewing product selection, etc. For a non-limiting example, if the viewers indicate on a survey that they did not like the media instance, something about the media instance might have caused them to feel this way. While the cause may be a reaction to the media instance in general, it can often be pinned down to a reaction to one or more key events in the media instance as discussed above. The established casual relationship explains why the viewers report on the survey their general feelings about the media instance the way they do without human input.
In some embodiments, the key event module 106 is operable to adopt multivariate regression analysis via a multivariate model that incorporates the physiological responses from the viewers as well as the surveyed feelings from the viewers to determine which events, on average, are key events in driving reported feelings (surveyed outcome) about the media instance. Here, the multivariate regression analysis examines the relationship among many factors (the independent variables) and a single, dependent variable, which variation is thought to be at least partially explained by the independent variables. For a non-limiting example, the amount of rain that falls on a given day varies, so there is variation in daily rainfall. Both the humidity in the air and the number of clouds in the sky on a given day can be hypothesized to explain this variation in daily rainfall. This hypothesis can be tested via multivariate regression, with daily rainfall as the dependent variable, and humidity and number of clouds as independent variables.
In some embodiments, the multivariate model may have each individual viewer's reactions to certain key events in the media instance as independent variables and their reported feeling about the media instance as the dependent variable. The coefficients from regressing the independent variables on the dependent variable would determine which key events are causing the reported feelings. Such a multivariate model could be adopted here to determine what set of key events most strongly affect reported feelings from the viewers about the media instance, such as a joke in an advertisement. One characterization of such event(s) is that the more positive (negative) the viewers respond to the event(s), the more likely the viewers were to express positive feelings about the media instance. For a non-limiting example, a multivariate regression can be run on multiples events (1, 2 . . . n) within an entire montage sequence of an advertisement to determine which events drive liking the most, using relationship between reported feelings about the ad and the emotional responses from the viewers to the events in the ad as input. The results of the multivariate regression runs shown in
In an automated process, this multivariate regression may be run stepwise, which essentially tries various combinations of independent variables, determining which combination has the strongest explanatory power. This is a step toward creating the causal relationship between the viewers' responses to the events and their surveyed feelings about the media instance. For a non-limiting example, if response to joke #2 is correlated with indicated intent to purchase when holding genders and responses to jokes #1 and #3 constant, a causal conclusion can be made that joke #2 triggers the viewers' intent to purchase.
In some embodiments, the key event module 106 identifies the key polarizing event(s) that cause statistically significant difference in the surveyed outcome from different demographic groups of viewers and provides insight into, for non-limiting examples, why women do not like the show or which issue actually divides people in a political debate. The key event module 106 may collect demographic data from overall population of the viewers and categorize them into groups to differentiate the responses for the subset, wherein the viewers can be grouped one or more of: race, gender, age, education, demographics, income, buying habits, intent to purchase, and intent to tell. Such grouping information can be included in the regressions to determine how different groups report different reactions to the media instance in the survey. Furthermore, grouping/event response interaction variables can be included to determine how different groups respond differently to the key events in the media instance. For key events that are polarizing, demographic information and/or interaction variables of the viewers can also be included to the multivariate model capture the combined effect of the demographic factor and the reaction to the polarizing key events.
For a non-limiting example, the viewers of an ad can be first asked a survey question, “How likely are you to tell someone about this particular commercial—meaning tell a friend about this ad you've just seen” as shown in
One embodiment may be implemented using a conventional general purpose or a specialized digital computer or microprocessor(s) programmed according to the teachings of the present disclosure, as will be apparent to those skilled in the computer art. Appropriate software coding can readily be prepared by skilled programmers based on the teachings of the present disclosure, as will be apparent to those skilled in the software art. The invention may also be implemented by the preparation of integrated circuits or by interconnecting an appropriate network of conventional component circuits, as will be readily apparent to those skilled in the art.
One embodiment includes a computer program product which is a machine readable medium (media) having instructions stored thereon/in which can be used to program one or more computing devices to perform any of the features presented herein. The machine readable medium can include, but is not limited to, one or more types of disks including floppy disks, optical discs, DVD, CD-ROMs, micro drive, and magneto-optical disks, ROMs, RAMs, EPROMs, EEPROMs, DRAMs, VRAMs, flash memory devices, magnetic or optical cards, nanosystems (including molecular memory ICs), or any type of media or device suitable for storing instructions and/or data. Stored on any one of the computer readable medium (media), the present invention includes software for controlling both the hardware of the general purpose/specialized computer or microprocessor, and for enabling the computer or microprocessor to interact with a human viewer or other mechanism utilizing the results of the present invention. Such software may include, but is not limited to, device drivers, operating systems, execution environments/containers, and applications.
The foregoing description of the preferred embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations will be apparent to the practitioner skilled in the art. Particularly, while the concept “module” is used in the embodiments of the systems and methods described above, it will be evident that such concept can be interchangeably used with equivalent concepts such as, class, method, type, interface, bean, component, object model, and other suitable concepts. Embodiments were chosen and described in order to best describe the principles of the invention and its practical application, thereby enabling others skilled in the art to understand the invention, the various embodiments and with various modifications that are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.
Number | Name | Date | Kind |
---|---|---|---|
3950618 | Bloisi | Apr 1976 | A |
4695879 | Weinblatt | Sep 1987 | A |
4755045 | Borah et al. | Jul 1988 | A |
4846190 | John | Jul 1989 | A |
4931934 | Snyder | Jun 1990 | A |
4974602 | Abraham-Fuchs et al. | Dec 1990 | A |
5041972 | Frost | Aug 1991 | A |
5077785 | Monson | Dec 1991 | A |
5124911 | Sack | Jun 1992 | A |
5243517 | Schmidt et al. | Sep 1993 | A |
5301109 | Landauer et al. | Apr 1994 | A |
5317507 | Gallant | May 1994 | A |
5321833 | Chang et al. | Jun 1994 | A |
5371673 | Fan | Dec 1994 | A |
5406957 | Tansey | Apr 1995 | A |
5447166 | Gevins | Sep 1995 | A |
5450855 | Rosenfeld | Sep 1995 | A |
5495412 | Thiessen | Feb 1996 | A |
5519608 | Kupiec | May 1996 | A |
5537618 | Boulton et al. | Jul 1996 | A |
5579774 | Miller et al. | Dec 1996 | A |
5601090 | Musha | Feb 1997 | A |
5659732 | Kirsch | Aug 1997 | A |
5659742 | Beattie et al. | Aug 1997 | A |
5668953 | Sloo | Sep 1997 | A |
5671333 | Catlett et al. | Sep 1997 | A |
5675710 | Lewis | Oct 1997 | A |
5676138 | Zawilinski | Oct 1997 | A |
5692906 | Corder | Dec 1997 | A |
5696962 | Kupiec | Dec 1997 | A |
5721721 | Yanagisawa et al. | Feb 1998 | A |
5724987 | Givins et al. | Mar 1998 | A |
5740812 | Cowan | Apr 1998 | A |
5761383 | Engel | Jun 1998 | A |
5774591 | Black et al. | Jun 1998 | A |
5794412 | Ronconi | Aug 1998 | A |
5819285 | Damico et al. | Oct 1998 | A |
5822744 | Kesel | Oct 1998 | A |
5836771 | Ho et al. | Nov 1998 | A |
5845278 | Kirsch et al. | Dec 1998 | A |
5857179 | Vaithyanathan et al. | Jan 1999 | A |
5884302 | Ho | Mar 1999 | A |
5895450 | Sloo | Apr 1999 | A |
5911043 | Duffy et al. | Jun 1999 | A |
5920854 | Kirsch et al. | Jul 1999 | A |
5924094 | Sutter | Jul 1999 | A |
5950172 | Klingman | Sep 1999 | A |
5950189 | Cohen et al. | Sep 1999 | A |
5953718 | Wical | Sep 1999 | A |
5974412 | Hazlehurst et al. | Oct 1999 | A |
5983129 | Cowan et al. | Nov 1999 | A |
5983214 | Lang et al. | Nov 1999 | A |
5983216 | Kirsch et al. | Nov 1999 | A |
5999908 | Abelow | Dec 1999 | A |
6006221 | Liddy et al. | Dec 1999 | A |
6012053 | Pant et al. | Jan 2000 | A |
6021409 | Burrows | Feb 2000 | A |
6026387 | Kesel | Feb 2000 | A |
6026388 | Liddy et al. | Feb 2000 | A |
6029161 | Lang et al. | Feb 2000 | A |
6029195 | Herz | Feb 2000 | A |
6032145 | Beall et al. | Feb 2000 | A |
6035294 | Fish | Mar 2000 | A |
6038610 | Belfiore et al. | Mar 2000 | A |
6064980 | Jacobi et al. | May 2000 | A |
6067539 | Cohen | May 2000 | A |
6078892 | Anderson et al. | Jun 2000 | A |
6094657 | Hailpern et al. | Jul 2000 | A |
6098066 | Snow et al. | Aug 2000 | A |
6099319 | Zaltman et al. | Aug 2000 | A |
6112203 | Bharat et al. | Aug 2000 | A |
6119933 | Wong et al. | Sep 2000 | A |
6138113 | Dean et al. | Oct 2000 | A |
6138128 | Perkowitz et al. | Oct 2000 | A |
6169986 | Bowman et al. | Jan 2001 | B1 |
6185558 | Bowman et al. | Feb 2001 | B1 |
6192360 | Dumais et al. | Feb 2001 | B1 |
6202068 | Kraay et al. | Mar 2001 | B1 |
6233575 | Agrawal | May 2001 | B1 |
6236977 | Verba et al. | May 2001 | B1 |
6236987 | Horowitz et al. | May 2001 | B1 |
6236991 | Frauenhofer et al. | May 2001 | B1 |
6254536 | DeVito | Jul 2001 | B1 |
6260041 | Gonzalez | Jul 2001 | B1 |
6266664 | Russell-Falla et al. | Jul 2001 | B1 |
6269362 | Broder et al. | Jul 2001 | B1 |
6278990 | Horowitz | Aug 2001 | B1 |
6289342 | Lawrence et al. | Sep 2001 | B1 |
6292688 | Patton | Sep 2001 | B1 |
6304864 | Liddy et al. | Oct 2001 | B1 |
6308176 | Bagshaw | Oct 2001 | B1 |
6309342 | Blazey et al. | Oct 2001 | B1 |
6314420 | Lang et al. | Nov 2001 | B1 |
6322368 | Young et al. | Nov 2001 | B1 |
6334131 | Chakrabarti et al. | Dec 2001 | B2 |
6360215 | Judd et al. | Mar 2002 | B1 |
6362837 | Ginn | Mar 2002 | B1 |
6366908 | Chong et al. | Apr 2002 | B1 |
6377946 | Okamoto et al. | Apr 2002 | B1 |
6385586 | Dietz | May 2002 | B1 |
6393460 | Gruen et al. | May 2002 | B1 |
6401118 | Thomas | Jun 2002 | B1 |
6411936 | Sanders | Jun 2002 | B1 |
6418433 | Chakrabarti et al. | Jul 2002 | B1 |
6421675 | Ryan et al. | Jul 2002 | B1 |
6434549 | Linetsky et al. | Aug 2002 | B1 |
6493703 | Knight et al. | Dec 2002 | B1 |
6507866 | Barchi | Jan 2003 | B1 |
6513032 | Sutter | Jan 2003 | B1 |
6519631 | Rosenschein et al. | Feb 2003 | B1 |
6526440 | Bharat | Feb 2003 | B1 |
6539375 | Kawasaki | Mar 2003 | B2 |
6546390 | Pollack et al. | Apr 2003 | B1 |
6553358 | Horvitz | Apr 2003 | B1 |
6571234 | Knight et al. | May 2003 | B1 |
6571238 | Pollack et al. | May 2003 | B1 |
6574614 | Kesel | Jun 2003 | B1 |
6584470 | Veale | Jun 2003 | B2 |
6585521 | Obrador | Jul 2003 | B1 |
6606644 | Ford et al. | Aug 2003 | B1 |
6622140 | Kantrowitz | Sep 2003 | B1 |
6623428 | Miller et al. | Sep 2003 | B2 |
6626676 | Freer | Sep 2003 | B2 |
6640218 | Golding et al. | Oct 2003 | B1 |
6651056 | Price et al. | Nov 2003 | B2 |
6651086 | Manber et al. | Nov 2003 | B1 |
6652283 | Van Schaack et al. | Nov 2003 | B1 |
6654813 | Black et al. | Nov 2003 | B1 |
6656116 | Kim et al. | Dec 2003 | B2 |
6658389 | Alpdemir | Dec 2003 | B1 |
6662170 | Dom | Dec 2003 | B1 |
6671061 | Joffe et al. | Dec 2003 | B1 |
6678866 | Sugimoto et al. | Jan 2004 | B1 |
6708215 | Hingorani et al. | Mar 2004 | B1 |
6721734 | Subasic et al. | Apr 2004 | B1 |
6751606 | Fries et al. | Jun 2004 | B1 |
6751683 | Johnson et al. | Jun 2004 | B1 |
6757646 | Marchisio | Jun 2004 | B2 |
6772141 | Pratt et al. | Aug 2004 | B1 |
6775664 | Lang et al. | Aug 2004 | B2 |
6778975 | Anick et al. | Aug 2004 | B1 |
6782393 | Balabanovic et al. | Aug 2004 | B1 |
6792304 | Silberstein | Sep 2004 | B1 |
6795826 | Flinn et al. | Sep 2004 | B2 |
6807566 | Bates et al. | Oct 2004 | B1 |
6839682 | Blume et al. | Jan 2005 | B1 |
6928526 | Zhu et al. | Aug 2005 | B1 |
6978115 | Whitehurst et al. | Dec 2005 | B2 |
6978292 | Murakami et al. | Dec 2005 | B1 |
6983320 | Thomas | Jan 2006 | B1 |
6999914 | Boerner et al. | Feb 2006 | B1 |
7035685 | Ryu et al. | Apr 2006 | B2 |
7043760 | Holtzman et al. | May 2006 | B2 |
7050753 | Knutson | May 2006 | B2 |
7113916 | Hill | Sep 2006 | B1 |
7146416 | Yoo | Dec 2006 | B1 |
7185065 | Holtzman et al. | Feb 2007 | B1 |
7188078 | Arnett et al. | Mar 2007 | B2 |
7188079 | Arnett et al. | Mar 2007 | B2 |
7197470 | Arnett et al. | Mar 2007 | B1 |
7277919 | Donoho | Oct 2007 | B1 |
D565735 | Washbon | Apr 2008 | S |
7363243 | Arnett et al. | Apr 2008 | B2 |
7523085 | Nigam et al. | Apr 2009 | B2 |
7596552 | Levy et al. | Sep 2009 | B2 |
7600017 | Holtzman et al. | Oct 2009 | B2 |
7660783 | Reed | Feb 2010 | B2 |
7725414 | Nigam et al. | May 2010 | B2 |
7844483 | Arnett et al. | Nov 2010 | B2 |
7844484 | Arnett et al. | Nov 2010 | B2 |
7877345 | Nigam et al. | Jan 2011 | B2 |
8041669 | Nigam et al. | Oct 2011 | B2 |
20010016874 | Ono et al. | Aug 2001 | A1 |
20010042087 | Kephart et al. | Nov 2001 | A1 |
20010056225 | DeVito | Dec 2001 | A1 |
20020010691 | Chen | Jan 2002 | A1 |
20020032772 | Olstad et al. | Mar 2002 | A1 |
20020059258 | Kirkpatrick | May 2002 | A1 |
20020062368 | Holtzman et al. | May 2002 | A1 |
20020087515 | Swannack | Jul 2002 | A1 |
20020123988 | Dean et al. | Sep 2002 | A1 |
20020133481 | Smith et al. | Sep 2002 | A1 |
20020154833 | Koch et al. | Oct 2002 | A1 |
20020159642 | Whitney | Oct 2002 | A1 |
20020182574 | Freer | Dec 2002 | A1 |
20030003433 | Carpenter et al. | Jan 2003 | A1 |
20030063780 | Gutta et al. | Apr 2003 | A1 |
20030070338 | Roshkoff | Apr 2003 | A1 |
20030076369 | Resner | Apr 2003 | A1 |
20030081834 | Philomin et al. | May 2003 | A1 |
20030088532 | Hampshire | May 2003 | A1 |
20030093784 | Dimitrova et al. | May 2003 | A1 |
20030126593 | Mault | Jul 2003 | A1 |
20030153841 | Kilborn et al. | Aug 2003 | A1 |
20040018476 | Ladue | Jan 2004 | A1 |
20040024752 | Manber et al. | Feb 2004 | A1 |
20040039268 | Barbour et al. | Feb 2004 | A1 |
20040054737 | Daniell | Mar 2004 | A1 |
20040059708 | Dean et al. | Mar 2004 | A1 |
20040059729 | Krupin et al. | Mar 2004 | A1 |
20040072133 | Kullock et al. | Apr 2004 | A1 |
20040078432 | Manber et al. | Apr 2004 | A1 |
20040111412 | Broder | Jun 2004 | A1 |
20040122811 | Page | Jun 2004 | A1 |
20040199498 | Kapur et al. | Oct 2004 | A1 |
20040205482 | Basu et al. | Oct 2004 | A1 |
20040208496 | Pilu | Oct 2004 | A1 |
20040210561 | Shen | Oct 2004 | A1 |
20040267141 | Amano et al. | Dec 2004 | A1 |
20050010087 | Banet | Jan 2005 | A1 |
20050039206 | Opdycke | Feb 2005 | A1 |
20050043774 | Devlin et al. | Feb 2005 | A1 |
20050045189 | Jay | Mar 2005 | A1 |
20050049908 | Hawks | Mar 2005 | A2 |
20050066307 | Patel et al. | Mar 2005 | A1 |
20050071865 | Martins | Mar 2005 | A1 |
20050097594 | O'Donnell et al. | May 2005 | A1 |
20050113656 | Chance | May 2005 | A1 |
20050114161 | Garg | May 2005 | A1 |
20050125216 | Chitrapura et al. | Jun 2005 | A1 |
20050154686 | Corston et al. | Jul 2005 | A1 |
20050172311 | Hjelt et al. | Aug 2005 | A1 |
20050209907 | Williams | Sep 2005 | A1 |
20060004691 | Sifry | Jan 2006 | A1 |
20060041605 | King et al. | Feb 2006 | A1 |
20060069589 | Nigam et al. | Mar 2006 | A1 |
20060173819 | Watson | Aug 2006 | A1 |
20060173837 | Berstis et al. | Aug 2006 | A1 |
20060173985 | Moore | Aug 2006 | A1 |
20060206505 | Hyder et al. | Sep 2006 | A1 |
20060253316 | Blackshaw | Nov 2006 | A1 |
20060258926 | Ali et al. | Nov 2006 | A1 |
20060277102 | Agliozzo | Dec 2006 | A1 |
20060287989 | Glance | Dec 2006 | A1 |
20060293921 | McCarthy et al. | Dec 2006 | A1 |
20070027840 | Cowling et al. | Feb 2007 | A1 |
20070033189 | Levy et al. | Feb 2007 | A1 |
20070053513 | Hoffberg | Mar 2007 | A1 |
20070055169 | Lee et al. | Mar 2007 | A1 |
20070060830 | Le et al. | Mar 2007 | A1 |
20070060831 | Le et al. | Mar 2007 | A1 |
20070066914 | Le et al. | Mar 2007 | A1 |
20070100779 | Levy et al. | May 2007 | A1 |
20070116037 | Moore | May 2007 | A1 |
20070168461 | Moore | Jul 2007 | A1 |
20070173733 | Le et al. | Jul 2007 | A1 |
20070179396 | Le et al. | Aug 2007 | A1 |
20070184420 | Mathan et al. | Aug 2007 | A1 |
20070225585 | Washbon et al. | Sep 2007 | A1 |
20070235716 | Delic et al. | Oct 2007 | A1 |
20070238945 | Delic et al. | Oct 2007 | A1 |
20070265507 | de Lemos | Nov 2007 | A1 |
20080091512 | Marci et al. | Apr 2008 | A1 |
20080144882 | Leinbach et al. | Jun 2008 | A1 |
20080159365 | Dubocanin et al. | Jul 2008 | A1 |
20080177197 | Lee et al. | Jul 2008 | A1 |
20080211768 | Breen et al. | Sep 2008 | A1 |
20080218472 | Breen et al. | Sep 2008 | A1 |
20090024049 | Pradeep et al. | Jan 2009 | A1 |
20090024447 | Pradeep et al. | Jan 2009 | A1 |
20090024448 | Pradeep et al. | Jan 2009 | A1 |
20090024449 | Pradeep et al. | Jan 2009 | A1 |
20090024475 | Pradeep et al. | Jan 2009 | A1 |
20090025023 | Pradeep et al. | Jan 2009 | A1 |
20090030287 | Pradeep et al. | Jan 2009 | A1 |
20090030303 | Pradeep et al. | Jan 2009 | A1 |
20090030717 | Pradeep et al. | Jan 2009 | A1 |
20090030930 | Pradeep et al. | Jan 2009 | A1 |
20090036755 | Pradeep et al. | Feb 2009 | A1 |
20090036756 | Pradeep et al. | Feb 2009 | A1 |
20090062629 | Pradeep et al. | Mar 2009 | A1 |
20090062681 | Pradeep et al. | Mar 2009 | A1 |
20090063255 | Pradeep et al. | Mar 2009 | A1 |
20090063256 | Pradeep et al. | Mar 2009 | A1 |
20090082643 | Pradeep et al. | Mar 2009 | A1 |
20090083129 | Pradeep et al. | Mar 2009 | A1 |
20090105576 | Do et al. | Apr 2009 | A1 |
20090112077 | Nguyen et al. | Apr 2009 | A1 |
20090156925 | Jin et al. | Jun 2009 | A1 |
20090214060 | Chuang et al. | Aug 2009 | A1 |
20090222330 | Leinbach | Sep 2009 | A1 |
20110161270 | Arnett et al. | Jun 2011 | A1 |
Number | Date | Country |
---|---|---|
1052582 | Nov 2000 | EP |
10-2000-0072489 | Dec 2000 | KR |
10-2001-0104579 | Nov 2001 | KR |
0017824 | Mar 2000 | WO |
0017827 | Mar 2000 | WO |
0197070 | Dec 2001 | WO |
Number | Date | Country | |
---|---|---|---|
20090158308 A1 | Jun 2009 | US |