The present invention relates generally to interactive television, and more particularly, to a system and method for selecting or modifying advertisements to display to a viewer.
The distribution of advertisement via television is well established. Advertisers typically purchase advertisement time on a specific channel and time period with the rate being set by the popularity of programs airing within the time period. The more popular the underlying program or time slot, the more expensive the advertising rate. With the increase in the number of network stations, advertisers are confronted with the task of determining which stations are appropriate for their products or services. It is also difficult for advertisers to select a type of advertisement that will appeal to a broad cross section of the population (e.g., males, females, young and old).
Commercials are often targeted based on the type of television shows a viewer watches. For example, an advertisement for tulips is preferably displayed to a viewer who watches gardening programs. Similarly, an advertiser on the Internet may track the web sites that a user visits and use this information to determine that the user is interested in gardening. A banner ad for tulips may then be displayed to the user. Internet advertisement companies often use a targeting technology that allows advertisers to target consumers through the use of profiling criteria developed based on users activity on the Internet. The decision as to which advertisement to display is made remotely (e.g., cable headend system or server). This requires that information about a user be transferred to a remote collection area, which can create privacy problems. A user may not want personal information collected on his television viewing or “surfing” habits and provided to advertisers or other service providers.
Furthermore, from a consumer's perspective, one of the most irritating aspects relating to advertising is the inability of the consumer to provide any control over the content of the advertisement information. The viewer can only watch what is presented and cannot switch between different advertisements as he can with programs or interact with the advertisement.
New technology makes skipping commercials even easier. Consumers today often skip commercials that they do not want to see. However, commercials pay for almost everything on commercial television. High-priced shows are paid for by the value that advertisers perceive in displaying their ads during the show. As commercials become less effective, more ad dollars are being devoted to product placement within the television shows. For example, advertisers pay high prices for an actor to use their product in a show. Ideally, advertisers want to deliver a message appropriate for each viewer. Advertisers are willing to pay higher costs for even better viewer focus than merely a group of people watching a specific show.
There is, therefore, a need for a method and system that can be used to target advertisements to specific viewers or create interest in the advertisement by allowing viewers to interact with the advertisement.
A method for displaying interactive advertisements on a television having a controller connected thereto and configured for receiving input from a viewer of the television is disclosed. The controller has a receiver operable to receive advertisements and a processor operable to modify the advertisements. The method generally comprises requesting action by the viewer of the television, modifying an advertisement based on the action of the viewer, and displaying the modified advertisement on the television.
The action requested by the viewer may include requesting an answer to a question displayed on the television, having a camera take a picture of a person in the room, requesting the viewer to input a name into the controller, interaction with a game displayed on the television, requesting the viewer to input a personal profile into the controller, or rotating a product displayed by a video catalog, for example. The controller may further include a camera, microphone, light sensor, temperature sensor, or a motion sensor for collecting information about the viewer or viewing environment.
A system of the present invention generally comprises a controller configured for connection to the television and operable to receive input from a viewer of the television. The controller has a receiver operable to receive advertisements and a processor operable to request action by the viewer of the television, modify the advertisement based on the action of the viewer, and display the modified advertisement on the television.
In another aspect of the invention a system for displaying advertisements on a television comprising a controller coupled to the television generally comprises a memory device operable to store information about at least one viewer of the television and a receiver configured for receiving an advertisement scheduled for display on the television at a specified time. The system further includes a processor operable to modify the received advertisement based on action by the viewer.
The above is a brief description of some deficiencies in the prior art and advantages of the present invention. Other features, advantages, and embodiments of the invention will be apparent to those skilled in the art from the following description, drawings, and claims.
Corresponding reference characters indicate corresponding parts throughout the several views of the drawings.
The following description is presented to enable one of ordinary skill in the art to make and use the invention. Descriptions of specific embodiments and applications are provided only as examples and various modifications will be readily apparent to those skilled in the art. The general principles described herein may be applied to other embodiments and applications without departing from the scope of the invention. Thus, the present invention is not to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features described herein. For purpose of clarity, details relating to technical material that is known in the technical fields related to the invention have not been described in detail.
As previously discussed, it is important to target advertisements to specific viewers to make advertisements more effective. The present invention provides a method and system for displaying advertisements to consumers that are interested in the message content of the advertisements or making advertisements interactive so that consumers have more control over what type of advertisement they view and the advertisement appears more interesting to the consumer.
In one embodiment, the method utilizes targeting of advertisements based on passive techniques to convey a message to the viewer without requiring active participation by the viewer. In another embodiment, the speed of communication networks, such as the Internet or digital cable or satellite systems, are used to provide an interactive advertisement. Encouraging a viewer to interact with a commercial message is one way to get their attention. The viewer's response also shows that the viewer is paying attention to the advertiser's message. As described below, a fast video line and an intelligent box in the viewers home allow the viewer to interact with the advertisement and give the viewer a reason to pay attention to the advertiser's message. The following describes both passive (targeting) and active (interactive) techniques of the present invention.
The advertisements are preferably displayed on a television connected to a network system such as cable network system 30 shown in
It is to be understood that the system 30 described above and shown herein is only one example of a system used to convey signals to the television 45. The television network system may be different than described herein without departing from the scope of the invention.
The video signals and program control signals received by the set top box 38 correspond to television programs, advertisements, and menu selections that the viewer may access through a viewer interface (
The set top box 38 may be configured for receiving analog signals, digital signals, or both analog and digital signals. If digital signals are received by the set top box 38, the advertisements may be variants of one another (e.g., same video stream with different graphic overlays), as further described below. The set top box 38 may be configured for use with an interactive digital system which provides a forward path to the user and a return path to the local network gateway. The return path provides a two-way data stream to enable interactivity.
The set top box 38 may be configured, for example, to receive the following input: analog video channels; digital video channels which support broadband communications using Quadrature Amplitude Modulation (QAM); and control channels for two-way signaling and messaging. The digital QAM channels carry compressed and encoded multiprogram MPEG (Motion Picture Experts Group) transport streams. A transport system extracts the desired program or advertisement from the transport stream and separates the audio, video, and data components, which are routed to an audio decoder, video decoder, and RAM, respectively. The set top box 39 may further include a compositor which combines graphics and text with MPEG or analog video.
A broadband analog signal (e.g., 680, 750, 860 MHz) received by the set top box 38 carries multiple channels and is conveyed to a tuner 48 which selects one frequency band out of the available spectrum (
It is to be understood that the system used to select a channel and convert the analog signal to digital may be different than shown in
In a first embodiment of the present invention, passive technologies are used to improve an advertiser's ability to convey a message to a viewer. As described below, these technologies include identifying the viewer, the viewer's preferences, or the viewing environment.
Preferably, the system does not transmit or publish any personal information about the user. In one aspect of the invention, several forms of rich-media advertising are delivered to the user's home and the set top box decides which content to deliver to the viewer. The system may report back to a home system which commercials are viewed, without identifying or personal information about the viewer.
A viewer may also create a personal profile by entering demographics and related information into the set top box 38 (e.g., by querying the viewer on his birthplace, education level, type of employment, age, sex, and the like). The personal profile may also be created based on viewing history of the user through an information gathering process. The personal profile may be set up during initialization of the set top box 38 the first time the viewer uses the box, for example. Once the information is entered, it will be stored in the set top box memory 76 and may later be modified by the viewer, or additional profiles may be added for new viewers. Since this information is stored on the viewer's set top box 38 and not transferred to a remote data collection site, the viewer does not need to be concerned about privacy issues.
Various equipment may also be used to detect the environment of the room (e.g., temperature, light, motion of viewers, noise, number of viewers). This information may be used to determine the mood of the viewer and select an advertisement that is appropriate. For example, if the room is dark and the volume is low, the viewer is most likely not interested in seeing a loud bright commercial full of people and activity.
There are many sources of information about the viewer's interests. Each time the viewer clicks the remote control 42 there is a bit more information about what they like to view. The camera 60 and microphone 62 in the set top box 38 may be used to provide information on who is in the room and how active they are. Even knowing that the room is dark and the volume on the TV set is low provides information about the mood and what messages the viewer will most likely be interested in. It is difficult to absolutely recognize faces without help from the users (if only to provide a name to go with the face.) But it is easy to note that one face is always in the room when watching Oprah and another face watches Monday Night Football and from that correspondence tell who is watching the national news. The most appropriate commercial is then chosen based on who is in the room.
An especially valuable bit of information is knowing which content keeps the viewer's attention. Actively pressing a button to switch channels provides input that the viewer was not interested in that particular commercial. Playing the same commercial again will probably not fare any better. On the other hand, if the viewer stops changing channels after landing on a specific ad, then it should probably be played again. This is a simple example of reinforcement learning. The memory device 76 and CPU 78 may be used for reinforcement learning and video based tracking.
Reinforcement learning may be used as follows. Each commercial may contain one or more keywords indicating the content of the commercial. The keywords are used to determine which advertisements a consumer is most likely to watch. The keywords preferably include a product name, product category, and a bit about the style of the advertisement. Each time a commercial is played, the viewer's behavior is captured and used to modify the weight attached to each keyword. Thus, a commercial might be labeled “Doritos, snack food, sexy female.” Each time a viewer watches the entire commercial then the values for those three keywords are increased. Likewise, if the viewer quickly changes channels away from that particular commercial the value of those keywords are decreased. When an advertisement is ready to be displayed information stored in the set top box is used to select the commercial whose keywords have the highest value.
As previously described, information about the viewer or environment may be used to select one advertisement from a plurality of advertisements delivered to the set top box 38. The system shown in
The advertisements may also include links to other information such as detailed information about the advertised product, purchasing information and the like. For example, if after viewing an advertisement, the viewer wants to obtain additional information about an advertised product, the viewer may select an option listed in a menu which immediately directs the viewer to a new commercial or information piece providing additional information on the product.
After the advertisement is displayed, aggregate viewing feedback may be used to report back who actually viewed the commercial and a bit of information about how it was received by the viewer. As previously described, the system includes two-way channels so that information can be sent back to the headend 36 and advertiser. The set top box 38 may be used, for example, to gather information about the viewing habits of the household. The set top box 38 may report back that a household watched a particular commercial. More detailed information can be collected by noting correspondences between which shows are watched and which commercials are seen. Using this tracking technology, detailed information about who is actually in the room during the displaying of an advertisement can be collected. The following is an example of the type of data that may be collected and provided to an advertiser regarding the success of their advertisements:
Viewer Statistics:
In addition to sending separate advertisements as shown in
Alternatively, an advertisement modification may be sent down to the viewer's set top box 38 ahead of time and stored in memory 76. Then using data portions of the MPEG video stream areas of the video which should be modified are identified. The set top box 38 can then use the viewer profile or other information to find the right pixels to modify and perform the image processing operations.
The information collected about viewers may be used to let advertisers bid for different viewers. For example, one advertiser may be willing to pay 5 cents for any female between the ages of 20 and 40, while another advertiser may pay 10 cents for twenty-five year-old females. This allows the advertisement to be selected based on the highest bidder. Orders may be collected from advertisers and the commercial selected based on the highest bidder for that specific viewer.
The techniques described above cater to the passive nature of most television viewing. However, people often like to be in control of what they view. For example, people spend time at Amazon's web site because they like the content and the feel of the electronic store. Interaction between an advertiser and a consumer can help to make a sale. The following describes a number of different techniques that allow consumers to interact with advertisements and thus improve the value to advertisers.
Local intelligence in the set top box 38 allows for many different types of viewer interaction. In one embodiment, the advertiser presents the viewer with a question and the viewer selects an answer with the remote control 42. The advertisement may solicit an answer to a question to keep the viewer engaged (e.g., have you tried the Pepsi challenge yet?). The set top box 38 is used to report back the answer, along with viewing statistics using a set top box modem or other connection. The advertiser can then use this information to issue coupons to the appropriate viewers.
Presenting an advertisement in the form of a game is another method that may be used to capture a viewer's attention. For example, a beer advertiser may let viewers shoot at frogs or lizards on the screen. Another example is to put a viewer in a James Bond road race game with a BMW Z3 so that viewers can see if they like the experience. The game may be downloaded through a digital connection into the set top box 38, for example. A pointing device which interacts with the television 45 and set top box 38 is preferably included in the system. The advertiser may also send a coupon (electronic or paper) to viewers that score high in a simple 30-second game. Viewers may also choose their ending to an advertisement. Multiple versions of the ad may be sent to the set top box 38, as previously described. The set top box 38 then plays the appropriate ending, based on the user's interactions.
Another method for capturing the viewer's attention is to personalize an advertisement. For example, camera 60 installed on the set top box 38 can be used to take a picture of a viewer's baby. A furniture commercial is then displayed with the baby playing on the floor surrounded by furniture in the advertisement. Similarly, a family dog can be displayed in a dog food commercial or a face from the family can be looking out of a hotel window. The advertisement may also contain the viewer along with their favorite movie star.
The set top box 38 is preferably configured to process real-time video. Video Rewrite technology may be used to build models of a human face. (See, for example, “Video Rewrite: Driving Visual Speech with Audio”, Bregler et al. ACM SIGGRAPH 97.) This technology may be used interactively by requesting the viewer to stand in front of the camera or via passive viewing of the audience using the set top box 38. A dialog box that requests one of the children to stand in front of the camera may be presented to the viewer. Face-id software may be used to identify faces in the room and then extract the faces for later use. (See, for example, “Probalistic Visual Learning for Object Representation”, B. Moghaddam et al., IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 19, No. 7, July 1997.)
Computer vision may also be used to track what the viewer is doing and tailor the commercial to fit the viewer's activities. This may be as subtle as allowing a spokesperson's eyes to follow the viewer around the room as he goes about his activities. It may also be more direct by inserting the viewer's name or other personalized information into the pitch. In another example, a dog in the advertisement may do something appropriate to get the attention of viewers in the room. The video may be modified so that a spokesperson speaks a viewer's name.
Computer vision technology may also be used to sense where the user is. (See, for example, “A Virtual Mirror Interface Using Real-time Robust Face Tracking”, Darrell et al., Proceedings of the Third International Conference on Face and Gesture Recognition, IEEE Computer Society Press, April 1998, Nara, Japan.) Using technology such as Video Rewrite (or generic computer graphics) images that follow the user around the room can be generated. Facial identification may be used to determine who is in the room and a character in the advertisement may use the viewer's name.
Custom video catalogs may also be used to attract a viewer's attention. The catalogs may be used to get more information about a product so that the consumer can make an informed buying decision. Many advertisers do this today by providing a web address. But interactive television can do much more by giving viewers a chance to spend more time with the product. The advertisements may contain three-dimensional video models of the product which can be turned and stopped by the viewer to see how the product looks from different angles. Technology such as fly-around video may be used, for example. Fly-around video allows a viewer to see a product from all angles. The set top box 38 preferably includes a system such as Apple's Quick Time VR Authoring (QTVR) or similar technology to store video and allow real-time video processing to synthesize product views. A graphical user-interface provides a user with control of angles. Apple's QTVR technology, or Fly-Around Video (see e.g., U.S. patent application Ser. No. 60/152,352, “Head-Tracked Light Field Movies: A New Multimedia Data Type” by G. Miller et al. filed Sep. 7, 1999) allows a user to view an object from many different directions. When a user is browsing through the video catalog all objects are preferably three-dimensional so that the user can navigate around the object using the remote control or other user interface techniques.
The system may also be used to put a viewer into a virtual environment. This includes putting an image of the viewer on the couch of a furniture display or draping clothes over the image. The viewer's body size can be captured electronically, subtlety slimmed down, and then draped with the advertiser's product. Electric Planet articulated pose estimation technology may be used to see how the person is moving and to move the corresponding video model in real time. (See, for example, U.S. Pat. No. 6,141,463 issued Oct. 31, 2000, by M. Covell et al., “Method and System for Estimating Jointed-Figure Configurations.”)
All references cited above are incorporated herein by reference.
Although the present invention has been described in accordance with the embodiments shown, one of ordinary skill in the art will readily recognize that there could be variations made to the embodiments without departing from the scope of the present invention. Accordingly, it is intended that all matter contained in the above description and shown in the accompanying drawings shall be interpreted as illustrative and not in a limiting sense.
The present application is a continuation of U.S. patent application Ser. No. 13/475,822 filed May 18, 2012 which is a continuation of U.S. patent application Ser. No. 12/793,540 filed Jun. 3, 2010 (now U.S. Pat. No. 8,185,923), which is a continuation of U.S. patent application Ser. No. 09/789,926 filed Feb. 20, 2001, which claims the benefit of U.S. Provisional Patent Application No. 60/185,182, filed Feb. 25, 2000, each of which is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
3868675 | Firmin | Feb 1975 | A |
4258386 | Cheung et al. | Mar 1981 | A |
4319286 | Hanpachern | Mar 1982 | A |
4390904 | Johnston et al. | Jun 1983 | A |
4546382 | McKenna et al. | Oct 1985 | A |
4602279 | Freeman | Jul 1986 | A |
4750052 | Poppy et al. | Jun 1988 | A |
4750053 | Allen | Jun 1988 | A |
4752834 | Koombes | Jun 1988 | A |
4782401 | Faerber et al. | Nov 1988 | A |
4841291 | Swix et al. | Jun 1989 | A |
4845658 | Gifford | Jul 1989 | A |
4858000 | Lu | Aug 1989 | A |
4907079 | Turner et al. | Mar 1990 | A |
4931865 | Scarampi | Jun 1990 | A |
5105184 | Pirani et al. | Apr 1992 | A |
5151788 | Blum | Sep 1992 | A |
5164992 | Turk et al. | Nov 1992 | A |
5223924 | Strubbe | Jun 1993 | A |
5231494 | Wachob | Jul 1993 | A |
5295064 | Malec et al. | Mar 1994 | A |
5305195 | Murphy | Apr 1994 | A |
5347632 | Filepp et al. | Sep 1994 | A |
5410344 | Graves et al. | Apr 1995 | A |
5422986 | Neely | Jun 1995 | A |
5436637 | Gayraud et al. | Jul 1995 | A |
5440337 | Henderson et al. | Aug 1995 | A |
5446919 | Wilkins | Aug 1995 | A |
5481294 | Thomas et al. | Jan 1996 | A |
5497185 | Dufresne et al. | Mar 1996 | A |
5498002 | Gechter | Mar 1996 | A |
5504518 | Ellis et al. | Apr 1996 | A |
5515098 | Carles | May 1996 | A |
5532735 | Blahut et al. | Jul 1996 | A |
5546071 | Zdunich | Aug 1996 | A |
5550735 | Slade et al. | Aug 1996 | A |
5550928 | Lu et al. | Aug 1996 | A |
5550965 | Gabbe et al. | Aug 1996 | A |
5559549 | Hendricks et al. | Sep 1996 | A |
5572643 | Judson | Nov 1996 | A |
5579055 | Hamilton et al. | Nov 1996 | A |
5583560 | Florin et al. | Dec 1996 | A |
5589892 | Knee et al. | Dec 1996 | A |
5596373 | White et al. | Jan 1997 | A |
5600364 | Hendricks et al. | Feb 1997 | A |
5600573 | Hendricks et al. | Feb 1997 | A |
5608445 | Mischler et al. | Mar 1997 | A |
5619709 | Caid et al. | Apr 1997 | A |
5659350 | Hendricks et al. | Aug 1997 | A |
5661516 | Carles | Aug 1997 | A |
5682195 | Hendricks et al. | Oct 1997 | A |
5704017 | Heckerman et al. | Dec 1997 | A |
5708478 | Tognazzini | Jan 1998 | A |
5717814 | Abecassis | Feb 1998 | A |
5717923 | Dedrick | Feb 1998 | A |
5724091 | Freeman et al. | Mar 1998 | A |
5724424 | Gifford | Mar 1998 | A |
5724472 | Abecassis | Mar 1998 | A |
5724521 | Dedrick | Mar 1998 | A |
5729279 | Fuller | Mar 1998 | A |
5734853 | Hendricks et al. | Mar 1998 | A |
5740549 | Reilly et al. | Apr 1998 | A |
5754939 | Herz et al. | May 1998 | A |
5758258 | Shoff et al. | May 1998 | A |
5761606 | Wolzien | Jun 1998 | A |
5767857 | Neely | Jun 1998 | A |
5767894 | Fuller et al. | Jun 1998 | A |
5768528 | Stumm | Jun 1998 | A |
5771307 | Lu et al. | Jun 1998 | A |
5774170 | Hite et al. | Jun 1998 | A |
5786845 | Tsuria et al. | Jul 1998 | A |
5793409 | Tetsumura et al. | Aug 1998 | A |
5794210 | Goldhaber et al. | Aug 1998 | A |
5798785 | Hendricks et al. | Aug 1998 | A |
5801747 | Bedard | Sep 1998 | A |
5805974 | Hite et al. | Sep 1998 | A |
5812647 | Beaumont et al. | Sep 1998 | A |
5812732 | Dettmer et al. | Sep 1998 | A |
5818512 | Fuller | Oct 1998 | A |
5819284 | Farber et al. | Oct 1998 | A |
5826165 | Echeita et al. | Oct 1998 | A |
5828839 | Moncreiff | Oct 1998 | A |
5835667 | Wactlar et al. | Nov 1998 | A |
5862324 | Collins | Jan 1999 | A |
5870151 | Korber | Feb 1999 | A |
5872588 | Aras et al. | Feb 1999 | A |
5872850 | Klein et al. | Feb 1999 | A |
5873068 | Beaumont et al. | Feb 1999 | A |
5874986 | Gibbon et al. | Feb 1999 | A |
5877755 | Hellhake | Mar 1999 | A |
5892535 | Allen et al. | Apr 1999 | A |
5892554 | DiCicco et al. | Apr 1999 | A |
5892691 | Fowler | Apr 1999 | A |
5900919 | Chen et al. | May 1999 | A |
5903816 | Broadwin et al. | May 1999 | A |
5907322 | Kelly et al. | May 1999 | A |
5913040 | Rakavy et al. | Jun 1999 | A |
5915243 | Smolen | Jun 1999 | A |
5917553 | Honey et al. | Jun 1999 | A |
5918014 | Robinson | Jun 1999 | A |
5926207 | Vaughan et al. | Jul 1999 | A |
5929849 | Kikinis | Jul 1999 | A |
5933150 | Ngo et al. | Aug 1999 | A |
5933811 | Angles et al. | Aug 1999 | A |
5945988 | Williams et al. | Aug 1999 | A |
5946646 | Schena et al. | Aug 1999 | A |
5948061 | Merriman et al. | Sep 1999 | A |
5953076 | Astle et al. | Sep 1999 | A |
5959623 | van Hoff et al. | Sep 1999 | A |
5966120 | Arazi et al. | Oct 1999 | A |
5974398 | Hanson et al. | Oct 1999 | A |
5977964 | Williams et al. | Nov 1999 | A |
5990927 | Hendricks et al. | Nov 1999 | A |
6002393 | Hite et al. | Dec 1999 | A |
6002833 | Abecassis | Dec 1999 | A |
6005564 | Ahmad et al. | Dec 1999 | A |
6006197 | d'Eon et al. | Dec 1999 | A |
6006257 | Slezak | Dec 1999 | A |
6008802 | Iki et al. | Dec 1999 | A |
6011895 | Abecassis | Jan 2000 | A |
6012051 | Sammon et al. | Jan 2000 | A |
6020883 | Herz et al. | Feb 2000 | A |
6020931 | Bilbrey et al. | Feb 2000 | A |
6026369 | Capek | Feb 2000 | A |
6029045 | Picco et al. | Feb 2000 | A |
6034652 | Freiberger et al. | Mar 2000 | A |
6036601 | Heckel | Mar 2000 | A |
6038367 | Abecassis | Mar 2000 | A |
6044376 | Kurtzman, II | Mar 2000 | A |
6052492 | Bruckhaus | Apr 2000 | A |
6052554 | Hendricks et al. | Apr 2000 | A |
6075551 | Berezowski et al. | Jun 2000 | A |
6100941 | Dimitrova et al. | Aug 2000 | A |
6104425 | Kanno | Aug 2000 | A |
6112192 | Capek | Aug 2000 | A |
6141010 | Hoyle | Oct 2000 | A |
6141463 | Covell et al. | Oct 2000 | A |
6160570 | Sitnik | Dec 2000 | A |
6169542 | Hooks et al. | Jan 2001 | B1 |
6177931 | Alexander et al. | Jan 2001 | B1 |
6208386 | Wilf et al. | Mar 2001 | B1 |
6237022 | Bruck et al. | May 2001 | B1 |
6240555 | Shoff et al. | May 2001 | B1 |
6243104 | Murray | Jun 2001 | B1 |
6282713 | Kitsukawa et al. | Aug 2001 | B1 |
6286005 | Cannon | Sep 2001 | B1 |
6314569 | Chernock et al. | Nov 2001 | B1 |
6324519 | Eldering | Nov 2001 | B1 |
6351265 | Bulman | Feb 2002 | B1 |
6357043 | Ellis et al. | Mar 2002 | B1 |
6438751 | Voyticky et al. | Aug 2002 | B1 |
6457010 | Eldering et al. | Sep 2002 | B1 |
6484148 | Boyd | Nov 2002 | B1 |
6519769 | Hopple et al. | Feb 2003 | B1 |
6526215 | Hirai et al. | Feb 2003 | B2 |
6560281 | Black et al. | May 2003 | B1 |
6570499 | Kaganer | May 2003 | B2 |
6574793 | Ngo et al. | Jun 2003 | B1 |
6597405 | Iggulden | Jul 2003 | B1 |
6615408 | Kaiser et al. | Sep 2003 | B1 |
6681393 | Bauminger et al. | Jan 2004 | B1 |
6684194 | Eldering et al. | Jan 2004 | B1 |
6698020 | Zigmond et al. | Feb 2004 | B1 |
6708335 | Ozer et al. | Mar 2004 | B1 |
6735776 | Legate | May 2004 | B1 |
6750880 | Freiberger et al. | Jun 2004 | B2 |
6788314 | Freiberger et al. | Sep 2004 | B1 |
6906732 | Li et al. | Jun 2005 | B1 |
6937658 | Suito et al. | Aug 2005 | B1 |
6968565 | Slaney et al. | Nov 2005 | B1 |
6993245 | Harville | Jan 2006 | B1 |
7134130 | Thomas | Nov 2006 | B1 |
7134132 | Ngo et al. | Nov 2006 | B1 |
7209631 | Tada et al. | Apr 2007 | B2 |
7272295 | Christopher | Sep 2007 | B1 |
7348935 | Freiberger et al. | Mar 2008 | B1 |
7409437 | Ullman et al. | Aug 2008 | B2 |
7661116 | Slaney et al. | Feb 2010 | B2 |
7720432 | Colby et al. | May 2010 | B1 |
7765574 | Maybury et al. | Jul 2010 | B1 |
7778519 | Harville | Aug 2010 | B2 |
7844985 | Hendricks et al. | Nov 2010 | B2 |
20020046084 | Steele et al. | Apr 2002 | A1 |
20020062481 | Slaney et al. | May 2002 | A1 |
20030055831 | Ryan et al. | Mar 2003 | A1 |
20030110499 | Knudson et al. | Jun 2003 | A1 |
20030200128 | Doherty | Oct 2003 | A1 |
20040194131 | Ellis et al. | Sep 2004 | A1 |
20090210902 | Slaney et al. | Aug 2009 | A1 |
20100242063 | Slaney et al. | Sep 2010 | A1 |
20100281499 | Harville | Nov 2010 | A1 |
20130014157 | Harville | Jan 2013 | A1 |
Number | Date | Country |
---|---|---|
2054331 | Feb 1990 | JP |
403179895 | Aug 1991 | JP |
403184484 | Aug 1991 | JP |
403229597 | Oct 1991 | JP |
4051628 | Feb 1992 | JP |
05037870 | Feb 1993 | JP |
406303569 | Oct 1994 | JP |
7507169 | Aug 1995 | JP |
408235676 | Sep 1996 | JP |
9269923 | Oct 1997 | JP |
409284706 | Oct 1997 | JP |
411008835 | Jan 1999 | JP |
611261909 | Sep 1999 | JP |
WO-9319427 | Sep 1993 | WO |
WO-9413107 | Jun 1994 | WO |
WO-9416442 | Jul 1994 | WO |
WO-9515649 | Jun 1995 | WO |
WO 9515658 | Jun 1995 | WO |
WO-9624115 | Aug 1996 | WO |
WO-9630864 | Oct 1996 | WO |
WO-9700494 | Jan 1997 | WO |
WO-9700581 | Jan 1997 | WO |
WO-9700582 | Jan 1997 | WO |
WO-9741683 | Nov 1997 | WO |
WO-9824242 | Jun 1998 | WO |
WO-9824243 | Jun 1998 | WO |
WO-9828906 | Jul 1998 | WO |
WO-9904561 | Jan 1999 | WO |
WO-9938320 | Jul 1999 | WO |
WO-9945702 | Sep 1999 | WO |
WO-9952285 | Oct 1999 | WO |
WO-9955066 | Oct 1999 | WO |
WO-9960789 | Nov 1999 | WO |
WO-0022818 | Apr 2000 | WO |
WO-0033160 | Jun 2000 | WO |
WO-0033163 | Jun 2000 | WO |
WO-0033228 | Jun 2000 | WO |
WO-0033233 | Jun 2000 | WO |
Entry |
---|
U.S. Appl. No. 13/475,822, filed May 18, 2012, Slaney et al. |
“About GAIN Ad Vehicles,” http://www.gainpublishing.com/about, pp. 1-2 [accessed Apr. 26, 2004]. |
“Double Click Press Kit,” www.doubleclick.com, 2 pages, Feb. 22, 2000. |
“GAIN—Support Center,” http://www.gainpublishing.com/help/gainfaq.html, pp. 1-5 [accessed Apr. 26, 2004]. |
“GAIN Publishing-Software,” http://www.gainpublishing.com/software, pp. 1-3 [Internet accessed Apr. 26, 2004]. |
“General Instrument & ACTV to Offer a Complete Solution for Addressable, Targeted Digital Television Advertising,” ACTV and The BOX Music Network, Press Release, www.actv.com/newpage/press/actvgiad.html, Jun. 14, 1999, 3 pages. |
“MIT Media Lab's Hypersoap uses hyperlinks to mix shopping, entertainment,” MIT News, Nov. 9, 1998, 3 pgs. [Internet accessed on Nov. 23, 1999]. |
“Scientific-Atlanta's Explorer 2000 Advanced Digital Set-Top Will Support ACTV's ‘Inidvidualized Television’” ACTV and Scientific-Atlanta, Press Release, www.actv.com/newpage/press/actvsatl.html, Jan. 25, 1999, 3 pages. |
“What is Wink: How wink works,” www.wink.com, 3 pages, Jan. 1, 2000. |
Adauction.com, http://web.archive.org/web/20000302051902/http://www.adauction.com, pp. 1-2, internet archive date of Mar. 2, 2000 [accessed Apr. 6, 2007]. |
AsSeenIn.com, http://www.asseenin.com/asseenin/infor, 3 pgs. [Internet accessed on Jan. 12, 2000]. |
Bove, M., et al., “Adding Hyperlinks to Digital Television,” MIT Media Laboratory, Proc. 140th SMPTE Technical Conference, 1998, 11 pages. |
Bregler et al., “Video Rewrite: Driving Visual Speech with Audio, Interval Research Corporation,” Abstract and ACM SIGGRAPH 97 Paper, 10 pages. |
CLARIA, Products and Services Overview, http://www.clairia.com/products/index.html, pp. 1-3 [Internet accessed Apr. 26, 2004]. |
Darrell et al., “A Virtual Mirror Interface using Real-time Robust Face Tracking,” Proceedings of the Third International Conference on Face and Gesture Recognition, Apr. 1998, IEEE Computer Society Press, Nara, Japan, 20 pages. |
Delio, M., “TV Commercials Get Personal,” Wired News, Sep. 20, 2000, www.wired.com/news/print/0.1294.38754.00.html, 3 pages. |
Ebert, R., “The Incredible Shrinking Media Lab,” Y-Life: Roger Ebert-Critical Eye, Mar. 1999, http://www.zdnet.com, 3 pgs. [Internet accessed on Nov. 23, 1999]. |
Fujikawa, “Subrogation of User Account by Advertisement”, Nikkei Communications, Nikkei BP Company, Mar. 4, 1996, Japan, No. 217, p. 54. |
Gomes, “Upstart's Internet ‘TV’ Has Microsoft Tuned in,” Wall Street Journal, Aug. 1996, 1 page. |
Koenen, R., “Overview of the MPEG-4 Standard,” International Organisation for Standardisation Organisation Internatioanale De Normalisation ISO/IEC JTC1/SC29/WG11 Coding of Moving Pictures and Audio, ISO/IEC JTC1/SC29/WG11 N4030, Mar. 2001, http://www.cselt.it/mpeg/standars/mpeg-4/mpeg-4.htm, pp. 1-78. |
Letter of Interrogation for Japanese Application No. 1997-533741, Mail Date Jun. 16, 2009, 7 pages. |
Lyon, R.F., “The Optical Mouse, and an Architectural Methodology for Smart Digital Sensors,” Xerox PARC, VLSI-81-1, Aug. 1981, 38 pages. |
Moghaddam et al., “Research Index: Probabilistic Visual learning for Object Detection,” http://citeseer.nj.nec.com/moghaddam95probabilistic.html, 1995, 3 pages. |
Moghaddam, B. et al., “Abstract: Probabilistic Visual learning for Object Detection,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, No. 7, Jul. 1997, www.computer.org/tpami/tp1997/i0696abs.htm, 1 page. |
Rigdon, “Screen Savers Go Beyond Fish, Flying Toasters,” Wall Street Journal, Feb. 13, 1996, 1 page. |
Staff Reporter, “PointCast Inc. is Testing New Screen-Saver Product,” Wall Street Journal, May 1996, 1 page. |
Sutton et al., “Reinforcement Learning: An Introduction,” http://www-anw.cs.umass.edu/˜rich/book/the-book.html, MIT Press, Cambridge, MA, 1998, 4 pages. |
Tokuda, “To Make the Internet Free—Mr. Y. Itakura (president of Hyper-net company)”, Nikkei Business Publications, Inc., Mar. 11, 1996, Japan, No. 831, p. 136-140. |
Number | Date | Country | |
---|---|---|---|
20130086608 A1 | Apr 2013 | US |
Number | Date | Country | |
---|---|---|---|
60185182 | Feb 2000 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13475822 | May 2012 | US |
Child | 13619661 | US | |
Parent | 12793540 | Jun 2010 | US |
Child | 13475822 | US | |
Parent | 09789926 | Feb 2001 | US |
Child | 12793540 | US |