This application is related to U.S. patent application Ser. No. 12/683,737, filed on Jan. 7, 2010 and entitled OFFERING ITEMS IDENTIFIED IN A MEDIA STREAM, which is incorporated herein by reference in its entirety.
While consuming a content stream, users may view or hear scores of advertisements or other pieces of content relating to an array of different items. For instance, a user watching a video stream, such as broadcast television, may view an advertisement for a particular product that he or she may wish to obtain (e.g., purchase) or learn more about. However, in order to obtain or learn more information about the product, the user must typically either go to a brick-and-mortar store that offers the desired product or use a computer to locate information about the product or a merchant that offers the product. However, both of these two options require the user to direct his or her attention away from the object currently of interest—namely the video stream that is currently being broadcast and consumed by the user in this example.
The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items.
Overview
This disclosure is directed, in part, to techniques to identify items within a content stream and output information pertaining to these items. In some instances, this information may be output with the content stream. For instance, the techniques may monitor an audio and/or video stream to identify products, geographical locations, particular people, or any other item of interest. In response to identifying an item of interest, the techniques obtain an instruction that determines what type of information to output with the content stream.
With these techniques, in one example a video stream being displayed on an output device (e.g., a television, a laptop computer, etc.) may include an advertisement for an item offered for acquisition (e.g., purchase, rent, lease, etc.). A device connected to the output device may monitor the output video stream to identify this advertisement and/or an item associated with this advertisement. In one example, a content provider that provides the content stream may have embedded a content identifier (ID) within the content stream corresponding to the advertisement. For instance, the content provider may have embedded a barcode or a universal product code (UPC) associated with the advertised item within the portion of the content stream corresponding to the advertisement.
The device that monitors the video stream may identify this barcode or UPC and, in response, may access a database mapping this barcode or UPC to the corresponding item and potentially to an instruction. The instruction may direct the device to take a certain action in response to locating this particular item. For instance, the instruction may direct the device to determine a current price or availability of the item and to display this price or availability over the content stream. The instruction may also instruct the device to display a selectable icon over the content stream that, when selected, allows the user to obtain more information about the item or that allows the user to purchase the item from an offering service that offers the item. In this regard, the connected device may implement the techniques described in U.S. patent application Ser. No. 12/683,737, incorporated by reference in its entirety above.
In response to mapping the content ID to an item and an instruction, the connected device may obtain the information according to the instruction. For instance, the connected device may request a current price or availability for the item from an offering service that offers the item. After receiving this information, the connected device may then display this price or availability along with a selectable icon to obtain the item or additional information about the item. Therefore, when a user views the video stream and the described advertisement is displayed, the user also views the current price or availability of the advertised item and an icon that, when selected, initiates acquisition of the item or provides the user with additional information regarding the item.
In some instances, the content provider does not explicitly embed a content ID into the content stream. In these instances, the connected device may identify the content (and any items associated with the content) using other recognition techniques. For instance, the device or another entity may continually or periodically analyze frames of the content stream to identify a unique audio and/or visual fingerprint associated with the content. Once the device or another entity identifies a unique fingerprint, the device or the other entity may access the database discussed above that maps this identified piece of content (e.g., an advertisement, etc.) to a particular item and an instruction. Also as described above, the connected device may then display or otherwise output the information pertaining to the item in accordance with the instruction.
In still other instances, the connected device may provide one or more frames of the content stream to a human evaluation group for identifying any items therein. Each user of the human evaluation group may provide his or her input as to the identity of any items within the content stream. The connected device or another entity may then use these identifications to make a determination of any items within the content stream. Again, the connected device may then map these item(s) to an instruction directing the device to output certain information with the content stream.
In still other instances, users that consume the content stream (e.g., users watching a video stream) may provide identifications of items within the content stream, either as they consume the stream or as feedback to information presented on the content stream by the connected device. Again, the connected device may use this information in future instances of identifying items within the content stream.
The techniques described above and below may be implemented in a number of ways and in a number of contexts. Several example implementations and contexts are provided with reference to the following figures, as described below in more detail. However, the following implementations and contexts are but a few of many. For instance, and as discussed above, these techniques apply to a variety of content output devices and for a variety of content streams, including audio streams, video streams and any other form of content stream.
Illustrative Architecture
Content provider 102 may comprise a cable television provider, a satellite television provider, a satellite radio provider, or any other type of provider capable of producing, creating, and/or providing a content stream to a content output device 104. In the illustrated example, connected device 108 monitors the content stream to detect when items are displayed on the content output device 104, as introduced above and described in detail below. Although illustrated as separate from the content output device 104, the connected device 108 may be integral with the content output device 104 in other implementations. Furthermore, the connected device 108 may comprise a set-top box, a game console, a media center or any other computing device that receives the content stream that is displayed or otherwise output on the content output device 104.
Generally, the connected device 108 monitors the content stream being output on the content output device 104 to identify items within the stream. In response, the connected device 108 displays or otherwise outputs information pertaining to these items on the content output device 104. In the illustrated example, the connected device 108 identifies that the content stream includes an item 112 comprising a digital camera. In response, the connected device 108 displays information 114 pertaining to the item. Here, this information 114 includes a current price of the camera at a particular merchant, as well as a selectable icon that in this example allows a user to receive more information regarding the illustrated camera.
In this example, the content stream being monitored by the connected device 108 and being output by the content output device 104 includes a content identifier (ID) 116. Here, the content ID comprises a barcode that identifies the illustrated item 112. In other instances, the content ID may comprise a universal product code (UPC) or any other unique identifier associated with the item 112. In still other instances, the content ID may comprise a unique trace or fingerprint of the audio and/or visual portions of the content stream. This unique trace or fingerprint may comprise a series of sound, colors and/or the like that is unique to a particular portion of content, such as an advertisement or the like.
As illustrated, the connected device 108 includes one or more processors 118 and memory 120, which stores a monitoring module 122, a mapping module 124, and an output module 126. While
The monitoring module 122 functions to monitor the content stream provided by the content provider 102 and output by the content output device 104. The monitoring module 122 may continuously monitor this stream in order to attempt to locate a content ID, or the monitoring module 122 may periodically monitor this stream. For instance, the module 122 may periodically take screenshots of a video stream in order to analyze these screenshots for the presence of one or more content IDs.
In some instances, the content provider 102 inserts the content IDs into the content stream. Here, the content provider 102 may insert these IDs in a consistent and pre-specified area of the stream (e.g., in a top-right corner of the frames of the screen). Here, the monitoring module 122 may be configured to monitor this particular area of the stream.
The mapping module 124, meanwhile, functions to map an identified content ID to one or more particular items, as well as to one or more instructions that direct the connected device 108 to output certain types of information. In some instances, the mapping module 124 provides a located content ID to an offering service 128. As illustrated, the offering service 128 includes an electronic catalog 130 of one or more digital and/or physical items 132(1), 132(2), . . . , 132(P). The offering service 128 may offer some or all of these items 132(1)-(P) for acquisition (e.g., purchase, download, etc.).
When the offering service 128 receives a content ID from the mapping module 124, the offering service 128 may attempt to determine the item associated with the ID (e.g., the barcode, the UPC, etc.). If successful, the offering service 128 may return the identification of this item to the mapping module 124 of the connected device 108.
In instances where the content provider 102 has not inserted a barcode, UPC or the like into the content stream, the monitoring module 122 may send a portion (e.g., a frame) of the content stream to the offering service 128. If the offering service 128 is unable to identify an item within the received portion of the content stream (e.g., with reference to a unique fingerprint of the content), the offering service may provide this portion of the stream to a human evaluation group 134. One or more users of the human evaluation group 134 may analyze this portion of the stream to identify one or more items therein. The offering service may receive the responses of this group and may provide, to the monitoring module 122, the identity of an item most consistently identified by the group 134.
After receiving an identity of an item from the offering service 128, or in lieu of sending the content ID to the offering service 128 at all, the mapping module 124 accesses a database 136 to determine (1) an identity of an item referenced by a content ID (if not already received), and (2) an instruction. As illustrated, the database 136 includes a table 138 that maps content IDs to items and instructions. For instance, the table 138 may map a particular barcode number (“2234466”) to a particular item (“camera4,” the illustrated camera in this example) and an instruction (“Price/Info”). The instruction directs the connected device 108 to output certain types of information based on the item.
In the illustrated example, for instance, the table 138 directs the connected device to determine and output a price of the camera, as well as an icon that allows a user to receive more information regarding the camera. The instruction may additionally or alternatively instruct the connected device to output a specifications of the item, availability of the item, options associated with item (e.g., colors, sizes, etc.), an option to acquire the item, or any other information pertaining to the item. The table may also instruct the connected device 108 to output an icon that allows the user to obtain the item, to add the item to a cart of the user, or the like.
In the instant example, the mapping module 124 maps the content ID to the camera and an instruction directing the connected device 108 to output a current price of the camera and an icon to allow the user to receive more information regarding the camera. In response, the mapping module 124 queries the offering service 128 for a current price of the camera.
After determining the item within the content stream and the information pertaining to the item, the output module 126 outputs this information with the content stream. For instance, the output module 126 may cause display of this information on the content output device 104. Here, for instance, the output module 126 causes display of the current price of the illustrated camera ($199) and an icon that, when selected, allows a user to receive more information regarding the camera from the manufacturer of the camera, an offering service (e.g., a merchant) that offers the camera, or the like.
By monitoring the content stream output by the content output device 104, the connected device 108 is able to provide information pertaining to items within the content stream on a real-time basis. For instance, while a user views broadcast television, the user may also view information pertaining to items advertised or otherwise displayed within the stream. The user may also potentially learn more information about the item or request to obtain the item, depending upon the icons displayed by the connected device 108.
Illustrative Flow Diagrams and User Interfaces
The process 200 is illustrated as a collection of acts, each individually performed by a particular actor described above with reference to
The process 200 includes, at 202, the content provider 102 inserting a content ID into a content stream. This may include, for instance, embedding a barcode, a universal product code (UPC), or the like into the content. When embedding a content ID into a visual portion of the content stream, the provider 102 may embed this content ID into the portion of the content stream that is viewable to a user or in the portion that is outside the viewable region of the content.
Conversely,
Returning to
At 208, the content provider 102 provides the content stream to a collection of devices. For instance, the content provider 102 may broadcast a live stream of content to these devices. The dotted line indicates that the content provider 102 may continue to broadcast the content stream during the process 200. At 210, the connected device 108 receives the content stream and causes display of the content on the content output device 104. At 212, the content output device displays the content. At 214, the connected device 108 monitors the content stream for the purpose of identifying items therein. While the process 200 illustrates that the connected device 108 begins monitoring the stream after causing display of the stream, the device 108 may begin monitoring the stream prior to causing display or simultaneous to causing display of this stream.
At 216, the connected device 108 identifies a content ID, such as a barcode, a UPC, or the like. In this example, the device 108 has identified the content ID associated with the digital camera currently displayed on the content output device 104. At 218, the connected device 108 maps the identified content ID to an item and an instruction. This may include providing the content ID to the database 136 to determine the item and instruction. At 220, the database 136 receives the content ID.
To do so, the connected device 108 sends a request for a price to an offering service 128 that offers the particular item (potentially identified with the received instruction, originally registered by the content provider 102 at 204). At 228, the offering service 128 receives the request and returns the information. Here, the offering service 128 returns the price of the digital camera ($199). At 230, the connected device 108 receives this information and, at 232, causes display of this information and the icon to allow a user to receive additional information about the camera.
At 234, the content output device 104 displays the information, which here comprises the price and the selectable icon. At this point, the process may implement techniques described in U.S. patent application Ser. No. 12/683,737, incorporated by reference in its entirety above.
The process 600 includes the content provider 102 broadcasting the content stream at 602. At 604, the connected device 108 receives the content stream and causes display of the content stream on the content output device 104. At 606, the content output device 104 outputs the content (here, a video stream). At 608, the connected device 108 monitors the content stream. In this instance, this may include taking screenshots of the content stream and sending these screenshots to the offering service 128 or to another entity for analysis. In some instances, a user consuming the stream may provide an instruction that triggers the sending of the screenshots. At 610, the connected device 108 sends a portion (e.g., one or more screenshots or frames) to the offering service 128.
At 612, the offering service 128 receives the portion of the content stream and, at 614, the offering service queries as to whether the portion of the content stream includes a known item. For instance, the offering service 128 may determine whether the offering service 128 has previously associated the received portion of the content stream with an item from the electronic catalog 130. The offering service 128 may make this determination by identifying a unique trace or fingerprint (audio and/or visual) and comparing this unique fingerprint to a database of known fingerprints.
If the offering service 128 determines that the portion of the content does contain a known item, then the offering service identifies the item and the associated instruction at 616. The process 600 then proceeds to 632. If, however, the offering service 128 does not recognize the received portion of content, then the offering service 128 may send the portion of the content stream to the human evaluation group 134 at 618.
Illustrative Processes
The process 800 includes monitoring, at operation 802, a video stream being output by a content output device, such as a television, a laptop computer, a personal computing device, a personal media player, a portable digital assistant, an electronic book reader device, or any other type of device. Operation 804 represents identifying a content identifier (ID) within the video stream. For instance, the connected device 108 may identify a barcode, a UPC, or may identify a unique fingerprint of the content.
The process 800 further includes, at operation 806, mapping this content ID to an item and an instruction. An operation 808 then determines information to display with the content stream based on the mapped instruction. The process 800 then concludes with an operation 810, which causes display of the determined information on the content output device.
An operation 906 then determines information pertaining to the item to output on the content output device 104. For instance, this may include determining an instruction associated with the item (e.g., display item price, options, availability, etc.) and then querying an offering service for this information. Finally, an operation 908 displays the determined information.
In some instances, users that consume the content stream may provide feedback as to the identity of the item. For instance, operation 908 may display a selectable icon that allows users to agree or disagree with the identification of the item (e.g., “Did we correctly identify this item? Yes or No”). In response to a threshold number of users providing feedback that the item was incorrectly identified, then the identity of the item may be changed or updated at operation 910. Conversely, consistent feedback indicating that the item has been correctly identified may strengthen the confidence in the identification of the item.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as exemplary forms of implementing the claims.
Number | Name | Date | Kind |
---|---|---|---|
5249044 | Von Kohorn | Sep 1993 | A |
5872588 | Aras | Feb 1999 | A |
5880769 | Nemirofsky et al. | Mar 1999 | A |
5929849 | Kikinis | Jul 1999 | A |
5978013 | Jones et al. | Nov 1999 | A |
6282713 | Kitsukawa et al. | Aug 2001 | B1 |
6438751 | Voyticky | Aug 2002 | B1 |
6490725 | Kikinis | Dec 2002 | B2 |
7001279 | Barber et al. | Feb 2006 | B1 |
7058963 | Kendall | Jun 2006 | B2 |
7150028 | Ranta | Dec 2006 | B1 |
7158676 | Rainsford | Jan 2007 | B1 |
7231651 | Pong | Jun 2007 | B2 |
7237252 | Billmaier | Jun 2007 | B2 |
7269837 | Redling | Sep 2007 | B1 |
7346917 | Gatto et al. | Mar 2008 | B2 |
7383209 | Hudetz | Jun 2008 | B2 |
7577979 | Feinleib et al. | Aug 2009 | B2 |
7793316 | Mears et al. | Sep 2010 | B2 |
7856644 | Nicholson et al. | Dec 2010 | B2 |
8068781 | Ilan et al. | Nov 2011 | B2 |
8160840 | Caruso | Apr 2012 | B2 |
8213916 | Yankovich et al. | Jul 2012 | B1 |
8601504 | Stone et al. | Dec 2013 | B2 |
9071730 | Livesey | Jun 2015 | B2 |
20010021916 | Takai | Sep 2001 | A1 |
20010023436 | Srinivasan et al. | Sep 2001 | A1 |
20010052133 | Pack et al. | Dec 2001 | A1 |
20010056350 | Calderone et al. | Dec 2001 | A1 |
20020056091 | Bala et al. | May 2002 | A1 |
20020104090 | Stettner | Aug 2002 | A1 |
20020120934 | Abrahams | Aug 2002 | A1 |
20020120935 | Huber et al. | Aug 2002 | A1 |
20020174444 | Gatto et al. | Nov 2002 | A1 |
20030101104 | Dimitrova et al. | May 2003 | A1 |
20030182658 | Alexander | Sep 2003 | A1 |
20040117839 | Watson et al. | Jun 2004 | A1 |
20040125125 | Levy | Jul 2004 | A1 |
20040249726 | Linehan | Dec 2004 | A1 |
20060064757 | Poslinski | Mar 2006 | A1 |
20060150218 | Lazar et al. | Jul 2006 | A1 |
20060273893 | Warner | Dec 2006 | A1 |
20070030385 | Crawford | Feb 2007 | A1 |
20070039020 | Cansler et al. | Feb 2007 | A1 |
20070061845 | Barnes | Mar 2007 | A1 |
20070079335 | McDonough | Apr 2007 | A1 |
20070124769 | Casey et al. | May 2007 | A1 |
20070150360 | Getz | Jun 2007 | A1 |
20080021786 | Stenberg | Jan 2008 | A1 |
20080052226 | Agarwal et al. | Feb 2008 | A1 |
20080098425 | Welch | Apr 2008 | A1 |
20080109841 | Heather et al. | May 2008 | A1 |
20080155637 | Du Breuil | Jun 2008 | A1 |
20080255961 | Livesey | Oct 2008 | A1 |
20080281689 | Blinnikka | Nov 2008 | A1 |
20080304747 | Marinkovich et al. | Dec 2008 | A1 |
20080319852 | Gardner | Dec 2008 | A1 |
20090077459 | Morris et al. | Mar 2009 | A1 |
20090123025 | Deng | May 2009 | A1 |
20090125559 | Yoshino | May 2009 | A1 |
20090150330 | Gobeyn et al. | Jun 2009 | A1 |
20090193463 | Choi et al. | Jul 2009 | A1 |
20090276805 | Andrews, II et al. | Nov 2009 | A1 |
20100060802 | Huegel | Mar 2010 | A1 |
20100175080 | Yuen et al. | Jul 2010 | A1 |
20110078001 | Archer et al. | Mar 2011 | A1 |
20110093884 | Wachtfogel et al. | Apr 2011 | A1 |
20110135283 | Poniatowki et al. | Jun 2011 | A1 |
20110167456 | Kokenos et al. | Jul 2011 | A1 |
20110225604 | Bova | Sep 2011 | A1 |
20110289535 | Saffari et al. | Nov 2011 | A1 |
20120066708 | Lee et al. | Mar 2012 | A1 |
20120167145 | Incorvia | Jun 2012 | A1 |
20140109118 | Kokenos et al. | Apr 2014 | A1 |
20140282674 | Conradt et al. | Sep 2014 | A1 |
Number | Date | Country |
---|---|---|
1214837 | Apr 1999 | CN |
101305611 | Nov 2008 | CN |
101529770 | Sep 2009 | CN |
0672993 | Sep 1995 | EP |
2002108668 | Apr 2002 | JP |
2005503598 | Feb 2005 | JP |
2006031200 | Feb 2006 | JP |
2008271196 | Nov 2008 | JP |
WO2011044270 | Apr 2011 | WO |
Entry |
---|
The EP Search Report mailed Mar. 14, 2011 for PCT Application No. PCT/US10/61984, a counterpart application of U.S. Appl. No. 12/683,737. |
U.S. Appl. No. 12/683,737, filed on Jan. 7, 2010, Kokenos, et al., “Offering Items Identified in a Media Stream”. |
Ad-ID Advertising Identification and Management, Advertising Digital Identification, LLC, Copyright 2002-2003, retrieved on Feb. 24, 2010 at <<https://www.ad-id.org/>> and <<https://www.ad-id.org/help/help—detailNEW.cfm>>, 2 pgs. |
Google Googles Labs, Retrieved on Feb. 24, 2010 at <<http://www.google.com/mobile/goggles/#landmark>>, 1 pg. |
OpenCable, CableLabs, Retrieved on Feb. 24, 2010 at <<http://www.cablelabs.com/opencable/>>, 1 pg. |
The Extended European Search Report mailed Jul. 9, 2013 for European patent application No. 10842750.1, 6 pages. |
Office Action for U.S. Appl. No. 12/683,737, mailed on Apr. 22, 2013, Kokenos et al., “Offering Items Identified in a Media Stream”, 18 pages. |
Translated Japanese Office Action mailed Dec. 2, 2014 for Japanese patent application No. 2012-548037, a counterpart foreign application of U.S. Appl. No. 12/683,737, 7 pages. |
The Chinese Office Action mailed Apr. 3, 2015 for Chinese patent application No. 201080065280.4, a counterpart foreign application of U.S. Appl. No. 12/683,737, 18 pages. |
The European Office Action mailed Mar. 11, 2015 for European patent application No. 10842750.1, a counterpart foreign application of U.S. Appl. No. 12/683,737, 5 pages. |
Office action for U.S. Appl. No. 14/098,241, mailed on Apr. 29, 2016, Kokenos et al., “Offering Items Identified in a Media Stream”, 19 pages. |
Translated Chinese Office Action mailed Nov. 26, 2015 for CN patent application No. 2010800652803.4, a counterpart foreign application of U.S. Pat. No. 8,627,379, 7 pages. |
The Summons to Attend Oral Proceedings mailed Feb. 8, 2016 for European patent application No. 10842750.1, a counterpart foreign application of U.S. Pat. No. 8,627,379, 5 pages. |
Translated Japanese Notice of Allowance mailed Mar. 29, 2016 for Japanese Patent Application No. 2012-548037, a counterpart foreign application of U.S. Pat. No. 8,627,379, 6 pages. |
Office action for U.S. Appl. No. 12/683,737, mailed on Sep. 14, 2012, Kokenos et al., “Offering Items Identified in a Media Stream”, 18 pages. |
Office action for U.S. Appl. No. 14/098,241, mailed on Sep. 17, 2015, Kokenos et al., “Offering Items Identified in a Media Stream”, 15 pages. |
The Canadian Office Action mailed Nov. 16, 2015 for Canadian Patent Application No. 2786587, a counterpart foreign application of U.S. Pat. No. 8,627,379, 4 pages. |
Translated Japanese Office Action mailed Jul. 28, 2015 for Japanese patent application No. 2012-548037, a counterpart foreign application of U.S. Appl. No. 12/683,737, 6 pages. |