Tracking user interaction from a receiving device

Information

  • Patent Grant
  • 10382807
  • Patent Number
    10,382,807
  • Date Filed
    Thursday, July 20, 2017
    6 years ago
  • Date Issued
    Tuesday, August 13, 2019
    4 years ago
Abstract
Measuring and tracking user interaction with a television receiver, such as a set top box or cable box. The television receiver may create and display a matrix code that includes temporal information, user identification information, geographic information, and/or a user selection. The matrix code may be captured by a matrix reading device and transmitted to a monitoring entity. Optionally, the matrix reading device may decode the matrix code and transmit associated data to the monitoring entity. The monitoring entity may use the code or data to track and distinguish between user interactions at different points in time.
Description
TECHNICAL FIELD

Embodiments described herein relate generally to measuring and tracking user interaction with a television receiver, such as a set top box or cable box, and more particularly to tracking and distinguishing between user interactions at different points in time.


BACKGROUND

Interactivity between users and audiovisual content has been increasing for some time. In many cases, interactivity is achieved by having a user call a telephone number displayed as part of the audiovisual content, or access a Web site at an address similarly displayed. Generally, information may be passively displayed and rely on a viewer to acknowledge the information and act on it.


Often, users may misdial a telephone number, mistype a Web address, or the like. Not only do these errors destroy interactivity, but they may cause undesired interactivity. For example, if a viewer misdials a telephone number by a single digit, he or she may cast an undesired vote, as telephone numbers corresponding to voting options may vary only by a single digit.


Even in situations where undesired interactivity is avoided, errors or faulty memory may prevent interaction entirely. By the time a viewer realizes he has made a mistake, the number, Web site, or other interactive information may no longer be displayed.


Accordingly, what is needed is an enhanced ability to track user interaction from a receiving device.


SUMMARY

Generally, embodiments discussed herein permit tracking user interaction with a television receiver, such as a set-top box, cable box, or other electronic device appropriately configured to receive, decode and display audiovisual information. For example, certain embodiments may be configured to track user interaction with a computing device, such as a desktop computer, laptop computer, tablet device, mobile telephone, personal digital assistant and the like. Further, embodiments discussed herein may generally distinguish between user interactions at different points in time and track the time of the various interactions. Such tracking may be accomplished through the use of a matrix code displayed on a display device along with audiovisual content; the matrix code may be captured by a reader device that processes the matrix code to retrieve data embedded therein. The data may be transmitted from the receiving device to a monitoring party who, in turn, may process the data to track user interaction, audience participation, the time at which audiovisual content was viewed and/or stored, and the like.


Certain embodiments may receive an information code as part of, or along with, audiovisual content. The information code may be a two-dimensional code (also referred to as a “matrix barcode”) containing various data regarding the audiovisual content, content provider, content recipient, time of transmission, and the like. The information code may be integrated with the audiovisual content or may be sent separately. If integrated, the code is typically part of the video or graphical portion of the content. If sent separately, it may be transmitted as metadata or in a separate data stream associated with the audiovisual content. As one example, if the audiovisual content is transmitted across a satellite network to a set-top box, the information code may be transmitted in a stream having packets tagged with a packet identifier (PID) that indicates the stream and its contents are associated with the audiovisual content. The stream carrying the information code may carry additional data or may be dedicated to the code.


One embodiment may take the form of a method for audience metering, including the operations of: receiving audiovisual content; receiving data relating to the audiovisual content; determining a time; creating an electronic construct from the data relating to the audiovisual content and the time; and displaying the electronic construct with the audiovisual content.


Another embodiment may take the form of a method for tracking a time at which audiovisual content is viewed, including the operations of: receiving the audiovisual content; receiving data related to the audiovisual content; storing the audiovisual content; receiving a command to play the stored audiovisual content; in response to the command, determining a time; and generating an electronic construct containing a data set, the data set including the data related to the audiovisual content and the time; wherein at least one of the electronic construct and the data set is used by a monitoring party to determine the time at which the audiovisual content is replayed.


Still another embodiment may take the form of an apparatus for outputting a data construct including embedded temporal information, comprising: a storage medium operative to store audiovisual content; a processing unit operative to receive a command to replay the audiovisual content from the storage medium; a matrix code module operatively connected to the processing unit, the matrix code module operative to generate a matrix code including a temporal identifier and a content identifier; and an output component operative to display the content and the matrix code.


Yet another embodiment may be a method for tracking content viewing, comprising the operations of: transmitting content across a first network to a receiver; receiving a digital construct associated with the content across a second network; processing the digital construct to obtain a time and an identifier associated with the content; and using the time and identifier to track viewing of the content.





BRIEF DESCRIPTION OF THE FIGURES


FIG. 1 depicts a sample embodiment in a sample operating environment.



FIG. 2 is a flowchart depicting a sample method of operation for generating, capturing and processing a matrix code in order to measure audience metrics and/or participation.



FIG. 3 is a flowchart depicting a sample method of operation for a television receiver to generate and display a matrix barcode having temporal and/or replay data included therein.





DETAILED DESCRIPTION
I. Introduction

Generally, embodiments discussed herein permit tracking user interaction with a television receiver, such as a set-top box, cable box, or other electronic device appropriately configured to receive, decode and display audiovisual information. For example, certain embodiments may be configured to track user interaction with a computing device, such as a desktop computer, laptop computer, tablet device, mobile telephone, personal digital assistant and the like. Further, embodiments discussed herein may generally distinguish between user interactions at different points in time and track the time of the various interactions.


Certain embodiments may receive an information code as part of, or along with, audiovisual content. The information code may be a two-dimensional code (also referred to as a “matrix barcode”) containing various data regarding the audiovisual content, content provider, content recipient, time of transmission, and the like. The information code may be integrated with the audiovisual content or may be sent separately. If integrated, the code is typically part of the video or graphical portion of the content. If sent separately, it may be transmitted as metadata or in a separate data stream associated with the audiovisual content. As one example, if the audiovisual content is transmitted across a satellite network to a set-top box, the information code may be transmitted in a stream having packets tagged with a packet identifier (PID) that indicates the stream and its contents are associated with the audiovisual content. The stream carrying the information code may carry additional data or may be dedicated to the code.


Typically, the code may be overlaid on the video portion of audiovisual content by the television receiver, so that it is displayed in a portion of a display screen associated with the television receiver. Thus, if the receiver obtains the two-dimensional (or “matrix”) code in a dedicated PID, it may overlay that code on the video prior to transmitting it to a display. Likewise, if the television receiver obtains metadata or other information from which it may construct a matrix code, the constructed matrix code may be overlaid on video that is then sent to an associated display device.


In several embodiments, the matrix code may contain time and/or date information (“temporal data”). This may be included in the video transmission, as metadata or retrieved by the television receiver and inserted into the matrix code. Typically, although not necessarily, the temporal data includes the date/time on which the audiovisual content was transmitted to the television receiver, the date/time on which the audiovisual content was played by the television receiver, or both. These dates and times may be different, for example, when the television receiver stores the audiovisual content on a storage device such as a hard drive. In such an instance, the temporal data for the transmission of the content may be stored with the content, while temporal data for playback of the content may be added to the matrix code by the television receiver upon playback.


Matrix codes containing temporal data may be used for a variety of purposes, many of which are described herein in more detail. Temporal data may be used to track or limit the timing of voting or audience interaction. As one example, a matrix code may be shown on a display screen during an audience participation segment of an audiovisual program. Watchers may capture an image of the matrix code with a smart phone, digital camera and the like (“reader device”) and transmit it across a network to a recipient. The recipient, in turn, may process the matrix code in order to facilitate audience participation.


As one example, many television shows permit a watcher to vote for a favorite television actor, dancer, and the like. A unique matrix code may be presented to viewers for each voting option. The viewer may capture the matrix code corresponding to their desired vote with an appropriately-configured reader device. The matrix code may contain not only the voting information, such as a name or other designation of the person for whom the viewer is voting, but also instructions that may be interpreted by the reader device; such instructions may order the reader device to access a certain web site, Internet location and the like and record the viewer's vote. Alternately, the instructions may facilitate the reader device transmitting the vote through a telephone network, perhaps as a SMS or other text message. Ultimately, data included in the matrix code may be transmitted to a monitoring party who may use that data to study audience participation, voting, viewing habits and so forth. The temporal data in the matrix code may be used by the monitoring party to determine when an audience member watched the audiovisual content, when the audience member replayed the audiovisual content, when the audience member captured an image of the matrix code, and the like. In some embodiments, the matrix code itself may be relayed by the reader device to the monitoring party.


II. System Overview


FIG. 1 is a block diagram illustrating a system 100 for facilitating user participation with audiovisual content through use of a matrix code. The system 100 may include a television receiver 101 (which may be any kind of appropriately-configured electronic device such as a television, a television receiver, a digital video recorder, a digital video disc player, a computing device, a mobile telephone, a video game system, and so on), at least one display device 102 (which may be any kind of display device such as a cathode ray tube display, a liquid crystal display, a television, a computer monitor, and so on), and a reader device 103 (which may be any kind of device capable of detecting and decoding a matrix code such as a telephone equipped with a camera, a mobile computing device that includes a camera, and so on). The system may also include a monitoring party 104.


The television receiver 101 may receive data across a network from a content provider; the content provider may be the same as the monitoring party 104 in certain embodiments. The receiver 101 may include one or more processing units 105, one or more one or more non-transitory storage media 106 (which may take the form of, but is not limited to, a magnetic storage medium; optical storage medium; magneto-optical storage medium; random access memory; erasable programmable memory; flash memory; and so on), and one or more output components 107. The storage medium 106 may store audiovisual content that is received by the television receiver 101 for review at a later time. (The receiver may include various elements, components, hardware and the like that permit it to receive transmitted content, process that content and display it; such components are not shown for purposes of simplicity.) Additionally, although the display device 102 is illustrated as separate from the television receiver, it is understood that in various implementations the display device may be incorporated into the television receiver. The processing unit of the television receiver may execute instructions stored in the non-transitory storage medium to derive information specific to the television receiver that relates to operation of the television receiver, dynamically generate one or more matrix codes (such as one or more QR codes) that include the information specific to the television receiver as well as temporal data related to the date and/or time on which audiovisual content was received and/or played back, and transmit the dynamically generated matrix code to the display device utilizing the output component. Such instructions may take the form of a matrix code module that may be executed by the processor to perform the foregoing functions. Alternately, the matrix code module may be implemented in hardware or firmware.


Subsequently, the reader device 103 may detect the matrix code displayed by the display device 102, decode the data relating to the audiovisual content and any temporal data, and initiate one or more transmissions of such data to the monitoring party 104. The matrix code data may include information such as the audiovisual content being watched, whether such content is being watched live or replayed from storage, a time and/or date on which the content was viewed, an option selected by the viewer, an address to which such information should be transmitted (e.g., an Internet or Web address, a telephone number, and so on), information identifying the viewer (a unique code associated with the viewer and/or the television receiver, such as a serial number, a viewer email address or physical address, a viewer telephone number and so on) and the like. It should be appreciated that different embodiments may vary the information contained in the matrix code and accessed by the reader device. Typically, the matrix code includes data that instructs the reader device 103 how to initiate a transmission to the monitoring party, as well as the data to be transmitted to the monitoring party. It should be appreciated that not all of the foregoing information is necessarily included in a matrix code. Likewise, the foregoing list should be considered examples of suitable information to be included in a matrix code as opposed to an exhaustive list. Further, the information included in any given matrix code may vary depending on whether the code is generated by a content provider or other third party and embedded in video, or is generated by the television receiver.


The reader device may include one or more processing units 109 which execute instructions stored in one or more non-transitory storage media 111 in order to perform the above described functions. The reader device may also include an optical input device 110 (such as a camera, a barcode scanner, and so on) for detecting the matrix code displayed by the display device as well as a communication component 112 for communicating with the monitoring party.


In some implementations, the television receiver 101 may dynamically generate the matrix codes upon a viewer (e.g., user) viewing audiovisual content either in a live fashion or during playback from a storage device. Matrix codes may be generated from metadata or other data transmitted with, or as part of, the audiovisual content. The television receiver 101 may use such data to create the matrix code and overlay it on a video stream prior to transmitting the video stream to the display device. In this manner, a viewer perceives the matrix code generated by the television receiver as part of the audiovisual content experience. If the television receiver 101 creates the matrix code, it may include temporal data therein.


Alternatively, the television receiver 101 may transmit the one or more matrix codes by themselves to the display device 102 via the output component 107 for the display device to display only the one or more matrix codes when the audiovisual content is selected for viewing.


III. Audience Metering and Participation

A time encoded matrix code may be useful for audience metering and/or participation. Such matrix codes may be used when voting (for example, for a contestant on a reality show), as an indication of support for a particular measure or point of view (for example, during debates on news channels), when selecting from among competing products (for example, during competitive advertising soliciting viewer feedback) and in many other instances.


As a general example, consider a reality show that permits viewers to vote for or otherwise support contestants. Rather than calling a telephone number or sending a text message, embodiments discussed herein may permit audience participation by displaying a matrix code that may be captured by a viewer. The matrix code may instruct the receiving device to transmit data included in the code to a monitoring party for registration as a vote for a particular contestant.


In one embodiment, a number of different matrix codes may be shown during display of audiovisual content, each of which may contain different data. In the reality show voting example, the matrix code data may encode the name of each contestant. At different intervals during the audiovisual content, a matrix code encoding the name of each contestant may be shown. A viewer may capture the matrix code associated with the contestant the viewer desires to support. The data therein may be transmitted to the monitoring party by the reader device 103 and registered as a vote for the particular contestant.


Alternately, the matrix codes may omit any information identifying a particular contestant (or option, choice, participation information and the like). Instead, a series of matrix codes may be displayed, each varying from the other codes by having different temporal data encoded therein. For example, a matrix code may encode the time at which it is displayed on the screen. The matrix code may be overlaid, integrated with or otherwise displayed with audiovisual content that identifies a particular option for which an audience member may vote. The member may capture the matrix code with a reader device 103 as previously described, and transmit the data in the code to the monitoring party 104. Since the monitoring party receives the temporal data indicating when the matrix code was displayed and/or captured, it needs only match the timestamp of the temporal data to a window of time during which an option was displayed. Each option may have a different window of time associated with it, which generally reflects the time during which the corresponding matrix code was displayed. Accordingly, if the time the matrix code was captured is known, the option supported by the viewer may be determined.



FIG. 2 is a flowchart depicting a sample method of operation for generating, capturing and processing a matrix code in order to measure audience metrics and/or participation. It should be appreciated that the flowchart of FIG. 2 is an overview of a general method in which individual operations may be performed by different entities or elements. In the instant flowchart, operations 205-225 are typically performed by the television receiver 101, operation 230-235 are typically performed by the reader device 103 and operations 240-245 are typically performed by the monitoring party 104. The activities of multiple entities or elements are shown in a single flowchart to provide an overview of the method that is generalized and easy to follow.


Initially, the method 200 begins in operation 205, in which the television receiver 101 receives audiovisual content and data associated with the content. The data, which may be metadata, may be used by the television receiver to construct a matrix code for display in conjunction with the audiovisual content. In operation 210, the receiver may obtain temporal data. Typically, this temporal data is the time at which the audiovisual content is to be display or when it is received (presuming the content is displayed at substantially the same time it is received).


In operation 215, the television receiver may generate a matrix code from the data received in operation 205 and the temporal data retrieved in operation 210. Thus, the matrix code typically includes the temporal data although in some embodiments it may be omitted. It may also include data identifying the audiovisual content and/or a viewer/user/receiver (such as a receiver number or subscriber number). In some embodiments, geographic data identifying a region or area in which the viewer/receiver is located may be included, as well. If data is omitted, then operation 210 may be skipped.


In operation 220, the television receiver receives an indicator that the matrix code is to be displayed. Accordingly, in operation 225, the receiver outputs the code and audiovisual content. Typically the code and content are outputted together, but in some embodiments they may be outputted serially for display. It should also be appreciated that operations 205-225 may be repeated multiple times so that multiple matrix codes, each with different temporal data, may be created and outputted at the appropriate time. Thus, as the audiovisual content continues to be displayed, a series of matrix codes may likewise be displayed, each of which relates to a segment of the overall content.


In operation 230, the reader device 103 may capture the matrix code from the associated display 102. The matrix code may be deciphered and data therein processed by the reader device, after which at least a portion of the data stored in the code may be transmitted to the monitoring party 104 in operation 235. (As part of the data in the matrix code is typically a Web address or other means of establishing a transmission link to the monitoring party 104, this portion of the matrix code data may be omitted and not transmitted to the monitoring party.) The data transmitted in operation 235 typically, although not necessarily, includes the temporal data. The transmitted data may also include geographical data, data identifying content, data identifying a user or receiver, and the like. Further, the reader device 103 may add this information to the transmitted data, if it was not included in the matrix code.


In operation 240, the monitoring party 104 may receive the data from the receive device 103. Next, in operation 245 the monitoring party may process the data and register the viewer's participation/vote/preference/input accordingly.


IV. Replay Monitoring

As discussed above, embodiments may receive a matrix code as part of, or associated with, reception of audiovisual content. In some cases the audiovisual content may be stored for later viewing or interaction. When content is stored, the associated matrix code (or data used to generate a matrix code) may likewise be stored.


Upon playback of stored audiovisual content, it may be useful to update temporal data in the associated matrix code. For example, it may be useful to determine if the date on which a user votes or otherwise participates in an interactive audience metering session is past a cutoff date for participation. This may be done by updating the temporal data associated with, or embodied in, the matrix code with the time and/or date on which stored content is played back. Thus, the information sent to the monitoring party via the reader device 103 may reflect the playback date, rather than the recording or receipt date. This permits the monitoring party to use the temporal data in any fashion it desires. As one example, it may disregard audience participation information transmitted with the temporal data if the temporal data is later than a cutoff date or time. As another option, audience participation information may be weighted such that information captured at a certain time, as indicated by the temporal data, may be more or less heavily considered by the monitoring party 104. As still another example, the audience participation information may be an indication of what audiovisual content was viewed and by whom it was viewed, similar to the NIELSEN audience measurement system used to determine the audience size and composition for television programming. By including temporal data in the matrix code (and thus ultimately transmitting that temporal data to the monitoring party), it can be determined how long after storing audiovisual content that the content is viewed.


Still other embodiments may use temporal data to open or close windows of opportunity for viewers, and permit viewers to participate in these windows by relaying data in a matrix code displayed on the display. As one example, an item may be placed on sale for a specific period of time that is longer than the duration of a live broadcast. If a viewer captures the matrix code shown during replay of the content and within the period, he or she may qualify for the sale. This qualification may be tracked through the temporal data in the matrix code. Similar methodologies may be used for contests, prize qualifications, and so forth.


In alternative embodiments, the television receiver 101 may omit temporal data from any matrix code it generates for display. Instead, the receiver 101 may include a counter indicating how many times a particular piece of audiovisual content has been played back. A user capturing the matrix code and transmitting it to a monitoring party may thus indicate the number of times the content has been viewed. This may be useful in determining a viewer's tastes and/or preferences. It may also be useful in determining how long (e.g., how many replays) it takes for a user to respond to a matrix code-enabled offer. Some embodiments may include both temporal information and a counter for audiovisual content in a single matrix code.


Generally, then, it may be useful to modify a matrix code upon playback of stored audiovisual content. FIG. 3 depicts a sample method for such modification. The method 300 begins in operation 305, where a replay command is received by the television receiver 101. Typically, such a command is initiated by a user. Next, in operation 310 the television receiver 101 retries the current time and/or date information.


In operation 315, the television receiver 101 determines if a matrix code is present in the audiovisual content as stored. That is, the receiver determines if the matrix code is a portion of the video signal or is to be overlaid on a video signal. If the code is not a portion of the video signal, the television receiver may execute operation 320 and generate a matrix code having temporal and/or replay counter data therein from metadata or other data stored and associated with the audiovisual content. Following operation 320 the process continues to operation 340 as described below.


If the matrix code is included in the audiovisual content, operation 325 is executed. In operation 325, the television receiver determines if the matrix code can be modified. The code may be modified, for example, if it may be stripped from the video or an overlay can be generated to be placed atop all or part of the matrix code as necessary. If the code cannot be modified, then the original, unmodified matrix code is displayed along with the audiovisual content in operation 330.


If the matrix code can be modified, it is updated with temporal and/or replay counter data in operation 335. Next, in operation 340, the updated code is displayed along with the audiovisual content.


Following either operation 330 or 340, the process ends in end state 345.


The matrix code containing temporal and/or replay data may be captured by a reader device 103 and transmitted to a monitoring party as described above during either operation 330 or 340, as appropriate.


V. Conclusion

Although the foregoing has been described with respect to particular physical structures, methods or operation and data structures, it should be appreciated that alternatives to any of these exist and are embraced within the scope of this disclosure. As one example, a linear bar code may be used in place of a matrix code. As another example, data relating to multiple captured matrix codes may be stored by the reader device 103 and transmitted in a burst fashion rather than serially and as captured. Accordingly, the proper scope of protection is set forth in the following claims.

Claims
  • 1. An apparatus for outputting a data construct including embedded temporal information, the apparatus comprising: a storage medium operative to store audiovisual content that the apparatus receives via a first network at a first time, the audiovisual content corresponding to television content;a processing unit operative to receive a command to replay the audiovisual content from the storage medium;a matrix code module operatively connected to the processing unit, the matrix code module operative to generate a matrix code including a temporal identifier and a content identifier corresponding to the audiovisual content; andan output component operative to output, in response to the command to replay the audiovisual content, the audiovisual content and the matrix code for display at a second time with the matrix code overlaid on the audiovisual content;wherein the matrix code specifies whether the audiovisual content is displayed live or as a replay after being stored in the storage medium and, consequent to the audiovisual content being replayed, the matrix code, including the temporal identifier, specifies the second time at which the audiovisual content is replayed.
  • 2. The apparatus of claim 1, wherein the first time corresponds to a broadcast time of the audiovisual content, the second time is subsequent to the broadcast time, and the matrix code, including the temporal identifier, further specifies a date on which the audiovisual content was replayed.
  • 3. The apparatus of claim 2, wherein the matrix code further includes information corresponding to one or a combination of: the first time at which the audiovisual content was received by the apparatus;data identifying a viewer; and/ora geographic identifier corresponding to a geographic locale at which the audiovisual content is replayed.
  • 4. The apparatus of claim 3, wherein the matrix code further includes an Internet address to which the information should be transmitted via a second network.
  • 5. The apparatus of claim 4, wherein the information is indicative of a vote cast by the viewer.
  • 6. The apparatus of claim 4, wherein: the processing unit is further operative to receive a subsequent command to replay the audiovisual content at a third time subsequent to the second time; andthe matrix code module is further operative to, upon subsequent replay of the audiovisual content, regenerate the matrix code to update the matrix code to further include a number of times the audiovisual content has been output for display.
  • 7. The apparatus of claim 4, wherein: the matrix code module operative to generate a series of matrix codes each having different temporal identifiers corresponding to different time windows during replay of the audiovisual content, wherein the series of matrix codes comprises the matrix code; andthe output component is further operative to output the audiovisual content and the series of matrix codes audiovisual content for display with the series of matrix codes overlaid on the audiovisual content.
  • 8. A method for outputting a data construct including embedded temporal information, the method comprising: receiving, by a television receiver, audiovisual content via a first network at a first time, the audiovisual content corresponding to television content;storing, by the television receiver, the audiovisual content in a storage medium;receiving, by the television receiver, a command to replay the audiovisual content from the storage medium;generating, by the television receiver, a matrix code including a temporal identifier and a content identifier corresponding to the audiovisual content;in response to the command to replay the audiovisual content, outputting, by the television receiver, the audiovisual content and the matrix code for display at a second time with the matrix code overlaid on the audiovisual content;wherein the matrix code specifies whether the audiovisual content is displayed live or as a replay after being stored in the storage medium and, consequent to the audiovisual content being replayed, the matrix code, including the temporal identifier, specifies the second time at which the audiovisual content is replayed.
  • 9. The method of claim 8, wherein the first time corresponds to a broadcast time of the audiovisual content, the second time is subsequent to the broadcast time, and the matrix code, including the temporal identifier, further specifies a date on which the audiovisual content was replayed.
  • 10. The method of claim 9, wherein the matrix code further includes information corresponding to one or a combination of: the first time at which the audiovisual content was received by the television receiver;data identifying a viewer; and/ora geographic identifier corresponding to a geographic locale at which the audiovisual content is replayed.
  • 11. The method of claim 10, wherein the matrix code further includes an Internet address to which the information should be transmitted via a second network.
  • 12. The method of claim 11, wherein the information is indicative of a vote cast by the viewer.
  • 13. The method of claim 11, further comprising: receiving, by the television receiver, a subsequent command to replay the audiovisual content at a third time subsequent to the second time;upon subsequent replay of the audiovisual content, regenerating, by the television receiver, the matrix code to update the matrix code to further include a number of times the audiovisual content has been output for display.
  • 14. The method of claim 11, further comprising: generating, by the television receiver, a series of matrix codes each having different temporal identifiers corresponding to different time windows during replay of the audiovisual content, wherein the series of matrix codes comprises the matrix code; andoutputting, by the television receiver, the audiovisual content and the series of matrix codes audiovisual content for display with the series of matrix codes overlaid on the audiovisual content.
  • 15. One or more non-transitory, machine-readable storage media having machine-readable instructions for outputting a data construct including embedded temporal information the machine-readable instructions to cause one or more processors to perform: processing audiovisual content via a first network at a first time, the audiovisual content corresponding to television content;storing the audiovisual content in a storage medium;processing a command to replay the audiovisual content from the storage medium;generating a matrix code including a temporal identifier and a content identifier corresponding to the audiovisual content;in response to the command to replay the audiovisual content, causing output of the audiovisual content and the matrix code for display at a second time with the matrix code overlaid on the audiovisual content;wherein the matrix code specifies whether the audiovisual content is displayed live or as a replay after being stored in the storage medium and, consequent to the audiovisual content being replayed, the matrix code, including the temporal identifier, specifies the second time at which the audiovisual content is replayed.
  • 16. The one or more non-transitory, machine-readable storage media of claim 15, wherein the first time corresponds to a broadcast time of the audiovisual content, the second time is subsequent to the broadcast time, and the matrix code, including the temporal identifier, further specifies a date on which the audiovisual content was replayed.
  • 17. The one or more non-transitory, machine-readable storage media of claim 16, wherein the matrix code further includes information corresponding to one or a combination of: the first time at which the audiovisual content was received by the one or more processors;data identifying a viewer;a geographic identifier corresponding to a geographic locale at which the audiovisual content is replayed; and/oran Internet address to which the information should be transmitted via a second network.
  • 18. The one or more non-transitory, machine-readable storage media of claim 17, wherein the information is indicative of a vote cast by the viewer.
  • 19. The one or more non-transitory, machine-readable storage media of claim 17, the machine-readable instructions to further cause one or more processors to perform: processing a subsequent command to replay the audiovisual content at a third time subsequent to the second time;upon subsequent replay of the audiovisual content, regenerating the matrix code to update the matrix code to further include a number of times the audiovisual content has been output for display.
  • 20. The one or more non-transitory, machine-readable storage media of claim 17, the machine-readable instructions to further cause one or more processors to perform: generating a series of matrix codes each having different temporal identifiers corresponding to different time windows during replay of the audiovisual content, wherein the series of matrix codes comprises the matrix code; andcausing output of the audiovisual content and the series of matrix codes audiovisual content for display with the series of matrix codes overlaid on the audiovisual content.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of U.S. patent application Ser. No. 13/302,717, filed Nov. 22, 2011, entitled “Tracking User Interaction from a Receiving Device”, which claims priority to U.S. Patent Application No. 61/417,076, filed Nov. 24, 2010, entitled “Tracking User Interaction From a Receiving Device”, the contents of each of which are incorporated herein by reference, in their entireties.

US Referenced Citations (322)
Number Name Date Kind
4809325 Hayashi et al. Feb 1989 A
4837414 Edamula Jun 1989 A
5500681 Jones Mar 1996 A
5510603 Hess et al. Apr 1996 A
5581636 Skinger Dec 1996 A
5602377 Beller et al. Feb 1997 A
5703349 Meyerson et al. Dec 1997 A
5929849 Kikinis Jul 1999 A
5959285 Schuessler Sep 1999 A
5963265 Bae et al. Oct 1999 A
5978855 Metz et al. Nov 1999 A
6006990 Ye et al. Dec 1999 A
6058238 Ng May 2000 A
6263502 Morrison et al. Jul 2001 B1
6438751 Voyticky et al. Aug 2002 B1
6512919 Ogasawara Jan 2003 B2
6556273 Wheeler et al. Apr 2003 B1
6704929 Ozer et al. Mar 2004 B1
6766956 Boylan et al. Jul 2004 B1
6983304 Sato Jan 2006 B2
7046161 Hayes May 2006 B2
7206029 Cohen-Solal Apr 2007 B2
7206409 Antonellis et al. Apr 2007 B2
7221405 Basson et al. May 2007 B2
7244404 Rosenberg et al. Jul 2007 B2
7328848 Xia et al. Feb 2008 B2
7349668 Ilan et al. Mar 2008 B2
7369180 Xing May 2008 B2
7373652 Bayrakeri et al. May 2008 B1
7387250 Muni Jun 2008 B2
7394519 Mossman et al. Jul 2008 B1
7424976 Muramatsu Sep 2008 B2
7443449 Momosaki et al. Oct 2008 B2
7487527 Ellis et al. Feb 2009 B2
7587601 Levy et al. Sep 2009 B2
7604172 Onogi Oct 2009 B2
7612748 Tateuchi Nov 2009 B2
7624417 Dua Nov 2009 B2
7624916 Sato et al. Dec 2009 B2
7673297 Arsenault et al. Mar 2010 B1
7797430 Ichieda Sep 2010 B2
7818675 Maruyama et al. Oct 2010 B2
7841531 Onogi Nov 2010 B2
8010977 Hogyoku Aug 2011 B2
8045054 Bishop et al. Oct 2011 B2
8186572 Herzig May 2012 B2
8292166 Gomez et al. Oct 2012 B2
8327403 Chilvers et al. Dec 2012 B1
8364018 McArdle Jan 2013 B2
8380993 Chen et al. Feb 2013 B2
8386339 Minnick et al. Feb 2013 B2
8408466 Gratton Apr 2013 B2
8427455 Matsuda Apr 2013 B2
8430302 Minnick et al. Apr 2013 B2
8439257 Beals et al. May 2013 B2
8443407 Gaede et al. May 2013 B2
8468610 Beals et al. Jun 2013 B2
8511540 Anguiano Aug 2013 B2
8534540 Gratton et al. Sep 2013 B2
8550334 Gratton et al. Oct 2013 B2
8553146 Kennedy Oct 2013 B2
8746554 Gomez et al. Jun 2014 B2
8786410 Beals et al. Jul 2014 B2
8827150 Gratton et al. Sep 2014 B2
8833640 Martch et al. Sep 2014 B2
8856853 Casagrande et al. Oct 2014 B2
8875173 Kilaru et al. Oct 2014 B2
8886172 Gomez Nov 2014 B2
8931031 Schaefer Jan 2015 B2
9092830 Gomez et al. Jul 2015 B2
9148686 Gerhards et al. Sep 2015 B2
9280515 Gaede et al. Mar 2016 B2
9329966 Dugan et al. May 2016 B2
9367669 Gratton Jun 2016 B2
9571888 Casagrande et al. Feb 2017 B2
9596500 Gomez et al. Mar 2017 B2
9652108 Mountain May 2017 B2
9736469 Schaefer et al. Aug 2017 B2
9781465 Kilaru et al. Oct 2017 B2
9792612 Dugan et al. Oct 2017 B2
10015483 Schaefer et al. Jul 2018 B2
20010037297 McNair Nov 2001 A1
20010052133 Pack et al. Dec 2001 A1
20020011521 Lahey et al. Jan 2002 A1
20020027612 Brill et al. Mar 2002 A1
20020049980 Hoang Apr 2002 A1
20020065728 Ogasawara May 2002 A1
20020082931 Siegel et al. Jun 2002 A1
20020112250 Koplar et al. Aug 2002 A1
20020195495 Melick et al. Dec 2002 A1
20030018711 Imanishi Jan 2003 A1
20030050854 Showghi et al. Mar 2003 A1
20030077065 Scholten et al. Apr 2003 A1
20030112974 Levy Jun 2003 A1
20030121978 Rubin et al. Jul 2003 A1
20030125092 Burnhouse et al. Jul 2003 A1
20030151562 Kulas Aug 2003 A1
20030172374 Vinson et al. Sep 2003 A1
20040005900 Zilliacus Jan 2004 A1
20040019691 Daymond et al. Jan 2004 A1
20040026508 Nakajima et al. Feb 2004 A1
20040044532 Karstens Mar 2004 A1
20040046790 Agarwal et al. Mar 2004 A1
20040049672 Nollet et al. Mar 2004 A1
20040133907 Rodriguez et al. Jul 2004 A1
20040226042 Ellis Nov 2004 A1
20040260850 Yu et al. Dec 2004 A1
20050009564 Hayaashi et al. Jan 2005 A1
20050011958 Fukasawa et al. Jan 2005 A1
20050015800 Holcomb Jan 2005 A1
20050015815 Shoff et al. Jan 2005 A1
20050055281 Williams Mar 2005 A1
20050059339 Honda et al. Mar 2005 A1
20050097618 Arling et al. May 2005 A1
20050107135 Deeds et al. May 2005 A1
20050125301 Muni Jun 2005 A1
20050149967 Hanley et al. Jul 2005 A1
20050163483 Rassool Jul 2005 A1
20050180804 Andrew et al. Aug 2005 A1
20050203854 Das et al. Sep 2005 A1
20050258250 Melick et al. Nov 2005 A1
20050262217 Nonaka et al. Nov 2005 A1
20050262548 Shimojo et al. Nov 2005 A1
20050264694 Ilan et al. Dec 2005 A1
20060064700 Ludvig et al. Mar 2006 A1
20060065733 Lee et al. Mar 2006 A1
20060071076 Tamayama Apr 2006 A1
20060079247 Ritter Apr 2006 A1
20060086796 Onogi Apr 2006 A1
20060090179 Hsu et al. Apr 2006 A1
20060095286 Kimura May 2006 A1
20060124742 Rines et al. Jun 2006 A1
20060174317 Onomatsu et al. Aug 2006 A1
20060196950 Kiliccote Sep 2006 A1
20060203339 Kleinberger et al. Sep 2006 A1
20060208088 Sekiguchi Sep 2006 A1
20060265731 Matsuda Nov 2006 A1
20070008344 Medina Jan 2007 A1
20070011709 Katz et al. Jan 2007 A1
20070016934 Okada et al. Jan 2007 A1
20070016936 Okada et al. Jan 2007 A1
20070017350 Uehara Jan 2007 A1
20070019215 Yu Jan 2007 A1
20070063050 Attia et al. Mar 2007 A1
20070073585 Apple et al. Mar 2007 A1
20070143788 Abernathy et al. Jun 2007 A1
20070159522 Neven Jul 2007 A1
20070174198 Kasahara Jul 2007 A1
20070192723 Anzelde et al. Aug 2007 A1
20070200335 Tushcel Aug 2007 A1
20070205596 Mizuno et al. Sep 2007 A1
20070206020 Duffield et al. Sep 2007 A1
20070256118 Nomura et al. Nov 2007 A1
20070288594 Philyaw et al. Dec 2007 A1
20080022323 Koo Jan 2008 A1
20080059998 McClenny et al. Mar 2008 A1
20080062164 Bassi et al. Mar 2008 A1
20080073434 Epshteyn et al. Mar 2008 A1
20080077324 Hatano et al. Mar 2008 A1
20080082684 Gaos et al. Apr 2008 A1
20080092154 Hogyoku Apr 2008 A1
20080112615 Obrea et al. May 2008 A1
20080141321 Kubat et al. Jun 2008 A1
20080156879 Melick et al. Jul 2008 A1
20080182561 Kim et al. Jul 2008 A1
20080189185 Matsuo et al. Aug 2008 A1
20080200153 Fitzpatrick et al. Aug 2008 A1
20080200160 Fitzpatrick et al. Aug 2008 A1
20080201078 Fitzpatrick et al. Aug 2008 A1
20080244675 Sako et al. Oct 2008 A1
20080263621 Austerlitz et al. Oct 2008 A1
20080267537 Thuries Oct 2008 A1
20080281624 Shibata Nov 2008 A1
20080285944 Tsunokawa Nov 2008 A1
20080288460 Poniatowski et al. Nov 2008 A1
20080288600 Clark Nov 2008 A1
20080307348 Jones et al. Dec 2008 A1
20090029725 Gerard Kindberg Jan 2009 A1
20090031071 Chiu Jan 2009 A1
20090031373 Hogyoku Jan 2009 A1
20090070699 Birkill et al. Mar 2009 A1
20090083808 Morrison Mar 2009 A1
20090088213 Rofougaran Apr 2009 A1
20090094546 Anzelde et al. Apr 2009 A1
20090108057 Mu et al. Apr 2009 A1
20090113334 Chakra et al. Apr 2009 A1
20090116074 Wilsher May 2009 A1
20090154759 Koskinen et al. Jun 2009 A1
20090157511 Spinnell et al. Jun 2009 A1
20090157530 Nagamoto et al. Jun 2009 A1
20090172780 Sukeda et al. Jul 2009 A1
20090179852 Refai et al. Jul 2009 A1
20090180025 Dawson Jul 2009 A1
20090196456 Bisti et al. Aug 2009 A1
20090200367 Arnouse Aug 2009 A1
20090212112 Li et al. Aug 2009 A1
20090212113 Chiu et al. Aug 2009 A1
20090234570 Sever Sep 2009 A1
20090250512 Deck et al. Oct 2009 A1
20090254954 Jeong Oct 2009 A1
20090293088 Mukerji et al. Nov 2009 A1
20090293110 Koga Nov 2009 A1
20090294538 Wihlborg et al. Dec 2009 A1
20090300673 Bachet et al. Dec 2009 A1
20090303036 Sahuguet Dec 2009 A1
20090307232 Hall Dec 2009 A1
20090307719 Clark et al. Dec 2009 A1
20090312105 Koplar Dec 2009 A1
20090320066 Soldan et al. Dec 2009 A1
20100001072 Onogi Jan 2010 A1
20100004984 Beyabani Jan 2010 A1
20100017457 Jumpertz et al. Jan 2010 A1
20100020970 Liu et al. Jan 2010 A1
20100026721 Park et al. Feb 2010 A1
20100031162 Wiser et al. Feb 2010 A1
20100036936 Cox et al. Feb 2010 A1
20100053339 Aaron et al. Mar 2010 A1
20100058312 Morohoshi Mar 2010 A1
20100081375 Rosenblatt et al. Apr 2010 A1
20100089996 Koplar Apr 2010 A1
20100096448 Melick et al. Apr 2010 A1
20100103018 Yoon et al. Apr 2010 A1
20100114715 Schuster et al. May 2010 A1
20100129057 Kulkarni May 2010 A1
20100131373 Kubasov May 2010 A1
20100131900 Spetalnick May 2010 A1
20100131970 Falcon May 2010 A1
20100131983 Shannon et al. May 2010 A1
20100138344 Wong et al. Jun 2010 A1
20100149187 Slavin et al. Jun 2010 A1
20100154035 Damola et al. Jun 2010 A1
20100161437 Pandey Jun 2010 A1
20100163613 Bucher et al. Jul 2010 A1
20100169935 Abbruzzese Jul 2010 A1
20100186025 Thomas et al. Jul 2010 A1
20100188514 Sato Jul 2010 A1
20100201894 Nakayama et al. Aug 2010 A1
20100205628 Davis et al. Aug 2010 A1
20100217663 Ramer et al. Aug 2010 A1
20100225653 Sao et al. Sep 2010 A1
20100261454 Shenfield et al. Oct 2010 A1
20100262924 Kalu Oct 2010 A1
20100262992 Casagrande Oct 2010 A1
20100272193 Khan et al. Oct 2010 A1
20100272420 Soohoo et al. Oct 2010 A1
20100275010 Ghirardi Oct 2010 A1
20100279710 Dicke et al. Nov 2010 A1
20100295868 Zahnert et al. Nov 2010 A1
20100301115 Berkun Dec 2010 A1
20100313231 Okamoto et al. Dec 2010 A1
20100319041 Ellis Dec 2010 A1
20100327060 Moran et al. Dec 2010 A1
20110000958 Herzig Jan 2011 A1
20110007630 Almhana et al. Jan 2011 A1
20110030068 Imai Feb 2011 A1
20110039573 Hardie Feb 2011 A1
20110058516 Small et al. Mar 2011 A1
20110061003 Miyazawa et al. Mar 2011 A1
20110065451 Danado et al. Mar 2011 A1
20110083154 Boersma Apr 2011 A1
20110087539 Rubinstein et al. Apr 2011 A1
20110107374 Roberts et al. May 2011 A1
20110107386 De Los Reyes et al. May 2011 A1
20110138408 Adimatyam et al. Jun 2011 A1
20110208710 Lesavich Aug 2011 A1
20110258058 Carroll et al. Oct 2011 A1
20110264527 Fitzpatrick et al. Oct 2011 A1
20110264530 Santangelo et al. Oct 2011 A1
20110282727 Phan et al. Nov 2011 A1
20110296046 Arya et al. Dec 2011 A1
20110314485 Abed Dec 2011 A1
20110321114 Newell Dec 2011 A1
20120075529 Wong et al. Mar 2012 A1
20120084206 Mehew et al. Apr 2012 A1
20120096499 Dasher et al. Apr 2012 A1
20120117193 Phillips et al. May 2012 A1
20120117232 Brown et al. May 2012 A1
20120127110 Amm et al. May 2012 A1
20120128267 Dugan et al. May 2012 A1
20120130835 Fan et al. May 2012 A1
20120130851 Minnick et al. May 2012 A1
20120131416 Dugan et al. May 2012 A1
20120137318 Kilaru et al. May 2012 A1
20120138671 Gaede et al. Jun 2012 A1
20120139826 Beals et al. Jun 2012 A1
20120139835 Morrison et al. Jun 2012 A1
20120142322 Gomez Jun 2012 A1
20120151293 Beals Jun 2012 A1
20120151524 Kilaru et al. Jun 2012 A1
20120153015 Gomez et al. Jun 2012 A1
20120153017 Bracalente et al. Jun 2012 A1
20120155838 Gerhards et al. Jun 2012 A1
20120158919 Aggarwal et al. Jun 2012 A1
20120159563 Gomez et al. Jun 2012 A1
20120168493 Worms Jul 2012 A1
20120168510 Gratton Jul 2012 A1
20120169928 Casagrande et al. Jul 2012 A1
20120175416 Gomez et al. Jul 2012 A1
20120181329 Gratton et al. Jul 2012 A1
20120182320 Beals et al. Jul 2012 A1
20120188112 Beals et al. Jul 2012 A1
20120188442 Kennedy Jul 2012 A1
20120198572 Beals et al. Aug 2012 A1
20120199643 Minnick et al. Aug 2012 A1
20120206648 Casagrande et al. Aug 2012 A1
20120215830 Anguiano Aug 2012 A1
20120217292 Gratton et al. Aug 2012 A1
20120217293 Martch et al. Aug 2012 A1
20120218470 Schaefer Aug 2012 A1
20120218471 Gratton Aug 2012 A1
20120222055 Schaefer et al. Aug 2012 A1
20120222071 Gaede et al. Aug 2012 A1
20120222081 Schaefer et al. Aug 2012 A1
20120293327 Mountain Nov 2012 A1
20120311623 Davis et al. Dec 2012 A1
20130068838 Gomez et al. Mar 2013 A1
20130239157 Gaede et al. Sep 2013 A1
20140046661 Bruner Feb 2014 A1
20140076963 Gratton et al. Mar 2014 A1
20140158762 Gomez et al. Jun 2014 A1
20160066050 Gerhards et al. Mar 2016 A1
20170366806 Schaefer et al. Dec 2017 A1
Foreign Referenced Citations (69)
Number Date Country
2 634 951 Jan 2010 CA
2 760 146 Nov 2010 CA
1352765 Jun 2002 CN
1571503 Jan 2005 CN
1636371 Jul 2005 CN
1675930 Sep 2005 CN
1839398 Sep 2006 CN
101 227 581 Jul 2008 CN
101253504 Aug 2008 CN
101 355 685 Jan 2009 CN
101 409 027 Apr 2009 CN
101 873 467 Oct 2010 CN
101 894 113 Nov 2010 CN
101 895 722 Nov 2010 CN
103221963 Jul 2013 CN
23 36 711 Feb 1975 DE
10 2007 038 810 Feb 2009 DE
1 021 035 Jul 2000 EP
1 383 071 Jan 2004 EP
1 555 808 Jul 2005 EP
1 724 695 Nov 2006 EP
1 757 222 Feb 2007 EP
1 768 400 Mar 2007 EP
1 831 747 Sep 2007 EP
2 079 051 Jul 2009 EP
2 131 289 Dec 2009 EP
2 439 936 Apr 2012 EP
2 643 769 Oct 2013 EP
2 643 793 Oct 2013 EP
2 565 748 Dec 1985 FR
2 929 467 Oct 2009 FR
2 044 446 Oct 1980 GB
2 165 129 Apr 1986 GB
2 311 451 Sep 1997 GB
2 325 765 Dec 1998 GB
2 471 567 Jan 2011 GB
4698CHENP2013 Jun 2016 IN
7734CHENP2013 Jun 2016 IN
2000-222116 Aug 2000 JP
2002-215768 Aug 2002 JP
3929450 Jun 2007 JP
2007-213548 Aug 2007 JP
2008-276716 Nov 2008 JP
2009-140204 Jun 2009 JP
2008-244556 Oct 2009 JP
2004 0087776 Oct 2004 KR
299433 Mar 1997 TW
200915193 Apr 2009 TW
200926075 Jun 2009 TW
201032139 Sep 2010 TW
201038061 Oct 2010 TW
201043039 Dec 2010 TW
9527275 Oct 1995 WO
9741690 Nov 1997 WO
0106593 Jan 2001 WO
0118589 Mar 2001 WO
0158146 Aug 2001 WO
2004019442 Mar 2004 WO
2005109338 Nov 2005 WO
2007009005 Jan 2007 WO
2009056897 Jan 2007 WO
2009057651 May 2009 WO
2009116954 Sep 2009 WO
2009144536 Dec 2009 WO
2010149161 Dec 2010 WO
2011009055 Jan 2011 WO
2012071176 May 2012 WO
2012071174 May 2012 WO
2012074705 Jun 2012 WO
Non-Patent Literature Citations (293)
Entry
Office Action for CA 2,818,604, all pages.
Third Office Action issued by State Intellectual Property Office (SIPO) for CN 201180056249.9 dated Aug. 21, 2017, all pages.
U.S. Appl. No. 15/595,621, filed May 15, 2017 Non-Final Rejection dated Oct. 18, 2017, all pages.
European Search Report for EP 11 846 858.6 dated Jun. 12, 2017, all pages.
Jung, Eui-Hyun et al., “A Robust Digital Watermarking System Adopting 2D Barcode against Digital Piracy on P2P Network,” IJCSNS International Journal of Computer Science and Network Security, vol. 6, No. 10, Oct. 2006, 6 pages.
“Android App Reviews & Showcase Just a Tapp Away,” Android Tapp, 10pp. Found online at http://www.androidtapp.com/download-the-weather-channel-android-app-from-your-tv/, Oct. 22, 2010.
O'Sullivan, “Can Mobile Barcodes Work on TV?,” India and Asia Pacific Mobile Industry Reports, Sep. 2009, 4 pp. Found online at http://gomonews.com/can-mobile-barcodes-work-on-tv/, Feb. 5, 2013.
“FOX TV Uses QR Codes,” 2D Barcode Strategy, Sep. 2010, 6 pp. Found online at http://www.2dbarcodestrategy.com/2010/09/fox-tv-uses-qr-codes.html, Oct. 22, 2010.
“FOX's Fringe Uses QR Code,” 2D Barcode Strategy, Oct. 2010, 4 pp. Found on the Internet at http://www.2dbarcodestrategy.com/2010/10/foxs-fringe-uses-qr-code.html, Oct. 22, 2010.
“Mobile Paths: QR Codes Come to TV,” Mobile Behavior: An Omnicom Group Company, Sep. 2010, 8 pp. Found online at http://www.mobilebehavior.com/2010/09/27/mobile-paths-qr-codes-come-to-tv, Oct. 22, 2010.
“What Can I Do with the QR Barcode,” Search Magnet Local-QR Barcode Technology, 2 pp. Found online at http://www.searchmagnetlocal.com/qr_barcode_technology.html, Oct. 22, 2010.
Byford, D., “Universal Interactive Device,” International Business Machines Corporation, Jun. 1998, 1 page.
Costedio, K., “Bluefly QR Codes Debut on TV,” 2 pp. Found online at http://www.barcode.com/Mobile-Barcode-news/bluefly-qr-codes-debut-on-tv.html, Oct. 22, 2010.
First Examination Report from European Patent Office dated May 18, 2015 for EP 11849496.2, 7 pages.
Examination Search Report from the European Patent Office dated Dec. 4, 2015 for EP 12707418.5, 8 pages.
Extended European Search Report for EP 12152690.9 dated Jun. 19, 2012, 9 pages.
European Office Action for EP 12716751.8 dated Nov. 11, 2015, 4 pages.
Extended European Search Report for EP 11842890.3 dated Mar. 26, 2014, 8 pages.
European Office Action for EP 11842890.3 dated Mar. 13, 2015, 8 pages.
European Office Action for EP 11842890.3 dated May 9, 2016, all pages.
Extended European Search Report for EP 11850819.1 dated Mar. 17, 2014, 5 pages.
Office Action for EP 11850819.1 dated Nov. 12, 2015, 4 pages.
Office Action for EP 12705768.5 dated May 25, 2016, all pages.
Extended European Search Report for EP 11846486 dated Mar. 26, 2014, 5 pages.
Extended European Search Report for EP 11852630 dated Jun. 30, 2014, 7 pages.
Office Action for Korean Patent Application No. 10-2013-7015610 dated Oct. 21, 2016, all pages.
Office Action for Korean Patent Application No. 10-2013-7015610 dated Feb. 21, 2017, all pages.
Office Action from European Patent Office for Application No. 11852630.0 dated May 12, 2015, 7 pages.
European Search Report for EP 11844504 dated Feb. 24, 2015, 10 pages.
European Search Report for EP 11855065 dated Mar. 6, 2014, 6 pages.
Notice of Allowance for Canadian Application 2,818,585 dated Apr. 3, 2017, 1 page.
Notice of Allowance for Canadian Application 2,828,463 dated Apr. 4, 2017, 1 page.
Office Action for CA 2,818,585 dated Nov. 28, 2016, all pages.
Office Action for EP 11811502.1 dated Aug. 29, 2016, all pages.
Office Action for EP 11855065 dated Mar. 13, 2015, 6 pages.
First Office Action by the Mexican Institute of Industrial Property for Mexican Patent Application No. Mx/a/2013/009881 dated Aug. 14, 2014, 3 pages.
Notice of Allowance by the Mexican Institute of Industrial Property for Mexican Patent Application No. Mx/a/2013/009881 dated Jan. 12, 2015, 1 page.
Publication of Brazil appln No. BR 11 2013 012218-8 on Aug. 9, 2016, 1 page.
Gao, J., et al., “A 2D Barcode-Based Mobile Payment System,” Multimedia and Ubiquitous Engineering, 2009, 10 pp. Found online at http://ieeexplore.ieee.org/Xplore/login.jsp?url=http%3A%2F%2Fieeexplore.ieee.org%2Fie . . . , Oct. 22, 2010.
International Search Report and Written Opinion of PCT/US2011/059977 dated Mar. 19, 2012, 7 pages.
International Preliminary Report on Patentability of PCT/US2011/059977 dated Jun. 6, 2013, 6 pages.
International Preliminary Report on Patentability of PCT/US2012/048032 dated Apr. 3, 2014, 6 pages.
International Search Report and Written Opinion of PCT/US11/60002 dated Feb. 15, 2012, 7 pages.
International Preliminary Report on Patentability for PCT/US2011/060002 dated Jun. 6, 2013, 6 pages.
International Search Report and Written Opinion of PCT/US11/60094 dated Mar. 30, 2012, 7 pages.
International Preliminary Report on Patentability for PCT/US2011/060094 dated Jun. 20, 2013, 6 pages.
International Search Report and Written Opinion of PCT/US11/60104 dated Mar. 29, 2012, 9 pages.
International Search Report and Written Opinion of PCT/US2011/60121 dated Feb. 14, 2012, 7 pages.
International Preliminary Report on Patentability for PCT/US2011/060121 dated Jun. 20, 2013, 6 pages.
International Search Report and Written Opinion of PCT/US11/61074 dated Jan. 6, 2012, 11 pages.
International Search Report and Written Opinion of PCT/US11/61211 dated Mar. 29, 2012, 8 pages.
International Preliminary Report on Patentability for PCT/US2011/061211 dated Jul. 4, 2013, 7 pages.
International Search Report and Written Opinion of PCT/US11/61773 dated Feb. 21, 2012, 7 pages.
International Preliminary Report on Patentability for PCT/US2011/061773 dated Jun. 6, 2013, 6 pages.
International Search Report and Written Opinion of PCT/US11/61778 dated Mar. 2, 2012, 7 pages.
International Preliminary Report on Patentability for PCT/US2011/061778 dated Jul. 11, 2013, 6 pages.
International Search Report and Written Opinion of PCT/US11/63111 dated Apr. 4, 2012, 9 pages.
International Preliminary Report on Patentability of PCT/US2011/063111 dated Jun. 13, 2013, 8 pages.
International Search Report and Written Opinion of PCT/US2011/064709 dated Apr. 10, 2012, 8 pages.
International Search Report and Written Opinion of PCT/US2011/060098 dated Mar. 29, 2012, 10 pages.
International Preliminary Report on Patentability for PCT/US2011/060098 dated Jun. 13, 2013, 9 pages.
International Search Report and Written Opinion of PCT/US2011/063308 dated Mar. 29, 2012, 10 pages.
International Preliminary Report on Patentability for PCT/US2011/063308 dated Jul. 18, 2013, 9 pages.
International Search Report and Written Opinion of PCT/US2011/068161 dated Jun. 14, 2012, 19 pages.
International Preliminary Report on Patentability of PCT/US2011/068161 dated Jul. 25, 2013, 13 pages.
International Search Report and Written Opinion of PCT/US2011/068176 dated Mar. 29, 2012, 15 pages.
International Search Report and Written Opinion of PCT/US2012/021657 dated May 23, 2012, 12 pages.
International Search Report of PCT/US2012/022581 dated Oct. 8, 2012, 18 pages.
International Preliminary Report on Patentability for PCT/US2012/022581 dated Aug. 8, 2013, 12 pages.
International Search Report and Written Opinion of PCT/US2012/022405 dated Apr. 19, 2012, 11 pages.
International Preliminary Report on Patentability for PCT/US2012/022405 dated Aug. 8, 2013, 7 pages.
International Search Report and Written Opinion of PCT/US2012/024923 dated May 22, 2012, 12 pages.
International Preliminary Report on Patentability for PCT/US2012/024923 dated Aug. 29, 2013, 8 pages.
International Search Report and Written Opinion of PCT/US2012/024956 dated Jun. 11, 2012, 10 pages.
International Preliminary Report on Patentability for PCT/US2012/024956 dated Aug. 29, 2013, 7 pages.
International Search Report and Written Opinion of PCT/US2012/025502 dated Jun. 8, 2012, 13 pages.
International Preliminary Report on Patentability of PCT/US2012/025502 dated Sep. 6, 2013, 9 pages.
International Search Report and Written Opinion of PCT/US2012/025607 dated Jun. 8, 2012, 13 pages.
International Preliminary Report on Patentability for PCT/US2012/025607 dated Sep. 12, 2013, 8 pages.
International Search Report and Written Opinion of PCT/US2012/025634 dated May 7, 2012, 8 pages.
International Preliminary Report on Patentability for PCT/US2012/025634 dated Sep. 6, 2013, 5 pages.
International Search Report and Written Opinion of PCT/US2012/026373 dated Jun. 13, 2012, 14 pages.
International Preliminary Report on Patentability for PCT/US2012/026373 dated Sep. 12, 2013, 10 pages.
International Search Report and Written Opinion of PCT/US2012/026624 dated Aug. 29, 2012, 14 pages.
International Preliminary Report on Patentability for PCT/US2012/026624 dated Sep. 12, 2013, 12 pages.
International Search Report and Written Opinion of PCT/US2012/026722 dated Jun. 28, 2012, 11 pages.
International Search Report and Written Opinion of PCT/US2012/048032, dated Oct. 16, 2012, 14 pages.
International Search Report and Written Opinion of PCT/US2011/060109 dated Feb. 14, 2012, 7 pages.
International Preliminary Report on Patentability for PCT/US2011/060109 dated Jun. 20, 2013, 7 pages.
First Examination Report from European Patent Office dated Feb. 4, 2015 for EP 12716751.8, 4 pages.
First Office Action for CN 201180065044.7 dated Feb. 13, 2015 by the State Intellectual Property Office (SIPO), 4 pages.
First Office Action with Search Report for CN 201280013891.3 dated Jan. 15, 2016, 13 pages.
Second Office Action CN 201280013891.3 dated Aug. 12, 2016, all pages.
Second Office Action for CN 201180065044.7 dated Sep. 9, 2015 by the State Intellectual Property Office (SIPO), 23 pages.
Office Action from European Patent Office for Application No. 12716728.6 dated Feb. 26, 2015, 5 pages.
Notice of Allowance and search report for ROC (Taiwan) Patent Application No. 101106288 received May 29, 2015, 9 pages.
Office Action of the Intellectual Property Office for ROC Patent App. No. 101101486 dated Aug. 5, 2014, 4 pages.
Office Action of the Intellectual Property Office for ROC Patent App. No. 100143194 dated Sep. 23, 2014, 10 pages.
Office Action of the Intellectual Property Office for ROC Patent App. No. 100142978 dated Sep. 23, 2014, 9 pages.
Office Action from State Intellectual Property Office for CN Appln. No. 201180056242.7 received Jun. 17, 2015, 10 pages.
Second Office Action from State Intellectual Property Office for CN Appln. No. 201180056242.7 dated Jan. 26, 2016, all pages.
Third Office Action from State Intellectual Property Office for CN Appln. No. 201180056242.7 dated Jul. 28, 2016, all pages.
First Office Action and Search Report from State Intellectual Property Office for CN Appln. No. 201180064527.5 dated Oct. 23, 2015, 10 pages.
Second Office Action from State Intellectual Property Office for CN Appln. No. 201180064527.5 dated Jun. 12, 2016, all pages.
(Translation) Rejection Decision for CN Appln. No. 201180064527.5 dated Oct. 9, 2016, all pages.
The First Office Action dated Sep. 11, 2014 for Mexican Patent Application No. MX/a/2013/007672 is not translated into English, 2 pages.
The Second Office Action dated Jun. 1, 2015 for Mexican Patent Application No. MX/a/2013/007672 is not translated into English, 2 pages.
Office Action dated Mar. 2, 2017 for KR 10-2013-7020865, all pages.
Notice to Grant received Jun. 9, 2017 for KR 10-2013-7020865, all pages.
Notice of Allowance dated Nov. 10, 2015 for Mexican Patent Application No. MX/a/2013/007672, 1 page.
The First Office Action dated Jul. 13, 2015 for Mexican Patent Application No. MX/a/2013/009791 is not translated into English, 2 pages.
Notice of Allowance for Mexican Patent Application No. MX/a/2013/009791 dated Mar. 15, 2016, 1 page.
Office Action dated Nov. 12, 2014 for Mexican Patent Application No. MX/a/2013/009794, 2 pages.
Office Action dated Oct. 17, 2016 for European Patent Appln. No. 12701638.4, all pages.
Notice of Allowance dated Feb. 18, 2015 for Mexican Patent Application No. MX/a/2013/009794, 1 page.
The First Office Action for Mexican Patent Application No. MX/a/2013/006262 is not translated into English. This document is from prosecution of the corresponding foreign matter for which we do not have a translation, dated Aug. 7, 2014, 2 pages.
Office Action dated Feb. 10, 2015 for Mexican Patent Application No. MX/a/2013/006770, 2 pages.
Office Action dated Feb. 6, 2015 for Mexican Patent Application No. MX/a/2013/006520, 2 pages.
Office Action dated Jan. 28, 2015 for Mexican Patent Application No. MX/a/2013/006973, 9 pages.
Notice of Allowance for Mexican Patent Application No. MX/a/2013/006973 dated Sep. 4, 2015, 1 page.
Office Action dated Dec. 5, 2014 for Mexican Patent Application No. MX/a/2013/009882, 2 pages.
Office Action for European Patent App. 12704473.3 dated Apr. 29, 2016, all pages.
The Second Office Action dated Apr. 22, 2015 for Mexican Patent Application No. MX/a/2013/009883, 2 pages.
Supplementary European Search Report for EP 11843423 completed Mar. 23, 2016, 8 pages.
Supplementary European Search Report for EP 11843045 completed Mar. 31, 2016, all pages.
Kato et al, “2D barcodes for mobile phones”, Mobile Technology, Applications and Systems, 2005 2nd International Conference on Guangzhou, China Nov. 15-17, 2005, Piscataway, NJ, USA, IEEE, Piscataway, NJ, USA, Nov. 15, 2005, pp. 8pp-8, XP031887368, DOI: 10.1109/MTAS.2005.207166; ISBN: 978-981-05-4573-4, 8 pages.
Liu, Yue et al., “Recognition of QR code with mobile phones,” Control and Decision Conference, 2008. CCDC 2008. Jul. 2-4, 2008, pp. 203, 206.
Ngee, S., “Data Transmission Between PDA and PC Using WiFi for Pocket Barcode Application,” Thesis, University Teknologi Malaysia, May 2007, 126 pp. Found online at http://eprints.utm.my/6421/1/SeahYeowNgeeMFKE20007TTT.pdf, Oct. 22, 2010.
Olson, E., “Bar Codes add Detail on Items in TV Ads,” New York Times, Sep. 2010, 3 pp. Found online at http:www.nytimes.com/2010/09/27/business/media/27bluefly.html?src=busin, Oct. 22, 2010.
Publication of BR 11 2014 020007-6 A2 on Jun. 20, 2017, 1 page.
Publication of PCT/US2011/059977 by the India Controller General of Patents Designs and Trademarks as India Patent Publication No. 4694/CHENP/2013 A on Sep. 5, 2014, 1 page.
Publication of PCT/US2012/025634 by the India General Patents Designs and Trademarks as India Patent Publication No. 6967/CHENP/2013 A on Aug. 1, 2014, 1 page.
Rekimoto, J., et al., “Augment-able Reality: Situated Communication Through Physical and Digital Spaces,” Sony Computer Science Laboratory, 2002, 8 pp. Found online at Citeseer: 10.1.1.20.34[1].pdf, Oct. 22, 2010.
Schmitz, A., et al., “Ad-Hoc Multi-Displays for Mobile Interactive Applications,” 31st Annual Conference of the European Association for Computer Graphics (Eurographics 2010), May 2010, vol. 29, No. 2, 8 pages.
Silverstein, B., “QR Codes and TV Campaigns Connect,” ReveNews, Sep. 2010, 5 pp. Found online at http://www.revenews.com/barrysilverstein/qr-codes-and-tv-campaigns-connect/, Oct. 22, 2010.
Smith, L., “QR Barcodes Make History on Global TV,” 3 pp. Found online at http://lindsaysmith.com/worlds-first-mobio-mini-telethon/, Oct. 22, 2010.
Yamanari, T., et al., “Electronic Invisible Code Display Unit for Group Work on Reminiscence Therapy,” Proceedings of the International MultiConference of Engineers and Computer Scientists 2009, vol. 1, IMECS 2009, Mar. 2009, 6 pp. Retrieved from the Internet: http://citeseerxist.psu.edu/viewdoc/download?doi=10.1.1.145.6904&rep1&type=pdf.
Yang, C., et al., “Embedded Digital Information Integrated by Video-on-Demand System,” Proceedings of the Fourth International Conference on Networked Computing and Advanced Information Management, IEEE Computer Society, 2008, 6 pages.
First Office Action including Search Report from the State Intellectual Property Office for CN Patent Appln. No. 201280014034.5 dated Apr. 5, 2016, all pages.
First Office Action and Search Report from the State Intellectual Property Office (SIPO) for CN 201180056249.9 dated Feb. 3, 2016, all pages.
Notice of Decision to Grant for KR 10-2013-7024307 dated Apr. 14, 2017, 2 pages.
Second Office Action issued by State Intellectual Property Office (SIPO) for CN 201180056249.9 dated Feb. 4, 2017, all pages.
Office Action and Search Report from the State Intellectual Property Office for CN Pat. Appln. No. 201180066584.7 dated Jul. 10, 2015, 12 pages.
Second Office Action issued by State Intellectual Property Office for CN Pat. Appln. No. 201180066584.7 dated Jan. 11, 2016, 5 pages.
Office Action and Search Report for ROC (Taiwan) Patent Application No. 10014870 dated May 7, 2014, issued in the corresponding foreign application, 9 pages.
Office Action for Korean Patent Application No. 10-2013-7020207 dated Dec. 21, 2016, all pages.
Decision to Grant for Korean Patent Application No. 10-2013-7020207 dated Mar. 9, 2017, all pages.
Office Action and Search Report for ROC (Taiwan) Patent Application No. 100149344 dated Jan. 23, 2015, 8 pages.
Search Report for Patent Application ROC (Taiwan) Patent Application No. 100149344 dated Oct. 28, 2015, 1 page.
Office Action and Search Report for ROC (Taiwan) Pat. Appln. No. 101106313, all pages.
Office Action and Search Report for ROC (Taiwan) Patent Application No. 100142966 dated May 27, 2014, 6 pages.
Office Action for European Patent Application No. 12707435.9 dated Mar. 12, 2015, 6 pages.
Office Action for European Patent Application No. 12719817.4 dated Jun. 23, 2014 issued in the corresponding foreign application, 5 pages.
First Office Action for CN 201280010873 dated Mar. 2, 2016, all pages. (no art cited).
Notice of Decision to Grant for CN 201280010873 dated Mar. 25, 2016, all pages. (not in English).
Notice of Allowance for Canadian Application 2,822,214 dated Nov. 28, 2016, 1 page.
U.S. Appl. No. 14/179,336, filed Feb. 12, 2014, Non-Final Office Action dated May 22, 2014, 14 pages.
U.S. Appl. No. 14/179,336, filed Feb. 12, 2014 Final Office Action dated Dec. 1, 2014, 30 pages.
U.S. Appl. No. 14/179,336, filed Feb. 12, 2014 Notice of Allowance dated Feb. 18, 2015, 15 pages.
U.S. Appl. No. 13/302,717, filed Nov. 22, 2011 Non Final Rejection dated Jun. 16, 2016, all pages.
U.S. Appl. No. 13/302,717, filed Nov. 22, 2011 Non Final Rejection dated Dec. 2, 2015, 27 pages.
U.S. Appl. No. 13/302,717, filed Nov. 22, 2011 Final Rejection dated May 8, 2015, 44 pages.
U.S. Appl. No. 13/302,717, filed Nov. 22, 2011 Non-Final Rejection dated Dec. 18, 2014, 71 pages.
U.S. Appl. No. 12/958,073, filed Dec. 1, 2010, Office Action dated Aug. 31, 2012, 12 pages.
U.S. Appl. No. 12/958,073, filed Dec. 1, 2010, Notice of Allowance dated Jan. 17, 2013, 17 pages.
U.S. Appl. No. 12/961,369, filed Dec. 6, 2010, Non-Final Office Action dated Mar. 9, 2012, 17 pages.
U.S. Appl. No. 12/964,478, filed Dec. 9, 2010, Non-Final Office Action dated Mar. 26, 2013, 19 pages.
U.S. Appl. No. 12/964,478, filed Dec. 9, 2010, Final Office Action dated Sep. 16, 2013, 12 pages.
U.S. Appl. No. 12/971,349, filed Dec. 17, 2010, Office Action dated Nov. 10, 2011, 9 pages.
U.S. Appl. No. 12/971,349, filed Dec. 17, 2010, Final Office Action dated Jan. 20, 2012, 10 pages.
U.S. Appl. No. 12/961,369, filed Dec. 6, 2010, Notice of Allowance dated Jul. 16, 2014, 15 pages.
U.S. Appl. No. 12/961,369, filed Dec. 6, 2010, Final Rejection dated Oct. 30, 2012, 17 pages.
U.S. Appl. No. 12/961,369, filed Dec. 6, 2010, Non-Final Office Action dated Mar. 25, 2013, 17 pages.
U.S. Appl. No. 12/961,369, filed Dec. 6, 2010, Non-Final Office Action dated Jul. 12, 2013, 22 pages.
U.S. Appl. No. 12/961,369, filed Dec. 6, 2010, Non-Final Office Action dated Feb. 13, 2014, 21 pages.
U.S. Appl. No. 12/971,349, filed Dec. 7, 2010 ), Notice of Allowance dated Oct. 2, 2013, 24 pages.
U.S. Appl. No. 12/971,349, filed Dec. 7, 2010 ), Final Rejection dated Oct. 24, 2012, 11 pages.
U.S. Appl. No. 12/971,349, filed Dec. 17, 2010, Office Action dated Jul. 16, 2012, 11 pages.
U.S. Appl. No. 12/981,244, filed Dec. 29, 2010, Office Action dated Dec. 21, 2012, 23 pages.
U.S. Appl. No. 12/981,244, filed Dec. 29, 2010, Final Office Action dated Oct. 30, 2013, 10 pages.
U.S. Appl. No. 12/981,244, filed Dec. 29, 2010, Notice of Allowance dated Mar. 25, 2014, 17 pages.
U.S. Appl. No. 12/984,385, filed Jan. 4, 2011, Notice of Allowance dated Nov. 28, 2012, 11 pages.
U.S. Appl. No. 12/984,385, filed Jan. 4, 2011, Office Action dated Jul. 12, 2012, 16 pages.
U.S. Appl. No. 12/986,721, filed Jan. 7, 2011, Office Action dated Mar. 16, 2012, 6 pages.
U.S. Appl. No. 12/986,721, filed Jan. 7, 2011, Notice of Allowance dated Jun. 21, 2012, 7 pages.
U.S. Appl. No. 12/953,227, filed Nov. 23, 2010, Final Office Action dated May 24, 2013, 17 pages.
U.S. Appl. No. 12/953,227, filed Nov. 23, 2010, Office Action dated Oct. 7, 2012, 31 pages.
U.S. Appl. No. 12/953,227, filed Nov. 23, 2010, Non Final Office action dated Mar. 24, 2015, 39 pages.
U.S. Appl. No. 12/953,227, filed Nov. 23, 2010, Final Office Action dated Nov. 6, 2015, 26 pages.
U.S. Appl. No. 12/953,227, filed Nov. 23, 2010, Notice of Allowance dated May 9, 2017, all pages.
U.S. Appl. No. 13/015,382, filed Jan. 27, 2011, Office Action dated Nov. 13, 2012, 7 pages.
U.S. Appl. No. 13/015,382, filed Jan. 27, 2011, Notice of Allowance dated Feb. 22, 2013, 12 pages.
U.S. Appl. No. 13/016,483, filed Jan. 28, 2011 Office Action dated Nov. 2, 2012, 18 pages.
U.S. Appl. No. 13/016,483, filed Jan. 28, 2011 Final Office Action dated Jun. 27, 2013, 13 pages.
U.S. Appl. No. 13/016,483, filed Jan. 28, 2011 Non-Final Office Action dated Nov. 3, 2014, 33 pages.
U.S. Appl. No. 13/016,483, filed Jan. 28, 2011 Final Office Action dated May 13, 2015, 34 pages.
U.S. Appl. No. 13/016,483, filed Jan. 28, 2011 Non-Final Office Action dated Dec. 14, 2015, 27 pages.
U.S. Appl. No. 13/016,483, filed Jan. 28, 2011 Final Office Action dated Jul. 5, 2016, all pages.
U.S. Appl. No. 13/035,474, filed Feb. 25, 2011 Non Final Rejection dated Feb. 17, 2015, 57 pages.
U.S. Appl. No. 12/953,273, filed Nov. 23, 2010, Notice of Allowance, dated Oct. 18, 2012, 11 pages.
U.S. Appl. No. 12/965,645, filed Dec. 10, 2010, Non-Final Office Action, dated Jul. 19, 2013, 20 pages.
U.S. Appl. No. 12/965,645, filed Dec. 10, 2010, Final Office Action, dated Mar. 18, 2014, 24 pages.
U.S. Appl. No. 12/965,645, filed Dec. 10, 2010, Notice of Allowance, dated Jun. 20, 2014, 35 pages.
U.S. Appl. No. 12/973,431, filed Dec. 20, 2010, Non-Final Rejection dated May 15, 2013, 30 pages.
U.S. Appl. No. 12/973,431, filed Dec. 20, 2010, Final Office Action dated Aug. 27, 2013, 11 pages.
U.S. Appl. No. 12/973,431, filed Dec. 20, 2010 Non-Final Rejection dated Dec. 19, 2014, 30 pages.
U.S. Appl. No. 12/973,431, filed Dec. 20, 2010 Notice of Allowance dated May 28, 2015, 20 pages.
U.S. Appl. No. 13/007,317, filed Jan. 14, 2011, Office action dated Dec. 19, 2012, 29 pages.
U.S. Appl. No. 13/010,557, filed Jan. 20, 2011, Final Rejection dated Jan. 16, 2014, 17 pages.
U.S. Appl. No. 13/010,557, filed Jan. 20, 2011, Non-Final Rejection dated Aug. 5, 2013, 17 pages.
U.S. Appl. No. 13/014,591, Notice of Allowance dated May 24, 2013, 32 pages.
U.S. Appl. No. 13/020,678, filed Feb. 3, 2011, Office Action dated Jul. 30, 2012, 15 pages.
U.S. Appl. No. 13/020,678, filed Feb. 3, 2011, Notice of Allowance dated Jan. 3, 2013, 13 pages.
U.S. Appl. No. 13/007,317, Notice of Allowance dated May 13, 2013, 16 pages.
U.S. Appl. No. 13/028,030, filed Feb. 15, 2011, Office Action dated Jan. 11, 2013, 14 pages.
U.S. Appl. No. 13/028,030, filed Feb. 15, 2011, Final Office Action dated Jul. 11, 2014, 43 pages.
U.S. Appl. No. 13/028,030, filed Feb. 15, 2011, Non-Final Office Action dated Feb. 6, 2015, 56 pages.
U.S. Appl. No. 13/028,030, filed Feb. 15, 2011, Final Office Action dated Jul. 17, 2015, 63 pages.
U.S. Appl. No. 13/031,115, Notice of Allowance dated Apr. 16, 2013, 24 pages.
U.S. Appl. No. 13/034,482, filed Feb. 24, 2011 Notice of Allowance dated Aug. 29, 2014, 45 pages.
U.S. Appl. No. 13/034,482, filed Feb. 24, 2011, Final Office Action dated Apr. 25, 2013, 19 pages.
U.S. Appl. No. 13/034,482, filed Feb. 24, 2011, Office Action dated Oct. 19, 2012, 11 pages.
U.S. Appl. No. 13/035,474, filed Feb. 25, 2011, Office Action dated Oct. 30, 2012, 11 pages.
U.S. Appl. No. 13/035,474, filed Feb. 25, 2011, Final Rejection dated Mar. 29, 2013, 20 pages.
U.S. Appl. No. 13/035,474, filed Feb. 25, 2011, Non Final Rejection dated Mar. 6, 2014, 20 pages.
U.S. Appl. No. 13/035,474, filed Feb. 25, 2011 Final Rejection dated Aug. 27, 2014, 38 pages.
U.S. Appl. No. 13/035,474, filed Feb. 25, 2011 Non Final Rejection dated Sep. 11, 2015, 65 pages.
U.S. Appl. No. 12/960,285, filed Dec. 3, 2010 Non-Final Office Action dated May 14, 2015, 21 pages.
U.S. Appl. No. 12/960,285, filed Dec. 3, 2010 Final Office Action dated Dec. 3, 2014, 19 pages.
U.S. Appl. No. 12/960,285, filed Dec. 3, 2010, Non-Final Office Action dated Jun. 6, 2014, 19 pages.
U.S. Appl. No. 12/960,285, filed Dec. 3, 2010, Final Office Action dated Apr. 18, 2013, 14 pages.
U.S. Appl. No. 12/960,285, filed Dec. 3, 2010, Non-Final Office Action dated Dec. 6, 2012, 17 pages.
U.S. Appl. No. 12/960,285, filed Dec. 3, 2010, Notice of Allowance dated Nov. 18, 2015, 31 pages.
U.S. Appl. No. 13/006,270, filed Jan. 13, 2011, Non-Final Office Action dated Oct. 8, 2013, 20 pages.
U.S. Appl. No. 13/006,270, filed Jan. 13, 2011, Final Office Action dated May 9, 2014, 41 pages.
U.S. Appl. No. 13/006,270, filed Jan. 13, 2011 Non-Final Office Action dated Sep. 12, 2014, 41 pages.
U.S. Appl. No. 13/006,270, filed Jan. 13, 2011, Final Office Action dated Mar. 23, 2014, 51 pages.
U.S. Appl. No. 13/028,030, filed Feb. 15, 2011 Non-Final Office Action dated Dec. 17, 2013, 60 pages.
U.S. Appl. No. 13/035,525, filed Feb. 25, 2011, Office Action dated Jul. 18, 2012, 15 pages.
U.S. Appl. No. 13/035,525, filed Feb. 25, 2011, Final Office Action dated Jan. 31, 2013, 26 pages.
U.S. Appl. No. 13/035,525, filed Feb. 25, 2011, Non-Final Office Action dated May 15, 2013, 15 pages.
U.S. Appl. No. 13/035,525, filed Feb. 25, 2011, Final Office Action dated Sep. 12, 2013, 21 pages.
U.S. Appl. No. 13/037,302, filed Feb. 28, 2011 Office Action dated Mar. 1, 2013, 20 pages.
U.S. Appl. No. 13/037,302, filed Feb. 28, 2011, Final Office Action dated Oct. 16, 2013, 28 pages.
U.S. Appl. No. 13/037,302, filed Feb. 28, 2011, Final Office Action dated May 4, 2015, 54 pages.
U.S. Appl. No. 13/037,302, filed Feb. 28, 2011, Non-Final Office Action dated Jan. 12, 2016, 62 pages.
U.S. Appl. No. 13/037,302, filed Feb. 28, 2011, Final Office Action dated Jul. 12, 2016, all pages.
U.S. Appl. No. 13/037,302, filed Feb. 28, 2011, Notice of Allowance dated Feb. 16, 2017, all pages.
U.S. Appl. No. 13/037,312, filed Feb. 28, 2011, Office Action dated Aug. 15, 2012, 9 pages.
U.S. Appl. No. 13/037,312, filed Feb. 28, 2011, Notice of Allowance dated Jun. 13, 2013, 10 pages.
U.S. Appl. No. 13/037,312, filed Feb. 28, 2011, Final Office Action dated Feb. 28, 2013, 18 pages.
U.S. Appl. No. 13/037,316, filed Feb. 28, 2011, Office Action dated Jan. 30, 2013, 21 pages.
U.S. Appl. No. 13/037,316, filed Feb. 28, 2011, Final Office Action dated Aug. 28, 2013, 13 pages.
U.S. Appl. No. 13/037,333, filed Feb. 28, 2011, Notice of Allowance dated Jan. 18, 2013, 27 pages.
U.S. Appl. No. 13/192,287, filed Jul. 27, 2011, Notice of Allowance dated Dec. 14, 2015, 14 pages.
U.S. Appl. No. 13/192,287, filed Jul. 27, 2011, Final Office Action dated Jan. 28, 2014, 18 pages.
U.S. Appl. No. 13/192,287, filed Jul. 27, 2011, Non Final Office Action dated Jun. 13, 2013, 22 pages.
U.S. Appl. No. 13/673,480, filed Nov. 9, 2012, Office Action dated Jan. 16, 2013, 27 pages.
U.S. Appl. No. 13/673,480, filed Nov. 9, 2012 Final Office Action dated Sep. 9, 2013, 10 pages.
U.S. Appl. No. 13/673,480, filed Nov. 9, 2012 Notice of Allowance dated Nov. 12, 2013, 16 pages.
U.S. Appl. No. 13/475,794, filed May 18, 2012 Non-Final Office Action dated Sep. 18, 2013, 19 pages.
U.S. Appl. No. 13/475,794, filed May 18, 2012 Non-Final Office Action dated Nov. 21, 2014, 33 pages.
U.S. Appl. No. 13/475,794, filed May 18, 2012 Final Office Action dated Jun. 1, 2015, 45 pages.
U.S. Appl. No. 13/475,794, filed May 18, 2012 Non Final Office Action dated Jul. 29, 2016, all pages.
U.S. Appl. No. 13/475,794, filed May 18, 2012 Notice of Allowance dated Jan. 5, 2017, all pages.
U.S. Appl. No. 13/864,474, filed Apr. 17, 2013 Non Final Office Action dated Aug. 11, 2015, 59 pages.
U.S. Appl. No. 13/864,474, filed Apr. 17, 2013 Final Office Action dated Nov. 20, 2015, all pages.
U.S. Appl. No. 13/864,474, filed Apr. 17, 2013 Non Final Office Action dated Mar. 23, 2016, all pages.
U.S. Appl. No. 13/864,474, filed Apr. 17, 2013 Notice of Allowance dated Feb. 16, 2017, all pages.
U.S. Appl. No. 13/968,611, filed Aug. 16, 2013, Notice of Allowance dated May 2, 2014, 40 pages.
U.S. Appl. No. 13/968,611, filed Aug. 16, 2013, Non-Final Office Action dated Jan. 17, 2014, 21 pages.
U.S. Appl. No. 14/852,787, filed Sep. 14, 2015, Non-Final Office Action dated Sep. 14, 2016, all pages.
U.S. Appl. No. 14/852,787, filed Sep. 14, 2015, Final Office Action dated Jan. 13, 2017, all pages.
Third Office Action from State Intellectual Property Office for CN Appln. No. CN 201280013891.3 dated Dec. 30, 2016, all pages.
U.S. Appl. No. 14/852,787, filed Sep. 14, 2015, Non-Final Office Action dated Sep. 3, 2017, all pages.
Office Action for CA 2,818,757 dated Jul. 12, 2017, all pages.
Office Action for CA 2,823,636 dated Jan. 24, 2017, all pages.
Office Action for CA 2,825,414 dated Apr. 3, 2017, all pages.
Office Action for CA 2,825,414 dated Nov. 1, 2017, all pages.
First Examination Report for EP Appln No. 12716110.7 dated Aug. 2, 2017, all pages.
First Office Action for CA Appln No. 2828447 dated Sep. 7, 2017, all pages.
U.S. Appl. No. 15/595,621, filed May 15, 2017 Final Rejection dated Mar. 8, 2018, all pages.
U.S. Appl. No. 14/852,787, filed Sep. 14, 2015, Notice of Allowance dated Feb. 21, 2018, all pages.
U.S. Appl. No. 15/639,871, filed Jun. 30, 2017, Notice of Allowance dated Mar. 5, 2018, all pages.
Office Action for CA 2,818,757 dated Jun. 13, 2018, all pages.
Examination Report for Indian Appln No. 7734/CHENP/2013 dated Jun. 25, 2018, all pages.
Decision of Rejection for Korean Patent Appln. No. 10-2013-7015610 dated Jun. 27, 2018, all pages.
Office Action for EP 11843423 dated May 22, 2018, all pages.
Office Action for CA 2819146 dated Jan. 15, 2018, all pages.
Office Action for EP 11 846 858.6 dated Feb. 9, 2018, all pages.
Office Action for EP 12707418.5, dated Feb. 28, 2018, all pages.
Office Action for EP 11846486 dated Jan. 5, 2018, all pages.
Office Action for CA 2828447, all pages.
Related Publications (1)
Number Date Country
20180007415 A1 Jan 2018 US
Provisional Applications (1)
Number Date Country
61417076 Nov 2010 US
Continuations (1)
Number Date Country
Parent 13302717 Nov 2011 US
Child 15655266 US