The present disclosure relates generally to an interface for television programming. More particularly, the present disclosure relates to QR codes presented on a display for scanning, where the codes have accessibility content stored therein.
In one embodiment, a method of presenting a matrix code for providing accessibility content may include receiving, at a content receiver, a signal carrying accessibility content, generating, at the content receiver, a matrix code from the accessibility content, and transmitting the matrix code as part of a content presentation to a presentation device for display, wherein the content presentation includes at least one of audio and visual content and the accessibility content includes an alternative form of at least one of the audio and the visual content for allowing a user to more fully comprehend the content presentation.
In variations of this embodiment, the signal carrying the accessibility content may include the content presentation. The content receiver may transmit the matrix code in response to a user indication. Furthermore, the content receiver may transmit the matrix code in a user-indicated configuration. The user-indicated configuration may include a size of the matrix code on the presentation device and/or a position of the matrix code on the presentation device.
In further variations of this embodiment, the matrix code may be displayed on the presentation device continuously throughout the content presentation, or intermittently throughout the content presentation. More than one matrix code may be displayed on the presentation device throughout the content presentation. The accessibility content may include visual information regarding the content presentation. In this manner, the visual information may be provided in a format for use with a Braille writer. Alternatively or additionally, the accessibility content may include audio information regarding the content presentation. Furthermore, the matrix code may define a two dimensional pattern that embodies data. Thus, the matrix code may be a QR code.
In another embodiment, a method of generating a matrix code representing accessibility content may include receiving, at a content receiver, a signal carrying data relating to a content presentation and to the accessibility content; and generating, at the content receiver, a matrix code from the data relating to the accessibility content, wherein the content presentation includes at least one of audio and visual content and the accessibility content includes an alternative form of at least one of the audio and the visual content for allowing a user to more fully comprehend the content presentation.
In variations of this embodiment, the matrix code and the content presentation may be displayed simultaneously on a presentation device. The presentation device may be a television. Furthermore, the content receiver may be a connected and integral part of the presentation device.
In further embodiments, a method of generating a matrix code having accessibility content for allowing a user to more fully comprehend a content presentation having at least one of audio and visual content may include receiving, at a content receiver, a signal carrying data relating to the accessibility content and generating, at the content receiver, a matrix code from the data relating to the accessibility content and configured in a format for use with a Braille writer, wherein the accessibility content includes an alternative form of at least one of the audio and the visual content of the content presentation.
In still further embodiments, a system may include an input that receives a broadcast signal carrying accessibility content, a processor that receives the broadcast signal and that generates a matrix code that includes the accessibility content, and an output that transmits the matrix code to a presentation device for display.
In still other embodiments, a system may include an optical device configured to capture a matrix code presented with an audiovisual presentation, a processor in communication with the optical device configured to receive the matrix code from the optical device and transform the matrix code into accessibility data readable by the accessibility device, and an accessibility device interface component in communication with an accessibility device and the processor configured to output the accessibility data to the accessibility device. The accessibility device may provide accessibility information to a user based on the received accessibility data.
It is to be understood that both the foregoing general description and the following detailed description are for purposes of example and explanation and do not necessarily limit the present disclosure. The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate subject matter of the disclosure. Together, the descriptions and the drawings serve to explain the principles of the disclosure.
The description that follows includes sample systems and methods that embody various elements of the present disclosure. However, it should be understood that the described disclosure may be practiced in a variety of forms in addition to those described herein.
Audiovisual programming generally involves transmitting signals carrying audiovisual content to a receiver. Generally, the audiovisual content includes one or more streams of data supporting several viewable programs at any given time. As such, the signals may be processed and the audiovisual content may be selectively displayed on a presentation device by, for example, selecting a particular channel. It should be appreciated that selecting a channel does not necessarily require tuning to a particular frequency but instead selecting a different group of data streams (typically identified by packet identifiers, or PIDS) for processing and/or display.
In addition to audiovisual content, the streams of data may include accessibility content relating to one or more of the audiovisual signals. In one embodiment, accessibility content may refer to any information, whether presented in an audio, visual, or tactile form, that allows persons having one or more impairments, disabilities, infirmities, language barriers, or any other condition to more fully comprehend the audiovisual content provided through the presentation device. That is, the accessibility content may include an alternative form of the visual aspect of the audiovisual content or it may include an alternative form of the audio aspect of the audiovisual content or it may include some combination of alternative forms of the both the visual and audio aspects of the content. Existing accessibility content, which is currently supported on many presentation devices, includes closed captioning, wherein the audio aspect of a content presentation is provided in an alternative textual form displayed on a portion of the presentation device corresponding to the audiovisual stream in order to allow hearing-impaired persons to read content that is otherwise presented as audio. Furthermore, bilingual content may be available in some systems, such that the audio is provided in two languages, e.g., English and Spanish, to allow the user to select a desired language.
The present disclosure relates to systems and methods for receiving accessibility content from a television provider as well as methods associated with presenting and/or receiving the accessibility content. In some embodiments, the system includes a satellite configured to receive transmissions and instructions from a satellite provider. The methods, in some embodiments, involve producing a matrix code, such as a QR code, and associating the matrix code with audiovisual content. Specifically, the matrix code may contain data relating to accessibility content for association with the audiovisual content. A user may scan the code using any device capable of reading (scanning) the matrix code and converting the code into transmittable accessibility content. The transmittable accessibility content may be transmitted to an accessibility device, for receipt by the user. In this manner, a QR code may be produced and associated with a television program. The QR code may be viewable on the presentation device while the television program is being displayed. Thus, the accessibility content may be presented concurrently with the television program to which such information relates, and it may be concurrently received by the reading device (reader), and thereafter transmitted to the accessibility device wherein the user may receive the accessibility content.
In one embodiment, the accessibility content may be visual information related to the television program (for example, descriptions of scenes, descriptions of characters, descriptions of actions, and other description of the visual content of the television program) translated into Braille, the reading device may be an optical scanner, and the accessibility device may be a Braille writer. In this embodiment, the matrix code may contain accessibility content relating to the visual content of a television program which would not be viewable by persons having visual impairments. For example, the accessibility content may include a narration of a television program allowing a person with visual impairments to “hear” the television program, or the narration may be translated into Braille such that the person may use their sense of touch with a Braille writer to read the visual aspects of the program.
In this or another embodiment, the receiving device and the transmitting device may be, or may be a part of, the same device. Thus, a single device may be provided to read the accessibility content from the displayed matrix code on the presentation device, and transmit such accessibility content to the user. Other embodiments and variations of those mentioned will be described below.
Referring now to
The receiver 102 may be configured to receive a signal carrying a broadcast television program, a program guide, a menu, a movie or other audiovisual content. The receiver 102 may also be configured to receive associated metadata or other data information carried by the signal. The receiver 102 may further be configured for transmitting the content to the presentation device 104 for viewing, listening, or both. As such, the receiver 102 may be in the form of a set top box 116, other television receiver box (such as a cable box, network-enabled box, and the like) or a satellite system as shown including a dish 112, cabling 114 leading to the set top box 116, and cabling 118 leading to the presentation device 104, for example. Other examples of a receiver 102 may include an antenna system employing an analog or digital antenna connected by cabling leading either to a television receiver or directly to the presentation device 104. Still other examples may include a cable system including an incoming cable leading directly to a presentation device 104 or to a presentation device via a set top box.
In some embodiments, the receiver 102 may be configured to convert, configure, or otherwise modify the display prior to transmitting it to the presentation device for viewing. The receiver may further be configured for storing and displaying audiovisual content. The receiver may thus be in the form of a computer-type device having one or more processing units 120, one or more inputs 122, one or more outputs 124, and one or more computer readable storage media 126 (which may take the form of, but is not limited to: a magnetic storage medium; optical storage medium; magneto-optical storage medium; read only memory; random access memory; erasable programmable memory; flash memory; non-transitory storage media; and so on).
In some embodiments, these computer-type elements may be incorporated into a set top box 116 for receiving a broadcast, program guide information, audio and video streams, other audiovisual content, data, or other information. The set top box 116 may receive the information through the one or more inputs 122, process or store the incoming information, and selectively output information to the presentation device 104 for viewing and interaction by a viewer. For example, the viewer may select which television channel he would like to watch, select from time-shifted television programs stored in the storage medium, or select movies from a video-on-demand menu, for example. In another example, the viewer may navigate an electronic program guide or other series of menus, which may be output to the presentation device 104. Instructions executed by the processor 120 may be stored in the storage medium 126 or received through the one or more inputs 122 or both. The set top box 120 may include a remote control 128 for remote interaction with the viewer. In still other embodiments, some or all of these aspects of the receiver 102 may be connected and integral with the presentation device 104 described below. For example, a presentation device 104 in the form of a television may include some or all of these elements.
The presentation device 104 may include one or more inputs 130 for receiving information from the receiver and an electronic device 132 for receiving information from the one or more inputs 130 and transmitting the information to a display screen, speaker, or other output 134. The presentation device 104 may be a television, computer monitor, or other device for presenting a viewer with visual and/or audio stimuli.
The reader 106 may be a remote device configured for optically scanning information from the presentation device 104. The reader 106, like the receiver 102 described above, may also be a computer-type device having one or more inputs, a processor, a computer readable storage medium, and one or more outputs. One of the inputs of the reader 106 may include an optical receiver configured for receiving and recording light patterns. The optical receiver may be a digital optical receiver similar to that found in digital cameras and some mobile phones. In some embodiments, the reader 106 may be in the form of a dedicated optical scanner, a personal digital assistant (PDA), a portable computing device, a tablet computer, a smartphone and the like. The reader 106 may receive image input, for example, from the optical receiver and the processor may access processing instructions for the image from the computer readable storage medium. That is, in some embodiments, the reader 106 may have decoding software stored in the storage medium for decoding matrix codes. In some embodiments, the software may include an “auto run” feature so that decoding software automatically executes when a matrix code is scanned and recognized. The processor may process the image produced by the optical receiver and may follow additional instructions produced by the processing of the image. In at least one embodiment, the processor receives an image captured by the optical receiver, such as an image including a matrix code, and transforms the matrix code into accessibility data readable by an accessibility device, such as a Braille writer.
The reader 106 may have data transmitting capabilities, such as, for example, by hardwired connection, a wireless network, or another electronic data connection. Thus, the reader 106 may be capable of transmitting the decoded information from the matrix code, which may be accessibility content, to an accessibility device 108. In at least one embodiment, the reader 106 includes an accessibility device interface component configured to output data to the accessibility device 108. For example, the accessibility device interface component may output Braille data for presentation by the accessibility device 108. The accessibility device 108 may be configured to receive accessibility content transmitted from the reader 106, and thereafter provide a user with the accessibility content. In one embodiment, the reader 106 and the accessibility device 108 may be the same device, or a may be a part of the same device, such that the procedure of transmitting the data representing the accessibility content is not necessary. For example, the accessibility device 108 may be provided with a camera or other image-capture device, as well as appropriate software to recognize, decode and/or process a captured matrix code. Thus, in this embodiment, the same device may read the matrix-based code, and directly provide the accessibility content encoded therein to the user.
Having described a system 100 upon which the current methods may be performed, reference is now made to
A viewer of the television program may scan the matrix code in the display with a reader, for example, and the reader may decode the accessibility content and transmit it to an accessibility device, such as a Braille writer. In this fashion, a user with a visual impairment may be able to read the output of the Braille writer as the program is shown, thereby allowing the user to enjoy the program to a greater degree than if he were just listening to the audio portion of the program.
Before discussing the operations performed in the method, additional information is provided with regard to the programming content, the matrix codes included therein, and the information in the matrix codes. The programming content may be in the form of live television, pre-recorded television, movies, on-demand programming, or any other form of audiovisual content which may be provided from a content provider. In general, such content is produced by third parties, such as a television or movie production studio, and is then transmitted to the television signal provider for simultaneous or eventual broadcast or transmission to the user, via the transmission media described above, to the display device.
The programming content may include a matrix code, such as a QR code, that is presented to the user for viewing and/or capturing, such as scanning with a reader 106. The matrix code may be visible throughout the programming or it may appear for a portion thereof. The matrix code may be provided on the entire display device (i.e., in place of the visual portion of the programming), or it may be provided on only a portion thereof. In this manner, a person with visual impairment may choose to have the matrix code displayed on the entire display device if there is no one else watching the programming; however, if the visually impaired person is watching the programming with normal sighted persons, then the matrix-code may be displayed in only a portion of the display device to allow the sighted persons to view the visual portion of the programming content. Moreover, the user may choose not to have the matrix code displayed on the display device at all.
As the accessibility content may be provided in the transmission signal along with the audiovisual content, the user may use the set-top box 116 or other like device to select the configuration (size, positioning, etc.) of the matrix code on the display device, including selecting whether or not the code will be present at all. The set-top box may execute the user's instructions by providing the display device with both the user's selected programming and the user's selected accessibility content. The receiver may thus have a menu stored in the computer readable storage medium that is accessible by a key pad on the set top box or a via a remote control. The user may interact with the set top box using the menu and the set top box may filter or otherwise reconfigure the incoming signal to display the audiovisual content and data as selected by the user.
An exemplary matrix code 140, as depicted in
The matrix codes 140 in the present embodiment may store accessibility data representing accessibility content. Accessibility data, in one embodiment, may relate to any type of information or content that would allow a person with an impairment to more fully sense, or otherwise receive, and enjoy the programming. As previously discussed, accessibility data may include a textual description of the visual programming content. In this manner, a person having a visual impairment may be able to receive such visual information via a Braille writer or other medium, in lieu of actually seeing the content. Greater or lesser levels of accessibility content are possible. For example, the content may simply describe the general scene of the programming, for example, every time the scene changes. In other examples, the content may fully describe certain or all scenes, backgrounds, actions, and other information occurring in the program. In some embodiments, the described accessibility content may be in the form of a narration of greater or lesser detail that allows a person with visual impairments to understand what the visual content of the audiovisual content is showing. As such, the user may hear the audio content and associate the audio content with the visual content.
Other types of accessibility content are possible. For example, the matrix code may be configured to provide textual information to a person with a hearing impairment. In this manner, the textual information could be displayed to the user on a device that is connected to (or is a part of) the reader 106 to allow the user to read the audio content. In a further example, the matrix code may be configured to provide textual or audio information in a language other than the broadcast language. In this manner, the textual or audio information could be provided to the user on a device that is connected to (or is a part of) the reader 106 to allow the user to read or listen to the audio content in another language.
Matrix codes having accessibility content stored therein may be provided at any interval and in any number throughout the course of the selected audiovisual content. For example, a single matrix code may be displayed at the beginning of the program. Alternatively, numerous codes may be displayed at regular intervals throughout the program, e.g., every minute, 30 seconds, 15 seconds, etc. In still other embodiments, the matrix codes may be spaced throughout the program at the beginning, or end, or each scene of the program. It is generally anticipated that a greater frequency of code presentation will correspond with a higher level of accessibility content being provided, though this may not be the case. In all cases, the set-top box 116 or other like device may be responsible for receiving the accessibility content from the signal, and presenting the content in the form of a matrix code on the display device.
The accessibility content may be supplied by the content provider. For example, a producer of a movie may develop, or have developed, accessibility content for its movie. At the time of developing the accessibility content, a matrix code 140 and any associated features may be included in the movie. The matrix code 140 may be included in a data stream associated with the program or it may be included in a secondary visual stream. When a network or other program providing entity assembles programming they may include the accessibility content in their programming. Accordingly, the receiver in the system above may receive the accessibility content together with the program or other audiovisual content being provided by the television signal provider and may selectively display portions of the signal.
In another embodiment, the accessibility content may be supplied by the television signal provider. At the time of receiving the content from the content provider, the television signal provider may develop and embed the accessibility content along with the regular content signal and, as above, may embed the content as a data stream or a secondary visual stream. Thus, as above, the receiver in the system above may receive the accessibility content together with the program or other audiovisual content being provided by the television signal provider and may selectively display portions of the signal.
With this background regarding programming content, matrix codes, and the data stored therein, the method 250 (
Turning now to the embodiment shown in
Having received a signal carrying programming content with possibly additional accessibility content (252), the receiver 102 may search the signal for any matrix codes or otherwise available accessibility content which may be present (254). In other embodiments, the receiver 102 may be pre-configured or selectively configured via a menu to receive accessibility content from a particular stream of the transmitted signal. As such, the search step (254) may be omitted, as shown in
Depending on the nature of the accessibility content, the receiver 102 may generate a matrix code 140 in a format for graphical display. Generating a matrix code 140 may be done in several manners, depending on how the matrix code data is received into the receiver 102. In some embodiments, the matrix codes 140 may be received in a visual stream apart from the audiovisual content and the receiver 102 may display the visual stream together with the audiovisual content. The relationship of the matrix code visual stream to the audiovisual content may be selected via menu by the user and the receiver 102 may compile the audiovisual content and matrix code visual stream to suitably display the matrix visual stream together with the audiovisual content (256a). This may include replacing the visual aspect of the audiovisual content with the matrix code visual stream or it may include coordination between the two. In some other embodiments, matrix codes may be present in a data stream and may be received and read or decoded by the receiver 102 (256b). The receiver 102 may then generate matrix codes 140 for visual display and embed them in the audiovisual content or display them along with or in place of the visual aspect of the audiovisual content. In still other embodiments, the receiver 102 may receive a data stream including accessibility content and may develop matrix codes 140 by grouping portions of the accessibility content and generating a matrix code or codes 140 and associating the code or codes 140 with the audiovisual content (256c). Any of the above embodiments or combinations thereof may be included in procedures (256a, 256b, 256c) of generating a matrix code 140 in format for display. It will be appreciated that, in certain embodiments, two or more procedures (256a, 256b, 256c) may be performed, depending on how the matrix code data is received.
Having received a signal (252), possibly searched for accessibility content (254), and generated a matrix code for display (256), the receiver may then transmit the matrix code 140 (258) to the display device to be displayed (260). The transmission may correspond to the generated matrix code 140. That is, for example, where the accessibility content is replacing a portion of the audiovisual content, the transmission may include the accessibility content in the form of one or matrix codes 140 and the portion of the audiovisual content not replaced.
Once a program including matrix codes 140 has been output to a presentation device 104, a viewer may scan a matrix code when it is presented on the presentation device 104 (262). The matrix code 140 may have accessibility data, representing accessibility content, stored therein. The viewer may direct the optical receiver portion of a reader 106 toward the presentation device 104 when the presentation device 104 is displaying a matrix code 140. The viewer may then actuate the optical receiver by, for example, depressing a shutter button. In other embodiments, the optical receiving is in an “always on” configuration, i.e., it may capture the matrix code without input from the user. The reader 106 may thus capture an image of the matrix code 140. In some embodiments, the viewer may zoom, focus, or otherwise direct the reader toward the portion of the presentation device displaying the matrix code. In some embodiments, the reader 106 may be in a stationary position relative to the presentation device 104 and may be focused on all or a respective portion of the presentation device 104 such that it may capture matrix codes 140 as they are available and without the need for focusing, directing, or otherwise positioning the reader 106.
As mentioned above, the reader 106 may include an auto run feature causing the reader 106 to begin the decoding process when a matrix code 140 has been captured. In other cases, the viewer may deliberately select software resident on the reader 106 and direct the software to decode the captured image of the matrix code 140. The software may decode the image thus producing the accessibility content. The accessibility content may be transmitted to an accessibility device, such as a Braille writer, or the reader 106 may be a part of the accessibility device, in which case the accessibility device 108 may have the accessibility content directly available to it. The accessibility device 108 may then provide the accessibility content to the user, in a manner depending on the type of device, as is known in the art (264).
The overall process of the method for generating a matrix code 250 may be advantageous for several reasons. For example, the process may allow a television service provider to make accessibility content available to a user to enhance the user's ability to experience the program, particularly when the user has a visual, audio, or other impairment. At the same time, the user may enjoy the programming with other persons who do not have impairments, because the matrix-code may be displayed on only a portion of the display device, and the accessibility content may be provided directly to the individual user (as opposed to completely altering the audio or visual presentation from the display device). The user may have the option to turn on or off the matrix code display at any time, and to configure the code size and display positioning to suit the user's needs.
In the present disclosure, the methods disclosed may be implemented as sets of instructions or software readable by a device. Further, it is understood that the specific order or hierarchy of operations in the methods disclosed are examples of sample approaches. In other embodiments, the specific order or hierarchy of operations in the method can be rearranged while remaining within the disclosed subject matter. The accompanying method claims present elements of the various operations in a sample order, and are not necessarily meant to be limited to the specific order or hierarchy presented.
The described disclosure may be provided as a computer program product, or software, that may include a non-transitory machine-readable medium having stored thereon instructions, which may be used to program a computer system (or other electronic devices) to perform a process according to the present disclosure. A non-transitory machine-readable medium includes any mechanism for storing information in a form (e.g., software, processing application) readable by a machine (e.g., a computer). The non-transitory machine-readable medium may take the form of, but is not limited to, a magnetic storage medium (e.g., floppy diskette, video cassette, and so on); optical storage medium (e.g., CD-ROM); magneto-optical storage medium; read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; and so on.
It is believed that the present disclosure and many of its attendant advantages will be understood by the foregoing description, and it will be apparent that various changes may be made in the form, construction and arrangement of the components without departing from the disclosed subject matter or without sacrificing all of its material advantages. The form described is merely explanatory, and it is the intention of the following claims to encompass and include such changes.
While the present disclosure has been described with reference to various embodiments, it will be understood that these embodiments are illustrative and that the scope of the disclosure is not limited to them. Many variations, modifications, additions, and improvements are possible. More generally, embodiments in accordance with the present disclosure have been described in the context or particular embodiments. Functionality may be separated or combined in blocks differently in various embodiments of the disclosure or described with different terminology. These and other variations, modifications, additions, and improvements may fall within the scope of the disclosure as defined in the claims that follow.
Number | Name | Date | Kind |
---|---|---|---|
4809325 | Hayashi et al. | Feb 1989 | A |
4837414 | Edamula | Jun 1989 | A |
5500681 | Jones | Mar 1996 | A |
5510603 | Hess et al. | Apr 1996 | A |
5581636 | Skinger | Dec 1996 | A |
5602377 | Beller et al. | Feb 1997 | A |
5703349 | Meyerson et al. | Dec 1997 | A |
5959285 | Schuessler | Sep 1999 | A |
5963265 | Bae et al. | Oct 1999 | A |
5978855 | Metz et al. | Nov 1999 | A |
6006990 | Ye et al. | Dec 1999 | A |
6058238 | Ng | May 2000 | A |
6263502 | Morrison et al. | Jul 2001 | B1 |
6438751 | Voyticky et al. | Aug 2002 | B1 |
6512919 | Ogasawara | Jan 2003 | B2 |
6556273 | Wheeler et al. | Apr 2003 | B1 |
6704929 | Ozer et al. | Mar 2004 | B1 |
6983304 | Sato | Jan 2006 | B2 |
7046161 | Hayes | May 2006 | B2 |
7206029 | Cohen-Solal | Apr 2007 | B2 |
7206409 | Antonellis et al. | Apr 2007 | B2 |
7221405 | Basson et al. | May 2007 | B2 |
7244404 | Rosenberg et al. | Jul 2007 | B2 |
7328848 | Xia et al. | Feb 2008 | B2 |
7349668 | Ilan et al. | Mar 2008 | B2 |
7369180 | May 2008 | B2 | |
7373652 | Bayrakeri et al. | May 2008 | B1 |
7387250 | Muni | Jun 2008 | B2 |
7394519 | Mossman et al. | Jul 2008 | B1 |
7424976 | Muramatsu | Sep 2008 | B2 |
7443449 | Momosaki et al. | Oct 2008 | B2 |
7487527 | Ellis et al. | Feb 2009 | B2 |
7587601 | Levy et al. | Sep 2009 | B2 |
7604172 | Onogi | Oct 2009 | B2 |
7612748 | Tateuchi | Nov 2009 | B2 |
7624417 | Dua | Nov 2009 | B2 |
7624916 | Sato et al. | Dec 2009 | B2 |
7673297 | Arsenault et al. | Mar 2010 | B1 |
7797430 | Ichieda | Sep 2010 | B2 |
7841531 | Onogi | Nov 2010 | B2 |
8010977 | Hogyoku | Aug 2011 | B2 |
8045054 | Bishop et al. | Oct 2011 | B2 |
8186572 | Herzig | May 2012 | B2 |
8292166 | Gomez et al. | Oct 2012 | B2 |
8364018 | McArdle | Jan 2013 | B2 |
8386339 | Minnick et al. | Feb 2013 | B2 |
8408466 | Gratton | Apr 2013 | B2 |
8427455 | Matsuda | Apr 2013 | B2 |
8430302 | Minnick et al. | Apr 2013 | B2 |
8439257 | Beals et al. | May 2013 | B2 |
8443407 | Gaede et al. | May 2013 | B2 |
8468610 | Beals et al. | Jun 2013 | B2 |
8511540 | Anguiano | Aug 2013 | B2 |
8534540 | Gratton et al. | Sep 2013 | B2 |
8550334 | Gratton et al. | Oct 2013 | B2 |
8553146 | Kennedy | Oct 2013 | B2 |
20010037297 | McNair | Nov 2001 | A1 |
20010052133 | Pack et al. | Dec 2001 | A1 |
20020027612 | Brill et al. | Mar 2002 | A1 |
20020049980 | Hoang | Apr 2002 | A1 |
20020112250 | Koplar et al. | Aug 2002 | A1 |
20030018711 | Imanishi | Jan 2003 | A1 |
20030050854 | Showghi et al. | Mar 2003 | A1 |
20030121978 | Rubin et al. | Jul 2003 | A1 |
20030151562 | Kulas | Aug 2003 | A1 |
20030172374 | Vinson et al. | Sep 2003 | A1 |
20040019691 | Daymond et al. | Jan 2004 | A1 |
20040026508 | Nakajima et al. | Feb 2004 | A1 |
20040044532 | Karstens | Mar 2004 | A1 |
20040046790 | Agarwal et al. | Mar 2004 | A1 |
20050011958 | Fukasawa et al. | Jan 2005 | A1 |
20050015800 | Holcomb | Jan 2005 | A1 |
20050015815 | Shoff et al. | Jan 2005 | A1 |
20050055281 | Williams | Mar 2005 | A1 |
20050059339 | Honda et al. | Mar 2005 | A1 |
20050125301 | Muni | Jun 2005 | A1 |
20050149967 | Hanley et al. | Jul 2005 | A1 |
20050180804 | Andrew et al. | Aug 2005 | A1 |
20050203854 | Das | Sep 2005 | A1 |
20050262548 | Shimojo et al. | Nov 2005 | A1 |
20050264694 | Ilan et al. | Dec 2005 | A1 |
20060064700 | Ludvig et al. | Mar 2006 | A1 |
20060071076 | Tamayama | Apr 2006 | A1 |
20060079247 | Ritter | Apr 2006 | A1 |
20060086796 | Onogi | Apr 2006 | A1 |
20060095286 | Kimura | May 2006 | A1 |
20060124742 | Rines et al. | Jun 2006 | A1 |
20060196950 | Kiliccote | Sep 2006 | A1 |
20060203339 | Kleinberger et al. | Sep 2006 | A1 |
20060208088 | Sekiguchi | Sep 2006 | A1 |
20060265731 | Matsuda | Nov 2006 | A1 |
20070016934 | Okada et al. | Jan 2007 | A1 |
20070016936 | Okada et al. | Jan 2007 | A1 |
20070017350 | Uehara | Jan 2007 | A1 |
20070019215 | Yu | Jan 2007 | A1 |
20070063050 | Attia et al. | Mar 2007 | A1 |
20070073585 | Apple et al. | Mar 2007 | A1 |
20070143788 | Abernathy et al. | Jun 2007 | A1 |
20070192723 | Anzelde et al. | Aug 2007 | A1 |
20070206020 | Duffield et al. | Sep 2007 | A1 |
20070256118 | Nomura et al. | Nov 2007 | A1 |
20080022323 | Koo | Jan 2008 | A1 |
20080059998 | McClenny et al. | Mar 2008 | A1 |
20080062164 | Bassi et al. | Mar 2008 | A1 |
20080073434 | Epshteyn et al. | Mar 2008 | A1 |
20080077324 | Hatano et al. | Mar 2008 | A1 |
20080082684 | Gaos et al. | Apr 2008 | A1 |
20080092154 | Hogyoku | Apr 2008 | A1 |
20080112615 | Obrea et al. | May 2008 | A1 |
20080156879 | Melick et al. | Jul 2008 | A1 |
20080182561 | Kim et al. | Jul 2008 | A1 |
20080189185 | Matsuo et al. | Aug 2008 | A1 |
20080200153 | Fitzpatrick et al. | Aug 2008 | A1 |
20080200160 | Fitzpatrick et al. | Aug 2008 | A1 |
20080201078 | Fitzpatrick et al. | Aug 2008 | A1 |
20080244675 | Sako et al. | Oct 2008 | A1 |
20080267537 | Thuries | Oct 2008 | A1 |
20080281624 | Shibata | Nov 2008 | A1 |
20080288600 | Clark | Nov 2008 | A1 |
20080307348 | Jones et al. | Dec 2008 | A1 |
20090029725 | Kindberg | Jan 2009 | A1 |
20090031071 | Chiu | Jan 2009 | A1 |
20090031373 | Hogyoku | Jan 2009 | A1 |
20090070699 | Birkill et al. | Mar 2009 | A1 |
20090083808 | Morrison | Mar 2009 | A1 |
20090088213 | Rofougaran | Apr 2009 | A1 |
20090094546 | Anzelde et al. | Apr 2009 | A1 |
20090108057 | Mu et al. | Apr 2009 | A1 |
20090113334 | Chakra et al. | Apr 2009 | A1 |
20090154759 | Koskinen et al. | Jun 2009 | A1 |
20090157511 | Spinnell et al. | Jun 2009 | A1 |
20090157530 | Nagamoto et al. | Jun 2009 | A1 |
20090172780 | Sukeda et al. | Jul 2009 | A1 |
20090179852 | Refai et al. | Jul 2009 | A1 |
20090180025 | Dawson | Jul 2009 | A1 |
20090212112 | Li | Aug 2009 | A1 |
20090212113 | Chiu et al. | Aug 2009 | A1 |
20090234570 | Sever | Sep 2009 | A1 |
20090254954 | Jeong | Oct 2009 | A1 |
20090293110 | Koga | Nov 2009 | A1 |
20090294538 | Wihlborg et al. | Dec 2009 | A1 |
20090303036 | Sahuguet | Dec 2009 | A1 |
20090307232 | Hall | Dec 2009 | A1 |
20090312105 | Koplar | Dec 2009 | A1 |
20090320066 | Soldan et al. | Dec 2009 | A1 |
20100017457 | Jumpertz et al. | Jan 2010 | A1 |
20100020970 | Liu et al. | Jan 2010 | A1 |
20100031162 | Wiser et al. | Feb 2010 | A1 |
20100036936 | Cox et al. | Feb 2010 | A1 |
20100053339 | Aaron et al. | Mar 2010 | A1 |
20100081375 | Rosenblatt et al. | Apr 2010 | A1 |
20100089996 | Koplar | Apr 2010 | A1 |
20100096448 | Melick et al. | Apr 2010 | A1 |
20100103018 | Yoon et al. | Apr 2010 | A1 |
20100114715 | Schuster et al. | May 2010 | A1 |
20100129057 | Kulkarni | May 2010 | A1 |
20100131900 | Spetalnick | May 2010 | A1 |
20100131970 | Falcon | May 2010 | A1 |
20100131983 | Shannon et al. | May 2010 | A1 |
20100138344 | Wong | Jun 2010 | A1 |
20100149187 | Slavin et al. | Jun 2010 | A1 |
20100154035 | Damola et al. | Jun 2010 | A1 |
20100161437 | Pandey | Jun 2010 | A1 |
20100163613 | Bucher et al. | Jul 2010 | A1 |
20100201894 | Nakayama et al. | Aug 2010 | A1 |
20100217663 | Ramer et al. | Aug 2010 | A1 |
20100225653 | Sao et al. | Sep 2010 | A1 |
20100261454 | Shenfield et al. | Oct 2010 | A1 |
20100262924 | Kalu | Oct 2010 | A1 |
20100262992 | Casagrande | Oct 2010 | A1 |
20100272193 | Khan et al. | Oct 2010 | A1 |
20100275010 | Ghirardi | Oct 2010 | A1 |
20100279710 | Dicke et al. | Nov 2010 | A1 |
20100295868 | Zahnert et al. | Nov 2010 | A1 |
20100301115 | Berkun | Dec 2010 | A1 |
20100313231 | Okamoto et al. | Dec 2010 | A1 |
20100319041 | Ellis | Dec 2010 | A1 |
20100327060 | Moran et al. | Dec 2010 | A1 |
20110000958 | Herzig | Jan 2011 | A1 |
20110007630 | Almhana et al. | Jan 2011 | A1 |
20110030068 | Imai | Feb 2011 | A1 |
20110039573 | Hardie | Feb 2011 | A1 |
20110061003 | Miyazawa et al. | Mar 2011 | A1 |
20110065451 | Danado et al. | Mar 2011 | A1 |
20110087539 | Rubinstein et al. | Apr 2011 | A1 |
20110107386 | De Los Reyes et al. | May 2011 | A1 |
20110138408 | Adimatyam et al. | Jun 2011 | A1 |
20110208710 | Lesavich | Aug 2011 | A1 |
20110258058 | Carroll et al. | Oct 2011 | A1 |
20110264527 | Fitzpatrick et al. | Oct 2011 | A1 |
20110264530 | Santangelo et al. | Oct 2011 | A1 |
20110282727 | Phan et al. | Nov 2011 | A1 |
20110314485 | Abed | Dec 2011 | A1 |
20120096499 | Dasher et al. | Apr 2012 | A1 |
20120117232 | Brown et al. | May 2012 | A1 |
20120127110 | Amm et al. | May 2012 | A1 |
20120128267 | Dugan et al. | May 2012 | A1 |
20120130835 | Fan et al. | May 2012 | A1 |
20120130851 | Minnick et al. | May 2012 | A1 |
20120131416 | Dugan et al. | May 2012 | A1 |
20120137318 | Kilaru et al. | May 2012 | A1 |
20120138671 | Gaede et al. | Jun 2012 | A1 |
20120139826 | Beals et al. | Jun 2012 | A1 |
20120139835 | Morrison et al. | Jun 2012 | A1 |
20120142322 | Gomez | Jun 2012 | A1 |
20120151293 | Beals | Jun 2012 | A1 |
20120151524 | Kilaru et al. | Jun 2012 | A1 |
20120153015 | Gomez et al. | Jun 2012 | A1 |
20120153017 | Bracalente et al. | Jun 2012 | A1 |
20120155838 | Gerhards et al. | Jun 2012 | A1 |
20120158919 | Aggarwal et al. | Jun 2012 | A1 |
20120159563 | Gomez et al. | Jun 2012 | A1 |
20120168493 | Worms | Jul 2012 | A1 |
20120168510 | Gratton | Jul 2012 | A1 |
20120169928 | Casagrande et al. | Jul 2012 | A1 |
20120175416 | Gomez et al. | Jul 2012 | A1 |
20120181329 | Gratton et al. | Jul 2012 | A1 |
20120182320 | Beals et al. | Jul 2012 | A1 |
20120188112 | Beals et al. | Jul 2012 | A1 |
20120188442 | Kennedy | Jul 2012 | A1 |
20120198572 | Beals et al. | Aug 2012 | A1 |
20120199643 | Minnick et al. | Aug 2012 | A1 |
20120206648 | Casagrande et al. | Aug 2012 | A1 |
20120215830 | Anguiano | Aug 2012 | A1 |
20120217292 | Gratton et al. | Aug 2012 | A1 |
20120217293 | Martch et al. | Aug 2012 | A1 |
20120218471 | Gratton | Aug 2012 | A1 |
20120222055 | Schaefer et al. | Aug 2012 | A1 |
20120222071 | Gaede et al. | Aug 2012 | A1 |
20120222081 | Schaefer et al. | Aug 2012 | A1 |
20120293327 | Mountain | Nov 2012 | A1 |
20130068838 | Gomez et al. | Mar 2013 | A1 |
20130239157 | Gaede et al. | Sep 2013 | A1 |
20140046661 | Bruner | Feb 2014 | A1 |
20140076963 | Gratton et al. | Mar 2014 | A1 |
20140158762 | Gomez et al. | Jun 2014 | A1 |
Number | Date | Country |
---|---|---|
2 634 951 | Jan 2010 | CA |
1571503 | Jan 2005 | CN |
101 227 581 | Jul 2008 | CN |
10 2007 038 810 | Feb 2009 | DE |
1 021 035 | Jul 2000 | EP |
1 383 071 | Jan 2004 | EP |
1 724 695 | Nov 2006 | EP |
1 757 222 | Feb 2007 | EP |
1 768 400 | Mar 2007 | EP |
2 079 051 | Jul 2009 | EP |
2 131 289 | Dec 2009 | EP |
2 439 936 | Apr 2012 | EP |
2 565 748 | Dec 1985 | FR |
2 044 446 | Oct 1980 | GB |
2 165 129 | Apr 1986 | GB |
2 311 451 | Sep 1997 | GB |
2 325 765 | Dec 1998 | GB |
2 471 567 | Jan 2011 | GB |
2002-215768 | Aug 2002 | JP |
2007-213548 | Aug 2007 | JP |
2008 244556 | Oct 2008 | JP |
2004 0087776 | Oct 2004 | KR |
200926075 | Jun 2009 | TW |
201032139 | Sep 2010 | TW |
9527275 | Oct 1995 | WO |
9741690 | Nov 1997 | WO |
0106593 | Jan 2001 | WO |
0118589 | Mar 2001 | WO |
0158146 | Aug 2001 | WO |
2005109338 | Nov 2005 | WO |
2007009005 | Jan 2007 | WO |
2009057651 | May 2009 | WO |
2009144536 | Dec 2009 | WO |
2010149161 | Dec 2010 | WO |
2011009055 | Jan 2011 | WO |
Entry |
---|
“Android App Reviews & Showcase Just a Tapp Away,” Android Tapp, 10 pp. Found online at http://www.androidtapp.com/download-the-weather-channel-android-app-from-your-tv/, Oct. 22, 2010. |
“Can Mobile Barcodes Work on TV?,” India and Asia Pacific Mobile Industry Reports, Sep. 2009, 4 pp. Found online at http://www.gomonews.com/can-mobile-barcodes-work-on-tv/, Oct. 22, 2010. |
“FOX TV Uses QR Codes,” 2d Barcode Strategy, Sep. 2010, 6 pp. Found online at http://www.2dbarcodestrategy.com/2010/09/fox-tv-uses-qr-codes.html, Oct. 22, 2010. |
“FOX's Fringe Uses QR Code,” 2d Barcode Strategy, Oct. 2010, 4 pp. Found on the Internet at http://www.2dbarcodestrategy.com/2010/10/foxs-fringe-uses-qr-code.html, Oct. 22, 2010. |
“Mobile Paths: QR Codes Come to TV,” Mobile Behavior: An Omnicom Group Company, Sep. 2010, 8 pp. Found online at http://www.mobilebehavior.com/2010/09/27/mobile-paths-qr-codes-come-to-tv, Oct. 22, 2010. |
“What Can I Do with the QR Barcode,” Search Magnet Local-QR Barcode Technology, 2 pp. Found online at http://www.searchmagnetlocal.com/qr—barcode—technology.html, Oct. 22, 2010. |
Kartina Costedio, “Bluefly QR Codes Debut on TV,” 2 pp. Found online at http://www.barcode.com/Mobile-Barcode-News/bluefly-qr-codes-debut-on-tv.html, Oct. 22, 2010. |
Gao, J. et al., “A 2D Barcode-Based Mobile Payment System,” Multimedia and Ubiquitous Engineering, 2009, 10 pp. Found online at http://ieeexplore.ieee.org/Xplore/login.jsp?url=http%3A%2F%2Fieeexplore.ieee.org%2Fie . . . , Oct. 22, 2010. |
Smith, Lindsay, “Barcodes Make History on Global TV”, 3 pp. Found online at http://www.lindsaysmith.com/worlds-first-mobio-mini-telethon/, Oct. 22, 2010. |
Nghee, Seah Y. , “Data Transmission Between PDA and PC Using WIFI for Pocket Barcode Application”, Thesis, University Teknologi Malaysia, May 2007, 126 pp. Found online at http://eprints.utm.my/6421/1/SeahYeowNgeeMFKE20007TTT.pdf, Oct. 22, 2010. |
Olson, Elizabeth, “Bar Codes Add Detail on Items in TV Ads,” New York Times, Sep. 2010, 3 pp. Found online at http:www.nytimes.com/2010/09/27/business/media/27bluefly.html?src=busln, Oct. 22, 2010. |
Rekimoto, Jun et al., “Augment-able Reality: Situated Communication Through Physical and Digital Spaces”, Sony Computer Science Laboratory, 2002, 8 pp. Found online at Citeseer: 10.1.1.20.34[1].pdf, Oct. 22, 2010. |
Silverstein, Barry, “QR Codes and TV Campaigns Connect,” ReveNews, Sep. 2010, 5 pp. Found online at http://www.revenews.com/barrysilverstein/qr-codes-and-tv-campaigns-connect/, Oct. 22, 2010. |
Yamanari, Tomofumi et al., “Electronic Invisible Code Display Unit for Group Work On Reminiscence Therapy,” Proceedings of the International MultiConference of Engineers and Computer Scientists 2009, vol. I, IMECS 2009, Mar. 2009, 6 pp. Retrieved from Internet: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.148.6904&rep1&type=pdf. |
Byford, D., “Universal Interactive Device,” International Business Machines Corporation, Jun. 1998, 1 page. |
International Search Report and Written Opinion of PCT/US11/60094 mailed on Mar. 30, 2012, 7 pages. |
International Search Report of PCT/US11/60109 mailed on Feb. 14, 2012, 3 pages. |
International Search Report and Written Opinion of PCT/US2011/068161 mailed on Jun. 14, 2012, 19 pages. |
International Search Report and Written Opinion of PCT/US2012/021657 mailed on May 23, 2012, 12 pages. |
International Search Report and Written Opinion of PCT/US2012/022405 mailed on Apr. 19, 2012, 11 pages. |
International Search Report and Written Opinion of PCT/US2012/024923 mailed on May 22, 2012, 12 pages. |
International Search Report and Written Opinion of PCT/US2012/024956 mailed on Jun. 11, 2012, 10 pages. |
International Search Report and Written Opinion of PCT/US2012/025502 mailed Jun. 8, 2012, 13 pages. |
International Search Report and Written Opinion of PCT/US2012/025607 mailed Jun. 8, 2012, 13 pages. |
International Search Report and Written Opinion of PCT/US2012/025634 mailed on May 7, 2012, 8 pages. |
International Search Report and Written Opinion of PCT/US2012/026373 mailed Jun. 13, 2012, 14 pages. |
International Search Report and Written Opinion of PCT/US2012/026722 mailed Jun. 28, 2012, 11 pages. |
Schmitz, A., et al., “Ad-Hoc Multi-Displays for Mobile Interactive Applications,” 31st Annual Conference of the European Association for Computer Graphics (Eurographics 2010), May 2010, vol. 29, No. 2, 8 pages. |
Yang, C., et al., “Embedded Digital Information Integrated by Video-on-Demand System,” Proceedings of the Fourth International Conference on Networked Computing and Advanced Information Management, IEEE Computer Society, 2008, 6 pages. |
U.S. Appl. No. 12/971,349, filed Dec. 17, 2010, Office Action mailed Jul. 16, 2012, 11 pages. |
U.S. Appl. No. 12/984,385, filed Jan. 4, 2011, Office Action mailed Jul. 12, 2012, 16 pages. |
U.S. Appl. No. 12/986,721, filed Jan. 7, 2011, Notice of Allowance mailed Jun. 21, 2012, 7 pages. |
U.S. Appl. No. 13/020,678, filed Feb. 3, 2011, Office Action mailed Jul. 30, 2012, 15 pages. |
U.S. Appl. No. 13/035,525, filed Feb. 25, 2011, Office Action mailed Jul. 18, 2012, 15 pages. |
International Search Report and Written Opinion of PCT/US11/59977 mailed on Mar. 19, 2012, 7 pages. |
International Search Report and Written Opinion of PCT/US11/60002 mailed on Feb. 15, 2012, 7 pages. |
International Search Report and Written Opinion of PCT/US11/60104 mailed on Mar. 29, 2012, 9 pages. |
International Search Report and Written Opinion of PCT/US11/60121 mailed on Feb. 14, 2012, 7 pages. |
International Search Report and Written Opinion of PCT/US11/61074 mailed on Jan. 6, 2012, 11 pages. |
International Search Report and Written Opinion of PCT/US11/61211 mailed on Mar. 29, 2012, 8 pages. |
International Search Report and Written Opinion of PCT/US11/61773 mailed on Feb. 21, 2012, 7 pages. |
International Search Report and Written Opinion of PCT/US11/61778 mailed on Mar. 2, 2012, 7 pages. |
International Search Report and Written Opinion of PCT/US11/63111 mailed on Apr. 4, 2012, 9 pages. |
International Search Report and Written Opinion of PCT/US11/64709 mailed on Apr. 10, 2012, 8 pages. |
International Search Report and Written Opinion of PCT/US2011/060098 mailed on Mar. 29, 2012, 10 pages. |
International Search Report and Written Opinion of PCT/US2011/063308 mailed on Mar. 29, 2012, 10 pages. |
International Search Report and Written Opinion of PCT/US2011/068176 mailed on Mar. 29, 2012, 15 pages. |
Ngee, S., “Data Transmission Between PDA and PC Using WiFi for Pocket Barcode Application,” Thesis, University Teknologi Malaysia, May 2007, 126 pp. Found online at http://eprints.utm.my/6421/1/SeahYeowNgeeMFKE20007TTT.pdf, Oct. 22, 2010. |
U.S. Appl. No. 12/961,369, filed Dec. 6, 2010, Office Action mailed Mar. 9, 2012, 17 pages. |
U.S. Appl. No. 12/971,349, filed Dec. 17, 2010, Office Action mailed Nov. 10, 2011, 9 pages. |
U.S. Appl. No. 12/971,349, filed Dec. 17, 2010, Final Office Action mailed Jan. 20, 2012, 10 pages. |
U.S. Appl. No. 12/986,721, filed Jan. 7, 2011, Office Action mailed Mar. 16, 2012, 6 pages. |
International Search Report of PCT/US2012/022581 mailed on Oct. 8, 2012, 18 pages. |
International Search Report and Written Opinion of PCT/US2012/048032, mailed Oct. 16, 2012, 14 pages. |
O'Sullivan, “Can Mobile Barcodes Work on TV?,” India and Asia Pacific Mobile Industry Reports, Sep. 2009, 4 pp. Found online at http://gomonews.com/can-mobile-barcodes-work-on-tv/, Feb. 5, 2013. |
U.S. Appl. No. 12/958,073, filed Dec. 1, 2010, Notice of Allowance mailed Jan. 17, 2013, 17 pages. |
U.S. Appl. No. 12/981,244, filed Dec. 29, 2010, Office Action mailed Dec. 21, 2012, 23 pages. |
U.S. Appl. No. 12/984,385, filed Jan. 4, 2011, Notice of Allowance mailed Nov. 28, 2012, 11 pages. |
U.S. Appl. No. 13/015,382, filed Jan. 27, 2011, Notice of Allowance mailed Feb. 22, 2013, 12 pages. |
U.S. Appl. No. 13/007,317, filed Jan. 14, 2011, Office action mailed Dec. 19, 2012, 29 pages. |
U.S. Appl. No. 13/020,678, filed Feb. 3, 2011, Notice of Allowance mailed Jan. 3, 2013, 13 pages. |
U.S. Appl. No. 13/028,030, filed Feb. 15, 2011, Office Action mailed Jan. 11, 2013, 14 pages. |
U.S. Appl. No. 12/960,285, filed Dec. 3, 2010, Non-Final Office Action mailed Dec. 6, 2012, 17 pages. |
U.S. Appl. No. 13/035,525, filed Feb. 25, 2011, Final Office Action mailed Jan. 31, 2013, 26 pages. |
U.S. Appl. No. 13/037,302, filed Feb. 28, 2011 Office Action mailed Mar. 1, 2013, 20 pages. |
U.S. Appl. No. 13/037,312, filed Feb. 28, 2011, Final Office Action mailed Feb. 28, 2013, 18 pages. |
U.S. Appl. No. 13/037,316, filed Feb. 28, 2011, Office Action mailed Jan. 30, 2013, 21 pages. |
U.S. Appl. No. 13/037,333, filed Feb. 28, 2011 Notice of Allowance mailed Jan. 18, 2013, 27 pages. |
U.S. Appl. No. 13/673,480, filed Nov. 9, 2012 Office Action mailed Jan. 16, 2013, 27 pages. |
Extended European Search Report for EP 12152690.9 dated Jun. 19, 2012, 9 pages. |
International Search Report and Written Opinion of PCT/US2012/026624 mailed Aug. 29, 2012, 14 pages. |
U.S. Appl. No. 12/958,073, filed Dec. 1, 2010, Office Action mailed Aug. 31, 2012, 12 pages. |
U.S. Appl. No. 12/961,369, filed Dec. 6, 2010, Final Rejection mailed Oct. 30, 2012, 17 pages. |
U.S. Appl. No. 12/971,349, filed Dec. 7, 2010), Final Rejection mailed Oct. 24, 2012, 11 pages. |
U.S. Appl. No. 12/953,227, filed Nov. 23. 2010, Office Action mailed Nov. 7, 2012, 31 pages. |
U.S. Appl. No. 13/015,382, filed Jan. 27, 2011, Office Action Mailed Nov. 13, 2012, 7 pages. |
U.S. Appl. No. 13/016,483, filed Jan. 28, 2011 Office Action mailed Nov. 2, 2012, 18 pages. |
U.S. Appl. No. 12/953,273, filed Nov. 23, 2010, Notice of Allowance, mailed Oct. 18, 2012, 11 pages. |
U.S. Appl. No. 13/035,474, filed Feb. 25, 2011, Office Action mailed Oct. 30, 2012, 11 pages. |
U.S. Appl. No. 13/037,312, filed Feb. 28, 2011, Office Action mailed Aug. 15, 2012, 9 pages. |
International Preliminary Report on Patentability of PCT/US2011/059977 mailed on Jun. 6, 2013, 6 pages. |
International Preliminary Report on Patentability of PCT/US2011/068161 mailed on Jul. 25, 2013, 13 pages. |
International Preliminary Report on Patentability of PCT/US2012/025502 mailed Sep. 6, 2013, 9 pages. |
Liu, Yue et al., “Recognition of QR code with mobile phones,” Control and Decision Conference, 2008. CCDC 2008. Jul. 2-4, 2008, pp. 203, 206. |
U.S. Appl. No. 12/961,369, filed Dec. 6, 2010, Non-Final Office Action mailed Mar. 25, 2013, 17 pages. |
U.S. Appl. No. 12/971,349, filed Dec. 7, 2010 ), Notice of Allowance mailed Oct. 2, 2013, 24 pages. |
U.S. Appl. No. 12/981,244, filed Dec. 29, 2010, Final Office Action mailed Oct. 30, 2013, 10 pages. |
U.S. Appl. No. 13/016,483, filed Jan. 28, 2011 Final Office Action mailed Jun. 27, 2013, 13 pages. |
U.S. Appl. No. 12/973,431, filed Dec. 20, 2010, Final Office Action mailed Aug. 27, 2013, 11 pages. |
U.S. Appl. No. 13/006,270, filed Jan. 13, 2011, Non-Final Office Action mailed Oct. 8, 2013, 20 pages. |
U.S. Appl. No. 13/028,030, filed Feb. 15, 2011, Non-Final Office Action mailed Dec. 17, 2013, 60 pages. |
U.S. Appl. No. 13/035,525, filed Feb. 25, 2011, Final Office Action mailed Sep. 12, 2013, 21 pages. |
U.S. Appl. No. 13/037,302, filed Feb. 28, 2011 Final Office Action mailed Oct. 16, 2013, 28 pages. |
U.S. Appl. No. 13/037,316, filed Feb. 28, 2011, Final Office Action mailed Aug. 28, 2013, 13 pages. |
U.S. Appl. No. 13/192,287, filed Jul. 27, 2011 Non Final Office Action mailed Jun. 13, 2013, 22 pages. |
U.S. Appl. No. 13/673,480, filed Nov. 9, 2012 Final Office Action mailed Sep. 9, 2013, 10 pages. |
U.S. Appl. No. 13/673,480, filed Nov. 9, 2012 Notice of Allowance mailed Nov. 12, 2013, 16 pages. |
U.S. Appl. No. 13/475,794, filed May 18, 2012 Non-Final Office Action mailed Sep. 18, 2013, 19 pages. |
U.S. Appl. No. 12/964,478, filed Dec. 9, 2010, Non-Final Office Action mailed Mar. 26, 2013, 19 pages. |
U.S. Appl. No. 12/964,478, filed Dec. 9, 2010, Final Office Action mailed Sep. 16, 2013, 12 pages. |
U.S. Appl. No. 12/961,369, filed Dec. 6, 2010, Non-Final Office Action mailed Jul. 12, 2013, 22 pages. |
U.S. Appl. No. 12/953,227, filed Nov. 23, 2010, Final Office Action mailed May 24, 2013, 17 pages. |
U.S. Appl. No. 12/965,645, filed Dec. 10, 2010, Non-Final Office Action, mailed Jul. 19, 2013, 20 pages. |
U.S. Appl. No. 12/973,431, filed Dec. 20, 2010, Non-Final Rejection mailed May 15, 2013, 30 pages. |
U.S. Appl. No. 13/014,591, Notice of Allowance mailed May 24, 2013, 32 pages. |
U.S. Appl. No. 13/007,317, Notice of Allowance mailed May 13, 2013, 16 pages. |
U.S. Appl. No. 13/031,115, Notice of Allowance mailed Apr. 16, 2013, 24 pages. |
U.S. Appl. No. 13/035,474, filed Feb. 25, 2011, Final Rejection mailed Mar. 29, 2013, 20 pages. |
U.S. Appl. No. 12/960,285, filed Dec. 3, 2010, Final Office Action mailed Apr. 18, 2013, 14 pages. |
U.S. Appl. No. 13/035,525, filed Feb. 25, 2011, Non-Final Office Action mailed May 15, 2013, 15 pages. |
Extended European Search Report for EP 11842890.3 dated Mar. 26, 2014, 8 pages. |
Extended European Search Report for EP 11850819.1 dated Mar. 17, 2014, 5 pages. |
Extended European Search Report for EP 11846486 dated Mar. 26, 2014, 5 pages. |
Extended European Search Report for EP 11852630 dated Jun. 30, 2014, 7 pages. |
International Preliminary Report on Patentability of PCT/US2012/048032 mailed on Apr. 3, 2014, 6 pages. |
International Preliminary Report on Patentability of PCT/US2011/063111 mailed Jun. 13, 2013, 8 pages. |
Kato et al, “2D barcodes for mobile phones”, Mobile Technology, Applications and Systems, 2005 2nd International Conference on Guangzhou, China Nov. 15-17, 2005, Piscataway, NJ, USA, IEEE, Piscataway, NJ, USA, Nov. 15, 2005, pp. 8pp-8, XP031887368, DOI: 10.1109/MTAS.2005.207166; ISBN: 978-981-05-4573-4, 8 pages. |
Office Action and Search Report for ROC (Taiwan) Patent Application No. 10014870 dated May 7, 2014, issued in the corresponding foreign application, 9 pages. |
Office Action and Search Report for ROC (Taiwan) Patent Application No. 100142966 dated May 27, 2014, 6 pages. |
Office Action for European Patent Application No. 12719817.4 dated Jun. 23, 2014 issued in the corresponding foreign application, 5 pages. |
U.S. Appl. No. 14/179,336, filed Feb. 12, 2014 Non-Final Office Action mailed May 22, 2014, 14 pages. |
U.S. Appl. No. 12/961,369, filed Dec. 6, 2010, Notice of Allowance mailed Jul. 16, 2014, 15 pages. |
U.S. Appl. No. 12/981,244, filed Dec. 29, 2010, Notice of Allowance mailed Mar. 25, 2014, 17 pages. |
U.S. Appl. No. 12/965,645, filed Dec. 10, 2010, Final Office Action, mailed Mar. 18, 2014, 24 pages. |
U.S. Appl. No. 12/965,645, filed Dec. 10, 2010, Notice of Allowance, mailed Jun. 20, 2014, 35 pages. |
U.S. Appl. No. 13/028,030, filed Feb. 15, 2011, Final Office Action mailed Jul. 11, 2014, 43 pages. |
U.S. Appl. No. 12/960,285, filed Dec. 3, 2010, Non-Final Office Action mailed Jun. 6, 2014, 19 pages. |
U.S. Appl. No. 13/006,270, filed Jan. 13, 2011, Final Office Action mailed May 9, 2014, 41 pages. |
U.S. Appl. No. 13/037,312, filed Feb. 28, 2011, Notice of Allowance mailed Jun. 13, 2013, 10 pages. |
U.S. Appl. No. 13/968,611, filed Aug. 16, 2013, Notice of Allowance mailed May 2, 2014, 40 pages. |
U.S. Appl. No. 12/961,369, filed Dec. 6, 2010, Non-Final Office Action mailed Feb. 13, 2014, 21 pages. |
U.S. Appl. No. 13/010,557, filed Jan. 20, 2011, Final Rejection mailed Jan. 16, 2014, 17 pages. |
U.S. Appl. No. 13/010,557, filed Jan. 20, 2011, Non-Final Rejection mailed Aug. 5, 2013, 17 pages. |
U.S. Appl. No. 13/192,287, filed Jul. 27, 2011, Final Office Action mailed Jan. 28, 2014, 18 pages. |
U.S. Appl. No. 13/035,474, filed Feb. 25, 2011, Non Final Rejection mailed Mar. 6, 2014, 20 pages. |
U.S. Appl. No. 13/968,611, filed Aug. 16, 2013, Non-Final Office Action mailed Jan. 17, 2014, 21 pages. |
Number | Date | Country | |
---|---|---|---|
20120218470 A1 | Aug 2012 | US |