A method, computer readable storage device, and apparatus for determining a condition under which a sign has been viewed includes receiving a uniform resource locator from a user endpoint device, where the uniform resource locator is embedded in a machine-readable code displayed on the sign, and extracting information from the uniform resource locator, where the information describes the condition under which the sign was viewed by a user of the user endpoint device.
A method, computer readable storage device, and apparatus for transmitting information describing a condition under which a sign is currently viewable includes generating the information describing the condition under which the sign is currently viewable and inserting the information in a uniform resource locator, where the uniform resource locator is embedded in a machine-readable code displayed on the sign.
The teaching of the present disclosure can be readily understood by considering the following detailed description in conjunction with the accompanying drawings, in which:
To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the figures.
The present disclosure relates generally to digital signage and relates more particularly to machine-readable codes for inclusion in digital signage. In one embodiment, the present disclosure describes a method, computer-readable storage device and apparatus for using dynamic machine-readable codes in digital signage. By “dynamic,” it is meant that certain properties of the machine-readable codes change, e.g., with time or location. In one particular embodiment, a uniform resource locator (URL) embedded in a machine-readable code allows one to determine a condition (e.g., the time and/or place) under which the code was scanned by a user endpoint device, without having to extract data from the user endpoint device. From this, a condition under which the sign was viewed can be inferred. In further embodiments, additional contextually-relevant information may be embedded in the code.
In one embodiment, server 108, and database 110 are deployed in a communications network 101. For example, the communication network 100 may be any type of communications network, such as for example, a traditional circuit switched network (e.g., a public switched telephone network (PSTN)) or a packet network such as an Internet Protocol (IP) network (e.g., an IP Multimedia Subsystem (IMS) network), an asynchronous transfer mode (ATM) network, a wireless network, a cellular network (e.g., 2G, 3G, 4G/LTE and the like), and the like related to the current disclosure. It should be noted that an IP network is broadly defined as a network that uses Internet Protocol to exchange data packets.
In one embodiment, the communications network 101 may include a core network. The core network may include the application server (AS) 108 and the database (DB) 110. The AS 104 may be deployed as a hardware device embodied as a general purpose computer (e.g., the general purpose computer 400 as illustrated in
In one embodiment, the communications network 101 may include one or more access networks (e.g., a cellular network, a wireless network, a wireless-fidelity (Wi-Fi) network, a PSTN network, an IP network, and the like) that are not shown to simplify
The sign 102 comprises any visible medium on which information may be displayed, such as a static billboard, a mobile billboard, a changeable copy board, an electronic message center, a poster, or the like. In one particular embodiment, the sign 102 is a digital sign, which allows for the content displayed on the sign to be easily changed. In a further embodiment still, the digital sign 102 includes a microprocessor that is capable of inserting data into a URL embedded in a machine-readable code displayed on the sign. Thus, the digital sign 102 may be implemented as a general purpose computer as illustrated in
The sign 102 includes a machine-readable code 104. In one embodiment, the machine-readable code 104 is any type of optical machine-readable representation of data, such as a one-dimensional (i.e., linear) barcode, a two-dimensional (i.e., matrix) barcode, a quick response (QR code), or the like. As described in greater detail below, the machine-readable code 104 embeds a URL (or other character string that constitutes a reference to a resource) that allows one to determine a condition (e.g., the time and/or place) under which the machine-readable code was scanned by the user endpoint device 106. The microprocessor in the sign 102 may insert various identifying data, such as a location of the sign 102 and/or a given time of day, into the URL. The machine-readable code 104 may additionally embed other contextually relevant data.
The user endpoint device 106 may be any type of endpoint device such as a desktop computer or a mobile endpoint device such as a cellular telephone, a smart phone, a tablet computer, a laptop computer, a netbook, an ultrabook, a portable media device (e.g., an MP3 player), a gaming console, a portable gaming device, and the like. In one embodiment, the user endpoint device 106 has at least one sensor integrated therein, such as an imaging sensor (e.g., a still or video camera) or a scanner for reading machine-readable codes (e.g., a barcode scanner).
In one embodiment, the server 108 is an application server and may comprise a general purpose computer as illustrated in
The database 110 may store information that allows the server 108 to decode data in the URL embedded in the machine-readable code 104. For example, the database 106 may store cryptographic keys that the server 108 shares with various signs, as discussed in greater detail below. Alternatively, the database 106 may track and store nonce strings that the server 108 distributes to different signs.
It should be noted that the system 100 has been simplified. For example, the system 100 may include various network elements (not shown) such as border elements, routers, switches, policy servers, security devices, a content distribution network (CDN) and the like that enable communication among the various pieces of the system 100.
The method 200 begins in step 202. In step 204, the server 108 receives a URL from the user endpoint device 106. The URL is embedded in a machine-readable code 104 that was not necessarily captured by the user endpoint device 106 at the same time that the URL is received by the server 108. That is, the user of the user endpoint device 106 may have captured an image of the machine-readable code 104, but waited to launch the embedded URL until sometime after the image was captured. For instance, the user may have captured an image of a machine-readable code 104 displayed on a roadside billboard as the user was traveling and unable to review more detailed information.
In step 206, the server 108 extracts from the URL information that describes a condition under which the sign was viewed by the user. In one embodiment, the information includes a location of the sign and/or a time at which the sign was viewed by the user of the user endpoint device 106. The location may comprise simply a unique identifier associated with the sign (e.g., a serial number), or it may comprise the actual physical location of the sign (e.g., global positioning system coordinates). The time may comprise the exact or approximate time at which the user of the user endpoint device 106 captured the image of the machine-readable code. In one embodiment, the time is only approximate because the time identifier embedded in the URL is updated at periodic intervals (e.g., every few seconds). Thus, the time may comprise a range of times between which the user of the user endpoint device 106 captured the image of the machine-readable code.
The server 108 and the microprocessor of the digital sign 102 agree on specifically how the information is represented in the URL. In one embodiment, the representation is completely transparent (e.g., the time and/or location are depicted “in the clear” in the URL). However, this approach may leave the system 100 vulnerable to fraud. Moreover, a low-resolution machine-readable code, legible from a distance, might still be able to store only a limited number of characters (e.g., fifty). Thus, in one embodiment, the information is sent “in the clear,” but protected by a message authentication code (MAC). However, in an alternative embodiment, an opaque (e.g., encoded or otherwise masked) representation of the information is employed.
In an alternative embodiment, the information extracted in step 206 is a sequence of pseudo-random nonce strings. The strings are sent by the server 108 to a plurality of signs (microprocessors), each sign receiving a different string or set of strings. In this case, the server 108 consults the database 110 in optional step 208 (illustrated in phantom). The database 110 maintains a table of which strings were sent when and to which signs (microprocessors). Thus, this table may be referenced by the server 108 in step 208 for time and/or location (or other) data associated with the received URL.
In another alternative embodiment, the information extracted in step 206 is an identifier (e.g., a serial number) and a string of encrypted data. In this case, the server 108 extracts the identifier and consults the database 110 in optional step 210 (illustrated in phantom) for a shared cryptographic key (e.g., an advanced encryption standard key) corresponding to the identifier. In optional step 212 (illustrated in phantom), the server 108 uses the shared cryptographic key to decrypt the encrypted data, which includes time and/or location (or other) data associated with the received URL. Although not illustrated, this embodiment may additionally employ the use of a separately keyed MAC.
The method 200 ends in step 214.
Although the method 200 describes operations in which descriptive information is sent either “in the clear” or in an encoded form, it will be appreciated that a hybrid of these approaches could also be employed. For instance, the server 108 may periodically send the microprocessor fresh nonce and/or fresh key material, but the microprocessor may use its local clock to time-stamp the data that is inserted into the URL.
Moreover, although the method 200 references a server 108 that both functions as a web server (e.g., receives the URL) and manages the signs (e.g., distributes data for use in transmitting the URL), this does not necessarily have to be the case. For instance, a separate web server that makes no use of the descriptive information may be employed; however, the information may be extracted from the web server's logs for offline analysis.
The method 300 begins in step 302. In step 304, the hardware processor generates descriptive information relating to the sign 102 (e.g., information describing a condition under which the sign is currently viewable). In one embodiment, the descriptive information includes a location of the sign and/or a current time. The location may comprise simply a unique identifier associated with the sign (e.g., a serial number), or it may comprise the actual physical location of the sign (e.g., global positioning system coordinates). The time may comprise the exact or approximate time. In one embodiment, the time is only approximate because the time identifier is updated at periodic intervals (e.g., every few seconds). Thus, information generated in step 304 may comprise an update to identifying information that was previously generated.
In one embodiment, the descriptive information comprises a unique sequence of pseudo-random nonce strings sent by the server 108 to the hardware processor. As discussed above, the server 108 is able to derive information such as a time and/or location associated with the sign 102.
In optional step 306 (illustrated in phantom), the hardware processor encrypts at least some of the descriptive information using a cryptographic key (e.g., an advanced encryption standard key) that is shared with the server 108. The encrypted data is then appended to an identifier that corresponds to the cryptographic key.
In step 308, the hardware processor inserts descriptive information (which may or may not be encoded in some fashion as described above) into a URL associated with the machine-readable code 104 that is displayed on the sign 102.
The method 300 then ends in step 310.
It should be noted that the present disclosure can be implemented in software and/or in a combination of software and hardware, e.g., using application specific integrated circuits (ASIC), a programmable logic array (PLA), including a field-programmable gate array (FPGA), or a state machine deployed on a hardware device, a general purpose computer or any other hardware equivalents, e.g., computer readable instructions pertaining to the method(s) discussed above can be used to configure a hardware processor to perform the steps, functions and/or operations of the above disclosed methods. In one embodiment, instructions and data for the present module or process 405 for determining a condition under which a sign has been viewed (e.g., a software program comprising computer-executable instructions) can be loaded into memory 404 and executed by hardware processor element 402 to implement the steps, functions or operations as discussed above in connection with the exemplary methods 200 and 300. Furthermore, when a hardware processor executes instructions to perform “operations”, this could include the hardware processor performing the operations directly and/or facilitating, directing, or cooperating with another hardware device or component (e.g., a co-processor and the like) to perform the operations.
The processor executing the computer readable or software instructions relating to the above described method(s) can be perceived as a programmed processor or a specialized processor. As such, the present module 405 for determining a condition under which a sign has been viewed (including associated data structures) of the present disclosure can be stored on a tangible or physical (broadly non-transitory) computer-readable storage device or medium, e.g., volatile memory, non-volatile memory, ROM memory, RAM memory, magnetic or optical drive, device or diskette and the like. More specifically, the computer-readable storage device may comprise any physical devices that provide the ability to store information such as data and/or instructions to be accessed by a processor or a computing device such as a computer or an application server.
While various embodiments have been described above, it should be understood that they have been presented by way of example only, and not limitation. Thus, the breadth and scope of a preferred embodiment should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.
This application is a continuation of U.S. patent application Ser. No. 14/092,839, filed Nov. 27, 2013, now U.S. Pat. No. 10,325,282, which is herein incorporated by reference in its entirety. Hundreds of billions of dollars are spent on advertising each year in the United States alone; as such, the ability to gauge the effectiveness of advertising is critical. However signage, which is a particularly popular form of advertising, is problematic in this regard. Although signage clearly is visible to many people, it is difficult to determine exactly how many people have viewed it, or how influential the signage is on the behavior of the viewers.
Number | Name | Date | Kind |
---|---|---|---|
6532448 | Higginson | Mar 2003 | B1 |
7530099 | Flurry | May 2009 | B2 |
7614562 | Motoi | Nov 2009 | B2 |
8010408 | Rubinstein | Aug 2011 | B2 |
8186572 | Herzig | May 2012 | B2 |
8219115 | Nelissen | Jul 2012 | B1 |
8707442 | Lax | Apr 2014 | B1 |
8838973 | Yung | Sep 2014 | B1 |
9208339 | Paczkowski | Dec 2015 | B1 |
9286560 | Burkhart | Mar 2016 | B2 |
9736587 | Desai | Aug 2017 | B2 |
9946963 | Samara | Apr 2018 | B2 |
20020077846 | Bierbrauer | Jun 2002 | A1 |
20030069967 | Vincent | Apr 2003 | A1 |
20050240490 | Mackey | Oct 2005 | A1 |
20070131773 | Motoi | Jun 2007 | A1 |
20070162438 | Unz | Jul 2007 | A1 |
20070257816 | Lyle | Nov 2007 | A1 |
20080091685 | Garg | Apr 2008 | A1 |
20090044020 | Laidlaw | Feb 2009 | A1 |
20090101706 | Boyd | Apr 2009 | A1 |
20090197616 | Lewis | Aug 2009 | A1 |
20090210526 | Howell | Aug 2009 | A1 |
20090257620 | Hicks | Oct 2009 | A1 |
20090313136 | Giblin | Dec 2009 | A1 |
20100116888 | Asami | May 2010 | A1 |
20100211442 | Venkataraman | Aug 2010 | A1 |
20100241857 | Okude | Sep 2010 | A1 |
20100268592 | Shaer | Oct 2010 | A1 |
20110035284 | Moshfeghi | Feb 2011 | A1 |
20110137706 | Howard | Jun 2011 | A1 |
20110225069 | Cramer | Sep 2011 | A1 |
20110313870 | Eicher | Dec 2011 | A1 |
20120030725 | Seno | Feb 2012 | A1 |
20120089471 | Comparelli | Apr 2012 | A1 |
20120124372 | Dilley | May 2012 | A1 |
20120181330 | Kim | Jul 2012 | A1 |
20120205436 | Thomas | Aug 2012 | A1 |
20120234907 | Clark | Sep 2012 | A1 |
20120278465 | Johnson | Nov 2012 | A1 |
20120290394 | Shapiro | Nov 2012 | A1 |
20130035787 | Canter | Feb 2013 | A1 |
20130048707 | Do | Feb 2013 | A1 |
20130048723 | King | Feb 2013 | A1 |
20130091002 | Christie | Apr 2013 | A1 |
20130113936 | Cohen | May 2013 | A1 |
20130125200 | Sharma | May 2013 | A1 |
20130157760 | Boudville | Jun 2013 | A1 |
20130185400 | Larson | Jul 2013 | A1 |
20130212117 | Tyree | Aug 2013 | A1 |
20130217332 | Altman | Aug 2013 | A1 |
20130256402 | Lim | Oct 2013 | A1 |
20130282714 | Lathrom | Oct 2013 | A1 |
20130290106 | Bradley | Oct 2013 | A1 |
20140018105 | O'Neil | Jan 2014 | A1 |
20140025676 | Alexander | Jan 2014 | A1 |
20140040037 | Yoon | Feb 2014 | A1 |
20140040628 | Fort | Feb 2014 | A1 |
20140067787 | Normile | Mar 2014 | A1 |
20140068593 | McErlane | Mar 2014 | A1 |
20140110468 | Kandregula | Apr 2014 | A1 |
20140117076 | Eberlein | May 2014 | A1 |
20140247278 | Samara | Sep 2014 | A1 |
20140324589 | Pacey | Oct 2014 | A1 |
20140372195 | Desideri | Dec 2014 | A1 |
20150046269 | Liu | Feb 2015 | A1 |
20150088703 | Yan | Mar 2015 | A1 |
20150136846 | Burkhart | May 2015 | A1 |
20150149276 | Kirk | May 2015 | A1 |
20150178721 | Pandiarajan | Jun 2015 | A1 |
Number | Date | Country |
---|---|---|
2254063 | Nov 2010 | EP |
2007280290 | Oct 2007 | JP |
Entry |
---|
Dynamic Barcodes for a Better Ubiquitous Understanding, Geert Vanderhulst and Lieven Trappeniers (Year: 2013). |
An_Interactive_system_using_digital_broadcasting_and_Quick_Response_code (Year: 2011). |
Market Insights_ Marketing initiatives using QR codes (Year: 2010). |
QRishing The Susceptibility of Smartphone Users to QR Code Phishing Attacks (Year: 2012). |
Rivers, Damian J. “Utilizing the quick response (QR) code within a Japanese EFL environment.” The Jaltcalljournal (2009): 15-28. |
Marketing Initiatives using QR codes, John Larkin (2010). |
Number | Date | Country | |
---|---|---|---|
20190303966 A1 | Oct 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14092839 | Nov 2013 | US |
Child | 16443591 | US |