Computers and computing devices are finding their way into more and more aspects of daily life. For example, computing devices are found both inside the home (e.g., personal computers, media devices, communication devices, etc.) and outside the home (e.g., bank computers, supermarket checkout computers, computers in retail stores, computer billboards, computing devices relating to providing commercial services, computing devices in cars, etc.). Most of these computing devices have mechanisms that allow them to interact with humans and/or the environment at some level. Aspects of the way that computing devices interact with humans are sometimes referred to as a “user experience.” For example, a human's satisfaction with a computing device interaction (or sequence of computing device interactions) may be based, at least in part, on the richness and/or productivity of the user experience. In addition, various aspects of the environment (including the physical environment) in which the computing device operates to interact with humans may play a role in shaping the user experience.
The technology described herein facilitates the electronic presentation of information (e.g., information that is more traditionally associated with posters, brochures, and product signage) to one or more users within an environment. Electronic presentation makes it possible for the information to be presented interactively. The technology includes a display component (e.g., public display screen) that displays or otherwise presents content to users within its vicinity. In addition, aspects of the presented content or additional information related to the presented content can be streamed to a user's personal device (e.g., PDA or smart cell phone). Aspects of the technology may include a user detection component that can be used to detect the presence of a user in a specified vicinity of the display and, optionally, a content selection component that can be used to identify targeted/customized content to present to users.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Providing a comfortable and aesthetically pleasing environment is important in many contexts, including commercial contexts, civic contexts, educational contexts, etc. For example, in commercial and/or corporate contexts, enhancements in wireless networks and employee mobility may allow customers, clients, and employees to interact in more comfortable lounge-like settings without the need to be tethered to desks or cubicles, while still maintaining communication abilities.
One way to facilitate such an environment is through the use of display technologies, such as streaming interactive media that provides information more traditionally associated with posters, brochures, and product signage. For example, such display technologies can be used to replace posters and large-scale printed graphics in a variety of environments. The display technologies may have interactive aspects. For example, the display technologies can react to changes in the surrounding environment (e.g., the approach of a user) and/or stream to a user's personal device where the user can interact with aspects of the display technologies.
The following description provides specific examples of techniques that can be used in association with one or more computing devices to increase the richness and productivity of user experiences. While the description provides some examples in the context of a bank branch, the techniques described herein are not limited to banking contexts and, rather, can be applied in any type of environment associated with computing devices, including environments associated with other commercial activities besides banking, home environments, environments at sporting events, retail environments, manufacturing environments, workplace environments, customer service environments, entertainment environments, science or research environments, educational environments, transportation environments, etc. Depending on the environment, increasing the richness and productivity of user experiences in accordance with some embodiments may improve customer retention, increase the value of individual customer relationships, reduce costs, result in higher sales, drive sales to new customers, and provide many other personal and/or commercial benefits.
I. Sample Environment
In general, any of the computing devices described herein may include a central processing unit, memory, input devices (e.g., keyboard and pointing devices), output devices (e.g., display devices), and storage devices (e.g., disk drives). The memory and storage devices are computer-readable media that may contain instructions that implement the system. In addition, the data structures and message structures may be stored or transmitted via a data transmission medium, such as a signal on a communication link. Various communication links may be used, such as the Internet, a local area network, a wide area network, a point-to-point dial-up connection, a cell phone network, and so on.
Embodiments may be implemented in various operating environments that include personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, programmable consumer electronics, digital cameras, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and so on. The computer systems may be cell phones, personal digital assistants, smart phones, personal computers, programmable consumer electronics, digital cameras, and so on.
Embodiments may be described in the general context of computer-executable instructions, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, and so on that perform particular tasks or implement particular abstract data types. Typically, the functionality of the program modules may be combined or distributed as desired in various embodiments.
The display 102 may include a CPU 108 to perform processing, a memory 110, a content storage component 112, a content selection component 114, a streaming module 116, an audio/video component 118, a network module 120, a connectivity port 122, a display screen 124 (e.g., LCD, plasma, projector screen, etc.), and audio features 126. For example, the user 106 may consume presented video content via the display screen and/or audio features and then receive a stream of select content at his or her personal device 104. Accordingly, like the display 102, the personal device 104 may include a connectivity port 130 and a streaming module 134, as well as a user interface 132, a CPU 136, I/O features 138, memory 140, etc.
The display 102 may include and/or communicate with a user presence detection/recognition node 128, which identifies users (known or unknown) and provides information allowing the display 102 to behave in response to the presence of users within its environment. For example, based on information provided by the user presence detection/recognition node 128, the display 102 may wake up from a sleep mode when a user enters into the vicinity of the display 102. Similarly, the display 102 may present information that is specific to a user, based on the identity and/or preferences of the user being known. Various technologies may be used to implement aspects of the user presence detection/recognition node 128.
In some embodiments, the user presence detection/recognition node 128 communicates, e.g., via a network 142, with a remote content server 144 that has access to both a user profile database 146 (which stores user profile information for known users) and a content database 148. Accordingly, based on identifying a known user (e.g., a user having profile information stored in the user profile database 146), the remote content server 144 may serve user-specific content for presentation at the display 102. The user profile database 146 may also store information about a user's response (e.g., favorable, unfavorable, ignored, etc.) to information presented at the display 102. Even if the exact identity of the user is not known, the remote content server 144 may be configured to use information about unknown users to serve specific content. This information may include information about the number of users approaching the display (e.g., whether it is a single user or a group of users, a couple, a family, an adult and a child, etc.), information about the recent past locations of the user or users, etc. For example, if the user presence detection/recognition node 128 detects that a couple is approaching the display 102, the remote content server 144 may use this information to serve display content that is intended for display to a couple (e.g., an advertisement about a vacation to a romantic getaway). Alternatively, if it is likely that a family is approaching, the remote content server 144 may use this information to serve content that is intended for display to a family (e.g., an advertisement about a vacation to Disneyland). In another example, if the user presence detection/recognition node 128 is tracking the location of a user within the environment and can ascertain that the user has performed certain activities based on his or her route through the environment, the remote content server 144 may use this information to serve appropriate content (e.g., if the user just came from a cash machine, the user may be interested in viewing advertisements for financial products).
Sample details of the user presence detection/recognition node 128 of
In some embodiments, the customer identification component may interface with one or more devices or technologies to allow the interactive display technologies to determine the identity of users (e.g., customers in a retail setting). Examples of such devices/technologies include RF ID 202; personal device identification technologies 204 (e.g., based on unique signal transmitted by personal device); card readers 206 (e.g., configured to read magnetic strips on personal identification cards); bar code scanners 208 (e.g., configured to read bar codes on card or other item); DNA analysis technologies 210 (e.g., configured to determine identity based on available DNA samples from skin, hair, etc.); graphonomy technology 212 (e.g., configured to determine identity based on handwriting or signatures); fingerprint/thumbprint analysis technology 214; facial analysis technology 216; hand geometry analysis technology 218; retinal/iris scan analysis technology 220; voice analysis technology 222; etc.
Many of these technologies/devices function based on having a user register and/or voluntarily provide initial information (e.g., name, biometric information, affiliations, etc.) so that a user profile can be generated. In this way, the user can be identified as soon as the user's presence is subsequently detected within the environment (e.g., by collecting information for each user who enters the environment and then matching this information to find specific user profiles). However, such an initial registration process may not be needed in all cases to generate a user profile. For example, a user profile for an unnamed new user may be initially generated and updated based on collecting available biometric (or other information) for that user, assigning a unique identifier to the user (e.g., an ID number), mapping the unique identifier to the available biometric (or other information), and then subsequently tracking the user's activities within the environment.
Referring to
Tracking the user's location and activities within the environment may further control what type of content is to be selected for display to that user, as well as providing more basic information about when a particular user is approaching a display. For example, if a bank customer is approaching a display after having recently made a large deposit into her savings account using an ATM, it may make sense to display content associated with an offer for a new investment opportunity that the customer may potentially be interested in based on the fact that she recently made the deposit.
II. Sample Display Technologies
As illustrated in
In another example, the display 302 is configured so that displayed content streams can be split into multiple channels, allowing users to view content on their own devices and/or take content away with them, much like a take-home brochure, as shown in
In some embodiments, the content streamed to the user device 304 is a subset of the displayed content (e.g., a single user-selected screen). Alternatively, the streamed content is an expanded version of the displayed content, which, for example, allows the user to take home more detailed information than what is initially displayed. For example, a displayed advertisement for a restaurant may, when streamed to the user's device, provide a detailed “menu view.” In another example, the streamed content allows a user to purchase a product or service from his or her personal device and/or learn more details about select products or services. For example, when a user streams information related to the “Ready for that vacation?” advertisement shown on the display 302 to his or her personal device 304, the streamed information may include options to view details about different available vacation packages, select a desired vacation package, and even make reservations using an interface provided in association with the personal device 304.
In addition to allowing the user to interact with aspects of the displayed content (e.g., select from multiple options, play a game, provide personal information, request more information, etc.) at his or her own personal device 304, the display technologies may also facilitate allowing the personal device 304 to provide information back to the display 302 after the user interacts with aspects of the content. For example, the display 302 may stream aspects of a game to be played on the personal device 304. When the user has completed a game, information from the completed game may be exported back to the display 302 so that the display 302 can publicly present the user's score (or other information associated with the user interaction).
As discussed in more detail above with respect to
Providing interactivity may also involve allowing users to interact with the displays using their own devices (e.g., to leverage multi-cast support). For example, the display may be configured to interact with an application on a user device so that application can, at least to some extent, control the behavior of the display. To illustrate, the user may be able to flip through screens on the display by using controls on his or her mobile device, make selections of options presented on the display, etc.
III. Representative Flows
At block 403, the routine 400 presents content to the user, which may include audio content, images, movies, or other visual content, or a combination of content using different media. In some embodiments, visual content presentation abilities may be based on display technologies such as those associated with flat panel displays (e.g., liquid crystal displays (LCDs), plasma display panels (PDPs), organic light emitting diodes (OLEDs), field emission displays (FEDs), etc.), active matrix displays, cathode ray tubes (CRTs), vacuum fluorescent displays (VFDs), 3D displays, electronic paper, microdisplays, projection displays, etc.
At block 404, the routine 400 streams content to a device associated with the user. This content may be interactive content (e.g., content that provides user selectable options) or may be static (e.g., purely informational). With interactive content, the user can interact with the content on his or her device, which in turn may (or may not) affect the content on the display. The routine 400 then ends.
At block 502, the received information is presented on the user device. For example, the user device may present a small version of an advertisement that was initially presented on the larger display. At block 503, the routine 500 responds to user interaction with the information presented on the user display. For example, in the case of the advertisement, the user may have the option to view details about aspects of the advertisement using the I/O features of the user device. In another example, the user plays a take-away mini game. At block 504, if appropriate, the routine 500 streams interaction results back to the display. For example in the case of the mini game, the user's game results may be streamed back to the display so that they can be presented on the display after the user has completed the game. In another example, the playing of the game itself may be presented on the display so that other patrons in the area can view the game play. The routine 500 then ends.
From the foregoing, it will be appreciated that specific embodiments have been described herein for purposes of illustration, but that various modifications may be made without deviating from the scope of the invention. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
This application claims priority to U.S. application No. 60/703,548, filed Jul. 29, 2005, entitled “Device/Human Interactions, such as in the Context-Aware Environments,” which is herein incorporated by reference.
Number | Date | Country | |
---|---|---|---|
60703548 | Jul 2005 | US |