Systems and methods for communication between a reactive video system and a mobile communication device

Information

  • Patent Grant
  • 8098277
  • Patent Number
    8,098,277
  • Date Filed
    Monday, December 4, 2006
    17 years ago
  • Date Issued
    Tuesday, January 17, 2012
    12 years ago
Abstract
Systems and methods are provided for communication between a reactive video engine and a mobile communication device. In a system according to one embodiment, a reactive video engine is coupled to a communication interface. The communication interface is configured to facilitate communication with the reactive video engine and a mobile communication device.
Description
BACKGROUND OF THE INVENTION

1. Field of the Invention


The present invention relates generally to communication with a reactive video system, and more particularly to systems and methods for communication between a reactive video system and a mobile communication device.


2. Description of the Related Art


Conventionally, human interaction with video display systems has required users to employ devices such as hand-held remote controllers, keyboards, mice, and/or joystick controllers to control the video display systems. Recently, a new type of system for human interaction with video displays has been developed called a reactive video system. A reactive video system allows real-time, interactive and unencumbered human interaction with images generated by the system. In such a reactive video system, the location and the motions of humans or other physical objects are captured as data using a computer vision system. The captured data may then be analyzed using software and/or hardware systems to generate images using the reactive video display, including by using a projector, television display screen, liquid crystal display screen, or other video display medium that is a part of the reactive video system.


While existing reactive video systems using computer vision methods capture and analyze data representing human and/or object interactions, they do not provide for easy and/or private communications with a user. For example, it is not possible for the user to send a text message, an image or a video to a reactive video system. Furthermore, existing reactive video systems do not identify their users, and they do not individualize (i.e., personalize) the images they generate in response to specific users.


SUMMARY OF THE INVENTION

Systems and methods are provided for communication between a reactive video engine and a mobile communication device. In a system according to one embodiment, a reactive video engine is coupled to a communication interface. The communication interface is configured to facilitate communication with the reactive video engine and a mobile communication device.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates an exemplary environment for communication between a reactive video engine and a mobile communication device.



FIG. 2 illustrates a block diagram of an exemplary reactive video engine.



FIG. 3 illustrates a flow diagram of an exemplary process for communication between a mobile communication device and a reactive video engine.



FIG. 4 illustrates a flow diagram of an exemplary process for generating an image based on individualized content.



FIG. 5 illustrates a flow diagram of an exemplary process for generating an image based on an individualized communication.



FIG. 6 illustrates a flow diagram of an exemplary process for providing an information to a mobile communication device based on body motions using a reactive video engine.





DETAILED DESCRIPTION

Various embodiments include systems and methods for communication between a reactive video engine and a mobile communication device. Applications for reactive video engines are widely varied, and include advertising, video games, information browsing, product customization, branding, in-store marketing, and other commercial and non-commercial uses. Reactive video engines can be deployed in public and/or private spaces, such as malls, transit centers, retail stores, movie theaters, sports arenas, restaurants, clubs, bars, schools, and private homes.



FIG. 1 illustrates an exemplary environment for communication between a reactive video engine 110 and a mobile communication device 130 via network 120 and/or communication interface 140. A user can interact with the reactive video engine 110 using the mobile communication device 130 and/or by using the user's body motions. The reactive video engine 110 can generate images based on the user's interactions, and the mobile communication device 130 can detect the images. In various embodiments, the mobile communication device 130 includes a camera to detect the images. The images can comprise a barcode containing information that can be decoded by the mobile communication device 130 and/or the images can be configured to be recognized using software associated with the mobile communication device 130. The reactive video engine 110 and the mobile communication device 130 can communicate with each other using the communication interface 140 and/or the network 120.


A mobile communication device 130 comprises a device with communication functionality such as a mobile phone, a mobile computer, a personal digital assistant (PDA), an active or passive radio frequency identification (RFID) device, a hand-held video game system, and so forth. The network 120 may be any network including, but not limited to, a telephone network or a computer network (e.g., the interne, a local area network, a wide area network, and/or wireless network such as a cellular network, a WiFi network according to Institute of Electrical and Electronics Engineers (IEEE) standards 802.11a/b or g or other such wireless local area network standards, or a WiMax network according to IEEE standard 802.16 or another such wireless broadband standard).


The reactive video engine 110 can generate an image that is detected by the mobile communication device 130, and/or can generate an image of a text wherein at least part of the text can be entered by a user into the mobile communication device 130.


The communication interface 140 may comprise any device that enables direct communications between the mobile communication device 130 and the reactive video engine 110. Furthermore, the communication interface 140 may be used in conjunction with the network 120 to provide communication between the reactive video engine 110 and the mobile communication device 130. In various embodiments, the communication interface 140 may be used when short latency periods are desired for communications. For example, the communication interface 140 may provide communication using Wi-Fi according to IEEE standards 802.11a/b or g or other such wireless local area network standards, WiMAX IEEE standard 802.16 or another such wireless broadband standard, the Bluetooth® wireless technology standard IEEE 802.15.1 for wireless personal area networks (PANs) or another personal area network standard, or a wireless, infrared standard such IrDA® promulgated by the Infrared Data AssociationSM. Furthermore, in various embodiments, the communication interface 140 may be configured to receive the unique electromagnetic signals emitted, for example, by mobile phones, and to excite and/or receive signals from RFID devices.



FIG. 2 illustrates a block diagram of an exemplary reactive video engine, such as the reactive video engine 110 discussed with reference to FIG. 1. The reactive video engine 110 comprises a reactive video display module 200, a mobile communication interface module 210, a communication module 220, an analysis module 230, an information filter module 240, and a storage medium 250. Although the reactive video engine 110 is described as comprising various components, such as the reactive video display module 200 and the mobile communication interface module 210, the reactive video engine 110 may comprise fewer or more components and still fall within the scope of various embodiments.


The reactive video display module 200 may comprise a camera, a video projector and/or another display medium, and/or a computer system configured to capture information in an interactive area. According to various embodiments, for example, the reactive video display module 200 may comprise any of the interactive video display systems (reactive video systems) disclosed in U.S. patent application Ser. No. 10/160,217, filed May 28, 2002, entitled “Interactive Video Display System,” now U.S. Pat. No. 7,259,747, U.S. patent application Ser. No. 10/946,263, filed Sep. 20, 2004, entitled “Self-Contained Interactive Video Display System,” U.S. patent application Ser. No. 10/974,044, filed Oct. 25, 2004, entitled “Method and System for Processing Captured Image Information in an Interactive Video Display System,” now U.S. Pat. No. 7,536,032, U.S. patent application Ser. No. 10/946,084, filed Sep. 20, 2004, entitled “Self-Contained Interactive Video Display System,” U.S. patent application Ser. No. 10/946,414, filed Sep. 20, 2004, entitled “Interactive Video Window Display System,” now U.S. Pat. No. 7,710,391 and U.S. patent application Ser. No. 11/083,851, filed Mar. 18, 2005, entitled “Interactive video display system,” now U.S. Pat. No. 7,170,492. The entireties of the above patents and patent applications are incorporated herein by reference.


The mobile communication interface module 210 can facilitate communication between the communication interface 140 described with reference to FIG. 1 and the reactive video engine 110. The communication module 220 may also facilitate communication between the network 120 described with reference to FIG. 1 and the reactive video engine 110. In various embodiments, the communication interface 140, the mobile communication interface module 210 and/or the communication module 220 may detect the presence of the mobile communication device 130. This detection may occur passively, such as when the mobile communication device 130 emits signals that are detected within a detection zone of the reactive video engine 110, or the detection may be based on a user-initiated communication sent from the mobile communication device 130 and is received by the reactive video engine 110. Furthermore, the reactive video engine 110 can also send information to the mobile communication device 130 using the mobile communication interface module 210 and/or the communication module 220 using the communication interface 140 and/or the network 120. In exemplary embodiments, the mobile communication interface module 210 can facilitate communication between a first mobile communication device, such as the mobile communication device 130, and one or more other mobile communication devices.


As discussed herein, for example, the reactive video display module 200 can generate an image of a one-dimensional or a two-dimensional barcode, which can be detected and analyzed using a camera and software application associated with the mobile communication device 130. In various embodiments, the barcode can encode information that enables the mobile communication device 130 and/or the user to contact and communicate with the reactive video engine 110. In another example, the reactive video display module 200 can generate an image representing a unique picture, an icon, a web page Uniform Resource Locator (URL), a text string, an email address, a Common Short Code (CSC) and a keyword for use via Short Message Service (SMS), an instant message (IM) address, etc., which the user can optionally detect using the mobile communication device 130, or which the user can enter into the mobile communication device 130, as text. The images can represent specific information which can be used to access other information or a service. The images may be accessed either immediately, for example, or at a later time, or when the user is away from the reactive video engine 110.


In various embodiments, the reactive video engine 110 can receive an individualized communication from the mobile communication device 130. An individualized communication may comprise any communication or information specific to a particular user or a particular mobile communication device, such as the mobile communication device 130.


In one embodiment, the individualized communication may include the user log-in identification. For example, the reactive video display module 200 may generate an image that depicts a CSC and a keyword. In one instance, the mobile communication device 130 may comprise a mobile telephone, and the user may enter the CSC and keyword in the mobile telephone so that SMS and/or Multimedia Message Service (MMS) messages are received by the reactive video engine 110. The SMS and/or MMS message may provide the individualized communication establishing the user's identity, the reactive video engine 110 the user is located at, and/or other information.


The reactive video engine 110 can also send information to the mobile communication device 130. The reactive video engine 110 can send information in response to a previous communication received from the mobile communication device 130 or, for example, by allowing a user to use the user's body or appendages to enter the user's phone number, an instant message name, an email address, and so forth, directly into the reactive video engine 110 via body motions on a virtual keypad. The reactive video engine 110 can then communicate with the mobile communication device 130 and send the information from the user using the network 120 or the communication interface 140. According to exemplary embodiments, the SMS and/or MMS messages described herein may allow the reactive video engine 110 to facilitate sending an advertising coupon or other information to the mobile communication device 130.


A variety of different types of information may be encoded into the communication by the reactive video engine 110 and/or the mobile communication device 130. For example, the identity of the reactive video engine 110 being accessed by the user, the time and date of the user's access, and/or particular choices or selections made by the user while interacting with the reactive video engine 110 may be communicated. In an exemplary game application, the communication can include a performance of a user in the game, and whether the user won a particular sweepstakes or contest. In various embodiments, the communication may provide a full record or description of the user's interaction with the reactive video engine 110. In other embodiments, the communication may be a unique code that allows the interaction to be identified and/or retrieved from a storage medium, such as the storage medium 250.


In various embodiments, the reactive video engine 110 can display a unique session identifier (session ID), which can encode for the identity of the reactive video engine 110, the time, and other information. A user can use the session ID to log-in by using the mobile communication device 130 to communicate with the reactive video engine 110. Multiple users may be identified by providing a different session ID to each user, for example, by having one user stand in one region of the display screen while logging in, and having another user stand in a different section of the display screen while logging in. In other embodiments, the session ID can be automatically sent to the mobile communication device 130 of all users on or near the reactive video engine 110.


The log-in process can take many forms. For example, the session ID may consist of a short text string, which the user can type into the mobile communication device 130 and send as an SMS message to the reactive video engine 110. In another example, a log-in may consist of the user's mobile phone number, which is sent as part of the SMS message, and may be used to identify the user. In another example, the session ID may be encoded as text, image, or a barcode readable by a camera and software on the mobile communication device 130. Thus, the user can point the mobile communication device 130 at the text, image, or barcode, and the application on the mobile communication device 130 can automatically extract the session ID.


In another embodiment, the log-in process may be initiated by the reactive video engine 110. The reactive video engine 110 can send out a signal that can be picked up by the mobile communication device 130 of the user. If the mobile communication device 130 is a RFID device or contains a radio frequency transmitter, the reactive video engine 110 can use a group of receivers or a directional receiver to triangulate or otherwise disambiguate the position of the mobile communication device 130, and thus unambiguously identify each particular user within a group of users. In some embodiments, session IDs may be unnecessary and the reactive video engine 110 can simply send out a generic signal to trigger each mobile communication device 130 to broadcast the user identifications (users IDs) associated with the mobile communication device 130. In other embodiments, the reactive video engine 110 need not send out a signal, for example, when the mobile communication device 130 continuously sends out a unique signal which can act as a user ID.


The storage medium 250 can include, but is not limited to random access memory (RAM) and/or a read only memory (ROM). A storage device, such as a hard drive or a flash drive can comprise the storage medium 250. A database may be contained within the storage medium 250, and can provide an individualized content to the reactive video engine 110. The individualized content may comprise any communication or information specific to the particular user or the mobile communication device 130 that is stored in the storage medium 250. In exemplary embodiments, the storage medium 250 stores the communication between the reactive video engine 110 and the mobile communication device 130. In various embodiments, a full or partial record of the communication can be stored, along with the unique code identifying the interaction. According to some embodiments, access to the storage medium 250 allows the reactive video engine 110, the user, or a manager responsible for the operation and maintenance of reactive video engine 110, to use the unique code to access the record of the communication. The storage medium 250 may be located at the physical location of the reactive video display module 200 or at another location (not shown).


Users may create accounts in which the user's name, preferences, and/or statistics are linked to a unique identifier, such as a user ID that is stored in the storage medium 250. The unique identifier can take many forms, including a cell phone number, an identification code stored on the mobile communication device 130, a password that can be typed into the mobile communication device 130, or a RFID device encoded with a specific frequency. Links between unique identifiers and personal information, for example, the user's name, preferences, and/or statistics, can be stored in a database, such as storage medium 250.


In exemplary embodiments, the analysis module 230 analyzes and/or reviews the communication received from the mobile communication device 130. The communication (images, text, videos, sounds, etc.) submitted by the user can analyzed before the reactive video engine 110 generates one or more images, videos, and/or sounds based at least in part on the communication. The communication can then be incorporated into the content used to generate the one or more images. For example, an image of the user's face may be submitted and generated as an image on the display screen used by the reactive video engine 110. The reactive video display module 200 can generate arbitrary graphical content and the image of the face can be used like any other image on the display screen. For example, the image of the user's face may be shown on a head of a user-controlled video game character, or as a head of a mannequin in a clothing/dressing application. In applications where the communication is a user-submitted image of a person, for example, the analysis module 230 can include face-finding and/or facial-feature-finding functionality that can extract and/or otherwise change the image of the user's face.


The use of individualized content can be persistent. The reactive video engine 110 can implement log-in procedures. For example, when a user uses the mobile communication device 130 to send a log-in communication to a reactive video engine 110, the storage medium 250 can access the information that the user has submitted in the past. The information may be stored locally or at a central database. Thus, for example, if a user logs-in to play a game, the user's logo, character, and other user-submitted information previously submitted can be available to the reactive video engine 110. In various embodiments, user-submitted communications can be stored in a central database and tagged with a user's identification, a timestamp of the submission, and the location of the submission, allowing a network of reactive displays to have persistent access to the communications.


Since the reactive video engine 110 can be in public venues, the information filter module 240 enables the filtering (information screening) of an individualized (personalized) communication before a public display via the reactive video engine 110. The information filter module 240 reviews the individualized communication and provides a reviewed individualized communication to the reactive video engine 110.


In various embodiments, an individualized communication (e.g., images, text, videos, sounds, etc.) is reviewed by the information filter module 240 before being made public by the reactive video display module 200. The reactive video engine 110 can generate an image of a text based at least in part on the individualized communication and/or a non-text image based at least in part on the individualized communication.


The information filter module 240 can filter out inappropriate material, which may include racist, offensive, pornographic, copyrighted, trademarked, or otherwise unacceptable content. The information filter module 240 may use known automated filtering techniques that are in common use for web content, image recognition techniques, and/or other techniques. In addition, the information filter module 240 can use a manual technique, in which a person reviews and approves each user-submitted communication prior to public display. The information filter module 240 can also use a combination of automated and manual filtering techniques.


The reactive video engine 110 described with reference to FIGS. 1 and 2 can be use to implement a variety of exemplary applications, some examples of which are described herein.


For example, the reactive video engine 110 can provide a coupon to the user with the mobile communication device 130. The coupon can be either unique or non-unique to a particular user. A unique coupon code can be provided via a CSC and a keyword (or a similar method) and displayed using the reactive video display module 200. The user can enter the coupon code in the mobile communication device 130 and thereby send an SMS message to the reactive video engine 110 or to another system (not shown), which then can send a coupon back via SMS to the user's mobile communication device 130 (e.g., a mobile phone associated with the user). In various embodiments, the coupon may be in the form of a SMS message, image, or other message. The reactive video engine 110 or another system (not shown) can provide an online database, such as the storage medium 250, which can store detailed information about the user's interaction linked to the unique coupon code. Alternatively, the coupon may be encoded uniquely via a text string, image, barcode, or any of the other communication techniques described herein. To redeem the coupon at a store, the user can present the mobile communication device 130, with the stored coupon, to an employee of the store. If necessary, the employee may then verify the validity of the coupon, by accessing the database provided by the storage medium 250. The coupon can also be redeemed online. The user may type a coupon code or URL into a website of an online store, and the store can automatically check the validity of the coupon against the database provided, for example, by the storage medium 250.


According to another embodiment, the user may be randomly selected while the user interacts with the reactive video engine 110, and the user is provided with a coupon code as described herein. For identification purposes, a video camera associated with the reactive video engine 110 can take an image of the user so that the user can be distinguished from other nearby people who saw the coupon code as well.


According to another embodiment, the reactive video engine 110 can allow the user to vote or to take a survey. To prevent users from voting more than once, the user can send the user's votes and/or the user's answers through the mobile communication device 130. Thus, the reactive video engine 110 can uniquely identify the user using, for example, a mobile phone number.


According to another embodiment, the reactive video engine 110 can allow the user to view and to vote on a series of images, text strings, videos, or other content. For example, the reactive video engine 110 can provide images of buttons, allowing the user to express opinions by touching the parts of the image. Each button pressed, or otherwise selected, is perceived as a vote or a rating for the content shown. Data from the voting may be aggregated at the reactive video engine 110 and/or shared over a network a plurality of the reactive video engines 110. A central database may keep track of vote counts, and the user may submit images, text, videos and/or other content using the mobile communication device 130 to the reactive video engine 110 for incorporation into the list of images, text strings, videos, or other content being voted upon. The submissions may be sent to the reactive video engine 110, or to a network of the reactive video engines 110. The user who submits content that receives a specific number of votes can win a prize.


In another embodiment, a player of a game using the reactive video engine 110 can be provided with a unique coupon code using mobile communication device 130, as described herein, upon completion of the game. The resulting coupon may be based on the player's performance on the game. The player may win a coupon representing a special offer or a free item as a result of winning the game or achieving a particular score, for example.


In another embodiment, the reactive video engine 110 can allow users to make a set of choices. For example, users may use the reactive video engine 110 to customize a particular product. The reactive video engine 110 can provide the user with a code that either indicates the choices the user made, or a unique identifier that can be used to access a database, such as storage medium 250, that stores the user's choices. The user can use the reactive video engine 110 as a creative tool to create an image or other content. At a later time, the user can use the code or unique identifier to access the image or content the user previously created. The code may be received using the mobile communication device 130.


In another embodiment, the user can purchase an item the user views, designs, and/or customizes using the reactive video engine 110. The reactive video engine 110 can provide the user with a code that identifies the item, allowing the user to enter confidential financial information into their mobile communication device 130 to complete the purchase.


In another embodiment, the mobile communication device 130 may be used as a video game controller when the reactive video engine 110 provides a game. For example, the user playing a magic-themed game can use a combination of hand gestures and buttons on the mobile communication device 130 to cast a spell, for example, by using one particular button on the mobile communication device 130 to select a spell, and the body motions of the user's arm to aim it. The mobile communication device 130 can also direct the movements of a virtual objects or characters generated by the reactive video engine 110.


In another embodiment, using the reactive video engine 110 to play games can include games in which each user's performance history is tracked. Using the mobile communication device 130, the user could have a unique virtual character and/or set of abilities that may be associated with the user, and the reactive video engine 110 can individualize (personalize) the appearance of the unique virtual character based on preferences selected by the user. In the case of multiple users, the reactive video engine 110 can individualize (personalize) a section of the display to each user. For example, the image of a distinctive “aura” may be displayed around each user's body. If the user returns to a game previously played, the game can continue where the game left off when last played. Using the mobile communication device 130, the user can receive a code representing a promotion or prize for visiting a particular set of the reactive video engines 110. Furthermore, the using mobile communication device 130, a user can submit text, photos, or other information to the reactive video engine 110 such that text, photos, etc. appear in the game.



FIG. 3 illustrates a flow diagram of an exemplary process for communication between a mobile communication device and a reactive video engine, such as the mobile communication device 130 and the reactive video engine 110. At step 310, a communication is received from the mobile communication device. As discussed herein, receiving the communication may include the use of the communication interface 140, the mobile communication interface module 210, and/or the communication module 220 and detecting the presence of the mobile communication device 130. This detection may occur passively, such as when the mobile communication device 130 emits signals that are detected within a detection zone of the reactive video engine 110, or the detection may be based on a user-initiated communication sent from the mobile communication device 130 and received by the reactive video engine 110. The user-initiated communication may be facilitated by images and/or signals emitted by the reactive video engine 110 or the communication interface 140.


At step 320, the communication is analyzed. As discussed herein, the analysis module 230 analyzes and/or reviews the communication received from mobile communication device 130. The communication, which may include images, text, videos, sound, etc., is analyzed before reactive video display module 200 generates one or more images based at least in part on the communication. Operations may be performed on the communication that may be necessary to convert the communication from one format into another format that may be needed by the reactive video engine 110. In various embodiments, where the communication is an image of a person, for example, step 320 may include functionality provided by the analysis module 230 that can include face-finding and/or facial-feature-finding functionality that can extract and/or otherwise change the image of the user's face.


At step 330, one or more images are generated using the reactive video display module 200 such that the image is based at least in part on the communication received in step 310. For example, an icon or graphic symbol may be utilized as an image on the display screen used by the reactive video engine 110. In various embodiments, the one or more images generated at step 330 may be configured to be detected by the mobile communication device 130. For example, the reactive video display module 200 can generate an image of a one-dimensional or a two-dimensional barcode, which can be detected or captured and analyzed using a camera and a software application associated with the mobile communication device 130. In other embodiments, the one or more images generated at step 330 may include an image of a text wherein at least part of the text can be entered by the user into the mobile communication device 130. For example, the reactive video display module 200 can generate an image representing a web page Uniform Resource Locator (URL), a text string, an email address, an instant message (IM) address, etc., which the user can optionally detect using the mobile communication device 130, or which the user can enter into mobile communication device 130, as text.



FIG. 4 illustrates a flow diagram of an exemplary process for generating an image based on individualized content. Individualized content may comprise any communication or information specific to a particular user or the mobile communication device 130. The individualized content may be stored, for example, using the storage medium 250. In various embodiments, individualized content may include a session ID, user ID, text, image, video, and/or a sound.


At step 410, a communication is received from a mobile communication device, such as mobile communication device 130. Receiving the communication may include the use of the communication interface 140, the mobile communication interface module 210 and/or the communication module 220 and detecting the presence of the mobile communication device 130. This detection may occur passively, such as when the mobile communication device 130 emits signals that are detected within a detection zone of the reactive video engine 110, or the detection may be based on a user-initiated communication sent from the mobile communication device 130 and received by the reactive video engine 110, as discussed herein. The user-initiated communication may be facilitated by images and/or signals emitted by the reactive video engine 110 and/or the communication interface 140.


At step 420, the communication is analyzed. As discussed herein, the analysis module 230 analyzes and/or reviews the communication received from the mobile communication device 130. The communication, which may include images, text, videos, sound, etc., is analyzed before the reactive video display module 200 generates one or more images based at least in part on the communication. As discussed herein, the communication may be converted from one format into another format that may be needed by the reactive video engine 110. In exemplary embodiments where the communication is a image of a person, for example, step 420 may include functionality provided by the analysis module 230 that can include face-finding and/or facial-feature-finding functionality that can extract and/or otherwise change the image of the user's face.


At step 430, a database contained within the storage medium 250 is queried for individualized content. As discussed herein, individualized content may be a session ID, user ID, image, video, and/or a sound that was previously stored using the storage medium 250. For example, an image of a user's face can be previously stored, and the query performed at step 430 can retrieve the stored image. In another example, a database contained within the storage medium 250 can be queried for information that the user submitted in the past, such as a login communication discussed herein. Thus, for example, if the user logs-in to play a game, the user's logo, character, and/or other information previously submitted can be available to the reactive video engine 110. In various embodiments, the user's communications can be stored in a central database and tagged with the user's identification, a timestamp of the submission, and/or the location of the submission, allowing one or more reactive video engines 110 to have persistent access to the stored communications.


At step 430, an image is generated using the reactive video display module 200 such that the image is based at least in part on the individualized content. For example, the reactive video display module 200 can generate arbitrary graphical content and a previously stored image of a face can be used like any other image displayed on the display screen of the reactive video engine 110. For example, the image of the user's face may be shown on the head of a user-controlled video game character, or as the head of a mannequin in a clothing/dressing application.



FIG. 5 illustrates a flow diagram of an exemplary process for generating an image based on an individualized communication. At step 510, a communication is received from a mobile communication device, as is described with reference to step 310 in FIG. 3. At step 520, the communication is analyzed, as is described with reference to step 320 in FIG. 3.


At step 530, the reactive video engine 110 determines if the communication received from the mobile communication device 130 includes an individualized communication. An individualized communication may comprise any communication or information specific to a particular user or to the mobile communication device 130. If the communication is not individualized, at step 540 an image is generated using the reactive video display module 200 such that the image is based at least in part on the communication received in step 510. Step 540 may generate an image using the process described with reference to step 330 in FIG. 3


Step 550 is performed if the reactive video engine 110 determines that the communication received from the mobile communication device is an individualized communication. At step 550, the individualized communication is reviewed using an information filter module, such as information filter module 240. The individualized communication is reviewed using a filter and/or an information screen, as described herein. Step 550 insures that a user-submitted communication (e.g., images, text, sounds, or videos) is reviewed by information filter module 240 before being made public by the reactive video display module 200. The review performed by the information filter module 240 may be configured to filter out inappropriate material such as, for example, include racist, offensive, pornographic, copyrighted, trademarked, or otherwise unacceptable content. The review performed by the information filter module 240 may use known automated filtering techniques that are in common use for web content, image recognition techniques, and/or other techniques. In addition, the information filter module 240 can use a manual technique, in which a person reviews and approves each user-submitted communication prior to public display. The Information filter module 240 can also use a combination of automated and manual filtering techniques.


At step 560, the reactive video display module 200 generates an image of a text based at least in part on the individualized communication and/or a non-text image based at least in part on the individualized communication. For example, the image may include specific information about the user, such as images, text, sounds, or videos submitted by the user. As discussed herein, the image generated at step 560 may comprise an image of a text wherein at least part of the text can be entered by a user into the mobile communication device 130. For example, the reactive video display module 200 can generate an image representing a web page Uniform Resource Locator (URL), a text string, an email address, an instant message (IM) address, etc., which the user can optionally store using the mobile communication device 130, or which the user can enter into mobile communication device 130 as text. As discussed herein, a non-text image generated at step 560 may comprise an image of a one-dimensional or a two-dimensional barcode, which can be detected and analyzed using mobile communication device 130. For example, the when mobile communication device 130 is a mobile phone with a camera, a camera and software application in the mobile phone can detect the image generated at step 560.



FIG. 6 illustrates a flow diagram of an exemplary process for providing an information to a mobile communication device based on body motions using a reactive video engine, such as reactive video engine 110. At step 610, the user can interact with the reactive video engine 100 by using the body motions of the user. As described herein, the reactive video engine 100 allows human interaction with images generated by reactive video engine 100. The body motions of the user, including the location and the motions of the user and of other physical objects controlled by the user, are captured as data. In one exemplary embodiment, reactive video engine 110 can interact with images of buttons allowing the users to express the user's opinions by touching the parts of the image. In another exemplary embodiment, the user playing a game can use the user's hand, other appendages or gestures that interact with the reactive video engine 110 to direct the playing of the game.


At step 620, the reactive video engine 110 provides an information to the mobile communication device based on the user's body motions using the reactive video engine 110. For example, based on the body motions of a user, the user may request information, win a game, or otherwise interact with the reactive video engine 110. As a result, the reactive video engine 110 can provide the information to a mobile communication device 130 that provides the requested information, a coupon code, a SMS or MMS message, or other information.


In exemplary embodiments, the information may comprise an image, an electromagnetic signal, and/or may be individualized to the user. As discussed herein, when the information comprises an image, the image can represent a web page Uniform Resource Locator (URL), a text string, an email address, an instant message (IM) address, etc., which the user can enter into mobile communication device 130 as text. Furthermore, a non-text image can be an image of a one-dimensional or a two-dimensional barcode, which can be detected (captured) and analyzed using a camera and software application in the mobile communication device 130. When the information comprises an electromagnetic signal, the information may comprise any of the electromagnetic signals described with reference to FIG. 1 (e.g., Wi-Fi, WiMAX, Bluetooth®, IrDA®, an RFID signal). When the information is individualized to the user, the information may comprise an individualized information. An individualized information is any communication or information specific to a particular user or to the mobile communication device 130, and is described herein with reference to FIGS. 1, 2 and 5.


While various embodiments have been described above, it should be understood that they have been presented by way of example only, and not limitation. For example, any of the elements associated with reactive video engine 110 may employ any of the desired functionality set forth herein above. Thus, the breath and scope of any particular embodiment should not be limited by any of the above-described exemplary embodiments.

Claims
  • 1. A system comprising: a camera configured to capture video data of a space occupied by a user having a mobile communication device that is operable by the user;a display configured to present an interactive image to the user; anda processor configured to determine an identity of the user based on information transmitted by the mobile communication device;identify an image of at least a face of the user; andupdate the interactive image on the display to include the face of the user on a user-controlled character.
  • 2. The system of claim 1, wherein the user may complete a purchase of a product, the product represented in the interactive image.
  • 3. The system of claim 1, wherein the interactive image comprises a barcode.
  • 4. The system of claim 1, wherein the interactive image comprises objects recognizable to the mobile communication device.
  • 5. The system of claim 1, wherein at least some of the information transmitted by the mobile communication device is manually entered into the mobile communication device by the user.
  • 6. The system of claim 1, wherein the system is further configured to initiate communication with the mobile communication device.
  • 7. The system of claim 1, wherein the user may choose to accept the communication.
  • 8. A method comprising: capturing video data corresponding to an interaction between a user of a mobile communication device and a virtual object generated by an interactive video computing system;determining an identity of the user based on the video data or information transmitted by the mobile communication device;identifying individualized content associated with the determined identity; andpresenting an interactive image on a display that is viewable by the user in accordance with the individualized content.
  • 9. The method of claim 8, further comprising completing a purchase of a product, the product represented in the interactive image.
  • 10. The method of claim 8, wherein the interactive image comprises a barcode.
  • 11. The method of claim 8, wherein the interactive image comprises objects recognizable to the mobile communication device, the mobile communication device including a camera.
  • 12. The method of claim 8, wherein at least some of the information transmitted by the mobile communication device is manually entered into the mobile communication device by the user.
  • 13. The method of claim 8, further comprising transmitting information to the mobile communication device.
  • 14. A hardware computer readable storage medium having embodied thereon a program, the program being executable by a processor of a computing system in order to cause the computing system to perform operations comprising: capturing video data including a space occupied by a user of a mobile communication device;determining an identity of the user based on the video data or information transmitted by the mobile communication device;identifying individualized content associated with the determined identity; andpresenting an interactive image that is customized at least partly based on the individualized content on a display.
  • 15. The hardware computer readable storage medium of claim 14, wherein the operations further comprise completing a purchase of a product, the product represented in the interactive image.
  • 16. The hardware computer readable storage medium of claim 14, wherein the interactive image comprises a barcode.
  • 17. The hardware computer readable storage medium of claim 14, wherein the interactive image comprises objects recognizable to the mobile communication device.
  • 18. The hardware computer readable storage medium of claim 14, wherein at least some of the information transmitted by the mobile communication device is manually entered into the mobile communication device by the user.
  • 19. The hardware computer readable storage medium of claim 14, wherein the operations further comprise transmitting information to the mobile communication device.
  • 20. The hardware computer readable medium of claim 19, wherein the user may choose to accept the transmitted information.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the priority benefit of U.S. provisional application Ser. No. 60/741,557 filed on Dec. 2, 2005 and entitled “Deliverance of Personalized Information and Interaction with Mobile Communications Devices on Reactive Displays,” and of U.S. provisional application Ser. No. 60/817,278 filed on Jun. 28, 2006 and entitled “Using MCDs to Upload Information to an Interactive Display System,” which are hereby incorporated by reference.

US Referenced Citations (212)
Number Name Date Kind
2917980 Grube et al. Dec 1959 A
3068754 Benjamin et al. Dec 1962 A
3763468 Ovshinsky et al. Oct 1973 A
4053208 Kato et al. Oct 1977 A
4275395 Dewey et al. Jun 1981 A
4573191 Kidode et al. Feb 1986 A
4725863 Dumbreck et al. Feb 1988 A
4843568 Krueger et al. Jun 1989 A
4887898 Halliburton et al. Dec 1989 A
4948371 Hall Aug 1990 A
5001558 Burley et al. Mar 1991 A
5138304 Bronson Aug 1992 A
5151718 Nelson Sep 1992 A
5239373 Tang et al. Aug 1993 A
5276609 Durlach Jan 1994 A
5319496 Jewell et al. Jun 1994 A
5325472 Horiuchi et al. Jun 1994 A
5325473 Monroe et al. Jun 1994 A
5426474 Rubtsov et al. Jun 1995 A
5436639 Arai et al. Jul 1995 A
5442252 Golz Aug 1995 A
5454043 Freeman Sep 1995 A
5497269 Gal Mar 1996 A
5510828 Lutterbach et al. Apr 1996 A
5526182 Jewell et al. Jun 1996 A
5528263 Platzker et al. Jun 1996 A
5528297 Seegert et al. Jun 1996 A
5534917 MacDougall Jul 1996 A
5548694 Gibson Aug 1996 A
5591972 Noble et al. Jan 1997 A
5594469 Freeman et al. Jan 1997 A
5633691 Vogeley et al. May 1997 A
5703637 Miyazaki et al. Dec 1997 A
5808784 Ando et al. Sep 1998 A
5861881 Freeman et al. Jan 1999 A
5882204 Iannazo et al. Mar 1999 A
5923380 Yang et al. Jul 1999 A
5923475 Kurtz et al. Jul 1999 A
5953152 Hewlett Sep 1999 A
5969754 Zeman Oct 1999 A
5978136 Ogawa et al. Nov 1999 A
5982352 Pryor Nov 1999 A
6008800 Pryor Dec 1999 A
6058397 Barrus et al. May 2000 A
6075895 Qiao et al. Jun 2000 A
6084979 Kanade et al. Jul 2000 A
6088612 Blair Jul 2000 A
6097369 Wambach Aug 2000 A
6106119 Edwards Aug 2000 A
6118888 Chino et al. Sep 2000 A
6125198 Onda Sep 2000 A
6166744 Jaszlics et al. Dec 2000 A
6176782 Lyons et al. Jan 2001 B1
6191773 Maruno et al. Feb 2001 B1
6198487 Fortenbery et al. Mar 2001 B1
6198844 Nomura Mar 2001 B1
6263339 Hirsch Jul 2001 B1
6292171 Fu et al. Sep 2001 B1
6308565 French et al. Oct 2001 B1
6323895 Sata Nov 2001 B1
6333735 Anvekar Dec 2001 B1
6335977 Kage Jan 2002 B1
6339748 Hiramatsu Jan 2002 B1
6349301 Mitchell et al. Feb 2002 B1
6353428 Maggioni et al. Mar 2002 B1
6359612 Peter et al. Mar 2002 B1
6388657 Natoli May 2002 B1
6400374 Lanier Jun 2002 B2
6407870 Hurevich et al. Jun 2002 B1
6414672 Rekimoto et al. Jul 2002 B2
6445815 Sato Sep 2002 B1
6454419 Kitazawa Sep 2002 B2
6480267 Yanagi et al. Nov 2002 B2
6491396 Karasawa et al. Dec 2002 B2
6501515 Iwamura Dec 2002 B1
6522312 Ohshima et al. Feb 2003 B2
6545706 Edwards et al. Apr 2003 B1
6552760 Gotoh et al. Apr 2003 B1
6598978 Hasegawa Jul 2003 B2
6607275 Cimini et al. Aug 2003 B1
6611241 Firester Aug 2003 B1
6654734 Mani et al. Nov 2003 B1
6658150 Tsuji et al. Dec 2003 B2
6661918 Gordon et al. Dec 2003 B1
6677969 Hongo Jan 2004 B1
6707054 Ray Mar 2004 B2
6707444 Hendriks et al. Mar 2004 B1
6712476 Ito et al. Mar 2004 B1
6720949 Pryor et al. Apr 2004 B1
6732929 Good et al. May 2004 B2
6747666 Utterback Jun 2004 B2
6752720 Clapper et al. Jun 2004 B1
6754370 Hall-Holt et al. Jun 2004 B1
6791700 Omura et al. Sep 2004 B2
6826727 Mohr et al. Nov 2004 B1
6831664 Marmaropoulos et al. Dec 2004 B2
6871982 Holman et al. Mar 2005 B2
6877882 Haven et al. Apr 2005 B1
6912313 Li Jun 2005 B2
6965693 Kondo et al. Nov 2005 B1
6975360 Slatter Dec 2005 B2
6999600 Venetianer et al. Feb 2006 B2
7015894 Morohoshi Mar 2006 B2
7042440 Pryor May 2006 B2
7054068 Yoshida et al. May 2006 B2
7058204 Hildreth et al. Jun 2006 B2
7068274 Welch et al. Jun 2006 B2
7069516 Rekimoto Jun 2006 B2
7088508 Ebina et al. Aug 2006 B2
7149262 Nayar et al. Dec 2006 B1
7158676 Rainsford Jan 2007 B1
7170492 Bell Jan 2007 B2
7190832 Frost et al. Mar 2007 B2
7193608 Stuerzlinger Mar 2007 B2
7227526 Hildreth et al. Jun 2007 B2
7259747 Bell Aug 2007 B2
7262874 Suzuki Aug 2007 B2
7289130 Satoh et al. Oct 2007 B1
7330584 Weiguo et al. Feb 2008 B2
7339521 Scheidemann et al. Mar 2008 B2
7348963 Bell Mar 2008 B2
7379563 Shamaie May 2008 B2
7382897 Brown et al. Jun 2008 B2
7394459 Bathiche et al. Jul 2008 B2
7428542 Fink et al. Sep 2008 B1
7432917 Wilson et al. Oct 2008 B2
7536032 Bell May 2009 B2
7559841 Hashimoto Jul 2009 B2
7576727 Bell Aug 2009 B2
7598942 Underkoffler et al. Oct 2009 B2
7619824 Poulsen Nov 2009 B2
7665041 Wilson et al. Feb 2010 B2
7710391 Bell et al. May 2010 B2
7737636 Li et al. Jun 2010 B2
RE41685 Feldman et al. Sep 2010 E
7809167 Bell Oct 2010 B2
7834846 Bell Nov 2010 B1
20010012001 Rekimoto et al. Aug 2001 A1
20010033675 Maurer et al. Oct 2001 A1
20020006583 Michiels et al. Jan 2002 A1
20020032697 French et al. Mar 2002 A1
20020041327 Hildreth et al. Apr 2002 A1
20020064382 Hildreth et al. May 2002 A1
20020081032 Chen et al. Jun 2002 A1
20020103617 Uchiyama et al. Aug 2002 A1
20020105623 Pinhanez Aug 2002 A1
20020130839 Wallace et al. Sep 2002 A1
20020140633 Rafii et al. Oct 2002 A1
20020140682 Brown et al. Oct 2002 A1
20020178440 Agnihotri et al. Nov 2002 A1
20020186221 Bell Dec 2002 A1
20030032484 Ohshima et al. Feb 2003 A1
20030076293 Mattsson Apr 2003 A1
20030091724 Mizoguchi May 2003 A1
20030093784 Dimitrova et al. May 2003 A1
20030098819 Sukthankar et al. May 2003 A1
20030103030 Wu Jun 2003 A1
20030113018 Nefian et al. Jun 2003 A1
20030122839 Matraszek et al. Jul 2003 A1
20030137494 Tulbert Jul 2003 A1
20030161502 Morihara et al. Aug 2003 A1
20030178549 Ray Sep 2003 A1
20040005924 Watabe et al. Jan 2004 A1
20040015783 Lennon et al. Jan 2004 A1
20040046736 Pryor et al. Mar 2004 A1
20040046744 Rafii et al. Mar 2004 A1
20040073541 Lindblad et al. Apr 2004 A1
20040091110 Barkans May 2004 A1
20040095768 Watanabe et al. May 2004 A1
20040183775 Bell Sep 2004 A1
20050088407 Bell et al. Apr 2005 A1
20050089194 Bell Apr 2005 A1
20050104506 Youh et al. May 2005 A1
20050110964 Bell et al. May 2005 A1
20050122308 Bell et al. Jun 2005 A1
20050132266 Ambrosino et al. Jun 2005 A1
20050147282 Fujii Jul 2005 A1
20050162381 Bell et al. Jul 2005 A1
20050185828 Semba et al. Aug 2005 A1
20050195598 Dancs et al. Sep 2005 A1
20050265587 Schneider Dec 2005 A1
20060010400 Dehlin et al. Jan 2006 A1
20060031786 Hillis et al. Feb 2006 A1
20060132432 Bell Jun 2006 A1
20060139314 Bell Jun 2006 A1
20060168515 Dorsett, Jr. et al. Jul 2006 A1
20060184993 Goldthwaite et al. Aug 2006 A1
20060187545 Doi Aug 2006 A1
20060227099 Han et al. Oct 2006 A1
20060242145 Krishnamurthy et al. Oct 2006 A1
20060256382 Matraszek et al. Nov 2006 A1
20060258397 Kaplan et al. Nov 2006 A1
20060294247 Hinckley et al. Dec 2006 A1
20070285419 Givon Dec 2007 A1
20080040692 Sunday et al. Feb 2008 A1
20080062123 Bell Mar 2008 A1
20080090484 Lee et al. Apr 2008 A1
20080150890 Bell et al. Jun 2008 A1
20080150913 Bell et al. Jun 2008 A1
20080170776 Albertson et al. Jul 2008 A1
20080245952 Troxell et al. Oct 2008 A1
20080252596 Bell et al. Oct 2008 A1
20090027337 Hildreth Jan 2009 A1
20090077504 Bell et al. Mar 2009 A1
20090102788 Nishida et al. Apr 2009 A1
20090225196 Bell et al. Sep 2009 A1
20090235295 Bell et al. Sep 2009 A1
20090251685 Bell et al. Oct 2009 A1
20100026624 Bell et al. Feb 2010 A1
20100039500 Bell et al. Feb 2010 A1
20100060722 Bell Mar 2010 A1
20100121866 Bell et al. May 2010 A1
Foreign Referenced Citations (28)
Number Date Country
0055366 Jul 1982 EP
0626636 Nov 1994 EP
0913790 May 1999 EP
1 689 172 Jun 2002 EP
57094672 Jun 1982 JP
2000-105583 Apr 2000 JP
2002-014997 Jan 2002 JP
2002-092023 Mar 2002 JP
2002-171507 Jun 2002 JP
2003-517642 May 2003 JP
2003-271084 Sep 2003 JP
2003-0058894 Jul 2003 KR
WO 9838533 Sep 1998 WO
WO 0016562 Mar 2000 WO
WO 2001063916 Aug 2001 WO
WO 0201537 Jan 2002 WO
WO 2002100094 Dec 2002 WO
WO 2004055776 Jul 2004 WO
WO 2004097741 Nov 2004 WO
WO 2005041578 May 2005 WO
WO 2005041579 May 2005 WO
WO 2005057398 Jun 2005 WO
WO 2005057399 Jun 2005 WO
WO 2005057921 Jun 2005 WO
WO 2005091651 Sep 2005 WO
WO 2007019443 Feb 2007 WO
WO 2008124820 Oct 2008 WO
WO 2009035705 Mar 2009 WO
Provisional Applications (2)
Number Date Country
60817278 Jun 2006 US
60741557 Dec 2005 US