Embodiments pertain to displaying a representation within an interactive display application of an interaction between generic interaction devices. Some embodiments relate to interactions between two or more generic interaction devices, and interpreting interactions of the device on an interactive display.
Many existing systems incorporate an interactive display to capture human/machine interaction, with such human/machine interaction used to control or drive a displayed or virtual application. Systems range in functionality from simple objects that allow humans to interact with an interactive television/video screen display (e.g., children's interactive products made by toy manufacturers) to complex devices that allow for a user's interaction to be captured through motion capture or in association with movement of auxiliary devices (e.g., Microsoft Kinect, LeapMotion, Nintendo Wii videogame systems). However, existing systems provide limited mechanisms for real-world object-to-object interaction, and rely on a single source detection mechanism, such as video camera or IR sensors, to perceive activity and movement among humans and real world objects.
The following description and drawings sufficiently illustrate specific embodiments to enable those skilled in the art to practice them. Other embodiments may incorporate structural, logical, electrical, process, and other changes. Portions and features of some embodiments may be included in, or substituted for, those of other embodiments. Embodiments set forth in the claims encompass all available equivalents of those claims.
Some of the embodiments discussed herein describe an interactive display, an interactive system, and at least two generic interaction devices. The interactive system may process relative location information for the generic interaction devices, and the interactive system may cause the interactive display to depict interactions between generic interaction devices. This may allow for individual interactions between a physical generic interaction device and an interactive display. This may also allow for other interactions, between two or more generic interaction devices, to be interpreted by an interactive display.
This system may be advantageous in applications where one or more users are learning how to manipulate one or more objects. For example, such a system could be used to teach users how to manipulate medical devices, how to play musical instruments, or how to perform a ballroom dance. In some embodiments, the system could also be used to teach young children how to manipulate simple educational blocks to learn the alphabet or math, or could include basic reorganization of blocks, rings, or towers. In some embodiments, the system may also be used to teach various physical phenomena, such as the operation of radio waves, magnets, or aerodynamics. For example, movement of generic interaction devices may cause electromagnetic field lines or aerodynamic airflow lines to be displayed. In other embodiments, the relative location of generic interaction devices may be used to measure or configure the location of various physical objects. For example, generic interaction devices may be used to measure cable length required for various electronic components, to guide the placement and aiming of each speaker in a set of surround sound speakers, or to guide the placement of furniture, artwork, or electronic components in a room.
In some embodiments, a system allows a user to manipulate generic interaction devices in relation to each other to perform actions on an interactive display. The educational examples mentioned above may be used in an interactive environment. For example, a single interactive environment may be used to teach a user how to play an instrument, and then may be used in a score-based video game based on the accuracy of playing the instrument. In other embodiments, the generic interaction devices may be used to control various actions within a virtual environment. For example, the generic interaction devices may be elements of a toy gun that must be assembled before use. In other embodiments, the generic interaction devices may be used to interact with a remote user, such as in an interactive teaching or an interactive healthcare context. For example, generic interaction devices may be various simple medical devices, and a healthcare provider may remotely guide a user through an interactive physical examination.
In some embodiments, the system may supplement existing controller technology. Various existing interactive systems use line-of-sight 2-D positioning, such as the Wii's IrDA sensor or the video camera used in Xbox Kinect or PlayStation Eye. Generic interaction devices may offer non-line-of-sight input to augment such line-of-sight systems, thereby allowing a user to manipulate virtual objects without requiring a direct line-of-sight to a controller sensor, or providing for continuous movement data during periods where line-of-sight is temporarily unavailable. For example, a dance may require a user to turn his or her back to a line-of-sight controller sensor, a user may manipulate a virtual object behind his or her back, or dance or hand-to-hand combat may require movement or virtual object manipulation while another user is blocking the direct line-of-sight to a controller sensor. Generic interaction devices may also provide a depth input to augment the inherently 2-D input of line-of-sight systems. For example, an exercise or dance move may require information about relative and absolute location and motion inputs in a direction toward or away from an IrDA sensor or video camera.
The interaction between the generic interaction devices 102 and 104 may be depicted on the interactive display 108 using corresponding generic interaction device virtual objects or avatars 116 and 118. For example, when the generic interaction devices 102 and 104 have been moved closer together, the generic interaction device virtual objects or avatars 116 and 118 depicted on the interactive display 108 may be moved in a corresponding direction (closer together). In another example, movement of the generic interaction devices 102 and 104 in one direction may cause the virtual objects or avatars 116 and 118 to be moved in the opposite direction. In some embodiments, the interactive console 106 and the interactive display 108 may be separate, such as a computer and computer screen or a video game system and a television. In other embodiments, the interactive console 106 and the interactive display 108 may be housed and operable within a single device, such as a tablet computer, laptop computer, input-connected dongle, smart phone, or smart TV.
The generic interaction devices 102 and 104 may include relative location detection components for detecting relative location 110 information between the respective objects. For example, the relative location detection components may detect that the generic interaction devices 102 and 104 have been moved closer together, and the relative location 110 information may reflect that increase in proximity. The generic interaction devices 102 and 104 may include passive absolute location detection components to enable the interactive console 106 to detect absolute location information. For example, the passive absolute location detection components may include infrared (IR) lights, markers, and reflectors that may be observed 120 by a camera 122. The camera 122 may be provided from the interactive display 108 (such as a camera located within a television housing), provided from the interactive console 106, or attached as a peripheral to the interactive display 108 or interactive console 106 (such as through a universal serial bus connection, an HDMI connection, a connection with a connected dongle, and the like).
The camera 122 may detect absolute location information by tracking IR light reflections among the generic interaction devices 102 and 104, by tracking the shape or color of the generic interaction devices 102 and 104, by tracking user movements of the generic interactions devices 102 and 104, or other similar mechanisms. The generic interaction devices 102 and 104 may include absolute location detection components for detecting absolute location information. For example, the absolute location detection components may include an IR camera in one or both of the generic interaction devices 102 and 104, where the IR camera is used to detect one or more external IR reference points.
In one embodiment, the master interaction device 202 includes hardware or software functionality not included in the slave interaction device 204. For example, the master interaction device 202 may include active location detection hardware, and the slave interaction device 204 may include passive location detection hardware. In other embodiments, the master interaction device 202 and a slave interaction device 204 may include identical hardware (e.g., components), but may perform different functions or roles. For example, the master interaction device 202 and the slave interaction device 204 may both include communications hardware, and after one of the interaction devices is designated as the master interaction device 202, that device may perform all communication with an interactive console 206 (e.g., personal computer, video game system) or an interactive display 208.
The generic interaction devices 202 and 204 may wirelessly detect or determine information regarding their relative location 210, and the master interaction device 202 may transmit 212 that relative location information 210 to the interactive console 206. The interactive console 206 may receive and interpret the relative location information 210 in the context of an interactive display application and transmit 214 a visual display of the interpretation of the relative location information 210 to the interactive display 208. The interaction between the generic interaction devices 202 and 204 may be depicted on the interactive display 208 using corresponding generic interaction device virtual objects or avatars 216 and 218. For example, the generic interaction devices 202 and 204 may detect that they have been moved closer together, and the relative location information 210 may reflect that increase in proximity.
The generic interaction devices 202 and 204 may detect or determine information regarding their relative location 210 using one or a combination of active or passive relative location detection components 222 and 224. The relative location detection components 222 and 224 may actively send and receive information to and from each other to detect relative location information 210, such as using a received signal strength indicator (RSSI) in Bluetooth or other measurements available with operations of RF protocols. The first relative location detection component 222 may include a passive device, such as an RFID chip, and the second relative location detection component 224 may actively detect the proximity of the RFID chip. The relative location detection components 222 and 224 may include a combination of active and passive components, and may switch between using active or passive components to conserve power, to increase accuracy, or to improve system performance. The relative location detection components 222 and 224 may use sonic or optical ranging, or may use sonic or optical communication for ranging (e.g., IrDA communication). The relative location detection components 222 and 224 may include inertial sensors (e.g., accelerometers, gyroscopes) to detect acceleration, rotation, or orientation information relative to gravity. Other non-proximity information from these components may be used for feedback, processing, or changes either at the generic interaction devices 202 and 204 or in the interactive display 208. Further, the generic interaction devices 202 and 204 may discern location and orientation information with respect to each other through a localization scheme enabled through user interaction or automated processing with the interactive display 208.
In addition to the relative location detection components 222 and 224, the generic interaction devices 202 and 204 may include passive or active absolute location detection components. For example, a camera 230 may observe an IR light on each of the generic interaction devices 202 and 204 and detect 226 the absolute location of the master interaction device 202 and detect 228 the absolute location of the slave interaction device 204.
The generic interaction devices 202 and 204 may include interactive communication components 232 and 234. The interactive communication components 232 and 234 may be RF components (e.g., Bluetooth, ANT, ZigBee, or Wi-Fi). The interactive communication components 232 and 234 may be external to the generic interaction devices 202 and 204, such as is depicted in
In some embodiments, in addition to causing an action on the interactive display 208, the generic interaction devices 202 and 204 may interact with each other. The generic interaction devices 202 and 204 may include sensory feedback components that may indicate when the two generic interaction devices 202 and 204 have been arranged or are being manipulated in a specific manner. The sensory feedback components may include lights 242 and 244, vibration components 246 and 248, speakers 250 and 252, or other electromagnetic or electromechanical components. The sensory feedback components may provide a binary feedback, where the light, sound, or vibration is either on or off. For example, a toy gun may include a light or simulated clicking sound to indicate a toy gun ammo clip has been correctly inserted, two cubes may vibrate briefly to indicate they have been placed together in the correct orientation, or user-worn generic interaction devices may vibrate briefly upon performing a dance move correctly. The sensory feedback components may provide varying levels of feedback, where the light, sound, or vibration may be increased or decreased in intensity. For example, the intensity of the light, sound, or vibration may increase as the user moves the generic interaction devices 202 and 204 in a desired direction. The sensory feedback components may also alter the motion of the generic interaction devices 202 and 204. For example, a solenoid may shift the balance of the master interaction device 202 to indicate that the user is manipulating it incorrectly. In another example, based on the orientation or proximity of two cubes, the generic interaction devices 202 and 204 may activate an electromagnetic component to attract one another to indicate that the user is manipulating the generic interaction devices 202 and 204 correctly.
The generic interaction devices 202 and 204 may include input components 254 and 256. The input components 254 and 256 may receive touch-sensitive input (e.g., computer trackpad, capacitive touchscreen, resistive touchscreen), which may enable touchscreen inputs such as swiping, pinching, or expanding. The input components 254 and 256 may receive conventional controller input, such as from a keyboard, interactive environment buttons, joystick input, or optical mouse input. The input components 254 and 256 may receive other inputs, such as an environmental readings (e.g., temperature, atmospheric pressure) or mechanical readings (e.g., compression or distortion of the generic interaction device). The input components 254 and 256 may be used in the absolute positioning of the generic interaction devices 202 and 204, such externally provided ranging information or input video of external reference points. Each of these input components may be used separately or in combination to cause interaction between the virtual objects on the interactive display. For example, a touch sensitive input in combination with the repositioning of the generic interaction devices 202 and 204 may change the virtual object(s) differently than a simple repositioning of the generic interaction devices 202 and 204. The input components may also provide inputs used to change the shape, geometry, or other visible properties of any displayed virtual objects on the interactive display.
Once the relative location information (e.g., 110 or 210) has been detected (operation 302), the system interactive method 300 may send the relative location information (e.g., 110 or 210) to an interactive console (e.g., the interactive console 206 of
The master device interactive method 400 may send the relative location information (e.g., 110 or 210) to an interactive console (e.g., 206) (operation 408). The master device interactive method 400 may then receive a response from the interactive device (e.g., 206) (operation 410), where the response is based on the relative location information (e.g., 110 or 210). Using the response from the interactive device (e.g., 206), the master device interactive method 400 may provide sensory feedback to the generic interaction devices (e.g., 102, 104 or 202, 204) (operation 412).
The master interaction device 502 may include a master relative location determination component 512, and the slave interaction device 504 may include a slave relative location determination component 522. The relative location determination components 512 and 522 may interact with each other to detect relative location information, or may operate independently to detect relative location information. The relative location determination components 512 and 522 may actively send and receive information to and from each other to detect relative location information, such as using a received signal strength indicator (RSSI) in Bluetooth or other RF protocol. The master relative location determination component 512 may include a passive device, such as an RFID chip, and the slave relative location determination component 522 may actively detect the proximity of the RFID chip. The relative location determination components 512 and 522 may include a combination of active and passive components, and may switch between using active or passive components to conserve power, to increase accuracy, or to improve system performance. The relative location determination components 512 and 522 may use sonic or optical ranging, or may use sonic or optical communication for ranging (e.g., IrDA communication). The relative location determination components 512 and 522 may include inertial sensors (e.g., accelerometers, gyroscopes) to detect acceleration, rotation, or orientation information relative to gravity.
The master interaction device 502 may include a master sensory feedback component 514, and the slave interaction device 504 may include a slave sensory feedback component 524. These sensory feedback components 514 and 524 may include various feedback implementations, such as lights, speakers, vibration components, or electromagnetic components to indicate when the generic interaction devices 502 and 504 have been arranged or are being manipulated in a specific manner. The sensory feedback components 514 and 524 may provide a binary feedback, where the light, sound, or vibration is either on or off. The sensory feedback components 514 and 524 may provide varying levels of feedback, where the light, sound, or vibration may be increased or decreased in intensity. The sensory feedback components 514 and 524 may include electromagnetic or other motion-based feedback, such as a solenoid that shifts the balance of the generic interaction devices 502 and 504, or an electromagnet that causes the generic interaction devices 502 and 504 to repulse or attract one another.
The master interaction device 502 may include a master input component 516, and the slave interaction device 504 may include a slave input component 526. The master and slave input components 516 and 526 may receive input from external sources, or may include various components to measure or observe external information. The master and slave input components 516 and 526 may receive conventional controller input, such as from a keyboard, interactive environment buttons, joystick input, or optical mouse input. The master and slave input components 516 and 526 may receive touch-sensitive input (e.g., computer trackpad, capacitive touchscreen, resistive touchscreen), which may enable touchscreen inputs such as swiping, pinching, or expanding. The master and slave input components 516 and 526 may receive other inputs, such as an environmental readings (e.g., temperature, atmospheric pressure) or mechanical readings (e.g., compression or distortion of the generic interaction devices 502 and 504). The master and slave input components 516 and 526 may receive other input to provide for absolute positioning of the master and slave interaction devices 502 and 504, such as externally provided ranging information or input video of external reference points. For example, an external device may provide a distance-sensitive RF beacon, or an infrared (IR) light might provide an external reference point to indicate the direction of the display.
The master interaction device 502 may include a master interactive system communication component 518, and the slave interaction device 504 may include a slave interactive system communication component 528. The interactive system communication components 518 and 528 may communicate directly with each other 530, or may communicate 532 and 534 with a generic interaction device communication component 542 within the interactive display system 506. Though
The generic interaction device communication component 542 may be external to the interactive display system 506, such as is depicted in
The example generic interaction device 600 includes a processor 602 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 604 and a static memory 606, which communicate with each other via an interconnect 608 (e.g., a link, a bus, etc.). The generic interaction device 600 may further include a display device 610 to provide visual feedback, such as one or more LED lights or an LCD display. The generic interaction device 600 may further include an input device 612 (e.g., a button or alphanumeric keyboard), and a user interface (UI) navigation device 614 (e.g., an integrated touchpad). In one embodiment, the display device 610, input device 612 and UI navigation device 614 are a touch screen display. The generic interaction device 600 may additionally include mass storage 616 (e.g., a drive unit), a signal generation device 618 (e.g., a speaker), an output controller 632, battery power management 634, and a network interface device 620 (which may include or operably communicate with one or more antennas 630, transceivers, or other wireless communications hardware), and one or more sensors 628, such as a GPS sensor, compass, location sensor, accelerometer, or other sensor.
The mass storage 616 includes a machine-readable medium 622 on which is stored one or more sets of data structures and instructions 624 (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein. The instructions 624 may also reside, completely or at least partially, within the main memory 604, static memory 606, and/or within the processor 602 during execution thereof by the generic interaction device 600, with the main memory 604, static memory 606, and the processor 602 constituting machine-readable media.
While the machine-readable medium 622 is illustrated in an example embodiment to be a single medium, the term “machine-readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more instructions 624. The term “machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding or carrying instructions 624 for execution by the generic interaction device 600 and that cause the generic interaction device to perform any one or more of the methodologies of the present disclosure or that is capable of storing, encoding or carrying data structures utilized by or associated with such instructions 624. The term “machine-readable medium” shall, accordingly, be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media. Specific examples of machine-readable media 622 include non-volatile memory, including, by way of example, semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
The instructions 624 may further be transmitted or received over a communications network 626 using a transmission medium via the network interface device 620 utilizing any one of a number of well-known transfer protocols (e.g., HTTP). Examples of communication networks 626 include a local area network (LAN), wide area network (WAN), the Internet, mobile telephone networks, Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Wi-Fi, 3G, and 4G LTE/LTE-A or WiMAX networks). The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carrying instructions 624 for execution by the generic interaction device 600, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
Embodiments may be implemented in connection with wired and wireless networks, across a variety of digital and analog mediums. Although some of the previously described techniques and configurations were provided with reference to implementations of consumer electronic devices with wired or physically coupled digital signal connections, these techniques and configurations may also be applicable to display of content from wireless digital sources from a variety of local area wireless multimedia networks and network content accesses using WLANs, WWANs, and wireless communication standards. Further, the previously described techniques and configurations are not limited to input sources provided from a direct analog or digital signal, but may be applied or used with any number of multimedia streaming applications and protocols to provide display content over an input link.
Embodiments may be implemented in one or a combination of hardware, firmware, and software. Embodiments may also be implemented as instructions stored on a machine-readable storage device, which may be read and executed by at least one processor to perform the operations described herein. A machine-readable storage device may include any non-transitory mechanism for storing information in a form readable by a device (e.g., a computer or other processor-driven display device). For example, a machine-readable storage device may include read-only memory (ROM), random-access memory (RAM), magnetic disk storage media, optical storage media, flash-memory devices, and other storage devices and media. In some embodiments, display devices such as televisions, A/V receivers, set-top boxes, and media players may include one or more processors and may be configured with instructions stored on such machine-readable storage devices.