The capabilities of computing devices have continuously expanded to include ever more capabilities and convenience. From personal computers integrated with monitors to wearable computers, computing devices have progressed toward integrated devices. Each of such integrated computing devices presents a unique set of problems which must be overcome to provide a truly integrated and natural computing experience.
Often, users of computing devices need to share information with other users and collaborate on common information. Various information sharing services allow users of their computing devices to exchange information. Many services provide some form of visualization for this information exchange.
The technology, roughly described, includes an integrated processing and projection device which can rest on a supporting surface provides interactivity between users. The interactivity is provided in a projected display area projected by the device on the supporting surface. The integrated processing and projection device includes a processor and a projector designed to provide a display on the supporting surface of the device. Various sensors enable object and gesture detection in the display area. An interactive service, provided using the device or a network connected host, enables users of companion processing devices to interact in the display area of the integrated processing and projection device using the companion devices, via an interface in the display provide by the projector. Users without companion devices can interact with users of companion devices using an interface provided in the display area.
An integrated processing system includes a display projector provided in a housing adapted to be supported by a surface. The display projector is adapted to display an interface in a display area on the supporting surface. The system includes an RGB camera and an infrared emitter and detector, wherein the RGB camera and the infrared detector each have a field of view encompassing a detection area including at least the display area. The system includes a communication interface receiving input and providing output to associated devices. The system includes a processor and memory having code operable to instruct the processor to receive input from one or more associated devices via the communication interface, the input comprising at least data to be shared in the display area, and provide an output in the display area of the data to be shared based on input instructions from the one or more associated devices.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
c are an illustration of a second exemplary application utilized with the integrated processing and projection device
Technology is presented wherein an integrated processing and projection device adapted to rest on a supporting surface provides interactive applications alone or in conjunction with associated processing devices in a projected display area on the supporting surface. In one aspect, interaction between multiple users is enabled by an integrated processing and projection device, or a hosted service designed to enable interaction in conjunction with an integrated processing and projection device. The integrated processing and projection device includes a processor and a projector designed to provide a display on the supporting surface of the device. Various sensors enable object and gesture detection in the display area. An interactive service, provided using the device or a network connected host, enables users of companion processing devices to interact in the display area of the integrated processing and projection device using the companion devices, via an interface in the display provide by the projector. Users without companion devices can interact with users of companion devices using an interface provided in the display area.
As illustrated in
Housing 106 includes a lid 102 having mounted therein a rotatable mirror 110. Lid 102 is supported by arms 112, 113 which can raise and lower lid 102 as illustrated in
As illustrated in
A second embodiment of device 100 is illustrated in
With reference to
The system memory 222 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 223 and random access memory (RAM) 231. A basic input/output system (BIOS) 224, containing the basic routines that help to transfer information between elements within device 100, such as during start-up, is typically stored in ROM 223. RAM 231 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 259. By way of example, and not limitation,
Object detection component 226 includes instructions for enabling the processing units 259 to detect both passive and active objects in the object detection area 122. Gesture recognition component 227 allows detection of user hand and object gestures within the detection area 122. Depth data processing component 228 allows for the depth image data provided by capture device 322 to be utilized in conjunction with the RGB image data and the IR detector data to determine any of the objects or gestures described herein. Interaction service component 229a provides a communication path to allow users with other processing devices to communicate with the device 100 and/or the device 100 to communicate with an interactive service system (illustrated in
Optionally, an interaction application 260 may be provided to implement the functions of
Device 100 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only,
The computer 241 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 251. The remote computer 251 may be a personal computer, a server, a router, a network PC, a peer device or other common network node. The logical connections depicted include a local area network (LAN) and a wide area network (WAN) 245, but may also include other networks. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet. When used in a LAN networking environment, the computer 241 is connected to the LAN/WAN 245 through a network interface or adapter 237.
The RGB camera 160 and IR detector 150 may be coupled to a video interface 232 which processes input prior to input to the processing units 259. A graphics processor 231 may be utilized to offload rendering tasks from the processing units 259. IR Emitter 150 operates under the control of processing units 259. Projector 170 is coupled to video interface 232 to output content to the display area 120. Video interface 232 operates in conjunction with user input interface 236 to interpret input gestures and controls from a user which may be provided in the display area 120.
A user may enter commands and information into the device 100 through conventional input devices, but optimally a user interface is provided by the projector 170 into the display area 120 when input is utilized by any of the applications operation on or in conjunction with device 100.
A capture device 322 may optionally be provided in one embodiment as shown in
In time-of-flight analysis, the IR light component 324 of the capture device 322 may emit an infrared light onto the capture area and may then use sensors to detect the backscattered light from the surface of one or more objects in the capture area using, for example, the 3-D camera 326 and/or the RGB camera 328. In some embodiments, pulsed infrared light may be used such that the time between an outgoing light pulse and a corresponding incoming light pulse may be measured and used to determine a physical distance from the capture device 322 to a particular location on the one or more objects in the capture area. Additionally, the phase of the outgoing light wave may be compared to the phase of the incoming light wave to determine a phase shift. The phase shift may then be used to determine a physical distance from the capture device to a particular location associated with the one or more objects.
In another example, the capture device 20 may use structured light to capture depth information. In such an analysis, patterned light (i.e., light displayed as a known pattern such as grid pattern or a stripe pattern) may be projected onto the capture area via, for example, the IR light component 324. Upon striking the surface of one or more objects (or targets) in the capture area, the pattern may become deformed in response. Such a deformation of the pattern may be captured by, for example, the 3-D camera 326 and/or the RGB camera 328 and analyzed to determine a physical distance from the capture device to a particular location on the one or more objects. Capture device 322 may include optics for producing collimated light. In some embodiments, a laser projector may be used to create a structured light pattern. The light projector may include a laser, laser diode, and/or LED.
The capture device 322 may include a processor 332 that may be in communication with the image camera component 331. The processor 332 may include a standardized processor, a specialized processor, a microprocessor, or the like. The processor 332 may execute instructions that may include instructions for receiving and analyzing images. It is to be understood that at least some image analysis and/or target analysis and tracking operations may be executed by processors contained within one or more capture devices such as capture device 322.
The capture device 322 may include a memory 334 that may store the instructions that may be executed by the processor 332, images or frames of images captured by the 3-D camera or RGB camera, filters or profiles, or any other suitable information, images, or the like. As depicted, the memory 334 may be a separate component in communication with the image capture component 331 and the processor 332. In another embodiment, the memory 334 may be integrated into the processor 332 and/or the image capture component 331.
The capture device 322 may be in communication with the device 100 via a communication link. The communication link 46 may be a wired connection including, for example, a USB connection, a FireWire connection, an Ethernet cable connection, or the like and/or a wireless connection such as a wireless 802.11b, g, a, or n connection.
The cameras 326, 328 and capture device 322 may define additional input devices for the device 100 that connect via user input interface 236. In addition, device 100 may incorporate a microphone 243 and speakers 244 coupled to an audio interface 233.
As noted above, an interaction service may allow users of companion processing devices associated with the user to interact with the device 100 and in the display area 120, providing a common interaction zone for both users of companion devices and users of the device 100 directly.
Service host 602 may include, for example, a user login service 608, an object library 618, applications 620, and interaction service 229b. Service database 612 may include user account records 610 having a user-associated object library and user data 614 and a friends or associates list 616. The service host 602 may be utilized to provide applications running on either the processing devices 600 or the integrated processing and projection device 100 with the means to communicate objects from user data 614 or the individual devices 600 and 100 and their respective applications to each other and to a common display environment such as the display area 120 of the integrated processing and projection device 100. Users of processing devices 600 and projection device 100 may utilize any of a variety of applications to share information in a common display area 120, or between individual associated processing devices, based on permissions defined and stored in the user account records 610. The login service 608 ensures that each user of the collaboration services and the user's associated device is authorized and authenticated. Object library 618 may provide information to the integrated processing and projection device 100 to regarding real objects which may be placed in the display area 120 as well as the identity of associated processing devices of users of service host 602. Interaction service 229b utilizes the identities of users and associated processing devices to enable information sharing between both of processing devices 600 and the integrated processing and projection device 100. The interaction service can utilize sensors described above on the integrated processing and projection device 100 to identify companion processing devices which are proximate to the integrated processing and projection device 100 to thereby enable communication and interaction between the respective devices.
Applications 620 may comprise executable instructions for any of the processing devices 600 and integrated processing and projection device 100 which may be utilized by the devices 100 and 600 to participated in the interaction service 229b as described in the examples herein.
User interaction history and permissions may be stored in the object library and user data 614 and friends list 616. For example, where a user has identified certain physical objects in the display area and preferences regarding such objects, this information is stored in the object library and user data 614. library and user data 614 may also include user-specific data that a user wishes to use in the interaction service. For example, data such as documents and notes may be retained securely in the user data 614 and accessed by the user when interacting with the collaboration service. The friends and associates list 616 can store permissions of the types of information which may be available through various applications for users of respective processing devices 600 and device 100.
As noted herein, device 100 may utilize sensors including IR detector 150 and camera 160 (and optionally, capture device 322) to identify objects in a detection area 122. Device 100 may also identify other devices and users via communication with the device 100 or using the above sensors.
Other types of interactive applications need not use service 229 or proximate associated processing devices.
If a user does not have an associate processing device, then an interaction application 260 and the interaction service 229 may still allow the user to participate in interactions using an interface in the display area 120. If a user does have an associated processing device, a determination is made at 1212 as to whether or not the device has an interactive service enabled application which interfaces with the integrated device service. The applications on an associated processing device can access the device 100 and projected area 120, as described above, via service 229 using an application programming interface. If the associated processing device is accessing the integrated device service 229 at step 1212, then for each application accessing the interaction service, device 100 will respond per application instructions to render and identify objects in the display area and the detection area 122.
If no associated processing device is present for a given user at 1210, then the system will monitor the detection area 122 at step 1218 and if at step 1218 an object or gesture is performed or placed in the detection area 122, then the object or gesture is identified at 1220 and at 1224, a determination is made as to whether or not the object or gesture is an interaction with the interaction application. If there is no object or gesture in the detection area, step 1218 loops until such action or object occurs. If so, then feedback regarding the interaction may be projected into the projection area 122 at 1226. If the gesture is not a user action with the interaction application at 1224, the method waits for the next interaction at 1218.
It should be understood that for each device and each user accessing an interaction application, steps 1214 and steps 1218-1226 may operate concurrently and in parallel for so that both users with associated processing devices and those without may concurrently utilize an interaction application.
In interactive application, an exemplary action may be, at 1412, to receive an object for display in the display area. If a display action is determined at 1412, then the object is accessed from a data store associated with the user the object at 1214 and the object is displayed at 1416 and the balance of the display area is updated at 1450. Finally the service communicates the action to the application to update the application state at 1455. Another exemplary action may be to remove an object from the display area 120 at 1418. If this action detected at 1418, then the object is removed 1420 and the balance of the display in the display area 120 is rendered at 1450 based on the trigger, object position, and application settings. Yet another possible action at 1422 may be to transfer an object from one user to another—either to a user's associated device or to a user data store. If, at 1422, the action is transfer an object, the permissions may be checked at 1424, and if the permissions pass, then the object can be transferred from one user data store to another at 1426. Again, the display is rendered at 1450 in accordance with the trigger object position, and application settings. In all cases the application stays updated at 1455 and the application returns to step 1218 to await another action.
Mobile device 1600 includes one or more processors 1612 and memory 1610. Memory 1610 includes applications 1630 and non-volatile storage 1640. Memory 1610 can be any variety of memory storage media types, including non-volatile and volatile memory. A mobile device operating system handles the different operations of the mobile device 1600 and may contain user interfaces for operations, such as placing and receiving phone calls, text messaging, checking voicemail, and the like. The applications 1630 can be any assortment of programs, such as a camera application for photos and/or videos, an address book, a calendar application, a media player, an internet browser, games, an alarm application, and other applications. The non-volatile storage component 1640 in memory 1610 may contain data such as music, photos, contact data, scheduling data, and other files.
The one or more processors 1612 are in communication with a see-through display 1609. The see-through display 1609 may display one or more virtual objects associated with a real-world environment. The one or more processors 1612 also communicates with RF transmitter/receiver 1606 which in turn is coupled to an antenna 1602, with infrared transmitter/receiver 1608, with global positioning service (GPS) receiver 1665, and with movement/orientation sensor 1614 which may include an accelerometer and/or magnetometer. RF transmitter/receiver 1608 may enable wireless communication via various wireless technology standards such as Bluetooth® or the IEEE 802.11 standards. Accelerometers have been incorporated into mobile devices to enable applications such as intelligent user interface applications that let users input commands through gestures, and orientation applications which can automatically change the display from portrait to landscape when the mobile device is rotated. An accelerometer can be provided, e.g., by a micro-electromechanical system (MEMS) which is a tiny mechanical device (of micrometer dimensions) built onto a semiconductor chip. Acceleration direction, as well as orientation, vibration, and shock can be sensed. The one or more processors 1612 further communicate with a ringer/vibrator 1616, a user interface keypad/screen 1618, a speaker 1620, a microphone 1622, a camera 1624, a light sensor 1626, and a temperature sensor 1628. The user interface keypad/screen may include a touch-sensitive screen display.
The one or more processors 1612 controls transmission and reception of wireless signals. During a transmission mode, the one or more processors 1612 provide voice signals from microphone 1622, or other data signals, to the RF transmitter/receiver 1606. The transmitter/receiver 1606 transmits the signals through the antenna 1602. The ringer/vibrator 1616 is used to signal an incoming call, text message, calendar reminder, alarm clock reminder, or other notification to the user. During a receiving mode, the RF transmitter/receiver 1606 receives a voice signal or data signal from a remote station through the antenna 1602. A received voice signal is provided to the speaker 1620 while other received data signals are processed appropriately.
Additionally, a physical connector 1688 may be used to connect the mobile device 1600 to an external power source, such as an AC adapter or powered docking station, in order to recharge battery 1604. The physical connector 1688 may also be used as a data connection to an external computing device. The data connection allows for operations such as synchronizing mobile device data with the computing data on another device.
The disclosed technology is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with the technology include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
The disclosed technology may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, software and program modules as described herein include routines, programs, objects, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types. Hardware or combinations of hardware and software may be substituted for software modules as described herein.
For purposes of this document, reference in the specification to “an embodiment,” “one embodiment,” “some embodiments,” or “another embodiment” may be used to describe different embodiments and do not necessarily refer to the same embodiment.
For purposes of this document, the term “set” of objects refers to a “set” of one or more of the objects.
For purposes of this document, the term “based on” may be read as “based at least in part on.”
For purposes of this document, without additional context, use of numerical terms such as a “first” object, a “second” object, and a “third” object may not imply an ordering of objects, but may instead be used for identification purposes to identify different objects.
In one aspect, the technology includes an interactive integrated processing system, comprising: a display projector in a housing, the display projector adapted to display an interface in a display area on a supporting surface; an RGB camera; an infrared emitter and infrared detector, wherein the RGB camera and the infrared detector each have a field of view, each field of view encompassing a detection area including at least the display area; a communication interface; and a processor and memory including code operable to instruct the processor to receive input from one or more associated devices via the communication interface, the input comprising at least data to be shared in the display area, and provide an output in the display area of the data to be shared based on input instructions from the one or more associated devices.
Additional aspects of the technology include any of the foregoing embodiments wherein the code is operable to instruct the processor to receive input manipulating data in the display area via the detection area.
Another aspect of the technology includes any of the aforementioned embodiments wherein the code is operable to detect one or more user gestures manipulating data in the display area, at least one gesture manipulating data projected as a display object in the display area.
Additional aspects of the technology include any of the foregoing embodiments wherein the code is operable to instruct the processor to control the display projector to render an interface in the display area, the interface configured to manipulate the data shared in the display area.
Additional aspects of the technology include any of the foregoing embodiments wherein the code is operable to identify one or more users proximate to the device and associate the data shared in the display area with the one or more users.
Additional aspects of the technology include any of the foregoing embodiments wherein the code is operable to receive input comprising a transfer data input and to transfer the transfer data from one user data store to another user data store.
Additional aspects of the technology include any of the foregoing embodiments wherein the code is operable to provide an interaction service configured to identify the one or more associated devices and associate the one or more associated devices with an identified user of the device.
Additional aspects of the technology include any of the foregoing embodiments wherein the code is operable to receive the input from the one or more associated devices via a host processing device, the input from the host processing device including an identification of a user associated with each of the one or more processing devices.
Another aspect of the technology includes a computer implemented method facilitating interaction between multiple users in a projection area. The method includes rendering a display area on a supporting surface using an interaction device having projector provided in a housing on the supporting surface; detecting one or more inputs the display area utilizing sensors provided in the housing, each of the sensors having a field of view defining a detection area including at least the display area; receiving input to an interactive service via a communication interface provided in the housing, the input adapted to share information in the display area, the input received from a companion processing device associated with a user; and rendering an output in the display area responsive to the input, the output including one or more display objects representing interaction activity between at least the companion processing device and the interaction device.
Additional aspects of the technology include any of the foregoing embodiments wherein the detecting includes receiving input comprising a user gesture manipulating data using the one or more display object in the display area via the detection area.
Additional aspects of the technology include any of the foregoing embodiments wherein the detecting includes one or more user gestures adapted to transfer data from one user data store to another user data store.
Additional aspects of the technology include any of the foregoing embodiments wherein the rendering an output includes displaying a shared display object provided by a user from at least one companion processing device.
Additional aspects of the technology include any of the foregoing embodiments wherein the method further includes rendering a control interface in the display area, the interface configured to manipulate data shared in the display area.
Additional aspects of the technology include any of the foregoing embodiments further including identifying one or more users proximate to the device and associated data shared in the display area with the one or more users.
Additional aspects of the technology include any of the foregoing embodiments further including receiving input via the detection area comprising a transfer data input between users and transferring the transfer data from one user data store to another user data store.
Additional aspects of the technology include any of the foregoing embodiments wherein said receiving includes receiving the input from the one or more companion processing devices via a host processing device, the input from the host processing device including an identification of a user associated with each of the one or more companion processing devices.
Another aspect of the technology is an apparatus, comprising: a housing adapted to be supported on a surface; a processor in the housing; a projector in the housing, the projector configured to render a display area on the surface; a first type of image sensor and a second type of image sensor in the housing, each image sensor having a field of view of at least the display area; and a memory in the housing, the memory including code instructing the processor to provide an interaction service to receive input from at least a first user and a second user, each user having at least an associated data store, the code operable to instruct the processor to receive input to manipulate objects in the data store via the display area on the surface.
Additional aspects of the technology include any of the foregoing embodiments wherein the code is operable to receive data from one of the associated data stores and to render in the display area a display object representing the data.
Additional aspects of the technology include any of the foregoing embodiments wherein the code is operable to receive input comprising a gesture manipulating the display object to manipulate data relative to the one of the associated data stores.
Additional aspects of the technology include any of the foregoing embodiments wherein the code is operable to transfer data between associated data stores responsive to the gesture.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.