Many mobile devices have a small display screen or no display screen which limits the interface complexity they can present. To overcome the limited display size, some mobile devices link to a desktop or laptop computer device that has a larger display. These mobile devices then use the electronic device having the larger display as the user interface. However, using the desktop or laptop computer as the interface to the electronic device can decrease the intuitive nature of the user interface and ease of use.
The figures depict implementations/embodiments of the invention and not the invention itself. Some embodiments are described, by way of example, with respect to the following Figures.
The drawings referred to in this Brief Description should not be understood as being drawn to scale unless specifically noted.
For simplicity and illustrative purposes, the principles of the embodiments are described by referring mainly to examples thereof. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the embodiments. It will be apparent, however, to one of ordinary skill in the art, that the embodiments may be practiced without limitation to these specific details. Also, different embodiments may be used together. In some instances, well known methods and structures have not been described in detail so as not to unnecessarily obscure the description of the embodiments.
The present invention describes a method and system capable of interacting with an interfacing device positioned behind a display screen.
In the embodiments described, the display 112 of the display system 110 provides a larger display screen than the display of the interfacing device 120. In fact in some cases (say for the keyboard example described with respect to
One benefit of the present invention is that actions selected and the resulting content presented on the expanded display output is controlled by the interactions with the interfacing device itself. This is in contrast to some systems where the user controls the output to the expanded screen of the computing device using the interfaces to the computing device itself and not by directly manipulating or interacting with the interfacing device 120 itself.
In contrast, the present invention allows the user to hold and manipulate the interfacing 120 device. This provides a very natural, intuitive way of interacting with the device while still providing an expanded display for the user to interact with. For example, say an interfacing device such as a handheld mobile device is positioned behind an expanded transparent display screen 112 which shows several photographs on the display screen positioned to the right and in front of the interfacing device 120. If the interfacing device includes an arrow key, the user could simply hold the interfacing device and use the arrow on the device to point to a specific photograph on the expanded screen 112 to interact with. This would be in contrast to, for example, the user interacting with a mouse of the PC to move to the arrow key on the visual representation of the interfacing device and clicking on the arrow to move to the picture they wish to select.
In one embodiment, the content displayed on the display screen 112 is an overlaid image. The display system 100 creates an “overlaid” image on the display screen 112—where the overlaid image is an image generated on the display screen that is between the user's viewpoint and the object 120 behind the screen that it is “overlaid” on. Details regarding how the overlaid image is generated is described in greater detail in the patent application having the title “An Augmented Reality Display System and Method of Display” filed on Oct. 22, 2010, having Serial Number PCT/US2010/053860. The overlaid image generated is dependent upon the user's viewpoint. Thus, the position of the overlaid image with respect to the object behind the display screen stays consistent even as the user moves their head and/or the object behind the display screen.
In one embodiment, the overlaid image is created by the display controller component 130 responsive to the viewpoint of the user and the position of the display screen.
In addition, the display system 100 shown in
In addition, the display system also can include a display generation component 126, wherein based on data 128 from the viewpoint assessment component 116 and data 130a from the object tracking component 124, the display generation component 126 creates content for the display on the display screen 112. The display controller component 130 outputs data 134 from at least the display generation component 126 to the display screen 112. Data (128a, 130a) output from the viewpoint assessment component 116 and the object tracking component 124 is used by the display generation component to generate an image on the display screen that overlays or augments objects placed behind the screen.
The display system includes an interaction tracking component 192. In the embodiment shown in
In one embodiment, the interaction tracking component 192 includes a predefined list of device interactions 194 and the resulting outputs on the display (display modification 195). For example, pressing the delete key on the interfacing device might be one possible interaction. The result (the display modification) in this instance might be that the highlighted item is deleted or removed from the display screen. Information about the possible interactions 194 by the interfacing device 120 and the display modification 195 by the display 110 that results from the interaction 194 are stored and used by the interaction tracking component 192 and display generation component 126 to generate a display.
In the embodiment shown in
The type of display modification can be dependent upon the interaction by the interfacing device and in some cases is additionally dependent upon the type of interfacing device and the type of display used in the display system. Information stored about the type of display is stored in the display recognition component 197. Information stored about the type of device is stored in the device recognition component 196. In one embodiment, this information can be used by the display generation component 126 to determine the type of output displayed. For example, the display generation component might choose to output larger print on a menu on a display type that had very low resolution as compared to a display type that had very high resolution.
As previously stated, the interaction tracking component 192 includes a predefined list of device interactions 194 that result in the display screen being modified. Although not limited to these examples, some examples of user interactions with the interfacing device that could result in a display modification include: pushing a button on the interfacing device, scrolling a jog wheel on interfacing device, moving the cursor of the interfacing device, the act of putting the interfacing device behind the transparent display screen, performing a recognizable gesture in the vicinity of the interfacing device, physically manipulating the interfacing device (i.e. shaking the interfacing device, turning the interfacing device upside down, etc.).
In one embodiment, the user interaction with an interfacing device 120 is sensed by sensors in the vicinity of the display system (such as the view assessment sensor 140a-b or object tracking sensors 148a-b). In an alternative embodiment, whether user interaction has occurred can be communicated electronically from the interfacing device to the display system. For example, consider the case where the user pushes a button on the interfacing device 120. In one embodiment, the object tracking sensors behind the display screen could sense when the user's fingers come into contact with a button on the display of the interactive device. The sensor data could be sent to the interaction tracking component 192. In another embodiment, the interfacing device 120 is in wireless communication with the interaction tracking component 192 and when a predefined button on the interfacing device is pressed, a signal is transmitted to the interaction tracking component. Based on the signal information transmitted, the display system 100 can determine that an interaction has occurred.
In one embodiment, for a particular device, a predefined interaction 192 with the interfacing device 120 results in a predefined display modification when the interaction criteria 198 are met. Referring to
In one case ultrasound, visual or infrared technologies may be used for tracking position. For determining the orientation of the device, the device could include an inbuilt accelerometer. Alternatively, the interfacing device 120 could include a magnet that could be detected by magnetometers incorporated into the display (or vice versa). Alternatively, the device could have visibly recognizable markings on its exterior or augmented reality (AR) codes that enable recover of orientation from cameras located on the display. The interfacing device could also include a camera. Devices 120 that incorporate a camera could recover their position and orientation by recognizing IR beacons on the display screen or even fiducial patterns presented on the display.
As previously stated, a predefined interaction with the interfacing device results in a predefined display modification 195 when the interaction criteria 198 are met. Although, not limited to these examples, some examples of modifications to the display based on interactions by the communicative device meeting interaction criteria would be: the output of menu on the display screen, the output of an overlaid image that augments or changes the functionality of the device behind the display screen, the appearance or removal of files from the display screen, etc.
Referring to
Because the output or content displayed on the display screen 112 is dependent upon the controlling interactions, the interfacing device in effect has an expanded display that is capable of providing the user expanded content to interact with. The expanded content is generated and controlled at least in part by whether interaction with the interfacing device meets the predefined interaction criteria 198. If the interaction or manipulation of the device 120 meets the predefined interaction criteria, the content being displayed on the display screen 112 (the expanded screen) is modified.
An example of one possible user interaction is described with respect to
In one embodiment, the image or content on the display screen 112 has a spatial relationship to the interfacing device 120 on the display screen 112. For example, in the embodiment shown in
In one embodiment, the content on the display screen 112 stays static and the interfacing device is moved behind the screen to select content.
In one embodiment, to select a particular photo, the interfacing device 120 should meet the interaction criteria 198 (e.g., sensed within a predefined distance of the photo and with 50% overlap of the display screens), to be selected. In one example, buttons on the device could be used to indicate the selection. In another example, the back surface 158 of the transparent screen is a touch sensitive surface and selection of a particular item or photo can be chosen simply touching the back of the display screen.
In one embodiment, the predefined interactions 194 by the interfacing device are coordinated so that the content displayed on the display 204 of the interfacing device 120 is coordinated with the content displayed on the display 112 of the display system 100. This coordination can be more easily seen and described, for example, with respect to
Referring to
Referring to
In the embodiment—interaction with the interfacing device includes placing the keyboard (the interfacing device) behind the display screen interfacing. When the keyboard is sensed (interaction criteria met), the display output is modified by adding an image of a reassignment label 410 that supports alternative key functions.
One possible example of an interfacing device with no display would be an electronic music player that stores and plays music. The music player could randomly reassign the order of the stored songs for playback. However, the user might find it desirable to provide a designated order that the songs would be played in. In this case, placing the electronic music storage device behind the transparent screen (interaction) would result in a menu (display modified) popping up. In one example, the at least a subset of the possible songs of choice would be displayed by album cover on the transparent display screen. The user could select the order of the song by interacting with the menu or alternatively by selecting songs using the electronic song storage device as a selection means.
Although in one embodiment, a menu could appear on the display screen that was planar with the display screen surface, in the embodiment shown in
In one embodiment, the user twists the ring structure 120 to control the position of the circular menu 510. When the user comes to a defined position on the circular menu, the user can select that item (for example 520a). In one example, selection by the user of a particular item 520a-n results in the opening of a submenu. Based on the position of the ring—the circular menu 510 offers different alternative selections to the user.
Some or all of the operations set forth in the method 600 may be contained as utilities, programs or subprograms, in any desired computer accessible medium. In addition, the method 600 may be embodied by computer programs, which may exist in a variety of forms both active and inactive. For example, they may exist as software program(s) comprised of program instructions in source code, object code, executable code or other formats. Any of the above may be embodied on a computer readable medium, which include storage devices and signals, in compressed or uncompressed form.
The computing apparatus 700 includes one or more processor(s) 702 that may implement or execute some or all of the steps described in the method 600. Commands and data from the processor 702 are communicated over a communication bus 704. The computing apparatus 700 also includes a main memory 706, such as a random access memory (RAM), where the program code for the processor 702, may be executed during runtime, and a secondary memory 708. The secondary memory 708 includes, for example, one or more hard drives 710 and/or a removable storage drive 712, representing a removable flash memory card, etc., where a copy of the program code for the method 700 may be stored. The removable storage drive 712 reads from and/or writes to a removable storage unit 714 in a well-known manner.
Exemplary computer readable storage devices that may be used to implement the present invention include but are not limited to conventional computer system RAM, ROM, EPROM, EEPROM and magnetic or optical disks or tapes. Concrete examples of the foregoing include distribution of the programs on a CD ROM or via Internet download. In a sense, the Internet itself is a computer readable medium. The same is true of computer networks in general. It is therefore to be understood that any interfacing device and/or system capable of executing the functions of the above-described embodiments are encompassed by the present invention.
Although shown stored on main memory 706, any of the memory components described 706, 708, 714 may also store an operating system 730, such as Mac OS, MS Windows, Unix, or Linux; network applications 732; and a display controller component 130. The operating system 730 may be multi-participant, multiprocessing, multitasking, multithreading, real-time and the like. The operating system 730 may also perform basic tasks such as recognizing input from input devices, such as a keyboard or a keypad; sending output to the display 720; controlling peripheral devices, such as disk drives, printers, image, capture device; and managing traffic on the one or more buses 704. The network applications 732 includes various components for establishing and maintaining network connections, such as software for implementing communication protocols including TCP/IP, HTTP, Ethernet, USB, and FireWire.
The computing apparatus 700 may also include an input devices 716, such as a keyboard, a keypad, functional keys, etc., a pointing device, such as a tracking ball, cursors, etc., and a display(s) 720, such as the screen display 110 shown for example in
The processor(s) 702 may communicate over a network, for instance, a cellular network, the Internet, LAN, etc., through one or more network interfaces 724 such as a Local Area Network LAN, a wireless 702.11x LAN, a 3G mobile WAN or a WiMax WAN. In addition, an interface 726 may be used to receive an image or sequence of images from imaging components 728, such as the image capture device.
The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the invention. However, it will be apparent to one skilled in the art that the specific details are not required in order to practice the invention. The foregoing descriptions of specific embodiments of the present invention are presented for purposes of illustration and description. They are not intended to be exhaustive of or to limit the invention to the precise forms disclosed. Obviously, many modifications and variations are possible in view of the above teachings. The embodiments are shown and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents:
This case is a continuation-in-part of the case entitled “An Augmented Reality Display System and Method of Display” filed on Oct. 22, 2010, having Serial Number PCT/US2010/053860 which is hereby incorporated by reference in it's entirety.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/US2010/053860 | Oct 2010 | US |
Child | 12915311 | US |