Many different types of consumer electronics devices nowadays typically include a touch screen disposed on one surface of the devices. The touch screen acts as an output device that displays image, video and/or graphical information, and acts as an input touch interface device for receiving touch control inputs from a user. A touch screen (or touch panel, or touch panel display) may detect the presence and location of a touch within the area of the display, where the touch may include a touching of the display with a body part (e.g., a finger) or with certain objects (e.g., a stylus). Touch screens typically enable the user to interact directly with what is being displayed, rather than indirectly with a cursor controlled by a mouse or touchpad. Touch screens have become widespread in use with various different types of consumer electronic devices, including, for example, cellular radiotelephones, personal digital assistants (PDAs), and hand-held gaming devices. A factor limiting the usefulness of touch screens is the limited surface area that may actually be used. In particular, touch screens used with hand-held and/or mobile devices have very limited surface areas in which touch input may be received and output data may be displayed.
Virtual keyboards, or projected user interfaces (UIs), are recent innovations in device technology that attempt to increase the size of the UI relative to, for example, the small size of a touch screen. With virtual keyboards, or projected UIs, the device includes a projector that projects an image of the UI on a surface adjacent to the device, enabling a larger output display for use by the user.
In one exemplary embodiment, a method may include projecting a user interface (UI) in a projection area adjacent to a device to generate a projected UI, and identifying an occluding object in the projection area of the projected UI. The method may further include adapting the projected UI based on identification of the occluding object in the projection area, where adapting the projected UI comprises altering the projected UI to mask the occluding object or adapting a portion of graphics of the UI projected on or near the occluding object.
Additionally, altering the projected UI to mask the occluding object may include removing, from the user interface, graphics that would be projected onto the occluding object.
Additionally, adapting the projected UI may include projecting graphics associated with the projected UI onto the occluding object.
Additionally, adapting the projected UI may further include projecting information related to use of the projected UI onto the occluding object.
Additionally, projecting information related to use of the projected UI includes projecting information related to use of a tool palette of the projected UI onto the occluding object.
Additionally, the method may further include determining a projection mode associated with the projected UI, where determining a projection mode comprises one or more of: determining a context of use of the projected UI, determining user interaction with the projected UI or the device, or determining one or more gestures of the user in the projection area.
Additionally, the one or more gestures may include at least one of pointing a finger of a hand of the user, making a circular motion with a finger of the hand of the user, wagging a finger of the hand of the user, or clutching the hand of the user.
Additionally, adapting the projected UI may further be based on the determined projection mode associated with the projected UI.
Additionally, the device may include a hand-held electronic device.
In another exemplary embodiment, a device may include an image generation unit configured to generate an image of a user interface (UI), and a UI projector configured to project the image in a projection area adjacent the device to generate a projected UI. The device may further include a camera configured to generate an image of the area, and an image processing unit configured to process the generated image to identify an occluding object in the projection area. The device may also include a UI control unit configured to adapt the projected UI based on identification of an occluding object in the projection area.
Additionally, the UI control unit, when adapting the projected UI, may be configured to alter the projected UI to mask the occluding object.
Additionally, the UI control unit, when adapting the projected UI, may be configured to adapt a portion of graphics of the projected UI on or near the occluding object.
Additionally, when adapting a portion of graphics of the projected UI, the UI control unit may be configured to control the image generation unit and UI projector to project graphics onto the occluding object.
Additionally, when adapting a portion of graphics of the projected UI, the UI control unit may be configured to control the image generation unit and UI projector to project information related to use of the UI onto the occluding object.
Additionally, the occluding object in the projection area may include a hand of a user of the device.
Additionally, the device may include one of a cellular radiotelephone, a satellite navigation device, a smart phone, a Personal Communications System (PCS) terminal, a personal digital assistant (PDA), a gaming device, a media player device, a tablet computer, or a digital camera.
Additionally, the device may include a hand-held electronic device.
Additionally, the control unit may be further configured to: determine a projection mode associated with the projected UI based on a context of use of the projected UI, user interaction with the projected UI or the device, or one or more gestures of the user in the projection area.
Additionally, the one or more gestures may include at least one of pointing a finger of a hand of the user, making a circular motion with a finger of the hand of the user, wagging a finger of the hand of the user, or clutching the hand of the user.
Additionally, the UI control unit may be configured to adapt the projected UI further based on the determined projection mode associated with the projected UI.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate one or more embodiments described herein and, together with the description, explain these embodiments. In the drawings:
The following detailed description refers to the accompanying drawings. The same reference numbers in different drawings may identify the same or similar elements. Also, the following detailed description does not limit the invention.
As further shown in
When interacting with UI 110 projected on projection surface 115, a user's hand will occasionally occlude the projection. Sometimes this may be acceptable, such as when a hand accidentally passes through the projected area, but at other times it can be distracting. For example, if a user is interacting with projected UI 110, the UI image on the occluding hand can make it difficult to see the position, shape and gestures of the hand and how it relates to the underlying UI. Exemplary embodiments described herein enable the context of use of device 100 or UI 110, a user's hand gestures, and/or overt user UI interaction to trigger an appropriate adaptation of a part of a UI image projected on an occluding object that is placed within the projection area of projected UI 110.
Device 100 is depicted in
In a second example 300 shown in
In another example 400 shown in
Touch panel 640 may be integrated with, and/or overlaid on, a display to form a touch screen or a panel-enabled display that may function as a user input interface (i.e., a UI that can be used when the projected UI is turned off). For example, in one implementation, touch panel 640 may include a near field-sensitive (e.g., capacitive), acoustically-sensitive (e.g., surface acoustic wave), photo-sensitive (e.g., infrared), and/or any other type of touch panel that allows a display to be used as an input device. In another implementation, touch panel 640 may include multiple touch-sensitive technologies. Generally, touch panel 640 may include any kind of technology that provides the ability to identify the occurrence of a touch upon touch panel 640. The display associated with touch panel 640 may include a device that can display signals generated by device 100 as text or images on a screen (e.g., a liquid crystal display (LCD), cathode ray tube (CRT) display, organic light-emitting diode (OLED) display, surface-conduction electro-emitter display (SED), plasma display, field emission display (FED), bistable display, etc.). In certain implementations, the display may provide a high-resolution, active-matrix presentation suitable for the wide variety of applications and features associated with typical devices. The display may provide visual information to the user and serve—in conjunction with touch panel 640—as a user interface to detect user input when projected UI 110 is turned off (or may be used in conjunction with projected UI 110). In some embodiments, device 100 may only include a projected UI 110 for a user input interface, and may not include touch panel 640.
Camera 125 may include a digital camera for capturing digital images of the projection area of projected UI 110. Image processing unit 700 may receive digital images from camera 125 and may apply image processing techniques to, for example, identify an occluding object in the projection area of projected UI 110. Image processing unit 700 may also apply image processing techniques to digital images from camera 125 to identify one or more gestures when the occluding object is a hand of a user of device 100. UI control unit 710 may receive data from image processing unit 700 and may control the generation of projected UI 110 by UI image generation unit 720 based on the data from image processing unit 700. UI control unit 710 may control the adaptation of portions of the graphics of projected UI 110 based on a selected projection mode. UI image generation unit 720 may generate an image of the UI to be projected by UI projector 105. The generated image may include all icons, etc. that are to be displayed on projected UI 110. UI projector 105 may include optical mechanisms for projecting the UI image(s) generated by UI image generation unit 720 onto projection surface 115 to produce projected UI 110 with which the user of device 100 may interact.
The exemplary process may include determining a projection mode of projected UI 110 (block 810). The projection mode of projected UI 110 may be determined based on various factors, including, for example, a determined context of use of the projected UI, one or more gestures of the user in the projected UI, and/or explicit user interaction with the UI or with device 100. The projection mode of projected UI 110 may be determined by UI control unit 710.
User gesture(s) may be determined (block 920). The user of device 100 may perform certain hand gestures in the projection area of projected UI 110. Such gestures may include, for example, pointing with a finger of the user's hand, making a circular motion with a finger of the user's hand, wagging a finger of the user's hand, clutching the user's hand, etc. Other types of user gestures, however, may be used. The projection mode may be selected based on the context of use (i.e., determined in block 900), the user interaction with the UI or with device 100 (i.e., determined in block 910) and/or user gestures (i.e., determined in block 920) (block 930). The projected mode selected may include, for example, a “project normally” mode in which the UI is projected onto the occluding object, a “mask occluding object” mode in which the projected UI in the vicinity of the occluding object is masked, and/or an “adapt UI graphics” mode in which graphics on or near the occluding object are altered.
Returning to
The projection of projected UI 110 on the occluding object may be adapted based on the mode determined in block 810 (block 830). UI control unit 710 may control the adaptation of the projection of projected UI 110.
Returning to
Implementations described herein provide mechanisms for adapting portions of a projected UI on or near occluding objects in the projection area of the projected UI. The portions of the projected UI on or near the occluding objects may be adapted to suit the task or tasks being performed by the user on the projected UI.
The foregoing description of the embodiments described herein provides illustration and description, but is not intended to be exhaustive or to limit the invention to the precise form disclosed. Modifications and variations are possible in light of the above teachings or may be acquired from practice of the invention. For example, while a series of blocks has been described with respect to
Certain features described herein may be implemented as “logic” or as a “unit” that performs one or more functions. This logic or unit may include hardware, such as one or more processors, microprocessors, application specific integrated circuits, or field programmable gate arrays, software, or a combination of hardware and software.
The term “comprises” or “comprising” as used herein, including the claims, specifies the presence of stated features, integers, steps, or components, but does not preclude the presence or addition of one or more other features, integers, steps, components, or groups thereof.
No element, act, or instruction used in the description of the present application should be construed as critical or essential to the invention unless explicitly described as such. Also, as used herein, the article “a” is intended to include one or more items. Further, the phrase “based on,” as used herein is intended to mean “based, at least in part, on” unless explicitly stated otherwise.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/IB10/53730 | 8/18/2010 | WO | 00 | 9/26/2011 |