VIRTUAL USER INTERFACE METHOD AND SYSTEM THEREOF

Abstract
A virtual user interface (VUI) is provided. The VUI (120) can include a touchless sensing unit (110) for identifying and tracking at least one object in a touchless sensory field, a processor (130) communicatively coupled to the sensing unit for capturing a movement of the object within the touchless sensory field, and a driver (132) for converting the movement to a coordinate object (133). In one aspect, the VUI can implement an applications program interface (134) for receiving the coordinate object and providing the coordinate object to the virtual user interface (VUI). An object movement within the sensory field of the VUI can activate user components in a User Interface (150).
Description

BRIEF DESCRIPTION OF THE DRAWINGS

The features of the present invention, which are believed to be novel, are set forth with particularity in the appended claims. The invention, together with further objects and advantages thereof, may best be understood by reference to the following description, taken in conjunction with the accompanying drawings, in the several figures of which like reference numerals identify like elements, and in which:



FIG. 1 illustrates a touchless Virtual user interface (VUI) in accordance with an embodiment of the inventive arrangements;



FIG. 2 illustrates an exemplary VUI application in accordance with an embodiment of the inventive arrangements;



FIG. 3 depicts one embodiment of a VUI suitable for use with a mobile device in accordance with an embodiment of the inventive arrangements.



FIG. 4 depicts one embodiment of a VUI suitable for use with a head set in accordance with an embodiment of the inventive arrangements.


Claims
  • 1. A virtual user interface (VUI) generated by a computing device having computer instructions executing from a computer-readable storage memory that maps virtual components in a touchless sensing field to user components in a user interface (UI) managed by the computing device, and translates touchless finger actions applied to the virtual components to actions on the user components.
  • 2. The virtual user interface of claim 1, wherein the VUI comprises: a sensing unit that generates the touchless sensory field;a processor communicatively coupled to the sensing unit that identifies and tracks a finger movement within the touchless sensory field; anda driver for converting the finger movement to a coordinate object that controls at least a portion of the UI.wherein the touchless sensing unit is at least one among an ultrasonic sensing device, an optical sensing unit, a camera system, a micro-electromechanical (MEMS) system, a laser system, and an infrared unit.
  • 3. The virtual user interface of claim 1, wherein the coordinate object identifies at least one of a positioning action, a push action, a release action, a hold action, and a sliding action of a finger producing the finger movement in the touchless sensory field.
  • 4. The virtual user interface of claim 1, wherein the coordinate object includes one among an absolute location, a relative difference, a velocity, a length of time, and an acceleration of a finger producing the finger movement in the touchless sensory field.
  • 5. The virtual user interface of claim 1, further comprising an applications program interface (API) implemented by the computing device for controlling at least a portion of the UI in accordance with touchless finger movements in the VUI.
  • 6. The virtual user interface of claim 1, wherein the driver is at least one among a tablet driver, a touchpad driver, a touchscreen driver, a stylus driver, and a mouse driver.
  • 7. The virtual user interface of claim 1, wherein the UI generates one of a visual behavior or an audible behavior in response to the finger movement acquiring and controlling a virtual component in the VUI.
  • 8. A method, comprising a touchless sensing unit supplying a coordinate object to a computing device that receives the coordinate object, for controlling at least a portion of a user interface (UI) managed by the computing device, the coordinate object created by: detecting a touchless finger action applied to at least one virtual component in a touchless sensory field; andconverting the finger action to a coordinate object for controlling at least a portion of the UI.
  • 9. The method of claim 8, wherein the coordinate object identifies at least one among an up movement, down movement, left movement, right movement, clockwise movement, and counterclockwise movement of the finger action for controlling at least a portion of the GUI.
  • 10. The method of claim 8, wherein the converting further includes translating a coordinate space of the touchless sensory field to a coordinate space of the GUI, and wherein the coordinate object identifies at least one among an absolute location, a relative difference, a velocity, and an acceleration of the finger in the touchless sensory field.
  • 11. The method of claim 8, wherein the coordinate object identifies at least one of a touchless push action, a touchless release action, a touchless hold action, and a touchless slide action of the finger on at least one virtual component.
  • 12. The method of claim 8, wherein the UI generates at least one of a visual indicator or an audible indicator in response to a finger action applied to a virtual component in the touchless sensory field that corresponds to a user component in the UI.
  • 13. The method of claim 8, wherein the coordinate object is a web component at least one among an HTML object, an XML object, a Java Object, a C++ class object, a NET object, and a Java Servlet.
  • 14. A communication device for presenting a touchless virtual user interface (VUI), the communication device having a controlling element that receives a coordinate object from a touchless sensing unit and controls at least a portion of a user interface (UI) using the coordinate object, wherein the controlling element controls at least one user component in the UI in accordance with touchless finger movements applied to at least one virtual component in the touchless virtual user interface (VUI).
  • 15. The communication device of claim 14, wherein the coordinate object identifies at least one among an absolute location, a relative difference, a velocity, a length of time, and an acceleration of a finger producing the touchless finger movements for controlling at least a portion of the GUI.
  • 16. The communication device of claim 14, wherein the coordinate object identifies at least one among a positioning action, a push action, a release action, a hold action, and a sliding action for controlling at least a portion of the UI.
  • 17. The communication device of claim 14, wherein the coordinate object identifies at least one among an up movement, down movement, left movement, right movement, clockwise movement, and counterclockwise movement of a finger producing the touchless finger movements for controlling at least a portion of the GUI.
  • 18. The communication device of claim 14, wherein the UI exposes at least one property option for adjusting a sensitivity of the VUI.
  • 19. The communication device of claim 14, wherein the controlling element generates at least one among a visual indicator or an audio indicator of a user component in the UI in response to a touchless finger action applied to a virtual component in the VUI.
  • 20. The communication device of claim 14, wherein the controlling element correlates a position the finger in the VUI with at least one graphical component in a graphical user interface (GUI).
Provisional Applications (1)
Number Date Country
60781179 Mar 2006 US