TOUCHLESS TABLET METHOD AND SYSTEM THEREOF

Abstract
A system (100) and method for a touchless tablet that produces a touchless sensory field over a form (111). The touchless tablet includes a touchless sensing unit (110) for identifying a finger action above the form, and a controller (130) communicatively coupled to the sensing unit for associating the finger action with at least one form component on the form. The touchless tablet identifies a selection of a form component (122) based on a location and action of the finger above the form. A display (140) connected to the touchless tablet can expose a graphical application, wherein a touchless selection of a form component corresponds to a selection of a graphical component on the graphical application.
Description

BRIEF DESCRIPTION OF THE DRAWINGS

The features of the present invention, which are believed to be novel, are set forth with particularity in the appended claims. The invention, together with further objects and advantages thereof, may best be understood by reference to the following description, taken in conjunction with the accompanying drawings, in the several figures of which like reference numerals identify like elements, and in which:



FIG. 1 illustrates a first embodiment of a touchless tablet in accordance with the inventive arrangements;



FIG. 2 illustrates a second embodiment of a touchless tablet in accordance with the inventive arrangements;



FIG. 3 shows a form for use with the touchless tablet in accordance with the inventive arrangements;



FIG. 4 illustrates a first embodiment of a touchless screen in accordance with the inventive arrangements;



FIG. 5 illustrates a second embodiment of a touchless screen in accordance with the inventive arrangements; and



FIG. 6 illustrates a virtual touchscreen in accordance with the inventive arrangements.


Claims
  • 1. A touchless tablet comprising: a sensing unit for identifying a touchless finger action above a form; anda controller communicatively coupled to said sensing unit, wherein said controller associates said finger action with at least one form component on said form,wherein said touchless tablet identifies a selection of said form component based on said location and action of said finger above said form.
  • 2. The touchless tablet of claim 1, further comprising a projector for presenting a visual layout of graphical components corresponding to said form components.
  • 3. The touchless tablet of claim 2, further comprising a camera that captures an image of said form, and said controller identifies form components in said image and creates graphical components on a display corresponding to said form components.
  • 4. The touchless tablet of claim 1, wherein said sensing unit further comprises: a detector, wherein said detector identifies a position of said finger in a sensory field above the form; anda timer cooperatively connected to said detector, wherein said timer determines a length of time said finger is at a position over a form component.
  • 5. The touchless tablet of claim 4, wherein said detector includes a microphone for capturing a sound of a finger tap on a surface presenting said form component, such that said detector affirms said position of said finger when said sound of said finger tap is detected.
  • 6. The touchless tablet of claim 1, wherein said touchless tablet is: a portion of a frame positioned on at least one side of said form to allow a user to see an unobstructed view of said form,wherein said frame contains at least one sensory element for detecting a finger movement within said frame.
  • 7. The touchless tablet of claim 1, wherein said finger action includes at least one of a touchless depressing action, a touchless release action, a touchless hold action, and a touchless dragging action.
  • 8. The touchless tablet of claim 1, further comprising a display communicatively coupled to said touchless tablet for exposing a graphical application, wherein said selection of a form component within a sensory field above the form corresponds to a selection of a graphical component on said graphical application.
  • 9. The touchless tablet of claim 1, wherein said selection performs an action on the form object that produces a response from a graphical component in the graphical application, wherein the graphical component corresponds to the form component.
  • 10. The touchless tablet of claim 1, wherein said touchless sensing unit comprises: an array of ultrasonic transducers positioned on at least one side of a form for producing a sensing field over said form for determining a first and second coordinate of a finger within said sensing field;at least one off-axis element cooperatively connected to said touchless sensing unit for providing a third dimension of sensory measurement for determining a third coordinate of said finger within said sensing field; anda processor for determining a three-dimensional coordinate and action of said finger within said sensing field.
  • 11. A virtual screen comprising: a sensing unit containing an arrangement of sensing elements for generating a touchless sensory field;a processor communicatively coupled to said touchless sensing unit for determining a finger location and a finger action within said touchless sensory field; anda controller communicatively coupled to said processor for controlling a graphical application according to said finger location and action.
  • 12. The virtual screen of claim 11, wherein said sensing unit comprises at least one among a camera element, an infra-red element, an optical element, or an acoustic transducer.
  • 13. The virtual screen of claim 11, wherein said finger action includes at least one of a finger push action, a finger hold action, a finger release action, and a finger slide action.
  • 14. The virtual screen of claim 11, further comprising a display that receives commands for controlling a graphical application on the display.
  • 15. The virtual screen of claim 11, further comprising a communication unit for transmitting said finger action and location using at least one among a USB, a BlueTooth, and a ZigBee communication link.
  • 16. The virtual screen of claim 11, further comprising a communication device that receives said finger location and action and adjusts at least one user interface control of the communication device.
  • 17. A method of navigation in a virtual screen comprising: creating a three dimensional (3D) sensory space;estimating a finger position within said 3D sensory space;determining one of a forward or retracting finger movement associated with said finger position; andcontrolling a user interface according to said finger movement.
  • 18. The method of claim 17, wherein said determining comprises: identifying a first depth of a finger during said forward or retracting finger movement within said 3D sensory space;identifying a second depth of said finger during said forward or retracting finger movement within said 3D sensory space;creating a vector from said first and said second depth;predicting a destination of said finger on said graphical application from said vector; andadjusting a region of focus within a graphical portion of said user interface based on said destination.
  • 19. The method of claim 17, wherein a distant depth of said 3D sensory space corresponds to a broad region and a close depth of said 3D sensory space corresponds to a narrow region of focus, such that said region of focus broadens when the user's finger is farther from a sensing unit producing said 3D sensory space, and said region of focus narrows when the user's finger is closer to said sensing unit.
  • 20. The method of claim 17, further comprising presenting said destination and said region of focus on a graphical portion of said user interface to provide visual feedback.
Provisional Applications (1)
Number Date Country
60781177 Mar 2006 US