METHOD AND DEVICE FOR THREE-DIMENSIONAL SENSING

Abstract
An apparatus (100) and method is provided that identifies and tracks a relative location and movement of an object in a three-dimensional space. The sensing unit includes a processor (122) for communicating a coordinate information of the object within the three-dimensional space. The method includes emitting a pulse from a first transducer (101), estimating a time of flight from a reflection signal received by a second transducer (102), and determining a location and relative movement of the object from the time of flight measurements. The sensing unit can provide touchless control via touchless finger depression actions, finger slide actions, finger release actions, and finger hold actions.
Description

BRIEF DESCRIPTION OF THE DRAWINGS

The features of the present invention, which are believed to be novel, are set forth with particularity in the appended claims. The invention, together with further objects and advantages thereof, may best be understood by reference to the following description, taken in conjunction with the accompanying drawings, in the several figures of which like reference numerals identify like elements, and in which:



FIG. 1 illustrates a first configuration of a sensing unit in accordance with an embodiment of the inventive arrangements;



FIG. 2 illustrates a second configuration of the sensing unit in accordance with an embodiment of the inventive arrangements;



FIG. 3 illustrates a third configuration of the sensing unit in accordance with an embodiment of the inventive arrangements;



FIG. 4 depicts a sensing unit coupled to a display in accordance with an embodiment of the inventive arrangements;



FIG. 5 illustrates a schematic of the sensing unit in accordance with an embodiment of the inventive arrangements;



FIG. 6 depicts a sensing unit coupled to a keyboard in accordance with an embodiment of the inventive arrangements;



FIG. 7 depicts a sensing unit integrated within a mobile device in accordance with an embodiment of the inventive arrangements; and



FIG. 8 depicts a sensing unit coupled to a laptop in accordance with an embodiment of the inventive arrangements.


Claims
  • 1. An apparatus comprising: a touchless sensing unit, wherein said touchless sensing unit identifies and tracks a relative location and movement of an object in a three-dimensional space; anda controller communicatively coupled to said touchless sensing unit for communicating a coordinate information of said object within said three-dimensional space.
  • 2. The apparatus of claim 1, wherein a member of said sensing unit has at least one sensing element, wherein a first sensing element, a second sensing element, and a third sensing element are spaced at approximately 120 degrees from one another with respect to a center location.
  • 3. The apparatus of claim 1, wherein a member of said sensing unit has at least one sensing element, wherein a first and a second member are spaced at approximately 90 degrees, a second and third member are spaced at approximately 90 degrees, and the third and the first member are spaced at approximately 180 degrees to one another.
  • 4. The apparatus of claim 1, further comprising a center sensing element positioned at a center location that is approximately equidistant from at least one more sensing unit.
  • 5. The apparatus of claim 2, wherein a sensing element is at least one of an acoustic microphone, an acoustic speaker, an ultrasonic receiver, an ultrasonic emitter, a combination acoustic microphone and speaker, an omni-directional ultrasonic transducer, a directional ultrasonic transducer, a micro-electromechanical (MEMS) microphone, an imaging element, an infra-red element, and a camera element.
  • 6. A method for identifying and tracking a relative location and movement of an object in a three-dimensional space comprising: emitting a pulse from a first transducer;estimating a time of flight between when said pulse was transmitted from said transducer and when a reflection of said signal off an object in said three-dimensional space was received from a plurality of transducers;estimating a differential time of flight between a first reflected signal and a second reflected signal;determining a location and relative displacement of said object from said time of flight measurement and said differential time of flight measurement;wherein the object is within a spherical radius of 12 inches from the first transducer and plurality of transducers.
  • 7. The method of claim 6, further comprising: converting said location and relative movement to a coordinate object; andconveying said coordinate object to a program for controlling a user interface.
  • 8. The method of claim 7, further comprising wirelessly transmitting said coordinate object to a communication device.
  • 9. The method of claim 6, further comprising: tracking a finger within said three-dimensional space;detecting a finger action within said three-dimensional space; andperforming a user interface command in response to said finger action.
  • 10. The method of claim 9, further comprising at least one of: determining a length of time a finger stays at a current location; anddetermining if a finger moves away and returns to the current location within a time frame.
  • 11. The method of claim 9, further comprising detecting a touchless finger depression action, a touchless finger slide action, a touchless finger release action, and a touchless finger hold action.
  • 12. The method of claim 6, further comprising: from a first transducer, emitting a first signal from a first direction at a first time;from at least one pair of transducers, detecting a first and second reflection of said signal off said finger from said first direction;calculating a first and second phase differential from said signal for each said pair;determining a location of said finger from time of flight (TOF) measurements calculated at said pair of transducers; andupdating said location using said phase differentials from each said pair.
  • 13. The method of claim 12, further comprising: repeating the steps of emitting, detecting, and determining a TOF said phase differentials for multiple directions at multiple times for generating a plurality of finger locations and relative displacements;correlating said possible finger locations for determining a finger position having a highest likelihood of producing said reflections from said multiple directions; andupdating said finger locations using said relative displacements through a combinational weighting of said locations and said differentials.
  • 14. The method of claim 6, further comprising: calculating a first TOF and a first phase differential between a signal emitted at a first time and a reflected signal detected at a first sensor;calculating a second TOF and a second phase differential between said signal emitted at said first time and a reflected signal detected at a second sensor; andcalculating a third TOF and a third phase differential between said signal emitted at said first time and a reflected signal detected at a third sensor.
  • 15. The method of claim 14, further comprising calculating a first estimate of a location of said object from said first TOF; calculating a second estimate of a location of said object from said second TOF;calculating a third estimate of a location of said object from said third TOF;creating at least three complex surfaces as a function of said first, second, and third estimate of a location; andidentifying an intersection of said at least three complex surfaces corresponding to a coarse location of said object; andapplying said first, second, and third phase differential to said location for updating said coarse location of said object to a fine location of said object.
  • 16. The method of claim 14, further comprising: estimating a coarse location of said object using an estimate of said object's location from said TOFs when said object is at a far distance from said sensing unit;updating said coarse location using a relative displacement of said object using said phase differences when said object is at a close distance from said sensing unit;employing a combinational weighting of location and relative displacement as said object moves between said far distance and said close distance.
  • 17. The method of claim 16, wherein a phase differential corresponds to a relative displacement of said object and a TOF corresponds to a location of said object, such that a base coordinate of said object can be referenced at a location when said object is at a far distance from said sensing unit, and said base coordinate is updated using said relative displacement as said object nears closer to said sensing unit.
  • 18. The method of claim 17, wherein said base coordinate is updated at one of a faster or slower time interval in accordance with the proximity of the object to the sensing unit.
  • 19. A mobile device having a sensing unit comprising: an array of sensing elements aligned along said communication device to identify and track touchless finger movements;at least one sensing element within a boundary of a display of the communication device; anda controller communicatively coupled to said array and said at least one sensing element for conveying touchless controls to a user interface of said communication device in response to said finger movements.
  • 20. The mobile device of claim 19, wherein said at least one sensing element emits a direct pulse to said array to estimate an angle of the display of said communication device.
Provisional Applications (1)
Number Date Country
60779868 Mar 2006 US