Claims
- 1. For use with a system that acquires at least one of (x,y,z) distance and brightness measurements from a target source using independent sensors, a method of improving distance measurements comprising the following steps:
(a) causing said sensors to acquire measurement data at a repetition rate higher than required for operation of said system; (b) combining output signals from said sensors to obtain statistical average output signals; wherein random noise in said system is reduced proportionally to a square root of number of said output signals averaged, and accuracy of said distance measurements is enhanced.
- 2. The method of claim 1, wherein said sensors are formed in an array, and step (b) includes forming statistical running averages of outputs from chosen ones of said sensors based upon at least one of (i) neighboring proximity of said sensors, and (ii) successive proximity in time of sampled said outputs.
- 3. The method of claim 2, wherein step (b) excludes ones of said outputs if said ones do not meet at least one threshold associated with (i) distance measurement, (ii) brightness measurement, (iii) velocity measurement, and (iv) estimated shape contour of said target source.
- 4. The method of claim 3, wherein said sensors are formed in an array, and step (b) includes forming statistical running averages of outputs from chosen ones of said sensors based upon at least one of (i) neighboring proximity of said sensors, and (ii) successive proximity in time of sampled said outputs.
- 5. The method of claim 4, wherein step (b) excludes ones of said outputs if said ones do not meet at least one threshold associated with (i) distance measurement, (ii) brightness measurement, (iii) velocity measurement, and (iv) estimated shape contour of said target source.
- 6. The measurement of claim 2, wherein said system acquires data from said sensors at a nominal frame rate, and step (b) includes at least one of (i) averaging said outputs acquired from a same frame, and (ii) averaging said outputs acquired from at least two adjacent frames.
- 7. The method of claim 1, wherein said system includes a reference target having at least one region of pre-calibrated reflectivity, and further including (c) correcting said statistical average output signals made at step (b) according to detecting signals from said reference target.
- 8. For use with a system that acquires at least one of (x,y,z) distance and brightness measurements using energy transmitter from an emitter at a first location on a plane, said energy reflecting at least in part from a target source and being detected by independent sensors defining a sensor array on said plane but spaced-apart from said first location, a method of improving distance measurements comprising the following steps:
(a) defining a spherical coordinate for each sensor in said array, and constructing a look-up table containing spherical coordinates for each said sensor; (b) defining a spatial coordinate of said emitter; (c) For each sensor <i,j>, calculating constants kij and hij as follows: kij=Cx2+Cy2Cz2, andC=2(pij+Cx cos(aij)sin(bij)+Cy sin(aij)sin(bij)+Cz cos(bij)) (3)wherein sensor p has spherical coordinate (pij, −aij, −bij) and has Cartesian coordinate (px, py, pz)=(pij cos(−aij)sin(−bij), pij sin(−aij)sin(−bij), pij cos(−bij)) (d) constructing a look-up table containing calculated said values of kij and kij; (e) identifying sensors <i,j> that actually detect energy reflected from said target object; (f) for each sensor <i,j> identified at step (e), calculating calculate rA according to rA=((2 d−p)2−kij)/(4 d−hij) where in a spherical coordinate system, point A is representable as (rA, aij, bij) using values of kij and kij from step (d); (g) calculating roundtrip distance 2d from said target object to sensor <i,j>; and (h) calculating actual coordinate of said target object detected at sensor <ij> according to Ax=rA cos(aij)sin(bij), Ay=rA sin(aij)sin(bij), and Az=rA cos(βij).
- 9. For use with a video imaging system that can encode Z values, a method of encoding said Z values as part of YIQ encoding, the method comprising the following steps:
(a) converting a RGB value for each sensor to a RGB matrix and converting said RGB matrix to a YIQ matrix; (b) partitioning said YIQ matrix in Y, I, and Q planes; (c) Fourier transforming I and Q dimensions of said YIQ matrix; (d) allocating segments of said I and Q dimensions in a frequency domain to store Z-values, wherein said segments correspond to the frequencies that if eliminated from a reverse transformation would not substantially alter color perception by a human viewer; (e) locating segments having at least one characteristic selected from a group consisting of (i) said segments are not used, and (ii) said segments fall below a predetermined threshold of visibility; said segments being sufficiently large to store Z-values for all sensors; (f) encoding Z(X,Y) coordinates of each sensor using said segments; (g) adjusting amplitude coefficients of said segments; (h) transform I″Q″ from frequency domain to time domain, and appending Y thereto to create a YI″Q″ matrix; and (i) transforming from said YI″Q″ matrix to a R″G″B″ matrix.
- 10. The method of claim 9, wherein at step (b) if said YIQ matrix comprised N-bit values per sensor, step (b) includes logically dissecting said matrix into three matrices, each having (N/3)-bit values per sensor.
- 11. The method of claim 9, wherein step (f) includes using Huffman encoding.
- 12. The method of claim 9, wherein step (g) includes just a just-noticable-differences technique.
- 13. The method of claim 9, wherein recovering of said Z values is carried out according to the following steps:
(a) converting R′G′B′ back to YI′Q′; (b) converting I′Q′ to a frequency domain matrix; and (c) extracting Z values from said matrix resulting from step (b).
- 14. A computer-readable storage medium wherein is located a computer program that causes a computer sub-system having at least a processor unit to control a system that acquires at least one of (x,y,z) distance and brightness measurements from a target source using independent sensors to enhance performance of said system by:
(a) causing said sensors to acquire measurement data at a repetition rate higher than required for operation of said system; (b) combining output signals from said sensors to obtain statistical average output signals; wherein random noise in said system is reduced proportionally to a square root of number of said output signals averaged, and accuracy of said distance measurements is enhanced.
- 15. The storage medium of claim 14, wherein sensors in said system are formed in an array, and wherein execution of said computer program forms running averages of outputs from chosen ones of said sensors based upon neighboring proximity of said sensors.
- 16. The storage medium of claim 14, wherein sensors in said system are formed in an array, and wherein execution of said computer program excludes ones of said outputs if said ones do not meet at least one threshold associated with (i) distance measurement, (ii) brightness measurement, (iii) velocity measurement, and (iv) estimated shape contour of said target source.
- 17. The storage medium of claim 14, wherein said system acquires data from said sensors at a nominal frame rate, and execution of said computer program causes at least one of (i) averaging said outputs acquired from a same frame, and (ii) averaging said outputs acquired from at least two adjacent frames.
- 18. A computer-readable storage medium wherein is located a computer program that causes a computer sub-system having at least a processor unit to improve distance measurements in a system that acquires at least one of (x,y,z) distance and brightness measurements using energy transmitter from an emitter at a first location on a plane, said energy reflecting at least in part from a target source and being detected by independent sensors defining a sensor array on said plane but spaced-apart from said first location by carrying out the following steps:
(a) defining a spherical coordinate for each sensor in said array, and constructing a look-up table containing spherical coordinates for each said sensor; (b) defining a spatial coordinate of said emitter; (c) For each sensor <i,j>, calculating constants kij and hij as follows: kij=Cx2+Cy2+Cz2, and C=2(pij+Cx cos(aij)sin(bij)+Cy sin(aij)sin(bij)+Cz cos(bij)) wherein sensor p has spherical coordinate (pij, −aij, −bij) and has Cartesian coordinate (px, py, pz)=(pij cos(−aij)sin(−bij), pij sin(−aij)sin(−bij), pij cos(−bij)) (d) constructing a look-up table containing calculated said values of kij and kij; (e) identifying sensors <i,j> that actually detect energy reflected from said target object; (f) for each sensor <i,j> identified at step (e), calculating calculate rA according to rA=((2 d−pij)2−kij)/(4 d−hij) where in a spherical coordinate system, point A is representable as (rA, aij, bij) using values of kij and kij from step (d); (g) calculating roundtrip distance 2d from said target object to sensor <i,j>; and (h) calculating actual coordinate of said target object detected at sensor <i,j> according to Ax=rA cos(aij)sin(bij), Ay=rA sin(aij)sin(bij), and Az=rA cos(βij).
- 19. A computer-readable storage medium wherein is located a computer program that causes a computer sub-system having a processor unit for use with a video imaging system that can encode Z values to encode Z values as part of YIQ encoding by:
(a) converting a RGB value for each sensor to a RGB matrix and converting said RGB matrix to a YIQ matrix; (b) partitioning said YIQ matrix in Y, I, and Q planes; (c) Fourier transforming I and Q dimensions of said YIQ matrix; (d) allocating segments of said I and Q dimensions in a frequency domain to store Z-values, wherein said segments correspond to the frequencies that if eliminated from a reverse transformation would not substantially alter color perception by a human viewer; (e) locating segments having at least one characteristic selected from a group consisting of (i) said segments are not used, and (ii) said segments fall below a predetermined threshold of visibility; said segments being sufficiently large to store Z-values for all sensors; (f) encoding Z(X,Y) coordinates of each sensor using said segments; (g) adjusting amplitude coefficients of said segments; (h) transform I″Q″ from frequency domain to time domain, and appending Y thereto to create a YI″Q″ matrix; and (i) transforming from said YI″Q″ matrix to a R″G″B″ matrix.
- 20. The storage medium of claim 19, wherein if said YIQ matrix comprised N-bit values per sensor, execution of said computer program logically dissects said matrix into three matrices, each having (N/3)-bit values per sensor.
RELATION TO PREVIOUSLY FILED APPLICATION
[0001] Priority is claimed from U.S. provisional patent application, serial No. 60/157,659 filed on Oct. 5, 1999 and entitled “Software Algorithms and Applications for Direct 3D Sensing”, Abbas Rafii and Cyrus Bamji, applicants. This present application is a continuation-in-part of co-pending U.S. patent application Ser. No. 09/401,059 filed on Sep. 22, 1999, entitled “CMOS-COMPATIBLE THREE-DIMENSIONAL IMAGE SENSOR IC”, Cyrus Bamji, applicant, and of co-pending U.S. patent application Ser. No. 09/502,499 filed on Feb. 11, 2000, entitled “METHOD AND APPARATUS FOR ENTERING DATA USING A VIRTUAL INPUT DEVICE”, Abbas Rafii, Cyrus Bamji, and Nazim Kareemi, applicants. Each said co-pending application is assigned to assignee herein.
Provisional Applications (1)
|
Number |
Date |
Country |
|
60157659 |
Oct 1999 |
US |
Divisions (3)
|
Number |
Date |
Country |
| Parent |
09684368 |
Oct 2000 |
US |
| Child |
10013069 |
Dec 2001 |
US |
| Parent |
09401059 |
Sep 1999 |
US |
| Child |
10013069 |
Dec 2001 |
US |
| Parent |
09502499 |
Feb 2000 |
US |
| Child |
10013069 |
Dec 2001 |
US |