Claims
- 1. An imaging device comprising:
a plurality of linear imaging arrays and image formation optics that provide field of views corresponding to the plurality of linear image arrays; at least one illumination module that produces planar light illumination that substantially overlaps the field of views corresponding to the plurality of linear imaging arrays; and image processing circuitry that analyzes pixel data values of a plurality of composite 2-D images each derived from sequential image capture operations of a corresponding one linear imaging array to derive velocity data that represents an estimated velocity of the imaging device with respect to at least one target object disposed in said field of views.
- 2. The imaging device of claim 1, wherein said image processing circuitry produces a first image of portions of said target object(s), said first image having substantially constant aspect ratio, utilizing image transformation operations that are based upon the velocity data or camera control operations that are based upon the velocity data.
- 3. The imaging device of claim 2, further comprising a controller, operably coupled to said at least one illumination module, that controls at least one time period of illumination or at least one power level of illumination such that said first image has a substantially uniform white level.
- 4. The imaging device of claim 1, wherein said linear imaging arrays are spaced apart along the intended direction of object motion, and line rate of said linear imaging arrays is substantially constant over said sequential image capture operations.
- 5. The imaging device of claim 1, wherein said planar light illumination overfills said field of views over a range of working distances of the imaging device during said sequential image capture operations.
- 6. The imaging device of claim 1, wherein said planar light illumination is produced from at least one source of coherent illumination.
- 7. The imaging device of claim 6, wherein said at least one source comprises at least one visible laser diode.
- 8. The imaging device of claim 7, wherein said at least one source comprises a plurality of visible laser diodes.
- 9. The imaging device of claim 1, wherein said planar light illumination is produced from at least one source of incoherent illumination.
- 10. The imaging device of claim 9, wherein said at least one source comprises at least one light emitting diode.
- 11. The imaging device of claim 10, wherein said at least one source comprises a plurality of light emitting diodes.
- 12. The imaging device of claim 1, wherein said planar light illumination is produced by at least one planar light illumination module comprising an illumination source, at least one focusing lens element, and at least one cylindrical lens element integrated into a modular housing.
- 13. The imaging device of claim 12, wherein said illumination source comprises at least one source of coherent illumination.
- 14. The imaging device of claim 13, wherein said at least one source comprises at least one visible laser diode.
- 15. The imaging device of claim 14, wherein said at least one source comprises a plurality of visible laser diodes.
- 16. The imaging device of claim 12, wherein said illumination source comprises at least one source of incoherent illumination.
- 17. The imaging device of claim 16, wherein said at least one source comprises at least one light emitting diode.
- 18. The imaging device of claim 17, wherein said at least one source comprises a plurality of light emitting diodes.
- 19. The imaging device of claim 12, wherein said planar light illumination is produced by planar light illumination arrays disposed on opposite sides of said plurality of linear imaging arrays, each planar light illumination array comprising at least one planar light illumination module.
- 20. The imaging device of claim 12, wherein said planar light illumination is produced by multiple planar light illumination modules that are spaced apart and oriented on an optical bench in a manner that produces a composite beam of planar light illumination with substantial uniform intensity distribution over a range of working distances of the imaging device.
- 21. The imaging device of claim 1, wherein said image processing circuitry derives said velocity data based on spatial offset of corresponding features in said plurality of composite 2-D images.
- 22. The imaging device of claim 21, wherein features in a given composite 2-D image are derived from the number of edges in each row of the given composite 2-D image.
- 23. The imaging device of claim 22, wherein features in a given composite 2-D image are derived from local extrema in the derivative of edge count over the rows of the given composite 2-D image.
- 24. The imaging device of claim 23, wherein said local extrema are selected from the group consisting of: local maximum edge count values, local minimum edge count values, rising points of inflection and falling points of inflection.
- 25. The imaging device of claim 22, wherein each feature comprises a row identifier a given local extrema.
- 26. The imaging device of claim 21, wherein features in a given composite 2-D image are derived from statistical analysis of the pixel data values in each row of the given composite 2-D image.
- 27. The imaging device of claim 26, wherein said statistical analysis identifies the largest bin of the pixel data values in each row of the given composite 2-D image.
- 28. The imaging device of claim 26, wherein a largest bin identifier for each row provides a characteristic value for the corresponding row, and features in the given composite 2-D image are derived from the characteristic values over rows of the given composite 2-D image.
- 29. The imaging device of claim 28, wherein each feature is row identifier for a row where the characteristic values cross an average characteristic value.
- 30. The imaging device of claim 21, wherein said features in a given composite 2-D image comprise a set of row identifiers in the given composite 2-D image, and wherein spatial offset of corresponding features is derived from row offset between corresponding features.
- 31. The imaging device of claim 21, wherein corresponding features are matched by identifying a list of potential matching assignments, wherein each potential matching assignment satisfies a set of predetermined constraints.
- 32. The imaging device of claim 31, wherein said constraints belonging to said set of constraints are selected from the group consisting of: corresponding features must be of the same type; row offset between corresponding features must have a predetermined sign; row offset between corresponding features must be less than a predetermined tolerance value; and difference between characteristic values of corresponding features must be less than a maximum tolerance value.
- 33. The imaging device of claim 31, wherein, in the event that multiple potential assignments can be made, identifying best matching assignment that minimizes difference between row offset for matching feature pair and average row offset.
- 34. The imaging device of claim 31, wherein said image processing circuitry builds a list of vertices corresponding to potential matching feature pairs, defines a cost function for edges between vertices, and identifies the shortest path that traverses vertices/edges over the list of vertices.
- 35. The imaging device of claim 34, wherein said cost function is based upon a parameter selected from the group consisting of: spatial offset between vertices; a similarity metric for each vertex (such as difference in characteristic values from which the features of the vertex are derived); and weights assigned to different portions of the function.
- 36. The imaging device of claim 1, wherein said image processing circuitry derives said velocity data by correlation of pixel data values for the rows of the plurality of composite 2-D images.
- 37. The imaging device of claim 36, wherein velocity data for a given row in one composite 2-D image is based on correlation of pixel data values for the given row with pixel data values of a plurality of rows in another composite 2-D image.
- 38. The imaging device of claim 1, wherein said velocity data is compensated for variations in height of said target object(s).
- 39. The imaging device of claim 1, wherein compensation of said velocity data is based on estimate of a tilt angle of the target object(s) over the rows of a composite 2-D image, wherein said tilt angle is derived from one or more height measurements.
- 40. The imaging device of claim 39, wherein said one or more height measurements utilize a geometry-based triangulation-type range finding technique.
- 41. The imaging device of claim 40, wherein said one or more height measurements are derived from: i) a laser light source oriented at predetermined angle with respect to the field of view of an imaging array; and ii) offset of laser light in pixel space of the imaging sensor.
- 42. The imaging device of claim 41, wherein said laser light source comprises a VLD and said imaging sensor comprises a linear imaging sensor disposed orthogonal to said plurality of linear image arrays.
- 43. The imaging device of claim 39, wherein said one or more height measurements utilize a time-of-flight-type range finding technique.
- 44. The imaging device of claim 43, wherein said one or more height measurements are derived from phase difference between a modulated EM beam and a phase reference signal.
- 45. The imaging device of claim 2, further comprising:
another linear image array that cooperates with said image formation optics to provide a field of view corresponding to the other linear imaging array; wherein said at least one illumination module produces planar light illumination that substantially overlaps the field of view corresponding to other linear imaging array; and a controller that calculates and updates variable line rate of said another linear imaging array based upon said velocity data such that said another other linear imaging array produces said first image having substantially constant aspect ratio.
- 46. The imaging device of claim 2, wherein said image processing circuitry transforms a select one of said plurality of composite 2-D images utilizing local compression, expansion, copy operations based upon said velocity data to produce said first image having substantially constant aspect ratio.
- 47. The imaging device of claim 2, wherein said first image is subject to an image sharpening routine to produce a resultant image having substantially constant aspect ratio that is stored in memory for subsequent processing.
- 48. The imaging device of claim 47, wherein said image processing circuitry performs image-based bar code detection operations that analyzes the resultant image to read bar code labels therein.
- 49. The imaging device of claim 48, further comprising data communication circuitry that communicates bar code data produced by said bar code detection operations to an external host system over a communication link therebetween.
- 50. The imaging device of claim 2, wherein said image processing circuitry performs image-based bar code detection operations that analyze said first image to read bar code labels therein.
- 51. The imaging device of claim 50, wherein said image-based bar code detection operations also analyze a select one of the composite 2-D images to read bar code labels therein.
- 52. The imaging device of claim 50, further comprising data communication circuitry that communicates bar code data produced by said bar code detection operations to an external host system over a communication link therebetween.
- 53. The imaging device of claim 2, wherein said first image (or an image derived by subjecting said first image to an image sharpening routine) is output to a display device for display.
- 54. The imaging device of claim 2, wherein said first image (or an image derived by subjecting said first image to an image sharpening routine) is output to an external host system over a communication link therebetween.
- 55. The imaging device of claim 54, wherein said external host system performs optical character recognition on the received image.
- 56. The imaging device of claim 1, wherein said velocity data is provided to a controller that updates the operating parameters of an illumination source based upon said velocity data to compensate for changes in the relative object velocity as estimated by said velocity data.
- 57. The imaging device of claim 56, wherein said operating parameters are selected from the group consisting of illumination power parameters and duty cycle parameters.
- 58. The imaging device of claim 1, wherein said velocity data is provided to a controller that updates the operating parameters a camera subsystem based upon said velocity data to compensate for, changes in the relative object velocity as estimated by said velocity data.
- 59. The imaging device of claim 58, wherein said operating parameters are selected from the group consisting of: orientation parameters, focus parameters, zoom parameters, exposure parameters.
- 60. The imaging device of claim 1, wherein said plurality of linear imaging arrays, image formation optics, at least one illumination module, and image processing circuitry are embodied within a hand-holdable housing that is moved by hand movement past a target object, to thereby realize a handheld imaging device.
- 61. The imaging device of claim 1, wherein said plurality of linear imaging arrays, image formation optics, at least one illumination module, and image processing circuitry are embodied within a housing that is stationary while a user moves a target object past the stationary housing, to thereby realize a presentation imaging device.
- 62. The imaging device of claim 1, wherein said plurality of linear imaging arrays, image formation optics, at least one illumination module, and image processing circuitry are embodied within at least one housing that is stationary and positioned over a moving package, to thereby realize an industrial imaging device.
- 63. The imaging device of claim 1, wherein said at least one illumination module includes at least one source of coherent illumination, said imaging device further comprising at least one despeckling mechanism for reducing speckle noise in the images captures by the plurality of linear imaging arrays.
- 64. The imaging device of claim 63, wherein said despeckling mechanism is provided by elongated imaging elements in the plurality of linear imaging arrays.
- 65. The imaging device of claim 63, wherein said despeckling mechanism is provided by image formation optics having the lowest possible F/# that does not go so far as to increase aberrations by blurring the optical signal received thereby.
- 66. The imaging device of claim 63, wherein said despeckling mechanism is provided by a plurality of visible laser diodes that contribute to said substantially planar light illumination.
- 67. The imaging device of claim 61, wherein said despeckling mechanism is provided by illumination control circuitry that modulates the power level of illumination produced by said source of coherent illumination during each photo-integration time period of the linear imaging arrays.
- 68. The imaging device of claim 2, said image processing circuitry carrying out image-based horizontal jitter estimation and compensation operations, which estimate the horizontal jitter of the imaging device relative to the target object over the image capture operations from which the first image is derived and transform the first image utilizing shift operations that are based upon such estimated horizontal jitter to produce a second image of portions of the target object(s) which compensates for horizontal jitter distortion that would otherwise result therefrom.
- 69. The imaging device of claim 45, wherein said image processing circuitry analyzes pixel data values derived from output of said plurality of linear imaging arrays to derive jitter data that estimates motion along a direction transverse to the intended direction of motion; and image buffer circuitry, operably coupled to said another linear imaging array and said image processing circuitry, for transforming a row of pixel data values derived from output of said another linear image array utilizing shift operations that are based upon said jitter data, to produce a second image having substantially constant aspect ratio, and thereby compensating for motion along the direction transverse to the intended direction of motion.
- 70. The imaging device of claim 69, wherein said image processing circuitry performs correlation operations of pixel data values for row pairs derived from the output of two different linear imaging arrays to derive said jitter data.
- 71. The imaging device of claim 2, wherein said image processing circuitry performs image transformation operations on the plurality of composite 2-D images based upon the velocity data to derive a corresponding plurality of reconstructed images with substantially constant aspect ratio, analyzes said plurality of reconstructed images to derive jitter data that estimates motion along a direction transverse to the intended direction of motion, and transforms each given row of pixel data values of a select one of the reconstructed images utilizing logical shift operations that are based upon said jitter data to produce a second image with substantially constant aspect ratio, and thereby compensate for motion along the direction transverse to the intended direction of motion.
- 72. The imaging device of claim 71, wherein said jitter data is derived from correlation operations of pixel data values for matching row pairs in the plurality of reconstructed images.
- 73. The imaging device of claim 71, wherein said jitter data is derived from analysis of a difference image, which comprises a pixel by pixel difference between two reconstructed images.
- 74. The imaging device of claim 68, wherein said second image is subject to an image sharpening routine to produce a resultant image having substantially constant aspect ratio that is stored in memory for subsequent processing.
- 75. The imaging device of claim 74, wherein said image processing circuitry performs image-based bar code detection operations that analyzes the resultant image to read bar code labels therein.
- 76. The imaging device of claim 75, further comprising data communication circuitry that communicates bar code data produced by said bar code detection operations to an external host system over a communication link therebetween.
- 77. The imaging device of claim 68, wherein said second image (or an image derived by subjecting said second image to an image sharpening routine) is output to a display device for display.
- 78. The imaging device of claim 68, wherein said second image (or an image derived by subjecting said second image to an image sharpening routine) is output to an external host system over a communication link therebetween.
- 79. The imaging device of claim 78, wherein said external host system performs optical character recognition on the received image.
- 80. The imaging device of claim 1, wherein said image processing circuitry comprises a programmable image processing computer.
- 81. The imaging device of claim 1, wherein said programmable image processing computer comprises at least one digital signal processing engine and associated memory.
- 82. The imaging device of claim 1, wherein said image processing circuitry comprises a programmable image processing computer and dedicated circuitry.
- 83. The imaging device of claim 82, wherein said dedicated circuitry is embodied in one of: at least one FPGA, at least one CPLD and at least one ASIC.
Priority Claims (1)
Number |
Date |
Country |
Kind |
PCT/US01/44011 |
Nov 2001 |
WO |
|
CROSS-REFERENCE TO RELATED U.S. APPLICATION
[0001] This application is related to copending Application No. PCT/US01/44011 filed Nov. 21, 2001, published by WIPO as WO 02/43195 A2, said application being commonly owned by Assignee, Metrologic Instruments, Inc., of Blackwood, N.J., and incorporated herein by reference as if fully set forth herein in its entirety.