The present invention relates generally to imaging systems, and methods of imaging, and more particularly, to such systems and methods that can be utilized to acquire images of objects hidden behind visibly opaque obstructions.
A variety of conventional systems are available for obtaining images through visibly opaque materials. For example, X-ray systems have been utilized to acquire images of objects that are hidden from visual inspection by visibly opaque materials (e.g., anatomical structures or objects within a luggage). X-ray systems, however, have many disadvantages. By way of example, such systems can be expensive and bulky, and can utilize ionizing radiation that may pose health hazards to humans. Moreover, X-ray systems typically detect a beam that has been transmitted through a target sample, thus requiring access to both sides of the target.
Ultrasound imaging systems, in turn, require the presence of a continuous, high quality acoustic transmission path between a transducer and a “hidden” object of interest. In many cases, however, such acoustic transmission paths may be not be available.
Millimeter-wave imaging systems have recently been developed for securing screening applications. Such conventional millimeter-wave systems are, however, complex, costly and bulky.
Accordingly, there is a need for enhanced imaging systems and associated image acquisition methods for obtaining images of objects behind visibly opaque obstructions, e.g., images of interiors of walls/floors/ceiling, boxes, suitcases and the like. There is also a need for such imaging systems that are field portable. Further, there is a need for such systems and methods that can be utilized for screening luggage and other containers for hazardous substances, e.g., explosive materials and devices.
In one embodiment, the invention provides a method of locating hidden objects. The method includes transmitting microwaves toward an object behind a surface, and detecting feedback from the interaction of the microwaves and the object, using a handheld imaging tool. The method also includes tracking movement of the handheld imaging tool along the surface by a tracking device of the handheld imaging tool. In a first operation mode of the handheld imaging tool, the method includes displaying a grid on a display supported by the handheld imaging tool. The grid is representative of an area to be scanned by the handheld imaging tool. The method also includes filling in the grid with generated images as the handheld imaging tool moves along the surface. The generated images are representative of space behind the surface and indicate at least one of the location, size, and depth of the object.
In one embodiment, in place of or in addition to the first operation mode, the method includes generating an image on a display supported by the handheld imaging tool. The image includes a representation of the object. The method also includes storing the image in a memory with an accompanying tag, retrieving the image using the tag, and displaying the image on the display.
In another embodiment, the invention provides a handheld imaging tool for locating hidden objects. The handheld imaging tool includes a transmitting module operable to transmit microwaves toward an object behind a surface and a detecting module operable to detect feedback from the interaction of the microwaves and the object. The handheld imaging tool further includes a tracking module operable to track movement of the handheld imaging tool along the surface and a display supported by the handheld imaging tool. The handheld imaging tool also includes an imaging module coupled to the detecting module, the tracking module, and the display. The imaging module has a first operation mode in which the imaging module is operable to render a grid on the display. The grid is representative of an area to be scanned by the handheld imaging tool. The imaging module is further operable to fill in the grid with generated images based on data from the detecting module and the tracking module as the handheld imaging tool moves along the surface. The generated images are representative of space behind the surface and indicate at least one of the location, size, and depth of the object.
In another embodiment, in place of or in addition to the first operation mode, the imaging module is further operable to generate an image on a display supported by the handheld imaging tool. The image includes a representation of the object. The imaging module is also operable to store the image in a memory with an accompanying tag, retrieve the image using the tag, and display the image on the display.
In another embodiment, the invention provides a handheld imaging tool for locating hidden objects. The handheld imaging tool includes a body and a handle portion. The body includes a horn assembly with an emitting horn and a receiving horn. The emitting horn is operable to transmit microwaves toward an object behind a surface and the receiving horn is operable to receive feedback from the interaction of the microwaves and the object. The body also includes a tracking module to track movement of the handheld imaging tool along the surface and an imaging module to generate images. The images are generated based on data from the tracking module and feedback received by the receiving horn and displayed on a display of the body as the handheld imaging tool moves along the surface. The generated images are representative of space behind the surface and indicate at least one of the location, size, and depth of the object. The handle portion supports the body of the handheld imaging tool and includes a trigger actuator and a thumb actuator. The trigger actuator and thumb actuator are operable to at least partially control the display.
Other aspects of the invention will become apparent by consideration of the detailed description and accompanying drawings.
Before any embodiments of the invention are explained in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. The invention is capable of other embodiments and of being practiced or of being carried out in various ways.
In this embodiment and some that follow, without any loss of generality, the functioning of the imagers according to the teachings of the invention are discussed by considering acquiring images within a depth of a wall (or other obstructions) that is opaque to visible radiation. Such imagers can, however, be also utilized to acquire images of other objects and behind a multitude of different, non-wall surfaces. For example, the imaging systems of the invention can be utilized to image objects within containers.
The source 12 and the detector 14 are disposed on opposite sides of a beam splitter 23 such that the propagation axis 20 associated with the source and the detection axis 24 associated with the detector typically intersect at an approximately 90-degree angle. The beam splitter 23 is perpendicular to a plane formed by the propagation and the detection axes and is oriented such that a normal to its surface bisects the angle between those axes, e.g., it typically forms a 45-degree angle with each of those axes. The radiation emitted by the source passes through the beam splitter to be directed by other optical components onto a region of interest, as discussed below.
By way of example, the beam splitter 23 can be preferably implemented as a polarizing beam splitter having a polarization axis that is preferably oriented either parallel or perpendicular to a plane defined by the propagation and detection axes. In some embodiments, a so-called wire grid polarizer (WGP) is employed, which can be made, e.g., of a one-dimensional array or grid of very fine parallel electrically conductive elements disposed upon a suitable transparent base material or, e.g., by a grid of fine parallel wires strung on a frame. By way of example,
Referring again to
In some embodiments of the invention, the BFL includes four steps each providing a ¼-wave phase delay. By way of example,
Referring again to
In some embodiments, the QWP 36 can be implemented as a grooved dielectric plate, such as that schematically depicted in
In this exemplary embodiment, the QWP 36 is disposed perpendicular to the propagation axis of radiation from the source 12 with its fast axis preferably oriented at +/−45 degrees from the plane of polarization of the outgoing radiation. As is well known in the art, linearly polarized radiation passing through a QWP oriented in this manner emerges from the QWP as substantially circularly polarized.
The imager 10 further includes a scan mechanism 38 coupled to the lens 32 for rotating the lens about its rotation axis (herein also referred to as the lens's physical axis). The lens is preferably disposed relative to the source such that its rotation axis is substantially coincident with the propagation axis of the outgoing radiation. As noted above, an optical axis 40 of the lens is displaced from its rotation axis by a predetermined distance, e.g., by about ½ of the radius of the lens. The optical axis of the lens can be parallel to its rotation axis, or alternatively, it can intersect the rotation axis at the emitting aperture of the source. As discussed in more detail below, the rotation of the lens 32 allows scanning the radiation at the focal plane of the lens over a path in an object plane.
A variety of scanning mechanisms can be utilized in the practice of the invention. For example, referring to
In this exemplary embodiment, various components of the imager, such as those discussed above, are disposed in a portable, preferably handheld housing 44. An optional window 46 (e.g., formed of a material transparent at the operating wavelength) is coupled to the housing 44 through which the radiation generated by the source can be transmitted to illuminate interior portions of the wall, as discussed further below. In other embodiments, no window is provided.
In operation, the lens 32 directs radiation generated by the source 12, after its passage through the beam splitter 23, via the QWP 36 and the window 46 into the interior of a wall (or other obstruction, or a region behind such an obstruction) to illuminate portions thereof, such as the hidden object 48. Preferably, the lens 32 forms an image of the source so as to create an illuminating focused point (e.g., an area of maximal radiation intensity) at a distance from the lens that is less than infinity and more than one focal length of the lens. In many embodiments, the radiation from the imager is focused onto an object plane (e.g., object plane 50) within the wall, and the radiation returning from that object plane is detected and analyzed to form an image thereof, as discussed in more detail below. In general, the object plane 50 has an axial extension (a depth) corresponding to the axial extension of the focal volume, as schematically illustrated by hidden object 48, which represents a portion of the object plane.
In this exemplary embodiment, the lens 32 is placed at a distance from the source substantially equal to twice its focal length, thereby forming an image of the source at a distance of approximately two focal lengths from the lens. Accordingly, the image is displaced radially from the rotation axis by twice the displacement of the lens's optical axis from the rotation axis. As shown schematically in
In some embodiments, it is preferable to operate the imager with a small tilt angle (e.g., approximately 7 degrees) between a scanning plane (e.g., a plane perpendicular to the lens's rotation axis) and a translation plane (i.e., the plane over which the imager is translated to build up an image of an area). For example, as shown schematically in
Referring again to
As noted above, the combined rotation of the lens and the translation of the imager over the wall surface results in illuminating various locations within the interior of the wall. As the illuminating radiation impinges on an object that is not transparent to the radiation, e.g., a metal pipe and/or electrical wiring within the wall, at least a portion of the radiation is reflected or scattered. In the frequency range of about 1 to about 2000 GHz, most objects specularly reflect, rather than scatter, the radiation. Hence, at least some of the radiation incident on such objects within the wall is reflected back towards the imager, e.g., depending on the “look angle” of the illumination and a normal to the reflecting surface at the point of illumination. The lens collects this back-propagating radiation (or at least a portion thereof), after its passage through the QWP 36, and directs that radiation, as a converging radiation beam, to the beam splitter 23. As is known in the art, the passage of the returning radiation, which is circularly polarized (or at least substantially circularly polarized, as the reflection of the incident radiation may have cause some change in the polarization) through the QWP results in conversion of its polarization to linear polarization with a polarization axis normal to that of the linearly polarized radiation generated by the source. As such, the beam splitter directs this linearly polarized back-propagating radiation to the detector 14. In this embodiment, the detector 14 operates in heterodyne mode, that is, it mixes the returning radiation with radiation from a local oscillator 62 to generate an intermediate frequency (IF) electrical signal whose strength is proportional to the intensity of the returning radiation and whose lower frequency can be more readily processed by electronics circuitry. A variety of detectors and local oscillators can be employed. For example, in some embodiments, a receive diode of a Gunnplexer can be employed as the detector. In some other embodiments, a portion of the transmit oscillator power can act as a local oscillator for the receiver. In such a case, a single oscillator can be used for microwave emission as well as detection.
The detector 14 transmits the electrical signal generated in response to detection of the returning radiation to a digital data processor 64. The digital data processor is also in communication with the scan position sensor 42 and the location sensors 60 to receive information regarding, respectively, the angular rotation of the lens (herein also referred to as A(t)) and the location of the imager on the wall surface (herein also referred to as P1(t) and P2(t), where P1(t) denotes the information from the first location sensor 60 and P2(t) denotes the information from the second location sensor 60). The digital data processor employs the received data to map the time-varying detection signal to a plurality of respective locations on the object plane from which the back-propagating radiation originates. More specifically, the electrical signal, herein also referred to as I(t), is typically a time-varying signal whose strength at any instant is proportional to the intensity of the returning radiation detected by the detector 14 at that time. The intensity is related to the reflecting properties of the object that is at that time at the location where the illuminating radiation is directed. It will be understood by those familiar with the art of scanning image sensing that signal I(t) varies as a function of time because the lens is scanning the radiation in time over the object space. That is,
I(t)=I[x(t),y(t)],
where x(t), and y(t) define the instantaneous position of the illuminating radiation in the object plane. In the remaining equations, the time dependence is dropped for convenience.
Digital data processor 64 transforms/inverts, and combines, the measurement A, P1(x,y), and P2(x,y) to generate x and y. In the exemplary embodiment of
Similarly, Xh, Yh, and θz coordinates of the imager housing can be calculated from P1 and P2, where it is understood each of P1 and P2 comprises an x- and a y-measurement (x1, y1; x2, y2). For example, if the location sensor P1 is selected as the housing reference point, then
where θ0 is the initial angle relative to the x-axis passing through P1 of the line connecting P1 with P2.
Finally, the position of the imaging spot can be calculated by adding the following three vectors: (1) a vector representing the rigid body displacement of the housing, (2) the position of the axis of rotation relative to the housing reference point, and (3) the displacement of the image point due to the angular rotation of the lens. More specifically, x and y can be obtained by employing the following relations:
x=x1+D[cos θz cos θ0−sin θz sin θ0]+d cos A, and
y=y1+D[cos θz sin θ0+sin θz cos θ0]+d sin A,
where D is the distance between P1 and the axis of rotation.
The processor 64 is also adapted to generate image position drive signals suitable for application to an image display 66. The image position drive signals cause the display of a plurality of image points, each having a brightness that corresponds to the detected signal strength from a respective coordinate point in the object plane. In operation, a complete image is built up on the display 66 as the imager's housing in moved over the wall surface (or in proximity thereof) while the imager's lens is rotated to scan the beam over an interior swath of the wall.
In some embodiments, relative locations of the pixels in an image obtained by combined rotation of the lens and translation of the imager are determined by acquiring a plurality of partially overlapping image frames, and tracking one or more pixels in the overlap regions. By way of example,
By way of example and only for illustrative purposes,
Although in the above embodiment, the processor 64 and the display 66 are housed within a single housing with the other components of the imager, in other embodiments, the processor and/or the display can be disposed in a separate housing. In embodiments in which the processor/display are disposed in a separate enclosure, one or more communications channels can be provided to allow the processor and/or display to communicate with one or more imager components disposed in another enclosure. In some cases, the communications channels employ wireless channels that utilize known wireless protocols.
The implementation of an imager according to the teachings of the invention is not limited to the embodiment described above. In fact, such as imager can be implemented in a variety of different ways. For example,
Referring again to
The lens 92 is rotated about a rotation axis (illustrated as RA) by a scan mechanism 108, such as those discussed above in connection with the previous embodiment. Similar to the previous embodiment, an optical axis (OA) of the lens 92 is displaced relative to its rotation axis by a selected distance, e.g., about one-half the lens's radius. The rotation axis is generally centered on the emitting aperture of the transmit/receive unit 78 parallel to general direction of propagation of the radiation (parallel to the central ray of a cone-like bundle of rays). The optical axis can be parallel to the rotation axis, or may form a non-zero angle with the rotation axis so as to intersect that axis at the emitting aperture of the transmit/receive unit. The rotation of the lens causes the image of the source, generated by the lens, to scan a selected path (e.g., a generally circular path) over an object plane, in a manner similar to that discussed above in connection with the previous embodiment.
In some embodiments, the emitting aperture of the transmit/receive unit 78, the lens 92, and the image of the emitting aperture are preferably disposed in a confocal configuration. That is, the illuminating radiation is focused onto a small region in a plane of interest (e.g., the object plane), and the reflected (or scattered) radiation reaching the detector (the transmit/receive module in this case) is limited to those rays that originate from the illuminated region. In some embodiments, such a confocal imaging system is employed to reject stray light by utilizing, for example, two strategies: (1) by illuminating a single point (small area) at any given time with a focused beam such that the focused intensity drops off rapidly at axial locations away from that plane of focus (e.g., in front or behind that plane), and (2) by utilizing a blocking or a pinhole aperture, or a point detector, in a conjugate receiver plane so that light reflected (or scattered) from the illuminated object region is blocked from reaching the detector.
With continued reference to
The output electrical signal is communicated, e.g., via a communication channel 100, to an electronic processor 102 (e.g., a digital data processor), disposed in an electronic processing and display module (EPDM) 104. While in this embodiment the EPDM is contained in a separate housing, in other embodiments, it can be integrated with the head 76 within a single housing. The processor 102 includes a signal processing module that is adapted to convert the output signal generated by the transmit/receive unit 78 into image strength drive signals suitable for application to an image display 106.
In addition to communicating with the detector, the processor 102 is also electrically coupled to a scan position sensor 110, e.g., via a communications channel 112, that can sense the position of the scan mechanism, and thereby that of the lens 92, relative to a predetermined reference position. A variety of scan position sensors, such as those discussed above, can be employed. The position sensor communicates the information regarding the position of the lens to the processor.
Similar to the previous embodiment, the imager 74 further includes a body location-determining subsystem 114 for determining the rigid body location of the head 76 on a surface (e.g., wall surface) over which it is moved to build up an image of a region behind the surface. The subsystem 114 can be in optical and/or mechanical communication with a surface over which the imager is translated. Typically, the subsystem 114 estimates the location and orientation of the head 76 via three parameters “Xh”, “Yh” and “θz”, where X, Y and Z denote orthogonal Cartesian coordinates. The X and Y denote coordinates in a plane (e.g., a planar surface of a wall over which the head is translated) and θz denotes an angle about the Z-axis that is perpendicular to that plane. By way of example, the origin of the coordinates can be established as the location and orientation of the imager upon its initial placement on the plane. This can be done automatically or by a user-issued command (which can also be employed to reset the location of the origin, if desired). The location-determining subsystem can then determine subsequent locations and orientations of the imager relative to the origin. A number of location-determining subsystems can be utilized. For example, in some embodiments, the subsystem can comprise two computer-mouse sensing mechanisms, separated by a known base line. Alternatively, the subsystem can be implemented by employing a plurality of inertial sensors.
The location-determining subsystem 114 transmits signals indicative of the location of the imager's head to the processor 102, e.g., via a communications channel 116. The processor utilizes these signals, together with those transmitted by the lens position sensor, to generate a set of image point coordinates in the coordinate space of the object region. The processor further correlates these image coordinates to the time-variation of the signal received from the detector to generate a reflectance image of the illuminated portion. In addition, the processor derives image position drive signals, based on the image coordinates and intensity of reflected signals originating from those coordinates, for application to the display 106. The image drive signals cause the display to present an image in which the brightness of an image point corresponds to the intensity of the detected reflected radiation originating from a coordinate point (e.g., an area or voxel in vicinity of that point) mapped to that image point.
In some embodiments, the frequency of the radiation generated by the source (e.g., the above transmit/receive unit 78) is modulated by a control signal. For example, in the above Gunnplexer 80 (
In some embodiments, the imager can provide an image of a two-dimensional area while the imager (e.g., imager housing) remains stationary (i.e., without the need to physically move the imager). One such exemplary embodiment 118 shown in
A scan mechanism 126 scans the radiation, which is directed by the lens to the region 124, over a plurality of locations in that region. The lens and the scan mechanism can be configured to produce a plurality of radiation scan patterns to cover (illuminate) at least a portion, e.g., an object plane 124a, within the region 124. The scan mechanism typically moves the radiation within a plane (e.g., a plane perpendicular to the lens's optical axis) so as to generate a desired radiation scan pattern. By way of example,
By way of example,
With reference to
Further, the processor generates a plurality of image drive signals for application to a display 136 for displaying the calculated image. In this embodiment, the processor and the display are disposed in separate enclosures with communication channels coupling the processor to the transmit/receive unit as well as the lens position sensor. In other embodiments, the various components of the imager can be housed in a single, preferably handheld, enclosure.
In some embodiments, an imager according to the teachings of invention is capable of acquiring images of a plurality of object planes located at different axial locations (e.g., at difference depths within an obstruction, such as a wall). For example,
In some embodiments, both transmit/receive unit 142 and lens are axially translated, while preferably maintaining the separation between the transmit/receive unit and the lens, to focus the radiation on planes at different axial locations.
The radiation reflected from each object plane can be detected by the transmit/receive unit, which generates an electrical signal in response to such detection and transmits the signal to a processor 150 for analysis. The imager further includes at least one lens position sensor 152 coupled to the rotational scanner and the focus-drive mechanism for determining the axial position as well as the rotational orientation of the lens (in some embodiments, the functionality of the lens position sensor 152 can be provided by two separate sensors, one for determining the lens's axial position and the other for determining the lens's rotational orientation). By way of example, the lens position sensor 152 can be implemented as a shaft encoder. The lens position sensor 152 transmits the information regarding the lens's axial position and rotational orientation to the processor 150. The processor employs this information to temporally correlate the detection signal generated by the detector to different object planes, and for each object plane, to a plurality of coordinate positions in that plane. In this manner, the processor can build up a plurality of images, each corresponding to a different depth within the object region. The processor can further generate image drive signals for application to a display 154 for displaying these images, e.g., as a three-dimensional image. In some cases, the processor can cause the display to present selected ones of these images, or present them in a selected sequence, or in any other desired manner.
Although a transmit/unit is employed in the imager 138, in other embodiments, separate source and detector can be employed to generate and detect the radiation, for example, in a manner shown in the above imager 10 (
In another embodiment, the longitudinal chromatic aberration of the lens can be employed to focus radiation from a source at a plurality of at different depths (e.g., onto a plurality of object planes located at different axial distances from the lens). For example, the frequency of the radiation generated by a source can be varied (tuned) such that the chromatic aberration exhibited by the lens would result in focusing different frequencies at different axial locations from the lens.
By way of example,
In the exemplary imager 172, the lens 176 is disposed relative to an emitting aperture of a radiation source 182 at a distance equal to one of its focal lengths. The lens 176 converts an expanding cone of radiation generated by the source into a generally collimated radiation beam, and directs the collimated beam in a predetermined off-axis direction, as shown schematically in
Preferably, the separation between the lenses 176 and 178 is substantially equal to the focal length of the fixed lens 178. In such a case, the fixed lens 178 forms the image 184 with an imaging cone of radiation whose chief ray is parallel to the optical axis. When the lens 178 is axially moved, the separation between the two lenses can deviate from this preferred value, although in some embodiments, both lenses can be moved so as to provide a depth of scan of the radiation while maintaining the separation between the lenses substantially equal to the preferred value.
A prototype imager made based on the above teachings of the invention is discussed in the following example for further illustration of various aspects of the invention. It should, however, be understood that this is intended only for illustrative purposes, and not for indicating optimal performance of imagers according to the teachings of the invention, or to suggest that the specific arrangement of the various optical components and other design parameters utilized in the prototype are in any way meant to limit the scope of the invention.
The prototype imaging system based on the teachings of the invention was fabricated by utilizing a Gunn oscillator operating at a frequency of 24.15 GHz (a wavelength of about 12.4 mm) as the emitting source. The Gunn oscillator was coupled to a 10 dB feedhorn, with an exit aperture having dimensions of 15 by 11 mm, so as to output a cone of linearly polarized radiation at a power of 5 mW with an angular spread of +/−57 degrees.
After passage through a 45-degree wire grid polarizer (composed of 30 gauge wires with 0.8 mm center spacing disposed on an Acrylic frame), the radiation from the oscillator was focused to a focal point by an F/0.6 quarter-wave focusing lens, formed of Low Density Polyethylene (LDPE). The lens was configured to image the radiation at a focal spot approximately 100 mm off the lens's optical axis. The distance of the source from the lens (about 125 mm) was substantially equal to that of the image from the lens, thus resulting in a magnification of about 1.
A 32 mm thick birefringent quarter-wave plate, composed of an array of 2.5 mm wide slots cut into a LPDE sheet, was placed between the lens and the focal point. The slots of the quarter-wave plate were oriented at 45 degrees relative to the polarization axis of the incident beam, thus converting the beam's linear polarization to circular polarization. Upon reflection from an object at the focal point and a second passage through the quarter-wave plate, the beam's circular polarization was converted back to linear polarization, albeit with a 90-degree rotation relative to the polarization axis of the incident beam. The back-propagating radiation was then transmitted by the wire grid polarizer to a second Gunn oscillator having an integrated mixer (receiver). The optical system effectively operated in a confocal mode, where the diffracted radiation spot served to illuminate an object and radiation reflected (or scattered) from the object was focused back through a small aperture (the feedhorn entrance) to the mixer detector.
The emitter and receiver Gunn oscillators were tuned to have a frequency mismatch of approximately 2 MHz. This frequency mismatch causes radiation reflected by an object at the focal point and relayed to the receiver to generate a 2 MHz beat frequency signal. The beat frequency was amplified, high-pass filtered (frequency cutoff was about 500 Hz to eliminate frequencies below any expected beat frequency), low-pass filtered (to eliminate high frequency noise above any expected beat frequency), and rectified. The rectified signal was, in turn, fed to a computer data acquisition system.
By rotating the lens at 300 revolutions-per-minute (rpm), a circularly scanned “probe spot” was generated. A magnet and a Hall effect sensor were utilized to measure the rotational position of the lens. Object imaging was accomplished by moving objects transversely through the scanning focused spot. A sheet of gypsum wallboard having a thickness of about ⅝ inches (15.9 cm) was placed between the lens and the focal plane of the probe spot. Radiation passing through the wallboard interacted with various test objects (e.g., wires, pipes, human skin, etc). A software program was utilized to use the rotational position of the lens so as to determine the Cartesian Coordinates of locations on the focal plane from which the detected reflected radiation originated. This information was utilized, in a manner discussed in detail above, to construct images of objects that were moved transversely through the scanned field.
As noted above, the above prototype was discussed only for illustrative purposes. The particular selections and arrangements of the optical components (e.g., source, lens and receiver) were made only by way of example. Alternative components and arrangements can also be utilized. For example, sources operating at other wavelengths can be employed.
A second prototype imager is discussed in the following example for further illustration of various aspects of the invention. It should, however, be understood that this is intended only for illustrative purposes, and not for indicating optimal performance of imagers according to the teachings of the invention, or to suggest that the specific arrangement of the various optical components and other design parameters utilized in the prototype are in any way meant to limit the scope of the invention.
The housing 252 includes a display screen 266 for displaying images and a power button 264 for turning the imager 250 on and off. The display screen 266 is rotatable up to approximately 90 degrees, from a closed orientation to an extended orientation, along a connection axis 268. In other embodiments, the display screen 266 is in a fixed position or is able to rotate about multiple axes. In one embodiment, the display 266 is a 4.3 inch liquid crystal display (“LCD”) with 480RGB×272 dot pixel resolution and a 24-bit parallel RGB and serial peripheral interface (e.g., LCD model 43WQW1 HZ0 by Seiko Instruments, Inc.). The housing 252 also includes tracking wheels 274 for measuring the movement of the imager 250 along a wall surface 56.
A portion of the emitted radiation is eventually reflected back off of the hidden object 48 towards the RF board 300. The returning radiation, which has an electric field vector perpendicular to the fold mirror 306, is reflected off the fold mirror 306 into the receiver horn 304 along the receiving axis 320. The receiving axis 320 intersects the fold mirror 306 at approximately a 45 degree angle.
The emitter horn 302 and receiver horn 304 are optimized for transmitting electromagnetic radiation over short distances (e.g., several inches) as opposed to long distances (e.g., several miles). In one embodiment, the horns have phase centers in the E-plane and H-plane that originate from a common point and maintain that point as the emission angle changes from the throat of the emitter and receiver horns 302 and 304.
Reflected electromagnetic radiation is received through receiver horn 304 at a receiving antenna (not shown). The receiving antenna converts the electromagnetic radiation to electric signals along input 334. The signals are subject to a delay along delay line 336, and then enter mixer 338. The mixer 338 also receives a portion of the VCO 328 output signal from the coupler 330 as described above. The mixer then mixes the output signal with the signal received, and the output is sent along IF outputs 314 and 316 to video amplifiers (not shown). By this RF board 300, the processor 64 is then able to detect beat notes from the mixed signal it receives via IF outputs 314 and 316. The beat note frequency and intensity indicate the amount, if any, of the emitted radiation reflected off of a hidden object 48. In some embodiments, the IF outputs are amplified, either before or after reaching the processor 64.
In one embodiment, the meniscus lens includes an anti-reflective layer of low density polyethylene (“LDPE”) approximately 2.1 mm thick on one or both surfaces. The antireflective material prevents electromagnetic radiation passing through the meniscus lens, in either direction, from reflecting away and degrading the radiation signal strength. In other embodiments, the meniscus lens 308 uses different, yet similarly functioning, antireflective materials and thickness levels. In other embodiments, a different type and/or thickness of antireflective material is placed on each side of the meniscus lens 308. In other embodiments, only one side of the meniscus lens 308 has a layer of antireflective material.
In one embodiment, the base material of the antenna 310 includes a material with a high dielectric constant. An exemplary high dielectric constant is five (5) and is associated with materials such as Remafin or EccoStockHiK. A high dielectric constant material enables a more compact antenna. Similar to the meniscus lens 308, antireflective layers are on each side of the antenna 310. The antireflective layers are chosen such that they match the dielectric constant of the antenna 310 to minimize the reflections. Generally, as the dielectric constant of a material increases, the more electromagnetic radiation reflects off. Thus, to achieve a set level of antireflectiveness, the antireflective material chosen for a high dielectric constant material should have a higher level of antireflectiveness than the antireflective material chosen for a lower dielectric constant material. An exemplary antireflective material that matches the dielectric constant of the base material of the antenna 310 is a quarter-wave length, LDPE. In other embodiments, a resurgent microstructure layer is used for antireflection.
The antenna 310 is similar in function to a diffractive optical element, but is operable at microwave frequencies. For illustrative purposes, the antenna 310 can be compared with a Fresnel-style lens that has been “compressed” or “flattened.” The antenna 310 includes multiple concentric circles 371 and partial circles 372 that center on an optical axis 374 offset a distance 373 from the rotation axis 370. The electromagnetic radiation received from the meniscus lens 308 is redirected by the antenna 310 as the optical axis 374 rotates about the rotation axis 370. As the antenna 310 rotates, it redirects electromagnetic radiation received to form a pattern similar to the pattern of
The circles 371 and partial circles 372 are broken up into five zones 377a-e. Each zone 377a-e includes a combination of eight circles 371 and partial circles 372, also referred to as sub-zones. In some embodiments, more or fewer than five zones are used.
In one embodiment, each sub-zone has one of eight approximate thickness levels measured from the bottom antireflective layer 375a to the top antireflective layer 375b. The sub-zone nearest the optical axis is the thickest, and the thickness of each sub-zone farther from the optical axis 374 decreases. For instance, sub-zone 379a is the thickest of zone 377a, and each sub-zone 379b-h decreases in thickness. In one embodiment, the difference in thickness between sub-zones ranges from approximately 1.25 mm to 2.30 mm. Additionally, the thickness of sub-zones is approximately equal across zones 377a-e. For instance, sub-zone 379a of each zone 377a-e is approximately equal; sub-zone 379b of zone 377a is approximately equal to sub-zone 379b of zone 377e; and so on. As is shown in
The width of each sub-zone in the radial direction varies. However, the widths of sub-zones nearer to the optical axis 374 are generally larger than the widths of sub-zones farther from the optical axis 374. Likewise, the widths of zones nearer to the optical axis 374 are generally larger than the widths of zones farther from the optical axis 374. In one embodiment, the smallest sub-zone width is roughly 1.3 mm.
The antenna 310 can be formed, for instance, by using injection molding or machining techniques. The contour shape of each zone 377a-e reduces the mass and thickness necessary to achieve the desired antenna 310 function. In one embodiment, the antenna 310 is formed with an offset distance 373 of 34.99 mm and creates a scan area of 100 mm. The scan area includes the area of a circle created by the rotation of the focal point of the radiation caused by one full rotation of the antenna 310. See, for instance, the circles pattern of
In one embodiment, the QWP 312 has a thickness of approximately 13 mm, including two antireflective layers at 1.55 mm and 2.8 mm, a web layer at 2.37 mm, and a base material layer at 6.2 mm. The QWP 312 includes an antireflection layer on each of its front side 380 and back side 382. The front side 380 antireflection layer includes a combination of polyethylene and solid polyethylene. The back side 382 antireflection layer is thicker than the front side 380 and includes a microstructured polyethylene. The web layer of high dielectric constant may be placed between an antireflective layer and the base material layer to hold the structure of the QWP 312 together. The base material can include, for instance, Remafin or EccoStockHiK, with a dielectric constant of five (5). The QWP 312 base material is formed with a series of grooves 386 approximately 6.2 mm deep and 1.55 mm wide. Between each groove 386 is a ridge 384. The overall diameter of the QWP 312 is approximately 125 mm.
The amount of data that can be scanned and displayed by the imager 250 is proportional to the speed at which the antenna rotates. For instance, if the imager 250 rotates the antenna at 10,000 rotations-per-minute (“RPM”), the imager 250 will be able to receive and display data faster than if the antenna 310 rotates at 1,000 RPM. In one embodiment, the imager 250 rotates the antenna 310 at a rate of approximately 3,000 RPM. However, other embodiments of the invention can rotate the antenna 310 anywhere from a few hundred RPM to approximately 10,000 RPM. One limiting factor at high RPM, e.g., at some level above 10,000 RPM, is that the signal-to-noise ratio of received radiation reflected off of the hidden object 48 may eventually decrease to the point of inoperability. In other words, the antenna may rotate so fast that the processor 64 cannot interpret the radiation received at the RF board 300 to generate accurate images on the display 266.
The encoding wheel assembly 421 is shown in
In one embodiment, three encoder wheels 420 are positioned to contact a surface against which the imager 250 is placed. In one embodiment, the three encoder wheels 420 are positioned in a Y formation as shown in
The emitter horn 302 is coupled to the emitter 450 so as to facilitate coupling of the radiation generated by the emitter 450 into free space (e.g., by providing a better impedance match) for propagation towards the hidden object 48. In this embodiment, the emitter 450, in conjunction with the emitter horn 302, generates a diverging cone of radiation beam 319 disposed about the emitting axis 318, which is also referred to as the rotation axis 370. The receiver horn 304 is coupled to the receiver 452 to facilitate coupling of radiation into the receiver 452. In general, the combination of the receiver 452 and receiver horn 304 is capable of receiving radiation beams disposed about the receiving axis 320 with a given angular distribution that depends at least in part on the receiver horn 304 geometry.
The emitter 450 is coupled to the processor 64. The processor 64 is configured to output signals to cause the emitter 450 to start and stop emitting radiation. The processor 64 is coupled to the receiver 452 via IF outputs 314 and 316. The processor 64 receives signals along IF outputs 314 and 316 indicative of the received radiation reflected off of the hidden object 48. The processor 64, in turn, translates the signals received along IF outputs 314 and 316 into image data, which it outputs to the display 266 along connection 458. The display 266 translates the image data into an image on the screen of the display 266.
The position and rotation information detected by the motor position sensor 460, along with the tracking information provided by the encoder wheels 420, are used by the processor 64 to determine the location of the imager 250 when it emits and receives radiation. The processor 64 is configured to associate the receiver data provided along IF outputs 314 and 316 with the imager 250 location data to determine the location of the hidden object 48 and generate an image for the display 266. Exemplary calculations used to determine the location of the hidden object 48 and generate an image for the display 266 are described in greater detail above.
In one embodiment, the distance between the emitter 450 and the hidden object 48 is approximately 174 mm and the distance between the emitter 450 and the wall surface 56 is approximately 117 mm. In one embodiment, the emitter 450 is separated from the meniscus lens 308 by approximately 71 mm; the meniscus lens 308 is separated from the antenna 310 by approximately 1 mm; the antenna 310 is separated from the QWP 312 by approximately 2 mm; and the QWP 312 is separated from the window 46 by approximately 1 mm.
In step 512, the radiation continues to the rotating antenna 310, where the radiation is redirected to the focal point of the antenna 310. The redirected radiation passes through the QWP 312 in step 514. The QWP 312 receives linearly polarized electromagnetic radiation from the antenna 310 and outputs circularly polarized electromagnetic radiation towards the hidden object 48. In step 516, the circularly polarized electromagnetic radiation from the QWP 312 reaches the hidden object 48.
Returning to
After set-up in step 540, the process 530 proceeds to an explore mode step 542, where the imager 250 enters into the explore mode. The explore mode is further described in conjunction with
If the imager 250 has been used before, in step 544, the imager 250 determines if a password has been set. If no password has been set, the imager 250 enters the explore mode in step 542. If a password has been set, a password screen is displayed on display 266 in step 546 and a password is received. The password can be entered by a user, for instance, using the trackpad 260 or other input buttons. In some embodiments, a user or login name is used in conjunction with a password. The imager 250 may save and associate each user's settings with a particular login name. After the user enters the password, the imager 250 begins the explore mode in step 542.
In some embodiments, the process 530 cannot proceed beyond step 546 until a correct password has been entered. In other embodiments, the process 530 proceeds to explore mode in step 542 even without a correct password, but the imager 250 will operate in a safe mode. In the safe mode, certain features are not available to the user. For instance, in the safe mode, the user is prevented from deleting or modifying saved files. In some embodiments, the process 530 proceeds to a main menu screen or a different imager mode after the password steps of 544 and 546.
In step 556, upon the imager 250 detecting receipt of a user request for a snapshot, the imager 250 proceeds to step 559. The request for a snapshot can be made by, for instance, the user inputting the request via input buttons such as the trackpad 260, the user releasing the trigger 262, or the user pulling the imager 250 away from the wall surface 56 while image data is being displayed in the explore mode. In step 559, the imager 250 freezes the current image on the display 266. The user can then use the trackpad 260 or other input buttons to select the refine mode in step 560 or to save the image in step 562. If the user selects refine mode in step 560, the imager proceeds to step 564 to enter the refine mode (see
If the main menu is selected in step 558, the imager 250 displays the main menu on display 266 in step 568. The main menu allows a user to select different imager modes or return to the set-up step 540 of
After entering the quadrant mode process 580, the user places the imager 250 against a wall surface 56 and depresses the trigger 262 to start imaging. In step 584, the imager 250 displays an empty grid on the display 266. In step 586, the grid 581 is evaluated to determine if it is full. If the grid 581 is not full (which is initially true), the imager 250 obtains image data using the process 500 of
Upon filling the grid 581, the imager 250 proceeds to step 592. In step 592, the imager 250 continues to display the entire grid area with the generated images until determining that a user has selected to refine the image in the refine mode (steps 594 and 596), save the image (steps 598 and 600), or return to the main menu (steps 602 and 604).
Upon determining the location of the imager 250, the display 266 highlights a section of the grid 581 to indicate the location to the user (step 616). The user may reposition the imager 250, if necessary, to a section of interest on the wall surface 56. Upon repositioning, the imager 250 updates the display 266 to highlight the current location of the imager 250 in step 616. Thereafter, the imager 250 obtains new image data for the particular section at a higher scan rate in step 618. In step 620, the image on the display 266 is updated with the new image data from step 618.
The imager 250 will continue to display the image until the user selects to further refine the image, save the image, or return to the main menu. If the imager 250 determines that the user selected to further refine the image in step 622, the imager 250 returns to step 614. If the imager 250 determines that the user selected to save the retrieved image in step 624, the imager 250 proceeds to save the image in step 626. If the imager 250 determines that the user selected the main menu in step 628, the process 610 proceeds to the main menu in step 630.
If the user selects a GPS tag in step 664, the imager 250 retrieves current GPS data from a GPS module (see additional detection module 702 of
The memory (not shown) of imager 250 may be any memory suitable for storing and retrieving image files. In some embodiments, the memory includes a hard disk permanently residing within the imager 250. In other embodiments, a removable drive may be swapped in and out of the imager 250.
In some embodiments, to retrieve a saved image from memory, thumbnail images are displayed on display 266. In some embodiments, when a thumbnail is highlighted or selected using the trackpad 260, the tag is displayed or, in the case of audio files, is played. In some embodiments, any associated text tag is displayed below or alongside the associated thumbnail. In some embodiments, a text-only directory is displayed showing all the tags arranged, for instance, in alphabetical order, by date created, or by another sorting technique.
Post-Processing
In some embodiments, the imager 250 is capable of processing the generated image (“post processing”) to identify objects in the image and to contrast the identified objects with the background of the image to render the objects easier for the user to see. For instance, particular pixels of the generated image may be identified as corresponding to the hidden object 48. After being identified, the pixels may be brightened or darkened to enhance their visibility on the image. In other embodiments, a group of pixels identified as corresponding to the hidden object 48 may have an outline or border displayed around the group of pixels.
In some embodiments, identification algorithms are used to compare the shape of the group of pixels to a database of shapes. Upon identifying a match between the group of pixels and a shape in the database, the imager 250 may apply a specific highlighting technique to the group of pixels (e.g., display the group of pixels as blue to indicate a water pipe). In some embodiments, upon matching the group of pixels, a text bubble or other graphical identifier is added to the image and associated with the group of pixels, for instance, by an arrow.
In some embodiments, in addition to the shape of an identified group of pixels, the values of each pixel or sub-groups of pixels with similar values are used to identify the hidden object 48. For instance, a metallic object such as a pipe will generally reflect more radiation than a plastic or wood object. The imager 250 will, in turn, receive more reflected radiation and a stronger signal will be received by the processor 64. Thus, the imager 250 may identify the material making up the hidden object 48 based on the strength of the signal received. The imager 250 may use the identified material of the group or sub-group of pixels to better determine the type of object (e.g., wood, pipe, or wire) or merely to add post processing highlights based on the material type.
In some embodiments, a radon transform type post-processing technique is used to identify objects within an image.
To perform the post-processing, the imager 250 determines the sum of each column and row of pixels, as indicated in
The benefits of the outlining or bordering functionality may be more apparent in an image with more pixels than image 670. In image 670, the pixels making up the outlines of objects 676a-c would cover as much area as the objects 676a-c themselves. For instance, object 676b is only one column wide. An outline around object 676b would be one column wide on each side of object 676b. Thus, the outline of object 676b would actually cover more than twice the area than the object 676b. In other embodiments, particularly those with more pixels per image, the ratio of outline pixel area to object pixel area is decreased such that the outline appears more as a thin line around an object.
In
In other embodiments, the highlighting mask 678 has pixels not corresponding to object locations in image 670 set equal to zero to cancel out noise. In some embodiments, the processor 64 is also configured to detect groups of low-valued pixels within detected objects that, in reality, indicate that the single detected object is actually two or more objects or that the single detected object 676c is smaller than initially determined. For instance, in image 670, object 676c is within one column and includes two 1 values, three 0 values, and two 3 values from top to bottom. The processor 64 may detect that the object 676c is only located at the two 3 values, and, in turn, highlight and outline the object 676c appropriately.
The small 10×10 pixel array with black and white images is used to simplify the explanation of the post processing techniques of
Additional Detection Technology
In some embodiments, additional detection technology is incorporated into the imager 250 to form imager 700.
The additional detection technology can include, for example, one or more of a capacitance detector, a thermographic camera, a sonic measuring device, a laser distance measuring device, a depth measurement device using ionizing radiation, a non-contact moisture meter, a GPS device, and a fiberscope camera.
In one embodiment, the additional detection module 702 includes a capacitance detector that uses an internal capacitor plate to detect changes in the dielectric constant of the wall surface 56 as the user moves the imager 700 over the wall surface 56. If the dielectric constant changes by a predetermined amount, the capacitance detector indicates that a dense object is behind the wall, such as a stud. The imager 700 combines the image data from the RF board 300 with the capacitance data from the capacitance detector to determine whether an object is behind the wall surface 56. For instance, if both the image data from the RF board 300 and the capacitance detector indicate that an object is behind wall surface 56, the display 266 will indicate that the object is present. However, if only one of the data from the capacitance detector and the data from the RF board 300 indicate an object behind the wall surface 56, the imager 700 may 1) indicate that no object is behind the wall surface 56, 2) indicate that an object is behind the wall surface 56, or 3) delay a determination and alert the user that further inspection is needed to determine whether an object is behind the wall surface 56. In other embodiments, the capacitance detector data is displayed on the display 66 as a separate color or other identifying information such that the user can distinguish between image data based on the RF board 300 and image data based on the capacitance detector.
In one embodiment, the additional detection module 702 includes a live wire detecting sensor. The live wire detecting sensor may include, for instance, an antenna to respond to electrical or magnetic fields surrounding live wires (e.g., 120 V or 220 V wires common in residential homes and commercial buildings). The live wire detecting sensor is configured to output an analog or digital signal to indicate to the processor 64 that a live wire is near the imager 700. The imager 700 may incorporate the live wire detecting sensor data with the image data from the RF board 300 similar to the method of incorporating capacitance data described above.
In one embodiment, the additional detection module 702 includes a thermographic camera, also referred to as a forward looking infrared (“FLIR”) camera or an infrared camera. The infrared camera detects radiation with wavelengths between 0.7 and 300 μm. In some embodiments, the user toggles the imager 700 between displaying the infrared image generated by the infrared camera data and the image generated by the RF board 300 data. In some embodiments, the imager 700 overlays the RF board 300 data on the infrared image on the display 266.
In some embodiments, the additional detection module 702 includes a laser distance measurer or a sonic distance measurer. The distance measurers are used in conjunction with or in place of the encoder wheels 420 to determine the location of the imager 700 and to estimate the size of the wall surface 56. For instance, the distance measurers can detect the distance between the imager 700 and the surrounding walls in the x and y directions. The imager 700 can assume a rectangular wall surface 56 or can measure distances to obstructions in additional directions to increase the accuracy. When the imager 700 is moved across the wall surface 56, the processor 64 uses the changing distance information from the distance measurers to track the movement of the imager 700.
In addition, the distance measurers enable determining the location of the imager 700 for purposes of refining images in step 614 of the refine mode process 610. For instance, if the imager 700 is pulled away from the wall while the user selects an image to refine, the imager 700 will lose its position because the encoder wheels will not be touching the wall surface 56 to be rotated when the imager 700 is moved. However, even after the imager 700 is pulled away and then placed back up against the wall surface 56, distance measurers can provide location information to reorient the imager 700. The distance measurers do not rely on constant contact with the wall surface 56 to track location. Rather, the distance measurers rely on stationary objects such as a ceiling or other surface adjoining the wall surface 56. Furthermore, inertial sensors may be incorporated into the imager 700 to track imager 700 movement without needing to contact the wall surface 56. Example inertial sensors include accelerometers and gyroscopes, which, in some instances, are of the micro-electro-mechanical systems (“MEMS”) variety. Finally, an optical tracking system similar to those used in an optical computer mouse can be used in the imager 700. Within certain distances, the optical tracking system enables tracking even when the imager 700 is pulled away from the wall.
In other embodiments, the additional detection module 702 includes an emitter and detector of ionizing radiation. First, a metal object is detected using the RF board 300 data. The user is then prompted to cause the imager 700 to determine depth information regarding the metal object. In some embodiments, the imager 700 automatically determines depth information regarding a detected metal object without prompting the user. After aligning the imager 700, for instance, using similar methods used during the refine mode process 610, the emitter sends ionizing radiation towards the metal object. A portion of the ionizing radiation reflects back towards the ionizing radiation detector. The detector interprets the reflected ionizing radiation to determine more specific information regarding the depth of the metal object. The determined depth information is then displayed to the user on display 266. The ionizing radiation used is, for instance, short x-rays or gamma rays.
In some embodiments, the additional detection module 702 includes a non-contact moisture meter. The non-contact moisture meter relies on the dielectric properties of a hidden object. For instance, if the object material is known (e.g., wood, metal, or plastic), and the dielectric properties are different than expected given the known material, the non-contact moisture meter determines that water is present. The determination that water is present can be displayed to the user on the display 266 by overlaying a blue color on the appropriate pixels or using another highlighting technique. The object material may be determined, for instance, by analyzing the RF board 300 data as described above or by user input.
In some embodiments, the additional detection module 702 includes a GPS device. The GPS device is used, for instance, to tag images as described above with reference to
In some embodiments, the additional detection module 702 includes a fiberscope camera or connection means to connect the fiberscope camera to the imager 700. The fiberscope camera includes a flexible fiber optic bundle with a lens on an inspecting end. In some embodiments, the lens end includes a light source to emit light in the area to be inspected. The other end of the flexible fiber optic bundle permanently or removably connects to the imager 700. The light waves reflecting off of objects within view of the lens are received through the lens and transmitted along the fiber optic bundle to the imager 700. The light waves are converted to digital signals by the imager 700 and forwarded on to the processor 64. The processor 64 receives the digital representations of the incoming light waves and generates images for the display 266. Thus, a user can use the fiberscope to view objects in hard-to-reach locations, such as behind walls, by feeding the fiberscope through a small hole.
In some embodiments, the imager 700 also includes a wireless communications module 704. The wireless communications module 704 includes hardware, software, or a combination thereof, to facilitate wireless communication between the imager 700 and one or more external devices (not shown). The external devices include, for instance, a personal computer, a laptop, another imager 700, a cell phone, or other device with a processor and memory.
Using the wireless communications module 704, the imager 700 can wirelessly transmit and receive image data to and from an external device. For instance, in one embodiment, the wireless communications module 704 includes Bluetooth™ technology. The Bluetooth-enabled imager 700 is configured to wirelessly transmit a generated image to a cell phone. Thereafter, the cell phone can be used to transmit the image to another location, such as a personal computer, or a remote server, via email. In other embodiments, the wireless communications module 704 includes WiFi™ technology and the imager 700 can wirelessly transmit or receive a generated image via email without the intermediate step of using a cell phone.
Telescoping Housing
In some embodiments, the imager 250 includes a telescoping housing. The telescoping housing enables the imager 250 to increase or decrease the distance between the RF board 300 and the wall surface 56. In turn, the imager 250 is able to detect objects at different depths behind the wall surface 56 because the focal point of the emitted radiation is at a different depth. In one embodiment, the encoder wheels extend away from the housing to increase the distance between the wall surface and the RF board 300, and retract to reduce the distance between the wall surface and the RF board 300.
In other embodiments, the housing includes a first and a second section. The components shown in
In some embodiments, the telescoping technique is performed electronically, for instance, by a user toggling input buttons and interacting with the display 266 to extend or retract the encoder wheels 420, or increase or decrease the distance between the first and second sections of the housing. In other embodiments, the telescoping technique is performed manually by the user. The various telescoping housing configurations are similarly applicable to embodiments of the imager 700.
It should be understood that various modifications can be made to the above illustrative embodiments without departing from the scope of the invention. For example, a variety of different lenses can be utilized. The lenses can be fabricated, e.g., as zone plates, parallel metal plates, dielectric materials. Further, the optics can be designed as confocal, near confocal, telecentric, or dark field. The scanning of the radiation can be one or two-dimensional (radial, tangential, raster, or a combination thereof). The camera body location-determining subsystem can be internal or external to the camera body. Further, the location sensing technology can be mechanical, optical, RF or any suitable mode.
Thus, the invention provides, among other things, an imager that generates images based on emitting electromagnetic radiation and receiving reflected electromagnetic radiation to detect objects. The invention also provides, among other things, methods of operating an imager, organizing generated images, and processing image data to generate images. Various features and advantages of the invention are set forth in the following claims.
This application is a continuation-in-part of U.S. application Ser. No. 11/858,413, filed Sep. 20, 2007, now U.S. Pat. No. 7,679,546, which claims the benefit of U.S. Provisional Application No. 60/826,358, filed on Sep. 20, 2006. This application is also a continuation-in-part of U.S. application Ser. No. 11/353,882, filed Feb. 14, 2006, now U.S. Pat. No. 7,626,400, which claims the benefit of U.S. Provisional Application No. 60/653,228, filed on Feb. 15, 2005. The entire contents of all of the above applications are herein incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
3304434 | Koster | Feb 1967 | A |
3494039 | Porter | Feb 1970 | A |
3713156 | Pothier | Jan 1973 | A |
3754271 | Epis | Aug 1973 | A |
3775765 | Di Piazza et al. | Nov 1973 | A |
3836258 | Courten et al. | Sep 1974 | A |
3845384 | Stoutenberg et al. | Oct 1974 | A |
3967282 | Young et al. | Jun 1976 | A |
4008469 | Chapman | Feb 1977 | A |
4062010 | Young et al. | Dec 1977 | A |
4227807 | Pond et al. | Oct 1980 | A |
4246703 | Robinet | Jan 1981 | A |
4430653 | Coon et al. | Feb 1984 | A |
4464622 | Franklin | Aug 1984 | A |
4561183 | Shores | Dec 1985 | A |
4612538 | Karcher, Jr. | Sep 1986 | A |
4677438 | Michiguchi et al. | Jun 1987 | A |
4706031 | Michiguchi et al. | Nov 1987 | A |
4760647 | Gillis | Aug 1988 | A |
4797544 | Montgomery et al. | Jan 1989 | A |
4814768 | Chang | Mar 1989 | A |
4850693 | Deason et al. | Jul 1989 | A |
4853617 | Douglas et al. | Aug 1989 | A |
4859931 | Yamashita et al. | Aug 1989 | A |
4967484 | Nosek | Nov 1990 | A |
4992741 | Douglas et al. | Feb 1991 | A |
4995102 | Ichinose et al. | Feb 1991 | A |
4998058 | Tofte et al. | Mar 1991 | A |
5012248 | Munro et al. | Apr 1991 | A |
5020886 | Takeda et al. | Jun 1991 | A |
5030956 | Murphy | Jul 1991 | A |
5051748 | Pichot et al. | Sep 1991 | A |
5081456 | Michiguchi et al. | Jan 1992 | A |
5104225 | Masreliez | Apr 1992 | A |
5148108 | Dufour | Sep 1992 | A |
5227797 | Murphy | Jul 1993 | A |
5227799 | Kimura et al. | Jul 1993 | A |
5296807 | Kousek et al. | Mar 1994 | A |
5345240 | Frazier | Sep 1994 | A |
5352974 | Heger | Oct 1994 | A |
5446461 | Frazier | Aug 1995 | A |
5455590 | Collins et al. | Oct 1995 | A |
5457394 | McEwan | Oct 1995 | A |
5477622 | Skalnik | Dec 1995 | A |
5512834 | McEwan | Apr 1996 | A |
5541727 | Rando et al. | Jul 1996 | A |
5543799 | Heger | Aug 1996 | A |
5560119 | LeBreton | Oct 1996 | A |
5570189 | Salmon | Oct 1996 | A |
5577330 | Cheng | Nov 1996 | A |
5619128 | Heger | Apr 1997 | A |
5640092 | Motazed et al. | Jun 1997 | A |
5644314 | Ahmad et al. | Jul 1997 | A |
5647135 | Fuentes et al. | Jul 1997 | A |
5647439 | Burdick et al. | Jul 1997 | A |
5659985 | Stump | Aug 1997 | A |
5675349 | Wong | Oct 1997 | A |
5680048 | Wollny | Oct 1997 | A |
5704142 | Stump | Jan 1998 | A |
5748369 | Yokota | May 1998 | A |
5757320 | McEwan | May 1998 | A |
5760397 | Huguenin et al. | Jun 1998 | A |
5760932 | Perchak | Jun 1998 | A |
5767679 | Schroder | Jun 1998 | A |
5780846 | Angilella et al. | Jul 1998 | A |
5812057 | Hepworth et al. | Sep 1998 | A |
5835053 | Davis | Nov 1998 | A |
5835054 | Warhus et al. | Nov 1998 | A |
5877618 | Luebke et al. | Mar 1999 | A |
5886664 | Yujiri et al. | Mar 1999 | A |
5900833 | Sunlin et al. | May 1999 | A |
5904210 | Stump et al. | May 1999 | A |
5905455 | Heger et al. | May 1999 | A |
5933014 | Hartrumpf et al. | Aug 1999 | A |
5933120 | Manasson et al. | Aug 1999 | A |
5992741 | Robertson et al. | Nov 1999 | A |
6028547 | Dory | Feb 2000 | A |
6091354 | Beckner et al. | Jul 2000 | A |
6119376 | Stump | Sep 2000 | A |
6130858 | Felesky | Oct 2000 | A |
6182512 | Lorraine | Feb 2001 | B1 |
6195922 | Stump | Mar 2001 | B1 |
6198271 | Heger et al. | Mar 2001 | B1 |
6211662 | Bijawat et al. | Apr 2001 | B1 |
6215293 | Yim | Apr 2001 | B1 |
6242740 | Luukanen et al. | Jun 2001 | B1 |
6249113 | Krantz et al. | Jun 2001 | B1 |
6259241 | Krantz | Jul 2001 | B1 |
6301997 | Welte | Oct 2001 | B1 |
6359582 | MacAleese et al. | Mar 2002 | B1 |
6417502 | Stoner et al. | Jul 2002 | B1 |
6417797 | Cousins et al. | Jul 2002 | B1 |
6445334 | Bradley et al. | Sep 2002 | B1 |
6462696 | Gorman | Oct 2002 | B1 |
6473025 | Stolarczyk et al. | Oct 2002 | B2 |
6473049 | Takeuchi et al. | Oct 2002 | B2 |
6493126 | Iizuka | Dec 2002 | B1 |
6507441 | Eisenberg et al. | Jan 2003 | B1 |
6522285 | Stolarczyk et al. | Feb 2003 | B2 |
6573855 | Hayakawa et al. | Jun 2003 | B1 |
6573857 | Fullerton et al. | Jun 2003 | B2 |
6590519 | Miceli et al. | Jul 2003 | B2 |
6600441 | Liedtke et al. | Jul 2003 | B2 |
6617996 | Johansson et al. | Sep 2003 | B2 |
6621448 | Lasky et al. | Sep 2003 | B1 |
6633252 | Stolarczyk et al. | Oct 2003 | B2 |
6637278 | Fasanella | Oct 2003 | B1 |
6660193 | Myhre | Dec 2003 | B2 |
6687036 | Riza | Feb 2004 | B2 |
6696827 | Fazekas et al. | Feb 2004 | B2 |
6701647 | Stump | Mar 2004 | B2 |
6703944 | Obradovich | Mar 2004 | B1 |
6736004 | Evans et al. | May 2004 | B2 |
6747536 | Miller, Jr. | Jun 2004 | B1 |
6748323 | Lokshin et al. | Jun 2004 | B2 |
6748797 | Breed et al. | Jun 2004 | B2 |
6777684 | Volkov et al. | Aug 2004 | B1 |
6778127 | Stolarczyk et al. | Aug 2004 | B2 |
6791487 | Singh et al. | Sep 2004 | B1 |
6791488 | Diekhans et al. | Sep 2004 | B2 |
6842993 | DiMauro | Jan 2005 | B1 |
6844713 | Steber et al. | Jan 2005 | B2 |
6856272 | Levitan et al. | Feb 2005 | B2 |
6894508 | Sanoner et al. | May 2005 | B2 |
6900756 | Salmon | May 2005 | B2 |
6909497 | Holbrook | Jun 2005 | B2 |
6919838 | Santhoff | Jul 2005 | B2 |
6926473 | Luebke | Aug 2005 | B2 |
6950054 | Steinway et al. | Sep 2005 | B1 |
6952880 | Saksa | Oct 2005 | B2 |
6967612 | Gorman et al. | Nov 2005 | B1 |
6978503 | Del Cogliano | Dec 2005 | B2 |
6999021 | Taylor, Jr. et al. | Feb 2006 | B2 |
7013570 | Levine et al. | Mar 2006 | B2 |
7034740 | Witten | Apr 2006 | B2 |
7036241 | Williams et al. | May 2006 | B2 |
7038446 | Keely | May 2006 | B1 |
7059057 | Raskin et al. | Jun 2006 | B2 |
7099084 | Bi | Aug 2006 | B2 |
7113124 | Waite | Sep 2006 | B2 |
7116091 | Miller | Oct 2006 | B2 |
7125145 | Gardiner et al. | Oct 2006 | B2 |
7134217 | Melittas | Nov 2006 | B2 |
7142193 | Hayama et al. | Nov 2006 | B2 |
7147162 | Fitch et al. | Dec 2006 | B2 |
7148836 | Romero et al. | Dec 2006 | B2 |
7170076 | Butler et al. | Jan 2007 | B2 |
7173560 | Li et al. | Feb 2007 | B2 |
7178250 | Nash et al. | Feb 2007 | B2 |
7193405 | Murray | Mar 2007 | B2 |
7194236 | Lovberg et al. | Mar 2007 | B2 |
7209035 | Tabankin et al. | Apr 2007 | B2 |
7210820 | Broude et al. | May 2007 | B2 |
7212014 | Krantz | May 2007 | B2 |
7218267 | Weil | May 2007 | B1 |
7222437 | Spanski et al. | May 2007 | B2 |
7236120 | Healy et al. | Jun 2007 | B2 |
7237341 | Murray | Jul 2007 | B2 |
7248204 | Lovberg et al. | Jul 2007 | B2 |
7253766 | Foote et al. | Aug 2007 | B2 |
7256587 | Sanoner et al. | Aug 2007 | B2 |
7262602 | Meyer | Aug 2007 | B2 |
7276910 | Prsha et al. | Oct 2007 | B2 |
7278223 | Dever et al. | Oct 2007 | B1 |
7282920 | Mizuno | Oct 2007 | B2 |
7287336 | Goodrich | Oct 2007 | B1 |
7310060 | Stilwell et al. | Dec 2007 | B2 |
7312742 | Steinway et al. | Dec 2007 | B2 |
7316073 | Murray | Jan 2008 | B2 |
7333045 | Aomori et al. | Feb 2008 | B1 |
7339516 | Thompson et al. | Mar 2008 | B2 |
7355410 | Schmitzer et al. | Apr 2008 | B2 |
7356421 | Gudmundsson et al. | Apr 2008 | B2 |
7356587 | Boulanger et al. | Apr 2008 | B2 |
7357526 | Zeiler | Apr 2008 | B2 |
7358746 | Clauss et al. | Apr 2008 | B2 |
7358888 | Fullerton et al. | Apr 2008 | B2 |
7372894 | Rached et al. | May 2008 | B2 |
7382119 | Gasque, Jr. | Jun 2008 | B1 |
7443154 | Merewether et al. | Oct 2008 | B1 |
7447565 | Cerwin | Nov 2008 | B2 |
7460052 | Zemany et al. | Dec 2008 | B2 |
7482968 | Wuersch et al. | Jan 2009 | B2 |
7504817 | Sanoner et al. | Mar 2009 | B2 |
7518374 | Olsson et al. | Apr 2009 | B1 |
7518542 | Steinway et al. | Apr 2009 | B1 |
7538326 | Johnson et al. | May 2009 | B2 |
7557559 | Olsson et al. | Jul 2009 | B1 |
7602175 | Mednikov et al. | Oct 2009 | B2 |
7626400 | Holbrook et al. | Dec 2009 | B2 |
7633282 | Radle et al. | Dec 2009 | B2 |
7800527 | Douglass et al. | Sep 2010 | B2 |
7898455 | Rosenbury | Mar 2011 | B2 |
7956794 | Skultety-Betz et al. | Jun 2011 | B2 |
7978124 | Cook et al. | Jul 2011 | B2 |
20010007420 | Bijawat et al. | Jul 2001 | A1 |
20020170201 | Trout et al. | Nov 2002 | A1 |
20030218469 | Brazell et al. | Nov 2003 | A1 |
20050097765 | Sorensen et al. | May 2005 | A1 |
20050216032 | Hayden | Sep 2005 | A1 |
20050247460 | Luebke | Nov 2005 | A1 |
20060055584 | Waite et al. | Mar 2006 | A1 |
20060076385 | Etter et al. | Apr 2006 | A1 |
20060113985 | Gist et al. | Jun 2006 | A1 |
20060144829 | Broude et al. | Jul 2006 | A1 |
20060148519 | Simpson et al. | Jul 2006 | A1 |
20060255789 | Wuersch et al. | Nov 2006 | A1 |
20060266742 | Hall et al. | Nov 2006 | A1 |
20070079445 | Siebeck | Apr 2007 | A1 |
20070200547 | Chen | Aug 2007 | A1 |
20080084212 | Wieland | Apr 2008 | A1 |
20080111732 | Bublitz et al. | May 2008 | A1 |
20080186010 | Skultety-Betz et al. | Aug 2008 | A1 |
20080196910 | Radle et al. | Aug 2008 | A1 |
20080235954 | Radle | Oct 2008 | A1 |
20090195435 | Kapilevich et al. | Aug 2009 | A1 |
Number | Date | Country |
---|---|---|
0389240 | Sep 1990 | EP |
20020085380 | Nov 2002 | KR |
20030020722 | Mar 2003 | KR |
9847020 | Oct 1998 | WO |
9961942 | Dec 1999 | WO |
0137029 | May 2001 | WO |
02084796 | Oct 2002 | WO |
2005031502 | Apr 2005 | WO |
2006088845 | Aug 2006 | WO |
Number | Date | Country | |
---|---|---|---|
20100117885 A1 | May 2010 | US |
Number | Date | Country | |
---|---|---|---|
60826358 | Sep 2006 | US | |
60653228 | Feb 2005 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11858413 | Sep 2007 | US |
Child | 12628445 | US | |
Parent | 11353882 | Feb 2006 | US |
Child | 11858413 | US |