This application relates generally to imaging devices and methods, including those that employ millimeter-wave radar sensors and target image reconstruction techniques.
Millimeter-wave (mmW) imaging using conventional techniques (e.g., synthetic aperture radar or SAR, Inverse SAR or ISAR) require information on distance, motion profile, and, for some applications, orientation of the mmW reader compared to a target in order to perform radar data processing and image reconstruction of backscattered mmW signals. This required information constrains the imaging applications to configurations where distance and orientation have to be fixed and pre-known, and are not robust to changes in these parameters. For example, the reconstruction of SAR images depends on knowing the target distance and orientation from the imager. When using SAR to image planet earth from space, for example, this distance is known and predictable. However, when using SAR to image a chipless RFID (radio frequency identification) tag or a security flagged backpack, for example, the operator may not be able to maintain an exact distance while sweeping the imager across the target.
Embodiments are directed to a millimeter-wave (mmW) imaging device comprising a mmW radar sensor, an auxiliary sensor arrangement, and at least one processor. The mmW radar sensor comprises a transmitter coupled to an antenna arrangement configured to transmit mmW radiation at a target, and a receiver coupled to the antenna arrangement or a separate antenna arrangement configured to receive backscatter radiation from the target. The auxiliary sensor arrangement is configured to detect one or both of relative motion and relative position between the imaging device and the target. A processor of the mmW imaging device is configured to receive data respectively produced by the mmW radar sensor and the auxiliary sensor arrangement during relative movement between the imaging device and the target. The processor or a remote processor is configured to reconstruct a mmW radar image of the target using the received data.
Embodiments are directed to a mmW imaging device comprising a mmW radar sensor, an auxiliary sensor arrangement, and at least one processor. The mmW radar sensor comprises a transmitter coupled to an antenna arrangement configured to transmit mmW radiation at a chipless RFID tag, and a receiver coupled to the antenna arrangement or a separate antenna arrangement configured to receive backscatter radiation from the chipless RFID tag. The auxiliary sensor arrangement is configured to detect one or both of relative motion and relative position between the imaging device and the chipless RFID tag. The processor of the mmW imaging device is configured to receive data respectively produced by the mmW radar sensor and the auxiliary sensor arrangement during relative movement between the imaging device and the chipless RFID tag. The processor or a remote processor is configured to reconstruct a mmW radar image of the chipless RFID tag using the received data and decode data encoded in the chipless RFID tag.
Embodiments are directed to a method implemented by a mmW imaging device. The method comprises transmitting mmW radiation at a target during relative movement between the imaging device and the target, and producing backscatter radiation data in response to receiving backscatter radiation from the target. The method also comprises producing one or both of motion data and position data during relative movement between the imaging device and the target. The method further comprises reconstructing a mmW radar image of the target using the backscatter radiation data and one or both of the motion data and the position data. In some embodiments, the target comprises a chipless RFID tag and the method comprises decoding data encoded in the RFID tag.
The above summary is not intended to describe each disclosed embodiment or every implementation of the present disclosure. The figures and the detailed description below more particularly exemplify illustrative embodiments.
Throughout the specification reference is made to the appended drawings wherein:
The figures are not necessarily to scale. Like numbers used in the figures refer to like components. However, it will be understood that the use of a number to refer to a component in a given figure is not intended to limit the component in another figure labeled with the same number.
The capabilities and limitations of traditional imaging are well known, and only briefly described here for the sake of completeness. In “traditional” (i.e., non-radar) imaging, the object of interest is illuminated fully by a coherent (e.g., planar or spherical) wavefront, and the scattered (i.e., reflected or transmitted) wavefront is captured by the aperture of a physical lens. This received wavefront is spatially modulated (via refraction in a dielectric lens or diffraction in a meta-material lens) to compensate for the phase distortion in the scattered wave that is suffered in propagating from the object to the lens. With the phase information thus “corrected,” the resulting optical wave in the image plane of the lens conveys the spatial structure of the object. This wave is then converted to electrical form (i.e., a digital image) via photo-detection. Advances in two-dimensional semiconductor focal-plane arrays as well as refractive and diffractive optics have enabled a steady improvement in the efficiency and the quality of this kind of imaging. However, the image resolution—a key performance metric—achievable through this approach is ultimately and fundamentally limited by the size of the physical lens aperture as well as by the size and the pixel density of the focal-plane array.
In radar imaging, one firstly combines the source and the detector of radiation into a single device: an antenna that transmits the incident wave and subsequently detects the backscattered wave at the same spatial location. It is noted that the extension of this to multiple-input multiple-output radar imaging systems is relatively recent. Secondly, arranging the antenna beam axis to be slanted at an angle between 0 and 90 degrees with respect to the normal to the nominal object plane allows one to form a mapping between the “down-range” or “cross-track” points in the object plane and time delays or frequency shifts in the received electrical signal, depending respectively on whether the transmitted signal is a narrow pulse or a chirped waveform. This is the basis of “range imaging.” Finally, moving the antenna in a prescribed fashion in the “along-track” (a.k.a. azimuthal or “cross-range”) direction relative to the object allows one to sample the backscattered wave in space, similar to what a lens does simultaneously. In this way, an effective aperture can be synthesized for capturing a larger portion of the backscattered wavefront than is possible by using a real lens of necessarily small size. This is the basis of high-resolution “cross-range imaging” in synthetic-aperture radar (SAR). The radar measurements are then digitally processed and assembled into an image. The obvious penalty paid for a much longer image acquisition time is often justified by the much finer image resolution achievable via SAR.
The phase modulation that is performed by a lens in traditional imaging is performed by a computational algorithm in SAR imaging. In the most popular forms of SAR (strip-map, spotlight, circular), the motion of the radar platform is so simple as to result in an analytically expressible mapping between the platform positions and the (cross-range) image coordinates. Consequently, the phase modulation processing step can be expressed as a (sampled) Fourier transform, for which there are well-known and highly efficient computational implementations. The advancements in digital computing has therefore led to increasingly more efficient SAR processing codes over the decades. As the above description makes clear, however, the quality of the resulting SAR image depends critically on the accuracy of the relative phase information between measurements. Specifically, the relative position and orientation of the radar platform (including the slant angle) for all measurements must be known, to within a fraction of wavelength of radiation, in the three-dimensional object space in order for the SAR computational algorithm to properly reconstruct an image of the object. This requirement is the motivation behind this disclosure.
Embodiments of the disclosure are directed to imaging systems and methods that use a radar imaging sensor and one or more non-radar sensors to extract information that is used in the processing of radar data and image reconstruction. Embodiments are directed to imaging systems and methods that use a relative-target-to-radar motion profile, which includes one or both of range information and orientation information, in addition to radar data to reconstruct an image of a target. The relative-target-to-radar motion profile can be generated by one or more motion sensors, one or more position sensors, or a combination of motion and position sensors. Various types of targets are contemplated, including various objects, structures, and materials. Targets detected by an imaging system of the present disclosure include those that are separated from the imaging system by a barrier (e.g., a wall, container, luggage, clothing) which is substantially non-transmissive to visible light. Various types of information-containing targets are contemplated, including RFID tags (e.g., chipless RFID tags), in which case the imaging system includes a decoder for decoding data encoded in the information-containing targets. The imaging system may include a user interface comprising a display for displaying a reconstructed image of a target.
Embodiments are directed to a handheld, portable mmW imaging device that includes a mmW radar sensor and one or more non-radar sensors to extract information that is used in the processing of radar data and image reconstruction. The mmW imaging device comprises one or more processors configured to fuse data from the mmW radar sensor and one or more non-radar sensors to reconstruct a mmW radar image. The one or more processors are configured to combine data from one or more non-radar sensors to generate one or more of a target range, motion profile data, and orientation profile data that is used in the radar data processing and image reconstruction implemented by the one or more processors of the mmW imaging device. In some embodiments, the one or more processors are components of the handheld, portable mmW imaging device. In other embodiments, the processors used in the radar data processing and image reconstruction include at least one processor of the handheld, portable mmW imaging device and at least one remote processor, such as a processor of an external device or system or a cloud processor. For example, a processor of the handheld, portable mmW imaging device can be configured to preprocess data from the mmW radar sensor and one or more non-radar sensors, and a remote processor of an external system or device or a cloud processor can be configured to perform image reconstruction using the preprocessed data.
According to various embodiments, the mmW imaging device 100 is configured to transmit and receive RF signals in the range from about 24 GHz to about 300 GHz (e.g., the EHF or Extremely High Frequency range). Radio waves in this spectrum have wavelengths from about 10 to 1 millimeter. As such, radiation in this frequency band is referred to as millimeter waves. The mmW imaging device 100 can implement millimeter-wave imaging for detection of objects, for example, as well as the range, velocity, and angle of these objects. Due to the use of RF signals with short wavelengths, the mmW imaging device 100 can provide high resolution. The RF signals generated by the mmW imaging device 100 are able to penetrate various materials such as plastic, drywall, and clothing, and are impervious to environmental conditions such as rain, fog, dust and snow. The mmW imaging device 100 can be configured to be highly directional, such as by forming a compact beam with high angular accuracy. In some embodiments, the beam produced by the mmW imaging device 100 can be focused and steered using standard techniques including mechanical rotation and programmable phased array configurations. Although embodiments are directed to mmW imaging in this disclosure, it is understood that the imaging devices and methodologies disclosed herein can be implemented using RF signals that fall outside of the EHF range (e.g., the microwave band or terahertz band).
The mmW imaging device 100 shown in
As is further shown in
According to some embodiments, the processor 140 of the mmW imaging device 100 can be configured to preprocess data from the mmW radar sensor 120 and one or more non-radar sensors of the auxiliary sensor arrangement 130, and the remote processor 160 can be configured to fuse the preprocessed data received from the mmW imaging device 100 and reconstruct an image of the target 102 using the fused data. The mmW imaging device 100 can communicate the preprocessed data to the remote processor 160 via a communication device 150 (e.g., a wired or wireless communication interface). For example, the communication device 150 can be configured to establish a connection with the remote processor via public and/or private communication infrastructure (e.g., one or more of a local area network, a wide area network, the Internet).
In various embodiments, the auxiliary sensor arrangement 130 includes one or more motion sensors 134. The one or more motion sensors 134 are configured to produce data indicative of motion of the mmW imaging device 100 during imaging of the target 102. For example, the auxiliary sensor arrangement 130 can include one or more of an accelerometer arrangement, a gyro arrangement, a magnetometer arrangement, an inertial measurement unit (IMU), and a camera.
In various embodiments, the auxiliary sensor arrangement 130 includes one or more position sensors 132. The position sensor(s) 132 can be configured to produce data indicative of relative distance (range) between the mmW imaging device 100 and the target 102. For example, the position sensor(s) 132 of the auxiliary sensor arrangement 130 can include one or more of a stereoscopic camera arrangement, a laser sensor, and an ultrasonic sensor. In some embodiments, the processor 140 can be configured to produce data indicative of relative distance between the mmW imaging device 100 and the target 102 using known techniques (e.g., stretch processing, range compression) applied to the backscatter radiation data. In such embodiments, the mmW imaging device 100 serves as a position sensor, and a separate position sensor 132 need not be included. The relative distance between the mmW imaging device 100 and the target 102 provides the phase delta between the different azimuth measurements used to form the synthetic aperture in accordance with various embodiments.
In some embodiments, the auxiliary sensor arrangement 130 can include one or more sensors configured to produce data indicative of orientation of the mmW imaging device 100 relative to the target 102. For example, an orientation sensor of the auxiliary sensor arrangement 130 can include a camera or any of the sensors described above as position sensors.
In accordance with various embodiments, the mmW imaging device 100 includes at least one motion sensor 134. In accordance with some embodiments, the mmW imaging device 100 includes at least one position sensor 132. In accordance with further embodiments, the mmW imaging device 100 includes at least one motion sensor 134 and at least one position sensor 132. In accordance with various embodiments, the mmW imaging device 100 includes at least one motion sensor 134, at least one position sensor 132, and at least one orientation sensor. In some embodiments, position and/or motion can be determined by the mmW imaging device 100 through triangulation of a set of sensors situated in the environment (e.g., a room), exclusive of or in addition to one or more sensors of the auxiliary sensor arrangement 130.
The processor 140 can incorporate or be coupled to a sensor processor 146 configured to process signals or data received from the auxiliary sensor arrangement 130. The processor 140 and/or the sensor processor 146 can be implemented as or include one or more of a multi-core processor, a digital signal processor (DSP), a microprocessor, a programmable controller, a general-purpose computer, a special-purpose computer, a hardware controller, a software controller, a combined hardware and software device, such as a programmable logic controller, and a programmable logic device (e.g., FPGA, ASIC). The processor 140 and/or the sensor processor 146 can include or be operatively coupled to memory, such as RAM, SRAM, ROM, or flash memory. The processor 140 and/or sensor processor 146 can also be operatively coupled to a mass storage device, such as a solid-state drive (SSD).
The transmitter 122 and the antenna arrangement 126 are configured to transmit mmW radiation from a continuous-wave (for example FMCW, Stepped Frequency) or impulse source at a target 102. The receiver 124 and the antenna arrangement 126 are configured to receive backscatter radiation from the target 102. Backscatter radiation refers to the portion of the transmitted mmW radiation that the target 102 redirects directly back towards the antenna arrangement 126. Backscattering is the process by which backscatter is formed. The scattering cross section in the direction toward the mmW imaging device 100 is called the backscattering cross section (the usual notation is the symbol sigma, σ). The backscattering cross section is a measure of the reflective strength of the target 102. The normalized measure of the radar return from a distributed target 102 is called the backscatter coefficient (or sigma nought, σ0), which is the average radar cross-section of a set of objects defining the target 102 per unit area. If the signal formed by backscatter is undesired, it is called clutter. Other portions of the transmitted mmW radiation may be reflected and scattered away from the mmW imaging device 100 or absorbed.
Backscatter radiation data produced by the mmW radar sensor 120 is communicated to the processor 140. In some embodiments, the processor 140 incorporates or is coupled to a radar data processor 144. The radar data processor 144 is configured to process the backscatter radiation data using known techniques that depend on the transmitted signal.
As is further shown in
The mmW imaging device 100 includes a housing 110 configured to contain and/or support the various components shown in
Referring now to
Increased resolution is achieved by synthesizing a larger aperture than the real antenna aperture by moving the antenna a certain length along the azimuth direction, which is perpendicular to range direction, and conducting multiple measurements. The data from these multiple measurements, in particular the variations in phase with frequency and with antenna position, contain sufficient information to allow reconstruction of the positions of scatterers (e.g., targets) relative to the antenna motion path at high resolution, as if all the measurements were taken simultaneously with a single larger synthetic aperture. The phase information for these multiple measurements is a function of range, motion profile for more complex scenarios, and potentially object orientation for more complex targets. Thus, this information is used in addition to the radar data to enable reconstruction of the target image.
Existing synthetic aperture and related radar imaging techniques require that the relative motion between the antenna and the target be one dimensional and linear. Such a configuration provides a straightforward relationship between combined phase data and scatterer position. However, a linear relationship is not required in accordance with embodiments of the disclosure. The distance and/or relative motion between the radar and the target or scene is sufficient to enable reconstruction of the target image for arbitrary relative motion paths. The use of one or more non-radar sensors (e.g., auxiliary sensors such as motion, position, and/or orientation sensors) of the mmW imaging device 100 provides the necessary auxiliary information. The auxiliary sensors can provide information on relative radar-target motion profile which is fed into the radar image reconstruction algorithms in order to generate the radar image. In addition to motion profile, for some applications like chipless RFID, the auxiliary sensors can also register and provide information about the orientation of the target tag.
It is understood that the mmW imaging device 100 can be useful in a static mode of operation in which there is no relative movement between the mmW imaging device 100 and the target 102. This can be accomplished, for example, by achieving a large antenna aperture through the application of multiple antennas for transmission (Tx) and reception (Rx), in which the total aperture size, and thus the resolution, will be a function of the number of Tx antennas multiplied by the number of Rx antennas.
In the context of a scanning mode of operation, the processor 140 is configured to receive data respectively produced by the mmW radar sensor 120 and the auxiliary sensor arrangement 130 during relative movement between the mmW imaging device 100 and the target 102. As is shown in
For example, the processor 140 is configured to receive a plurality of samples of data produced by the mmW radar sensor 120 and the auxiliary sensor arrangement 130 during relative movement between the mmW imaging device 100 and the target 102. For example, a hand-held mmW imaging device 100 may be moved across a structural wall by a user over a scan length of 10 cm during which 100 or more samples of backscatter radiation data and auxiliary sensor arrangement data are captured. Each of the samples comprises backscatter radiation data 121 and auxiliary sensor arrangement data 133 and/or 135. The auxiliary sensor arrangement data 133 and/or 135 can include data indicative of relative distance and/or motion between the mmW imaging device 100 and the target 102. In some embodiments, the auxiliary sensor arrangement data can also include data indicative of orientation between the mmW imaging device 100 and the target 102. It is noted that, in some embodiments, data indicative of relative distance between the mmW imaging device 100 and the target 102 can be generated by the processor 140 using backscatter radiation data 121 using known techniques (e.g., stretch processing, range compression). The processor 140 is configured to reconstruct a two-dimensional (2-D) mmW radar image of the target 102 using the plurality of samples.
According to some embodiments, one or more of the motion sensors 132 (e.g., camera, gyro, IMU) can serve as an orientation sensor(s). For example, a camera can serve as a motion sensor 132 and/or an orientation sensor. When used as an orientation sensor, a camera can capture images of a target 102 as the orientation of the target 102 changes during relative movement between the mmW imaging device 100 and the target 102. The target 102 may include an optical feature 104 (e.g., a geometric pattern) which is readily captured by the camera. Changes in the orientation of the optical feature 104 captured by a series of image samples captured by the camera correspond to changes in the orientation of the mmW imaging device 100. The magnitude and direction of these changes in the orientation of the optical feature 104 are calculated by the processor 140. The one or more motion sensors 134 can be configured to measure linear acceleration, exclusive of or in addition to rotational rate of the mmW imaging device 100. The one or more motion sensors 134 can be configured to measure any combination of roll, pitch, heading, position, and velocity of the mmW imaging device 100.
In some embodiments, and as shown in
The mmW imaging device 100 includes an auxiliary sensor arrangement 130 which can be configured to provide one or any combination of imaging device-to-target motion data 326, imaging device-to-target distance data 322, and imaging device-to-target orientation data 324. The auxiliary sensor arrangement 130 can include one or any combination of motion, position, and orientation sensors.
The auxiliary sensor arrangement 130 can include one or more motion sensors configured to produce image device motion data 326. The auxiliary sensor arrangement 130 can include one or more accelerometers 312, one or more gyros 314, and/or one or more magnetometers 318. According to some embodiments, the auxiliary sensor arrangement 130 includes an IMU 316 configured to measure at least the linear acceleration of the mmW imaging device 100. In addition to measuring linear acceleration, the IMU 316 can be configured to measure the rotational rate of the mmW imaging device 100. In some implementations, the IMU 316 can be configured to measure the linear acceleration, the rotational rate, and the heading of the mmW imaging device 100. For example, the IMU 316 can include one or more of an accelerometer, a gyroscope, and a magnetometer per each of three orthogonal axes. Some implementations of the IMU 316 can provide roll, pitch, heading, position, and velocity of the mmW imaging device 100. Suitable IMUs are available from SBG systems (e.g., Ellipse, Ekinox, and Apogee series IMUs) and Aceinna (e.g., OpenIMU models 330B, 300ZI, 300RI, and 381ZA). A camera 302 can also serve as a motion sensor.
The auxiliary sensor arrangement 130 can include one or more position sensors configured to produce imaging device-to-target distance data 322. The auxiliary sensor arrangement 130 can include one or more of a stereoscopic camera 304, a laser sensor 306, and an ultrasonic sensor 308. Suitable stereoscopic cameras 304 are available from DUO (e.g., models DDK, MLX, MC, and M ultra-compact stereoscopic imaging sensors). The laser sensor 306 can be implemented as a laser distance sensor and/or laser displacement sensor, for example. Suitable laser sensors 306 are available from Micro-Epsilon (e.g., optoNCDT laser sensors). Suitable ultrasonic sensors 308 are available from Texas Instruments (e.g., PGA460, TDC1000).
The mmW imaging device 100 shown in
In the embodiment shown in
The transmitter of the mmW radar sensor 120 is configured to transmit mmW radiation at the chipless RFID tag 102. The receiver of the mmW radar sensor 120 is configured to receive backscatter radiation from the chipless RFID tag 102. The processor 140 is configured to receive data respectfully produced by the mmW radar sensor 120 and the auxiliary sensor arrangement 130 during relative movement between the mmW imaging device 100 and the chipless RFID tag 102. The processor 140 (or a remote processor) is also configured to reconstruct a 2-D mmW radar image 150 of the chipless RFID tag 102 using the received data. The representative 2-D mmW radar image 150 in
Although reference is made herein to the accompanying set of drawings that form part of this disclosure, one of at least ordinary skill in the art will appreciate that various adaptations and modifications of the embodiments described herein are within, or do not depart from, the scope of this disclosure. For example, aspects of the embodiments described herein may be combined in a variety of ways with each other. Therefore, it is to be understood that, within the scope of the appended claims, the claimed invention may be practiced other than as explicitly described herein.
All references and publications cited herein are expressly incorporated herein by reference in their entirety into this disclosure, except to the extent they may directly contradict this disclosure. Unless otherwise indicated, all numbers expressing feature sizes, amounts, and physical properties used in the specification and claims may be understood as being modified either by the term “exactly” or “about.” Accordingly, unless indicated to the contrary, the numerical parameters set forth in the foregoing specification and attached claims are approximations that can vary depending upon the desired properties sought to be obtained by those skilled in the art utilizing the teachings disclosed herein or, for example, within typical ranges of experimental error.
The recitation of numerical ranges by endpoints includes all numbers subsumed within that range (e.g. 1 to 5 includes 1, 1.5, 2, 2.75, 3, 3.80, 4, and 5) and any range within that range. Herein, the terms “up to” or “no greater than” a number (e.g., up to 50) includes the number (e.g., 50), and the term “no less than” a number (e.g., no less than 5) includes the number (e.g., 5).
The terms “operatively coupled” or “connected” refer to elements being attached to each other either directly (in direct contact with each other) or indirectly (having one or more elements between and attaching the two elements). Either term may be modified by “operatively” and “operably,” which may be used interchangeably, to describe that the coupling or connection is configured to allow the components to interact to carry out at least some functionality (for example, a radio chip may be operably operatively coupled to an antenna element to provide a radio frequency electromagnetic signal for wireless communication).
Terms related to orientation, such as “top,” “bottom,” “side,” and “end,” are used to describe relative positions of components and are not meant to limit the orientation of the embodiments contemplated. For example, an embodiment described as having a “top” and “bottom” also encompasses embodiments thereof rotated in various directions unless the content clearly dictates otherwise.
Reference to “one embodiment,” “an embodiment,” “certain embodiments,” or “some embodiments,” etc., means that a particular feature, configuration, composition, or characteristic described in connection with the embodiment is included in at least one embodiment of the disclosure. Thus, the appearances of such phrases in various places throughout are not necessarily referring to the same embodiment of the disclosure. Furthermore, the particular features, configurations, compositions, or characteristics may be combined in any suitable manner in one or more embodiments.
The words “preferred” and “preferably” refer to embodiments of the disclosure that may afford certain benefits, under certain circumstances. However, other embodiments may also be preferred, under the same or other circumstances. Furthermore, the recitation of one or more preferred embodiments does not imply that other embodiments are not useful and is not intended to exclude other embodiments from the scope of the disclosure.
As used in this specification and the appended claims, the singular forms “a,” “an,” and “the” encompass embodiments having plural referents, unless the content clearly dictates otherwise. As used in this specification and the appended claims, the term “or” is generally employed in its sense including “and/or” unless the content clearly dictates otherwise.
As used herein, “have,” “having,” “include,” “including,” “comprise,” “comprising” or the like are used in their open-ended sense, and generally mean “including, but not limited to.” It will be understood that “consisting essentially of” “consisting of,” and the like are subsumed in “comprising,” and the like. The term “and/or” means one or all of the listed elements or a combination of at least two of the listed elements.
The phrases “at least one of,” “comprises at least one of,” and “one or more of” followed by a list refers to any one of the items in the list and any combination of two or more items in the list.