Embodiments described herein relate generally to a processing system, a processing method, and a storage medium storing a program for processing an image.
There is a radar system that irradiates a walking person with an electromagnetic wave, receives the electromagnetic wave reflected by the person, and generates an image of the person. In this system, a large number of transmitting antennas and a large number of receiving antennas are arranged on both sides of a walking path of a person. A plurality of transmitting antennas and a plurality of receiving antennas are arranged on a straight line. The receiving antennas are arranged at equal intervals as a first interval. The transmitting antennas are arranged at equal intervals as a second interval that is an integral multiple of the first interval. The transmitting antennas transmit electromagnetic waves in a time division manner. An electromagnetic wave is transmitted from one transmitting antenna, and an electromagnetic wave reflected by a person is received by all receiving antennas, and thus one image is generated. Then, electromagnetic waves are sequentially transmitted from other transmitting antennas, and electromagnetic waves reflected by the person are received by all the receiving antennas, and thus images are sequentially generated. As a result, a plurality of images, the number of which is equal to the number of transmitting antennas, is generated. The images are superposed (or simply added to each other) to generate one image at a certain position.
Then, electromagnetic waves are sequentially transmitted from the transmitting antennas, electromagnetic waves are received by all the receiving antennas, a plurality of images are generated, and the images are superposed. In this manner, one image is generated at another position of a person who is walking. This is sequentially repeated to generate a large number of images of the person at a large number of different positions during walking. When a large number of images at a large number of different parts are superposed, one image based on the reflected wave of the entire circumference of the person is generated.
In this system, a high definition image can be obtained by increasing the number of antennas.
However, if the number of antennas increases, the cost increases, the size of the device increases, and it may be difficult to arrange the antennas on both sides of a passage or the like. Furthermore, in order to superpose images, it is necessary to align pixels of the respective images, and thus the processing time becomes longer by the alignment.
Various embodiments will be described hereinafter with reference to the accompanying drawings.
The disclosure is merely an example and is not limited by contents described in the embodiments described below. Modification which is easily conceivable by a person of ordinary skill in the art comes within the scope of the disclosure as a matter of course. In order to make the description clearer, the sizes, shapes, and the like of the respective parts may be changed and illustrated schematically in the drawings as compared with those in an accurate representation. Constituent elements corresponding to each other in a plurality of drawings are denoted by like reference numerals and their detailed descriptions may be omitted unless necessary.
In general, according to one embodiment, a processing system includes a concatenating unit configured to form a concatenated image by concatenating m images based on received signals acquired by two or more radars, and a determination unit configured to determine whether or not there is an object, based on the concatenated image. “m” is an integer equal to or more than 2.
An application example of a processing system according to an embodiment is a security system. The processing system is installed, for example, in a facility such as an airport, a station, an airport, a shopping mall, a concert hall, or an exhibition hall. The processing system determines whether a person who uses the facility possesses a dangerous article. The person is irradiated with an electromagnetic wave, and an electromagnetic wave reflected by the person is received. An image is generated based on a received signal. It is possible to determine whether there is a dangerous article, based on this image.
Radar modules 20a, 20c, and 20e are arranged on one side wall 16 of a passage 12 on which a pedestrian 10 walks. Radar modules 20b, 20d, and 20f are arranged on the other side wall 14 of the passage 12. The side wall 16 and the side wall 14 are flat surfaces parallel to each other. The longitudinal direction of the passage 12 is called an X-direction. The width direction of the passage 12 is called a Y-direction. The height direction of the side walls 14 and 16 is called a Z-direction.
It is not essential that the radar modules 20a to 20e are arranged on both side walls. At least two radar modules, for example, 20a and 20c may be arranged only on one side wall. The number of radar modules arranged on one side wall is not limited to three. In a case where the radar modules are arranged on both side walls, at least one radar module, for example, 20a and 20b may be arranged on each side wall.
In addition, the place on which the radar modules 20a to 20f are arranged is not limited to the side wall. The radar modules 20a to 20f may be arranged on, for example, a ceiling, a floor surface, a gate, or the like of a facility. Furthermore, the radar modules 20a to 20f may be arranged not on a fixed object but on a movable panel or the like. The movable panel may be installed around a person.
A camera 8 is attached to at least one of the side walls 14 and 16. The camera 8 captures an image of the pedestrian 10 on the passage 12. The position of the pedestrian 10 is determined based on the captured image of the camera 8. Instead of the camera 18, a sensor capable of detecting the pedestrian 10 on the passage 12 may be used. Alternatively, the position of the pedestrian 10 may be determined by the radar modules 20a to 20f without providing the camera 8.
The radar modules 20a, 20c, and 20e are one-dimensionally arranged on a straight line along a traveling direction of the pedestrian 10, that is, the X-direction. The radar modules 20b, 20d, and 20f are one-dimensionally arranged on a straight line along the X-direction. The positions of the radar modules 20a, and 20e in the height direction, that is, the Z-direction are the same as the positions of the radar modules 20b, 20d, and 20f in the Z-direction.
The position of the radar module 20a in the X-direction is the same as the position of the radar module 20b in the X-direction. The position of the radar module 20c in the X-direction is the same as the position of the radar module 20d in the X-direction. The position of the radar module 20e in the X-direction is the same as the position of the radar module 20f in the X-direction.
The radar module 20a includes three transmitting antennas Tx1, Tx2, and Tx3 arranged on a straight line along the X-direction, and four receiving antennas Rx1, Rx2, Rx3, and Rx4. The four receiving antennas Rx1 to Rx4 are arranged at equal intervals of an interval d. d is a half wavelength. The three transmitting antennas Tx1 to Tx3 are arranged at equal intervals of an interval 4d. An interval between the receiving antenna Rx4 and the transmitting antenna Tx1 is 2d. Examples of the receiving antennas Rx1 to Rx4 and the transmitting antennas Tx1 to Tx3 are microstrip antennas (also called patch antennas).
Note that the four receiving antennas Rx1 to Rx4 may be arranged at equal intervals of an interval 4d, and the three transmitting antennas Tx1 to Tx3 may be arranged at equal intervals of an interval d.
Furthermore, the antenna configuration, the number of antennas, or the antenna arrangement in the above description and the following description are merely examples. Other antenna configurations, other numbers of antennas, or other antenna arrangements may be used. For example, the transmitting antennas Tx1 to Tx3 and the receiving antennas Rx1 to Rx4 may be one-dimensionally arranged in the Z-direction, or may be two-dimensionally arranged in the X-Z direction.
The receiving antennas Rx1 to Rx4 receive the electromagnetic waves transmitted from the transmitting antennas Tx1 to Tx3, and generate received signals of the electromagnetic waves from the three transmitting antennas Tx1 to Tx3 to form a virtual array antenna (also called a MIMO (Multi-Input and Multi-Output) array antenna). The virtual array antenna includes virtual antennas V1 to V12 arranged at equal intervals d on a straight line along the X-direction. By using the MIMO array antenna, it is possible to increase the number of antennas in the X-direction, also to increase an aperture length, and to improve the angular resolution.
The radar module 20a may include a plurality of transmitting antennas and a plurality of receiving antennas arranged two-dimensionally in the X-direction and the Y-direction.
The irradiation scheduler 34 supplies a timing signal for controlling irradiation timings of an electromagnetic wave of the radar modules 20a to 20f to the radar modules 20a to 20f. The camera 8 is connected to the irradiation scheduler 34. Based on the captured image of the camera 8, the irradiation scheduler 34 detects that the pedestrian 10 has entered an inspection area by the radar modules 20a to 20f or has left the inspection area. The inspection area is an area where the radar modules 20a to 20f can receive a reflected wave of an electromagnetic wave by the pedestrian 10, which has been transmitted by the radar modules 20a to 20f. The irradiation scheduler 34 may detect the position and the walking speed of the pedestrian 10 based on the captured image of the camera 8. The irradiation scheduler 34 supplies a timing signal for controlling the irradiation timings of the electromagnetic wave of the radar modules 20a to 20f to the radar modules 20a to 20f based on the detected timing.
The image generation units 22a to 22f are connected to an image concatenating unit 24. The image concatenating unit 24 generates one image by concatenating a plurality of images generated by the image generation units 22a to 22f. The image concatenating unit 24 arranges the images generated by the image generation units 22a to 22f in different regions of one image. The image concatenating unit 24 is connected to a determination unit 26.
The determination unit 26 determines whether or not there is a predetermined object, based on the one image generated by the image concatenating unit 24. The predetermined object is a dangerous article that is not permitted to be possessed in this area. The dangerous article is metal such as a handgun or a knife, or powder such as an explosive. The predetermined object may include powder such as narcotics or an illegal article such as a bar.
The determination unit 26 is formed, for example, by a convolutional neural network. The convolutional neural network learns a determination result of possession of a dangerous article by using a large number of pieces of image data as teacher data. When the image of the pedestrian 10 is input to the determination unit 26, the determination unit 26 outputs a determination result as to whether or not the person in the image possesses a dangerous article. The convolutional neural network may be learned to determine not only whether or not the person possesses a dangerous article but also whether or not the person possesses each type of the dangerous article.
The determination unit 26 is connected to a display device 28. The display device 28 displays the determination result by the determination unit 26 and generates an alarm sound. A monitoring person can take necessary countermeasures by looking at the determination result or listening to the alarm sound.
The image generation units 22a to 22f, the image concatenating unit 24, and the determination unit 26 may be formed by hardware, or may be formed by a computer that functions as the image generation units 22a to 22f, the image concatenating unit 24, and the determination unit 26 by executing software.
The components related to the processing of the processing system according to the embodiment, that is, the image generation units 22a to 22f, the image concatenating unit 24, and the determination unit 26 may be formed by a cloud system.
The radar modules 20a to 20f have a radar function of a linear frequency modulated continuous wave (L-FMCW) type in which the frequency of an electromagnetic wave linearly increases with the lapse of time.
The reference signal generation unit 32 includes a synthesizer 46 and a clock generator 47. The radar modules 20a to 20f have a radar function of a linear frequency modulated continuous wave (L-FMCW) type in which the frequency of an electromagnetic wave linearly increases with the lapse of time. The synthesizer 46 generates an L-FMCW signal (called a chirp signal below).
The radar module 20a includes the transmitting antennas Tx1 to Tx3, a transmitting circuit 42, the receiving antennas Rx1 to Rx4, and a receiving circuit 44.
The transmitting circuit 42 includes amplifiers 481, 482, and 483. The chirp signal generated by the synthesizer 46 is supplied to the amplifiers 481, 482, and 483. The amplifiers 481, 482, and 483 are connected to the transmitting antennas Tx1, Tx2, and Tx3, respectively.
The receiving circuit 44 includes amplifiers 521, 522, 523, and 524, mixers 541, 542, 543, and 544, low-pass filters (LPF) 561, 562, 563, and 564, A/D converters (ADC) 581, 582, 583, and 584, and fast Fourier transform circuits (FFT) 601, 602, 603, and 604.
The receiving antennas Rx1, Rx2, Rx3, and Rx4 are connected to the amplifiers 521, 522, 523, and 524, respectively. The output signals of the amplifiers 521, 522, 523, and 524 are input to first input terminals of the mixers 541, 542, 543, and 544, respectively. The chirp signal generated by the synthesizer 46 is input to second input terminals of the mixers 541, 542, 543, and 544. Each of the mixers 541, 542, 543, and 544 multiplies the received signal by the chirp signal to generate an intermediate frequency (IF) signal.
The IF signals output from the mixers 541, 542, 543, and 544 are supplied to the FFTs 601, 602, 603, and 604 via the LPFs 561, 562, 563, and 564 and the ADCs 581, 582, 583, and 584, respectively. Each of the FFTs 601, 602, 603, and 604 detects the intensity of an electromagnetic wave received by each of the receiving antennas Rx1, Rx2, Rx3, and Rx4.
The clock signal generated by the clock generator 47 is supplied to the transmission circuits 42 and the reception circuits 44 of all the radar modules 20a to 20f. Operation timings of the transmission circuits 42 and the reception circuits 44 of all the radar modules 20a to 20f are controlled by the clock signal. Therefore, the transmission circuits 42 and the reception circuits 44 of all the radar modules 20a to 20f execute a transmission operation and a reception operation in synchronization.
A transmitted signal St(t) of the chirp signal is represented by Equation 1.
St(t)=cos[2π(fc×t+γt2/2)] Equation 1
The chirp rate γ is represented by Equation 2.
γ=fb/Tb Equation 2
A reflected wave from an object away from the radar modules 20a to 20f by a distance R is observed with a delay of Δt=2R/c from a transmitting timing. A letter “c” indicates the speed of light. The received signal Sr(t) is represented by Equation 3 if the reflection intensity of the object is set as “a”.
Sr(t)=a×cos[2πfc(t−Δt)+πγ(t−Δt)2] Equation 3
As illustrated in
z(t)=a×cos(2πΔtγt) Equation 4
It is possible to calculate the reflection intensity in a frequency domain by performing FFT of the IF signal z(t) in a time domain, which is represented by Equation 4, in the FFTs 601 to 604. Thus, the amplitude at each point in the frequency domain, which is the result of the FFT of the IF signal, corresponds to the reflection intensity for each of distances from the radar modules 20a to 20f. The frequency fif and the distance R have the relationship of Equation 5.
f
if
=Δtγ=2Rγ/c Equation 5
As the electromagnetic wave used in the embodiment, an electromagnetic wave having a wavelength of 1 mm to 30 mm may be used. The electromagnetic wave having a wavelength of 1 mm to 10 mm is called a millimeter wave. The electromagnetic wave having a wavelength of 10 mm to 100 mm is called a microwave. Further, an electromagnetic wave having a wavelength called a terahertz wave of 100 micrometers to 1 millimeter may be used.
The electromagnetic wave is reflected by the skin of the pedestrian 10. The electromagnetic wave is also reflected by metal such as a handgun or a knife. The reflectance of metal is higher than the reflectance of skin. The intensity of the reflected wave of metal is higher than the intensity of the reflected wave of skin. The electromagnetic wave is absorbed by a powder such as an explosive. The reflectance of the powder is lower than the reflectance of the skin. The intensity of the reflected wave is determined by the type of substance at a point where the electromagnetic wave is reflected, such as skin, metal, or powder. Therefore, it is possible to detect the type of substance at a reflection point from the intensity of the reflected wave.
In the first transmission for the radar module 20a, the irradiation scheduler 34 causes the radar module 20a to transmit an electromagnetic wave via the transmitting antenna Tx1. All the receiving antennas Rx1 to Rx4 of the radar modules 20a to 20f receive reflected waves substantially simultaneously. In the second transmission, the irradiation scheduler 34 causes the radar module 20a to transmit an electromagnetic wave via the transmitting antenna Tx2.
All the receiving antennas Rx1 to Rx4 of the radar modules 20a to 20f receive reflected waves substantially simultaneously. In the third transmission, the irradiation scheduler 34 causes the radar module 20a to transmit an electromagnetic wave via the transmitting antenna Tx3. All the receiving antennas Rx1 to Rx4 of the radar modules 20a to 20f receive reflected waves substantially simultaneously.
When the third transmission of the radar module 20a completes, the image generation units 22a to 22f generate one image from each of the received signals of the radar modules 20a to 20f. An image based on a received signal corresponding to a combination of a transmitting module and receiving module in which the transmitting module is the radar module 20a and the reception module is the radar module is called an A-A image. That is, when the third transmission completes, the image generation units 22a to 22f generate an A-A image, an A-B image, an A-C image, an A-D image, an A-E image, and an A-F image, respectively. The image generation units 22a to 22f supply the A-A image, the A-B image, the A-C image, the A-D image, the A-E image, and the A-F image to the image concatenating unit 24. The image concatenating unit 24 includes a first memory that stores images, and writes the six supplied images into the first memory.
Thereafter, the irradiation scheduler 34 causes each of the other radar modules 20b to 20f to transmit an electromagnetic wave three times.
When the third transmission of the radar module 20f completes, the image generation units 22a to 22f generate an F-A image, an F-B image, an F-C image, an F-D image, an F-E image, and an F-F image from the received signals of the radar modules 20a to 20f. The image generation units 22a to 22f supply the F-A image, the F-B image, the F-C image, the F-D image, the F-E image, and the F-F image to the image concatenating unit 24. The image concatenating unit 24 writes the six supplied images into the first memory. When the transmission from the radar modules 20a to 20f completes, the first memory of the image concatenating unit 24 stores thirty six images from the A-A image to the F-F image. The image concatenating unit 24 concatenates thirty six images to form one image. “Concatenating” means that thirty six images are arranged in different regions of one image.
The determination unit 26 learns a determination result of possession of a dangerous article by using a large number of concatenated images as teacher data. The teacher data is data related to the concatenated image of thirty six images for the pedestrian 10 located in the inspection area. The teacher data is independent of the position of the pedestrian 10 in the inspection area.
During a period in which the pedestrian 10 passes through the inspection area, the radar modules to 20f can transmit an electromagnetic wave a plurality of times. During the period in which the pedestrian 10 passes through the inspection area, the image concatenating unit 24 can generate a plurality of concatenated images in which the positions of the pedestrian 10 in the inspection area are different from each other.
The image concatenating unit 24 includes a second memory. The image concatenating unit 24 divides an area of the second memory for storing an image into, for example, 1×8 middle regions. The image concatenating unit 24 writes the first round image into the second memory such that the first round image is arranged in, for example, the middle region at the left end of one image. Similarly, the image concatenating unit 24 writes the second to eighth round images into the second memory such that the second to eighth round images are arranged in, for example, the second to eighth middle regions from the left end of one area. The image concatenating unit 24 supplies an expanded concatenated image obtained by concatenating eight concatenated images to each other, to the determination unit 26.
In a case where the determination unit 26 performs a dangerous article determination based on the expanded concatenated image, the determination unit 26 has learned the determination result of possession of the dangerous article by using a large number of the expanded concatenated images as teacher data. The positions of the eight concatenated images (included in the expanded concatenated image as the teacher data) in the inspection area are defined by boundaries for substantially equally dividing the inspection area into seven pieces in the X-direction.
The irradiation scheduler 34 calculates the walking speed of the pedestrian 10 based on the captured image of the camera 8. The irradiation scheduler 34 determines image capturing timings of the second to eighth round images from the walking speed and the length of the inspection area in the X-direction. The irradiation scheduler 34 causes the radar modules 20a to 20f to transmit an electromagnetic wave at the image capturing timings of the second to eighth round images, respectively. As a result, the image concatenating unit 24 can generate the concatenated image at the same position in the inspection area as the position of the teacher data in the inspection area.
The determination unit 26 can capture a change in the characteristics of the image of the pedestrian 10 for a long time by performing a dangerous article determination using the expanded concatenated image, and can perform more accurate dangerous article determination.
In the above description, since the irradiation scheduler 34 controls the transmission timings of the radar modules 20a to 20f, only the first to eighth round images are supplied to the image concatenating unit 24. However, the walking speed of the pedestrian 10 may fluctuate, and it may not be possible to obtain the timing when the inspection area is substantially equally divided into seven pieces in the X-direction. In such a case, the irradiation scheduler 34 may cause the radar modules 20a to 20f to transmit an electromagnetic wave at a constant period, and the image concatenating unit 24 may generate a large number of concatenated images which are equal to or more than 8. The image concatenating unit 24 may generate an expanded concatenated image by selecting eight concatenated images from a large number of concatenated images and concatenating the selected concatenated images.
The expanded concatenated image includes concatenated images generated at different positions of the pedestrian 10 in the inspection area. Depending on the position of the pedestrian 10, any of the radar modules 20a to 20f may not be able to receive the reflected wave contributing to image creation, or may not be able to receive the reflected wave at all.
Therefore, in a case where the irradiation scheduler 34 detects that the pedestrian 10 is located at the position illustrated in
Therefore, in a case where the irradiation scheduler 34 detects that the pedestrian 10 is located at the position illustrated in
Therefore, in a case where the irradiation scheduler 34 detects that the pedestrian 10 is located at the position illustrated in
In
When a plurality of images based on the received signals of the radar modules 20a to 20f are concatenated to form one image, alignment of pixels is not necessary. Thus, the transmitting/receiving direction of the radar modules 20a to 20f can be changed. In a case where a plurality of images is superposed to form one image, alignment of pixels is necessary. Thus, it is not possible to freely change the transmitting/receiving direction of the radar modules 20a to 20f.
According to the first embodiment, a plurality of images based on received signals corresponding to combinations each formed by one transmitting antenna among a plurality of transmitting antennas included in at least two radar modules and one receiving antenna among a plurality of receiving antennas included in at least two radar modules are concatenated to form one concatenated image. It is determined whether or not there is an object, based on the concatenated image. Therefore, a processing system capable of determining whether or not there is an object in a short time at low cost is provided.
A plurality of concatenated images generated at different times may be concatenated to generate the expanded concatenated image. When the determination is made based on the expanded concatenated image, it is more accurately determined whether or not there is an object.
Depending on the position of the object, there is a combination of the transmitting antenna and the receiving antenna that do not contribute to image generation. An image based on a received signal corresponding to such a combination is removed from the concatenation object. Since the concatenated image does not include an image that does not contribute for object determination, the size of the concatenated image is suppressed to the minimum necessary.
When images based on the received signals of the radar modules 20a to 20f are concatenated to form one image, alignment of pixels is not necessary. Thus, the transmitting/receiving direction of the radar modules 20a to 20f can be freely changed. Therefore, it is possible to reduce the number of combinations of transmitting antennas and receiving antennas that do not contribute to image generation.
In the second embodiment, nine radar modules 20c, 20e, 20g, 20i, 20k, 20m, 20o, and 20q are two-dimensionally arranged on the side wall 16. Nine radar modules 20b, 20d, 20f, 20h, 20j, 201, 20n, 20p, and 20r are two-dimensionally arranged on the side wall 14. Each of the radar modules 20a to 20r includes three transmitting antennas Tx1 to Tx3 and four receiving antennas Rx1 to Rx4 which are arranged one-dimensionally in an X-direction, as illustrated in
An example of an electrical configuration of the processing system according to the second embodiment is similar to the electrical configuration in
Note that, in a case where the receiving antennas and the transmitting antennas are two-dimensionally arranged in each of the radar modules 20a to 20r, the image generation unit 22 can generate a plurality of three-dimensional slice images shifted in the depth direction, the horizontal direction, and a vertical direction (elevation angle direction), based on the received signals of the two-dimensionally arranged radar modules 20a to 20r. The elevation angle corresponds to a Z-direction. The depth direction is a direction perpendicular to the vertical direction and the horizontal direction. The image generation unit 22 may acquire not only the slice images related to the azimuth angle, the elevation angle, and the distance but also a slice image related to speed information by signal processing in the Doppler (speed) dimension.
Outputs of the eighteen image generation units 22 are connected to the image concatenating unit 24. The image concatenating unit 24 concatenates the output images of the eighteen image generation units 22 to generate one concatenated image. In a case where the receiving antennas and the transmitting antennas are one-dimensionally arranged in each of the radar modules 20a to 20r, the image concatenating unit 24 can generate a distance-azimuth (elevation) angle map. In a case where the receiving antennas and the transmitting antennas are two-dimensionally arranged in each of the radar modules 20a to 20r, the image concatenating unit 24 can generate a distance-elevation angle map and an azimuth angle-elevation angle map in addition to the distance-azimuth angle map. Furthermore, in a case where a slice image related to speed information is acquired, a map in which an azimuth angle, an elevation angle, a distance, and a speed are associated with each other can also be generated.
The image concatenating unit 24 includes a third memory. The image concatenating unit 24 divides one area of the third memory for storing an image into, for example, 3×6 (=18) middle regions in accordance with the number of the radar modules 20a to 20r. The image concatenating unit 24 writes the A image group generated in a case where the radar module 20a is the transmitting module, into the third memory such that the A image group is arranged in, for example, the upper left middle region of one image. Similarly, the image concatenating unit 24 writes eighteen images generated in a case where the other radar modules 20b to 20f are the transmitting module, into the third memory such that the images are arranged in predetermined middle region of one image. For example, the image concatenating unit 24 writes an R image group generated in a case where the radar module 20r is the transmitting module, into the third memory such that the R image group is arranged in, for example, the lower right middle region of one image.
According to the second embodiment, a plurality of three-dimensional images based on received signals corresponding to combinations each formed by one transmitting antenna among a plurality of transmitting antennas included in at least two radar modules two-dimensionally arranged and one receiving antenna among a plurality of receiving antennas included in at least two radar modules are concatenated to form one concatenated image. It is determined whether or not there is an object, based on the concatenated image. Therefore, a processing system capable of determining whether or not there is an object in a short time at low cost is provided.
Also in the second embodiment, as described in relation to the first embodiment with reference to
Furthermore, also in the second embodiment, as described in relation to the first embodiment with reference to
In a case where a plurality of images is concatenated to form one image, alignment of pixels is necessary. Thus, it is not possible to freely change the direction of the radar modules 20a to 20f. In order to align pixels of a three-dimensional image, it is necessary to align resolutions of an azimuth, an elevation angle, and a distance of the three-dimensional image. In this case, it is not possible to freely set the directions of the radar modules 20a to 20f.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions, and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2022-044128 | Mar 2022 | JP | national |
This application is a Continuation Application of PCT Application No. PCT/JP2022/030739, filed Aug. 12, 2022 and based upon and claiming the benefit of priority from Japanese Patent Application No. 2022-044128, filed Mar. 18, 2022, the entire contents of all of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2022/030739 | Aug 2022 | US |
Child | 18457361 | US |