The present disclosure relates to a measurement device.
There is known a measurement device that measures a distance to a target based on emission of laser light from a light emitting unit and exposure to the reflected and returned light using an exposure unit. In Patent Literature 1, noise due to light other than reflected light is reduced by operating an imaging sensor near where the reflected light returns.
Patent Literature 1: JP2019-191126A
For example, when a measurement target is a vehicle, light (light reflected by the target) may be reflected by a road surface at a place where the road surface is wet due to rain or the like or a place where the road surface is smoothly paved such as a parking lot. Accurate measurement may not be performed due to influence of this road surface reflection. In Patent Literature 1, influence of the light other than the reflected light can be reduced, but when positions of a light source and the imaging sensor are misaligned, an arrival position of the reflected light shifts depending on a distance to a target. Therefore, there is a possibility of exposure to light reflected from a road surface.
An object of the present disclosure is to reduce influence of road surface reflection.
According to an aspect of the present disclosure for achieving the above object, there is provided a measurement device including: a light emitting unit configured to emit light to each of a plurality of measurement areas at different heights; an exposure unit having a plurality of exposure areas respectively corresponding to the measurement areas at different heights, and exposed to reflected light from the measurement areas by a plurality of light receiving elements respectively provided for the exposure areas; and a distance calculation unit configured to calculate a distance to a target in the measurement areas based on an exposure result of the exposure unit.
According to the present disclosure, influence of road surface reflection can be reduced.
Hereinafter, an embodiment of the present disclosure will be described with reference to the drawings.
The measurement device 1 shown in
As shown in
The light emitting unit 10 emits (projects) light to a space to be imaged. The light emitting unit 10 controls light emission according to an instruction from the control unit 30. The light emitting unit 10 includes a light source 12 and a projecting optical system (not shown) that emits light generated by the light source 12.
The light source 12 includes light emitting elements 122 (to be described later). The light source 12 emits pulsed laser light under the control of the control unit 30. Hereinafter, this pulsed light is also referred to as a light emission pulse. The light source 12 will be described in detail later.
The imaging unit 20 (indirect ToF camera) performs imaging based on exposure to light reflected by a target of distance measurement. The imaging unit 20 includes an imaging sensor 22 and an exposure optical system (not shown) that guides incident (exposure) light to the imaging sensor 22. The imaging unit 20 corresponds to an “exposure unit”.
The imaging sensor 22 images an object to be imaged according to an instruction from the control unit 30 and outputs image data obtained by imaging to an image acquisition unit 34 of the control unit 30. A value (pixel data) of each pixel constituting the image data indicates a signal value corresponding to an exposure amount. The imaging sensor 22 will be described in detail later.
The control unit 30 controls the measurement device 1. The control unit 30 is implemented by a hardware configuration such as an element or a circuit, such as a memory or a CPU. The control unit 30 implements a predetermined function by the CPU executing a program stored in the memory.
The timing control unit 32 controls a light emission timing of the light emitting unit 10 and an exposure timing of the imaging unit 20. The light emission timing and the exposure timing will be described later.
The image acquisition unit 34 acquires image data from the imaging sensor 22 of the imaging unit 20. The image acquisition unit 34 includes a memory (not shown) that stores the acquired image data.
The time calculation unit 36 calculates an arrival time (time of flight of light: ToF) from when the light source 12 of the light emitting unit 10 emits light until the reflected light reaches the imaging sensor 22 of the imaging unit 20.
The distance calculation unit 38 calculates a distance based on an exposure result of the imaging sensor 22 and the arrival time of the light. As will be described later, a distance image can be acquired by calculating a distance for each pixel.
First, as shown in
The control unit 30 (timing control unit 32) causes the imaging sensor 22 of the imaging unit 20 to be exposed to the reflected light after a time Tdelay from emission of the light emission pulse. An exposure period is set based on the delay time Tdelay and an exposure width Gw. The exposure period is a period in which an exposure level is a high level (H level).
The time Tdelay is a time (delay time) from the emission of the light emission pulse to a start of the exposure period. The delay time Tdelay is set according to a distance to a measurement target region. That is, by setting a short time from when the light emitting unit 10 emits the light emission pulse until the imaging sensor 22 starts exposure, an image of a target (object that reflects light) in a short distance region can be acquired. Conversely, by setting a long time from when the light emitting unit 10 emits the light emission pulse until the imaging sensor 22 starts exposure, an image of a target in a long distance region can be acquired.
The exposure width Gw is a width of the exposure period (that is, a period from a start of the exposure to an end of the exposure). The width of the exposure period defines a length of the measurement target region in a measurement direction. Therefore, the smaller the exposure width Gw is, the higher a distance resolution becomes.
In the present embodiment, as shown in
The light emission and the exposure are repeated a plurality of times at a cycle Tp shown in
In an image obtained for each region, a target (object that reflects light) present in the region is imaged. The image for each region may be referred to as a “range image”. A value (image data) of each pixel constituting the image indicates a signal value corresponding to an exposure amount.
As shown in
When there is one measurable region for once light emission, it takes time to acquire image data on a large number of regions, so that a measurement time becomes long (it is difficult to increase a speed of a flame per seconds (FPS)). Therefore, in the present embodiment, a plurality of exposure periods are set for once light emission, and a plurality of regions are measured for once light emission. Therefore, in the present embodiment, a multi-tap (specifically, four taps) CMOS image sensor is used as the imaging sensor 22. However, the imaging sensor 22 is not limited to the multi-tap CMOS image sensor.
The light receiving element PD is an element (for example, a photodiode) that generates charges corresponding to an exposure amount.
The signal reading unit RU1 includes a storage unit CS1, a transistor G1, a reset transistor RT1, a source follower transistor SF1, and a selection transistor SL1.
The storage unit CS1 stores the charges generated by the light receiving element PD, and includes a storage capacitor C1 and a floating diffusion FD1.
The transistor G1 is provided between the light receiving element PD and the storage unit CS1. The transistor G1 is turned on in a predetermined exposure period (for example, an exposure period 1 to be described later) and supplies the charges generated by the light receiving element PD to the storage unit CS1 based on an instruction of the timing control unit 32 of the control unit 30. Similarly, transistors G2 to G4 supply the charges generated by the light receiving element PD to the storage units CS2 to CS4, respectively, based on instructions from the timing control unit 32. That is, the transistors G1 to G4 distribute the charges generated by the light receiving element PD to the storage units CS1 to CS4 according to the exposure period.
The charges are repeatedly stored in each storage unit according to the number n of times of repetition. The charges stored in each storage unit correspond to an amount of light to which the light receiving element PD is exposed in each exposure period.
When the selection transistor SL1 of the signal reading unit RU1 is selected, the signal output unit SO1 outputs a signal value corresponding to charges stored in the storage unit CS1. As shown in the drawing, the signal output unit SO1 includes an amplification circuit ZF1 that amplifies an output of the signal reading unit RU1 and an A/D conversion circuit HK1 that converts an output (analog signal) of the amplification circuit ZF1 into a digital signal. The signal output unit SO1 converts the charges (exposure amount in the exposure period) stored in the storage unit CS1 into a signal value (digital signal) corresponding to the charges and outputs the signal value to the image acquisition unit 34 of the control unit 30. The signal value (digital signal) based on the charges stored in the storage unit CS1 is a signal value according to an exposure amount in the exposure period.
With such an imaging sensor 22, four regions can be measured by imaging once. That is, four range images are obtained by imaging once. The number of (here, four) range images obtained by imaging once may be referred to as a “subframe”. A plurality of regions (here, four regions) measured by imaging once may be referred to as a “zone”.
First, the control unit 30 (timing control unit 32) causes the light emitting unit 10 to emit light at the cycle Tp (see
First, images of a zone 1 (regions 1 to 4) is acquired. That is, the timing control unit 32 causes the imaging sensor 22 of the imaging unit 20 perform exposure for each pixel of an image in exposure periods 1 to 4 (see
The timing control unit 32 causes the exposure to be repeatedly performed for each cycle Tp, and stores the charges in the storage units CS1 to CS4.
The image acquisition unit 34 acquires signal values corresponding to charges stored in the storage units CS1 to CS4 via the signal output units SO1 to SO4. Then, the control unit 30 writes image data on acquired range images (subframes) of the regions 1 to 4 into an image memory of the image acquisition unit 34.
Next, similarly, the control unit 30 acquires images of the zone 2 (regions 5 to 8). Then, the control unit 30 writes image data on range images (subframes) of the regions 5 to 8 into the image memory of the image acquisition unit 34. The delay time Tdelay from a light emission timing in the zone 2 (regions 5 to 8) is set to be longer than that in a case of the zone 1 (regions 1 to 4). As described above, the number of times of repetition (the number of times of charge accumulation) is set to increase as the measurement target region becomes further away. Although light emission pulses (light emission timings) are different in the region 4 and the region 5, exposure periods of the region 4 and the region 5 are consecutive with the light emission pulse as a reference. In this way, the consecutive exposure periods are not limited to exposure periods having the same light emission timing, and may include exposure periods having different light emission timings.
By performing the above operation up to a region N, images up to the region N (images of all the regions) are acquired.
In
Here, a case where the reflected light is used for exposure in an exposure period 2 and an exposure period 3 will be described (the reflected light may reach a period different from the exposure period 2 and the exposure period 3, and for example, the reflected light may be used for exposure in the exposure period 1 and the exposure period 2).
In the exposure 1, a region (region 1) defined by a delay time Tl and the exposure width Gw (=Lw) with respect to a start of light emission of the light emission pulse is measured. In this example, the reflected light is used for exposure in the exposure period 1, and the charges are not stored in the storage unit CS1 for the pixel 221 of the imaging sensor 22 (a signal value S1 acquired in the exposure period 1 is zero).
The exposure period 2 corresponding to a region (region 2) next to the region 1 is set in the exposure 2. The pixel 221 of the imaging sensor 22 acquires a signal value S2 corresponding to an exposure amount of the reflected light in the exposure period 2.
The exposure period 3 corresponding to a region (region 3) next to the region 2 is set in the exposure 3. The pixel 221 of the imaging sensor 22 acquires a signal value S3 corresponding to an exposure amount of the reflected light in the exposure period 3.
An exposure period 4 corresponding to a region (region 4) next to the region 3 is set in the exposure 4. In this example, the reflected light is used for exposure in the exposure period 4, and the charges are not stored in the storage unit CS4 for the pixel 221 of the imaging sensor 22 (a signal value S4 acquired in the exposure period 4 is zero).
The image acquisition unit 34 of the control unit 30 acquires the signal values S1 to S4 (signal values according to the charges stored in the storage units CS1 to CS4) of the pixels 221 from the imaging sensor 22. Accordingly, the image acquisition unit 34 acquires the image data on the regions 1 to 4. Similarly, the image acquisition unit 34 acquires the images up to the region N (images of all the regions).
The time calculation unit 36 of the control unit 30 calculates an arrival time Tx of the reflected light. Specifically, first, the time calculation unit 36 specifies a signal value S of exposure to the reflected light from the signal values S1 to S4 (signal values S1 to SN). For example, the time calculation unit 36 specifies a signal value of a signal corresponding to two consecutive exposure periods and having the highest exposure amount. For example, when a signal value corresponding to an exposure period i in which the reflected light starts exposure is Si, two signal values Si and Si+1 are specified. Here, the signal values S2 and S3 correspond to the signal values Si and Si+1 of exposure to the reflected light.
The time calculation unit 36 uses the signal values Si and Si+1 to calculate the flight time (hereinafter also referred to as an arrival time) Tx of the light by the following Equation (1).
When a distance to a target is L, the distance L to the target is calculated based on the arrival time Tx. That is, since the light travels twice the distance L during the arrival time Tx, when a speed of the light is Co, L=(Co× Tx)/2 . . . (2).
The distance calculation unit 38 calculates the distance L to the target by Equation (2). In particular, the distance calculation unit 38 according to the present embodiment calculates the distance L to the target in a measurement area for each measurement area to be described later.
As shown in
The light source substrate 121 is a plate-shaped member on which the light emitting elements 122 are arranged. The light source substrate 121 has a surface facing the measurement direction, and the plurality of light emitting elements 122 are arranged on the surface.
The light emitting element 122 is an element that emits light when supplied with a drive current. Similarly to the pixels 221 of the imaging sensor 22, the plurality of light emitting elements 122 are two-dimensionally arranged on the surface of the light source substrate 121. More specifically, the plurality of light emitting elements 122 are arranged to be divided into five areas (light emitting areas 1 to 5) in an upper-lower direction so as to correspond to a plurality of (five in the present embodiment) measurement areas (to be described later) at different heights in the vertical direction. In other words, the light source 12 (light emitting unit 10) according to the present embodiment has five light emitting areas respectively corresponding to five measurement areas at different heights.
For example, a light emitting area 1 corresponds to the lowest measurement area 1 among the five measurement areas. A light emitting area 2 corresponds to a measurement area 2 above the measurement area 1. Similarly, light emitting areas 3 to 5 correspond to measurement areas 3 to 5, respectively.
In each light emitting area, the light emitting elements 122 are arranged in the horizontal direction (left-right direction). In the present embodiment, the control unit 30 causes the light emitting elements 122 to emit light for each of the plurality of light emitting elements 122 (areas) arranged in the horizontal direction. Accordingly, the light source 12 (light emitting unit 10) can emit light to the measurement area corresponding to each light emitting area.
However, a method of emitting light to each measurement area is not limited to the above-described method. For example, a method (scanning method) may be used in which a plurality of light emitting elements arranged in the horizontal direction emit light by moving in a height direction (vertical direction) according to switching of a measurement area.
As shown in
As shown in the drawing, an arrival range of the reflected light is wider than the exposure area. Accordingly, the reflected light can also be used for exposure in the pixel 221 (light receiving element PD) at an end of the exposure area.
If the light source and the imaging sensor are arranged at different heights, there is a possibility that the imaging sensor cannot be exposed due to influence of the distance to the target.
In the present embodiment, the light source 12 and the imaging sensor 22 are disposed at the same height. The imaging sensor 22 is provided at the same height as the light source 12 in the imaging unit 20 in the drawing. More specifically, an light emitting area of the light source 12 and an exposure area of the imaging sensor 22 corresponding to the light emitting area are disposed at the same height. Accordingly, a correspondence relationship among the light emitting area, a measurement area, and the exposure area is not influenced by a distance to a target. That is, regardless of the distance to the target, the reflected light of light emitted from the light emitting area of the light source 12 (light that reaches the target and reflected thereby) can be used for exposure in the corresponding exposure area of the imaging sensor 22.
In
In the comparative example, the light source irradiates a measurement target (here, a vehicle) with light by full light emission, and a camera (imaging sensor) is exposed to the reflected light by overall surface exposure.
For example, when a road surface has a high reflectance, such as a place where a road surface is wet due to rain or the like or a place where a road surface is smoothly paved in a parking lot or the like, as shown in the drawing, the reflected light from the target (vehicle) may be further reflected by the road surface, and the camera (imaging sensor) may be exposed to the reflected light (reflected light via the road surface).
In this case, a reflected image of the vehicle is reflected by the reflected light via the road surface, and thus accurate measurement may not be performed. That is, a pixel (light receiving element) that should not actually be exposed to reflected light may be exposed to light reflected from the road surface, and charges may be stored in a storage unit for the pixel. Accordingly, an image showing the vehicle at a position on the road surface may be generated in a distance image.
Therefore, influence of such road surface reflection is reduced in the present embodiment.
For example, when the measurement target is the measurement area 1, among the five light emitting areas 1 to 5 of the light emitting unit 10 (light source 12), the control unit 30 causes only the light emitting elements 122 in the light emitting area 1 corresponding to the measurement area 1 to emit light, and does not cause the light emitting elements 122 in the light emitting areas 2 to 5 to emit light. That is, the light emitting unit 10 irradiates only the measurement area 1 with light, and does not irradiate other measurement areas (measurement areas 2 to 5) with light. In a case of the drawing, since the vehicle is located in the measurement areas 3 to 5, the vehicle is not irradiated with light (light is irradiated only on the road surface).
Among the five exposure areas 1 to 5 of the imaging unit 20, the control unit 30 causes only the pixels 221 (light receiving elements PD) of the exposure area 1 corresponding to the measurement area 1 to be exposed and does not cause the light receiving elements PD of the exposure areas 2 to 5 to be exposed (or among the five exposure areas 1 to 5 of the imaging unit 20, the control unit 30 acquires signals of only the pixels 221 of the exposure area 1 corresponding to the measurement area 1 and does not acquire signals of the pixels 221 of the exposure areas 2 to 5). That is, the imaging unit 20 images only the measurement area 1 and does not image other measurement areas (measurement areas 2 to 5).
Accordingly, when measuring the measurement area 1 (or the measurement area 2), since the light emitting unit 10 does not irradiate the measurement areas 3 to 5 (areas where the vehicle is located) with light, there is no light directed from the vehicle toward the road surface as shown in the comparative example (
When measuring the measurement area 3 (or the measurement areas 4 and 5), even if there is the reflected light via the road surface in the comparative example (
Therefore, in the measurement method according to the present embodiment, the influence of road surface reflection can be reduced.
Next, an operation of the measurement device 1 according to the present embodiment will be described.
The measurement device 1 according to the present embodiment performs light emission and exposure at different timings for each area. Specifically, as shown in
As shown in
Next, after 4Lw elapses since the light emitting area 1 starts to emit light, the control unit 30 (timing control unit 32) causes the light emitting area 3 corresponding to the measurement area 3 to emit a light emission pulse having the pulse width Lw. Accordingly, the light emitting elements 122 in the light emitting area 3 emit light as shown in
The control unit 30 (timing control unit 32) causes the exposure area 1 of the imaging sensor 22 to be exposed after a predetermined time (after a time corresponding to a region to be measured elapses) since the light emitting area 1 emits the light emission pulse (see
At this time, the control unit 30 first turns on the transistor G1 for each pixel 221 in the exposure area 1 of the imaging sensor 22 (TAP1 is set to the H level), and stores charges generated by the light receiving element PD in the storage unit CS1. Thereafter, as shown in
After exposure in the exposure area 1 is completed (and after a predetermined time elapses since the light emitting elements 122 in the light emitting area 3 emit light), the control unit 30 causes the exposure area 3 to be exposed to reflected light of the light emitted from the light emitting area 3. The exposure operation is the same as that for the exposure area 1, and description thereof will be omitted.
Similarly, the control unit 30 (timing control unit 32) performs exposure in an order of the exposure area 5→the exposure area 2→the exposure area 4. Based on the exposure result, the time calculation unit 36 calculates the arrival time Tx of the reflected light in each measurement area by Equation (1). The distance calculation unit 38 calculates the distance L to the target in each measurement area by Equation (2).
If the measurement area 4 is to be measured after the measurement area 3 is measured, the reflected light of the light emission pulse emitted before the measurement area 3 is measured may be used for exposure after measurement of the measurement area 4 is started. This is because a range including the measurement area 3 is irradiated with light (the measurement area 4 is also irradiated with light) when measuring the measurement area 3. In this way, when two measurement areas before and after switching are adjacent to each other, the two measurement areas may be influenced by light emission pulses emitted to the adjacent areas.
In contrast, in the present embodiment, the control unit 30 performs control such that two measurement areas before and after switching are not adjacent to each other. For example, the control unit 30 causes the measurement area 5 to be measured after the measurement area 3 is measured. Accordingly, it is possible to reduce influence of adjacent measurement areas and improve measurement accuracy.
In the present embodiment, as shown in
In
A reading period is a period in which a signal value according to the charges stored in the storage units CS1 to CS4 for each pixel 221 is output. In the imaging sensor 22 according to the present embodiment, the charges stored in the storage units CS1 to CS4 are AD-converted by A/D conversion circuits HK1 to HK4 through amplification circuits ZF1 to ZF4, respectively. AD conversion is performed for each horizontal line arranged in the horizontal direction among the plurality of two-dimensionally arranged pixels 211. For example, each horizontal line is AD-converted and read in an order from an upper side to a lower side or from the lower side to the upper side. During this reading period, no new imaging can be performed.
When the overall surface exposure is performed as in the comparative example, it takes time to read all horizontal lines (all pixels) of the imaging sensor 22 (it takes a time equivalent to five times the reading period shown in
In contrast, the imaging sensor 22 according to the present embodiment performs exposure and reading for each of the five exposure areas. Therefore, the number of pixels to be read once is ⅕ of that in the case of the overall surface exposure. Thus, once reading period (output time) can be shortened.
In the present embodiment, as shown in
In the embodiment described above, the measurement areas are not adjacent before and after the switching of the measurement area. However, the present disclosure is not limited thereto, and the measurement areas may be adjacent to each other before and after the switching of the measurement area.
In this example, the control unit 30 causes the light source 12 to emit light in an order of the light emitting area 1→the light emitting area 2→the light emitting area 3→the light emitting area 4→the light emitting area 5. That is, the control unit 30 sequentially switches a measurement area to be irradiated with light such that the measurement area goes from the lower side to the upper side. Accordingly, the control unit 30 switches an area of the imaging sensor 22 that is exposed to the reflected light in an order of the exposure area 1→the exposure area 2→the exposure area 3→the exposure area 4→the exposure area 5.
In this case, an interval between light emission timings of the light emitting areas is preferably longer than that in a case where the measurement areas are not adjacent to each other (
If measurement is performed by switching the measurement area in an order from the upper side to the lower side, when switching from an area (for example, the measurement area 3) where a target (here, a vehicle) is present to an area (for example, the measurement area 4) where no target is present, the measurement may be influenced by reflected light from a road surface. By performing the measurement from the lower side to the upper side, such influence can be reduced.
In this way, even when the measurement areas are adjacent to each other before and after the switching, influence of road surface reflection can be reduced as compared with the comparative example. In this case, by switching the measurement area from the lower side to the upper side, it is possible to further reduce influence of the reflected light from the road surface. In this modification, while outputting (reading) an exposure result in a certain exposure area, it is also possible to perform imaging in another exposure area, and thus it is possible to improve a frame rate as compared with the comparative example.
In the embodiment described above, the measurement is performed in all the measurement areas (measurement areas 1 to 5), but for example, when measuring a distant target, only a specific area may be measured by narrowing down the measurement areas.
In
In
The measurement device 1 according to the present embodiment has been described above. The measurement device 1 includes the light emitting unit 10, the imaging unit 20, and the distance calculation unit 38 of the control unit 30. The light emitting unit 10 can emit light to the measurement areas 1 to 5 at different heights. The imaging unit 20 has the exposure areas 1 to 5 corresponding to the measurement areas 1 to 5 at different heights. The imaging unit 20 is exposed to reflected light from the measurement area using the plurality of light receiving elements PD provided for each of the exposure areas 1 to 5. The distance calculation unit 38 calculates the distance L to a target in the measurement area based on an exposure result of the imaging unit 20. Accordingly, influence of road surface reflection can be reduced.
The light source 12 of the light emitting unit 10 has light emitting areas 1 to 5, and emits light onto a measurement area corresponding to each light emitting area. Accordingly, the measurement area can be switched.
The light source 12 of the light emitting unit 10 and the imaging sensor 22 of the imaging unit 20 are arranged at the same height. Accordingly, influence of a distance to a measurement target can be reduced.
The control unit 30 sequentially switches the measurement areas 1 to 5 to be irradiated with light from the light source 12, and sequentially switches the exposure areas 1 to 5 of the imaging sensor 22 to be exposed to the reflected light. Accordingly, exposure can be performed in the exposure area corresponding to the measurement area.
The control unit 30 sequentially switches the measurement area to be irradiated with light such that two measurement areas before and after the switching are not adjacent to each other (such that a measurement area before the switching and a measurement area after the switching are not adjacent to each other), and sequentially switches the exposure area to be exposed to the reflected light such that two exposure areas before and after the switching are not adjacent to each other (such that an exposure area before the switching and an exposure area after the switching are not adjacent to each other). Accordingly, it is possible to reduce influence of the adjacent measurement areas and to increase measurement accuracy.
In the modification, the control unit 30 sequentially switches a measurement area to be irradiated with light such that the measurement area is sequentially switched from a lower side to an upper side, and sequentially switches an exposure area to be exposed with the reflected light. Accordingly, influence of road surface reflection can be reduced as compared with the comparative example (when no switching is performed) or a case where a measurement area is switched from an upper side to a lower side.
When measuring a distant target, the control unit 30 causes measurement to be performed only in a specific area (for example, the measurement area 3) among all the measurement areas. Accordingly, a frame rate can be improved.
While outputting an exposure result in a certain exposure area, the imaging unit 20 can be exposed to the reflected light in another exposure area. Accordingly, a frame rate can be improved.
The embodiments described above are intended to facilitate understanding of the present disclosure, and are not to be construed as limiting the present disclosure. In addition, it is needless to say that the present disclosure can be changed or improved without departing from the inventive concept thereof, and equivalents thereof are included in the present disclosure.
The present application claims the priority based on Japanese patent application No. 2022-004991 filed on Jan. 17, 2022, and all the contents described in the Japanese patent application are incorporated.
Number | Date | Country | Kind |
---|---|---|---|
2022-004991 | Jan 2022 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2022/043367 | 11/24/2022 | WO |