This patent application is based on and claims priority pursuant to 35 U.S.C. § 119(a) to Japanese Patent Application No. 2020-217320, filed on Dec. 25, 2020 and Japanese Patent Application No. 2021-207439, filed on Dec. 21, 2021, in the Japan Patent Office, the entire disclosure of which are hereby incorporated by reference herein.
The present disclosure relates to an object detector, a sensing apparatus, and a mobile object.
In the related art, an optical scanner that deflects a light beam to scan an irradiation region, an object detector that emits a laser beam to detect the presence or absence of an object and a distance to the object, and a sensing apparatus using the object detector are known. Such a device or apparatuses may be referred to as a light detection and ranging (LiDAR) device or a laser imaging detection and ranging (LiDAR) device.
The LiDAR detects, for example, the presence or absence of an object in front of a running vehicle (mobile object) and a distance to the object. Specifically, the LiDAR detects irradiates an object with the laser beams emitted from the light source and detects light reflected or scattered (at least one of reflected or scattered) from the object by a light detector to detect the presence or absence of an object and a distance to the object in a desired range, and detecting.
An object detector includes: an optical scanner including: a light source unit configured to emit a light beam; and a light deflector configured to deflect the light beam emitted from the light source unit to scan an object in a detection region with the deflected light beam; a light receiving optical element configured to collect the light beam returned from the object through at least one of reflection or scatter on the object in the detection region in response to scanning with the light beam deflected by the light deflector; and a light receiving element configured to receive the light beam collected by the light receiving optical element, the light receiving element including multiple light receiving pixels arranged in a first direction at different positions on a plane perpendicular to the optical axis.
A more complete appreciation of the disclosure and many of the attendant advantages and features thereof can be readily obtained and understood from the following detailed description with reference to the accompanying drawings, wherein:
The accompanying drawings are intended to depict embodiments of the present invention and should not be interpreted to limit the scope thereof. The accompanying drawings are not to be considered as drawn to scale unless explicitly noted. Also, identical or similar reference numerals designate identical or similar components throughout the several views.
In describing embodiments illustrated in the drawings, specific terminology is employed for the sake of clarity. However, the disclosure of this specification is not intended to be limited to the specific terminology so selected and it is to be understood that each specific element includes all technical equivalents that have a similar function, operate in a similar manner, and achieve a similar result.
Referring now to the drawings, embodiments of the present disclosure are described below. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
As a result of diligent research on conventional scanning mechanisms, the inventors of the present invention have found that detection accuracy may decrease due to an advance movement of a light deflector during a round trip time it takes for light to undergo the round trip to an object at a speed of light. Such an advance movement of the light deflector increases with a higher scanning speed of the scanning mechanism or the light deflector.
The round trip time from the light deflector to an object t is obtained by an equation below.
t=2L/c
where L is a distance to the object, and c is the speed of light. For example, when L is 300 m, t is 2 μs. In the MEMS mirror, an angular velocity of scanning becomes maximum at a center of the scanning, when a resonance frequency is 2,000 Hz and an amplitude is 25° (i.e., mechanical deflection angle is ±25°). An advance movement in a scanning direction reaches, for example, 0.6° as a deflection angle of the MEMS mirror, which is equivalent to 1.3° of an optical angle of view, when t is 2 μs. (The light deflector as the MEMS mirror is leading, for example, by a deflector angle of 0.6°, which is equivalent to an optical angle of view 1.3° double of the deflection angle, when t is 2 μs). As described above, when the round trip time from the light emission to light reception takes 2 μs, the advance movement of the MEMS mirror (the light deflector) is 1.3° in the scanning direction during the light projecting and the light receiving. If settings of an angle of view for each of the projecting light and receiving light are not appropriate, an accuracy of an object detection decreases (in an extreme case, an object cannot be detected), because light cannot be received due to an insufficient angle of view.
According to the present disclosure, an object detector, a sensing apparatus incorporating the object detector, and a mobile object incorporating the sensing apparatus achieve highly accurate object detection.
An optical scanner 1 according to the present embodiment is described in detail with reference to
The optical scanner 1 includes a light source 10, a light projecting (throwing) optical element 20, and a light deflector 30 (scanning mirror, optical deflector).
The light source 10 emits a light beam diverging at a predetermined angle. The light source 10 is, for example, a semiconductor laser (laser diode (LD)). Alternatively, the light source 10 may be a vertical cavity surface emitting laser (VCSEL) or a light emitting diode (LED). As described above, there is a latitude in configurations of the light source 10, and various designs (modifications) are possible.
The light projecting optical element 20 shapes a light beam emitted from the light source 10. Specifically, the light projecting optical element 20 shapes a light beam incident thereon while diverging at a predetermined angle into substantially parallel light. The light projecting optical element 20 is, for example, a coaxial aspherical lens that collimates a divergent light beam into substantially parallel light. In
The light source 10 and the light projecting optical element 20 constitute a light source unit 11 that emits a light beam.
The light deflector 30 has a deflection surface 31 that deflects the light beam, which is emitted from the light projecting optical element 20. Specifically, the light deflector 30 deflects and scans the light beam, which is emitted by the light source 10 and the light projecting optical element 20 in the X-direction, to scan a predetermined scanning range on a X-Z plane including the X-direction and the Z-direction. A scanning range by the light deflector 30 is set, for example, by changing an angle of the deflection surface 31 by vibration or rotation. In
When an object is present in a region to be scanned with the light beam deflected by the light deflector 30, the light beam returned from the object through at least one of reflection and scatter on the object. The returned light beam through at least one of reflection and is deflected by the light deflector 30 in a predetermined range on a X-Y plane and scatter on the object passes through the light receiving optical element 40 and reaches the light receiving element 50. The light receiving element 50 receives the returned light beam deflected by the light deflector 30 to scan the object with the light beam and through at least one of reflection and scatter on the object in the detection region via the light deflector 30 and the light receiving optical element 40. The drive substrate 60 controls the driving of the light source 10 and the light receiving element 50.
The light receiving optical element 40 is one of, for example, a lens system, a mirror system, and other various configurations that collects light onto the light receiving element 50, but the configuration of the light receiving optical element 40 is not limited thereto. There is a latitude in the configuration of the light receiving optical element 40.
The light receiving element 50 is, for example, a photo diode (PD), an APD (Avalanche Photo Diode), a single photo avalanche diode (SPAD) of a Geiger mode APD, or a CMOS imaging element having function of a time of flight (TOF) calculation for each pixel. The CMOS image element is also referred to as a TOF sensor.
The waveform processing circuit 70 applies predetermined waveform processing to the light beam received by the light receiving element 50 and outputs a detection signal. The time measuring circuit 80 measures time from the timing (a light emission time) at which a light beam emitted from the light source 10 to the timing (an object detection time) at which the object is detected by the light receiving element 50 based on the detection signal outputted from the waveform processing circuit 70 and outputs a result of the time measurement. The measurement controller 90 detects information on an object presence in the detection region based on the result of the time measurement (i.e., a period of time from the light beam emission by the light source 10 to the object detection by the light receiving element 50) inputted from the time measuring circuit 80. The measurement controller 90 works as an object detecting unit 21 (i.e., processing circuitry) that detects the returned light beam through at least one of reflection and scatter on the object in the detection region and determines the timing at which the object is detected by the light receiving element 50 (an object detection time). The object detecting unit 21 (i.e., processing circuitry) calculates an amount change in a deflection angle of the light deflector 30 from a distribution of the amount of light received by each light receiving pixel 51 (see
where fr is the focal length of the light receiving optical element 40, and d is the size of the light receiving element 50 in the scanning direction. Thus, the angle of view θr for light receiving of the light receiving element 50 is defined by the focal length fr of the light receiving element 50 and the size d of the light receiving element 50 in the scanning direction.
A shape of the light receiving element 50 (a light receiving surface of the light receiving element 50) may be, for example, a square, a rectangle, a circle, or an ellipse. In a case where a shape of the light receiving element 50 (a light receiving surface of the light receiving element 50) is a square, the length of one side of the square or a half thereof is the size d of the light receiving element 50 in the scanning direction. In a case where a shape of the light receiving element 50 (a light receiving surface of the light receiving element 50) is a rectangle, the length of the longer side or the shorter side of the rectangle or a half thereof is the size d of the light receiving element 50 in the scanning direction. In a case where a shape of the light receiving element 50 (a light receiving surface of the light receiving element 50) is a circle, the diameter or radius of the circle is the size d of the light receiving element 50 in the scanning direction. In a case where a shape of the light receiving element 50 (the light receiving surface of the light receiving element 50) has an elliptical shape, the major axis or the minor axis of the elliptical shape or a half thereof is the size d of the light receiving element 50 in the scanning direction.
When the light source 10 having a higher output power is used to detect an object at a longer distance, the output power is generally increased by increasing the area (e.g., an area defined by the size dt in
In
For example, when the round trip time it takes for light to undergo the round trip to the object at the speed of light is short with respect to the scanning speed of the light deflector 30, and when the angle of view θr of the light receiving element 50 is equal to or larger than the spread angle θt of the projecting light beam in the detection region (i.e., θr≥θt), the light receiving element 50 can efficiently receive without the loss of some amount of light returned from the object through at least one of reflection or scatter on the object, without losing some amount of light. In order to facilitate adjustment of the optical system or increase robustness against error factors such as aging, the angle of view of light receiving is usually set to be larger by an amount corresponding to the adjustment residual error or change.
The present inventors have found through studies that, since the light deflector moves in advance while the light beam travels for a round trip to the object at the speed of light, the detection accuracy may be deteriorated, which depends on the scanning speed of the scanning mechanism.
As described above, t is equal to 2L/c (i.e., t=2L/c) as illustrated in
The inventors of the present invention have focused on a phenomenon in which the angle of the light deflector 30 changes during the round trip time by light, and conceived the present invention. In the present embodiment, in order to effectively utilize the phenomenon in which the angle of the light deflector 30 changes during the round trip time by light, optical system parameters such as a pixel arrangement of the light receiving element 50 and the focal length are devised so that the shift detection region of the light receiving element 50 can estimate an amount of the beam horizontal shift of the reflected light from the amount of received light.
In the present embodiment, multiple light receiving pixels 51 in the light receiving element 50 are arranged along a predetermined direction in the light receiving element 50. Specifically, multiple light receiving pixels 51 are arranged side by side at different positions in a first direction (i.e., Z-direction) on a plane orthogonal to a propagating direction of the center ray of the light beam. With this arrangement, since multiple light receiving pixels 51 are arranged side by side along a direction (e.g., the Z-direction as the first direction, or the scanning direction of the light deflector 30) of shift of the light beam caused by the high-speed movement of the light deflector 30, any of multiple light receiving pixels 51 reliably receives the light beam. In addition, the detection region estimates the amount of the horizontal shift of the light beam reflected by the light deflector 30 from an amount of received light (i.e., an amount of light received by each light receiving pixel 51). As a result, an accuracy of distance measurement to the object improves while utilizing the phenomenon in which the angle of the light deflector 30 changes during the round trip time by light.
Arrangements of the light receiving pixel 51 are described with reference to
In
An amount of the shift in the reflected light beam in the Z-direction is referred to as an amount of the horizontal shift. In the present embodiment, a maximum value of the amount of the horizontal shift is regarded as a change in an amount of light incident on the light receiving pixels 51 in the shift detection region, and an angle change of the scanning mirror is estimated from the change in the amount of light. The round trip time to the object by light t is a period from light beam emission by the light source 10 to light beam reception in which the light is reflected by an object by the light receiving element 50. The round trip time to the object by light is obtained by an angle change of the light deflector 30 and a scanning speed (driving frequency). As a result, the object distance L (=ct/2) to be detected is estimated. In addition, the accuracy of distance measurement improves by complementing the accuracy of the distance, for example, by averaging multiple data, together with the information on measured distance based on the original TOF principle.
A specific arrangement of the light receiving element 50 includes a pixel array formed by multiple light receiving pixels 51 arranged along the Y-direction as a main region (i.e., pixel array in the main region) and a pixel array formed by multiple light receiving pixels 51 arranged along the Y-direction as a shift detection region which is shifted from the main region pixel in the Z-direction (i.e., pixel array in the shift detection region). In a scanning condition that the amount of horizontal shift with respect to a size of pixel is within an error, a pixel on which the center of the light beam is incident refers to a main region (main region pixel), a pixel on which the center of the light beam is not incident in the scanning condition described above or a peripheral ray of the light beam is incident refers to a shift detection region (shift detection region pixel). A direction of the horizontal shift of the reflected light beam is reversed between the first leg and the second leg of a round trip in the rotation of the MEMS mirror (i.e., directional switch). Because of the directional switch, the pixel array of shift detection region is symmetrically arranged at both sides of the pixel array in the main region along the Z-direction.
More specifically, multiple light receiving pixels 51 are arranged in a grid pattern along the Z-direction, which is a first direction, and the Y-direction, which is a second direction intersecting the first direction, in a plane orthogonal to the propagating direction of the center ray of the light beam. As described above, the first direction (Z-direction) coincides with the shift direction of the light beam caused by the high-speed movement of the light deflector 30. The second direction (Y-direction) is orthogonal to the first direction. Multiple light receiving pixels 51 are arranged in the grid pattern on the Y-Z plane.
Each light receiving pixel 51 has a shape of a rectangle elongated in the Z-direction on the Y-Z plane. The shape of the light receiving pixel 51 is not limited thereto and can be changed as appropriate. For example, the shape of the light receiving pixel 51 may be a rectangle elongated in the Y-direction.
As illustrated in
Among multiple light receiving pixels 51 arranged side by side in the first direction (Z-direction), two adjacent light receiving pixels 51 are referred to as a first light receiving pixel 51a and a second light receiving pixel 51b. In
Db<d1+d2+p
Preferably, the width of the reflected light beam is smaller than the sum of the size of two light receiving pixels 51 adjacent to each other (i.e., d1 and d2) and the distance between them (i.e., p). Thus, any of the light receiving pixels 51 can receive the light beam without waste.
In addition, suitable conditions differ depending on the reference position at which the light receiving element 50 receives a reflected light beam, and the number and the arrangement of pixel arrays 52 in the light receiving element 50 involve different conditions that depend on the presence or absence of the main region line (the pixel array in the main region) on which the center of the light beam is incident.
There are conditions to efficiently detect an amount of light change from the arrangements of the pixel arrays 52 with respect to the spread angle of the reflected light beam. When the reflected light beam is formed to be narrower than the size of light receiving pixel 51 (i.e., size of d), which may be the minimum spread angle of the reflected light beam, and horizontally shifted on the pixel, there is a position where the shift direction or the accurate shift position cannot be detected. The relation between a tangential direction of the pixel size d and the projecting light beam is illustrated in
When a MEMS mirror is used as the light deflector 30, the angle changes sinusoidally and the angle change δθ at a predetermined time t is not constant. A feature of the angular velocity of the scanning by the light deflector 30 is sinusoidal (i.e., sinusoidal angular velocity of scanning). For example, the angle change 60 at a certain time t′ is given by a mathematical expression below,
where c is the speed of light, L is a maximum distance detecting an object, ωs is angular velocity of scanning, and A is a maximum mechanical angle of deflection.
In the MEMS mirror, the scanning speed becomes maximum at the scanning center position. When focusing on the portion where the angle becomes 0, the angle change δθ at t=2L/c becomes maximum during a period from −t/2 to t/2, and thus a mathematical expression below is held.
When the shift of the optical angle of view is considered, it becomes four times in total in which the shift of the angle of view is optically doubled and further doubled in back and forth of scanning of the MEMS mirror. Thus, the angle change δθ is given by a mathematical expression below.
The number of pixel arrays 52 is represented as N=2m (m is a natural number, e.g., m=1, 2, 3, . . . ). When a sum of the spread angle of the projecting light beam θt and the angle change δθ of the scanning mirror is larger than the sum of the width of all pixel arrays 52 (i.e., the width of multiple light receiving pixels 51 in the first direction, N×d) of the light receiving element 50 and the pitch between all pixels (i.e., (N−1)×p), which may be the maximum reflected light beam spread angle, there is a position where the shift direction or the accurate shift position cannot be detected, when the light receiving pixel 51 is horizontally shifted, as in the minimum case. As illustrated in
When the above equation is modified and arranged, a mathematical expression below is derived.
where ωs is a sinusoidal angular velocity of scanning, A is a maximum mechanical deflection angle, L is a maximum detection distance, fr is a focal length of the light receiving optical element, d is a width of each of the light receiving pixels, p is a pitch between the light receiving pixels, c is the speed of light, θt is a spread angle of projecting light, and N is the number of pixel arrays arranged along a direction in which horizontal deviation occurs in the light receiving element.
As illustrated in
When the number of pixel arrays 52 is odd, instead of even, there is a condition to efficiently detect an amount of light change from the arrangement of pixel arrays with respect to the spread angle of the reflected light beam. When the reflected light beam is formed to be narrower than the size d of the light receiving pixel 51 (i.e., size of d in a direction of the horizontal shift) and the pitch p between the light receiving pixels on both sides, which may be the minimum spread angle of the reflected light beam, and horizontally shifted on the pixel, there is a position where the shift direction or the accurate shift position cannot be detected. As illustrated in
The number of pixel arrays 52 is represented as N=2m+1 (m is a natural number, e.g., m=1, 2, 3, . . . ). As illustrated in
When these expressions are modified and arranged, a mathematical expression below is derived.
where ωs is a sinusoidal angular velocity of scanning, A is a maximum mechanical deflection angle, L is a maximum detection distance, fr is a focal length of the light receiving optical element, d is a width of each of the light receiving pixels, p is a pitch between the light receiving pixels, c is the speed of light, θt is a spread angle of projecting light, and N is the number of pixel arrays arranged along a direction in which horizontal deviation occurs in the light receiving element.
As illustrated in
In
As illustrated in
As illustrated in
ΔS=ΔdΔZ×h.
At this time, the focal length fr is given by a mathematical expression below.
where h is a height of the light receiving pixel 51, W is the power density of the reflected light beam on the light receiving pixel 51, and the relational expression between the received light amount change ΔP=W×ΔS in the sweep area and the detection limit Plimit of the light receiving pixel 51 (pixel array 52) in the detection region are deformed. The height of the light receiving pixel 51 h is defined in a direction perpendicular to the first direction.
By designing the light receiving optical system to satisfy the focal length fr, a horizontal shift signal in the light receiving pixel in the detection region can be detected.
In the present embodiment, a MEMS mirror is used as the light deflector 30, but the light deflector 30 is not limited thereto. In one or more examples, a polygon mirror may be used as the light deflector 30. The polygon mirror has a feature of constant angular velocity (i.e., constant angular velocity of scanning). Thus, for a given angular speed ωu, the angle of the polygon mirror changes by t×ωu=2L/c×ωu during time t, due to the time t=2L/c required for light to travel back and forth to an object of length L. For this reason, when light travels back and forth from the light beam emission by the light source 10 to an object at a length L, the angle of the polygon mirror advances by 2L/c×ωu. As a result, considering that the incident angle of the projected/received light beam on the polygon mirror changes by this angle, and the light beam reflected by the polygon mirror is guided to the light receiving element 50, an angular shift in the projected/received light beam occurs by an angle twice this angle. When the polygon mirror is used, and the number of the pixel arrays 52 is an even number (i.e., the main region on which the center of the light beam is incident does not exist), the spread angle θt of the projecting light beam satisfies a mathematical expression below.
where ωu is a constant angular velocity of scanning, L is a maximum detection distance, fr is a focal length of the light receiving optical element, d is a width of each of the light receiving pixels, p is a pitch between the light receiving pixels, c is the speed of light, θt is a spread angle of projecting light, and N is the number of pixel arrays arranged along a direction in which horizontal deviation occurs in the light receiving element.
When the number of pixel arrays 52 is odd (i.e., the main region on which the center of the light beam is incident exists), the spread angle of the projecting light beam θt satisfies a mathematical expression below.
where ωu is a constant angular velocity of scanning, L is a maximum detection distance, fr is a focal length of the light receiving optical element, d is a width of each of the light receiving pixels, p is a pitch between the light receiving pixels, c is the speed of light, θt is a spread angle of projecting light, and N is the number of pixel arrays arranged along a direction in which horizontal deviation occurs in the light receiving element.
Also in the case of the polygon mirror, similarly to the MEMS mirror, when the reference position that the reflected light beam reaches on the light receiving element 50 is shifted to the pixel array 52 other than the center such as on the adjacent pixel arrays 52, the conditional expressions of the spread angles of the projected beams of an even number and an odd number are switched. The conditions described above (e.g., the even number and the odd number) are appropriately changed according to the reference position at which the light receiving element 50 receives the light beam.
When a polygon mirror is used as the light deflector 30, the focal length fr is given by the conditional expression below.
To eliminate the needs for such preparations, in the present modification in
As described above, since the light receiving element is set to flexibly cope with various situations as a hardware, different light receiving elements 50 for each measurement distance are not to be prepared. According to the measurement distance, the number of pixel arrays to be used may be changed, pixel arrays to be used may be selected, the number of pixel arrays may be thinned out to secure a pitch between the light receiving pixels 51, a width of a pixel array may be changed by adding information on multiple pixel arrays as a single array, or information of the horizontal shift may be complemented.
In such a case, the size of each light receiving pixel 51 can be smaller than the size of one line-shape pixel having larger pixel size in detecting an amount of light, although the total amount of light to be captured does not change. As a result, information reading becomes faster than measurement by one pixel array. In addition, the summit position of the Gaussian beam of the reflected light can be estimated with high accuracy by processing the pieces of information on light amount in multiple pixel arrays 52 together. As a result, the estimation accuracy of the amount of horizontal shift improves. Further, secondarily, the scanning speed of the light deflector 30 can be monitored by processing the position of the pixel array 52 which received light and information on the light reception time together.
In the present embodiment, conditions are: a line-shape beam elongating in the Y-direction is used; the light deflector 30 is one axis scanning; and the line-shape beam is scanned in one dimension. However, the conditions are not limited thereto, and there is a latitude in the conditions. For example, in two-dimensional scanning of a beam spot using a scanning mirror (i.e., optical deflector) of two axis scanning, each pixel of the light receiving element 50 of a two-dimensional array may be handled instead of a pixel array.
When the measurement distance is 10 m, an amount of the horizontal shift is approximately 20 μm, and when the measurement distance is 200 m, the amount of the horizontal shift is approximately 400 μm. In such a case, if the maximum measurement distance of the system design is adjusted to the measurement distance of 10 m, the measurement distance of 200 m cannot be measured, and if the maximum measurement distance of the system design is adjusted to the measurement distance of 200 m, the measurement accuracy is reduced when the measurement distance is 10 m. As in the modification of
Examples of application for the object detection device according to the present embodiment are described with reference to
The sensing apparatus 3 is mounted on, for example, a vehicle 6. Specifically, the sensing apparatus 3 is mounted in the vicinity of a bumper or a rearview mirror of the vehicle. In
The monitor 5 obtains information including at least one of the presence or absence of an object, a movement direction of the object, and a movement speed of the object based on an output of the object detector 2. The monitor 5 processes such as determination of the shape and size of the object, calculation of position information on the object, calculation of movement information, and recognition of the type of the object based on the output from the object detector 2. The monitor 5 controls traveling of the vehicle based on at least one of the position information and the movement information of the object. For example, when the monitor 5 detects an object, which is an obstacle, in front of the vehicle, the monitor 5 applies an automatic braking by an automatic driving technique, issues an alarm, turns a steering wheel, or issues an alarm on pressing the brake pedal. The mobile object is not limited to the vehicle 6. The sensing apparatus 3 may be mounted on an aircraft or a vessel. The sensing apparatus 3 may also be mounted on a mobile object such as drone or a robot that automatically moves without a driver.
According to the present embodiments described above, since multiple light receiving pixels 51 are arranged side by side along the shift direction of the light beam caused by the high-speed movement of the light deflector 30, the sensing apparatus 3 detects an object with high accuracy.
Besides the present embodiments and the modified examples described above, as other embodiments, the embodiments and modified examples described above may be combined wholly or partially.
In addition, the present embodiment is not limited to embodiments and modifications described above, and various changes, substitutions, and modifications may be made without departing from the spirit of the technical idea. Furthermore, if the technical idea can be realized in another manner by technical advancement or another derivative technology, the technical idea may be implemented by using the method. The claims cover all embodiments that can be included within the scope of the technical idea.
As described above, the present embodiments of the present invention achieve highly accurate object detection, and are particularly useful for an object detector, and a sensing apparatus.
The above-described embodiments are illustrative and do not limit the present invention. Thus, numerous additional modifications and variations are possible in light of the above teachings. For example, elements and/or features of different illustrative embodiments may be combined with each other and/or substituted for each other within the scope of the present invention.
Each of the functions of the described embodiments may be implemented by one or more processing circuits or circuitry. Processing circuitry includes a programmed processor, as a processor includes circuitry. A processing circuit also includes devices such as an application specific integrated circuit (ASIC), a digital signal processor (DSP), a field programmable gate array (FPGA), and conventional circuit components arranged to perform the recited functions.
Number | Date | Country | Kind |
---|---|---|---|
2020-217320 | Dec 2020 | JP | national |
2021-207439 | Dec 2021 | JP | national |