This patent application is based on and claims priority pursuant to 35 U.S.C. § 119(a) to Japanese Patent Application No. 2021-103874, filed on Jun. 23, 2021, in the Japan Patent Office, the entire disclosure of which is hereby incorporated by reference herein.
The present disclosure relates to a distance measurement apparatus and a distance measurement system.
As an example of ranging (or distance measurement), the time-of-flight (TOF) is known that includes emitting light for ranging toward an object and obtaining a time difference between the emitting time and the time of receiving light reflected from the object to calculate a distance to the object using the time difference. In the TOF, the image sensor for infrared light receives light for ranging reflected from an object after the light having an intensity modulated according to a predetermined irradiation pattern is emitted to the object. A time difference between the light emitting time and the light receiving time for the irradiation pattern is detected for each pixel to calculate a distance to the object. The calculated distance values are collected in a bitmap form for the respective pixels, which are stored as a distance image. Such a distance image generation apparatus (a distance measurement apparatus) is referred to as a TOF camera.
An embodiment provides a distance measurement apparatus includes multiple phototransmitters configured to emit light beams to a range to be measured; multiple photosensors each configured to receive a light beam reflected from an object within the range to be measured; and circuitry configured to calculate a distance to the objects based on times of light emission of each of the phototransmitters and time of light reception of each of the photosensors. The photosensors outnumber the phototransmitters.
Another embodiment provides a distance measurement system including a projector including multiple phototransmitters configured to emit light beams to a range to be measured; a photosensor device including multiple photosensors each configured to receive a light beam reflected from an object within the range to be measured; and circuitry configured to calculate a distance to the object based on time of light emission of each of the phototransmitters and time of light reception of each of the photosensors. The photosensors outnumber the phototransmitters.
A more complete appreciation of the disclosure and many of the attendant advantages and features thereof can be readily obtained and understood from the following detailed description with reference to the accompanying drawings, wherein:
The accompanying drawings are intended to depict embodiments of the present invention and should not be interpreted to limit the scope thereof. The accompanying drawings are not to be considered as drawn to scale unless explicitly noted. Also, identical or similar reference numerals designate identical or similar components throughout the several views.
In describing embodiments illustrated in the drawings, specific terminology is employed for the sake of clarity. However, the disclosure of this specification is not intended to be limited to the specific terminology so selected and it is to be understood that each specific element includes all technical equivalents that have a similar function, operate in a similar manner, and achieve a similar result.
Referring now to the drawings, embodiments of the present disclosure are described below. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
Embodiments of the present disclosure use photosensors more than phototransmitters to achieve higher intensities of light emitted from the phototransmitters, longer focal length of photosensors, and higher distance measurement accuracy within a limited size of the apparatus.
Hereinafter, embodiments of a distance measurement apparatus and a distance measurement system will be described in detail with reference to the accompanying drawings.
As illustrated in
The VCSEL projector unit 21 emits distance-measuring light (e.g., infrared light) toward an object to be measured to obtain a distance to the object. The VCSEL projector unit 21 includes two VCSEL projector units 21: VCSEL projector units 21F and 21B) (refer to
The TOF photosensor unit 61 receives light scattered (scattering light) and reflected from the object to which the distance-measuring light has been emitted from the VCSEL projector unit 21, so as to obtain three-dimensional point group data.
The CMOS photosensor unit 30 acquires a two-dimensional image using a CMOS sensor 33 (see
The substrates (the CMOS substrate 35, the VCSEL substrates 22F and 22B, and the main substrate 41) are substrates for driving or controlling the VCSEL projector unit 21, the TOF photosensor unit 61, and the CMOS photosensor unit 30. The substrates (the CMOS substrate 35, the VCSEL substrates 22F and 22B, and the main substrate 41) are connected to each of the VCSEL projector unit 21, the TOF photosensor unit 61, and the CMOS photosensor unit 30 via cables, flexible printed circuits (FPCs), and flexible flat cables (FFCs).
The fan 38 is provided inside the image-capturing device 100 and generates forced convection to cool the inside of the image-capturing device 100.
The following describes the arrangement of multiple substrates (the CMOS substrate 35, the VCSEL substrates 22F and 22B, and the main substrate 41) included in the image-capturing device 100.
As illustrated in
This configuration allows a total of four substrates (the VCSEL substrates 22F and 22B, the CMOS substrate 35, and the main substrate 41) to be housed within the image-capturing device 100 without wasting space inside the image-capturing device 100. This further enables downsizing of the four substrates (the VCSEL substrates 22F and 22B, the CMOS substrate 35, and the main substrate 41) in the arrangement direction of the substrates (i.e., along the Z-axis). Further, the arrangement of the four substrates (i.e, the VCSEL substrates 22F and 22B, the CMOS substrate 35, and the main substrate 41 are arranged in parallel to each other) allows generation of air flows in the longitudinal direction of the substrates without hampering natural convective flows inside the image-capturing device 100 or forced convective flows by the fan 38, so as to reduce the occurrence of temperature deviations inside the image-capturing device 100. Further, since heat exhaust efficiency by inflow/outflow of air from a vent hole provided in a cover described later and heat radiation (heat transfer) from the cover to external air is improved, it is possible to reduce the occurrence of temperature rise inside the image-capturing device 100.
As illustrated in
The shutter button 62 allows a user operation to determine the image-capturing timing of the CMOS photosensor unit 30.
The operation switch unit 17 allows a user operation to switch between ON and OFF of the image-capturing device 100 and switch an operation mode.
Further, as illustrated in
As illustrated in
The inner wall 10 that is a part of the covers of the image-capturing device 100 is described below.
As illustrated in
Next, the battery cases 68a and 68b will be described.
As illustrated in
Such a configuration facilitates replacement of the batteries 18a and 18b because in the image-capturing device 100, the batteries 18a and the 18b are attachable and detachable along the Y-axis by removing the battery covers 15a and 15b in
The image-capturing device 100 may be driven by using a power cord. The power cord is preferably detachably attachable. Such a configuration eliminates the need for built-in batteries in the body of the image-capturing device 100 and thus achieves a reduction in the weight of the image-capturing device 100 while preventing longer image-capturing time.
In such a configuration using a power code to drive the image-capturing device 100, an insertion port of the power cord is preferably disposed at a lower portion closer to the baseplate 16 than the shutter button 62 (downstream of the shutter button 62 in the −X-direction). This arrangement allows light beams from being blocked by the fingers of the user pressing the shutter button 62 earlier than by the power code, and thus reduces the dead spot due to the power code more than the dead spot due to the fingers of the user.
The following describes the shutter button 62.
As illustrated in
As illustrated in
Next, the arrangement of the shutter button 62 will be described.
As illustrated in
Specifically, in the X-Z cross-sectional plane in
The shutter button 62 is disposed between the TOF photosensor unit 61 and the batteries 18a and 18b. Such an arrangement of the shutter button 62 between the TOF photosensor unit 61 and the batteries 18a and 18b induces the user to hold the portion (a portion near the center of gravity) in which the batteries 18a and 18b are housed, and prevent camera shake while allowing a reduction in the user's fatigue.
Instead of determining the shooting timing by pressing and releasing the shutter button 62 on the rear cover 12 of the image-capturing device 100, a remote operation may be performed in a wired or wireless manner. Such a remotely operable configuration prevents shake of the image-capturing device 100 held by the hand of the user more effectively.
In the image-capturing device 100, the LED element 65 and the opening 64 may be connected by, for example, a light guide plate or an optical fiber. This arrangement increases the utilization efficiency of light in the image-capturing device 100. In some embodiments, the image-capturing device 100 includes a lens system and a diffusion plate in the opening 64 to increase the viewability of images formed by light emitted from the LED element 65.
Next, the VCSEL projector unit 21 will be described.
As illustrated in
The VCSEL package 24 is a light source including a VCSEL as a light emitting point 25 (a light source). The lens cell 26 houses the VCSEL optical system 23 composed of multiple lenses.
The VCSEL package 24 is soldered to each of the VCSEL substrates 22 (VCSEL substrates 22F and 22B). In the following detailed description of the VCSEL substrates 22F and 22B, the VCSEL substrates 22F and 22B are collectively referred to as a VCSEL substrate 22. Further, the VCSEL optical system 23 (i.e., the lens cell 26) is fixed to the VCSEL substrate 22 using a screw or bonded to the VCSEL substrate 22 so as to be aligned with the light emitting point 25 with a predetermined accuracy.
The VCSEL substrate 22 (the VCSEL substrates 22F and 22B) is mounted with a drive circuit for driving the VCSEL package 24. The drive circuit of the VCSEL substrate 22 generates heat because a large current flows in order to increase the intensity of light emitted from the VCSEL package 24 serving as a light source. To handle such generated heat, the VCSEL substrate 22 is mounted with a drive circuit having a large allowable current in order to reduce heat generation of the drive circuit, and is also mounted with a dissipator such as a heatsink. To mount such components on the VCSEL substrate 22, the VCSEL substrate 22 is larger than the other substrates, including the CMOS substrate 35. Such a configuration prevents the VCSEL substrate 22 from excessively heating up, and thus allows the VCSEL projector unit 21 to emit a higher intensity laser beam in the image-capturing device 100.
As illustrated in
The VCSEL optical system 23 serves as a fish-eye lens. The VCSEL optical systems 23 of the multiple phototransmitters 21 (i.e., the VCSEL projector units 21F and 21B) emit light beams to a full-spherical (4π steradians) range to be measured with the emitted light beams (the light beams from the VCSEL projector units 21F and 21B) in a situation where the emitted light beams are not blocked by any of the surrounding components.
The range to be measured refers to a range actually irradiated with light beams emitted from the multiple phototransmitters: two VCSEL projector units 21F and 21B, within the full-spherical range of 4π steradians with the image-capturing device 100 as the center. As described above, the range to be measured is equal to the full-spherical range of 4π steradians measurement target range corresponds to the entire celestial sphere (full-spherical) (4π steradians) in a situation where the light beams emitted from the VCSEL projector units 21F and 21B are not blocked by any of the components surrounding the VCSEL projector units 21F and 21B. When the emitted light beams are partially blocked by any of the components surrounding the VCSEL projector units 21F and 21B, the range to be measured is a range, within the full-spherical range, actually irradiated with light rays that have not been blocked by any of the components surrounding the phototransmitters 21.
In the image-capturing device 100, the VCSEL projector units 21F and 21B are fixed by fastening screws 66a (see
This configuration reduces or eliminates variations in the amount of light blocked by, for example, the front cover 11 and the rear cover 12 after emitted from the VCSEL projector units 21F and 21B, due to processing errors (dimension errors and shape errors) and also due to assembly variations of components of the image-capturing device 100. This reduction or elimination of variations in the amount of light blocked by any component further prevents poor illumination distribution of light emitted from the image-capturing device 100.
The VCSEL substrates 22F and 22B are arranged in parallel to each other as illustrated in
Next, the CMOS photosensor unit 30 will be described.
As illustrated in
The CMOS optical system 31 includes multiple lenses, and a prism. The CMOS sensor 33 is mounted on the CMOS sensor substrate 32. The CMOS optical system 31 and the CMOS sensor substrate 32 are integrally held by the lens holder 34. The CMOS sensor substrate 32 is, for example, bonded to the holder 34a of the lens holder 34.
In the image-capturing device 100, scattering light reflected from an external object, which has been emitted from an external illumination or the VCSEL projector unit 21, enters the CMOS optical system 31 and reaches the CMOS sensor 33 as indicated by arrow S in
As illustrated in
As illustrated in
The CMOS optical system 31 serves as a fish-eye lens. The CMOS optical systems 31 of the CMOS photosensor units 30R and 30L receive light beams scattered and reflected from the full-spherical range of 4π steradians in a situation where light beams emitted from multiple photoemitters are not blocked by any of the surrounding components.
Next, the arrangement of the CMOS photosensor unit 30 will be described.
In the image-capturing device 100 as illustrated in
In the image-capturing device 100 as illustrated in
Next, the TOF photosensor unit 61 will be described.
As illustrated in
Each TOF sensor substrate 74 is mounted with a TOF sensor 76.
The relay board 77 serves as a relay between the TOF sensor substrates 74 and the main substrate 41.
The TOF optical systems 71A, 71B, 71C, 71D and the TOF sensor substrates 74 are integrally held by the holder 78.
Notably, the TOF sensor substrate 74 and the relay board 77 are connected by a cable such as an FPC, and the relay board 77 and the main substrate 41 are also connected by a cable such as an FPC.
In the image-capturing device 100 as illustrated in
In the image-capturing device 100, light scattered and reflected from an external object after emitted from the VCSEL projector unit 21 to the external object reaches the TOF sensor 76 mounted on the TOF sensor substrate 74 through the TOF optical system 71.
As illustrated in
The TOF optical system 71A vertically arranged has an angle of view (in the vertical direction; in an X-Y cross-sectional plane and an X-Z cross-sectional plane) of 65 degrees. The TOF optical system 71B horizontally arranged (with its incidence plane facing the right in
The TOF optical system 71 (the TOF optical systems 71A, 71B, 71C, and 71D) has an angle of view smaller than the angle of view of the VCSEL optical system 23 (i.e., each of the photosensors has an angle of view smaller than a light-emitting range of each of the phototransmitters). With such a smaller angle of view, the TOF optical system 71 has a focal length longer than the focal length of the VCSEL optical system 23. Such a TOF optical system 71 receives a larger amount of light per unit angle of view, which is related to the amount of light per unit pixel received by the TOF sensor 76. This configuration allows an increase in the amount of light received by the TOF optical system 71, which has even been reflected from a distant object or a low-reflective object.
Next, the relative position between the VCSEL projector unit 21 and the TOF photosensor unit 61 is described.
In
Further, in the image-capturing device according to the comparative example in
In the image-capturing device 100 as illustrated in
Next, the angle of view of the VCSEL optical system 23 will be described.
The image-capturing device 100 as illustrated in
As illustrated in
The upper light-receivable range of the TOF photosensor unit 61 covers a hemispherical range: the light-receivable range of the TOF optical system 71A on the top of the TOF photosensor unit 61; and the light-receivable ranges (an angle of view θtof2) of the horizontally-arranged TOF optical systems 71B, 71C, and 71D. Thus, the image-capturing device 100 is capable of capturing an image of the entire image-capturing range of the upper hemisphere. The image-capturing range of the CMOS photosensor unit 30 refers to an overlapping area between the light-emitting range of the VCSEL projector unit 21 and the light-receivable range of the CMOS photosensor unit 30.
Next, a hardware configuration of the image-capturing device 100 will be described. The following describes the hardware configuration for measuring a distance between the image-capturing device 100 and the object.
The distance-measurement control unit 230, which is built in the cover, is connected to the VCSEL projector unit 21, the TOF photosensor unit 61, and the CMOS photosensor unit 30. The distance-measurement control unit 230 includes a central processing unit (CPU) 231, a read only memory (ROM) 232, a random access memory (RAM) 233, a solid state drive (SDD) 234, a light-source drive circuit 235, a sensor interface (I/F) 236, an input-output I/F 237, and an RGB sensor I/F 240. These components are electrically connected to each other via a system bus 242.
The CPU 231 loads into the RAM 233 a program and data from a storage device, such as the ROM 232 or the SSD 234, and executes processing to provide the control or functions (described later) of the entirety of the distance-measurement control unit 230. Some or the entirety of these functions of the CPU 231 may be implemented by electronic circuit such as an application specific integrated circuit (ASIC) and a field-programmable gate array (FPGA).
The ROM 232 is a non-volatile semiconductor memory (storage device) that holds a program and data although the power is turned off. The ROM 232 stores programs and data of, for example, a basic input/output system (BIOS) and an operating system (OS) that are executed when the image-capturing device 100 is activated.
The RAM 233 is a volatile semiconductor memory (storage device) that temporarily holds a program and data.
The SSD 234 is a nonvolatile memory that stores programs for executing processing by the distance-measurement control unit 230 and various types of information. Note that the SSD may be a hard disk drive (HDD).
The light-source drive circuit 235 is an electric circuit that is electrically connected to the VCSEL projector unit 21 and outputs, to the VCSEL projector unit 21, a drive signal such as a drive voltage in response to a control signal input from the CPU 231. More specifically, in response to the control signal from the CPU 231, the light-source drive circuit 235 drives multiple light emitters included in the VCSEL projector unit 21 to emit light. The drive signal may use a rectangular wave, a sine wave, or a voltage waveform having a predetermined waveform. The light-source drive circuit 235 changes the frequency of the voltage waveform to modulate the frequency of the drive signal.
The sensor I/F 236 is an interface that is electrically connected to the TOF photosensor unit 61 and receives a phase signal output from the TOF photosensor unit 61. The input-output I/F 237 is an interface for connecting with an external device such as a personal computer (PC).
The RGB sensor I/F 240 is an interface that is electrically connected to the CMOS photosensor unit 30 and receives an RGB signal output from the CMOS photosensor unit 30.
The distance-measurement control unit 230 includes a light-emitting control unit 238, a light-receiving processing unit 239, and an RGB image processing unit 241. The distance-measurement control unit 230 controls the light emission of the VCSEL projector unit 21 via the light-emitting control unit 238, the light reception of the TOF photosensor unit 61 via the light-receiving processing unit 239, and the light reception of the CMOS photosensor unit 30 via the RGB image processing unit 241 in a synchronous manner.
The light-emitting control unit 238 includes at least a drive-signal output unit 238a that implements a function of the image-capturing device 100.
The drive-signal output unit 238a outputs a drive signal to the VCSEL projector units 21 and causes the VCSEL projector units 21 to simultaneously emit light beams. Further, the drive-signal output unit 238a outputs a drive signal having a predetermined voltage waveform at a predetermined light-emission frequency, thus to temporally modulate (temporally control) the light emission of the VCSEL projector units 21. In the present embodiment, the drive-signal output unit 238a outputs a drive signal having a rectangular wave or a sine wave at a frequency of about MHz to the VCSEL projector units 21 at a predetermined timing. This is only one example.
The light-receiving processing unit 239 includes at least a phase-signal input unit 239a, a distance-image acquisition unit 239b, a storage unit 239c, and a distance-image combining unit 239d as functions of the image-capturing device 100.
The phase-signal input unit 239a is implemented by the sensor I/F 236 to receive a phase signal output from the TOF photosensor unit 61. The phase-signal input unit 239a serves to receive a phase signal for each of the two-dimensionally arranged pixels in the TOF photosensor unit 61. Further, the phase-signal input unit 239a outputs the received phase signal to the distance-image acquisition unit 239b. In the present embodiment, the TOF photosensor unit 61 is connected to the phase-signal input unit 239a. In this case, the phase-signal input unit 239a outputs four phase signals for the TOF optical systems 71A, 71B, 71C, and 71D.
The distance-image acquisition unit 239b acquires a distance-image data for the distance between the image-capturing device 100 and the object, based on the phase signal for each pixel from the TOF photosensor unit 61, input from the phase-signal input unit 239a. Herein, the distance image refers to an image in which pieces of distance data acquired on a pixel-by-pixel basis are two-dimensionally arranged at corresponding pixel positions. For example, the distance image is an image generated with luminance data converted from the distance data. The distance-image acquisition unit 239b outputs the acquired four pieces of distance-image data to the storage unit 239c.
The storage unit 239c is implemented by, for example, the RAM 233, and temporarily stores the distance image (four pieces of distance image data) received from the distance-image acquisition unit 239b.
The distance-image combining unit 239d reads the four pieces of distance image data temporarily stored in the storage unit 239c, and combines the pieces of distance image data to generate one spherical distance image data.
The distance-image combining unit 239d is implemented by the CPU 231 executing a control program. However, no limitation is intended thereby. In some embodiments, a part or the entirety of the distance-image combining unit 239d may be implemented by dedicated hardware designed to execute similar functions, for example, semiconductor integrated circuits such as an application specific integrated circuit (ASIC), a digital signal processor (DSP), and a field programmable gate array (FPGA), or typical circuit modules.
The RGB image processing unit 241 includes an RGB image input unit 241a, an RGB image storage unit 241b, and an RGB image combining unit 241c.
The RGB image input unit 241a inputs an RGB image output from the CMOS photosensor unit 30. For example, the RGB image input unit 241a serves to receive, from the CMOS photosensor unit 30, an RGB signal for each of the two-dimensionally arranged pixels in the CMOS photosensor unit 30. The RGB image input unit 241a outputs the received RGB image (signal) to the RGB image storage unit 241b (in the present embodiment, since the two CMOS photosensor units 30R and 30L are connected to the RGB image processing unit 241, two RGB images are output from the RGB image input unit 241a to the RGB image processing unit 241). The RGB image input unit 241a is implemented by the RGB sensor I/F 240, for example.
The RGB image storage unit 241b is implemented by, for example, the RAM 233, and temporarily stores the RGB image input from the RGB image input unit 241a.
The RGB image combining unit 241c reads two pieces of RGB image data temporarily stored in the RGB image storage unit 241b, and combines the two pieces of RGB image data to generate a single spherical RGB image (image data). The RGB image combining unit 241c is implemented when the CPU 231 executing a control program. However, no limitation is intended thereby. In some embodiments, a part or the entirety of the RGB image combining unit 241c may be implemented by dedicated hardware designed to execute similar functions, for example, semiconductor integrated circuits such as an ASIC, a DSP, and an FPGA, or typical circuit modules.
In the above description, the configuration in
In the present embodiment described above, the photosensors (the TOF optical systems 71A, 71B, 71C, and 71D) outnumber the phototransmitters (the VCSEL projector units 21F and 21B) to cover the range to be measured. This configuration allows upsizing of the light source of each phototransmitter to emit higher-intensity light. This enables reception of a sufficient amount of light even reflected from a distant object or a low-reflective object, and thus achieves a higher measurement accuracy. Further, in the present embodiment, the photosensors outnumber the phototransmitters to cover the range to be measured, so as to allow each photosensor to have a smaller angle of view to receive light. This configuration allows each photosensor to have a longer focal length and a larger F-number lens, and thus achieves an increase in the accuracy of measurement of a distance to a distant object. In short, the configuration according to an embodiment of the present disclosure uses photosensors more than phototransmitters and achieves higher intensities of light emitted from the phototransmitters, a longer focal length of the photosensors, and higher distance measurement accuracy with a limited size of the apparatus.
A second embodiment is described below.
The second embodiment differs from the first embodiment in that the TOF photosensor unit is divided into two upper and lower parts, and a VCSEL projector unit and a CMOS photosensor unit are disposed between the two parts. In the following description of the second embodiment, the description of the same portions as those of the first embodiment will be omitted, and portions different from those of the first embodiment will be described.
In the image-capturing device 100 as illustrated in
Further, in the image-capturing device 100 in
In the image-capturing device 200 as illustrated in
In the image-capturing device 200, light scattered and reflected from an external object after emitted from the VCSEL optical systems 23F and 23B to the object reaches the TOF sensors on the TOF sensor substrates 74A to 74D through the TOF optical systems 71A to 71D.
In the image-capturing device 200 in
In the image-capturing device 200 according to the present embodiment, only the TOF optical system 71A is disposed on the upper side of the VCSEL optical system 23F in the second stage. This configuration allows the relation that D2 is equal to E2 (D2=E2) in
In the image-capturing device 200 in
The image-capturing device 200 allows a light-receivable range of the TOF photosensor unit to have an angle of view of 90 degrees at the maximum in the Z-X cross-sectional plane because of the lower light-receivable range (of the angle of view Ωtof3) of the TOF optical system 71B. Thus, the image-capturing device 200 is capable of capturing an image of the entire image-capturing range of the lower hemisphere by incorporating the light-receivable ranges of the TOF optical systems 71C and 71D.
The present embodiment allows capturing of an image over the entire image-capturing range. In this case, the range to be measured (or the image-capturing range) is a full-spherical range to which the phototransmitters (23F, 23B) emit the light beams.
In each of the above-described embodiments, the distance measurement apparatus (the image-capturing devices 100, 200) includes VCSEL projector units 21 (21F and 21B) as multiple phototransmitters, TOF optical systems 71 (71A, 71B, 71C, 71D) as multiple photosensors, CMOS photosensor units 30 (30R and 30L) as multiple imagers, and a distance-measurement control unit 230 that performs distance-measurement calculation, which are formed as a single integrated unit. However, the distance-measurement apparatus is not limited to such a single integrated unit.
Alternatively, as illustrated in
Note that the embodiments described above are preferred example embodiments of the present invention, and various applications and modifications may be made without departing from the scope of the invention.
The above-described embodiments are illustrative and do not limit the present invention. Thus, numerous additional modifications and variations are possible in light of the above teachings. For example, elements and/or features of different illustrative embodiments may be combined with each other and/or substituted for each other within the scope of the present invention.
Any one of the above-described operations may be performed in various other ways, for example, in an order different from the one described above.
Each of the functions of the described embodiments may be implemented by one or more processing circuits or circuitry. Processing circuitry includes a programmed processor, as a processor includes circuitry. A processing circuit also includes devices such as an application specific integrated circuit (ASIC), a digital signal processor (DSP), a field programmable gate array (FPGA), and conventional circuit components arranged to perform the recited functions.
Number | Date | Country | Kind |
---|---|---|---|
2021-103874 | Jun 2021 | JP | national |