Embodiments of the present disclosure relate to a photoelectric conversion element, a reading device, and an image processing apparatus.
In recent years, the document security consciousness has been increased. Above all, there is an increasing need for ensuring the originality of documents and determining the authenticity.
PTL 1 discloses an invisible information reading technique for ensuring the originality of documents, determining the authenticity, and preventing the forgery. Specifically, invisible information (e.g., infrared (IR) information) is embedded in a document and read by invisible light (e.g., infrared light) to ensure the originality of the document, determine the authenticity, and prevent the forgery.
PTL 2 discloses a red, green, and blue (RGB)+IR simultaneous reading technique of reading an RGB image and an IR image at the same time without reducing the productivity with a configuration of a 4-line image sensor in which IR pixels are added to ordinary RGB pixels.
[PTL 1]
However, the conventional RGB+IR simultaneous reading does not take noise resistance into consideration and therefore has some difficulties in reading the RGB image and the IR image at the same time with good signal/noise (S/N). This is mainly because no consideration is given to measures for charges accumulated in the IR pixels.
In light of the above-described problems, it is a general object of the present invention to provide a photoelectric conversion element, a reading device, and an image processing apparatus capable of reading a visible image and an invisible image at the same time while preventing a decrease in S/N.
In order to solve the above-described problems and achieve the object, there is provided a photoelectric conversion element as described in appended claims. Advantageous embodiments are defined by the dependent claims. Advantageously, the photoelectric conversion element includes a first pixel array including first light-receiving sections arranged in a direction and a second pixel array including second light-receiving sections arranged in the direction. Each of the first light-receiving sections includes a first pixel configured to receive at least light having a first wavelength inside a visible spectrum and a first pixel circuit configured to transmit a signal from the first pixel to a subsequent stage. Each of the second light-receiving sections includes a second pixel configured to receive at least light having a second wavelength outside the visible spectrum and a second pixel circuit configured to transmit a signal from the second pixel to the subsequent stage. The second pixel circuit is provided in a vicinity of the second pixel.
The present invention enables simultaneous reading of a visible image and an invisible image while preventing a decrease in S/N in a low-sensitive, invisible region that is particularly difficult to deal with.
The accompanying drawings are intended to depict example embodiments of the present invention and should not be interpreted to limit the scope thereof. The accompanying drawings are not to be considered as drawn to scale unless explicitly noted. Also, identical or similar reference numerals designate identical or similar components throughout the several views.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. In describing embodiments illustrated in the drawings, specific terminology is employed for the sake of clarity. However, the disclosure of this specification is not intended to be limited to the specific terminology so selected and it is to be understood that each specific element includes all technical equivalents that have a similar function, operate in a similar manner, and achieve a similar result. Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views, embodiments of the present disclosure are described in detail below.
The image forming apparatus 100 includes an image reading unit 101 serving as a reading device, an automatic document feeder (ADF) 102 atop the image reading unit 101, and an image forming unit 103 below the image reading unit 101. In order to describe an internal configuration of the image forming unit 103,
The ADF 102 is a document supporter that positions, at a reading position, a document or an original including an image to be read. The ADF 102 automatically feeds the document placed on a table to the reading position. The image reading unit 101 reads the document fed by the ADF 102 at the predetermined reading position. The image reading unit 101 includes a platen (or an exposure glass) as an upper surface of the image reading unit 101. The platen serves as a document supporter on which a document is placed. The image reading unit 101 reads the document on the platen, that is, at the reading position. Specifically, the image reading unit 101 is a scanner that includes a light source, an optical system, and a complementary metal oxide semiconductor (CMOS) image sensor inside. In the image reading unit 101, the light source illuminates the document. Reflected light from the document passes through the optical system and strikes the image sensor, which reads the reflected light. Thus, the image reading unit 101 reads the image or image data of the document.
The image forming unit 103 forms an image according to the image data read by the image reading unit 101. The image forming unit 103 includes a manual feed roller pair 104 through which a recording medium is manually inserted and a recording medium supply unit 107 that supplies a recording medium. The recording medium supply unit 107 includes an assembly that sends out recording media one by one from vertically-aligned input trays 107a. The recording medium thus supplied is sent to a secondary transfer belt 112 via a registration roller pair 108.
A secondary transfer device 114 transfers a toner image from an intermediate transfer belt 113 onto the recording medium conveyed on the secondary transfer belt 112.
The image forming unit 103 also includes an optical writing device 109, an image forming unit 105 employing a tandem system, the intermediate transfer belt 113, and the secondary transfer belt 112. Specifically, in an image forming process, the image forming unit 105 renders a latent image written by the optical writing device 109 visible as a toner image and forms the toner image on the intermediate transfer belt 113.
More specifically, the image forming unit 105 includes four rotatable, drum-shaped photoconductors to form yellow (Y), magenta (M), cyan (C), and black (K) toner images on the four photoconductors, respectively. Each of the four photoconductors is surrounded by various pieces of image forming equipment 106 including a charging roller, a developing device, a primary transfer roller, a cleaner unit, and a neutralizer. The pieces of image forming equipment 106 function around each of the four photoconductors to form a toner image on the corresponding photoconductor and transfer the toner image onto the intermediate transfer belt 113. Specifically, the primary transfer rollers transfer the toner images from the respective photoconductors onto the intermediate transfer belt 113. As a consequence, a composite toner image is formed on the intermediate transfer belt 113.
The intermediate transfer belt 113 is entrained around a drive roller and a driven roller and disposed so as to pass through primary transfer nips between the four photoconductors and the respective primary transfer rollers. As the intermediate transfer belt 113 rotates, the composite toner image constructed of the toner images primary-transferred onto the intermediate transfer belt 113 is conveyed to the secondary transfer device 114. The secondary transfer device 114 secondarily transfers the composite toner image onto the recording medium on the secondary transfer belt 112. As the secondary transfer belt 112 rotates, the recording medium is conveyed to a fixing device 110. The fixing device 110 fixes the composite toner image as a color image onto the recording medium. Finally, the recording medium is discharged onto an output tray disposed outside a housing of the image forming apparatus 100. Note that, in the case of duplex printing, a reverse assembly 111 reverses the front and back sides of the recording medium and sends out the reversed recording medium onto the secondary transfer belt 112.
Note that the image forming unit 103 is not limited to an electrophotographic image forming unit that forms an image by electrophotography as described above. Alternatively, the image forming unit 103 may be an inkjet image forming unit that forms an image in an inkjet printing system.
Now, a detailed description is given of the image reading unit 101 included in the image forming apparatus 100 described above.
In a reading operation, the image reading unit 101 emits light upward from the light source 2 while moving the first carriage 6 and the second carriage 7 from home positions, respectively, in a sub-scanning direction A. The first carriage 6 and the second carriage 7 cause reflected light from a document 12 to be imaged on the image sensor 9 via the lens unit 8.
When the power is turned on, the image reading unit 101 reads reflected light from the reference white plate 13 and sets a reference. Specifically, the image reading unit 101 moves the first carriage 6 directly below the reference white plate 13, turns on the light source 2, and causes the reflected light from the reference white plate 13 to be imaged on the image sensor 9, thereby performing a gain adjustment.
Now, a detailed description is given of the light source 2 included in the image reading unit 101 described above.
Note that, in reading the visible and invisible images, either the visible image information or the invisible image information may be selectively read in the end. Therefore, in the present embodiment, an emission wavelength of the light source 2 is switched between visible and invisible wavelengths or spectra. A control unit 23 (illustrated in
As described above, by switching between the visible light source 2a (white) and the invisible light source 2b (IR), the visible and invisible images are readable with a simple configuration.
Note that, in the present embodiment, the visible light source 2a (white) and the invisible light source 2b (IR) are arranged alternately within one light, for example. Alternatively, the visible light source 2a (white) and the invisible light source 2b (IR) may be separately arranged as two lights. Even in a case in which the visible light source 2a (white) and the invisible light source 2b (IR) are arranged within one light, the configuration is not necessarily limited to the aforementioned arrangement provided that the light source 2 is configured to illuminate a subject. For example, the visible light source 2a (white) and the invisible light source 2b (IR) may be arranged in a plurality of rows.
Now, a detailed description is given of the image sensor 9 included in the image reading unit 101 described above.
The image sensor 9 of the present embodiment has a configuration in which RGB pixels respectively configured by the PDs 92 are distinguished from each other simply by color filters of RGB single colors, respectively, without IR cut filters. Therefore,
Note that, although the RGB reading is described in the present embodiment, the embodiments are not limited to the RGB reading. Alternatively, the embodiments are applicable to cyan, magenta, and yellow (CMY) reading or orange, green, and violet (OGV) reading. In addition, pixel arrays of the present embodiment are not limited to full-color pixel arrays. The pixel arrays may be simply G pixel arrays or monochrome pixel arrays without color filters, provided that the pixel arrays receive visible light.
By contrast, in the image sensor 9 of the present embodiment, RGB pixels as the PDs 92 have a single-layer structure mounted simply with RGB single color filters 91R, 91G, and 91B, respectively, as illustrated in
Color filters are generally applied by spin coating. As illustrated in
As described above, the RGB pixels have a photosensitivity to IR simply with the single color filters 91R, 91G, 91B, and 91R, without the IR cut filters. Accordingly, the single-layer structure with each of the color filters 91R, 91G, 91B, and 91R saves cost.
Note that, although the IR cut filters are typically added to remove IR components mixed in RGB pixels, the embodiments easily attain the same advantageous effect as the IR cut filters by removing the IR components that is mixed in the RGB pixels with an image processing unit 25 (illustrated in
A signal amount of the IR pixel is a fraction of a signal amount of RGB pixels. This is because, as illustrated in
In addition, transferring charges over a long distance makes the charges susceptible to external noise. Such external noise has a greater impact on the IR pixel than on the RGB pixels.
As described above, the signal of the IR pixel is more likely to be affected by the signal attenuation and the external noise than the signals of the RGB pixels. Therefore, the S/N may be unfavorably decreased. To prevent the decrease in S/N, a signal (i.e., charge) output from a pixel is to be dealt with. In particular, the signal (i.e., charge) output from the pixel is to be dealt with in the case of a reduction optical system sensor having a pixel size reduced to about 1/10 of a pixel size of a contact image sensor.
As illustrated in
Specifically, the R-pixel array 90R includes multiple R-pixel light-receiving sections 94R serving as third light-receiving sections arranged in a row along the main scanning direction X at a constant pitch. The R-pixel array 90R receives red light, serving as light having a third wavelength inside a visible spectrum, from the light source 2. The R-pixel light-receiving section 94R includes the PD 92 as the R pixel serving as a third pixel and a pixel circuit (PIX_BLK) 93 serving as a third pixel circuit that performs charge-voltage conversion.
Note that, in the present embodiment, an area in which the PD 92 is located is hereinafter referred to as a pixel area (PIX); whereas an area in which the pixel circuit (PIX_BLK) 93 is located is hereinafter referred to as a non-pixel area (Non-PIX).
The G-pixel array 90G includes multiple G-pixel light-receiving sections 94G serving as first light-receiving sections arranged in a row along the main scanning direction X at a constant pitch. The G-pixel array 90G receives green light, serving as light having a first wavelength inside the visible spectrum, from the light source 2. The G-pixel light-receiving section 94G includes the PD 92 as the G pixel serving as a first pixel and the pixel circuit (PIX_BLK) 93 serving as a first pixel circuit that performs the charge-voltage conversion.
The B-pixel array 90B has multiple B-pixel light-receiving sections 94B serving as fourth light-receiving sections arranged in a row along the main scanning direction X at a constant pitch. The B-pixel array 90B receives blue light, serving as light having a fourth wavelength inside the visible spectrum, from the light source 2. The B-pixel light-receiving section 94B includes the PD 92 as the B pixel serving as a fourth pixel and the pixel circuit (PIX_BLK) 93 serving as a fourth pixel circuit that performs the charge-voltage conversion.
The IR-pixel array 90IR includes multiple IR-pixel light-receiving sections 94IR serving as second light-receiving sections arranged in a row along the main scanning direction X at a constant pitch. The IR-pixel array 90IR receives IR light, serving as light having a second wavelength outside the visible spectrum, from the light source 2. The IR-pixel light-receiving section 94IR includes the PD 92 as the IR pixel serving as a second pixel and the pixel circuit (PIX_BLK) 93 serving as a second pixel circuit that performs the charge-voltage conversion.
Note that the R-pixel array 90R, the G-pixel array 90G, the B-pixel array 90B, and the IR-pixel array 90IR are distinguished from each other simply by the color filters as described above. The R-pixel array 90R, the G-pixel array 90G, the B-pixel array 90B, and the IR-pixel array 90IR have the identical PDs 92 and circuit portions such as the pixel circuits (PIX_BLK) 93. Therefore, it can be regarded that the image sensor 9 includes a continuous pattern of four pixel arrays.
In the present embodiment, as illustrated in
In particular, in the IR-pixel array 90IR, the pixel circuit (PIX_BLK) 93 is arranged adjacent to the pixel area (PIX) including the PD 92 as the IR pixel. For example, in a case in which the IR pixel and the pixel circuit (PIX_BLK) 93 are arranged in the vicinity at a distance equivalent to several pixels, the signal attenuation and the noise superimposition can be prevented. The “vicinity” herein refers to a distance of, e.g., several-pixel width. In this case, the distance is sufficiently short to prevent the signal attenuation and the noise superimposition.
As illustrated in
In addition, in the present embodiment, the IR pixel is described as an example of invisible light pixels. Since a general-purpose silicon semiconductor can be used by use of an IR area, the image sensor 9 can be configured at low cost. However, the invisible light pixel is not limited to the IR pixel. The advantageous effect of the embodiments can be attained by using another non-visible light pixel having a low sensitivity to visible light pixels such as an ultraviolet (UV) pixel.
As illustrated in
As described above, in a case in which the image processing unit 25 (illustrated in
As described above with reference to
Different from the arrangement illustrated in
Note that the shield lines 80 can be easily implemented provided that the shield lines 80 are low impedance lines. Although
By contrast,
As described above, the light source 2 is configured as visible/IR light source. The light source driving unit 24 drives the light source 2.
The signal processing unit 21 includes a gain controller (i.e., amplifier), an offset controller, and an analog-to-digital (A/D) converter. The signal processing unit 21 executes gain control, offset control, and A/D conversion on image signals (RGB) output from the image sensor 9.
The control unit 23 selectively controls a visible image mode and an IR image mode. The control unit 23 controls settings of the light source driving unit 24, the image sensor 9, the signal processing unit 21, and the SD correcting unit 22. The control unit 23 serves as reading control means that selectively controls a first reading operation and a second reading operation.
In the first reading operation, shading correction is executed, by use of a first reference data, on data obtained by reading a subject in a visible light region. In the second reading operation, the shading correction is executed, by use of a second reference data, on data obtained by reading the subject in an invisible light region.
The SD correcting unit 22 includes a line memory and executes the shading correction. Specifically, in the shading correction, the SD correcting unit 22 normalizes, with the reference white plate 13, and thus corrects a main scanning distribution such as sensitivity variation of the image sensor 9 for each pixel and unevenness in light amount.
The image processing unit 25 executes various types of image processing. For example, the image processing unit 25 includes an IR component removing unit 26 serving as an invisible component removing unit. The IR component removing unit 26 removes an IR component (serving as a second wavelength component) that is mixed in each of RGB pixels by use of a signal of an IR pixel. Such a configuration prevents reduction of color reproduction of a visible image (i.e., RGB image) and also prevents a decrease in S/N. With such a configuration, a high-quality invisible image (i.e., IR image) is obtained.
As described above, according to the present embodiment, the pixel circuit (PIX_BLK) 93 provided adjacent to the IR pixel minimizes the transfer distance of the charge, which is a signal resulting from the photoelectric conversion, rendering a long-distance transfer unnecessary. Accordingly, the present embodiment prevents the attenuation of the IR pixel signal and the superimposition of external noise. Thus, the present embodiment enables simultaneous reading of an RGB image and an IR image while preventing a decrease in S/N in a low-sensitive, infrared (i.e., invisible) region that is particularly difficult to deal with.
In the present embodiment, the pixel circuit (PIX_BLK) 93 is arranged adjacent to the pixel area of the IR pixel. Similarly, the pixel circuit (PIX_BLK) 93 is arranged adjacent to the pixel area of each of the RGB pixels. Accordingly, the present embodiment enables invisible reading (i.e., reading of the IR image) and full-color reading (i.e., reading of the RGB images) at the same time while preventing the decrease in S/N.
In addition, since the pixel circuit (PIX_BLK) 93 is arranged adjacent to the pixel area for each pixel, the charge transfer distance is minimized for each pixel. Accordingly, the decrease in S/N is prevented over all pixels.
A description is now given of a second embodiment.
In the first embodiment, the correction is facilitated by matching the IR characteristics of RGB pixels and the IR characteristics of IR pixels. The second embodiment is different from the first embodiment in that the structure of the pixel circuit (PIX_BLK) 93 of the IR pixel is different from the structure of the pixel circuit (PIX_BLK) 93 of the RGB pixels. A redundant description of identical features in the first and second embodiments is herein omitted; whereas a description is now given of features of the second embodiment different from the features of the first embodiment.
To address such a situation, in the present embodiment, the physical structure (e.g., size and location) and configuration of the pixel circuit (PIX_BLK) 93 of the IR pixel are different from the physical structure (e.g., size and location) and configuration of the pixel circuit (PIX_BLK) 93 of the RGB pixels as illustrated in
Here, it is known that the quantum sensitivity of silicon is higher (i.e., the light is more likely to be absorbed) at a position closer to a silicon surface as the wavelength is shorter; whereas the quantum sensitivity of silicon is higher at a deeper position in the silicon as the wavelength is longer. That is, infrared light is likely to be photoelectrically converted at the deeper position in the silicon than RGB light. In other words, the light receiving sensitivity to the infrared light is higher at the deeper position in the silicon than the RGB light. This is because of the wavelength dependence of silicon quantum sensitivity in a depth direction.
Therefore, in the image sensor 9A illustrated in
Relatedly, in the image sensor 9A illustrated in
As described above, according to the present embodiment, the PD 92 and the pixel circuit (PIX_BLK) 93 of the IR pixel are arranged to a position deeper than the positions of the PDs 92 and the pixel circuits (PIX_BLK) 93 of the RGB pixels. Such a configuration prevents a decrease in the light receiving sensitivity to the invisible region.
A description is now given of a third embodiment.
The third embodiment is different from the first and second embodiments in that dummy pixel arrays are respectively arranged above and below the R-pixel array 90R, the G-pixel array 90G, the B-pixel array 90B, and the IR-pixel array 90IR. A redundant description of identical features in the first to third embodiments is herein omitted; whereas a description is now given of features of the third embodiment different from the features of the first and second embodiments.
In the semiconductor process, it is generally known that the characteristics are more likely to change at end portions of a continuous pattern than at other portions of the continuous pattern. This is because, during manufacturing, the semiconductor process is affected by a peripheral pattern (or design), and the end portions of the continuous pattern becomes the boundary of the pattern. In the case of the image sensor 9 described above, the R-pixel array 90R or the IR-pixel array 90IR is a pattern boundary in the configuration illustrated in
To address such a situation, in the image sensor 9B of the present embodiment, a dummy pixel array 90dummy that imitates a pixel array and a pixel circuit is additionally arranged at each end portion of a sensing area including at least the IR-pixel array 90IR. In the example illustrated in
Note that, since the continuity of the circuit pattern is significant, any color filter may be used for the dummy pixel arrays 90dummy illustrated in
As described above, according to the present embodiment, the characteristics between colors are equalized regardless of the visible or invisible region.
A description is now given of a fourth embodiment.
The fourth embodiment is different from the first to third embodiments in that the IR pixel is arranged away from the RGB pixels. A redundant description of identical features in the first to fourth embodiments is herein omitted; whereas a description is now given of features of the fourth embodiment different from the features of the first to third embodiments.
The crosstalk between pixel signals, that is, the electrical crosstalk is mentioned above with reference to
To address such a situation, the image sensor 9C of the present embodiment has a configuration in which the IR pixel is distanced from the RGB pixels as illustrated in
As described above, the present embodiment reduces the impact of crosstalk caused by the charge between pixels, from the IR pixel to the RGB pixels.
Note that, in the example illustrated in
A description is now given of a fifth embodiment.
The fifth embodiment is different from the first to fourth embodiments in that a plurality of AD converters (ADC) are provided at a subsequent stage and in the vicinity of the pixel circuit (PIX_BLK) 93. A redundant description of identical features in the first to fifth embodiments is herein omitted; whereas a description is now given of features of the fifth embodiment different from the features of the first to fourth embodiments.
As illustrated in
Note that the “vicinity” refers to a distance at which the signals are transferrable within a predetermined period of time. For example, the difference of distance from the individual pixels (i.e., PDs 92) to the ADC 70 that processes the signals from the individual pixels is not excessively large (or more than two digits). That is, in the image sensor 9D, the ADC 70 is arranged in the vicinity of the pixels (i.e., PDs 92) and the pixel circuits (PIX_BLK) 93 to extremely shorten an analog path.
The image sensor 9D further includes a low voltage differential signals or signaling (LVDS) 71 which is a differential interface. The image sensor 9D further includes a timing generator (TG) 72. The TG 72 supplies a control signal to each block and controls the operation of the entire image sensor 9D.
The image sensor 9D according to the present embodiment performs A/D conversion with the ADC 70 in the same chip and transmits image data to a subsequent stage with the LVDS 71.
As described above, according to the present embodiment, the ADC 70 is arranged in the vicinity of the pixels (i.e., PDs 92) and the pixel circuits (PIX_BLK) 93 to perform the A/D conversion in the same chip. Although an IR pixel is added, such a configuration allows an increase in operation speed and generation of a high-quality image with good S/N.
Note that, although the pixel circuits (PIX_BLK) 93 are connected to the ADC 70 in
Note that the image processing apparatus of the embodiments has been described as applied to an MFP having at least two of copying, printing, scanning, and facsimile functions. Alternatively, the image processing apparatus of the embodiments may be applied to, e.g., a copier, a printer, a scanner, or a facsimile machine.
In addition, the reading device or the image processing apparatus of the embodiments has been described as applied to, but not limited to, an MFP. For example, the reading device or the image processing apparatus of the embodiments may be applied to applications in various fields, such as inspection in a factory automation (FA) field.
The reading device or the image processing apparatus of the embodiments may be applied to a bill scanner that is used to discriminate bills and prevent the forgery.
The above-described embodiments are illustrative and do not limit the present invention. Thus, numerous additional modifications and variations are possible in light of the above teachings. For example, elements and/or features of different illustrative embodiments may be combined with each other and/or substituted for each other within the scope of the present invention.
This patent application is based on and claims priority to Japanese Patent Application No. 2019-180197, filed on Sep. 30, 2019, in the Japan Patent Office, the entire disclosure of which is hereby incorporated by reference herein.
Number | Date | Country | Kind |
---|---|---|---|
2019-180197 | Sep 2019 | JP | national |
The present application is a continuation of U.S. application Ser. No. 17/640,830, filed Mar. 7, 2022, which is based on PCT/IB2020/058694 filed on Sep. 18, 2020, and claims priority to JP 2019-180197, filed on Sep. 30, 2019, the entire contents of each are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4862286 | Suda | Aug 1989 | A |
5453611 | Oozu et al. | Sep 1995 | A |
5731880 | Takaragi et al. | Mar 1998 | A |
6094281 | Nakai | Jul 2000 | A |
6486974 | Nakai | Nov 2002 | B1 |
20030044190 | Nakayama | Mar 2003 | A1 |
20060054787 | Olsen | Mar 2006 | A1 |
20100289885 | Lu | Nov 2010 | A1 |
20120113486 | Masuda | May 2012 | A1 |
20140028804 | Usuda | Jan 2014 | A1 |
20140204427 | Nakazawa | Jul 2014 | A1 |
20140204432 | Hashimoto et al. | Jul 2014 | A1 |
20140211273 | Konno et al. | Jul 2014 | A1 |
20140368893 | Nakazawa et al. | Dec 2014 | A1 |
20150098117 | Marumoto et al. | Apr 2015 | A1 |
20150116794 | Nakazawa | Apr 2015 | A1 |
20150163378 | Konno | Jun 2015 | A1 |
20150222790 | Asaba et al. | Aug 2015 | A1 |
20150304517 | Nakazawa et al. | Oct 2015 | A1 |
20160003673 | Hashimoto et al. | Jan 2016 | A1 |
20160006961 | Asaba et al. | Jan 2016 | A1 |
20160088179 | Nakazawa | Mar 2016 | A1 |
20160112660 | Nakazawa et al. | Apr 2016 | A1 |
20160119495 | Konno et al. | Apr 2016 | A1 |
20160173719 | Hashimoto et al. | Jun 2016 | A1 |
20160268330 | Nakazawa | Sep 2016 | A1 |
20160293903 | Kanda et al. | Oct 2016 | A1 |
20160295138 | Asaba et al. | Oct 2016 | A1 |
20160373604 | Hashimoto et al. | Dec 2016 | A1 |
20170019567 | Konno et al. | Jan 2017 | A1 |
20170126923 | Natori et al. | May 2017 | A1 |
20170163838 | Nakazawa | Jun 2017 | A1 |
20170170225 | Asaba | Jun 2017 | A1 |
20170187920 | Suzuki | Jun 2017 | A1 |
20170201700 | Hashimoto et al. | Jul 2017 | A1 |
20170295298 | Ozaki et al. | Oct 2017 | A1 |
20170302821 | Sasa et al. | Oct 2017 | A1 |
20170324883 | Konno | Nov 2017 | A1 |
20180148150 | Hiroki et al. | May 2018 | A1 |
20180175096 | Inoue et al. | Jun 2018 | A1 |
20180213124 | Yokohama et al. | Jul 2018 | A1 |
20180261642 | Asaba | Sep 2018 | A1 |
20180278791 | Sano | Sep 2018 | A1 |
20190166317 | Tanaka | May 2019 | A1 |
20190208149 | Asaba et al. | Jul 2019 | A1 |
20190268496 | Nakazawa | Aug 2019 | A1 |
20190289163 | Hashimoto et al. | Sep 2019 | A1 |
20200053229 | Hashimoto et al. | Feb 2020 | A1 |
20200053230 | Nakazawa et al. | Feb 2020 | A1 |
20200053233 | Nakazawa et al. | Feb 2020 | A1 |
20200410271 | Nakazawa et al. | Dec 2020 | A1 |
20200412904 | Ohmiya et al. | Dec 2020 | A1 |
20210021729 | Hashimoto et al. | Jan 2021 | A1 |
20210144322 | Wang | May 2021 | A1 |
Number | Date | Country |
---|---|---|
109981940 | Jul 2019 | CN |
0605898 | Jul 1994 | EP |
2998995 | Mar 2016 | EP |
H06-217079 | Aug 1994 | JP |
H06-217127 | Aug 1994 | JP |
2005-143134 | Jun 2005 | JP |
2014-039205 | Feb 2014 | JP |
2014039205 | Feb 2014 | JP |
2015-041679 | Mar 2015 | JP |
2015-082648 | Apr 2015 | JP |
2015-216599 | Dec 2015 | JP |
2016158151 | Sep 2016 | JP |
2017-085501 | May 2017 | JP |
2017192092 | Oct 2017 | JP |
2017-200020 | Nov 2017 | JP |
2017-208496 | Nov 2017 | JP |
2017208496 | Nov 2017 | JP |
2018011328 | Jan 2018 | JP |
2018101995 | Jun 2018 | JP |
Entry |
---|
Office Action issued Oct. 31, 2023 in Chinese Patent Application No. 202080068172.6, 8 pages. |
Office Action issued Sep. 14, 2023 in European Patent Application No. 20 780 797.5, 7 pages. |
Japanese Office Action issued Apr. 11, 2023 in corresponding Japanese Patent Application No. 2019-180197, 6pp. |
International Search Report issued on Oct. 23, 2020 in PCT/IB2020/058694 filed on Sep. 18, 2020, 9 pages. |
Notification to Grant Patent Right for Invention mailed Apr. 23, 2024, in Chinese Application No. 202080068172.6, 6 pages. (with translation). |
Number | Date | Country | |
---|---|---|---|
20230353693 A1 | Nov 2023 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17640830 | US | |
Child | 18220298 | US |