This application claims benefit of priority under 35 U.S.C. § 119 to Korean Patent Application No. 10-2019-0025716 filed on Mar. 6, 2019 in the Korean Intellectual Property Office, the disclosure of which is incorporated herein by reference in its entirety.
Some example embodiments of the inventive concepts relate to an image sensor and an imaging device.
Image sensors include semiconductor-based sensors that receive light to generate an electrical signal. An image sensor may include a pixel array, having a plurality of pixels, a logic circuit configured to drive the pixel array to generate an image, and the like. Recently, research into an imaging devices combining a light source outputting an optical signal within a specific wavelength band and with an image sensor has been actively conducted. In the case of an imaging device using a light source, it may be difficult to precisely detect an optical signal emitted from a light source and reflected from a subject in an environment in which the intensity of ambient light, such as sunlight, is high. Accordingly, performance may be deteriorated.
Some example embodiments of the inventive concepts provide an image sensor and imaging device which may efficiently generate charges in response an optical signal within a specific wavelength band, in which sunlight intensity is low, to precisely recognize a subject even in an environment in which sunlight intensity is high.
According to some example embodiments of the inventive concepts, an image sensor may include a semiconductor substrate including a plurality of pixel regions, a first surface, and a second surface opposing the first surface, a plurality of transistors adjacent to the first surface of the semiconductor substrate in each of the plurality of pixel regions, a microlens on the second surface of the semiconductor substrate, and a plurality of conductive patterns in contact with the semiconductor substrate and closer to the second surface of the semiconductor substrate than to the first surface of the semiconductor substrate in each of the plurality of pixel regions.
According to some example embodiments of the inventive concepts, an image sensor may include a plurality of pixel regions, each of the plurality of pixel regions including at least one respective photodiode, a plurality of microlenses on the semiconductor substrate, each of the plurality of pixel regions including a respective microlense among the plurality of microlenses, and a plurality of conductive patterns in contact with the semiconductor substrate, each respective conductive pattern among the plurality of conductive patterns being between the at least one respective photodiode and the respective microlens.
According to some example embodiments of the inventive concepts, an imaging device may include a light source configured to output an optical signal within a determined wavelength band, and an image sensor configured to generate a pixel signal in response to receiving the optical signal reflected by an object, the image sensor including, a semiconductor substrate including a plurality of pixel regions, a plurality of microlenses on a surface of the semiconductor substrate on which the optical signal reflected by the object is incident, and a plurality of conductive patterns on the surface of the semiconductor substrate and in contact with the semiconductor substrate.
The above and other aspects, features, and advantages of the present disclosure will be more clearly understood from the following detailed description, taken in conjunction with the accompanying drawings, in which:
Hereinafter, some example embodiments of the inventive concepts will be described with reference to the accompanying drawings.
Referring to
The optical signal, output by the light source 2, may be reflected by an object 4, and the sensor unit 3 may receive the optical signal reflected by the object 4 as a received optical signal. The sensor unit 3 may include a pixel array, having pixels generating (e.g., configured to generate) an electrical signal in response to the received optical signal, a controller configured to generate an image using the electrical signal generated by the pixel array, and/or the like. As an example, the image generated by the controller may be a depth map including distance information of the object 4 and/or a surrounding environment of the object 4.
In an example embodiment, the sensor unit 3 may provide a function to generate a depth map as well as a proximity sensing function to sense existence of the object 4 proximate to the imaging device 1, a distance measuring function to calculate a distance between the object 4 and the imaging device 1, and/or the like. While, the sensor unit 3 may detect the received optical signal output by the light source 2 and reflected from the object 4, the above-mentioned functions may be more precisely implemented. Specifically, the sensor unit 3 may not precisely detect a received optical signal in an environment having severe interference of ambient light, such as sunlight, or the like. Accordingly, performance of the imaging device 1 may be degraded.
Sunlight is distributed over a wide wavelength band and generally has relatively low intensity in infrared bands. Most of the sunlight may be absorbed to (e.g., absorbed or attenuated by) moisture in atmosphere in a specific wavelength band. Thus, the light source 2 may be implemented using a light emitting element configured to output an optical signal of the specific wavelength band to improve performance of the imaging device 1. In an example embodiment, the specific wavelength band may be a short-wavelength infrared (SWIR) band around 1400 nm.
However, when the light source 2 outputs an optical signal of the specific wavelength band (e.g., the SWIR band), a silicon-based image sensor may be unable or less able to detect the optical signal of the specific wavelength band. In order to address the issue, an image sensor may be implemented with another semiconductor material other than silicon (Si). However, process difficulty may be increased as compared to the silicon-based image sensor resulting in increased manufacturing costs.
Example embodiments provide a silicon-based image sensor which may detect the optical signal of the specific wavelength band (e.g., the SWIR band). Accordingly, the light source 2 may be used to output the optical signal of the specific wavelength band in which most sunlight is absorbed to moisture in the atmosphere, or the like, and the imaging device 1 may be implemented to stably operate irrespective of the intensity of the sunlight.
Referring to
The pixel array 30 may include a plurality of pixels PX arranged in an array format along a plurality of rows and a plurality of columns. Each of the plurality of pixels PX may include a photodiode configured to generate charges in response to an optical signal incident from an object 60, a pixel circuit configured to generate an electrical signal corresponding to the charge generated by the photodiode, and/or the like. As an example, a pixel circuit may include a floating diffusion, a transfer transistor, a reset transistor, a drive transistor, a select transistor, and/or the like. Each of the pixels PX may have a configuration varying depending on example embodiments. As an example, each of the pixels PX may include an organic photodiode containing an organic material, unlike a silicon photodiode, or may be implemented as a digital pixel. When each of the pixels PX is implemented as a digital pixel, each of the pixels PX may include a comparator, a counter configured to convert an output of the comparator into a digital signal and transmit the digital signal, and/or the like.
The controller 20 may include a plurality of circuits for controlling the pixel array 30. As an example, the controller 20 may include a row driver 21, a readout circuit 22, a column driver 23, a control logic 24, and/or the like. The row driver 21 may drive the pixel array 30 in units of rows. For example, the row driver 21 may generate a transfer control signal for controlling a transfer transistor of a pixel circuit, a reset control signal controlling a reset transistor, a select control signal controlling a select transistor, and/or the like.
The readout circuit 22 may include a correlated double sampler (CDS), an analog-to-digital converter (ADC), and/or the like. The CDS may be connected to pixels PX, included in a row selected by a row select signal provided by the row driver 21, through column lines and may perform correlated double sampling to detect a reset voltage and a pixel voltage. The ADC may convert the reset voltage and the pixel voltage detected by the CDS into a digital signal, and may transmit the digital signal to the column driver 23.
The column driver 23 may include a latch and/or a buffer circuit configured to temporarily store a digital signal, an amplifier circuit, and/or the like, and may process a digital signal received from the readout circuit 22. The row driver 21, the readout circuit 22, and/or the column driver 23 may be controlled by the control logic 24. The control logic 24 may include a timing controller configured to control operating timing, an image signal processor configured to process image data, and/or the like.
The control logic 24 may process data output by the readout circuit 22 and the column driver 23 to generate image data. As an example, the image data may include a depth map. The control logic 24 may calculate a distance between the object 60 and the imaging device 10, and/or may recognize whether the object 60 is proximate to the imaging device 10 depending on an operating mode of the imaging device 10, by using the data output by the readout circuit 22 and the column driver 23.
The imaging device 10 may include the light source 50 outputting (e.g., configured to output) an optical signal to the object 60 to generate a depth map. The light source 50 may include at least one light emitting element. As an example, the light source 50 may include a semiconductor chip in which a plurality of semiconductor light emitting elements are arranged in an array format. The light source 50 may be operated by the light source driver 40. The light source driver 40 may be controlled by the controller 20.
In an example embodiment, the light source driver 40 may generate a determined pulse signal to drive the light source 50. The light source driver 40 may determine a cycle, a duty ratio, duration, and/or the like of the pulse signal in response to a control command of the controller 20. As an example, the controller 20 may synchronize at least one signal, among signals input to the pixel array 30, with a pulse signal input to the light source 50. In an example embodiment, a signal synchronized with the pulse signal input to the light source 50 may be at least one signal among signals input to the pixel array 30 by the row driver 21. According to some example embodiments, operations described herein as being performed by the controller 20, the row driver 21, the readout circuit 22, the column driver 23, the light source driver, 40, the CDS and/or the ADC may be performed by processing circuitry. The term ‘processing circuitry,’ as used in the present disclosure, may refer to, for example, hardware including logic circuits; a hardware/software combination such as a processor executing software; or a combination thereof. For example, the processing circuitry more specifically may include, but is not limited to, a central processing unit (CPU), an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable gate array (FPGA), a System-on-Chip (SoC), a programmable logic unit, a microprocessor, application-specific integrated circuit (ASIC), etc.
The light source 50 may output an optical signal within a specific wavelength band. In an example embodiment, the specific wavelength band may be an SWIR wavelength band around 1400 nm. Each of the pixels PX may include conductive patterns in direct contact with a semiconductor substrate to detect an optical signal of an SWIR wavelength band reflected from the object 60. As an example, a Schottky barrier may be formed on boundaries between the semiconductor substrate and the conductive patterns, and charges may be generated by the optical signal of the SWIR wavelength band.
Referring to
Each of the plurality of pixels PX may include a photodiode, configured to generate charges in response to an optical signal received by the pixel array 30, and a pixel circuit configured to output an electrical signal using the charges generated by the photodiode. The pixel circuit may include a floating diffusion configured to accumulate the charges generated by the photodiode, a transfer transistor configured to connect the floating diffusion and the photodiode to each other, a reset transistor configured to reset the floating diffusion, a drive transistor configured to amplify a voltage of the floating diffusion, a select transistor configured to connect the driving transistor to one of the column lines COL1[1] to COL1[n] and COL2[1] to COL2[n], and/or the like.
When an optical signal output from a light source is reflected from an object and is incident on the pixel array 30, a photodiode of each of the pixels PX may generate charges in response to the incident optical signal. The optical signal output from the light source, and the received optical signal reflected from the object and incident on the pixel array 30, may have a determined phase difference. In an example embodiment, an imaging device may determine a distance between the imaging device and the object, and/or may sense proximity of the object using the phase difference, and/or may generate a depth map.
The imaging device may obtain an electrical signal corresponding to the generated charge for an integration time through a plurality of first column lines COL1[1] to COL1[n] and a plurality of second column lines COL2[1] to COL2[n] connected to the plurality of pixels PX in a single frame. As an example, a transfer transistor, included in each of the plurality of pixels PX, is turned on and/or turned off using a transfer control signal having a phase difference of 180 degrees to obtain an electrical signal through the first column lines COL1[1] to COL1[n] and/or the second column lines COL2[1] to COL2[n].
Referring to
The first pixel circuit PC1 may include a first transfer transistor TX1 connected to the photodiode PD, a first floating diffusion FD1 configured to accumulate charges generated by the photodiode PD, and a plurality of first circuit elements RX1, DX1, and SX1. The plurality of first circuit elements RX1, DX1, and SX1 may include a first reset transistor RX1, a first drive transistor DX1, and a first select transistor SX1. The second pixel circuit PC2 may have a structure similar to a structure of the first pixel circuit PC1. For example, the second pixel circuit PC2 may include a second floating diffusion FD2, a second reset transistor RX2, a second drive transistor DX2, and a second select transistor SX2. Control signals TG1, RG1, and SEL1 for controlling the first transfer transistor TX1, the first reset transistor RX1, and the first select transistor SX1, respectively, may be input by a row driver of an imaging device (e.g., from the row driver 21). Likewise, control signals TG2, RG2, and SEL2 for controlling the second transfer transistor TX2, the second reset transistor RX2, and the second select transistor SX2, respectively, may be input by the row driver of the imaging device (e.g., from the row driver 21).
When the first reset transistor RX1 is turned on, a voltage of the first floating diffusion FD1 may be reset to a power supply voltage VDD. The select transistor SX1 may be turned on, allowing the first sampling circuit SA1 to detect a first reset voltage. For a first exposure time, corresponding to a period from a first time at which the first reset transistor RX1 is turned off to second a time at which the first transfer transistor TX1 is turned on, the photodiode PD may be exposed to light to generate charges.
When the first transfer transistor TX1 is turned on, charges of the photodiode PD may be accumulated on the first floating diffusion FD1 and the first sampling circuit SA1 may detect a first pixel voltage in response to turn-on of the first select transistor SX1. A first analog-to-digital converter may convert a difference between the first reset voltage and the first pixel voltage into first raw data DATA1 in a digital format.
An operation of the second pixel circuit PC2 may be similar to an operation of the first pixel circuit PC1. However, the second transfer transistor TX2 may not be turned on simultaneously or contemporaneously with the first transfer transistor TX1. Accordingly, a second pixel voltage, output through the second column line COL2 by the second pixel circuit PC2, may correspond to a charge generated by exposing the photodiode PD2 to light for a second exposure time different from the first exposure time. A second analog-to-digital converter may convert a difference between a second reset voltage and a second pixel voltage into second raw data DATA2.
In an example embodiment, an imaging device may operate in a global shutter manner. As an example, after first and second reset transistors RX1 and RX2, included in respective pixels included in the imaging device, are all turned on to reset the pixels PX, photodiodes of the pixels PX may exposed for a determined exposure time to generate charges. A length of time of exposure of the photodiode PD to light may vary depending on an operating mode of the imaging device. For the time of exposure, the first transfer transistor TX1 and the second transfer transistor TX2 may be alternately turned on and turned off and may be controlled in synchronization with a driving signal of a light source actually operating in the imaging device.
As an example, for a first time, a first transfer control signal TG1, input to the first transfer transistor TX1, may have the same phase or a similar phase as the driving signal of the light source and a second transfer control signal TG2, input to the second transfer transistor TX2, may have a phase difference of 180 degrees with the driving signal of the light source. For a second time following the first time, the first transfer control signal TG1 may have a phase difference of 90 degrees with the driving signal of the light source and the second transfer control signal TG2 may have a phase difference of 270 degrees with the driving signal of the light source. An image sensor may recognize an object and/or may determine a distance to the object by using first raw data DATA1 and second raw data DATA2, obtained for the first time, and first raw data DATA2 and second raw data DATA2 obtained for the second time. As an example, each of the first and second times may be a frame period of the image sensor.
Referring to
The photodiode PD may be connected to a power supply node, outputting (e.g., configured to output) a power supply voltage VDD, through an overflow transistor OX. The overflow transistor OX may be turned on and turned off by an overflow control signal OG to prevent or reduce saturation of the photodiode PD.
Compared with the example illustrated in
In the example illustrated in
The first pixel circuit PC1 may transfer the charges, stored in the first storage transistor SXX1, to a first floating diffusion FD1 to generate first raw data DATA1. The second pixel circuit PC2 may transfer the charges, stored in the second storage transistor SXX2, to a second floating diffusion FD2 to generate second raw data DATA2. A method of measuring a distance to an object and/or recognizing the object using the first raw data DATA1 and the second raw data DATA2 may be the same as or similar to that described with reference to
In an example embodiment, an image sensor 100 may include a plurality of pixels PX arranged in an array format. Referring to
A photodiode 105 may be disposed in a semiconductor substrate 101, and each of the plurality of pixels PX may include at least one photodiode 105. Although
The semiconductor substrate 101 may have a first surface, and a second surface, opposing the first surface. In the example illustrated in
The optical region 110, formed on the second surface, may receive externally introduced light. The optical region 110 may include a microlens 111, conductive patterns 113, an optical insulating layer 115, and/or the like. The microlens 111 may be disposed in each of the pixels PX to refract the externally introduced light. The optical insulating layer 115 may include a filter and/or a planarized layer.
The conductive patterns 113 may be formed of at least one of a metal material, a metal silicide material, and/or a transparent conductive material and may be in direct contact with the semiconductor substrate 101. The conductive patterns 113 may be disposed between the microlens 111 and the photodiode 105, and may be disposed closer to the second surface than to the first surface of the semiconductor substrate 101.
The conductive patterns 113 may be in direct contact with the semiconductor substrate 101 on the second surface. A Schottky barrier may be formed on boundaries between the conductive patterns 113 and the semiconductor substrate 101, and charges may be generated in the semiconductor substrate 101 by an infrared optical signal (e.g., a signal of a wavelength band of 1350 nm to 1450 nm). In this case, electrons may be generated as main charge carriers in the semiconductor substrate 101. Accordingly, an imaging device may be implemented to recognize an object and/or to measure a distance to the object by coupling a light source, configured to output an optical signal of an SWIR wavelength band in which most sunlight is absorbed to moisture in atmosphere, to an image sensor 100. In addition, performance of the imaging device may be improved.
Referring to
Referring to
Referring to
In the example illustrated in
Referring to
In the example illustrated in
Referring to
In the example illustrated in
Referring to
When the pixel circuit region 320 is formed, the semiconductor substrate 301 may be turned over to expose the second surface and a polishing process may be selectively performed. As illustrated in
Referring to
Referring to
Referring to
When the pixel circuit region 420 is formed, the semiconductor substrate 401 may be turned over to expose the second surface and, a polishing process may be selectively performed. As illustrated in
Referring to
Referring to
Referring to
An optical region 510 may include a microlens 511, conductive patterns 513, an optical insulating layer 515, and/or the like. In the example illustrated in
Referring to
In the example illustrated in
Referring to
In the example illustrated in
Referring to
When the pixel circuit region 520 is formed, the semiconductor substrate 501 may be turned over to expose the second surface and a polishing process may selectively be performed. As illustrated in
Referring to
Referring to
Referring to
The optical region 610 may include a microlens 611, conductive patterns 613, an intermediate insulating layer 614, an optical insulating layer 615, and/or the like. The microlens 611, the conductive patterns 613, and/or the optical insulating layer 615 may be the same as or similar to those in any of the above-described examples. As an example, the conductive patterns 613 may be formed of a metal material, a metal silicide material, a transparent conductive material, and/or the like.
The intermediate insulating layer 614 may be disposed between the semiconductor substrate 601 and the conductive patterns 613 and may be formed of a material, through which electrons may pass due to a tunneling effect, for example, a silicon oxide, and/or the like. According to some embodiments, the intermediate insulating layer 614 may be formed of a dielectric material containing fluorine and/or hydrogen.
The intermediate insulating layer 614 may have a thickness smaller than a thickness of each of the conductive patterns 613 and/or a thickness of the optical insulating layer 615. For example, the intermediate insulating layer 614 may have a thickness of 3 nm or less. In an example embodiment, the intermediate insulating layer 614 may have a thickness smaller than or equal to ⅕ of a thickness of each of the conductive patterns 613.
Referring to
The impurity region 707 may be doped with impurities of a first conductivity type, and may be formed adjacent to conductive patterns 713. As an example, when electrons are generated as main charge carriers, the impurity region 707 may be doped with P-type impurities, and the impurity region 707 may be formed to significantly reduce generation of dark current.
Unlike the image sensor 700 according to the example illustrated in
Referring to
In an example embodiment, the conductive patterns 813 may include a first conductive layer 813A, a second conductive layer 813B, a pattern insulating layer 813C disposed therebetween, and/or the like. According to some embodiments, one or more pattern insulating layers 813C may be omitted or added, and a conductive layer may be further added in addition to the first conductive layer 813A and the second conductive layer 813B. The second conductive layer 813B may be formed of a material different from a material of the first conducive layer 813A, and may be formed closer to the microlens 811 than to the first conductive layer 813A. In an example embodiment, the first conductive layer 813A may be formed of a metal silicide material, and/or the second conductive layer 813B may be formed of a metal material.
An image sensor 800A according to the example illustrated in
Referring to
Referring to
A computer device 1000 according to the example illustrated in
The processing circuitry 1040 may perform specific operations, commands, tasks, and/or the like. For example, the processing circuitry 1040 may be a central processing unit (CPU), a microprocessor unit (MCU), or a system-on-chip (SoC) and may communicate with the display 1010, the image sensor 1020, the memory device 1030, as well as to other units connected the port 1050, through a bus 1060.
The memory 1030 may be a storage medium configured to store data and/or multimedia data used for operating the computer device 1000. The memory 1030 may include volatile memory, such as random access memory (RAM) and/or a nonvolatile memory such as flash memory. In addition, the memory 1030 may include at least one of a solid state drive (SSD), a hard disc drive (HDD), and/or an optical drive (ODD) as a storage unit. The image sensor 1020 may be employed in a computer device 1000 in the form of various embodiments described with reference to
As described above, an image sensor according to example embodiments may include a semiconductor substrate providing a plurality of pixel regions, and conductive patterns may be formed on one surface of the semiconductor substrate to be in direct contact with the semiconductor substrate. A wavelength band of an optical signal, in which charges may be generated in the semiconductor substrate, may be determined by the conductive patterns. Accordingly, an image sensor, configured to sense an optical signal of a wavelength band in which intensity of sunlight is low, and an imaging device using the image sensor, may be implemented. Moreover, stable operations of the image sensor and the imaging device may be secured irrespective of the intensity of sunlight.
Spatially relative terms, such as “on,” “covering,” and/or the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “on” other elements or features would then be oriented “under” the other elements or features. Thus, the term “on” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
It will be understood that when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. As used herein the term “and/or” includes any and all combinations of one or more of the associated listed items.
Some example embodiments are described herein with reference to cross-sectional illustrations that are schematic illustrations of idealized example embodiments (and intermediate structures) of some example embodiments. As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, some example embodiments should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing.
The various and advantageous advantages and effects of the inventive concepts are not limited to the above description, and may be more easily understood in the course of describing a some example embodiments of the inventive concepts.
While some example embodiments have been shown and described above, it will be apparent to those skilled in the art that modifications and variations could be made without departing from the scope of the inventive concepts as defined by the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2019-0025716 | Mar 2019 | KR | national |
Number | Name | Date | Kind |
---|---|---|---|
7763918 | Feng | Jul 2010 | B1 |
8054371 | Wang et al. | Nov 2011 | B2 |
8384133 | Moon | Feb 2013 | B2 |
8384172 | Lee | Feb 2013 | B2 |
8537255 | Lee | Sep 2013 | B2 |
8558335 | Nagano | Oct 2013 | B2 |
8582019 | Fujii | Nov 2013 | B2 |
8743265 | Toumiya | Jun 2014 | B2 |
8759742 | Yokogawa | Jun 2014 | B2 |
9263490 | Yanagita | Feb 2016 | B2 |
9274254 | Igarashi | Mar 2016 | B2 |
9426399 | Yamaguchi | Aug 2016 | B2 |
9570493 | Cheng | Feb 2017 | B2 |
9674493 | Li et al. | Jun 2017 | B2 |
9773831 | Yang | Sep 2017 | B1 |
9917121 | Hsu et al. | Mar 2018 | B2 |
9960198 | Yokogawa | May 2018 | B2 |
10079259 | Yu | Sep 2018 | B2 |
10504952 | Cheng | Dec 2019 | B2 |
10672817 | Lee | Jun 2020 | B2 |
10868059 | Jin | Dec 2020 | B2 |
11183525 | Lim | Nov 2021 | B2 |
11222913 | Hsu | Jan 2022 | B2 |
20050280108 | Kim | Dec 2005 | A1 |
20050287479 | Moon | Dec 2005 | A1 |
20060113622 | Adkisson | Jun 2006 | A1 |
20060138488 | Kim | Jun 2006 | A1 |
20060175551 | Fan | Aug 2006 | A1 |
20070057338 | Lee | Mar 2007 | A1 |
20070102621 | Kim | May 2007 | A1 |
20070181923 | Tanaka | Aug 2007 | A1 |
20080011936 | Kuo | Jan 2008 | A1 |
20080170143 | Yoshida | Jul 2008 | A1 |
20080237766 | Kim | Oct 2008 | A1 |
20090127441 | Hwang | May 2009 | A1 |
20090295953 | Nozaki | Dec 2009 | A1 |
20100140733 | Lee | Jun 2010 | A1 |
20100144084 | Doan | Jun 2010 | A1 |
20100176280 | Yokogawa | Jul 2010 | A1 |
20100244168 | Shiozawa | Sep 2010 | A1 |
20110019041 | Ishiwata | Jan 2011 | A1 |
20110102547 | Sul | May 2011 | A1 |
20110128423 | Lee | Jun 2011 | A1 |
20110176023 | Jung | Jul 2011 | A1 |
20110281391 | Itahashi | Nov 2011 | A1 |
20110287368 | Watanabe | Nov 2011 | A1 |
20110316002 | Ahn | Dec 2011 | A1 |
20120009720 | Shim | Jan 2012 | A1 |
20120025059 | Kawashima | Feb 2012 | A1 |
20120038814 | Tayanaka | Feb 2012 | A1 |
20120091515 | Yoo | Apr 2012 | A1 |
20120235266 | Ootsuka | Sep 2012 | A1 |
20120248560 | Lee | Oct 2012 | A1 |
20130120843 | Junger | May 2013 | A1 |
20130270667 | Wang | Oct 2013 | A1 |
20140054662 | Yanagita | Feb 2014 | A1 |
20140145287 | Kato | May 2014 | A1 |
20140146207 | Yokogawa | May 2014 | A1 |
20140217538 | Shimotsusa | Aug 2014 | A1 |
20140264690 | Nagata | Sep 2014 | A1 |
20140339606 | Lin | Nov 2014 | A1 |
20140374868 | Lee | Dec 2014 | A1 |
20150028405 | Minami | Jan 2015 | A1 |
20150048469 | Jung | Feb 2015 | A1 |
20150115388 | Eda | Apr 2015 | A1 |
20150155320 | Chien | Jun 2015 | A1 |
20150162365 | Chang | Jun 2015 | A1 |
20150171125 | Jangjian | Jun 2015 | A1 |
20150194469 | Joei | Jul 2015 | A1 |
20150236066 | Tayanaka | Aug 2015 | A1 |
20150249109 | Wang | Sep 2015 | A1 |
20150255495 | Park | Sep 2015 | A1 |
20160005781 | Lin | Jan 2016 | A1 |
20160013226 | Shim | Jan 2016 | A1 |
20160043119 | Lee | Feb 2016 | A1 |
20160043125 | Hatano | Feb 2016 | A1 |
20160049528 | Cho | Feb 2016 | A1 |
20160056200 | Lee | Feb 2016 | A1 |
20160133865 | Yamaguchi | May 2016 | A1 |
20160172412 | Lee | Jun 2016 | A1 |
20160276386 | Hsu | Sep 2016 | A1 |
20160276394 | Chou | Sep 2016 | A1 |
20170012066 | Choi | Jan 2017 | A1 |
20170047367 | Lee | Feb 2017 | A1 |
20170098673 | Nomura | Apr 2017 | A1 |
20170170238 | Lee | Jun 2017 | A1 |
20180053796 | Baek | Feb 2018 | A1 |
20180090533 | Otake | Mar 2018 | A1 |
20180158850 | Wu | Jun 2018 | A1 |
20180190690 | Lee | Jul 2018 | A1 |
20180190696 | Lee | Jul 2018 | A1 |
20180190707 | Lee | Jul 2018 | A1 |
20180213142 | Usui | Jul 2018 | A1 |
20180219046 | Yamaguchi | Aug 2018 | A1 |
20180286922 | Togashi | Oct 2018 | A1 |
20180308892 | Kumano | Oct 2018 | A1 |
20180366504 | Jin | Dec 2018 | A1 |
20180366519 | Saito | Dec 2018 | A1 |
20190052823 | Jung | Feb 2019 | A1 |
20190131349 | Im | May 2019 | A1 |
20190148423 | Park | May 2019 | A1 |
20190157336 | Kim | May 2019 | A1 |
20190189818 | Chen | Jun 2019 | A1 |
20190206917 | Nakaji | Jul 2019 | A1 |
20190214427 | Nozawa | Jul 2019 | A1 |
20190221597 | Noh | Jul 2019 | A1 |
20190296070 | Jin | Sep 2019 | A1 |
20190305028 | Soda | Oct 2019 | A1 |
20190312075 | Yamamoto | Oct 2019 | A1 |
20200058549 | Choi | Feb 2020 | A1 |
20200091215 | Jang | Mar 2020 | A1 |
20200103511 | Jin | Apr 2020 | A1 |
20200105810 | Moon | Apr 2020 | A1 |
20200185439 | Jin | Jun 2020 | A1 |
20200243590 | Yamagishi | Jul 2020 | A1 |
20200335540 | Sugizaki | Oct 2020 | A1 |
20210057371 | Park | Feb 2021 | A1 |
20210057478 | Lee | Feb 2021 | A1 |
20210066367 | Jin | Mar 2021 | A1 |
20210167107 | Sugizaki | Jun 2021 | A1 |
20210192685 | Nomura | Jun 2021 | A1 |
20210280617 | Sugizaki | Sep 2021 | A1 |
20220028913 | Tateishi | Jan 2022 | A1 |
20220123032 | Lee | Apr 2022 | A1 |
20220150430 | Kim | May 2022 | A1 |
Number | Date | Country | |
---|---|---|---|
20200286942 A1 | Sep 2020 | US |