The present disclosure relates generally to digital pixel sensors. More specifically, but not by way of limitation, this disclosure relates to a digital pixel sensor that uses a single-input comparator for quantization.
A typical image sensor includes an array of pixel cells. Each pixel cell may include a photodiode to sense light by converting photons into electric charge (e.g., electrons or holes). The electric charge generated by the array of photodiodes can then be quantized by an analog-to-digital converter (ADC) into digital values to generate a digital image. The digital image may be transmitted from the sensor to another system for use by the other system. Examples of the other system may include a viewing system for viewing the digital image, a processing system for interpreting the digital image, or a compilation system for compiling a set of digital images.
One example of the present disclosure relates to an image sensing apparatus comprising a pixel cell including a photodiode configured to generate an electric charge in response to light. The image sensing apparatus also includes a quantizer circuit coupled to the pixel cell. The quantizer circuit includes a charge storage device configured to generate a voltage based on the electric charge, the charge storage device being coupled between the pixel cell and a comparator. The quantizer circuit also includes the comparator, wherein the comparator is a single-input comparator configured to switch from a first output state to a second output state in response to a ramp signal being equivalent to the voltage of the charge storage device. The quantizer circuit also includes a memory switch coupled to an output of the single-input comparator, the memory switch being configured to cause a counter value from a digital counter to be stored in a digital memory in response to the single-input comparator switching from the first output state to the second output state. The counter value can serve as a digital pixel value corresponding to the voltage. The image sensor apparatus also includes a ramp generator configured to transmit the ramp signal to a node positioned between the pixel cell and the charge storage device for switching the comparator from the first output state to the second output state.
Another example of the present disclosure relates to a method performed by an image sensor apparatus that includes a pixel cell coupled to a quantizer circuit, the quantizer circuit including a single-input comparator and a charge storage device coupled between the pixel cell and the single-input comparator. The method includes resetting the single-input comparator to a trip voltage level. The method includes generating a voltage at the charge storage device based on a difference between the trip voltage level and an electric charge output by the pixel cell. The method includes transmitting a ramp signal to a node positioned between the pixel cell and the charge storage device. The method includes switching the single-input comparator from a first output state to a second output state in response to the ramp signal being equivalent to the voltage at the charge storage device. The method includes, based on switching the single-input comparator from the first output state to the second output state, transmitting an output voltage from the single-input comparator to a memory switch. And the method includes storing a counter value from a digital counter to a digital memory in response to the memory switch receiving the output voltage from the single-input comparator, the counter value serving as a digital pixel value associated with the pixel cell.
Yet another example of the present disclosure relates to an artificial reality system that includes a display device for outputting an artificial reality environment, and an image sensor. The image sensor comprises a pixel array configured to generate a digital image, wherein the pixel array includes pixel cells with photodiodes configured to generate electric charges in response to light. The image sensor also includes a quantizer circuit coupled to the pixel array. The quantizer circuit includes a charge storage device configured to generate a voltage based on the electric charge, the charge storage device being coupled between the pixel cell and a comparator. The quantizer circuit also includes the comparator, wherein the comparator is a single-input comparator configured to switch from a first output state to a second output state in response to a ramp signal being equivalent to the voltage of the charge storage device. The quantizer circuit also includes a memory switch coupled to an output of the single-input comparator, the memory switch being configured to cause a counter value from a digital counter to be stored in a digital memory in response to the single-input comparator switching from the first output state to the second output state. The counter value can serve as a digital pixel value corresponding to the voltage. The image sensor can also include a ramp generator configured to transmit the ramp signal to a node positioned between the pixel cell and the charge storage device for switching the comparator from the first output state to the second output state. The artificial reality system further includes a host processor coupled to the image sensor and the display device, the host processor being configured to generate artificial reality content for display on the display device based on the digital image generated using the image sensor.
These illustrative examples are mentioned not to limit or define the scope of this disclosure, but rather to provide examples to aid understanding thereof. Illustrative examples are discussed in the Detailed Description, which provides further description. Advantages offered by various examples may be further understood by examining this specification.
Illustrative embodiments are described with reference to the following figures.
The figures depict some examples of the present disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative examples of the structures and methods illustrated may be employed without departing from the principles, or benefits touted, of this disclosure.
In the appended figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label by a dash and a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
In the following description, for the purposes of explanation, specific details are set forth in order to provide a thorough understanding of certain inventive embodiments. However, it will be apparent that various embodiments may be practiced without these specific details. The figures and description are not intended to be restrictive.
A digital image sensor (“image sensor”) typically includes an array of pixel cells coupled to one or more quantizer circuits (“quantizers”). Each pixel cell includes a photodiode to sense incident light by converting photons into electric charge. The electric charge generated by the photodiode of each pixel cell is then supplied to a quantizer to convert the electric charge into digital values. The digital values are stored in a memory and used to generate a digital image. The digital image may be used in various wearable applications, such as object recognition and tracking, location tracking, augmented reality (AR), virtual reality (VR), etc.
A conventional quantizer of a digital image sensor can be a complex arrangement of components, such as a differential comparator and a series of transistors. A differential comparator is a comparator that receives two input voltages at two input pins, compares the two input voltages to one another, and switches between output states based on a difference between the two input voltages. This complex arrangement of components can have a relatively large footprint that takes up a relatively large amount of space on a substrate of the image sensor. Additionally, the differential comparator's performance can degrade as its supply voltage is reduced. The large size and performance degradation associated with conventional quantizers can be problematic in the context of digital image sensors, for which smaller physical sizes and lower supply voltages is a key goal as image sensors continue to be integrated into smaller devices.
Some examples of the present disclosure can overcome one or more of the abovementioned problems by providing a quantizer for a digital image sensor that may have a smaller footprint than a conventional quantizer and that may perform better at lower supply voltages than a conventional quantizer, thereby allowing the digital image sensor to be integrated into smaller devices than may be possible using a conventional quantizer.
As one particular example, a quantizer of the present disclosure can include a first switch, such as a transistor. The quantizer can also include a charge storage device, such as a capacitor. The first switch is positioned between a pixel cell and the charge storage device for electrically coupling and decoupling the pixel cell and the charge storage device. In the quantizer, the charge storage device is coupled to a single-input comparator. A single-input comparator is a type of integrated circuit component that has only one input pin and may have one or more output pins. As used herein, the “input pin” of a comparator refers to an integrated circuit pin configured to receive an input voltage (Vin_comp) for comparison to a reference voltage (Vref) to control an output voltage at an output pin. An input pin is different from the other types of pins on a comparator, such as a power pin (Vs), a ground pin (GND), an output pin (OUT), and control pins such as a reset pin (RST).
The single-input comparator can be reset by connecting its output pin to its input pin. When the comparator is reset, it eventually reaches a steady state called a “trip point.” The input/output voltage of the comparator at the trip point is an intrinsic characteristic of the comparator that is determined by its internal circuitry (e.g., its transistors) and the supply voltage. Thus, the trip point (Vtrip) is not controlled by the input voltage on the input pin of the comparator. The trip point serves as the threshold voltage of the comparator against which the input voltage at the input pin is compared, unlike other types of comparators that may include multiple input pins for receiving multiple input voltages and comparing the input voltages to one another. Thus, the trip point serves as the reference voltage (Vref) in a single-input comparator.
The quantizer also includes a second switch operable to couple a ramp generator to the quantizer. For example, the second switch can couple the ramp generator to, and decouple the ramp generator from, a node of the quantizer located between the pixel cell and the charge storage device. When the second switch is closed, the ramp generator can supply a ramp signal to the single-input comparator. The ramp signal is an electric signal that has a linearly increasing voltage or a linearly decreasing voltage, depending on the implementation.
The quantizer can be coupled to a controller, which can operate the quantizer as follows. The controller can operate a reset switch to reset the single-input comparator to its trip point. While the comparator is operating at its trip point, the controller can close the first switch to connect the pixel cell to the charge storage device. Because the controller is operating at Vtrip, the charge storage device will charge to a voltage level that is the difference between Vtrip and the input voltage (Vin) to the quantizer circuit from the pixel cell. This voltage level can be referred to as a “differential voltage.” Storing this differential voltage in the charge storage device can be referred to as “sampling” the differential voltage. The sampled differential voltage is then supplied as input to the single-input comparator for comparison to Vtrip. If the differential voltage is less than Vtrip, which will usually be the case, the comparator will remain in a first output state.
Next, the controller can open the first switch and close the second switch to connect the ramp generator to the single-input comparator. One example of the ramp generator can include a ramp voltage generator. The ramp generator supplies a ramp signal that sweeps a voltage range, causing the input voltage to the comparator to change as the ramp signal's voltage changes. When the ramp signal's voltage crosses the trip point of the comparator, the comparator's output will flip from the first output state to a second output state. Flipping from the first output state to the second output state causes the counter value of a digital counter synchronized with the ramp signal to be latched into digital memory cells. The stored counter value can serve as the digital pixel value corresponding to the electric charge output by the pixel cell. In this way, the electric charge provided by the pixel cell is converted into a digital pixel value using the quantizer.
As described above, the quantizer described herein can have a smaller footprint than a conventional quantizer because the single-input comparator can include less internal circuitry and therefore have a smaller overall footprint than the operational transconductance amplifiers (OTA) (e.g., differential comparators) used in conventional quantizers. And while the quantizer described herein includes the first and second switches to select which of the pixel cell and the ramp signal is coupled to the comparator, the space consumed by the first and second switches is relatively negligible in the quantizer's overall footprint. A voltage protection circuit can also be included in the quantizer to achieve higher input range than may be possible with conventional quantizers that rely on OTAs, which will be explained in greater detail later on. Additional inverter stages or positive feedback stages can be added for further amplification and load driving, which may allow for increased gain as compared to OTAs. Since the single-input comparator may only consume significant current near the trip point where the comparator flips, the amount of current consumed by the single-input comparator during other times can be very small, helping to save power.
The above introduction is provided merely as an example, not to limit or define the limits of the present subject matter. Various other examples are described herein and variations of such examples would be understood by one of skill in the art. Advantages offered by various examples may be further understood by examining this specification and/or by practicing one or more examples of the claimed subject matter.
Photodiode 102 may include, for example, a P-N diode, a P-I-N diode, or a pinned diode. Photodiode 102 can generate and accumulate charge upon receiving light within an exposure period, and the quantity of charge generated within the exposure period can be proportional to the intensity of the light. In some examples, the exposure period can be defined based on the timing of the AB signal.
The image sensor 100 can be coupled to an image processor 109 configured to perform one or more processing operations on the digital pixel values generated by the array of pixels to generate an output digital image 110. Examples of the image processing operations can include filtering, feature extraction, cropping, etc. The digital image 110 may then be transmitted to another system, such as a viewing system for viewing the digital image, a processing system for interpreting the digital image, or a compilation system for compiling a set of digital images.
One example of the quantizer 107 is shown in
In the example shown in
In the quantizer 107, a first switch (SW1) is coupled between the pixel cell and the charge storage device. The first switch is operable to couple the pixel cell to the charge storage device and decouple the pixel cell from the charge storage device based on control signals from a controller 208. A second switch (SW2) is coupled to a circuit node (N1) located between the first switch and the charge storage device. The second switch is operable to couple a ramp generator to the node and decouple the ramp generator from the node based on control signals from the controller 208. A reset switch (RST) is coupled between the output and the input of the comparator. The reset switch is operable to reset the comparator to a trip point (a trip voltage level) based on control signals from the controller 208. Examples of the switches can include transistors or relays.
In some examples, the quantizer 107 can include an output capacitor (C2) coupled between the output of the comparator and ground. The output capacitor may be a band-limiting capacitor that can reduce comparator noise. In some examples, the output capacitor can be implemented as a dedicated capacitor or as a parasitic capacitor, depending the needed capacitor size to achieve a target noise level.
In some examples, the quantizer 107 can include output logic 210 coupled to the output of the comparator to modify the digital signal output by the comparator. Examples of the output logic 210 can include an inverter or an amplifier, such as a positive feedback stage. The output logic 210 can provide for further amplification and load driving, which may allow for increased gain as compared to conventional quantizers.
In some examples, the quantizer 107 can include a voltage protection circuit 212. For example, the voltage protection circuit 212 can be coupled to a node (N2) located between the charge storage device and the single-input comparator. The voltage protection circuit 212 can limit the voltage at node N2, which is equal to the input voltage to the comparator, to a predefined voltage range. This can prevent too much voltage from being supplied to the comparator and thereby prevent the comparator from malfunctioning or breaking.
Some examples of the voltage protection circuit 212 are shown in
Still referring to
As shown in
Referring to
Next, the controller 208 can transmit a second control signal to operate the second switch to electrically connect the ramp generator 214 to the single-input comparator. The ramp generator 214 supplies a ramp signal that sweeps a voltage range, causing the voltage at node N2 to change as the ramp signal's voltage changes. The voltage at node N2 can be referred to as V2. When V2 crosses Vtrip, the comparator's output will flip from the first output state to a second output state, causing the output voltage from the comparator to change. The output voltage from the comparator is equal to the voltage at node N3, which can be referred to as V3. Once the comparator switches output states, V3 may be HIGH or LOW, depending on the implementation. The output voltage may then be supplied to the additional output logic 210 to produce a final output signal, represented in
In some examples, the input range of the quantizer 107 can be derived as follows. For purposes of discussion, the following variables will be used:
In some cases, ramp signal may exhibit some nonlinearity near both Vin_h and Vin_l. To avoid such a nonlinear region, the ramp signal range can be selected to exceed the input signal range with a margin on both high end and low end. For example:
Vramp_h=Vin_h+VADR/8 (1)
Vramp_l=Vin_l−VADR/8 (2)
During the reset phase in which the comparator is reset to its trip point, the voltage at node N2 (V2) will be set to Vtrip. When the first switch is turned off, Vin-Vtrip will be sampled on C1. When the second switch is turned on and node N1 is connected to Vramp, V2 can change as Vramp changes. The upper bound of V2 can be:
V2_max=Vramp_h−(Vin_l−Vtrip) (3)
V2_min=Vramp_l−(Vin_h−Vtrip) (4)
where V2_max should be less than supply voltage and V2_min should be above ground (i.e., electrical ground). As one specific example, if V2_max≤1.1 V and V2_min≥0 V are requirements for the quantizer 107, then:
V2_max−V2_min=Vramp_h−Vramp_l+Vin_h−Vin_l (5)
After plugging equations (1) and (2) into equation (5), the result is:
V2_max−V2_min=2*(Vin_h−Vin_l)+VADR/4=2.25*VADR≤1.1V (6)
This can result in VADR≤0.49 V.
In the example above, the upper bound of VADR assumes there is no additional voltage protection circuit 212 at node N2. By including the voltage protection circuit 212, there is a discharge path when voltage exceeds a predefined limit. But because node N2 needs to stay floating to preserve the sampled differential voltage on C1 for the quantizer 107 to function properly, the added discharge path should only be engaged when the comparator finishes making its comparison, as signified by the comparator flipping between outputs states. The comparator flips between output states when V2=Vtrip or Vramp=Vin. So, when the ramp signal is a down ramp, V2 can be discharged when Vramp<Vin. When the ramp signal is an up ramp, V2 can be discharged when Vramp>Vin. This is illustrated in
The timing and amount of discharging performed by the voltage protection circuit 212 can be dictated by its configuration. Referring to
In the down ramp case, a lower Vtrip may be preferred because this leads to a larger VADR. This relationship can be explained by the equation VADR≤(VDD−Vtrip)/1.125. For example, if VDD=1.1V and Vtrip=0.3V, VADR≤0.71 V can be achieved. This is greater than VADR≤0.49 V when the voltage protection circuit 212 is not used, as described above. For similar reasons, a higher Vtrip may be preferred when an up ramp is used because this leads to a larger VADR.
Some digital image sensors may implement correlated double sampling (CDS) to reduce noise. Correlated double sampling is a method by which two samples are taken during a pixel read-out cycle. One sample is taken when the pixel is in a reset state and another sample is taken when the charge has been transferred to the read-out node. The two values are then used as differential signals in further stages, such a quantization stage. Correlated double sampling can be generally implemented in two ways. The first way involves determining a difference between the two analog signals output from the pixel and quantizing the difference. This is referred to as analog CDS, since the subtraction happens in the analog domain. The second way involves quantizing the two analog signals output from the pixel into digital values and then subtracting the digital values from one another. This is referred to as digital CDS, since the subtraction happens in the digital domain. To implement analog CDS with the quantizer 107, a capacitor divider can be provided at the input node of the comparator, as described in U.S. patent application Ser. No. 17/072,840. To implement digital CDS with the quantizer 107, the timing sequence shown in
In some examples, a digital image sensor that includes the quantizer 107 can be implemented using a stacked (e.g., layered) arrangement of substrates. One examples of this configuration is shown in
In some examples, the primary semiconductor substrate 902 and one or more secondary semiconductor substrates 906 can form a stack along a vertical direction (e.g., represented by z-axis), with vertical interconnects 904 and 908 to provide electrical connection among the substrates. Such arrangements can reduce the routing distance of the electrical connections between the pixel cell array, the processing circuits, and the controller, which can increase the speed of transmission of data (especially pixel data) from the pixel cell array to the other components, and reduce the power required for the transmission.
To further reduce pixel size, the quantizer 107 may be shared among a group of pixel cells in some examples.
As shown, a block of four pixel cells may share a block-level quantization circuit 1220, which can include a block-level ADC (e.g., quantizer 107) and a block-level memory 1216, via a multiplexor. Each pixel cell can take turns accessing quantization circuit 1220 to quantize the charge. For example, pixel cells 101a0-a3 share quantization circuit 1220a0, pixel cells 101a4-a7 share quantization circuit 1221a1, pixel cells 101b0-b3 share quantization circuit 1220b0, and pixel cells 101b4-b7 share quantization circuit 1220b1. In other examples, each pixel cell may have its own dedicated quantization circuit.
Image sensor 100 further includes other circuits, such as a counter 1240 and a digital-to-analog converter (DAC) 1242. Counter 1240 may be similar to the digital counter 216. The counter 1240 can be configured as a digital ramp circuit to supply count values to memory 1216. The count values can also be supplied to DAC 1242 to generate an analog ramp, which can be supplied to quantizer 1207 to perform the quantization operation. Thus, the counter 1240 and the DAC 1242 may collectively serve as a ramp generator in some examples.
Image sensor 100 further includes a buffer network 1230 containing buffers 1230a-d. The buffers 1230a-d can distribute the digital ramp signals representing the counter values, and the analog ramp signal, to processing circuits 1214 of different blocks of pixel cells, such that at any given time each processing circuit 1214 receives the same analog ramp voltage and the same digital counter value. This my help ensure that any difference in the digital values output by different pixel cells is due to differences in the intensity of light received by the pixel cells, not due to mismatches in the digital ramp signals/counter values and analog ramp signals received by the pixel cells.
The image data from image sensor 100 can be transmitted to a host processor to support different applications, such as applications for identifying and tracking objects in images or for performing depth sensing of objects in images. Examples of the host processor can include a Field-Programmable Gate Array (FPGA), an application-specific integrated circuit (ASIC), a microprocessor, a microcontroller, or a combination of these. In some examples, the host processor can be part of an artificial reality system that can make use of the images generated by the image sensor 100. Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, e.g., a virtual reality (VR), an augmented reality (AR), a mixed reality (MR), a hybrid reality, or some combination and/or derivatives thereof. Artificial reality content may include completely generated content or generated content combined with captured (e.g., real-world) content. The artificial reality content may include video, audio, haptic feedback, or some combination thereof, any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional effect to the viewer). Additionally, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, e.g., create content in an artificial reality and/or are otherwise used in (e.g., perform activities in) an artificial reality. The artificial reality system that provides the artificial reality content may be implemented on various platforms, including a head-mounted display (HMD) connected to a host computer system, a standalone HMD, a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers. The artificial reality system can include a display device, such as a liquid crystal display (LCD) or light-emitting diode (LED) display, for displaying the augmented reality content to a user.
In block 1302, the image sensor 100 transmits a reset control signal to a reset switch (e.g., RST) of a quantizer 107. A controller 208 that is part of, or coupled to, the quantizer 107 can transmit the reset control signal.
In block 1304, the image sensor 100 resets a single-input comparator (e.g., Comp) of the quantizer 107 to a trip level in response to the reset control signal.
In block 1306, the image sensor 100 transmits a first control signal to a first switch (e.g., SW1) of the quantizer 107. The controller 208 can transmit the first control signal. The first control signal can enable a pixel cell to be electrically connected to a charge storage device (e.g., C1) of the quantizer 107.
In block 1308, the image sensor 100 samples a voltage difference onto the charge storage device of the quantizer 107. The voltage difference can be a difference between an input voltage (e.g., Vin) from the pixel cell and the trip level.
In block 1310, the image sensor 100 transmits a second control signal to a second switch (e.g., SW2) of the quantizer 107. The controller 208 can transmit the second control signal. The second control signal can enable a ramp generator to be electrically connected to the quantizer 107.
In block 1312, the image sensor 100 provides a ramp signal from the ramp generator to the quantizer 107.
In block 1314, the image sensor 100 switches the single-input comparator from a first output state to a second output state based on the ramp signal.
In block 1316, the image sensor 100 generates an output voltage (e.g., V3) at the single-input comparator when the comparator is in the second output state. That is, the single-input comparator can generate the output voltage when it is in the second output state. The output voltage can then be transmitted from the single-input comparator.
In block 1318, the image sensor 100 amplifies the output voltage using an amplification circuit (e.g., output logic 210) to produce an amplified digital signal. In some examples, the image sensor 100 may apply other techniques to smooth, filter, or otherwise modify the output voltage from the comparator using output logic 210.
In block 1320, the image sensor 100 stores a counter value 202 of a digital counter 216 in a digital memory 204. The counter value 202 can serve as a digital pixel value, which can represent the electric charge output by the pixel cell. The counter value 202 to be stored in the digital memory 204 in response to a memory switch (SW_MEM) changing states. The memory switch can change states based on the output voltage from the single-input comparator (e.g., the amplified digital signal).
In block 1322, the image sensor 100 discharges an input voltage (e.g., V2) at the comparator using a voltage protection circuit 212 coupled to the quantizer 107. The voltage protection circuit 212 may be part of, or electrically coupled to, the quantizer 107. Discharging the input voltage can involve dissipating at least some of the input voltage to ground.
In block 1402, a manufacturer provides a first switch (e.g., SW1), a second switch (e.g., SW2), a reset switch (e.g., RST), a charge storage device (e.g., C1), and/or a comparator (e.g., Comp) on one or more substrates for use in a quantizer 107 of the image sensor 100. This may involve attaching some or all of these components to the one or more substrates and/or fabricating some or all of these components on the one or more substrates through various manufacturing processes. The one or more substrates may be part of a printed circuit board.
In block 1404, the manufacturer electrically couples a pixel cell of the image sensor 100 to the first switch. Electrically coupling two components together may involve electrically connecting the two components together with wires or traces. The traces may be printed traces formed from copper or another conductive material.
In block 1406, the manufacturer electrically couples the first switch to the charge storage device. The manufacturer can couple these two components together such that the first switch is positioned between the pixel cell and the charge storage device. For example, if the first switch has two leads, one lead may be electrically coupled to the pixel cell and the other lead may be electrically coupled to the charge storage device. This configuration may allow the first switch to transmit voltage from the pixel cell to the charge storage device when the first switch is in a closed state, and may prevent the first switch from transmitting voltage from the pixel cell to the charge storage device when the first switch is in an open state.
In block 1408, the manufacturer electrically couples the charge storage device to the comparator. For example, the manufacturer can electrically couple the charge storage device to the input pin of the comparator. The manufacturer can couple these two components together such that the charge storage device is positioned between the first switch and the comparator. For example, if the charge storage device has two leads, one lead may be electrically coupled to the first switch and the other lead may be electrically coupled to the comparator.
In block 1410, the manufacturer electrically couples the reset switch between the output and the input of the comparator. For example, if the reset switch has two leads, one lead may be electrically coupled to the output pin of the comparator and the other lead may be electrically coupled to the input pin of the comparator. This configuration may allow the reset switch to transmit voltage from the comparator's output to the comparator's input when the reset switch is in a closed state, causing the comparator to reset to a trip state.
In block 1412, the manufacturer can electrically couple second switch to a node (e.g., N2) of the quantizer 107. The node may be positioned between the charge storage device and the comparator.
In block 1414, the manufacturer electrically couples a voltage protection circuit to the second switch. For example, if the second switch has two leads, one lead may be electrically coupled to the node and the other lead may be electrically coupled to the voltage protection circuit. This configuration may allow the second switch to transmit voltage between the node and the voltage protection circuit when the second switch is in a closed state, and may prevent the second switch from transmitting voltage from the node to the voltage protection circuit when the second switch is in an open state. The other side of the voltage protection circuit can be electrically coupled to ground or a power supply, depending on the implementation.
In block 1416, the manufacturer electrically couples output logic 210 to the comparator. For example, the manufacturer can electrically couple the output logic 210 to the output pin of the comparator, so that output logic 210 operates on an output voltage at the output pin of the comparator.
In block 1418, the manufacturer electrically couples a capacitor (e.g., C2) between the comparator and ground. For example, the manufacturer can electrically couple the capacitor between the output pin of the comparator and ground.
In block 1420, the manufacturer electrically couples a controller 208 of the image sensor 100 to the reset switch, the first switch, and/or the second switch.
Some portions of this description describe the embodiments of the disclosure in terms of algorithms and symbolic representations of operations on information. These algorithmic descriptions and representations are commonly used by those skilled in the data processing arts to convey the substance of their work effectively to others skilled in the art. These operations, while described functionally, computationally, or logically, are understood to be implemented by computer programs or equivalent electrical circuits, microcode, or the like. Furthermore, it has also proven convenient at times, to refer to these arrangements of operations as modules, without loss of generality. The described operations and their associated modules may be embodied in software, firmware, and/or hardware.
Steps, operations, or processes described may be performed or implemented with one or more hardware or software modules, alone or in combination with other devices. In some embodiments, a software module is implemented with a computer program product comprising a computer-readable medium containing computer program code, which can be executed by a computer processor for performing any or all of the steps, operations, or processes described.
Embodiments of the disclosure may also relate to an apparatus for performing the operations described. The apparatus may be specially constructed for the required purposes, and/or it may include a general-purpose computing device selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a non-transitory, tangible computer-readable storage medium, or any type of media suitable for storing electronic instructions, which may be coupled to a computer system bus. Furthermore, any computing systems referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
Embodiments of the disclosure may also relate to a product that is produced by a computing process described herein. Such a product may include information resulting from a computing process, where the information is stored on a non-transitory, tangible computer-readable storage medium and may include any embodiment of a computer program product or other data combination described herein.
The language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the disclosure be limited not by this detailed description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of the embodiments is intended to be illustrative, but not limiting, of the scope of the disclosure, which is set forth in the following claims.
This application claims priority to U.S. provisional patent application Ser. No. 63/131,563, filed Dec. 29, 2020, entitled, “DIGITAL PIXEL SENSOR USING SINGLE-INPUT COMPARATOR,” which is hereby expressly incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4596977 | Bauman et al. | Jun 1986 | A |
5053771 | McDermott | Oct 1991 | A |
5650643 | Konuma | Jul 1997 | A |
5844512 | Gorin et al. | Dec 1998 | A |
5963369 | Steinthal et al. | Oct 1999 | A |
6181822 | Miller et al. | Jan 2001 | B1 |
6384905 | Barrows | May 2002 | B1 |
6522395 | Bamji et al. | Feb 2003 | B1 |
6529241 | Clark | Mar 2003 | B1 |
6864817 | Salvi et al. | Mar 2005 | B1 |
6963369 | Olding | Nov 2005 | B1 |
7326903 | Ackland | Feb 2008 | B2 |
7362365 | Reyneri et al. | Apr 2008 | B1 |
7659772 | Nomura et al. | Feb 2010 | B2 |
7659925 | Krymski | Feb 2010 | B2 |
7719589 | Turchetta et al. | May 2010 | B2 |
7880779 | Storm | Feb 2011 | B2 |
7956914 | Xu | Jun 2011 | B2 |
8134623 | Purcell et al. | Mar 2012 | B2 |
8144227 | Kobayashi | Mar 2012 | B2 |
8369458 | Wong et al. | Feb 2013 | B2 |
8426793 | Barrows | Apr 2013 | B1 |
8754798 | Lin | Jun 2014 | B2 |
8773562 | Fan | Jul 2014 | B1 |
8779346 | Fowler et al. | Jul 2014 | B2 |
8946610 | Iwabuchi et al. | Feb 2015 | B2 |
9001251 | Smith et al. | Apr 2015 | B2 |
9094629 | Ishibashi | Jul 2015 | B2 |
9185273 | Beck et al. | Nov 2015 | B2 |
9274151 | Lee et al. | Mar 2016 | B2 |
9282264 | Park et al. | Mar 2016 | B2 |
9332200 | Hseih et al. | May 2016 | B1 |
9343497 | Cho | May 2016 | B2 |
9363454 | Ito et al. | Jun 2016 | B2 |
9478579 | Dai et al. | Oct 2016 | B2 |
9497396 | Choi | Nov 2016 | B2 |
9531990 | Wilkins et al. | Dec 2016 | B1 |
9800260 | Banerjee | Oct 2017 | B1 |
9819885 | Furukawa et al. | Nov 2017 | B2 |
9832370 | Cho et al. | Nov 2017 | B2 |
9909922 | Schweickert et al. | Mar 2018 | B2 |
9948316 | Yun et al. | Apr 2018 | B1 |
9955091 | Dai et al. | Apr 2018 | B1 |
9967496 | Ayers et al. | May 2018 | B2 |
10003759 | Fan | Jun 2018 | B2 |
10015416 | Borthakur et al. | Jul 2018 | B2 |
10090342 | Gambino et al. | Oct 2018 | B1 |
10096631 | Ishizu | Oct 2018 | B2 |
10154221 | Ogino et al. | Dec 2018 | B2 |
10157951 | Kim et al. | Dec 2018 | B2 |
10321081 | Watanabe et al. | Jun 2019 | B2 |
10345447 | Hicks | Jul 2019 | B1 |
10419701 | Liu | Sep 2019 | B2 |
10574925 | Otaka | Feb 2020 | B2 |
10594974 | Ivarsson et al. | Mar 2020 | B2 |
10598546 | Liu | Mar 2020 | B2 |
10608101 | Liu | Mar 2020 | B2 |
10686996 | Liu | Jun 2020 | B2 |
10726627 | Liu | Jul 2020 | B2 |
10750097 | Liu | Aug 2020 | B2 |
10764526 | Liu et al. | Sep 2020 | B1 |
10804926 | Gao et al. | Oct 2020 | B2 |
10812742 | Chen et al. | Oct 2020 | B2 |
10825854 | Liu | Nov 2020 | B2 |
10834344 | Chen et al. | Nov 2020 | B2 |
10897586 | Liu et al. | Jan 2021 | B2 |
10903260 | Chen et al. | Jan 2021 | B2 |
10917589 | Liu | Feb 2021 | B2 |
10951849 | Liu | Mar 2021 | B2 |
10969273 | Berkovich et al. | Apr 2021 | B2 |
11004881 | Liu et al. | May 2021 | B2 |
11057581 | Liu | Jul 2021 | B2 |
11089210 | Berkovich et al. | Aug 2021 | B2 |
11595598 | Liu et al. | Feb 2023 | B2 |
11595602 | Gao et al. | Feb 2023 | B2 |
11729525 | Liu | Aug 2023 | B2 |
20020067303 | Lee et al. | Jun 2002 | A1 |
20020113886 | Hynecek | Aug 2002 | A1 |
20020118289 | Choi | Aug 2002 | A1 |
20030001080 | Kummaraguntla et al. | Jan 2003 | A1 |
20030020100 | Guidash | Jan 2003 | A1 |
20030049925 | Layman et al. | Mar 2003 | A1 |
20040090200 | Youm | May 2004 | A1 |
20040095495 | Inokuma et al. | May 2004 | A1 |
20040118994 | Mizuno | Jun 2004 | A1 |
20040227495 | Egan | Nov 2004 | A1 |
20040251483 | Ko et al. | Dec 2004 | A1 |
20050046715 | Lim et al. | Mar 2005 | A1 |
20050057389 | Krymski | Mar 2005 | A1 |
20050104983 | Raynor | May 2005 | A1 |
20050206414 | Cottin et al. | Sep 2005 | A1 |
20050231624 | Muramatsu | Oct 2005 | A1 |
20050237380 | Kakii et al. | Oct 2005 | A1 |
20050280727 | Sato et al. | Dec 2005 | A1 |
20060023109 | Mabuchi et al. | Feb 2006 | A1 |
20060146159 | Farrier | Jul 2006 | A1 |
20060157759 | Okita et al. | Jul 2006 | A1 |
20060158541 | Ichikawa | Jul 2006 | A1 |
20070013983 | Kitamura et al. | Jan 2007 | A1 |
20070076109 | Krymski | Apr 2007 | A1 |
20070076481 | Tennant | Apr 2007 | A1 |
20070092244 | Pertsel et al. | Apr 2007 | A1 |
20070102740 | Ellis-Monaghan et al. | May 2007 | A1 |
20070131991 | Sugawa | Jun 2007 | A1 |
20070208526 | Staudt et al. | Sep 2007 | A1 |
20070216564 | Koseki | Sep 2007 | A1 |
20070222881 | Mentzer | Sep 2007 | A1 |
20080001065 | Ackland | Jan 2008 | A1 |
20080007731 | Botchway et al. | Jan 2008 | A1 |
20080042888 | Danesh | Feb 2008 | A1 |
20080068478 | Watanabe | Mar 2008 | A1 |
20080088014 | Adkisson et al. | Apr 2008 | A1 |
20080191791 | Nomura et al. | Aug 2008 | A1 |
20080226170 | Sonoda | Sep 2008 | A1 |
20080226183 | Lei et al. | Sep 2008 | A1 |
20080266434 | Sugawa et al. | Oct 2008 | A1 |
20090002528 | Manabe et al. | Jan 2009 | A1 |
20090033588 | Kajita et al. | Feb 2009 | A1 |
20090040364 | Rubner | Feb 2009 | A1 |
20090091645 | Trimeche et al. | Apr 2009 | A1 |
20090128640 | Yumiki | May 2009 | A1 |
20090140305 | Sugawa | Jun 2009 | A1 |
20090219266 | Lim et al. | Sep 2009 | A1 |
20090224139 | Buettgen et al. | Sep 2009 | A1 |
20090237536 | Purcell et al. | Sep 2009 | A1 |
20090244346 | Funaki | Oct 2009 | A1 |
20090245637 | Barman et al. | Oct 2009 | A1 |
20090256735 | Bogaerts | Oct 2009 | A1 |
20090261235 | Lahav et al. | Oct 2009 | A1 |
20090321615 | Sugiyama et al. | Dec 2009 | A1 |
20100013969 | Ui | Jan 2010 | A1 |
20100140732 | Eminoglu et al. | Jun 2010 | A1 |
20100194956 | Yuan et al. | Aug 2010 | A1 |
20100232227 | Lee | Sep 2010 | A1 |
20100276572 | Iwabuchi et al. | Nov 2010 | A1 |
20110049589 | Chuang et al. | Mar 2011 | A1 |
20110122304 | Sedelnikov | May 2011 | A1 |
20110149116 | Kim | Jun 2011 | A1 |
20110155892 | Neter et al. | Jun 2011 | A1 |
20110254986 | Nishimura et al. | Oct 2011 | A1 |
20120016817 | Smith et al. | Jan 2012 | A1 |
20120039548 | Wang | Feb 2012 | A1 |
20120068051 | Ahn et al. | Mar 2012 | A1 |
20120092677 | Suehira et al. | Apr 2012 | A1 |
20120105475 | Tseng | May 2012 | A1 |
20120105668 | Velarde et al. | May 2012 | A1 |
20120113119 | Massie | May 2012 | A1 |
20120127284 | Bar-Zeev et al. | May 2012 | A1 |
20120133807 | Wu et al. | May 2012 | A1 |
20120138775 | Cheon et al. | Jun 2012 | A1 |
20120153123 | Mao et al. | Jun 2012 | A1 |
20120188420 | Black et al. | Jul 2012 | A1 |
20120200499 | Osterhout et al. | Aug 2012 | A1 |
20120205520 | Hsieh et al. | Aug 2012 | A1 |
20120212465 | White et al. | Aug 2012 | A1 |
20120241591 | Wan et al. | Sep 2012 | A1 |
20120262616 | Sa et al. | Oct 2012 | A1 |
20120267511 | Kozlowski | Oct 2012 | A1 |
20120273654 | Hynecek et al. | Nov 2012 | A1 |
20120305751 | Kusuda | Dec 2012 | A1 |
20130020466 | Ayers et al. | Jan 2013 | A1 |
20130056809 | Mao et al. | Mar 2013 | A1 |
20130057742 | Nakamura et al. | Mar 2013 | A1 |
20130068929 | Solhusvik et al. | Mar 2013 | A1 |
20130069787 | Petrou | Mar 2013 | A1 |
20130082313 | Manabe | Apr 2013 | A1 |
20130113969 | Manabe et al. | May 2013 | A1 |
20130126710 | Kondo | May 2013 | A1 |
20130141619 | Lim et al. | Jun 2013 | A1 |
20130187027 | Qiao et al. | Jul 2013 | A1 |
20130207219 | Ahn | Aug 2013 | A1 |
20130214127 | Ohya et al. | Aug 2013 | A1 |
20130214371 | Asatsuma et al. | Aug 2013 | A1 |
20130218728 | Hashop et al. | Aug 2013 | A1 |
20130221194 | Manabe | Aug 2013 | A1 |
20130229543 | Hashimoto et al. | Sep 2013 | A1 |
20130229560 | Kondo | Sep 2013 | A1 |
20130234029 | Bikumandla | Sep 2013 | A1 |
20130293752 | Peng et al. | Nov 2013 | A1 |
20130299674 | Fowler et al. | Nov 2013 | A1 |
20140021574 | Egawa | Jan 2014 | A1 |
20140042299 | Wan et al. | Feb 2014 | A1 |
20140042582 | Kondo | Feb 2014 | A1 |
20140070974 | Park et al. | Mar 2014 | A1 |
20140078336 | Beck et al. | Mar 2014 | A1 |
20140085523 | Hynecek | Mar 2014 | A1 |
20140176770 | Kondo | Jun 2014 | A1 |
20140211052 | Choi | Jul 2014 | A1 |
20140232890 | Yoo et al. | Aug 2014 | A1 |
20140247382 | Moldovan et al. | Sep 2014 | A1 |
20140306276 | Yamaguchi | Oct 2014 | A1 |
20140368687 | Yu et al. | Dec 2014 | A1 |
20150070544 | Smith et al. | Mar 2015 | A1 |
20150077611 | Yamashita et al. | Mar 2015 | A1 |
20150083895 | Hashimoto et al. | Mar 2015 | A1 |
20150085134 | Novotny et al. | Mar 2015 | A1 |
20150090863 | Mansoorian et al. | Apr 2015 | A1 |
20150172574 | Honda et al. | Jun 2015 | A1 |
20150179696 | Kurokawa et al. | Jun 2015 | A1 |
20150189209 | Yang et al. | Jul 2015 | A1 |
20150201142 | Smith et al. | Jul 2015 | A1 |
20150208009 | Oh et al. | Jul 2015 | A1 |
20150229859 | Guidash et al. | Aug 2015 | A1 |
20150237274 | Yang et al. | Aug 2015 | A1 |
20150279884 | Kusumoto | Oct 2015 | A1 |
20150281613 | Vogelsang et al. | Oct 2015 | A1 |
20150287766 | Kim et al. | Oct 2015 | A1 |
20150309311 | Cho | Oct 2015 | A1 |
20150309316 | Osterhout et al. | Oct 2015 | A1 |
20150312461 | Kim et al. | Oct 2015 | A1 |
20150312502 | Borremans | Oct 2015 | A1 |
20150312557 | Kim | Oct 2015 | A1 |
20150350582 | Korobov et al. | Dec 2015 | A1 |
20150358569 | Egawa | Dec 2015 | A1 |
20150358571 | Dominguez Castro et al. | Dec 2015 | A1 |
20150358593 | Sato | Dec 2015 | A1 |
20150381907 | Boettiger et al. | Dec 2015 | A1 |
20150381911 | Shen et al. | Dec 2015 | A1 |
20160011422 | Thurber et al. | Jan 2016 | A1 |
20160018645 | Haddick et al. | Jan 2016 | A1 |
20160021302 | Cho et al. | Jan 2016 | A1 |
20160028974 | Guidash | Jan 2016 | A1 |
20160028980 | Kameyama et al. | Jan 2016 | A1 |
20160037111 | Dai et al. | Feb 2016 | A1 |
20160078614 | Ryu et al. | Mar 2016 | A1 |
20160088253 | Tezuka | Mar 2016 | A1 |
20160100113 | Oh et al. | Apr 2016 | A1 |
20160100115 | Kusano | Apr 2016 | A1 |
20160111457 | Sekine | Apr 2016 | A1 |
20160112626 | Shimada | Apr 2016 | A1 |
20160118992 | Milkov | Apr 2016 | A1 |
20160165160 | Hseih et al. | Jun 2016 | A1 |
20160197117 | Nakata et al. | Jul 2016 | A1 |
20160204150 | Oh et al. | Jul 2016 | A1 |
20160210785 | Balachandreswaran et al. | Jul 2016 | A1 |
20160240570 | Barna et al. | Aug 2016 | A1 |
20160249004 | Saeki et al. | Aug 2016 | A1 |
20160255293 | Gesset | Sep 2016 | A1 |
20160277010 | Park et al. | Sep 2016 | A1 |
20160307945 | Madurawe | Oct 2016 | A1 |
20160307949 | Madurawe | Oct 2016 | A1 |
20160309140 | Wang | Oct 2016 | A1 |
20160337605 | Ito | Nov 2016 | A1 |
20160353045 | Kawahito et al. | Dec 2016 | A1 |
20160360127 | Dierickx et al. | Dec 2016 | A1 |
20170013215 | McCarten | Jan 2017 | A1 |
20170039906 | Jepsen | Feb 2017 | A1 |
20170041571 | Tyrrell et al. | Feb 2017 | A1 |
20170053962 | Oh et al. | Feb 2017 | A1 |
20170059399 | Suh et al. | Mar 2017 | A1 |
20170062501 | Velichko et al. | Mar 2017 | A1 |
20170069363 | Baker | Mar 2017 | A1 |
20170070691 | Nishikido | Mar 2017 | A1 |
20170099422 | Goma et al. | Apr 2017 | A1 |
20170099446 | Cremers et al. | Apr 2017 | A1 |
20170104021 | Park et al. | Apr 2017 | A1 |
20170104946 | Hong | Apr 2017 | A1 |
20170111600 | Wang et al. | Apr 2017 | A1 |
20170141147 | Raynor | May 2017 | A1 |
20170154909 | Ishizu | Jun 2017 | A1 |
20170170223 | Hynecek et al. | Jun 2017 | A1 |
20170195602 | Iwabuchi et al. | Jul 2017 | A1 |
20170201693 | Sugizaki et al. | Jul 2017 | A1 |
20170207268 | Kurokawa | Jul 2017 | A1 |
20170228345 | Gupta et al. | Aug 2017 | A1 |
20170251151 | Hicks | Aug 2017 | A1 |
20170270664 | Hoogi et al. | Sep 2017 | A1 |
20170272667 | Hynecek | Sep 2017 | A1 |
20170272768 | Tall et al. | Sep 2017 | A1 |
20170280031 | Price et al. | Sep 2017 | A1 |
20170293799 | Skogo et al. | Oct 2017 | A1 |
20170310910 | Smith et al. | Oct 2017 | A1 |
20170318250 | Sakakibara et al. | Nov 2017 | A1 |
20170324917 | Mlinar et al. | Nov 2017 | A1 |
20170338262 | Hirata | Nov 2017 | A1 |
20170339327 | Koshkin et al. | Nov 2017 | A1 |
20170346579 | Barghi | Nov 2017 | A1 |
20170350755 | Geurts | Dec 2017 | A1 |
20170359497 | Mandelli et al. | Dec 2017 | A1 |
20170359521 | Kim | Dec 2017 | A1 |
20170366766 | Geurts et al. | Dec 2017 | A1 |
20180019269 | Klipstein | Jan 2018 | A1 |
20180077368 | Suzuki | Mar 2018 | A1 |
20180115725 | Zhang et al. | Apr 2018 | A1 |
20180136471 | Miller et al. | May 2018 | A1 |
20180143701 | Suh et al. | May 2018 | A1 |
20180152650 | Sakakibara et al. | May 2018 | A1 |
20180167575 | Watanabe et al. | Jun 2018 | A1 |
20180175083 | Takahashi | Jun 2018 | A1 |
20180176545 | Aflaki Beni | Jun 2018 | A1 |
20180204867 | Kim et al. | Jul 2018 | A1 |
20180220093 | Murao et al. | Aug 2018 | A1 |
20180224658 | Teller | Aug 2018 | A1 |
20180227516 | Mo et al. | Aug 2018 | A1 |
20180241953 | Johnson | Aug 2018 | A1 |
20180270436 | Ivarsson et al. | Sep 2018 | A1 |
20180276841 | Krishnaswamy et al. | Sep 2018 | A1 |
20180376046 | Liu | Dec 2018 | A1 |
20180376090 | Liu | Dec 2018 | A1 |
20190035154 | Liu | Jan 2019 | A1 |
20190046044 | Tzvieli et al. | Feb 2019 | A1 |
20190052788 | Liu | Feb 2019 | A1 |
20190052821 | Berner et al. | Feb 2019 | A1 |
20190056264 | Liu | Feb 2019 | A1 |
20190057995 | Liu | Feb 2019 | A1 |
20190058058 | Liu | Feb 2019 | A1 |
20190098232 | Mori et al. | Mar 2019 | A1 |
20190104263 | Ochiai et al. | Apr 2019 | A1 |
20190104265 | Totsuka et al. | Apr 2019 | A1 |
20190110039 | Linde et al. | Apr 2019 | A1 |
20190115931 | Hurwitz | Apr 2019 | A1 |
20190123088 | Kwon | Apr 2019 | A1 |
20190124285 | Otaka | Apr 2019 | A1 |
20190141270 | Otaka et al. | May 2019 | A1 |
20190149751 | Wise | May 2019 | A1 |
20190157330 | Sato et al. | May 2019 | A1 |
20190172227 | Kasahara | Jun 2019 | A1 |
20190172868 | Chen et al. | Jun 2019 | A1 |
20190191116 | Madurawe | Jun 2019 | A1 |
20190246036 | Wu et al. | Aug 2019 | A1 |
20190253650 | Kim | Aug 2019 | A1 |
20190327439 | Chen et al. | Oct 2019 | A1 |
20190331914 | Lee et al. | Oct 2019 | A1 |
20190335151 | Rivard et al. | Oct 2019 | A1 |
20190348460 | Chen et al. | Nov 2019 | A1 |
20190355782 | Do et al. | Nov 2019 | A1 |
20190363118 | Berkovich et al. | Nov 2019 | A1 |
20190371845 | Chen et al. | Dec 2019 | A1 |
20190376845 | Liu et al. | Dec 2019 | A1 |
20190379388 | Gao et al. | Dec 2019 | A1 |
20190379827 | Berkovich et al. | Dec 2019 | A1 |
20190379846 | Chen et al. | Dec 2019 | A1 |
20200007800 | Berkovich et al. | Jan 2020 | A1 |
20200053299 | Zhang et al. | Feb 2020 | A1 |
20200059589 | Liu et al. | Feb 2020 | A1 |
20200068189 | Chen et al. | Feb 2020 | A1 |
20200186731 | Chen et al. | Jun 2020 | A1 |
20200195875 | Berkovich et al. | Jun 2020 | A1 |
20200217714 | Liu | Jul 2020 | A1 |
20200228745 | Otaka | Jul 2020 | A1 |
20200374475 | Fukuoka et al. | Nov 2020 | A1 |
20210026796 | Graif et al. | Jan 2021 | A1 |
20210099659 | Miyauchi et al. | Apr 2021 | A1 |
20210185264 | Wong et al. | Jun 2021 | A1 |
20210227159 | Sambonsugi | Jul 2021 | A1 |
20210368124 | Berkovich et al. | Nov 2021 | A1 |
20230080288 | Berkovich et al. | Mar 2023 | A1 |
20230092325 | Tsai et al. | Mar 2023 | A1 |
20230239582 | Berkovich et al. | Jul 2023 | A1 |
Number | Date | Country |
---|---|---|
1490878 | Apr 2004 | CN |
1728397 | Feb 2006 | CN |
1812506 | Aug 2006 | CN |
101753866 | Jun 2010 | CN |
103002228 | Mar 2013 | CN |
103207716 | Jul 2013 | CN |
104125418 | Oct 2014 | CN |
104204904 | Dec 2014 | CN |
104469195 | Mar 2015 | CN |
104704812 | Jun 2015 | CN |
104733485 | Jun 2015 | CN |
104754255 | Jul 2015 | CN |
204633945 | Sep 2015 | CN |
105144699 | Dec 2015 | CN |
105529342 | Apr 2016 | CN |
105706439 | Jun 2016 | CN |
205666884 | Oct 2016 | CN |
106255978 | Dec 2016 | CN |
106791504 | May 2017 | CN |
107852473 | Mar 2018 | CN |
ZL201810821296 | Jun 2022 | CN |
202016105510 | Oct 2016 | DE |
0675345 | Oct 1995 | EP |
1681856 | Jul 2006 | EP |
1732134 | Dec 2006 | EP |
1746820 | Jan 2007 | EP |
1788802 | May 2007 | EP |
2037505 | Mar 2009 | EP |
2063630 | May 2009 | EP |
2538664 | Dec 2012 | EP |
2804074 | Nov 2014 | EP |
2833619 | Feb 2015 | EP |
3032822 | Jun 2016 | EP |
3229457 | Oct 2017 | EP |
3258683 | Dec 2017 | EP |
3425352 | Jan 2019 | EP |
3439039 | Feb 2019 | EP |
3744085 | Dec 2020 | EP |
H08195906 | Jul 1996 | JP |
2001008101 | Jan 2001 | JP |
2002199292 | Jul 2002 | JP |
2003319262 | Nov 2003 | JP |
2005328493 | Nov 2005 | JP |
2006197382 | Jul 2006 | JP |
2006203736 | Aug 2006 | JP |
2007074447 | Mar 2007 | JP |
2011216966 | Oct 2011 | JP |
2012054495 | Mar 2012 | JP |
2012054876 | Mar 2012 | JP |
2012095349 | May 2012 | JP |
2013009087 | Jan 2013 | JP |
2013055581 | Mar 2013 | JP |
2013172203 | Sep 2013 | JP |
2013225774 | Oct 2013 | JP |
2014107596 | Jun 2014 | JP |
2014165733 | Sep 2014 | JP |
2014236183 | Dec 2014 | JP |
2015065524 | Apr 2015 | JP |
2015126043 | Jul 2015 | JP |
2015530855 | Oct 2015 | JP |
2015211259 | Nov 2015 | JP |
2016092661 | May 2016 | JP |
2016513942 | May 2016 | JP |
2017509251 | Mar 2017 | JP |
100574959 | Apr 2006 | KR |
20080019652 | Mar 2008 | KR |
20090023549 | Mar 2009 | KR |
20110050351 | May 2011 | KR |
20110134941 | Dec 2011 | KR |
20120058337 | Jun 2012 | KR |
20120117953 | Oct 2012 | KR |
20150095841 | Aug 2015 | KR |
20160008267 | Jan 2016 | KR |
20160008287 | Jan 2016 | KR |
201448184 | Dec 2014 | TW |
201719874 | Jun 2017 | TW |
201728161 | Aug 2017 | TW |
I624694 | May 2018 | TW |
WO-2006124592 | Nov 2006 | WO |
WO-2006129762 | Dec 2006 | WO |
WO-2010117462 | Oct 2010 | WO |
WO-2013099723 | Jul 2013 | WO |
WO-2014055391 | Apr 2014 | WO |
2014144391 | Sep 2014 | WO |
WO-2015135836 | Sep 2015 | WO |
2015182390 | Dec 2015 | WO |
2016014860 | Jan 2016 | WO |
WO-2016095057 | Jun 2016 | WO |
2016194653 | Dec 2016 | WO |
WO-2017003477 | Jan 2017 | WO |
WO-2017013806 | Jan 2017 | WO |
WO-2017047010 | Mar 2017 | WO |
WO-2017058488 | Apr 2017 | WO |
WO-2017069706 | Apr 2017 | WO |
WO-2017169446 | Oct 2017 | WO |
WO-2017169882 | Oct 2017 | WO |
WO-2019018084 | Jan 2019 | WO |
WO-2019111528 | Jun 2019 | WO |
WO-2019145578 | Aug 2019 | WO |
WO-2019168929 | Sep 2019 | WO |
Entry |
---|
Advisory Action dated Apr. 7, 2020 for U.S. Appl. No. 15/801,216, filed Nov. 1, 2019, 3 Pages. |
Advisory Action dated Oct. 8, 2020 for U.S. Appl. No. 16/210,748, filed Dec. 5, 2018, 4 Pages. |
Advisory Action dated Oct. 1, 2021 for U.S. Appl. No. 17/083,920, filed Oct. 29, 2020, 4 pages. |
Advisory Action dated Oct. 23, 2019 for U.S. Appl. No. 15/668,241, filed Aug. 3, 2017, 5 Pages. |
Amir M.F., et al., “3-D Stacked Image Sensor With Deep Neural Network Computation,” IEEE Sensors Journal, IEEE Service Center, New York, NY, US, May 15, 2018, vol. 18 (10), pp. 4187-4199, XP011681876. |
Cho K., et al., “A Low Power Dual CDS for a Column-Parallel CMOS Image Sensor,” Journal of Semiconductor Technology and Science, Dec. 30, 2012, vol. 12 (4), pp. 388-396. |
Chuxi L., et al., “A Memristor-Based Processing-in-Memory Architecture for Deep Convolutional Neural Networks Approximate Computation,” Journal of Computer Research and Development, Jun. 30, 2017, vol. 54 (6), pp. 1367-1380. |
Communication Pursuant Article 94(3) dated Dec. 23, 2021 for European Application No. 19744961.4, filed Jun. 28, 2019, 8 pages. |
Communication Pursuant Article 94(3) dated Jan. 5, 2022 for European Application No. 19740456.9, filed Jun. 27, 2019, 12 pages. |
Corrected Notice of Allowability dated Feb. 3, 2021 for U.S. Appl. No. 16/566,583, filed Sep. 10, 2019, 2 Pages. |
Corrected Notice of Allowability dated Apr. 9, 2021 for U.S. Appl. No. 16/255,528, filed Jan. 23, 2019, 5 Pages. |
Corrected Notice of Allowability dated Dec. 11, 2020 for U.S. Appl. No. 16/566,583, filed Sep. 10, 2019, 2 Pages. |
Corrected Notice of Allowability dated Jul. 26, 2021 for U.S. Appl. No. 16/707,988, filed Dec. 9, 2019, 2 Pages. |
Corrected Notice of Allowability dated Apr. 28, 2020 for U.S. Appl. No. 15/876,061, filed Jan. 19, 2018, 2 Pages. |
Corrected Notice of Allowance dated Mar. 7, 2022 for U.S. Appl. No. 17/150,925, filed Jan. 15, 2021, 2 Pages. |
Corrected Notice of Allowance dated Mar. 29, 2022 for U.S. Appl. No. 17/150,925, filed Jan. 15, 2021, 2 Pages. |
Extended European Search Report for European Application No. 18179838.0, dated May 24, 2019, 17 Pages. |
Extended European Search Report for European Application No. 18179846.3, dated Dec. 7, 2018, 10 Pages. |
Extended European Search Report for European Application No. 18179851.3, dated Dec. 7, 2018, 8 Pages. |
Extended European Search Report for European Application No. 18188684.7, dated Jan. 16, 2019, 10 Pages. |
Extended European Search Report for European Application No. 18188962.7, dated Oct. 23, 2018, 8 Pages. |
Extended European Search Report for European Application No. 18188968.4, dated Oct. 23, 2018, 8 Pages. |
Extended European Search Report for European Application No. 18189100.3, dated Oct. 9, 2018, 8 Pages. |
Extended European Search Report for European Application No. 19743908.6, dated Sep. 30, 2020, 9 Pages. |
Final Office Action dated Dec. 3, 2021 for U.S. Appl. No. 17/072,840, filed Oct. 16, 2020, 23 pages. |
Final Office Action dated Jul. 7, 2020 for U.S. Appl. No. 16/210,748, filed Dec. 5, 2018, 11 Pages. |
Final Office Action dated Jul. 12, 2021 for U.S. Appl. No. 16/435,451, filed Jun. 7, 2019, 13 Pages. |
Final Office Action dated Apr. 15, 2020 for U.S. Appl. No. 16/566,583, filed Sep. 10, 2019, 24 Pages. |
Final Office Action dated Jun. 17, 2019 for U.S. Appl. No. 15/668,241, filed Aug. 3, 2017, 19 Pages. |
Final Office Action dated Oct. 18, 2021 for U.S. Appl. No. 16/716,050, filed Dec. 16, 2019, 18 Pages. |
Final Office Action dated Oct. 21, 2021 for U.S. Appl. No. 16/421,441, filed May 23, 2019, 23 Pages. |
Final Office Action dated Dec. 26, 2019 for U.S. Appl. No. 15/801,216, filed Nov. 1, 2017, 5 Pages. |
Final Office Action dated Feb. 27, 2020 for U.S. Appl. No. 16/177,971, filed Nov. 1, 2018, 9 Pages. |
Final Office Action dated Jan. 27, 2021 for U.S. Appl. No. 16/255,528, filed Jan. 23, 2019, 31 Pages. |
Final Office Action dated Jul. 28, 2021 for U.S. Appl. No. 17/083,920, filed Oct. 29, 2020, 19 Pages. |
Final Office Action dated Apr. 29, 2020 for U.S. Appl. No. 15/719,345, filed Sep. 28, 2017, 14 Pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2018/039350, dated Jan. 9, 2020, 10 Pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2020/044807, dated Feb. 17, 2022, 10 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2018/039350, dated Nov. 15, 2018, 11 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2018/039352, dated Oct. 26, 2018, 8 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2018/039431, dated Nov. 7, 2018, 12 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2018/045661, dated Nov. 30, 2018, 11 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2018/045666, dated Dec. 3, 2018, 13 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2018/045673, dated Dec. 4, 2018, 13 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2018/046131, dated Dec. 3, 2018, 10 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2018/064181, dated Mar. 29, 2019, 10 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/014044, dated May 8, 2019, 9 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/014904, dated Aug. 5, 2019, 7 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/019756, dated Jun. 13, 2019, 11 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/019765, dated Jun. 14, 2019, 9 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/025170, dated Jul. 9, 2019, 11 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/027727, dated Jun. 27, 2019, 10 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/027729, dated Jun. 27, 2019, 9 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/031521, dated Jul. 11, 2019, 8 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/034007, dated Oct. 28, 2019, 18 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/035724, dated Sep. 10, 2019, 11 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/036484, dated Sep. 19, 2019, 9 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/036492, dated Sep. 25, 2019, 8 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/036536, dated Sep. 26, 2019, 13 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/036575, dated Sep. 30, 2019, 15 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/039410, dated Sep. 30, 2019, 10 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/039758, dated Oct. 11, 2019, 12 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/047156, dated Oct. 23, 2019, 9 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/048241, dated Jan. 28, 2020, 16 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/049756, dated Dec. 16, 2019, 8 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/059754, dated Mar. 24, 2020, 11 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/065430, dated Mar. 6, 2020, 11 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/066805, dated Mar. 6, 2020, 9 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/066831, dated Feb. 27, 2020, 11 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2020/044807, dated Sep. 30, 2020, 12 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2020/058097, dated Feb. 12, 2021, 09 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2020/059636, dated Feb. 11, 2021, 18 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2021/031201, dated Aug. 2, 2021, 13 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2021/033321, dated Sep. 6, 2021, 11 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2021/041775, dated Nov. 29, 2021, 14 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2021/054327, dated Feb. 14, 2022, 8 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2021/057966, dated Feb. 22, 2022, 15 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2021/065174 dated Mar. 28, 2022, 10 pages. |
Kavusi S., et al., “Quantitative Study of High-Dynamic-Range Image Sensor Architecture,” Proceedings of Society of Photo-Optical Instrumentation Engineers, Jun. 7, 2004, vol. 5301, 12 Pages, XP055186908. |
Millet L., et al., “A 5500-Frames/s 85-GOPS/W 3-D Stacked BSI Vision Chip Based on Parallel In-Focal-Plane Acquisition and Processing,” IEEE Journal of Solid-State Circuits, USA, Apr. 1, 2019, vol. 54 (4), pp. 1096-1105, XP011716786. |
Non-Final Office Action dated Feb. 1, 2021 for U.S. Appl. No. 16/435,451, filed Jun. 7, 2019, 14 Pages. |
Non-Final Office Action dated Jan. 1, 2021 for U.S. Appl. No. 16/715,792, filed Dec. 16, 2019, 15 Pages. |
Non-Final Office Action dated May 1, 2019 for U.S. Appl. No. 15/927,896, filed Mar. 21, 2018, 10 Pages. |
Non-Final Office Action dated May 1, 2020 for U.S. Appl. No. 16/384,720, filed Apr. 15, 2019, 6 Pages. |
Non-Final Office Action dated Oct. 1, 2019 for U.S. Appl. No. 16/286,355, filed Feb. 26, 2019, 6 Pages. |
Non-Final Office Action dated Oct. 1, 2019 for U.S. Appl. No. 16/566,583, filed Sep. 10, 2019, 10 Pages. |
Non-Final Office Action dated Jul. 2, 2021 for U.S. Appl. No. 16/820,594, filed Mar. 16, 2020, 8 Pages. |
Non-Final Office Action dated Mar. 2, 2022 for U.S. Appl. No. 17/127,670, filed Dec. 18, 2020, 18 pages. |
Non-Final Office Action dated Sep. 2, 2021 for U.S. Appl. No. 16/910,844, filed Jun. 24, 2020, 7 Pages. |
Non-Final Office Action dated Dec. 4, 2020 for U.S. Appl. No. 16/436,137, filed Jun. 10, 2019, 12 Pages. |
Non-Final Office Action dated Mar. 4, 2020 for U.S. Appl. No. 16/436,049, filed Jun. 10, 2019, 9 Pages. |
Non-Final Office Action dated May 7, 2021 for U.S. Appl. No. 16/421,441, filed May 23, 2019, 17 Pages. |
Non-Final Office Action dated Jun. 8, 2021 for U.S. Appl. No. 17/072,840, filed Oct. 16, 2020, 7 Pages. |
Non-Final Office Action dated Jul. 10, 2019 for U.S. Appl. No. 15/861,588, filed Jan. 3, 2018, 11 Pages. |
Non-Final Office Action dated Jul. 10, 2020 for U.S. Appl. No. 16/255,528, filed Jan. 23, 2019, 27 Pages. |
Non-Final Office Action dated Apr. 13, 2022 for U.S. Appl. No. 16/820,594, filed Mar. 16, 2020, 7 pages. |
Non-Final Office Action dated May 14, 2021 for U.S. Appl. No. 16/716,050, filed Dec. 16, 2019, 16 Pages. |
Non-Final Office Action dated Mar. 15, 2021 for U.S. Appl. No. 16/896,130, filed Jun. 8, 2020, 16 Pages. |
Non-Final Office Action dated Sep. 18, 2019 for U.S. Appl. No. 15/876,061, filed Jan. 19, 2018, 23 Pages. |
Non-Final Office Action dated Apr. 21, 2021 for U.S. Appl. No. 16/453,538, filed Jun. 26, 2019, 16 Pages. |
Non-Final Office Action dated Apr. 21, 2021 for U.S. Appl. No. 17/083,920, filed Oct. 29, 2020, 17 Pages. |
Non-Final Office Action dated Dec. 21, 2018 for U.S. Appl. No. 15/668,241, filed Aug. 3, 2017, 16 Pages. |
Non-Final Office Action dated Oct. 21, 2021 for U.S. Appl. No. 17/083,920, filed Oct. 29, 2020, 19 Pages. |
Non-Final Office Action dated Jul. 22, 2020 for U.S. Appl. No. 16/249,420, filed Jan. 16, 2019, 9 Pages. |
Non-Final Office Action dated Jul. 22, 2020 for U.S. Appl. No. 16/369,763, filed Mar. 29, 2019, 15 Pages. |
Non-Final Office Action dated Sep. 22, 2020 for U.S. Appl. No. 16/707,988, filed Dec. 9, 2019, 15 Pages. |
Non-Final Office Action dated Nov. 23, 2018 for U.S. Appl. No. 15/847,517, filed Dec. 19, 2017, 21 Pages. |
Non-Final Office Action dated Jul. 25, 2019 for U.S. Appl. No. 15/909,162, filed Mar. 1, 2018, 20 Pages. |
Non-Final Office Action dated Nov. 25, 2019 for U.S. Appl. No. 15/719,345, filed Sep. 28, 2017, 14 Pages. |
Non-Final Office Action dated Sep. 25, 2019 for U.S. Appl. No. 16/177,971, filed Nov. 1, 2018, 9 Pages. |
Non-Final Office Action dated Apr. 27, 2021 for U.S. Appl. No. 16/829,249, filed Mar. 25, 2020, 9 Pages. |
Non-Final Office Action dated Jul. 27, 2020 for U.S. Appl. No. 16/566,583, filed Sep. 10, 2019, 11 Pages. |
Non-Final Office Action dated Jun. 27, 2019 for U.S. Appl. No. 15/801,216, filed Nov. 1, 2017, 13 Pages. |
Non-Final Office Action dated Mar. 28, 2022 for U.S. Appl. No. 17/072,840, filed Oct. 16, 2020, 8 Pages. |
Non-Final Office Action dated Aug. 29, 2019 for U.S. Appl. No. 15/983,391, filed May 18, 2018, 12 Pages. |
Non-Final Office Action dated Jan. 30, 2020 for U.S. Appl. No. 16/431,693, filed Jun. 4, 2019, 6 Pages. |
Non-Final Office Action dated Jun. 30, 2020 for U.S. Appl. No. 16/436,049, filed Jun. 10, 2019, 11 Pages. |
Non-Final Office Action dated Jan. 31, 2020 for U.S. Appl. No. 16/210,748, filed Dec. 5, 2018, 11 Pages. |
Notice of Allowability dated May 6, 2020 for U.S. Appl. No. 15/876,061, filed Jan. 19, 2018, 2 Pages. |
Notice of Allowance dated Apr. 1, 2021 for U.S. Appl. No. 16/255,528, filed Jan. 23, 2019, 7 Pages. |
Notice of Allowance dated May 1, 2019 for U.S. Appl. No. 15/847,517, filed Dec. 19, 2017, 11 Pages. |
Notice of Allowance dated Mar. 2, 2022 for U.S. Appl. No. 16/453,538, filed Jun. 26, 2019, 8 pages. |
Notice of Allowance dated Nov. 2, 2021 for U.S. Appl. No. 16/453,538, filed Jun. 26, 2019, 8 Pages. |
Notice of Allowance dated Nov. 3, 2020 for U.S. Appl. No. 16/566,583, filed Sep. 10, 2019, 11 Pages. |
Notice of Allowance dated Sep. 3, 2020 for U.S. Appl. No. 15/719,345, filed Sep. 28, 2017, 12 Pages. |
Notice of Allowance dated Feb. 4, 2020 for U.S. Appl. No. 15/876,061, filed Jan. 19, 2018, 13 Pages. |
Notice of Allowance dated Jun. 4, 2020 for U.S. Appl. No. 16/286,355, filed Feb. 26, 2019, 7 Pages. |
Notice of Allowance dated Jul. 5, 2022 for U.S. Appl. No. 16/899,908, filed Jun. 12, 2020, 10 pages. |
Notice of Allowance dated Mar. 5, 2020 for U.S. Appl. No. 15/668,241, filed Aug. 3, 2017, 8 Pages. |
Notice of Allowance dated May 5, 2021 for U.S. Appl. No. 16/707,988, filed Dec. 9, 2019, 14 Pages. |
Notice of Allowance dated Jan. 7, 2022 for U.S. Appl. No. 16/899,908, filed Jun. 12, 2020, 10 pages. |
Notice of Allowance dated Mar. 7, 2022 for U.S. Appl. No. 16/421,441, filed May 23, 2019, 18 pages. |
Notice of Allowance dated Apr. 8, 2020 for U.S. Appl. No. 15/983,391, filed May 18, 2018, 8 Pages. |
Notice of Allowance dated Dec. 8, 2021 for U.S. Appl. No. 16/829,249, filed Mar. 25, 2020, 6 pages. |
Notice of Allowance dated Jul. 8, 2021 for U.S. Appl. No. 17/150,925, filed Jan. 15, 2021, 10 Pages. |
Notice of Allowance dated Jul. 9, 2020 for U.S. Appl. No. 16/454,787, filed Jun. 27, 2019, 9 Pages. |
Notice of Allowance dated Sep. 9, 2020 for U.S. Appl. No. 16/454,787, filed Jun. 27, 2019, 9 Pages. |
Notice of Allowance dated Aug. 10, 2022 for U.S. Appl. No. 16/896,130, filed Jun. 8, 2020, 8 pages. |
Notice of Allowance dated Jun. 11, 2020 for U.S. Appl. No. 16/382,015, filed Apr. 11, 2019, 11 Pages. |
Notice of Allowance dated Mar. 11, 2022 for U.S. Appl. No. 16/716,050, filed Dec. 16, 2019, 13 pages. |
Notice of Allowance dated Aug. 12, 2020 for U.S. Appl. No. 15/719,345, filed Sep. 28, 2017, 10 Pages. |
Notice of Allowance dated Feb. 12, 2020 for U.S. Appl. No. 16/286,355, filed Feb. 26, 2019, 7 Pages. |
Notice of Allowance dated Jul. 13, 2021 for U.S. Appl. No. 16/896,130, filed Jun. 8, 2020, 8 Pages. |
Notice of Allowance dated Feb. 14, 2022 for U.S. Appl. No. 16/715,792, filed Dec. 16, 2019, 9 pages. |
Notice of Allowance dated Oct. 14, 2020 for U.S. Appl. No. 16/384,720, filed Apr. 15, 2019, 8 Pages. |
Notice of Allowance dated Oct. 15, 2020 for U.S. Appl. No. 16/544,136, filed Aug. 19, 2019, 11 Pages. |
Notice of Allowance dated Apr. 16, 2021 for U.S. Appl. No. 16/715,792, filed Dec. 16, 2019, 10 Pages. |
Notice of Allowance dated Feb. 16, 2022 for U.S. Appl. No. 17/150,925, filed Jan. 15, 2021, 9 pages. |
Notice of Allowance dated Sep. 16, 2020 for U.S. Appl. No. 16/435,449, filed Jun. 7, 2019, 7 Pages. |
Notice of Allowance dated Nov. 17, 2021 for U.S. Appl. No. 16/899,908, filed Jun. 12, 2020, 7 Pages. |
Notice of Allowance dated Sep. 17, 2021 for U.S. Appl. No. 16/899,908, filed Jun. 12, 2020, 11 Pages. |
Notice of Allowance dated Mar. 18, 2020 for U.S. Appl. No. 15/909,162, filed Mar. 1, 2018, 9 Pages. |
Notice of Allowance dated Nov. 18, 2020 for U.S. Appl. No. 16/249,420, filed Jan. 16, 2019, 8 Pages. |
Notice of Allowance dated Oct. 18, 2019 for U.S. Appl. No. 15/983,379, filed May 18, 2018, 9 Pages. |
Notice of Allowance dated Apr. 19, 2022 for U.S. Appl. No. 16/899,908, filed Jun. 12, 2020, 10 pages. |
Notice of Allowance dated Dec. 21, 2021 for U.S. Appl. No. 16/550,851, filed Aug. 26, 2019, 10 pages. |
Notice of Allowance dated Oct. 21, 2020 for U.S. Appl. No. 16/436,049, filed Jun. 10, 2019, 8 Pages. |
Notice of Allowance dated Apr. 22, 2020 for U.S. Appl. No. 16/454,787, filed Jun. 27, 2019, 10 Pages. |
Notice of Allowance dated Aug. 22, 2022 for U.S. Appl. No. 16/435,451, filed Jun. 7, 2019, 08 pages. |
Notice of Allowance dated Dec. 22, 2021 for U.S. Appl. No. 16/910,844, filed Jun. 24, 2020, 7 pages. |
Notice of Allowance dated Feb. 22, 2022 for U.S. Appl. No. 17/083,920, filed Oct. 29, 2020, 10 pages. |
Notice of Allowance dated Jan. 22, 2021 for U.S. Appl. No. 16/369,763, filed Mar. 29, 2019, 8 Pages. |
Notice of Allowance dated Nov. 22, 2021 for U.S. Appl. No. 16/820,594, filed Mar. 16, 2020, 18 pages. |
Notice of Allowance dated Nov. 22, 2021 for U.S. Appl. No. 16/820,594, filed Mar. 16, 2020, 8 pages. |
Notice of Allowance dated Sep. 22, 2022 for U.S. Appl. No. 17/150,925, filed Jan. 15, 2021, 9 pages. |
Notice of Allowance dated Aug. 23, 2022 for U.S. Appl. No. 16/896,130, filed Jun. 8, 2020, 2 pages. |
Notice of Allowance dated Jun. 23, 2020 for U.S. Appl. No. 15/801,216, filed Nov. 1, 2017, 5 Pages. |
Notice of Allowance dated Apr. 24, 2020 for U.S. Appl. No. 16/177,971, filed Nov. 1, 2018, 6 Pages. |
Notice of Allowance dated Jun. 24, 2020 for U.S. Appl. No. 16/431,693, filed Jun. 4, 2019, 7 Pages. |
Notice of Allowance dated Nov. 24, 2021 for U.S. Appl. No. 16/910,844, filed Jun. 24, 2020, 8 pages. |
Notice of Allowance dated Aug. 25, 2021 for U.S. Appl. No. 16/715,792, filed Dec. 16, 2019, 9 Pages. |
Notice of Allowance dated Oct. 25, 2021 for U.S. Appl. No. 16/435,451, filed Jun. 7, 2019, 8 Pages. |
Notice of Allowance dated Aug. 26, 2020 for U.S. Appl. No. 16/384,720, filed Apr. 15, 2019, 8 Pages. |
Notice of Allowance dated Nov. 26, 2019 for U.S. Appl. No. 15/861,588, filed Jan. 3, 2018, 9 Pages. |
Notice of Allowance dated Oct. 26, 2021 for U.S. Appl. No. 16/896,130, filed Jun. 8, 2020, 8 Pages. |
Notice of Allowance dated Apr. 27, 2022 for U.S. Appl. No. 16/896,130, filed Jun. 8, 2020, 08 pages. |
Notice of Allowance dated Jul. 27, 2020 for U.S. Appl. No. 16/435,449, filed Jun. 7, 2019, 8 Pages. |
Notice of Allowance dated Apr. 28, 2022 for U.S. Appl. No. 16/435,451, filed Jun. 7, 2019, 09 pages. |
Notice of Allowance dated Jun. 29, 2020 for U.S. Appl. No. 15/668,241, filed Aug. 3, 2017, 8 Pages. |
Notice of Allowance dated Aug. 30, 2021 for U.S. Appl. No. 16/829,249, filed Mar. 25, 2020, 8 pages. |
Notice of Allowance dated Jun. 8, 2022 for U.S. Appl. No. 17/150,925, filed Jan. 15, 2021, 10 pages. |
Notice of Reason for Rejection dated Nov. 16, 2021 for Japanese Application No. 2019-571699, filed Jun. 25, 2018, 13 pages. |
Notification of the First Office Action dated Oct. 28, 2021 for Chinese Application No. 2019800218483, filed Jan. 24, 2019, 17 pages. |
Office Action for European Application No. 18179851.3, dated May 19, 2022, 7 pages. |
Office Action dated Jul. 3, 2020 for Chinese Application No. 201810821296, filed Jul. 24, 2018, 17 Pages. |
Office Action dated Jul. 5, 2022 for Korean Application No. 10-2020-7002533, filed Jun. 25, 2018, 13 pages. |
Office Action dated Jul. 7, 2021 for European Application No. 19723902.3, filed Apr. 1, 2019, 3 Pages. |
Office Action dated Jul. 7, 2021 for European Application No. 19737299.8, filed Jun. 11, 2019, 5 Pages. |
Office Action dated Mar. 9, 2021 for Chinese Application No. 201810821296, filed Jul. 24, 2018, 10 Pages. |
Office Action dated Jul. 12, 2022 for Japanese Application No. 2019-571699, filed Jun. 25, 2018, 5 pages. |
Office Action dated Aug. 14, 2019 for European Application No. 18188968.4, filed Aug. 14, 2018, 5 Pages. |
Office Action dated Dec. 14, 2021 for Japanese Application No. 2019571598, filed Jun. 26, 2018, 12 pages. |
Office Action dated Mar. 15, 2022 for Japanese Patent Application No. 2020505830, filed on Aug. 9, 2018, 12 pages. |
Office Action dated Mar. 17, 2022 for Taiwan Application No. 20180124384, 26 pages. |
Office Action dated May 18, 2022 for Taiwan Application No. 108122878, 24 pages. |
Office Action dated Jul. 19, 2022 for Japanese Application No. 2019571598, filed Jun. 26, 2018, 10 pages. |
Office Action dated Nov. 26, 2019 for European Application No. 18188684.7, filed Aug. 13, 2018, 9 Pages. |
Office Action dated Sep. 26, 2022 for Korean Patent Application No. 10-2020-7002496, filed on Jun. 26, 2018, 17 pages. |
Office Action dated Aug. 28, 2019 for European Application No. 18188962.7, filed Aug. 14, 2018, 6 Pages. |
Office Action dated Jun. 28, 2020 for Chinese Application No. 201810821296, filed Jul. 24, 2018, 2 Pages. |
Office Action dated Mar. 29, 2022 for Japanese Patent Application No. 2020520431, filed on Jun. 25, 2018, 10 pages. |
Office Action dated Aug. 30, 2022 for Japanese Patent Application No. 2020505830, filed on Aug. 9, 2018, 5 pages. |
Office Action dated Sep. 30, 2021 for Taiwan Application No. 107124385, 17 Pages. |
Partial European Search Report for European Application No. 18179838.0, dated Dec. 5, 2018, 13 Pages. |
Partial International Search Report and Provisional Opinion for International Application No. PCT/US2021/041775, dated Oct. 8, 2021, 12 pages. |
Restriction Requirement dated Feb. 2, 2021 for U.S. Appl. No. 16/716,050, filed Dec. 16, 2019, 7 Pages. |
Sebastian A., et al., “Memory Devices and Applications for In-memory Computing,” Nature Nanotechnology, Nature Publication Group, Inc, London, Mar. 30, 2020, vol. 15 (7), pp. 529-544, XP037194929. |
Shi C., et al., “A 1000fps Vision Chip Based on a Dynamically Reconfigurable Hybrid Architecture Comprising a PE Array and Self-Organizing Map Neural Network,” International Solid-State Circuits Conference, Session 7, Image Sensors, Feb. 10, 2014, pp. 128-130, XP055826878. |
Snoeij M.F., et al., “A low Power Column-Parallel 12-bit ADC for CMOS Imagers,” XP007908033, Jun. 1, 2005, pp. 169-172. |
Snoeij M.F., et al., “A Low-Power Column-Parallel 12-bit ADC for CMOS Imagers,” Harvestimaging [online], Proceedings IEEE Workshop on Charge-Coupled Devices and Advanced Image Sensors (CCDS & AIS), Published date: Jun. 1, 2005, pp. 169-172, XP007908033, Retrieved from the Internet: URL: http://www.harvestimaging.com/pubdocs/084_2005_june_workshop.pdf. |
Supplemental Notice of Allowability dated Apr. 29, 2020 for U.S. Appl. No. 15/668,241, filed Aug. 3, 2017, 5 Pages. |
Tanner S., et al., “Low-Power Digital Image Sensor for Still Picture Image Acquisition,” Visual Communications and Image Processing, San Jose, Jan. 22, 2001, vol. 4306, pp. 358-365, XP008014232. |
Xu C., et al., “A New Digital-Pixel Architecture for CMOS Image Sensor with Pixel-Level ADC and Pulse Width Modulation using a 0.18 um CMOS Technology,” Electron Devices and Solid-State Circuits, IEEE Conference on Kowloon, Hong Kong, Piscataway, NJ, USA, Dec. 16, 2003, pp. 265-268, XP010695857. |
Notice of Allowance dated Oct. 21, 2022 for U.S. Appl. No. 16/899,908, filed Jun. 12, 2020, 10 pages. |
Office Action dated Sep. 29, 2022 for Taiwan Application No. 108122878, filed Jun. 28, 2019, 9 pages. |
Corrected Notice of Allowability dated Jan. 9, 2023 for U.S. Appl. No. 17/150,925, filed Jan. 15, 2021,8 pages. |
Corrected Notice of Allowance dated Aug. 9, 2023 for U.S. Appl. No. 17/072,840, filed Oct. 16, 2020, 3 pages. |
Final Office Action dated Dec. 2, 2022 for U.S. Appl. No. 17/072,840, filed Oct. 16, 2020, 9 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2021/054327, dated Apr. 20, 2023, 7 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2021/057966, dated May 19, 2023, 12 pages. |
International Preliminary Reporton Patentability for International Application No. PCT/US2021/065174 dated Jul. 13, 2023, 9 pages. |
Notice of Allowance dated Jun. 1, 2023 for U.S. Appl. No. 16/899,908, filed Jun. 12, 2020, 9 pages. |
Notice of Allowance dated Mar. 1, 2023 for U.S. Appl. No. 17/242,152, filed Apr. 27, 2021,9 pages. |
Notice of Allowance dated Nov. 1, 2023 for U.S. Appl. No. 16/820,594, filed Mar. 16, 2020, 5 pages. |
Notice of Allowance dated Oct. 2, 2023 for U.S. Appl. No. 16/435,451, filed Jun. 7, 2019, 10 pages. |
Notice of Allowance dated Oct. 4, 2023 for U.S. Appl. No. 17/242,152, filed Apr. 27, 2021,9 pages. |
Notice of Allowance dated Dec. 6, 2022 for U.S. Appl. No. 16/896,130, filed Jun. 8, 2020, 8 pages. |
Notice of Allowance dated Apr. 7, 2021 for U.S. Appl. No. 16/436,137, filed Jun. 10, 2019, 9 pages. |
Notice of Allowance dated Dec. 7, 2023 for U.S. Appl. No. 17/496,712, filed Oct. 7, 2021, 12 pages. |
Notice of Allowance dated Jul. 7, 2023 for U.S. Appl. No. 16/896,130, filed Jun. 8, 2020, 8 pages. |
Notice of Allowance dated Dec. 9, 2022 for U.S. Appl. No. 16/435,451, filed Jun. 7, 2019, 8 pages. |
Notice of Allowance dated Feb. 10, 2023 for U.S. Appl. No. 16/899,908, filed Jun. 12, 2020, 9 pages. |
Notice of Allowance dated Apr. 13, 2023 for U.S. Appl. No. 17/072,840, filed Oct. 16, 2020, 6 pages. |
Notice of Allowance dated Dec. 13, 2022 for U.S. Appl. No. 16/820,594, filed Mar. 16, 2020, 5 pages. |
Notice of Allowance dated Sep. 13, 2023 for U.S. Appl. No. 16/899,908, filed Jun. 12, 2020, 9 pages. |
Notice of Allowance dated Jun. 16, 2023 for U.S. Appl. No. 17/242,152, filed Apr. 27, 2021,9 pages. |
Notice of Allowance dated Mar. 17, 2023 for U.S. Appl. No. 16/896,130, filed Jun. 8, 2020, 8 pages. |
Notice of Allowance dated Aug. 18, 2023 for U.S. Appl. No. 17/496,712, filed Oct. 7, 2021, 12 pages. |
Notice of Allowance dated Jul. 19, 2023 for U.S. Appl. No. 16/820,594, filed Mar. 16, 2020, 5 pages. |
Notice of Allowance dated Nov. 21, 2022 for U.S. Appl. No. 17/242,152, filed Apr. 27, 2021, 10pages. |
Notice of Allowance dated Dec. 22, 2022 for U.S. Appl. No. 17/496,712, filed Oct. 7, 2021, 13 pages. |
Notice of Allowance dated Jun. 22, 2023 for U.S. Appl. No. 16/435,451, filed Jun. 7, 2019, 10 pages. |
Notice of Allowance dated Apr. 24, 2023 for U.S. Appl. No. 17/496,712, filed Oct. 7, 2021, 12 pages. |
Notice of Allowance dated Oct. 26, 2023 for U.S. Appl. No. 16/896,130, filed Jun. 8, 2020, 8 pages. |
Notice of Allowance dated Mar. 27, 2023 for U.S. Appl. No. 16/820,594, filed Mar. 16, 2020, 5 pages. |
Notice of Allowance dated Oct. 27, 2023 for U.S. Appl. No. 17/072,840, filed Oct. 16, 2020, 10 pages. |
Notice of Allowance dated Jul. 31, 2023 for U.S. Appl. No. 17/072,840, filed Oct. 16, 2020, 6 pages. |
Office Action dated Dec. 1, 2023 for Korean Application No. 10-2021-7000855, filed Jun. 11, 2019, 5 pages. |
Office Action dated Nov. 2, 2022 for Taiwan Application No. 107128759, filed Aug. 17, 2018, 16 pages. |
Office Action dated Jul. 4, 2023 for Korean Application No. 10-2020-7002533, filed Jun. 25, 2018, 3 pages. |
Office Action dated Dec. 1, 2022 for Korean Application No. 10-2020-7002306, filed Jun. 25, 2018, 13 pages. |
Office Action dated Jun. 1, 2023 for Korean Application No. 10-2020-7002306, filed Jun. 25, 2018, 3 pages. |
Office Action dated Nov. 1, 2022 for Japanese Patent Application No. 2020-520431, filed on Jun. 25, 2018, 11 pages. |
Office Action dated Mar. 10, 2023 for Chinese Application No. 201880053600.0, filed Jun. 25, 2018, 10 pages. |
Office Action dated Feb. 15, 2023 for Chinese Application No. 201980049477.X, filed Jun. 11, 2019, 19 pages. |
Office Action dated Nov. 15, 2022 for Taiwan Application No. 108120143, filed Jun. 11, 2019, 8 pages. |
Office Action dated Mar. 16, 2023 for Korean Patent Application No. 10-2020-7002496, filed on Jun. 26, 2018, 3 pages. |
Office Action dated Nov. 23, 2023 for Chinese Application No. 201880053600.0, filed Jun. 25, 2018, 4 pages. |
Office Action dated Jul. 4, 2023 for Japanese Application No. 2019571598, filed Jun. 26, 2018, 34 pages. |
Office Action dated Jan. 5, 2023 for Chinese Application No. 201980043907.7, filed Jun. 28, 2019, 14 pages. |
Office Action dated Feb. 7, 2023 for Japanese Application No. 2019-571699, filed Jun. 25, 2018, 5 pages. |
Office Action dated May 9, 2023 for Japanese Patent Application No. 2020-5204312, filed on Jun. 25, 2018, 6 pages. |
Office Action dated May 9, 2023 for Japanese Patent Application No. 2020-563959, filed on Nov. 12, 2020, 5 pages. |
Number | Date | Country | |
---|---|---|---|
20220210360 A1 | Jun 2022 | US |
Number | Date | Country | |
---|---|---|---|
63131563 | Dec 2020 | US |