The present invention relates generally to an image sensing device, in particular, in particular to an image sensing device and its image sensing method.
In recent years, the demand of the self-driving car industry has become increasingly vigorous. For self-driving cars, an image sensor for detecting real-time road conditions is an essential component. The dynamic vision sensor (DVS) is a mainstream image sensor used for detecting real-time road conditions. The reason is that the DVS records images in units of events. This dynamic event-based sensor brings machine autonomy closer to reality, making it suitable for vision-based high-speed applications in the field of autonomous vehicles.
However, when using dynamic vision sensing technology, especially in low-brightness environments, due to the short exposure period, self-driving car image recognition algorithms need to recognize scenes and use static objects, which will face underexposure. That is, correct scene detection cannot be made due to frames with insufficient details, thus increasing the risk of accidents. Therefore, how to simultaneously obtain high-quality static information and high frame rate dynamic information in frames, especially in low-brightness environments, is an urgent problem for researchers to solve.
Therefore, the present invention is proposed to address the above-mentioned deficiency.
A primary objective of the present invention is to provide an image sensing device, which has an image sensing array and an image processing circuit, the image sensing array is used to obtain an initial frame, the initial frame includes a plurality of sub-frames, and the sub-frames respectively include a plurality of sensing signals, wherein the sensing signals include static sensing signals and dynamic sensing signals, the static sensing signals are generated by exposing at a first frame rate for a first exposure period, the dynamic sensing signals are generated by exposing at a second frame rate for a second exposure period, and the first exposure period is greater than the second exposure period, the second frame rate is greater than the first frame rate, and the image processing circuit is used for analyzing the initial frame to perform a dynamic event detection processing on the changes over time of the sensing signals with the same frame rate in the sub-frames, and fuse the static sensing signal and the dynamic sensing signal into a main frame based on the results of the detection results of sub-array regions. Thereby, the image sensing device according to the present invention can respectively fuse the static sensing signal and the dynamic sensing signal or fuse the static sensing signal and the dynamic sensing signal in a specific ratio to form the main frame according to the result of the dynamic event detection processing for different sub-array regions, so as to realize high-definition dynamic images under low light source.
In order to achieve the foregoing objective, the present invention provides an image sensing device, comprising: an image sensing array, including a plurality of sub-array regions used to obtain a plurality of sensing signals having different exposure periods, wherein in a main frame period, the sensing signals include at least one static sensing signal and at least one dynamic sensing signal, the number of the at least one static sensing signal and the at least one dynamic sensing signal can be any different positive integers, the at least one static sensing signal is generated at a first frame rate for a first exposure period, and the at least one dynamic sensing signal is generated at a second frame rate for a second exposure period; and an image processing circuit, coupled to the image sensing array for analyzing the at least one static sensing signal and the at least one dynamic sensing signal, outputting sub-frames of the sensing signals having the same frame rate in the sub-array region, and fusing the sub-frames each having a different frame rate by a specific ratio to generate a main frame.
In a preferred embodiment of the image sensing device of the present invention, the first exposure period is greater than the second exposure period, and the second frame rate is greater than the first frame rate.
In a preferred embodiment of the image sensing device of the present invention, the image sensing device further comprising: a buffering circuit, coupled to the image processing circuit, wherein the sub-frames include at least one static sub-frames and at least one dynamic sub-frames, the buffering circuit is used to store the static sub-frames and the dynamic sub-frames.
In a preferred embodiment of the image sensing device of the present invention, the image processing circuit performs a dynamic event detection processing on the sub-frames, which is used to determine whether an object in the sub-frames is a dynamic event, if not, the image processing circuit outputs the static sub-frames in the sub-frames, and if yes, the image processing circuit outputs the dynamic sub-frames in the sub-frames.
In a preferred embodiment of the image sensing device of the present invention, the image processing circuit fuses the static sub-frames and the dynamic sub-frames into the main frame according to the results of the dynamic event detection processing.
In a preferred embodiment of the image sensing device of the present invention, the image sensing array includes a plurality of sensing units, and the sub-array region includes a plurality of sub-array regions of the sensing units.
In a preferred embodiment of the image sensing device of the present invention, each of the sensing units includes: a photodiode; a transmission circuit, coupled to the photodiode; and a reset circuit, coupled to the photodiode, wherein the reset circuit is used to receive a reset signal, the transmission circuit is used to receive a readout signal, the reset circuit resets the charge in the photodiode according to the reset signal, and the transmission circuit converts the charge accumulated in the photodiode into the sensing signal based on the readout signal.
In a preferred embodiment of the image sensing device of the present invention, the sub-array regions include a static sub-array region and a dynamic sub-array region, the static sub-array region is used to generate the at least one static sensing signal, and the dynamic sub-array region is used to generate the at least one dynamic sensing signal.
In a preferred embodiment of the image sensing device of the present invention, the sensing units in one static sub-array region are arranged to form a Bayer pattern array, the sensing units in one dynamic sub-array region are arranged to form a Bayer pattern array, and the two Bayer pattern arrays are spaced adjacent to each other to form a row to form a two Bayer pattern array unit, and eight independent readout signal control lines are arranged in the two Bayer pattern array unit.
In a preferred embodiment of the image sensing device of the present invention, the static sub-array region includes a plurality of static sensing units, the reset circuits of the static sensing units all receive a static reset signal, and the transmission circuits of the static sensing units all receive a static readout signal.
In a preferred embodiment of the image sensing device of the present invention, the static reset signal includes a plurality of static reset timings, the static reset timings respectively reset the charge stored in the static sensing units, and the time difference between the static readout signal and the static reset timing received by each of the static sensing units is the first exposure period.
In a preferred embodiment of the image sensing device of the present invention, the dynamic sub-array region includes a plurality of dynamic sensing units, the reset circuits of the dynamic sensing units all receive a dynamic reset signal, and the transmission circuits of the dynamic sensing units all receive a dynamic readout signal.
In a preferred embodiment of the image sensing device of the present invention, the dynamic reset signal includes a plurality of dynamic reset timings, the dynamic reset timings respectively reset the charge stored in the dynamic sensing units, and the time difference between the dynamic readout signal and the dynamic reset timing received by each of the dynamic sensing units is the second exposure period.
In a preferred embodiment of the image sensing device of the present invention, the image sensing array further includes a plurality of filters arranged on the sensing units, and the filters include at least one of visible light filters, infrared filters, and ultraviolet filters.
In a preferred embodiment of the image sensing device of the present invention, the sensing units further include a control circuit coupled to the transmission circuit and the reset circuit, and the control circuit is used to generate the readout signal and the reset signal.
In a preferred embodiment of the image sensing device of the present invention, the control circuit includes at least one static exposure control circuit and at least one dynamic exposure control circuit, the static exposure control circuit is used to generate at least one static reset signal and at least one static readout signal, and the dynamic exposure control circuit is used to generate at least one dynamic reset signal and at least one dynamic readout signal.
In a preferred embodiment of the image sensing device of the present invention, the static exposure control circuit and the dynamic exposure control circuit respectively include a first address decoder and a second address decoder, which are used to respectively generate a first frame rate and a second frame rate, which are asynchronous.
Further, in order to achieve the foregoing objective, the present invention provides an image sensing method, comprising: during a main frame period, at least one static sensing signal and at least one dynamic sensing signal are generated, the number of the at least one static sensing signal and the at least one dynamic sensing signal can be any different positive integers, the at least one static sensing signal is generated at a first frame rate for a first exposure period, and the at least one dynamic sensing signal is generated at a second frame rate for a second exposure period; and analyzing the at least one static sensing signal and the at least one dynamic sensing signal to output sub-frames of the at least one static sensing signal and the at least one dynamic sensing signal having the same frame rate, and perform fusion on the sub-frames having different frame rates to generate a main frame.
In conclusion, the image sensing device and the image sensing method of the present invention provide readout signal and reset signal through independent exposure control circuits, so that static sub-array region and dynamic sub-array region can be processed in one image capturing operation according to different frame rates and different exposure periods, so as to obtain sensing signals of an initial frame. In other words, the image sensing device according to the present invention can respectively generate a static sensing signal and a dynamic sensing signal in the same sub-frame of the initial frame, and the static sensing signal is exposed at a first frame rate for a first exposure period, the dynamic sensing signal is exposed at a second frame rate for a second exposure period, and the first exposure period is greater than the second exposure period, and the second frame rate is greater than the first frame rate. Therefore, the image sensing device of the present invention can output the static sensing signal and the dynamic sensing signal separately or in a specific ratio by judging whether the same sub-frame in the initial frame is a dynamic event, and fuse the static sensing signal and the dynamic sensing signal into the main frame according to the detection result, so as to achieve high-definition dynamic images under low light.
In order to make those skilled in the art understand the objectives, characteristics and effects of the present invention, the present invention is described in detail below by the following specific embodiments, and in conjunction with the attached drawings.
The inventive concept will be explained more fully hereinafter with reference to the accompanying drawings in which exemplary embodiments of the inventive concept are shown. Advantages and features of the inventive concept and methods for achieving the same will be apparent from the following exemplary embodiments, which are set forth in more details with reference to the accompanying drawings. However, it should be noted that the present inventive concept is not limited to the following exemplary embodiments, but may be implemented in various forms. Accordingly, the exemplary embodiments are provided merely to disclose the inventive concept and to familiarize those skilled in the art with the type of the inventive concept. In the drawings, exemplary embodiments of the inventive concepts are not limited to the specific examples provided herein and are exaggerated for clarity.
The terminology used herein is used to describe particular embodiments only, and is not intended to limit the present invention. As used herein, the singular terms “a” and “the” are intended to include the plural forms as well, unless the context clearly dictates otherwise. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. It will be understood that when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present.
Similarly, it will be understood that when an element (e.g., a layer, region, or substrate) is referred to as being “on” another element, it can be directly on the other element or intervening elements may be present. In contrast, the term “directly” means that no intervening elements are present. It should be further understood that when the terms “comprising” and “including” are used herein, it is intended to indicate the presence of stated features, steps, operations, elements, and/or components, but does not exclude one or more other features, steps, operations, elements, components, and/or the presence or addition of groups thereof.
Furthermore, exemplary embodiments in the detailed description are set forth in cross-section illustrations that are idealized exemplary illustrations of the present inventive concepts. Accordingly, the shapes of the exemplary figures may be modified according to manufacturing techniques and/or tolerable errors. Therefore, the exemplary embodiments of the present inventive concept are not limited to the specific shapes shown in the exemplary figures, but may include other shapes that may be produced according to the manufacturing process. The regions illustrated in the figures have general characteristics and are used to illustrate specific shapes of elements. Therefore, this should not be considered limited to the scope of this creative concept.
It will also be understood that, although the terms “first,” “second,” “third,” etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish each element. Thus, a first element in some embodiments could be termed a second element in other embodiments without departing from the teachings of the present creation. Exemplary embodiments of aspects of the present inventive concept illustrated and described herein include their complementary counterparts. Throughout this specification, the same reference numbers or the same designators refer to the same elements.
Furthermore, example embodiments are described herein with reference to cross-sectional and/or planar views, which are illustrations of idealized example illustrations. Accordingly, deviations from the shapes shown, for example, caused by manufacturing techniques and/or tolerances, are expected. Accordingly, the exemplary embodiments should not be considered limited to the shapes of the regions shown herein, but are intended to include deviations in shapes resulting from, for example, manufacturing. Thus, the regions illustrated in the figures are schematic and their shapes are not intended to illustrate the actual shape of a region of a device and are not intended to limit the scope of example embodiments.
Please refer to
Specifically, as shown in
Specifically, as shown in
Specifically, in some embodiments, the image sensing array 11 may be a CMOS image sensor (CIS) or a charge coupled device (CCD). In some embodiments, the image processing circuit 12 can be an image signal processor (ISP), a central processing unit (CPU), a microprocessor, a digital signal processor (DSP), programmable logic controller (PLC), application specific integrated circuit (ASIC), system on chip (SoC) or other similar components or a combination of the above components. Moreover, in some embodiments, the image sensing device 100 may further include a memory. The memory can be used to store the frame, sensing signal, pixel data, image analysis software or computing software, etc., described in the various embodiments of the present invention, and the present invention is not limited thereto.
Please refer to
Step S1: obtain the initial frame 200 by the image sensing array 11, the initial frame 200 includes a plurality of sub-frames 21, and the sub-frames 21 respectively include a plurality of the sensing signals 22, wherein the sensing signals 22 include the static sensing signal 221 and the dynamic sensing signal 222, the static sensing signal 221 is generated by exposing at the first frame rate for the first exposure period T1, and the dynamic sensing signal 222 is generated by exposing at the second frame rate for the second exposure period T2, the first exposure period T1 is greater than the second exposure period T2, and the second frame rate is greater than the first frame rate.
Step S2: analyze the initial frame 200 by the image processing circuit 12, so as to perform a dynamic event detection processing on the changes over time of the sensing signals 22 with the same frame rate in the sub-frames 21, and generate a main frame by fusion of the static sensing signal 221 and the dynamic sensing signal 222 based on the detection results of the sub-frames 21.
It should be further explained that the method for the image processing circuit 12 to perform fusion on the sub-frames 21 to form the main frame according to the present invention may include but is not limited to a frame synthesis algorithm and a frame synthesis circuit. The algorithm can be, for example, multiplying the static sensing signal 221 and the dynamic sensing signal 222 by different gain values and summing them up to form a composite value. In addition, when the sub-frames are fused, gain adjustment or compensation can be performed according to the frame rate and exposure period of each sub-frame. For example, the adjustment is made according to the ratio between different exposure periods, such as setting the product of the exposure period and the gain to be a constant value, but the present invention is not limited thereto.
In addition, it can be understood that since the first exposure period T1 is greater than the second exposure period T2, when the static sensing signal 221 is exposed, the image sensing array 11 can sense and generate a plurality of dynamic sensing signals 222, and generate a plurality of dynamic sub-frames through the image processing circuit 12. That is, if the time for exposing, reading, and outputting the static sub-frames is regarded as a unit of time, then multiple cycles of exposure, reading, and outputting of the dynamic sub-frames can be completed within the unit of time, but the present invention is not limited thereto. In addition, in the present invention, the time required for the image sensing array 11 to sense the static sensing signal 221 and the dynamic sensing signal 222 to produce data for generating an image can be understood as a main frame period, according to requirements, within a main frame period, the image sensing array 11 can sense a plurality of static sensing signals 221 and a plurality of dynamic sensing signals 222.
It is worth mentioning that the image sensing array 11 of the image sensing device 100 according to the present invention may include a plurality of sensing units. In some embodiments, the sensing units may be a global shutter exposure operation during the first exposure period T1 and the second exposure period T2, so as to avoid the Jello effect. That is, each sensing diode of all sensing units on the image sensing device 100 is simultaneously exposed. In some other embodiments, the sensing units may perform a rolling readout operation during the first exposure period T1 and the second exposure period T2, but the present invention is not limited thereto.
Hereinafter, referring to the drawings, an embodiment of the first embodiment of the image sensing device 100 of the present invention will be described.
Referring to
Specifically, as shown in
Specifically, as shown in
Specifically, as shown in
Specifically, the sub-array region 111 according to the first embodiment of the present invention may include a plurality of sensing units 300, wherein the sub-array regions 111 are used for one imaging operation according to different frame ratios and different exposure periods to obtain the sensing signal 22 of the initial frame 200 shown in
Specifically, as shown in
Specifically, as shown in
Please refer to
Specifically, in this embodiment, the reset circuits 33 of the first dynamic sensing unit 421, the second dynamic sensing unit 422, the third dynamic sensing unit 423, and the fourth dynamic sensing unit 424 of the present invention all receive the dynamic reset signal RST1, and the dynamic reset signal RST1 includes four dynamic reset timings RST1_0-RST1_3. In addition, the transmission circuit 32 of the first dynamic sensing unit 421 receives the first dynamic readout signal D-TX0, the transmission circuit 32 of the second dynamic sensing unit 422 receives the second dynamic readout signal D-TX1, the transmission circuit 32 of the third dynamic sensing unit 422 receives the third dynamic readout signal D-TX2, and the transmission circuit 32 of the fourth dynamic sensing unit 424 receives the fourth dynamic readout signal D-TX3. Wherein, the dynamic reset timings RST1_0-RST1_3 are respectively used to reset the charge stored in the first dynamic sensing unit 421, the second dynamic sensing unit 422, the third dynamic sensing unit 423, and the fourth dynamic sensing unit 424, so as to prevent the sensing signal generated by the sensing unit 300 from being unable to obtain correct digital pixel values during analog and digital conversion. It should be noted that there is another time difference between the dynamic reset timings RST1_0-RST1_3 and the dynamic readout signals D-TX0-D-TX3, respectively. More specifically, as shown in
In this way, since there are different time differences between the static reset signal RST0 and the static readout signals TX0-TX3 and between the dynamic reset signal RST1 and the dynamic readout signals D-TX0-D-TX3, thus, the sensing units of the image sensing device 100 in the present embodiment can obtain the sensing signal 22 in each sub-frame 21 of the initial frame 200 according to different frame rates and different integration times (or exposure periods). In other words, the image sensing device 100 according to the first embodiment of the present invention can respectively generate the static sensing signal 221 and the dynamic sensing signal 222 in the same sub-frame 21 of the initial frame 200, the static sensing signal 221 is exposed at the first frame rate for the first exposure period T1, and the dynamic sensing signal 222 is exposed at the second frame rate for the second exposure period T2, and the first exposure period T1 is greater than the second exposure period T2, and the second frame rate is greater than second frame rate. Therefore, the image sensing device 100 of the present embodiment can output the static sensing signal 221 and the dynamic sensing signal 222 by respectively judging whether the same sub-frame 21 in the initial frame 200 is a dynamic event, so as to respectively outputting the static sensing signal 221 and the dynamic sensing signal 222 or multiplying and outputting the static sensing signal 221 and the dynamic sensing signal 222 using a specific ratio, and generating main frame by fusion of the static sensing signal 221 and the dynamic sensing signal 222 based on the detection results of the sub-frames 21. Accordingly, the fused main frame has a high signal-to-noise ratio (SNR) under low light sources, and realizes high-definition dynamic images without motion blur, which is equivalent to enhancing the sensitivity of the sensor.
In the following description, only the part of controlling the exposure period and the output frame rate will be described in detail. In order to balance the sensitivity of dynamic sensing and the quality of still images, the embodiment of the present invention enables the image sensing array 11 to simultaneously generate image data of various exposure values. For example, the static exposure control circuit 131 is coupled to the static sub-array region 41, and transmits readout signals TX0-TX3 to control the first exposure period T1 of each of the first static sensing unit 411, the second static sensing unit 412, the third static sensing unit 413, and the fourth static sensing unit 414 in each static sub-array region 41. The dynamic exposure control circuit 132 is coupled to the dynamic sub-array region 42, and transmits dynamic readout signals D-TX0-D-TX3 to control the second exposure period T2 of each of the first dynamic sensing unit 421, the second dynamic sensing unit 422, and the third dynamic sensing unit 423, and the fourth dynamic sensing unit 424. Since different sensing units in the image sensing array 11 have different exposure periods, the speed of outputting image data (i.e., frame rate) may also be different. Generally, dynamic detection requires a higher frame rate, while recording static images has relatively no requirements on frame rates, but requires a higher SNR. Therefore, the static exposure control circuit 131 of this embodiment can control the static sensing unit to output the static sensing signal 221 at a first frame rate, and the dynamic exposure control circuit 132 can control the dynamic sensing unit to output the dynamic sensing signal 222 at a second frame rate. The second frame rate is higher than the first frame rate. It should be noted that the static exposure control circuit 131 and the dynamic exposure control circuit 132 can respectively have a first address decoder and a second address decoder to generate asynchronous first frame rate and second frame rate, respectively. For example, the control circuit 13 can control the image sensing array 11 to detect dynamic image changes at a speed of 30 frame-per-second (fps), but simultaneously record static image data at a speed of 1 fps. And after the static sensing signal 221 and the dynamic sensing signal 222 are stored in the frame buffer, the fusion of the static sensing signal 221 and the dynamic sensing signal 222 can be performed through an appropriate algorithm to obtain a 30 fps high SNR image. The frame rate ratio of the static exposure control circuit 131 and the dynamic exposure control circuit 132 may be any ratio depending on the actual requirements. For example, the two frame rates may be in a ratio of M:N, where M and N are positive integers.
Please refer to
Step S21: whether if a change detection on the changes over time of light intensities of the sensing signals 22 with the same frame rate in the same sub-frame 21 by the image processing circuit 12 is in a light-varying interval.
Step S22: when it is in the light-varying interval, the image processing circuit 12 outputs the dynamic sensing signal 222 at the sub-frame 21, or multiplies and outputs the static sensing signal 221 and the dynamic sensing signal 222 using a specific ratio.
Step S23: when it is not in the light-varying interval, the image processing circuit 12 outputs the static sensing signal 221 at the sub-frame 21, or multiplies and outputs the static sensing signal 221 and the dynamic sensing signal 222 using another specific ratio.
It is worth mentioning that, as shown in
It is worth mentioning that, as shown in
In conclusion, the image sensing device 100 according to the first embodiment of the present invention can use the static sub-array region 41 and the dynamic sub-array region 42 according to different frame rates and different exposure periods in a single image capturing operation to obtain the initial frame 200, and by analyzing whether the sub-frame 21 in the initial frame 200 is a dynamic event, the static sensing signal 221 and the dynamic sensing signal 222 are respectively outputted or multiplied and outputted in a specific ratio.
Other examples of the image sensing device 100 are provided below, so that those skilled in the art of the present invention can more clearly understand possible variations. Components denoted by the same reference numerals as in the above embodiment are substantially the same as those described above with reference to
Please refer to
In addition, according to another embodiment of the present invention, when the sensing unit 300 is performing readout, a first analog gain and a second analog gain may also be respectively applied to the first static sensing unit 411, the second static sensing unit 412, the third static sensing unit 413, and the fourth static sensing unit 414 in the static sub-array region 41, and the first dynamic sensing unit 421, the second dynamic sensing unit 422, the third dynamic sensing unit 423, and the fourth dynamic sensing unit 424 in the dynamic sub-array region 42. The ratio of the first analog gain to the second analog gain may be an integer ratio, for example, M:N, where M and N are positive integers, and the ratio of the first analog gain to the second analog gain may be, for example, the ratio of the second exposure period to the first exposure period, that is, the product of the first analog gain and the first exposure period is equal to the product of the second analog gain and the second exposure period. But the present invention is not limited thereto.
The above description is to illustrate the implementation of the present invention by means of specific examples. Those skilled in the art can easily understand other advantages and effects of the present invention from the contents disclosed in this specification.
Although the present invention has been described with reference to the preferred embodiments thereof, it is apparent to those skilled in the art that a variety of modifications and changes may be made without departing from the scope of the present invention which is intended to be defined by the appended claims.
This application claims priority of U.S. provisional application No. 63/345,920, filed on May 26, 2022, the content of which is incorporated herein in its entirety by reference.
Number | Date | Country | |
---|---|---|---|
63345920 | May 2022 | US |