This application claims priority under 35 U.S.C. § 119 to Korean Patent Application No. 10-2019-0065663, filed on Jun. 4, 2019 in the Korean Intellectual Property Office, the disclosure of which is incorporated by reference herein in its entirety.
Exemplary embodiments of the present inventive concept relate to an image sensor and an image processing system, and more specifically, to an image sensor and an image processing system which perform an electronic image stabilization (EIS) processing operation in consideration of a processing result of optical image stabilization (OIS).
A camera module including an image sensor may be mounted on devices such as mobile phones, drones, digital cameras, wearable cameras, or devices in the automotive field. When an image is acquired using the image sensor, if the camera module including the image sensor or the device including the camera module moves, deterioration of the acquired image occurs. In particular, when the camera module is mounted on a device with much motion, deterioration of the acquired image due to shaking gets worse. Therefore, it may be necessary to execute a process of correcting the image acquired by the image sensor to compensate for the motion of the camera module or the device.
For example, an optical image stabilization (OIS) or an electronic image stabilization (EIS) may be performed. OIS senses the motion and shaking through a gyro sensor and physically adjusts a position of the lens to change a path of the optical signal. EIS performs correction according to motion on an image to be output from the image sensor, e.g., an image converted into an electrical signal.
However, since the result of OIS that changes the physical position of the lens is not reflected in the EIS processing operation, artifacts may occur due to duplicated correction.
According to an exemplary embodiment of the present inventive concept, an image sensor includes a video acquiring circuit configured to convert light incident in a first direction through a lens into an electrical signal to acquire image data, a clock signal generating circuit configured to generate and output a clock signal on current time information, a stabilization data generating circuit configured to generate stabilization data for processing the image data, image time information on a time point at which the image data is acquired, and stabilization time information on a time point at which the stabilization data is generated, on the basis of the clock signal, and an output circuit configured to output data received from the video acquiring circuit and the stabilization data generating circuit. The stabilization data generating circuit generates first stabilization data on a position of the lens and first stabilization time information on a time point at which the first stabilization data is generated, and the output circuit outputs the image data, the image time information, the first stabilization data, and the first stabilization time information.
According to an exemplary embodiment of the present inventive concept, an image sensor includes a video acquiring circuit including a plurality of pixels arranged in a matrix form, and configured to acquire image data obtained by converting light incident through a lens into an electrical signal using a target pixel group of at least some of the plurality of pixels, where the video acquiring circuit exposes pixels of a first row of the target pixel group in a first section, performs a read operation of the pixels of the first row in a second section, exposes pixels of a last row of the target pixel group in a third section, and performs a read operation on the pixels of the last row in a fourth section, a clock signal generating circuit configured to generate and output a clock signal on current time information, a stabilization data generating circuit configured to generate first stabilization data on positions of the lens in the first and third sections, and generate image time information on a time at which the image data is acquired and first stabilization time information on a time at which the first stabilization data is generated, on the basis of the clock signal, and an output circuit configured to output data received from the video acquiring circuit and the stabilization data generating circuit.
According to an exemplary embodiment of the present inventive concept, an image processing system includes an image sensor configured to generate image data acquired by converting light incident through a lens into an electrical signal, image time information on a time at which the image data is acquired, stabilization data on a position of the lens, and stabilization time information on a time at which the stabilization data is generated, an image data processing circuit configured to perform an electronic image stabilization (EIS) processing operation on the basis of the image data, the image time information, the stabilization data, and the stabilization time information, and output image processing data generated on the basis of the EIS processing operation, and a display circuit configured to receive the image processing data and display an image on the basis of the image processing data.
The above and other aspects and features of the present inventive concept will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings.
Exemplary embodiments of the present inventive concept provide an image sensor which reads out a pixel array having a pixel structure including a plurality of photoelectric conversion elements sharing a floating diffusion area with one another.
Exemplary embodiments of the present inventive concept will be described more fully hereinafter with reference to the accompanying drawings. Like reference numerals may refer to like elements throughout this application.
Hereinafter, an image sensor and an image processing system according to exemplary embodiments of the present inventive concept will be described with reference to
Referring to
The image sensor 100 may sense an object 20 imaged through the lens 400 under the control of the image data processing unit 200. The image sensor 100 may include a video (or image) acquiring unit 110, a stabilization data generating unit 120, a clock signal generating unit 130, a control register block 140, a buffer 150, a synchronizer 160, and a row decoder 170.
The video acquiring unit 110 may convert light, which is incident through the lens 400, into an electrical signal to acquire image data DT_IMG on the object 20. According to an exemplary embodiment of the present inventive concept, the video acquiring unit 110 may include a pixel array 111, a correlated double sampling block (CDS) 113, and an analog digital converter (ADC) 115.
The pixel array 111 may include a plurality of optical detection elements, such as photodiodes and pinned photodiodes. The pixel array 111 may detect light using the plurality of optical detection elements and may convert light into an electrical signal to generate an image signal. According to an exemplary embodiment of the present inventive concept, the pixel array 111 may have a plurality of pixels arranged in a matrix form. Each of the plurality of pixels is a red pixel for converting light of a red spectral region into an electrical signal, a green pixel for converting light of a green spectral region into an electrical signal, and a blue pixel for converting light of a blue spectral region into an electrical signal. Additionally, respective color filter arrays for transmitting light of a specific spectral region may be arranged on the top of each of the plurality of pixels constituting the pixel array 111.
The row decoder 170 may drive the pixel array 111 row by row. For example, the row decoder 170 may generate a row selection signal. According to an exemplary embodiment of the present inventive concept, the pixel array 111 outputs a reset signal and a video signal from a row selected by the row selection signal, which is provided from the row decoder 170, to the correlated double sampling block 113, and the correlated double sampling block 113 may perform correlated double sampling on the reset signal and the video signal to generate a correlated double sampled signal.
The analog digital converter 115 compares a lamp signal generated on the basis of a clock generated by the clock signal generating unit 130 with the correlated double sampled signal, counts a comparison result signal, and may output the image data DT_IMG to the buffer 150.
The stabilization data generating unit 120 may generate image time information TS_IMG and a stabilization data group DT_STB used for stabilization processing of the image data DT_IMG that is output from the video acquiring unit 110. According to an exemplary embodiment of the present inventive concept, the stabilization data generating unit 120 may receive stabilization data (e.g., DT_LP of
According to an exemplary embodiment of the present inventive concept, the stabilization data generating unit 120 may include an internal buffer, may temporarily store the extracted stabilization data DT_LP′, the generated image time information TS_IMG, and the stabilization time information TS_LP in the internal buffer and then output them to the synchronizer 160.
The clock signal generating unit 130 may output the clock signal Clk to each of the row decoder 170, the correlated double sampling block 113, the analog digital converter 115, and the stabilization data generating unit 120 to control the above-described operation. According to an exemplary embodiment of the present inventive concept, the clock signal Clk that is output by the clock signal generating unit 130 may include information on the current time.
The control register block 140 may output control signals to each of the clock signal generating unit 130, the stabilization data generating unit 120, the buffer 150, and the synchronizer 160 to control the above-described operation. According to an exemplary embodiment of the present inventive concept, the control register block 140 may operate under the control of the image data processing unit 200.
The buffer 150 may buffer the image data DT_IMG that is output from the video acquiring unit 110. According to an exemplary embodiment of the present inventive concept, the buffer 150 may sense and amplify the image data DT_IMG and output it.
The synchronizer 160 receives the image data DT_IMG output from the buffer 150 and the image time signal TS_IMG and the stabilization data group DT_STB output from the stabilization data generating unit 120, and may synchronize the image data DT_IMG, the image time signal TS_IMG, and the stabilization data group DT_STB to output image sensor output data DT_ISOUT. Therefore, the image data DT_IMG, the image time information TS_IMG, the stabilization data DT_LP′, and the stabilization time information TS_LP may be output to the image data processing unit 200.
The image sensor 100 may further include an output unit. The output unit may include at least one port. According to an exemplary embodiment of the present inventive concept, the image sensor output data DT_ISOUT, e.g., the image data DT_IMG, the image time information TS_IMG, and the stabilization data group DT_STB may be output through the at least one port.
The image data processing unit 200 controls the image sensor 100, performs a stabilization processing operation on the image sensor output data DT_ISOUT sensed and output by the image sensor 100, and may output the image data DT_IMG.
According to an exemplary embodiment of the present inventive concept, the image data processing unit 200 may control the control register block 140 of the image sensor 100. Although the image data processing unit 200 may control the image sensor 100, e.g., the control register block 140, using an inter-integrated circuit (I2C), the present inventive concept is not limited thereto.
The image data processing unit 200 receives an input of image sensor output data DT_ISOUT which is an output signal of the image sensor 100, performs fabricating/processing operations such as a stabilization processing operation, and may output the corrected data DT_DSP of
Although the image data processing unit 200 is shown as a configuration independent of the image sensor 100 in
According to an exemplary embodiment of the present inventive concept, the image data processing unit 200 performs the stabilization processing operation of the image data DT_IMG, on the basis of the image sensor output data DT_ISOUT that is output from the image sensor 100. The stabilization processing operation of the image data DT_IMG and other post-processing operations are performed, and the corrected data DT_DSP of
The display unit 300 may output an image on the basis of the corrected data DT_DSP of
Referring to
Referring to
According to an exemplary embodiment of the present inventive concept, the lens position data DT_LP may be data utilized to perform an AF (Auto Focusing) operation of the lens 400. In this case, the lens position data DT_LP may include information on the position of the lens 400 in an incident direction of light.
According to an exemplary embodiment of the present inventive concept, the lens position data DT_LP may be data used to perform an optical image stabilization (OIS) operation of the lens 400. In this case, the lens position data DT_LP may include information on the position of the lens 400 in a direction substantially perpendicular to the incident direction of light.
The stabilization data generating unit 120_0 may output generated stabilization data group DT_STB0 and the image time signal TS_IMG. The stabilization data group DT_STB0 may include the lens position data DT_LP′ and the lens position time information TS_LP.
Referring to
When a section from a first time point T11 to a fourth time point T14 is an exposure section of the pixel for acquisition of a first frame (Frame0), the stabilization data generating unit 120_0 extracts the data acquired in the section from the first time point T11 to the fourth time point T14 from the lens position data DT_LP and determines the extracted data as the lens position data DT_LP′.
Similarly, when a section from a seventh time point T21 to a tenth time point T24 is an exposure section of the pixel for acquisition of a second frame (Frame1), the stabilization data generating unit 120_0 may extract data acquired in a section from the seventh time point T21 to the tenth time point T24 from the lens position data DT_LP and may determine the extracted data as the lens position data DT_LP′.
Referring to
According to an exemplary embodiment of the present inventive concept, the buffer 150 may receive the image data DT_IMG from the video acquiring unit 110, and may receive the stabilization data group DT_STB0 and the image time signal TS_IMG from the stabilization data generating unit 120_0. The synchronizer 160 receives the image data DT_IMG, the image time signal TS_IMG, and the stabilization data group DT_STB0 from the buffer 150, and may synchronize the received data to output the image sensor output data DT_ISOUT. The stabilization data group DT_STB0 may include the lens position data DT_LP′ and the lens position time information TS_LP.
According to an exemplary embodiment of the present inventive concept, the image sensor 100 may output the image data DT_IMG, the image time signal TS_IMG, the lens position data DT_LP′, and the lens position time information TS_LP to the image data processing unit 200 through one port.
According to an exemplary embodiment of the present inventive concept, by generating the time information TS_IMG and TS_LP on the image data DT_IMG and the stabilization data DT_LP′, and by performing the stabilization processing operation of the image data DT_IMG using the same, the stabilization processing operation using the stabilization data that exactly matches the image data DT_IMG may be performed, and thus, the stability of the image data DT_IMG may be performed more correctly. In addition, by performing an electronic image stabilization (EIS) operation in consideration of the position information of the lens 400, it is possible to prevent an occurrence of artifacts caused by duplicated correction with corrections such as AF processing and OIS processing.
Referring to
According to an exemplary embodiment of the present inventive concept, the AF data DT_AF is data used to perform an AF operation of the lens 400, and may be data on the position of the lens 400 in the incident direction of light. According to an exemplary embodiment of the present inventive concept, the OIS data DT_OIS is data used to perform an OIS operation of the lens 400, and may be data on the position of the lens 400 in the direction substantially perpendicular to the incident direction of light.
The stabilization data generating unit 120_0 may output generated stabilization data group DT_STB1 and the image time signal TS_IMG. The stabilization data group DT_STB1 may include the AF data DT_AF′, the AF time information TS_AF, the OIS data DT_OIS′, and the OIS time information TS_OIS.
Referring to
In other words, the AF data DT_AF acquired in the section from the first time point T11 to the fourth time point T14 (e.g., the first frame Frame0), which is the exposure section of the pixels, may be determined as the AF data DT_AF′, and similarly, the OIS data DT_OIS acquired in the section from the first time point T11 to the fourth time point T14 may be determined as the OIS data DT_OIS′.
Referring to
According to an exemplary embodiment of the present inventive concept, the buffer 150 receives the image data DT_IMG from the video acquiring unit 110, and may receive the stabilization data group DT_STB1 and the image time signal TS_IMG from the stabilization data generating unit 120_0. The synchronizer 160 receives the image data DT_IMG, the image time signal TS_IMG, and the stabilization data group DT_STB1 from the buffer 150, and may synchronize the received data to output the image sensor output data DT_ISOUT. The stabilization data group DT_STB1 may include the AF data DT_AF′, the AF time information TS_AF, the OIS data DT_OIS′, and the OIS time information TS_OIS.
According to an exemplary embodiment of the present inventive concept, the image sensor 100 may output the image data DT_IMG, the image time signal TS_IMG, the AF data DT_AF′, the AF time information TS_AF, the OIS data DT_OIS′, and the OIS time information TS_OIS to the image data processing unit 200 through one port.
Referring to
The lens position data generating module 121 may extract the AF data DT_AF′ for the stabilization processing operation of the image data DT_IMG from the AF data DT_AF. In addition, the lens position data generating module 121 may extract OIS data DT_OIS′ for the stabilization processing operation of the image data DT_IMG from the OIS data DT_OIS. The motion data generating module 123 may extract motion data DT_MT′ for the stabilization processing operation of the image data DT_IMG from the motion data DT_MT.
The lens position data generating module 121 may generate the AF time information TS_AF on the AF data DT_AF′ and the OIS time information TS_OIS on the OIS data DT_OIS′ on the basis of the clock signal Clk. The motion data generating module 123 may generate motion time information ST_MT on the motion data DT_MT′ on the basis of the clock signal Clk.
The stabilization data generating unit 120_1 may output the generated stabilization data group DT_STB1, stabilization data group DT_STB2, and the image time signal TS_IMG. The stabilization data group DT_STB1 may include the AF data DT_AF′, the AF time information TS_AF, the OIS data DT_OIS′, and the OIS time information TS_OIS. The stabilization data group DT_STB2 may include the motion data DT_MT′ and the motion time information ST_MT.
Referring to
In other words, the AF data DT_AF acquired in the section from the first time point T11 to the fourth time point T14, which is the exposure section of the pixel, may be determined as the AF data DT_AF′, the OIS data DT_OIS acquired in the section from the first time point T11 to the fourth time point T14 may be determined as the OIS data DT_OIS′, and similarly, the motion data DT_MT acquired in the section from the first time point T11 to the fourth time point T14 may be determined as the motion data DT_MT′.
Referring to
According to an exemplary embodiment of the present inventive concept, the buffer 150 may receive the image data DT_IMG from the video acquiring unit 110, and may receive the stabilization data group DT_STB1, the stabilization data group DT_STB2, and the image time signal TS_IMG from the stabilization data generating unit 120_1. The synchronizer 160 may receive the image data DT_IMG, the image time signal TS_IMG, the stabilization data group DT_STB1, and the stabilization data group DT_STB2 from the buffer 150, and may synchronize the received data to output the image sensor output data DT_ISOUT2. The stabilization data group DT_STB1 may include the AF data DT_AF′, the AF time information TS_AF, the OIS data DT_OIS′, and the OIS time information TS_OIS, and the stabilization data group DT_STB2 may include the motion data DT_MT′ and the motion time information TS_MT.
According to an exemplary embodiment of the present inventive concept, the image sensor 100 may output the image data DT_IMG, the image time signal TS_IMG, the AF data DT_AF′, the AF time information TS_AF, the OIS data DT_OIS′, the OIS time information TS_OIS, the motion data DT_MT′, and the motion time information ST_MT to the image data processing unit 200 through one port.
Referring to
First, the process of acquiring the first frame among the image data DT_IMG is as follows.
At the first time point T11, pixels of a first row among the plurality of pixels start to be exposed. In other words, there is a shutter open state. The exposure of the pixels of the first row is performed from the first time point T11 to a third time point T13. In other words, the exposure of the pixels of the first row ends at the third time point T13, and there is a shutter close state. Thereafter, a read operation on the pixels of the first row is performed in the section from the third time point T13 to a fifth time point T15. In other words, the read operation on the pixels of the first row starts at the third time T13, and the read operation on the pixels of the first row ends at the fifth time T15.
Pixels of a last row of the plurality of pixels start to be exposed at a second time point T12, and the exposure of the pixels of the last row is performed from the second time point T12 to a fourth time point T14. Thereafter, the read operation on the pixels of the last row is performed in the section from the fourth time point T14 to the sixth time point T16. In other words, the read operation on the pixels of the last row starts at the fourth time point T14, and the read operation on the pixels of the last row ends at the sixth time point T16.
The operation of the pixels is controlled by a control signal (for example, the clock signal Clk of
The process of acquiring the second frame from the image data DT_IMG is similar to the process of acquiring the first frame. In other words, the pixels of the first row are exposed in the section between the seventh time point T21 and a ninth time point T23, and the read operation on the pixels of the first row is performed in the section between the ninth time point T23 and an eleventh time point T25. Further, the pixels of the last row are exposed in the section between an eighth time point T22 and a tenth time point T24, and the read operation on the pixels of the last row is performed in the section between the tenth time point T24 and the twelfth time point T26.
According to an exemplary embodiment of the present inventive concept, time information from five time points may be generated in the form of a time stamp.
In the case of the first frame, for example, first time information TS_0 includes information on the first time point T11 at which the exposure of the pixels of the first row starts, second time information TS_1 includes information on the third point T13 at which the exposure of the pixels of the first row ends and the read operation starts, and third time information TS_2 includes information on the fifth time point T15 at which the read operation of the pixels of the first row ends. Fourth time information TS_3 includes information on the fourth time point T14 at which the exposure of pixels of the last row ends and the read operation starts, and fifth time information TS_4 includes information on the sixth time point T16 at which the read operation of the pixels of the last row ends.
Similarly, in the case of the second frame, sixth time information TS_5 including the information on the seventh time point T21 at which the exposure of the pixels of the first row starts, seventh time information TS_6 including information on the ninth time point T23 at which the exposure of the pixels of the first row ends and the read operation starts, and eighth time information TS_7 including information on the eleventh time point T25 at which the read operation on the pixels of the first row ends are generated. Additionally, ninth time information TS_8 including information on the tenth time point T24 at which the exposure of pixels of the last row ends and the read operation starts, and tenth time information TS_9 including information on the twelve time point T26 at which the read operation on the pixels of the last row ends are generated.
According to an exemplary embodiment of the present inventive concept, the time information used for the stabilization processing operation of the image data DT_IMG may include information on an exposure start time point, an exposure end time point (or a read operation start time point), and a read operation end time point of pixels of the first row, and an exposure end time point (or a read operation start time point) and a read operation end time point of pixels of the last row. The exposure start time point of the pixels of the last row may be calculated using the exposure start time point and the exposure end time point of the pixels of the first row, and the exposure end time point of the pixels of the last row. For example, since a difference between the exposure start time point T11 of the pixels of the first row of the first frame and the exposure start time point T12 of the pixels of the last row is the same as a difference between the exposure end time point T13 of the pixels of the first row and the exposure end time point T14 of the pixels of the last row, the exposure start time point T12 of the pixels of the last row may be calculated according to Formula 1 below.
T12=T11+(T14−T13) [Formula 1]
As a result, by reducing the amount of time information required for the stabilization processing operation of the image data DT_IMG, the workload of the image processing system 10 can be minimized.
The time information described above may be generated for the image data DT_IMG and the stabilization data (e.g., DT_LP of
Referring to
The input module 210 may receive the input of the image sensor output data DT_ISOUT that is output from the image sensor 100. The image sensor output data DT_ISOUT may include the image data DT_IMG, the image time information TS_IMG, and the stabilization data group DT_STB, and the stabilization data group DT_STB may include stabilization data (e.g., DT_LP′ of
The comparing module 230 may compare the received stabilization data group DT_STB and the image time information TS_IMG to output comparison output data DT_COMP. For example, when the stabilization data group DT_STB includes the lens position data DT_LP′ and the lens position time information TS_LP, the image time information TS_IMG and the lens position time information TS_LP may be compared and output. As another example, when the stabilization data group DT_STB includes the motion data DT_MT′ and the motion time information TS_MT, the image time information TS_IMG and the motion time information TS_MT may be compared and output.
The stabilization vector extracting module 250 may generate and output stabilization vector data DT_STBV, using the comparison output data DT_COMP of the comparing module 230. For example, when the stabilization data group DT_STB includes the lens position data DT_LP′ and the lens position time information TS_LP, since an auto focusing (AF) processing or optical image stabilization (OIS) processing operation is already performed in the acquisition process of the image data DT_IMG, it is possible to generate the stabilization vector data DT_STBV in consideration of this. As a result, it is possible to prevent an artifact that occurs due to the duplicated execution of the OIS processing operation and the EIS processing operation even though the AF processing or the OIS processing operation has already been executed.
As another example, when the stabilization data group DT_STB includes the motion data DT_MT′ and the motion time information TS_MT, the stabilization vector data DT_STBV may be generated and output in consideration of information on the motion of the object 20.
After this, the stabilization processing module 270 may perform the stabilization processing operation of the image data DT_IMG to output corrected data DT_DSP, on the basis of the image data DT_IMG received from the input module 210 and the stabilization vector data DT_STBV received from the stabilization vector extracting module 250. In other words, it is possible to correct the image data DT_IMG and output the corrected data DT_DSP to the display unit 300.
According to an exemplary embodiment of the present inventive concept, the comparing module 230, the stabilization vector extracting module 250, and the stabilization processing module 270 may be implemented in software. In other words, the image data processing unit 200 may be implemented in one application processor AP, and the application processor may function as the comparing module 230, the stabilization vector extracting module 250, and the stabilization processing module 270 via software. Alternatively, elements of the image data processing unit 200 may be implemented in hardware, e.g., as circuits.
Referring to
In operation S120, the clock signal generating unit 130 may generate the clock signal Clk on the current time information and may output the clock signal Clk to the video acquiring unit 110 and the stabilization data generating unit 120.
In operation S130, the stabilization data generating unit 120 may generate the lens position data DT_LP′ on the position of the lens 400. According to an exemplary embodiment of the present inventive concept, the lens position data DT_LP′ may include information on the position of the lens 400 in the incident direction of light, and may include result information from performing the AF operation of the lens 400. According to an exemplary embodiment of the present inventive concept, the lens position data DT_LP′ may include information on the position of the lens 400 in the direction perpendicular to the incident direction of light, and may include result information from performing the OIS operation of the lens 400.
In operation S140, the stabilization data generating unit 120 may generate the image time information TS_IMG and the lens position time information TS_LP on the basis of the clock signal Clk received from the clock signal generating unit 130. According to an exemplary embodiment of the present inventive concept, when the lens position data DT_LP′ includes result information from the AF operation of the lens 400, the lens position time information TS_LP may be the AF time information TS_AF on the AF data DT_AF. According to an exemplary embodiment of the present inventive concept, when the lens position data DT_LP′ includes the result information from the OIS operation of the lens 400, the lens position time information TS_LP may be the OIS time information TS_OIS on the OIS data DT_OIS.
In operation S150, the image sensor 100 may output the image data DT_IMG, the image time information TS_IMG, and the stabilization data group DT_STB. The stabilization data group DT_STB may include the lens position data DT_LP′ and the lens position time information TS_LP. According to an exemplary embodiment of the present inventive concept, the buffer 150 receives the image data DT_IMG from the video acquiring unit 110 and outputs it to the synchronizer 160, the stabilization data generating unit 120 outputs the image time information TS_IMG and the stabilization data group DT_STB to the synchronizer 160, and the synchronizer 160 may synchronize the received image data, the image time information TS_IMG, and the stabilization data group DT_STB to output them to the image data processing unit 200.
According to an exemplary embodiment of the present inventive concept, the buffer 150 may receive the image data DT_IMG from the video acquiring unit 110, and may receive the image time signal TS_IMG and the stabilization data group DT_STB from the stabilization data generating unit 120. Thereafter, the buffer 150 may output the received image data DT_IMG, the image time signal TS_IMG, and the stabilization data group DT_STB to the synchronizer 160, and the synchronizer 160 may synchronize the received data to output it to the image data processing unit 200.
Referring to
In operation S151, the image sensor 100 may output the image data DT_IMG, the image time information TS_IMG, and the stabilization data group DT_STB, using one port. According to an exemplary embodiment of the present inventive concept, the stabilization data group may include the lens position time information TS_LP on the lens position data DT_LP′ and the lens position data DT_LP′.
According to an exemplary embodiment of the present inventive concept, the port to be output may be a mobile industry processor interface (MIPI) port. In other words, the image sensor output data DT_IS OUT may be output from the image sensor 100 to the image data processing unit 200 through MIPI. In this case, the image data DT_IMG, the image time information TS_IMG, and the stabilization data group DT_STB included in the image sensor output data DT_IS OUT may be output in an embedded data format or a general data format of MIPI.
Referring to
The stabilization data generating unit 120 may generate stabilization data on the position of lens (e.g., the lens position data DT_LP′) in operation S131, and may generate second stabilization data on the motion of the object 20 (e.g., the motion data DT_MT′) in operation S133.
In operation S141, the stabilization data generating unit 120 may generate and output the image time information TS_IMG, first stabilization time information (e.g., the lens position time information TS_LP) and second stabilization time information (e.g., the motion time information TS_MT) on the basis of the clock signal Clk.
In operation S153, the image sensor may output the image data DT_IMG, the image time information TS_IMG, a first stabilization data group, and a second stabilization data group. The first stabilization data group includes the first stabilization data and the first stabilization time information. Additionally, the second stabilization data group includes the second stabilization data and the second stabilization time information.
In operation S260, the image data processing unit 200 may receive the image sensor output data DT_IS OUT from the image sensor 100, and may perform the stabilization processing operation of the image data DT_IMG on the basis thereof. The image sensor output data DT_IS OUT includes the image data DT_IMG, the image time information TS_IMG, and the stabilization data group DT_STB, and at this time, the stabilization data group DT_STB may include stabilization data and stabilization time information.
The process described with reference to
In operation S270, the image data processing unit 200 outputs image processing data subjected to fabrication, such as the stabilization processing of the image data DT_IMG.
In operation S280, the display unit 300 receives the image processing data and displays an image on the basis of the image processing data.
While the present inventive concept has been shown and described with reference to exemplary embodiments thereof, it will be apparent to those of ordinary skill in the art that various modifications in form and details may be made thereto without departing from the spirit and scope of the present inventive concept as set forth by the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2019-0065663 | Jun 2019 | KR | national |