1. Technical Field
The present invention is related to data encoding, more specifically to data encoding executed in an imaging device.
2. Description of the Related Art
By mounting a small or thin imaging device on a small or thin portable terminal, such as a portable phone or a PDA (personal digital assistant), the portable terminal can now function as an imaging device also. Thanks to this new development, the portable terminal, such as the portable phone, can send not only audio information but also visual information. The imaging device has been also mounted on a portable terminal such as the MP3 player, besides the portable phone and PDA. As a result, a variety of portable terminals can now function as an imaging device, capturing an external image and retaining the image as electronic data.
Generally, the imaging device uses a solid state imaging device such as a CCD (charge-couple device) image sensor or a CMOS (complementary metal-oxide semiconductor) image sensor.
As shown in
The image sensor 110 has a Bayer pattern and outputs an electrical signal, corresponding to the amount of light inputted through a lens, per unit pixel.
The image signal processor 120 converts raw data inputted from the image sensor 110 to a YUV value and outputs the converted YUV value to the back-end chip. Based on the fact that the human eye reacts more sensitively to luminance than to chrominance, the YUV method divides a color into a Y component, which is luminance, and U and V components, which are chrominance. Since the Y component is more sensitive to errors, more bits are coded in the Y component than in the U and V components. A typical Y:U:V ratio is 4:2:2.
By sequentially storing the converted YUV value in FIFO, the image signal processor 120 allows the back-end chip 130 to receive corresponding information.
The back-end chip 130 converts the inputted YUV value to JPEG or BMP through a predetermined encoding method and stores the YUV value in a memory, or decodes the encoded image, stored in the memory, to display on the display unit 150. The back-end chip 130 can also enlarge, reduce or rotate the image. Of course, it is possible, as shown in
The baseband chip 140 controls the general operation of the imaging device. For example, once a command to capture an image is received from a user through a key input unit (not shown), the baseband chip 140 can make the back-end chip 130 generate encoded data corresponding to the inputted external image by sending an image generation command to the back-end chip 130.
The display unit 150 displays the decoded data, provided by the control of the back-end chip 130 or the baseband chip 140.
As illustrated in
Then, in a step represented by 220, a quantizer quantizes a DCT coefficient of each block by applying a weighted value according to the effect on the visual. A table of this weighted value is called a “quantization table.” A quantization table value takes a small value near the DC and a high value at a high frequency, keeping the data loss low near the DC and compressing more data at a high frequency.
Then, in a step represented by 230, the final compressed data is generated by an entropy encoder, which is a lossless coder.
The data encoded through the above steps is stored in a memory. The back-end chip decodes the data loaded in the memory and displays the data on the display unit 150.
Signal types during the steps of sequentially inputting the data, stored in the memory, to process, for example, decoding are shown in
As shown in
As a result, the back-end chip 130 of the conventional imaging device consumed unnecessary electric power by carrying out an unnecessary operation.
Moreover, as shown in
In this case, the back-end chip 130 sometimes processes not only the frame that is currently being processed but also the next frame, not completing the input and/or process of correct data.
In order to solve the problems described above, the present invention provides a method of transferring encoded data and an imaging device executing the method thereof that can increase the process efficiency and reduce power consumption of the back-end chip.
The present invention also provides a method of transferring encoded data and an imaging device executing the method thereof that can increase the process efficiency and process speed of the back-end chip by having valid data, forming an image, concentrated in the front path of an outputting data column.
The present invention also provides a method of transferring encoded data and an imaging device executing the method thereof that can male the hardware design and control easier by using a general interface structure when the image signal processor provides encoded data to the back-end chip.
The present invention also provides a method of transferring encoded data and an imaging device executing the method thereof that can perform a smooth encoding operation by allowing the image signal processor to determine, in accordance with the encoding speed, whether the inputted frame is to be encoded.
Other objects of the present invention will become apparent through the preferred embodiments described below.
To achieve the above objects, an aspect of the present invention features an image signal processor and/or an imaging device having the image signal processor.
According to an embodiment of the present invention, the image signal processor of the imaging device has an encoding unit, which generates encoded image data by encoding, in accordance with a predetermined encoding method, image data corresponding to an electrical signal inputted from the image sensor and a data output unit, which transfers the encoded image data, inputted sequentially from the encoding unit, for each frame to a receiving part in accordance with a predetermined basis. The receiving part is a back-end chip or a baseband chip. The predetermined basis allows a series of data to be outputted at a certain interval for certain duration, and the series of data comprise valid data, followed by dummy data, of the encoded image data.
The encoding unit can notify the amount of encoded image data or valid data to the data output unit at every interval such that the data output unit can determine an output amount of the dummy data.
In case information for starting to input a following frame is inputted from the image sensor or the encoding unlit while a preceding frame is processed by the encoding unit, the data output unit can input into the image sensor or the encoding unit a skip command to have the following frame skip the process.
The predetermined encoding method can be one of a JPEG encoding method, a BMP encoding method, an MPEG encoding method, and a TV-out method.
The image signal processor can further comprise a clock generator.
The data output unit can output a clock signal to the receiving part in a section only to which valid data is delivered.
The data output unit can further output a vertical synchronous signal (V_sync) and a valid data enable signal to the receiving part.
The data output unit can comprise a V_sync generator, which generates and outputs the vertical synchronous signal of high or low state in accordance with a vertical synchronous signal control command, an H_sync generator, which generates and outputs the valid data enable signal of high or low state in accordance with a valid data enable control command, a delay unit, which outputs in accordance with a data output control command a series of data for a certain duration, and a transmission control unit, which generates and outputs the vertical synchronous signal control command, the valid data enable control command, and the data output control command. The series of data comprise valid data and dummy data, and valid data of the encoded image data are outputted first, followed by dummy data for a remaining duration.
The certain duration can be a length of time for which the valid data enable signal is continuously outputted in a high state.
The valid data enable signal can be interpreted as a write enable signal in the receiving part.
The transmission control unit can determine, by using header information and tail information of the encoded image data stored in the delay unit, whether encoding of the preceding frame is completed.
In case input start information of the following frame is inputted while the preceding frame is being processed, the transmission control unit can control to maintain the current state if the vertical synchronous signal outputted by the V_sync generator is in a low state.
According to another embodiment of the present invention, the image signal processor of the imaging device comprises a V_sync generator, which generates and outputs a vertical synchronous signal of high or low state in accordance with a vertical synchronous signal control command, an H_sync generator, which generates and outputs a valid data enable signal of high or low state in accordance with a valid data enable control command, a delay unit, which outputs in accordance with a data output control command a series of data for a certain duration, and a transmission control unit, which generates and outputs the vertical synchronous signal control command, the valid data enable control command, and the data output control command. The series of data can comprise valid data and dummy data, and valid data of the encoded image data can be outputted first, followed by dummy data for a remaining duration.
According to another embodiment of the present invention, the imaging device, comprising an image sensor, an image signal processor, a back-end chip, and a baseband chip, comprises an encoding unit, which generates encoded image data by encoding, in accordance with a predetermined encoding method, image data corresponding to an electrical signal inputted from the image sensor; and a data output unit, which transfers the encoded image data, inputted sequentially from the encoding unit, for each frame to a receiving part in accordance with a predetermined basis. The receiving part is a back-end chip or a baseband chip. The predetermined basis can allow a series of data to be outputted at a certain interval for certain duration, and the series of data can comprise valid data, followed by dummy data, of the encoded image data.
In order to achieve the above objects, another aspect of the present invention features a method of processing an image signal executed in an image signal processor and/or a recorded medium recording a program for executing the method thereof.
According to an embodiment of the present invention, the method of processing the image signal, executed in the image signal processor of the imaging device comprising the image sensor, comprises (a) extracting valid data only from image data encoded and sequentially inputted by an encoding unit, and sequentially outputting the valid data to a receiving part and (b) in case the valid data finish outputting before coming to an end of a predetermined duration, outputting dummy data to the receiving part for a remaining time of the predetermined duration. The receiving part is a back-end chip or a baseband chip.
The steps (a)-(b) can be repeated for one frame at every predetermined interval.
In case information for starting to input a following frame is inputted from the image sensor while a preceding frame is processed, the encoding process of the following frame can be controlled to be skipped.
Completion of encoding the preceding frame can be determined by using header information and tail information of the inputted encoded image data.
The predetermined duration can be a length of time for which the valid data enable signal is continuously outputted in a high state.
The valid data enable signal can be interpreted as a write enable signal in the receiving part.
The above objects, features and advantages will become more apparent through the below description with reference to the accompanying drawings.
Since there can be a variety of permutations and embodiments of the present invention, certain embodiments will be illustrated and described with reference to the accompanying drawings. This, however, is by no means to restrict the present invention to certain embodiments, and shall be construed as including all permutations, equivalents and substitutes covered by the spirit and scope of the present invention. Throughout the drawings, similar elements are given similar reference numerals. Throughout the description of the present invention, when describing a certain technology is determined to evade the point of the present invention, the pertinent detailed description will be omitted.
Terms such as “first” and “second” can be used in describing various elements, but the above elements shall not be restricted to the above terms. The above terms are used only to distinguish one element from the other. For instance, the first element can be named the second element, and vice versa, without departing the scope of claims of the present invention. The term “and/or” shall include the combination of a plurality of listed items or any of the plurality of listed items.
When one element is described as being “connected” or “accessed” to another element, it shall be construed as being connected or accessed to the other element directly but also as possibly having another element in between. On the other hand, if one element is described as being “directly connected” or “directly accessed” to another element, it shall be construed that there is no other element in between.
The terms used in the description are intended to describe certain embodiments only, and shall by no means restrict the present invention. Unless clearly used otherwise, expressions in the singular number include a plural meaning. In the present description, an expression such as “comprising” or “consisting of” is intended to designate a characteristic, a number, a step, an operation, an element, a part or combinations thereof, and shall not be construed to preclude any presence or possibility of one or more other characteristics, numbers, steps, operations, elements, parts or combinations thereof.
Unless otherwise defined, all terms, including technical terms and scientific terms, used herein have the same meaning as how they are generally understood by those of ordinary skill in the art to which the invention pertains. Any term that is defined in a general dictionary shall be construed to have the same meaning in the context of the relevant art, and, unless otherwise defined explicitly, shall not be interpreted to have an idealistic or excessively formalistic meaning.
Hereinafter, preferred embodiments will be described in detail with reference to the accompanying drawings. Identical or corresponding elements will be given the same reference numerals, regardless of the figure number, and any redundant description of the identical or corresponding elements will not be repeated.
In describing the embodiments of the present invention, the process operation of the image signal processor, which is the core subject of the invention, will be described. However, it shall be evident that the scope of the present invention is by no means restricted by what is described herein.
As shown in
The image signal processor 400 comprises a pre-process unit 410, a JPEG encoder 420 and a data output unit 430. The image signal processor 400 can of course further comprise a clock generator for internal operation.
The pre-process unit 410 performs pre-process steps in preparation for the process by the JPEG encoder 420. The pre-process unit 410 can receive from the image sensor 110 and process an electrical signal type of raw data for each frame per line, and then can transfer the raw data to the JPEG encoder 420.
The pre-process steps can comprise at least one of the steps consisting of color space transformation, filtering and color subsampling.
The color space transformation transforms an RGB color space to a YUV (or YIQ) color space. This is to reduce the amount of information without recognizing the difference in picture quality.
The filtering is a step of smoothing the image using a low-pass filter in order to increase the compression ratio.
The color subsampling subsamples the chrominance signal component by using all of the Y value, some of other values and none of the remaining values.
The JPEG encoder 420 compresses the pre-processed raw data, as in the method described earlier, and generates JPEG encoded data. The JPEG encoder 420 can comprise a memory for temporarily storing the processed raw data inputted from the pre-process unit 410 to divide the raw data into predetermined block units (e.g. 8×8) for encoding. The JPEG encoder 420 can further comprise an output memory, which temporarily stores JPEG encoded data prior to outputting the JPEG encoded data to the data output unit 430. The output memory can be, for example, a FIFO. In other words, the image signal processor 400 of the present invention can also encode image data, unlike the conventional image signal processor 120. In addition, the JPEG encoder 420 (or output memory) can provide to a transmission control unit 550 (refer to
The data output unit 430 transfers the JPEG encoded data, generated by the JPEG encoder 420, to the back-end chip 420 (or a camera control processor, hereinafter referred to as “back-end chip” 405).
When transferring the JPEG encoded data to the back-end chip 405, the data output unit 430 outputs the data at every predetermined interval, and the size of the total outputted data (i.e. JPEG encoded valid data (i.e. JPEG encoded data actually forming an image) and/or dummy data) coincides with a predetermined line size. Invalid data mentioned in this description refers to what is described in, for example, the JPEG standard as data that is not valid (i.e. data not actually forming an image), and is sometimes expressed as 0x00.
For example, if the back-end chip 405 recognizes that a frame of 640×480 has received all JPEG encoded data, the data output unit 430 sequentially generates valid data and dummy data among JPEG encoded data inputted from the JPEG encoder 420 until the data is outputted as much as the line size of 640.
The dummy data is only added to fill up the data until the line size of 640 is reached in case the valid data outputted by the data output unit 430 is short of the line size of 640. This is because the back-end chip 405 may not recognize the data if the data is smaller than the line size.
This will be sequentially repeated 480 times, which is the column size, at predetermined time intervals.
If the V_sync_I signal, which notifies the input on the following frame (e.g. (k+1)th inputted frame, hereinafter referred to as “(k+1)th frame”, whereas k is a natural number), is inputted from the image sensor 110 although the JPEG encoder 420 has not finished encoding a particular frame (e.g. the kth inputted frame, hereinafter referred to as “kth frame”), the data output unit 430 controls a V_sync generator 520 (refer to
The input of a new frame can be detected by various methods, including, for example, detecting a rising edge or falling edge of the V_sync signal, but the case of detecting the rising edge will be described here.
In other words, if the V_sync generator 520 is outputting a low state of V_sync signal (i.e. no new frame is inputted) to the back-end chip 405, the data output unit 430 can control to maintain the current state (refer to V_sync2 illustrated with dotted lines in
Of course, it is possible in this case that the data output unit 430 sends to the image sensor 110, the pre-process unit 410 or the JPEG encoder 420 a V_sync_skip signal for having the output and/or process skip on the (k+1)th frame corresponding to the V_sync_I signal.
Here, the image sensor 110, the pre-process unit 410 or the JPEG encoder 420 must have been already realized to carry out a predetermined operation when the V_sync_skip signal is received from the data output unit 430. The method for designing and realizing the above elements shall be easily understood through the present description by anyone skilled in the art, and hence will not be further described.
For example, in case the image sensor 110 received the V_sync_skip signal, it is possible to designate that the raw data of a frame corresponding to the V_sync_I signal is not sent to the pre-process unit 410. If the pre-process unit 410 received the V_sync_skip signal, it is possible to designate that the process of the raw data of a frame corresponding to the V_sync_I signal is skipped or the processed raw data is not sent to the JPEG encoder 420. Likewise, if the JPEG encoder 420 received the V_sync_skip signal, it is possible to designate that the processed raw data of a frame corresponding to the V_sync_I signal is not encoded or the processed raw data received from the pre-process unit 410 is not stored in the input memory.
Through the above steps, although the raw data corresponding to a plurality of frames (referred to as #1, #2, #3 and #4 herein in accordance with the order of input) are sequentially inputted from the image sensor 10, the encoded image data for the frames corresponding to #1, #3, and #4 may be inputted to the back-end chip 405 by the operation or control of the data output unit 430.
If a command to, for example, capture a picture is received from the baseband chip 140, which controls the general operation of the portable terminal, the back-end chip 405 receives and stores in the memory the picture-improved JPEG encoded data, which is inputted from the image signal processor 400, and then decodes and displays the data on the display unit 150, or the baseband chip 140 reads and processes the data.
The detailed structure of the data output unit 430 is illustrated in
Referring to
The AND gate 510 outputs a clock signal (P_CLK) to the back-end chip 405 only if every input is inputted with a signal. That is, by receiving the clock signal from a clock generator (not shown), disposed in the image signal processor 400, and receiving a clock control signal from the transmission control unit 550, the AND gate 510 outputs the clock signal to the back-end chip 405 only when the clock control signal instructs the output of the clock signal. The clock control signal can be a high signal or a low signal, each of which can be recognized as a P_CLK enable signal or a P_CLK disable signal.
The V_sync generator 520 generates and outputs the vertical synchronous signal (V_sync) for displaying a valid section, by the control of the transmission control unit 550. The V_sync generator 520 outputs a high state of V_sync signal until an output termination command of the V_sync signal is inputted by the transmission control unit 550 after an output compound of the V_sync signal is inputted. It shall be evident to anyone skilled in the art that the vertical synchronous signal means the start of input of each frame.
The H_sync generator 530 generates and outputs a valid data enable signal (H_REF) by the control of the transmission control unit 550 (i.e. until an output termination command of H_REF is inputted after an output command of H_REF is inputted). The high section of the valid data enable signal coincides with the output section of data (i.e. valid data and/or dummy data) outputted in real time by the delay unit 540 to correspond to the predetermined line size, and is determined by the duration for which the amount of data corresponding to the predetermined line size is outputted.
In case the size of a frame is determined to be n×m, the duration for which the H_REF signal is maintained in a high state will be the duration for which the data in the size of n (i.e. valid data+dummy data) is outputted, and there will be a total of m output sections of the H_REF signal in the high state for one frame. This is because the back-end chip 405 recognizes that all JPEG encoded data are inputted for one frame only if data in the size of n×m are accumulated in the memory.
The delay unit 540 sequentially outputs valid data of the JPEG encoded data, inputted from the JPEG encoder 420, during the data output section (i.e. H_REF is outputted in a high state). The delay unit 540 can comprise, for example, a register for delaying the data inputted from the JPEG encoder 420 for a predetermined duration (e.g. 2-3 clocks) before outputting the data. It shall be evident, without further description, to those of ordinary skill in the art that the transmission control unit 550 can determine whether the JPEG encoded data stored temporarily in the delay unit is valid data.
If there is no more valid data to transmit while H_REF is still in the high state (i.e. JPEG encoded data is not inputted from the output memory of the JPEG encoder 420), the dummy data are outputted for the rest of the time during which H_REF is maintained in the high state.
The dummy data can be generated in real time in the delay unit 540 by a dummy data generation command, provided by being generated in real time by the transmission control unit 550, or configured by being pre-generated or pre-determined.
As shown in
By outputting as described above, the valid data of the data stored in the memory of the back-end chip 405 can be placed in the front part although the amount of valid data is different per each line (refer to
This can improve the process efficiency because the scanning speed of valid data can be increased when the back-end chip 405 processes the decoding per line.
The transmission control unit 550 determines the duration and the number of which the H_REF signal is maintained in a high state from the operation starting point of the imaging device or the data output unit 430. The duration and the number can be set by the user or determined to correspond to the line size and the number of columns recognized as one frame by default.
The transmission control unit 550 controls the output of the clock control signal, the V_sync generator 520, the H_sync generator 530 and the delay unit 540, in accordance with the determined duration and number, to control the output state of each signal (i.e. P_CLK, H_sync, V_sync and data).
The transmission control unit 550 can recognize the information on the start and end of JPEG encoding by capturing “START MARKER” and “STOP MARKER” from the header and tail of the JPEG encoded data that the delay unit 540 sequentially receives from the JPEG encoder 430 and temporarily stores for outputting valid data. Through this, it becomes possible to recognize whether one frame is completely encoded by the JPEG encoder 420.
Using the status information inputted from the JPEG encoder 420 (or the output memory), the transmission control unit 550 can transmit a dummy data output command to the delay unit 540 to have the dummy data outputted from a certain point (i.e. when the transmission of the valid data is completed).
Of course it is possible to place before the delay unit a multiplexer (MUX), through which the JPEG encoded data and dummy data are outputted, and the delay unit 540 receives these JPEG encoded data and dummy data to output. In this case, if the transmission control unit 550, which pre-recognized the amount of inputted JPEG encoded data (or valid data) using the status information, inputs a dummy data output command to the multiplexer at a certain point, the MUX shall then be able to have pre-designated dummy data input to the delay unit 540.
If the V_sync_I signal, which indicates the input of the (k+1)th frame from the image sensor 110 although the JPEG encoding of the kth frame is not finished, the transmission control unit 550 controls the V_sync generator 520, as described earlier, to have the output of the V_sync signal skip. In other words, if the V_sync generator 520 is currently outputting a low state of V_sync signal to the back-end chip 405, the V_sync generator 520 will be controlled to maintain the current state (refer to
Then, as described earlier in detail, the transmission control unit 550 can control the following frame corresponding to the V_sync_skip signal to skip the output and process (e.g. JPEG encoding) of data by transmitting the V_sync_skip signal to the image sensor 110, the pre-process unit 410 or the JPEG encoder 420.
This is because the following element does not have to carry out any unnecessary process if data corresponding to the V_sync_I signal is not inputted from the preceding element (e.g. the image sensor 110 that received the V_sync_skip signal does not output raw data corresponding to the V_sync_I signal), or the following element can delete the inputted data (e.g. the JPEG encoder 420 that received the V_sync_skip signal does not encode but delete the processed raw data received from the pre-process unit 410 in accordance with the V_sync_I signal). Using this method, each element of the image signal processor 400 carries out its predetermined function but does not process the following frame unnecessarily, reducing unnecessary power consumption and limiting the reduction in process efficiency.
The signal types inputted to the back-end chip 405 by the control of the transmission control unit 550 are shown in
As shown in
The sections in which the H_REF signal is outputted in the high state coincide with the output sections of the valid data (which is followed by the dummy data (i.e. PAD)). In other words, the output of the valid data starts from the rising edge of the H_REF signal and terminates at the falling edge of the H_REF signal. Of course, if there is no more valid data at a certain point, dummy data will be outputted from that point to the falling edge. Although
Moreover, if the speed at which the JPEG encoder 420 encodes the image of the kth frame, inputted from the image sensor 110, is slow (e.g. V_sync_I, indicating the start of input of a new frame, is inputted while encoding one frame), the data output unit 430 allows the JPEG encoding to be completed by having the V_sync signal for the following frame to be maintained low (i.e. the dotted sections of V_sync2, shown in
The conventional back-end chip 405 is embodied to receive the YUV/Bayer format of data, and uses the P_CLK, V_sync, H_REF and DATA signals as the interface for receiving these data.
Considering this, the image signal processor 400 of the present invention is embodied to use the same interface as the conventional image signal processor.
Therefore, it shall be evident that the back-end chip 405 of the present invention can be port-matched although the back-end chip 405 is embodied through the conventional method of designing back-end chip.
For example, if the operation of a typical back-end chip 405 can be said to be initialized from an interrupt of the rising edge of the V_sync signal, the interfacing between the chips is possible, similar to outputting the conventional V_sync signal, in the present invention by inputting the corresponding signal to the back-end chip 405, since the conventional interface structure is identically applied to the present invention.
Likewise, considering that the typical back-end chip 405 must generate the V_sync rising interrupt and that the valid data enable signal (H_REF) is used as a write enable signal of the memory when data is received from the image signal processor 400, the power consumption of the back-end chip 405 can be reduced by using the signal output method of the present invention.
Hitherto, although the image signal processor 400 using the JPEG encoding method has been described, it shall be evident that the same data transmission method can be used for other encoding methods, such as the BMP encoding method, MPEG (MPEG 1/2/4 and MPEG-4 AVC) encoding and TV-out method.
As described above, the present invention can increase the process efficiency and reduce power consumption of the back-end chip.
The present invention can also increase the process efficiency and process speed of the back-end chip by having valid data, forming an image, concentrated in the front part of an outputting data column.
Moreover, the present invention can make the hardware design and control easier by using a general interface structure when the image signal processor provides encoded data to the back-end clip.
Furthermore, the present invention enables a smooth encoding operation by allowing the image signal processor to determine, in accordance with the encoding speed, whether the inputted frame is to be encoded.
The drawings and detailed description are only examples of the present invention, serve only for describing the present invention and by no means limit or restrict the spirit and scope of the present invention. Thus, any person of ordinary skill in the art shall understand that a large number of permutations and other equivalent embodiments are possible. The true scope of the present invention must be defined only by the spirit of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2005-0104612 | Nov 2005 | KR | national |
This application claims foreign priority benefits under 35 U.S.C. sctn. 119(a)-(d) to PCT/KR2006/004454, filed Oct. 30, 2006, which is hereby incorporated by reference in its entirety.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/KR2006/004454 | 10/30/2006 | WO | 00 | 5/1/2008 |