This application claims priority from Korean Patent Application No. 10-2012-0095404 filed on Aug. 30, 2012, the disclosure of which is hereby incorporated by reference in its entirety.
Exemplary embodiments relate to a method of processing an image. More particularly, exemplary embodiments relate to a method of processing a multi-view image using a plurality of video codec modules and apparatuses, e.g., a system-on-chip (SoC), and an image processing system including the same for executing the method.
Multi-view coding is a three-dimensional (3D) image processing technique by which images shot by two or more cameras are geometrically corrected and spatially mixed to provide users multiple points of view. It is also called 3D video coding.
In a related art, video source data or video streams having multiple points of view are processed using a single video codec module. In other words, after first data of a first view is processed, first data of a second view is processed. After first data of all views, i.e., the first and second views are processed, second data of the first view and second data of the second view are sequentially processed. In other words, in processing data of multiple views using a single video codec module, the first data of the respective views are all sequentially processed, and only then, the second data of the respective view are sequentially processed.
In the related art, when data of two views are processed using a single video codec module that can process 60 frames per second, 30 frames are processed for each view. Since the processing performance for each view decreases by half, problems may occur.
For example, in the related art, when an input source of 60 frames is input to an encoder for a second for each view, a total amount of data to be processed is 120 frames per second. Accordingly, the input source of the related art cannot be processed with the module that processes 60 frames per second. To overcome this problem, the input source of the related art is downscaled to a size of 30 frames per second for each view, such that a frame rate is decreased. In the related art, when an input data stream of 60 frames is input to a decoder for a second for each view, a total amount of data to be processed is 120 frames per second. As in the encoder, the input data stream of the related art cannot be processed for a second with the module that processes 60 frames per second. However, unlike the encoder, the amount of input data can be downscaled by half in the decoder. In this case, the decoder of the related art processes 60 frames per second, and displays images two times slower than an original speed.
According to an aspect of the exemplary embodiments, there is provided a method of processing a multi-view image using an image processing apparatus including a first video codec module and a second video codec module. The method includes the first video codec module processing a first frame of a first image signal and sending sync information to a host and the second video codec module processing a first frame of a second image signal, with reference to the processed data of the first frame of the first image signal. Here, a time in which the second video codec module starts the processing of the first frame of the second image signal is determined based on the sync information and the first image signal and the second image signal are processed in parallel by the first video codec module and the second video codec module.
The first image signal may be provided from a first image source and the second image signal may be provided from a second image source, which is different from the first image source.
Alternatively, the first and second image signals may be provided from a single image source.
The method may further include the first video codec module processing an i-th frame of the first image signal, with reference to the processed data of at least one previous frame of the i-th frame, and sending the sync information to the host; and the second video codec module processing an i-th frame of the second image signal, with reference to the processed data of the i-th frame of the first image signal according to control of the host, where “i” is an integer of at least 2.
The sync information may be frame sync information.
According to another aspect of the exemplary embodiments, there is provided a method of processing a multi-view image using an image processing apparatus including a first video codec module and a second video codec module. The method includes the first video codec module generating sync information every time data of a predetermined unit has been processed in a first frame of a first image signal; the second video codec module determining whether a reference block in the first frame of the first image signal has been processed according to the sync information; and the second video codec module processing a first frame of a second image signal with reference to the processed data of the reference block.
The predetermined unit may be a row, and the sync information may be row sync information.
The method may further include the first video codec module processing an i-th frame of the first image signal, with reference to the processed data of at least one previous frame of the i-th frame, and sending the sync information to the second video codec module every time data of each row in the i-th frame is processed; the second video codec module determining whether a reference block in the i-th frame of the first image signal has been processed according to the sync information; and the second video codec module processing an i-th frame of the second image signal, with reference to the processed data of the reference block in the i-th frame, where “i” is an integer of at least 2.
Alternatively, the predetermined unit may be a block, and the sync information may be stored in a bitmap memory.
At this time, the method may further include the first video codec module processing an i-th frame of the first image signal with reference to the processed data of at least one previous frame of the i-th frame, and setting a corresponding bit in the bitmap memory every time data of each block in the i-th frame is processed; the second video codec module reading a value from the bitmap memory and determining whether a reference block in the i-th frame of the first image signal has been processed according to the value read from the bitmap memory; the second video codec module processing an i-th frame of the second image signal, with reference to the processed data of the reference block; and combining the processed data of the i-th image of the first image signal with the processed data of the i-th frame of the second image signal into a multi-view image, where “i” is an integer of at least 2.
According to another aspect of the exemplary embodiments, there is provided a multi-view image processing apparatus including a first video codec module which is configured to output first image processed data as a result of processing a first image signal provided from a first image source, and to generate sync information at each predetermined time and a second video codec module which is configured to output second image processed data as a result of processing a second image signal provided from a second image source, using part of the output first image processed data according to the sync information. The first image processed data and the second image processed data are combined into a multi-view image.
The first image signal and the second image signal may include a plurality of frames, and the sync information may be generated every time the first video codec module processes each of the frames of the first image signal.
Alternatively, the first image signal and the second image signal may include a plurality of frames. Each of the frames may include a plurality of rows. The first video codec module may include a first sync transceiver which is configured to generate the sync information every time data of a row in each frame of the first image signal is processed. The second video codec module may include a second sync transceiver which is configured to receive the sync information from the first video codec module.
As another alternative, each of the frames may include a plurality of blocks. The second sync transceiver of the second video codec module may determine whether a reference block of the first image signal, which is referred to when a block of the second image signal is processed, has been processed using the sync information.
Each of the first and second video codec modules may include at least one of an encoder which is configured to encode an input signal, and a decoder which is configured to decode the input signal.
The first video codec module may send the sync information to a host, and the second video codec module may receive the sync information from the host. Alternatively, the first video codec module may include a first sync transceiver which is configured to transmit the sync information to the second video codec module, and the second video codec module may include a second sync transceiver which is configured to receive the sync information from the first video codec module.
Alternatively, the first video codec module may store the sync information in memory, and the second video codec module may read the sync information from the memory.
The first video codec module and the second video codec module may be implemented together in a single hardware module.
The first and second video codec modules may have a same specification (e.g., a same hardware specification).
According to another aspect of the exemplary embodiments, there is provided a method of processing a multi-view image using an image processing apparatus including a first video codec module and a second video codec module. The method includes the first video codec module sequentially receiving and processing a plurality of frames of a first image signal; the second video codec module sequentially receiving and processing a plurality of frames of a second image signal; and combining processed data of each frame of the first image signal and processed data of a corresponding frame of the second image signal into a multi-view image. The second video codec module may process each frame of the second image signal using at least part of the processed data of a corresponding frame of the first image signal according to sync information generated by the first video codec module.
The sync information may be generated every time when the first video codec module processes each of the frames of the first image signal.
The frames included in each of the first and second image signals may include a plurality of rows. The sync information may be generated every time when the first video codec module processes data of a row in each of the frames of the first image signal.
The method may further include the second video codec module determining whether a reference block in a first frame of the first image signal has been processed according to the sync information.
The frames included in each of the first and second image signals may include a plurality of blocks and the sync information may be generated every time when the first video codec module processes data of a block in each of the frames of the first image signal.
The method may further include storing the sync information, which includes bitmap data indicating whether the data of the block in each frame of the first image signal has been processed, in memory; and the second video codec module reading the bitmap data from the memory.
The method may further include the second video codec module determining whether the data of the block in each frame of the first image signal has been processed according to the bitmap data.
According to another aspect of the exemplary embodiments, there is provided a method of processing a multi-view image using an image processing apparatus including a first video codec module and a second video codec module. The method includes the first video codec module processing data of each block in a first frame of a first image signal and setting a bit in a bitmap memory every time when processing the data of the block; the second video codec module reading the bit from the bitmap memory and determining whether a reference block in the first frame of the first image signal has been processed according to the bit read from the bitmap memory; and the second video codec module processing a first frame of a second image signal with reference to processed data of the reference block in the first frame of the first image signal.
According to another aspect of the exemplary embodiment, there is provided a codec module for processing a multi-view image signal. The codec module includes a first video codec module which is configured to process a first image signal in the multi-view image, and output first image processing data and sync information; and a second video codec module which is configured to process a second image signal in the multi-view image, and output second image processing data. The second video codec module may process the second image signal using part of the first image processing data according to the sync information output from the first video codec module. The first codec module and the second codec module may perform parallel processing of the multi-view image signal.
The above and other features and advantages of the exemplary embodiments will become more apparent by describing in detail exemplary embodiments with reference to the attached drawings in which:
Exemplary embodiments will be described more fully hereinafter with reference to the accompanying drawings, in which embodiments are shown. Exemplary embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the exemplary embodiments to those skilled in the art. In the drawings, the size and relative sizes of layers and regions may be exaggerated for clarity. Like numbers refer to like elements throughout.
It will be understood that when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected” or “directly coupled” to another element, there are no intervening elements present. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items and may be abbreviated as “/”.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first signal could be termed a second signal, and, similarly, a second signal could be termed a first signal without departing from the teachings of the disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the exemplary embodiments. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” or “includes” and/or “including” when used in this specification, specify the presence of stated features, regions, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, regions, integers, steps, operations, elements, components, and/or groups thereof.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and/or the present application, and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The image processing apparatus 10 may include a central processing unit (CPU) 110, a codec module 115, a display controller 140, a read-only memory (ROM) 150, an embedded memory 170, a memory controller 160, an interface module 180, and a bus 190. However, components of the image processing apparatus 10 may be added or subtracted according to different embodiments. In other words, the image processing apparatus 10 may not include some of the components illustrated in
The CPU 110 may process or execute programs and/or data stored in the memory 150, 170, or 20. The CPU 110 may be implemented by a multi-core processor. The multi-core processor is a single computing component with two or more independent actual processors (referred to as cores). Each of the processors may read and execute program instructions. The multi-core processor can drive a plurality of accelerators at a time. Therefore, a data processing system including the multi-core processor may perform multi-acceleration.
The codec module 115 is a module for processing a multi-view image signal. It may include a first video codec module 120 and a second video codec module 130.
The first video codec module 120 may encode or decode a first image signal in the multi-view image signal. The second video codec module 130 may encode or decode a second image signal in the multi-view image signal. Although only two video codec modules 120 and 130 are illustrated in
As described above, a plurality of video codec modules are provided to perform parallel process of the multi-view image signal in the current embodiments. The structure and the operations of the first and second video codec modules 120 and 130 will be described later.
The ROM 150 may store permanent programs and/or data. The ROM 150 may be implemented by erasable programmable ROM (EPROM), or electrically erasable programmable ROM (EEPROM).
The embedded memory 170 is a memory embedded in the image processing apparatus 10 implemented as a SoC. The embedded memory 170 may store programs, data, or instructions. The embedded memory 170 may store image signals to be processed by the first and second video codec modules 120 and 130, i.e., data input to the first and second video codec modules 120 and 130. The embedded memory 170 may also store image signals that have been processed by the first and second video codec modules 120 and 130, i.e., data output from the first and second video codec modules 120 and 130. The embedded memory 170 may be implemented by volatile memory and/or non-volatile memory.
The memory controller 160 is used for interface with the external memory device 20. The memory controller 160 controls the overall operation of the external memory device 20 and controls the data communication between a master device and the external memory device 20. The master device may be a device such as the CPU 110 or the display controller 140.
The external memory device 20 is a storage for storing data and may store an operating system (OS) and various kinds of programs and data. The external memory device 20 may be implemented by DRAM, but the exemplary embodiments are not restricted to the current embodiments. The external memory device 20 may be implemented by non-volatile memory, such as flash memory, phase-change RAM (PRAM), magnetoresistive RAM (MRAM), resistive RAM (ReRAM), or ferroelectric RAM (FeRAM).
The external memory device 20 may store image signals to be processed by the first and second video codec modules 120 and 130, i.e., data input to the first and second video codec modules 120 and 130. The external memory device 20 may also store image signals that have been processed by the first and second video codec modules 120 and 130, i.e., data output from the first and second video codec modules 120 and 130. The components of the image processing apparatus 10 may communicate with one another through a system bus 190.
The display device 30 may display multi-view image signals. The display device 30 may be a liquid crystal display (LCD) device in the current embodiments, but the exemplary embodiments are not restricted to the current embodiments. In other embodiments, the display device 30 may be a light emitting diode (LED) display device, an organic LED (OLED) display device, or one of other types of display devices.
The display controller 140 controls the operations of the display device 30. The camera module 40 is a module that can convert an optical image into an electrical image. Although not shown in detail, the camera module 40 may include at least two cameras, e.g., first and second cameras. The first camera may generate a first image signal, corresponding to a first view in a multi-view image, and the second camera may generate a second image signal, corresponding to a second view in the multi-view image.
The first video codec module 120 processes the first image signal in the multi-view image and outputs first image processing data. The first video codec module also outputs sync information Sync_f. The first image signal is an image signal of the first view, for example, an image signal shot by the first camera. When the first image signal is a signal to be encoded, the encoder 121 encodes the first image signal and outputs a result. When the first image signal is a signal to be decoded, the decoder 122 decodes the first image signal and outputs a result. The sync information Sync_f may be frame sync information generated every time when processing (e.g., encoding or decoding) of a frame of the first image signal is completed.
The second video codec module 130 processes the second image signal in the multi-view image and outputs second image processing data. At this time, the second video codec module 130 may process the second image signal using part of the first image processing data according to the sync information Sync_f output from the first video codec module 120. The second image signal is an image signal of the second view, for example, an image signal shot by the second camera.
An image processing apparatus 10a of an example of the image processing apparatus 10 illustrated in
Referring to
An image signal of a first view view-0 is referred to as a first image signal. The first image signal may be input to the first video codec module 120 at a predetermined rate. The rate may be expressed in frames per unit time, e.g., frames per second (fps). For instance, the first image signal may be input at a rate of 60 fps. An image signal of a second view view-1 is referred to as a second image signal. The second image signal may be input to the second video codec module 130 at the same rate (e.g., 60 fps) as the first image signal.
The first and second image signals may be signals that will be encoded or decoded. For instance, when each of the first and second image signals is generated by and input from a camera, the first and second image signals may be encoded by the encoders 121 and 131, respectively, of the respective first and second video codec modules 120 and 130 and stored in the memory 170 or 20 (
The first video codec module 120 may sequentially receive and process a plurality of frames I11 through I19 of the first image signal, and may generate the sync information Sync_f every time when processing of a frame is complemented. When the first video codec module 120 processes a current frame, i.e., an i-th frame of the first image signal, it may refer to processed data of at least one of previous frames, e.g., (i-1)-th through (i-16)-th frames.
When the first image signal is a signal to be encoded, the encoder 121 of the first video codec module 120 sequentially encodes the first through ninth frames I11 through I19 of the first image signal, and outputs encoded data O11 through O19. The firmware 123 of the first video codec module 120 provides the host 110 with the sync information Sync_f every time each of the frames I11 through I19 is completely encoded.
When the first image signal is a signal to be decoded, the encoder 121 of the first video codec module 120 sequentially decodes the first through ninth frames I11 through I19 of the first image signal and outputs decoded data O11 through O19. The firmware 123 of the first video codec module 120 provides the host 110 with the sync information Sync_f every time when each of the frames I11 through I19 is completely decoded.
The host 110 may control the operation of the second video codec module 130 according to the sync information Sync_f.
The second video codec module 130 sequentially receives and processes a plurality of frames I21 through I29 of the second image signal. When the second video codec module 130 processes each frame of the second image signal, it refers to the processed data of a corresponding frame of the first image signal. Accordingly, the second video codec module 130 waits for the corresponding frame of the first image signal to be completely processed.
When the sync information Sync_f is generated by the first video codec module 120 after a frame of the first image signal is completely processed by the first video codec module 120, the second video codec module 130 processes a corresponding frame of the second image signal with reference to processed data of the frame of the first image signal in response to the sync information Sync_f. For instance, the second video codec module 130 processes a first frame I21 of the second image signal with reference to the processed data O11 of the first frame I11 of the first image signal.
When the second image signal is encoded, the encoder 131 of the second video codec module 130 may sequentially encode first through ninth frames I21 through I29 of the second image signal with reference to the encoded data O11 through O19 of the respective frames I11 through I19 of the first image signal, and output encoded data O21 through O29. When the second image signal is decoded, the decoder 132 of the second video codec module 130 may sequentially decode the first through ninth frames I21 through I29 of the second image signal with reference to the decoded data O11 through O19 of the respective frames I11 through I19 of the first image signal, and output decoded data O21 through O29.
Accordingly, an initial delay from a time when the first frames I11 and I21 of the respective first and second image signals are input to a time when the first frames I11 and I21 are completely processed is illustrated in
When the second video codec module 130 processes a current frame, i.e., an i-th frame of the second image signal, the second video codec module 130 may refer to processed data of at least one of previous frames of the second image signal as well as the processed data of the first image signal. The maximum number of previous frames that can be referred to may be 16, but the number is not restricted.
The processed data O11 through O19 of the first image signal and the processed data O21 through O29 of the second image signal may be stored in the memory 170 or 20 (
Referring to
The first video codec module 210 processes a first image signal view-0 in a multi-view signal, and outputs first image processed data O11 through O16. The first video codec module 210 also outputs sync information Sync_r. The sync information Sync_r is output for a row of a frame. Accordingly, the sync information Sync_r may be called row sync information. The first video codec module 210 may also output the frame sync information Sync_f. The first image signal view-0 includes a plurality of frames I11 through 116 input sequentially. Each of the frames I11 through 116 includes a plurality of rows. The first video codec module 210 may output the sync information Sync_r to the second video codec module 220 every time the first video codec module 210 processes data of a row in a frame.
The second video codec module 220 processes a second image signal view-1 in the multi-view signal, and outputs second image processed data O21 through O26. At this time, the second video codec module 220 may process the second image signal view-1 using part of the first image processed data O11 through O16, according to the sync information Sync_r output from the first video codec module 210. The first and second video codec modules 210 and 220 illustrated in
The sync transceiver 214 generates the sync information Sync_r per row. Every time when the encoder 211 of the first video codec module 210 encodes data of a row in a frame of the first image signal view-0, the encoder 211 reports the sync transceiver 214 about the encoding and the sync transceiver 214 outputs the sync information Sync_r in response to the report. Every time when the decoder 212 of the first video codec module 210 decodes data of a row in a frame of the first image signal view-0, the decoder 211 reports the sync transceiver 214 about the decoding and the sync transceiver 214 outputs the sync information Sync_r in response to the report.
The sync transceiver 224 receives the sync information Sync_r from the first video codec module 210. The sync transceiver 224 determines whether a reference block of the first image signal view-0, which is referred to when a block of the second image signal view-1 is processed, has been completely processed using the sync information Sync_r. When it is determined that the reference block of the first image signal has been completely processed, the sync transceiver 224 may output a control signal for starting processing of a corresponding block of the second image signal to the encoder 221 or the decoder 222 of the second video codec module 220.
The sync transceivers 214 and 224 may perform both transmission and reception of the sync information Sync_r. For instance, the sync transceivers 214 and 224 may have both of sync information transmission and reception functions but may enable only one function (i.e., the sync information transmission or the sync information reception) when necessary.
A procedure for processing a multi-view image signal will be described in detail with reference to
An image signal of a first view is referred to as a first image signal view-0. The first image signal view-0 may be input to the first video codec module 210 at a predetermined rate. The rate may be expressed in fps. For instance, the first image signal view-0 may be input at a rate of 60 fps. An image signal of a second view is referred to as a second image signal view-1. The second image signal view-1 may be input to the second video codec module 220 at the same rate (e.g., 60 fps) as the first image signal view-0.
The first video codec module 210 may sequentially receive and process a plurality of the frames I11 through 116 of the first image signal view-0, and may generate the sync information Sync_r every time when processing of data of a row in a frame is complemented.
Referring to
A frame is a single picture and includes a plurality of pixels. For instance, there are 1000×1000 pixels in a frame shot by a 1-magapixel camera. Like typical image processing devices, an image processing apparatus according to some embodiments may process an image signal in units of macro blocks formed by grouping the pixels (e.g., 1000×1000 pixels) into N×M pixels. Here, N and M may be the same integer of at least 2. For instance, N×M may be 8×8, 16×16, or 32×32, but is not restricted thereto. In other words, each macro block includes N×M pixels.
When a frame including 1000×1000 pixels is divided into macro blocks including 8×8 pixels (i.e., 64 pixels), the frame is divided into 125×125 macro blocks. An image processing apparatus according to other embodiments may process an image signal in units of tiles formed by grouping the pixels into I×J macro blocks, where I and J are an integer of at least 1. When a frame including 125×125 macro blocks is divided into tiles including 5×25 (i.e., 125) macro blocks, the frame includes 25×5 tiles.
The second video codec module 220 sequentially receives and processes a plurality of frames I21 through I29 of the second image signal view-1. When the second video codec module 220 processes each frame of the second image signal view-1, it refers to part of the processed data of a corresponding frame of the first image signal view-0. Accordingly, the second video codec module 220 waits for a macro block to be referred to in the first image signal view-0 to be completely processed.
When the sync information Sync_r is generated by the sync transceiver 214 of the first video codec module 210 after data of a row in a frame of the first image signal view-0 is completely processed by the first video codec module 210, the sync transceiver 224 of the second video codec module 220 determines whether data of a row including a macro block to be referred to has been completely processed.
For instance, it is assumed that to process a (1,1) macro block in each frame of the second image signal view-1, a (1,1) macro block in the frame of the first image signal view-0 and its neighboring blocks (e.g., (1,2), (1,3), (2,1), (2,2), (2,3), (3,1), (3,2) and (3,3) macro blocks) in the frame of the first image signal view-0 are referred to. In this case, to process the (1,1) macro block in the first frame of the second image signal view-1, it is necessary that data of the last row (e.g., the 30th row) crossing the (3,1), (3,2) and (3,3) macro blocks in the first frame of the first image signal view-0 is completely processed.
Accordingly, when the data of the 30th row in the first frame of the first image signal view-0 is completed processed, the second video codec module 220 starts processing the (1,1) macro block in the first frame of the second image signal view-1 in response to the sync information Sync_r output from the first video codec module 210.
Let's assumed that to process a (3,1) macro block in each frame of the second image signal view-1, a (3,1) macro block in the frame of the first image signal view-0 and its neighboring blocks (e.g., (1,1), (1,2), (1,3), (2,1), (2,2), (2,3), (3,2), (3,3), (4,1), (4,2), (4,3), (5,1), (5,2) and (5,3) macro blocks) in the frame of the first image signal view-0 are referred to. In this case, to process the (3,1) macro block in the first frame of the second image signal view-1, it is necessary that data of the last row (e.g., the 50th row) crossing the (5,1), (5,2) and (5,3) macro blocks in the first frame of the first image signal view-0 is completely processed.
Accordingly, when the data of the 50th row in the first frame of the first image signal view-0 is completed processed, the second video codec module 220 starts processing the (3,1) macro block in the first frame of the second image signal view-1 in response to the sync information Sync_r output from the first video codec module 210.
The sync transceiver 224 may determine up to which row data has been processed by counting the number of times the sync information Sync_r is received. A count value of the sync transceiver 224 may be reset to an initial value (e.g., 0) at each frame. A maximum search range to be referred to for the processing of the second image signal view-1 may be different depending on codec standards used in processing image signals.
A search range is perceived by the second video codec module 220. The second video codec module 220 detects macro blocks to be referred to for processing of each macro block based on the maximum search range. Therefore, the second video codec module 220 processes the macro block of the second image signal view-1 with reference to the processed data of reference macro blocks of the first image signal view-0.
Accordingly, in the embodiments illustrated in
The first image processed data O11 through O16 of the first image signal view-0 and the second image processed data O21 through O26 of the second image signal view-1 may be stored in the memory 170 or 20 (
The first video codec module 310 includes an encoder 311, a decoder 312, firmware 313, and a sync controller 314. The second video codec module 320 includes an encoder 321, a decoder 322, firmware 323, and a sync controller 324.
The first and second video codec modules 310 and 320 illustrated in
The bitmap memory 330 stores information (hereinafter, referred to as block processing information) indicating whether a macro block in a frame of a first image signal (view-0 in
Referring to
An image signal of a first view is referred to as the first image signal view-0. The first image signal view-0 may be input to the first video codec module 310 at a predetermined rate. The rate may be expressed in fps. For instance, the first image signal view-0 may be input at a rate of 60 fps. An image signal of a second view is referred to as the second image signal view-1. The second image signal view-1 may be input to the second video codec module 320 at the same rate (e.g., 60 fps) as the first image signal view-0.
As shown in
The first video codec module 310 sequentially receives and processes a plurality of frames I11 through 116 of the first image signal view-0. The sync controller 314 of the first video codec module 310 sets a bit in the bitmap memory 330 every time when a macro block in a frame of the first image signal view-0 is processed. For instance, when a (1,1) macro block in a first frame is processed in a state where each of bits in the bitmap memory 330 has been reset to “0” (stage S1 in
The sync controller 324 of the second video codec module 320 reads bit values from the bitmap memory 330. For instance, the sync controller 324 of the second video codec module 320 may periodically read bit values from the bitmap memory 330.
The second video codec module 320 sequentially receives and processes a plurality of frames I21 through I26 of the second image signal view-1. When the second video codec module 320 processes each frame of the second image signal view-1, it refers to part of the processed data of a corresponding frame of the first image signal view-0. Accordingly, the second video codec module 320 waits for a macro block to be referred to, i.e., reference macro block in the first image signal view-0 to be completely processed. Whether the reference macro block has been processed is recognized by the sync controller 324 of the second video codec module 320 that reads data from the bitmap memory 330.
For instance, let's assumed that to process a (1,1) macro block in each frame of the second image signal view-1, a (1,1) macro block in the frame of the first image signal view-0 and its neighboring blocks (e.g., (1,2), (1,3), (2,1), (2,2), (2,3), (3,1), (3,2) and (3,3) macro blocks) in the frame of the first image signal view-0 are referred to. In this case, the sync controller 324 of the second video codec module 320 reads bit values from the bitmap memory 330 and determines whether bits respectively corresponding to the (1,1) macro block and its neighboring blocks (e.g., (1,2), (1,3), (2,1), (2,2), (2,3), (3,1), (3,2) and (3,3) macro blocks) have been set to “1”.
When all bits corresponding to respective reference macro blocks have been set to “1”, the second video codec module 320 starts processing the (1,1) macro block in the first frame of the second image signal view-1 with reference to processed data of the reference macro blocks in the first frame of the first image signal view-0.
Also, it is assumed that to process a (3,1) macro block in each frame of the second image signal view-1, a (3,1) macro block in the frame of the first image signal view-0 and its neighboring blocks (e.g., (1,1), (1,2), (1,3), (2,1), (2,2), (2,3), (3,2), (3,3), (4,1), (4,2), (4,3), (5,1), (5,2) and (5,3) macro blocks) in the frame of the first image signal view-0 are referred to. In this case, the sync controller 324 of the second video codec module 320 reads bit values from the bitmap memory 330 and determines whether bits respectively corresponding to the reference macro block, i.e., (3,1) macro block and its neighboring blocks (e.g., (1,1), (1,2), (1,3), (2,1), (2,2), (2,3), (3,2), (3,3), (4,1), (4,2), (4,3), (5,1), (5,2) and (5,3) macro blocks) have been set to “1”.
When all bits corresponding to the respective reference macro blocks have been set to “1”, the second video codec module 320 starts processing the (3,1) macro block in the first frame of the second image signal view-1 with reference to processed data of the reference macro blocks in the first frame of the first image signal view-0.
Processed data of each frame of the first image signal view-0 and processed data of each frame of the second image signal view-1 may be stored in the memory 170 or 20 (
When processing of the i-th frame is completed, the first video codec module 120 may send the sync information Sync_f to the second video codec module 130 in operation S112. The first video codec module 120 may also store processed data of the i-th frame in memory in operation S114. Then, the first video codec module 120 starts processing a subsequent frame in operations S116 and S110. In detail, “i” is increased by 1 in operation S116 and the i-th frame (e.g., a second frame) is processed in operation S110.
The second video codec module 130 processes an i-th frame (e.g., a first frame) of a second image signal in response to the sync information Sync_f in operation S 120. The sync information Sync_f generated by the first video codec module 120 may be directly sent to the second video codec module 130, but it may be sent to the host 110 so that the host 110 controls the second video codec module 130 according to the sync information Sync_f.
Therefore, the processing of the second frame by the first video codec module 120 and processing of the first frame by the second video codec module 130 are performed at the same time.
In this manner, the first video codec module 120 processes the first image signal up to the last frame and outputs the sync information Sync_f every time when each of the frames of the first image signal is processed, and the second video codec module 130 processes each frame according to the sync information Sync_f. The second video codec module 130 may also store the processed data of the i-th frame in memory in operation S122.
The processed data of each frame of the first image signal and the processed data of each frame of the second image signal, which are stored in memory, may be transmitted through a network or read and combined with each other by the host 110 or the display controller 140 (
Referring to
When processing of the j-th macro block in the i-th frame is completed, the first video codec module 310 sets a bit corresponding to the j-th macro block in the bitmap memory 330 in operation S212. For instance, when processing of the first macro block in the first frame is completed, a corresponding bit in the bitmap memory 330 may be set to “1”, as shown in stage S2 in
The first video codec module 310 repeats increasing “j” by 1 and processing a subsequent macro block in operations S214 and S216 until the i-th frame (e.g. the first frame) of the first image signal is completely processed. In other words, operations S210, S212, S214, and S216 are repeated until the i-th frame (e.g., the first frame) of the first image signal is completely processed. The first video codec module 310 may store processed data of the i-th frame in memory in operation S218.
The second video codec module 320 reads values from the bitmap memory 330 periodically or non-periodically in operation S220. The second video codec module 320 determines whether a reference macro block for a k-th macro block in the i-th frame of a second image signal has been completely processed using the values read from the bitmap memory 330 in operation S222, where “k” is an integer of at least 1. Before operation S212, “k” may be initialized to “1”. Here, the reference macro block is a macro block of the first image signal that the second video codec module 320 needs to refer to in order to process the k-th macro block of the i-th frame of the second image signal.
When it is determined that the reference macro block in the i-th frame of the first image signal has been processed in operation S222, the second video codec module 320 processes the k-th macro block in the i-th frame of the second image signal using processed data of the reference macro block of the first image signal in operation S224. However, when it is determined that the reference macro block in the i-th frame of the first image signal has not been processed in operation S222, the method goes back to operation S220 of reading values from the bitmap memory 330, and the second video codec module 320 waits for the reference macro block for the k-th macro block in the i-th frame to be completely processed.
The second video codec module 320 repeats increasing “k” by 1 and processing a subsequent macro block in operations S226 and S228 until the i-th frame (e.g. the first frame) of the second image signal is completely processed. In other words, operations S220, S222, S224, S226, and S228 are repeated until the i-th frame (e.g., the first frame) of the second image signal is completely processed. The second video codec module 320 may store processed data of the i-th frame in memory in operation S230.
In this manner, the first video codec module 310 processes the first image signal up to the last frame, and sets a corresponding bit in the bitmap memory 330 every time when a macro block in each frame is processed. The second video codec module 320 periodically or non-periodically reads values from the bitmap memory 330, and processes the second image signal up to the last frame with reference to the processed data of reference macro blocks of the first image signal.
The host 110 combines the processed data or each frame of the first image signal with the processed data of a corresponding frame of the second image signal to display a multi-view image.
The structure of first and second video codec modules 120d and 130d illustrated in
While each of the first and second video codec modules 120 and 130 includes both an encoder and a decoder in the embodiments illustrated in
The structure of first and second video codec modules 120e and 130e illustrated in
While each of the first and second video codec modules 120 and 130 includes both an encoder and a decoder in the embodiments illustrated in
The structure of first and second video codec modules 210f and 220f illustrated in
The structure of first and second video codec modules 310h and 320h illustrated in
As described above, according to some embodiments, the code module 115 may include at least two video codec modules with encoding and decoding functions to process a multi-view image, or include at least two video codec modules with only the encoding function or decoding function.
The structure of first and second video codec modules 120′ and 130′ illustrated in
Firmware 123′ of the first video codec module 120′ may transmit a sync signal Sync to the host 110 while firmware 133′ of the second video codec module 130′ receives a sync signal Sync from the host 110. At this time, the sync signal Sync may be the row sync information Sync_r described with reference to
For instance, the firmware 123′ may send the sync signal Sync to the host 110 every time the encoder 121 and the decoder 122 of the first video codec module 120′ processes data of a row or a macro block in each frame of the first image signal view-0.
The structure of first and second video codec modules 210′ and 220′ illustrated in
Referring to
The structure of first and second video codec modules 310′ and 320′ illustrated in
Referring to
In other embodiments, an encoder or a decoder may be omitted from each of first and second video codec modules with the structures illustrated in
In the exemplary embodiments, first and second video codec modules may have the same specification (e.g., the same hardware specification).
The image processing system 400 includes the processor 100, a power source 410, a storage device 420, a memory 430, I/O ports 440, an expansion card 450, a network device 460, and a display 470. The image processing system 400 may further include a camera module 480.
The processor 100 corresponds to the image processing apparatus according to some embodiments.
The processor 100 may control the operation of at least one of the elements 100, and 410 through 480. The power source 410 may supply an operating voltage to at least one of the elements 100, and 420 through 480. The storage device 420 may be implemented by a hard disk drive (HDD) or a solid state drive (SSD).
The memory 430 may be implemented by a volatile or non-volatile memory. The memory 430 may correspond to the memory device 330 illustrated in
The I/O ports 440 are ports that receive data transmitted to the image processing system 400, or transmit data from the image processing system 400 to an external device. For instance, the I/O ports 440 may include a port connecting with a pointing device, such as a computer mouse, a port connecting with a printer, and a port connecting with a USB drive.
The expansion card 450 may be implemented as a secure digital (SD) card or a multimedia card (MMC). The expansion card 450 may be a subscriber identity module (SIM) card or a universal SIM (USIM) card.
The network device 460 enables the image processing system 400 to be connected with a wired or wireless network. The display 470 displays data output from the storage device 420, the memory 430, the I/O ports 440, the expansion card 450, or the network device 460.
The camera module 480 converts optical images into electrical images. Accordingly, the electrical images output from the camera module 480 may be stored in the storage module 420, the memory 430, or the expansion card 450. Also, the electrical images output from the camera module 480 may be displayed through a display 470.
The exemplary embodiments can also be embodied as computer-readable codes on a computer-readable medium. The computer-readable recording medium is any data storage device that can store data as a program, which can thereafter be read by a computer system. Examples of the computer-readable recording medium include read-only memory (ROM), random-access memory (RAM), CD-ROMs, magnetic tapes, floppy disks, and optical data storage devices.
The computer-readable recording medium can also be distributed over network coupled computer systems so that the computer-readable code is stored and executed in a distributed fashion. Also, functional programs, codes, and code segments to accomplish the exemplary embodiments can be easily construed by programmers.
As described above, according to some embodiments, parallel processing can be applied to multi-view image data by using multi-core, i.e., at least two video codec modules without performance deterioration.
While exemplary embodiments have been particularly shown and described, it will be understood by those of ordinary skill in the art that various changes in forms and details may be made therein without departing from the spirit and scope of the exemplary embodiments as defined by the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2012-0095404 | Aug 2012 | KR | national |