The embodiment discussed herein is directed to an encoding device for realizing image encoding and a decoding device therefor.
BACKGROUND
High definition televisions (HDTVs) that can display high definition images are now spreading, and high-efficiency transmission of signals of a high sampling rate (signals of high definition images for displaying on the HDTV or the like) has been demanded. For this reason, Japanese Laid-open Patent Publication No. 08-46961 discloses a method of efficiently transmitting signals of a high sampling rate by dividing an image into several subimages and performing parallel processing thereon (see Japanese Laid-open Patent Publication No. 08-46961, for example).
Here, an image segmentation encoding/decoding system disclosed in Japanese Laid-open Patent Publication No. 08-46961 (hereinafter, “encoding/decoding system”) is explained.
The image input device 11 is a device that takes an image and outputs the taken image as an input image to the image encoding device 12. The image encoding device 12 divides the input image into several subimages, encodes these subimages, and outputs them to the image decoding device 13.
The image decoding device 13 is a device that decodes the encoded subimages and generates an output image by combining the decoded subimages. The image displaying device 14 is a device that displays the output image generated by the image decoding device on a display or the like.
The image segmentation control unit 12a is a processing unit that divides the input image that has been input, into several subimages and outputs the divided subimages to the encoding processing units 12b and 12c. The encoding processing units 12b and 12c are processing units that perform an encoding process (such as DPCM coding) onto the input subimages and output the encoded data, which is the encoded subimages, to the multiplex control unit 12d.
The DPCM coding executed by the encoding processing units 12b and 12c is now explained.
In a nature image or the like, a strong correlation is often established between adjacent pixels. Thus, in the DPCM coding, adjacent pixels are incorporated as predictive reference pixels, and a predictive pixel is calculated from an average value or the like of the predictive reference pixels. A difference between the predictive pixel and the encoding-target pixel is obtained, and the difference value is encoded. By encoding the difference value, information can be reduced from the encoding of the encoding-target pixel itself.
In the DPCM coding executed by the encoding processing units 12b and 12c, a quantizing process is performed onto the difference value, and a code is assigned to the quantized value that is determined.
As illustrated in
If the quantization is divided into smaller steps, quantizing errors can be reduced and image degradation can be suppressed, but the amount of information is increased because the number of bits of codes that are assigned to large difference values is increased.
In
The separation control unit 13a is a processing unit that receives the multiplexed encoded data through the transmission channel 15 and separates the multiplexed encoded data that is received into items of encoded data corresponding to the subimages. The separation control unit 13a outputs the separated encoded data items to the decoding processing units 13b and 13c.
The decoding processing units 13b and 13c are processing units that perform decoding processes onto the encoded data that is input, and output the decoded subimages to the image connection control unit 13d. The image connection control unit 13d is a processing unit that acquires the subimages from the decoding processing units 13b and 13c, generates the original image by combining the acquired subimages, and displays the generated image as an output image on a display or the like.
As discussed above, in the conventional encoding/decoding system, an input image is divided and multiple cores (encoding processing units 12b and 12c) are adopted in parallel so that an image of a size that is not processible by a single encoding core can be divided and encoded, and highly efficient transmission of signals of a high sampling rate can be thereby realized.
Japanese Laid-open Patent Publication No. 09-275559 discloses a technology of obtaining original image data consisting of N×N pixel blocks, quantizing the original image data, and performing scanning by reversing the scanning direction for each line in a block so that distortion of the block can be reduced. Japanese Laid-open Patent Publication No. 05-63988 discloses a technology of suppressing ringing effects in the edge periphery that are typically caused in the transform coding technology by use of an efficient DPCM method that eliminates the correlation of pixel values of a block.
However, according to the above conventional technologies, when the encoded subimages are decoded and combined, there is a difference in the image quality between the boundaries of the combined subimages, making the joint noticeable.
In other words, as indicated in
Moreover, when controlling the information amount for each subimage, the data mount is reduced in the bottom side of a subimage to keep the encoded data within a certain amount with a method of encoding a subimage sequentially from the top left corner. As a result, the image quality of the lower side of the decoded subimage may be degraded with reference to the upper side (see
In other words, as illustrated in
More specifically, even when the encoded data is kept within a certain amount, it is still very important to solve a challenge to reduce the difference in image quality at the joint of adjacent subimages and improve the quality of an image obtained after connecting subimages.
The present invention has been conceived in light of the above, and its purpose is to offer an encoding device and a decoding device that can reduce the difference in image quality at the joint of adjacent subimages and improve the quality of an image obtained after connecting subimages even when the encoded data is kept within a certain amount.
According to an aspect of an embodiment of the invention, an encoding/decoding system includes an encoding device that encodes an image; and a decoding device that decodes the image. The encoding device includes an image dividing unit that divides an encoding-target image into a plurality of subimages; an encoding executing unit that acquires the subimages divided by the image dividing unit and executes the encoding on the subimages in a direction moving away from a boundary of the subimages that are acquired; and an output unit that multiplexes and outputs the subimages encoded by the encoding executing unit. The decoding device includes a separating unit that separates the subimages that are multiplexed when acquiring the subimages that are multiplexed; and a decoding executing unit that acquires the subimages that are separated by the separating unit and executes the decoding on the subimages in a direction moving away from the boundary of the subimages that are acquired.
The object and advantages of the embodiment will be realized and attained by means of the elements and combinations particularly pointed out in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the embodiment, as claimed.
Preferred embodiments of the present invention will be explained with reference to accompanying drawings. The invention is not limited to the embodiments.
The overview and features of an encoding device (encoding/decoding system) according to an embodiment are explained first.
In an example illustrated in
More specifically, the subimage 20 is subjected to the encoding from the boundary to the left. When the left end is reached, the encoding is shifted to one line below the current line and performed again sequentially from the boundary to the left. Moreover, the subimage 21 is subjected to the encoding from the boundary to the right. When the right end is reached, the encoding is shifted to one line below the current line and performed again sequentially from the boundary to the right.
In an example illustrated in
More specifically, the subimage 22 is subjected to the encoding horizontally from the left side of the bottom end (boundary) to the right side. When the right end is reached, the encoding is shifted to one line above the current line and performed again from the left end to the right end. Moreover, the subimage 23 is subjected to the encoding horizontally from the left side of the top end (boundary) to the right side. When the right end is reached, the encoding is shifted to one line below the current line and performed again from the left end to the right end.
In an example illustrated in
More specifically, the encoding device performs the encoding onto the subimage 24 from the boundary to the left. When the left end is reached, the encoding device shifts to one line below the current line and performs the encoding again from the boundary to the left. Moreover, the encoding device performs the encoding onto the subimage 25 from the left and right boundaries toward the center, and when the center is reached, the encoding device shifts to one line below the current line and performs the encoding again from the left and right boundaries toward the center.
The encoding device performs the encoding onto the subimage 26 from the boundary to the right. When the right end is reached, the encoding device shifts to one line below the current line and performs the encoding again from the boundary to the right.
In an example illustrated in
More specifically, the encoding device performs the encoding onto the subimage 27 horizontally from the left side of the bottom end (boundary) to the right side. When the right end is reached, the encoding device shifts to one line above the current line and performs the encoding again from the left end to the right end.
Moreover, the encoding device performs the encoding onto the subimage 28 horizontally from the left side of the top end (boundary) to the right side. When the right end is reached, the encoding device shifts to one line below the current line below the current line and performs the encoding again from the left end to the right end. Furthermore, the encoding device performs the encoding from the left side of the bottom end (boundary) to the right side. When the right end is reached, the encoding device shifts to one line above the current line and performs the encoding again from the left end to the right end. The encoding of the subimage 28 in the vertical direction starts from the boundaries of the two sides (top end and bottom end) and is performed toward the center of the subimage 28.
The encoding device performs the encoding onto the subimage 29 from the left side of the top end (boundary). When the right end is reached, the encoding device shifts to one line below the current line and performs the encoding from the left end to the right end.
In an example of
When performing the encoding to the subimage 31 having three boundaries, the encoding device executes the encoding from the left end (boundary) of the bottom end (boundary) toward the center, and when the center is reached, the encoding device shifts to one line above the current line and performs the encoding again from the left boundary toward the center. In addition, the encoding device executes the encoding from the right end (boundary) of the bottom end (boundary) toward the center, and when the center is reached, the encoding device shifts to one line above the current line and performs the encoding again from the right boundary toward the center.
When performing the encoding to the subimage 32 having two boundaries, the encoding device executes the encoding horizontally from the left end (boundary) of the bottom end (boundary) to the right side. When the right end is reached, the encoding device shifts to one line above the current line and performs the encoding again from the left end to the right end and performs the encoding again from the left end to the right end.
When performing the encoding to the subimage 33 having three boundaries, the encoding device executes the encoding horizontally from the right end (boundary) of the top end (boundary) to the left side. When the left end is reached, the encoding device shifts to one line below the current line and performs the encoding again from the right end to the left end. In addition, the encoding device executes the encoding horizontally from the right end (boundary) of the bottom end (boundary) to the left side. When the left end is reached, the encoding device shifts to one line above the current line and performs the encoding again from the right end to the left end.
When performing the encoding to the subimage 34 having four boundaries, the encoding device executes the encoding from the left end (boundary) and the right end (boundary) of the top end (boundary) toward the center, and when the center is reached, the encoding device shifts to one line below the current line and performs the encoding again from the left boundary and the right boundary toward the center. In addition, the encoding device executes the encoding from the left end (boundary) and the right end (boundary) of the bottom end (boundary) toward the center, and when the center is reached, the encoding device shifts to one line above the current line and performs the encoding again from the left boundary and the right boundary toward the center.
When performing the encoding onto the subimage 35 having three boundaries, the encoding device executes the encoding horizontally from the left end (boundary) of the top end (boundary) to the right side. When the right end is reached, the encoding device shifts to one line below the current line and performs the encoding again from the left end to the right end. In addition, the encoding device executes the encoding horizontally from the left end (boundary) of the bottom end (boundary) to the right side. When the right end is reached, the encoding device shifts to one line above the current line and performs the encoding again from the left end to the right end.
When performing the encoding onto the subimage 36 having two boundaries, the encoding device executes the encoding horizontally from the right end (boundary) of the top end (boundary) to the left side. When the left end is reached, the encoding device shifts to one line below the current line and performs the encoding again from the right end to the left end.
When performing the encoding onto the subimage 37 having three boundaries, the encoding device executes the encoding from the left end (boundary) of the top end (boundary) toward the center, and when the center is reached, the encoding device shifts to one line below the current line and performs the encoding again from the left boundary toward the center. In addition, the encoding device executes the encoding from the right end (boundary) of the top end (boundary) toward the center, and when the center is reached, the encoding device shifts to one line below the current line and performs the encoding again from the right boundary toward the center.
When performing the encoding onto the subimage 38 having two boundaries, the encoding device executes the encoding horizontally from the left end (boundary) of the top end (boundary) to the right side. When the right end is reached, the encoding device shifts to one line below the current line and performs the encoding again from the left end to the right end.
In addition to the processes explained with reference to
In this manner, the encoding device according to the present embodiment divides the encoding-target image into several subimages and encodes the subimages in a direction moving away from the boundaries of the divided subimages, as explained with reference to
Furthermore, the encoding device according to the present embodiment does not encode pixels in the vicinity of the boundaries of the subimages, and changes the quantizing steps in accordance with the distance from the boundaries, as explained with reference to
Next, the structure of an encoding/decoding system according to the present embodiment is explained.
In
The image encoding device 100 includes an image segmentation control unit 110, encoding direction control units 120 and 130, encoding processing units 140 and 150, and a multiplex control unit 160. The image encoding device 100 includes encoding direction control units other than the encoding direction control units 120 and 130, and encoding processing units other than encoding processing units 140 and 150, although those are not illustrated in the drawing.
The image segmentation control unit 110 is a processing unit that divides the input image that has been input into several subimages and outputs the divided subimages to the encoding direction control units 120 and 130. Furthermore, the image segmentation control unit 110 outputs image segmentation information to the encoding direction control units 120 and 130 and the multiplex control unit 160.
Here, the image segmentation information is information related to a method of dividing an input image and the like. For example, when dividing an input image as illustrated in
The encoding direction control unit 120 acquires the subimage and the image segmentation information, determines the encoding direction based on the image segmentation information (see
In addition, the encoding direction control unit 120 includes a memory to store the subimage data so that it is provided for an encoding direction that is different from the order of the subimage data input from the image segmentation control unit 110. The explanation for the encoding direction control unit 130 is the same as that for the encoding direction control unit 120, and thus the explanation is omitted (the subimage data is output to the encoding processing unit 150 after the encoding direction is determined).
The encoding processing unit 140 is a processing unit that executes an encoding process onto the subimage input from the encoding direction control unit 120 and outputs the encoded data obtained by encoding the subimage to the multiplex control unit 160. Furthermore, the encoding processing unit 140 does not perform the encoding process onto pixels in the vicinity of the boundary of the subimage data (within a predetermined number of pixels from the boundary) but outputs the data of these pixels to the multiplex control unit 160 in PCM signals.
Moreover, the encoding processing unit 140 changes the quantizing steps in accordance with the distance from the boundary when executing the encoding. The encoding processing unit 140 stores therein distances from the boundary in association with quantizing steps corresponding to the distances. The explanation for the encoding processing unit 150 is the same as that for the encoding processing unit 140, and thus the explanation is omitted.
The multiplex control unit 160 is a device that acquires the encoded data from the encoding processing units 140 and 150 (or the PCM signals that are not encoded), generates multiplex data by multiplexing the acquired encoded data, and outputs the multiplex data to the image decoding device 200.
The multiplex control unit 160 creates a header including the image segmentation information and positional information of each subimage (coordinates of pixels included in the subimage), adds the created header to the multiplex data, and outputs it to the transmission channel 15. The multiplex control unit 160 contains a memory and thus is prepared for various input timings of the encoded data.
The image decoding device 200 includes a separation control unit 210, decoding processing units 220 and 230, and an image connection control unit 240. The image decoding device 200 includes decoding processing units other than the decoding processing units 220 and 230, although they is not illustrated in the drawing.
The separation control unit 210 is a processing unit that separates the multiplex data received through the transmission channel 15 into items of encoded data corresponding to the multiple subimages. Moreover, the separation control unit 210 extracts the positional information and the image segmentation information included in the header of the multiplex data, and outputs the positional information to the decoding processing unit 220 and the image segmentation information to the image connection control unit 240.
The decoding processing unit 220 is a processing unit that decodes encoded data in accordance with the positional information, selects the decoded data or the PCM data, and outputs it to the image connection control unit. In other words, the decoding processing unit 220 selects PCM data for pixels within a predetermined number thereof from the boundary of the subimage, and selects decoded data for pixels the predetermined number or greater thereof away from the boundary. The explanation for the decoding processing unit 230 is the same as that for the decoding processing unit 220, and thus the explanation is omitted.
Furthermore, when decoding the encoded data, the decoding processing unit 220 executes the decoding on the encoded data in a direction moving away from the boundary of the subimage.
The image connection control unit 240 is a processing unit that, when receiving the subimage data from the decoding processing units 220 and 230, stores the subimage data in the frame memory and connects it in accordance with the image segmentation information to generate an output image (image data before the segmentation). The image connection control unit 240 outputs the generated output image to the image displaying device (not illustrated).
Next, the structures of the encoding direction control unit 120 and the encoding processing unit 140 illustrated in
The direction control unit 121 is a processing unit that acquires the subimage data (data of the encoding-target subimage) and the image segmentation information, determines the encoding direction based on the image segmentation information, and outputs the subimage data in the encoding direction sequentially to the encoding processing unit 140. (In other words, the direction control unit 121 determines the position of the boundary of the subimage data based on the image segmentation information and outputs the subimage data in order of the direction moving away from the boundary (see
Otherwise, the direction control unit 121 temporarily stores the subimage data in the frame memory, and then reads out the subimage data of the encoding start position and outputs it to the encoding processing unit 140. The direction control unit 121 also outputs the positional information of pixels included in the subimage data to the quantizing unit 141, the inverse quantizing unit 142, and the selecting unit 146. The frame memory 122 is a storage unit that stores therein the subimage data.
The quantizing unit 141 is a processing unit that changes the quantizing steps and quantizes uncompressed data, based on the positional information. Here, the uncompressed data is data obtained from a difference between the subimage data output by the direction control unit 121 and prediction data output by the predicting unit 144.
The quantizing unit 141 stores therein a quantizing step table in which distances from the boundary and quantizing steps are associated with each other, determines a quantizing step for the uncompressed data by comparing the quantizing table and the positional information, and quantizes the uncompressed data in accordance with the quantizing step obtained as a result of the determination. The quantizing unit 141 outputs the quantized uncompressed data to the inverse quantizing unit 142 and the encoding unit 145.
The inverse quantizing unit 142 is a processing unit that changes the quantizing steps based on the positional information and executes inverse quantization on the quantized uncompressed data. The inverse quantizing unit 142 holds the above quantizing table, determines a quantizing step by comparing the quantizing table and the positional information, and executes the inverse quantization in accordance with the quantizing step obtained as a result of the determination.
The inverse quantizing unit 142 outputs the data that has been subjected to the inverse quantization to the line memory 143. The line memory 143 is a storage unit that stores therein data obtained by adding the data output by the inverse quantizing unit 142 and the prediction data output by the predicting unit 144. The data stored in the line memory 143 corresponds to data of pixels adjacent to the encoding-target pixels (adjacent pixel data).
The predicting unit 144 is a processing unit that reads the data stored in the line memory 143 (adjacent pixel data) and outputs the read-out data as prediction data. The encoding unit 145 is a processing unit that acquires the quantized uncompressed data that is output by the quantizing unit 141 and sequentially encodes the acquired data. The encoding unit 145 outputs the data that is encoded (encoded data) to the selecting unit 146.
The selecting unit 146 acquires the positional information, the subimage data that is not encoded (PCM signal), and the encoded data, selects either one of the PCM signal and the encoded data in accordance with the positional information, and outputs the selected data as image data to the multiplex control unit 160.
Based on the image segmentation information (the selecting unit 146 also acquires image segmentation information) and the positional information, the selecting unit 146 selects the PCM signal when the pixel (PCM signal or encoded data) is positioned within a predetermined value away from the boundary (i.e., in the vicinity of the boundary), and selects the encoded data when the pixel is positioned further away than the predetermined value from the boundary (i.e., not in the vicinity of the boundary).
Next, the structure of the multiplex control unit 160 illustrated in
The header generating unit 161 is a processing unit that acquires the positional information and the image segmentation information and generates header information from the acquired data. The header generating unit 161 outputs the generated header information to the multiplexing unit 164.
The arbitrating unit 162 is a processing unit that acquires the image segmentation information, also acquires image data from the encoding devices that are arranged in parallel, and outputs the image data to the multiplexing unit 164 based on the image segmentation information. When receiving multiple items of image data at a time, the arbitrating unit 162 stores these items of image data in the memory 163, and then outputs an item of image data corresponding to the positional information and the image segmentation information included in the header information to the multiplexing unit 164. The memory 163 is a storage unit that stores therein the image data.
The multiplexing unit 164 is a processing unit that, when receiving the header information and the image data (the encoded subimage data or the PCM signal), generates multiplex data by multiplexing the header information and the image data and outputs the generated multiplex data to the image decoding device 200.
For example, when the input image is divided as illustrated in
Next, the structure of the decoding processing unit 220 illustrated in
The decoding unit 221 is a processing unit that acquires the encoded data from the separation control unit 210 and decodes the acquired encoded data. The decoding unit 221 outputs the data that is decoded (decoded data) to the inverse quantizing unit 222.
The inverse quantizing unit 222 is a processing unit that acquires the positional information and the decoded data, changes the quantizing steps based on the positional information, and executes inverse quantization onto the decoded data. The inverse quantizing unit 222 holds the above quantizing table, determined a quantizing step by comparing the quantizing table and the positional information, and executes the inverse quantization in accordance with the quantizing step obtained as a result of the determination.
The inverse quantizing unit 222 outputs the data that is subjected to the inverse quantization to the selecting unit 225 and the line memory 223. The line memory 223 is a storage unit that stores therein data obtained by adding the data output by the inverse quantizing unit 222 and the data output by the predicting unit 224. The data stored in the line memory 223 corresponds to data of pixels adjacent to the decoding-target pixels (adjacent pixel data).
The predicting unit 224 is a processing unit that reads the data stored in the line memory 223 (adjacent pixel data) and outputs the read-out data as predication data. The predicting unit 224 adds the prediction data received from the predicting unit 224 to the data that has been subjected to the inverse quantization to represent the image data before being encoded, and inputs the image data to the selecting unit 225.
The selecting unit 225 is a processing unit that acquires the positional information, the subimage data that is not encoded (PCM signal), and the decoded image data, selects either one of the PCM signal and the decoded image data in accordance with the positional information, and outputs the selected data as image data to the image connection control unit 240.
Based on the image segmentation information (the selecting unit 225 also acquires the image segmentation information) and the positional information, the selecting unit 225 selects the PCM signal when the pixel (PCM signal or image data) is positioned within a predetermined value away from the boundary (i.e., in the vicinity of the boundary), and selects the image data when the pixel is positioned further away than the predetermined value from the boundary (i.e., not in the vicinity of the boundary).
Next, the structure of the image connection control unit 240 indicated in
The connection control unit 241 is a processing unit that acquires the image segmentation information and also acquires the image data (from the decoding processing units 220 arranged in parallel), connects items of image data based on the image segmentation information, and thereby generates output image data. The connection control unit 241 stores the items of image data in the frame memory 242. The connection control unit 241 connects the items of image data based on the image segmentation information when writing the image data into or reading it from the frame memory.
The encoding starts from the lower right corner of a subimage A, the lower left corner of a subimage B, the upper right of a subimage C, and the upper left of a subimage D. The pixels in the vicinity of the boundary are transmitted on PCM signals, without being compressed. Furthermore, small quantizing steps are adopted for the vicinity of the boundary in the encoding, and larger quantizing steps are adopted as being further away from the boundary.
Next, the processing procedure of the image encoding device 100 according to the present embodiment is explained.
Then, the image encoding device 100 executes the encoding process on each subimage (step S103). At step S103, the image encoding device 100 executes the encoding in a direction moving away from the boundary of the subimages. Moreover, the image encoding device 100 transmits pixels in the vicinity of the boundary on PCM signals, without compressing them. In addition, smaller quantizing steps are used for the vicinity of the boundary, while larger quantizing steps are used in the encoding as being further away from the boundary.
Thereafter, the image encoding device 100 multiplexes the encoded image data (step S104), and outputs the multiplexed data to the image decoding device 200 (step S105).
In this manner, the image encoding device 100 executes the encoding in a direction moving away from the boundary of the subimages, and thus image degradation can be avoided at the boundary.
Next, the processing procedure of the image decoding device 200 according to the present embodiment is explained.
Then, the image decoding device 200 executes the decoding process on each image (step S203). At step S203, the image decoding device 200 executes the decoding in the direction moving away from the boundary of the images. Further, the image decoding device 200 selects PCM signals as the pixels in the vicinity of the boundary (while selecting decoded image data for pixels that are not in the vicinity of the boundary). In addition, smaller quantizing steps are used for the vicinity of the boundary, and larger quantizing steps are used as being further away from the boundary.
Thereafter, the image decoding device 200 connects the decoded image data (subimages) (step S204), and outputs the generated output image data (step S205).
In this manner, the image decoding device 200 executes the decoding in a direction moving away from the boundary of the images, and selects PCM signals for the pixels in the vicinity of the boundary (while selecting the decoded image data for the pixels that are not in the vicinity of the boundary). In addition, smaller quantizing steps are used for the vicinity of the boundary, while larger quantizing steps are used as being further away from the boundary. Hence, the image data that is output after the connection can be prevented from being degraded.
As described above, the encoding/decoding system according to the present embodiment divides an encoding-target image into several subimages when encoding the image, and performs the encoding on the subimages in a direction moving away from the boundaries of the divided subimages. Thus, even when DPCM coding is performed in which the image quality is degraded gradually in order of encoding to improve the transmission efficiency, a difference in image quality at the boundary of the subimages can be reduced, and the image degradation can be avoided.
In addition, the encoding/decoding system according to the present embodiment transmits the pixels in the vicinity of the boundary of the subimages as uncompressed data, without encoding them. Thus, the image quality of the boundary area is prevented from being degraded, and the boundary created when connecting the subimages becomes less noticeable.
In addition, the encoding/decoding system according to the present embodiment changes quantizing steps that are used for encoding in accordance with the distance from the boundary so that an amount of data greater than the predetermined value can be assigned to the vicinity of the boundary by using a small quantizing step. Thus, the degradation of an image is suppressed, and the boundary becomes less noticeable.
Among the above-explained processes according to the present embodiment, all or part of the processes that are described as being automatically performed can be manually performed, or all or part of the processes that are described as being manually performed can be performed automatically with a known method. Besides these processes, the processing procedure, the controlling procedure, specific names, and information including various kinds of data and parameters that are mentioned in the above explanation and drawings can be arbitrarily changed unless otherwise stated.
In addition, the structural components of the image encoding device 100 and the image decoding device 200 illustrated in
Then, an encoding program 68b that performs the same function as that of the above image encoding device 100 is stored in the HDD 68. The CPU 67 reads and implements the encoding program 68b to start an encoding process 67a. This encoding process 67a corresponds to the image segmentation control unit 110, the encoding direction control units 120 and 130, the encoding processing units 140 and 150, and the multiplex control unit 160 that are illustrated in
Moreover, various kinds of data used for the encoding process are stored in the HDD 68. The CPU 67 reads various kinds of data 68a stored in the HDD 68, stores it in the RAM 63, performs the encoding by use of various kinds of data 63a stored in the RAM 63, and outputs the encoded data to the image decoding device.
Then, a decoding program 78b that performs the same function as that of the above image decoding device 200 is stored in the HDD 78. The CPU 77 reads and implements the decoding program 78b to start a decoding process 77a. This decoding process 77a corresponds to the separation control unit 210, the decoding processing units 220 and 230, and the image connection control unit 240 illustrated in
Moreover, various kinds of data used for the encoding process are stored in the HDD 78. The CPU 77 reads data 78a stored in the HDD 78, stores it in the RAM 73, performs the decoding by use of data 73a stored in the RAM 73, and outputs the decoded data to the monitor 72.
The encoding program 68b and the decoding program 78b indicated in
According to an embodiment of the present invention, the encoding device divides an encoding-target image into several subimages, and encodes each divided subimage in a direction moving away from the boundary of the subimage. Thus, even when DPCM coding, with which image quality is degraded gradually in accordance with the encoding order, is performed to improve the transmission efficiency, a difference in image quality at the boundaries of the subimages can be reduced, and image degradation can be suppressed.
Furthermore, according to an embodiment of the present invention, the encoding device transmits pixels near the boundaries of the subimages as uncompressed data without encoding them, and thus prevents the image from being degraded at the boundaries so that the joint of the connected subimages can be made less noticeable.
Still further, according to an embodiment of the present invention, the encoding device changes quantizing steps that are used in encoding in accordance with a distance from the boundary, and allocates a larger amount of data than a predetermined value to the vicinity of the boundary by using smaller quantizing steps. Thus, degradation of image quality can be suppressed, and the joint can be made less noticeable.
Still further, according to an embodiment of the present invention, the encoding device adds segmentation information indicating an image segmenting method and positional information of pixels of an encoded subimage when outputting the subimage. Thus, a process of decoding the encoded image data can be efficiently executed.
Still further, according to an embodiment of the present invention, when acquiring multiple subimages that constitute an image, the decoding device separates the acquired subimages and executes decoding in a direction moving away from the boundary of the image. Thus, the image data output after being combined can be prevented from being degraded.
All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the invention and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although the embodiments of the present invention have been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
This application is a continuation of International Application No. PCT/JP2007/072030, filed on Nov. 13, 2007, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2007/072030 | Nov 2007 | US |
Child | 12662911 | US |