The present invention relates to a technology for interpolating a pixel value of a certain pixel from a pixel value of a pixel positioned near the certain pixel.
Japanese Patent Application Laid-open No. H6-186526 and Japanese Patent Application Laid-open No. 2000-137443 disclose a display device that can simultaneously display two screens on one liquid crystal display (LCD). Such a display device can be used, for example, to display differing screens to a person seating in a driver's seat and to a person seating in a passenger seat. Japanese Patent Application Laid-open No. H11-331876 and Japanese Patent Application Laid-open No. H9-46622 disclose a display device that can simultaneously display two types of images on a same screen.
When image data from a plurality of video image sources are displayed in one display unit, a resolution conversion (for example, a resolution conversion in which horizontal resolutions of image data of each video image are halved and displayed by a “two-screen display”) is required to be performed on each piece of image data. However, pixels are merely culled in the resolution conversion, an unrecognizable video image may be obtained depending on a display content of the video image.
Japanese Patent Application Laid-open No. 2004-104368 discloses a technology for solving the above-described problem. Specifically, an average or a weighted average of a plurality of pieces of pixel data from a periphery of a position at which image data are interpolated is calculated, and interpolation data are created from the calculated average or the weighted average.
However, in the technology disclosed in Japanese Patent Application Laid-open No. 2004-104368, the interpolation data are uniformly created. In other words, the interpolation data are uniformly created from the average or the weighted average of the pixel data from the periphery of the position at which the image data are interpolated. This method therefore results in creation of interpolation data in which pixels characterizing an image are ambiguously processed. As a result, there is a problem in this method that the image quality can significantly degrade when the resolution conversion of the image data is performed.
For example, although luminance levels of adjacent pixels are divergent, because interpolation data is created by averaging pixels and peripheral pixels, a luminance level of a pixel characterizing an image in an original image gets smoothed due to the luminance levels of peripheral pixels. Therefore, the interpolation data are created in which pixels characterizing the image are ambiguously processed. As a result, there is a problem in that the image quality significantly degrades when the resolution conversion is performed on the image data.
The present invention has been achieved to at least solve the above-described issues (problems) of the conventional art. An object of the present invention is to provide an image interpolation device and a display device that can suppress image quality degradation accompanying a resolution conversion of image data.
To solve the above-described issues and achieve the object, based on pixels positioned in an interpolation subject area and peripheral pixels of the pixels, the image interpolation device and the display device of the present invention calculate feature quantities of the pixels and determine pixel values of interpolation pixels depending on whether the pixels are characteristic of an image.
The image interpolation device and the display device of the present invention effectively achieve an image interpolation device and a display device that suppress the image quality degradation accompanying the resolution conversion of the image data and maintain characteristics of the original image.
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
Exemplary embodiments of the present invention will be below described with reference to the drawings. However, the technical scope of the present invention is not limited to the embodiments and extends to include the invention described within the scope of claims and inventions equivalent thereto.
The schematic diagram in
The display unit 7 that is supplied with the display data 6 from the display controlling unit 5 includes a liquid crystal panel or the like. The liquid crystal panel includes a parallax barrier, described hereafter. Half of all pixels in a lateral direction in the display unit 7 are used to display the first display image 8, based on the first image source 1. Remaining half of the pixels are used to display the second display image 9, based on the second image source 2. The observer 10 positioned to the left side of the display unit 7 can only see the pixels corresponding to the first display image 8. The observer 10 cannot effectively see the second display image 9, because the second display image 9 is blocked by the parallax barrier formed on a surface of the display unit 7. At the same time, the observer 11 positioned to the right side of the display unit 7 can only see the pixels corresponding to the second display image 9. The observer 11 cannot effectively see the first display image 8, because the first display image 8 is blocked by the parallax barrier. Configurations disclosed in, for example, Japanese Patent Application Laid-open No. H10-123462 and Japanese Patent Application Laid-open No. H11-84131 can be applied with regards to the parallax barrier.
According to the configuration, differing information and contents can be provided to the user on the left and the user on the right, using a single screen. If the first image source and the second image source are the same, the user on the left and the user on the right can view a same image in a conventional manner.
The display unit 7 of the multi-view display device in
The observer 11 in
Each pixel in the liquid crystal panel 100 is divided into pixels used for a left-side (passenger seat side) display and pixels used for a right-side (driver's seat side) display and is display-controlled. The pixels used for the left-side (passenger seat side) display are blocked by the parallax barrier 108 from being displayed to the right side (driver's seat side). The pixels used for the left-side (passenger seat side) display can be viewed from the left side (passenger seat side). Pixels used for the right-side (driver's seat side) display are blocked by the parallax barrier 108 from being displayed to the left side (passenger seat side). The pixels used for the right-side (driver's seat side) display can be viewed from the right side (driver's seat side). As a result, differing displays can be provided to the driver's seat and the passenger. In other words, map information for navigation can be provided to the driver. At the same time, a movie from a DVD or the like can be shown to the passenger. If the configurations of the parallax barrier 108 and each pixel in the liquid crystal panel are changed, a configuration is possible in which differing images are displayed in multiple directions, such as three directions. The parallax barrier itself can include an electronically-drivable liquid crystal shutter or the like, and viewing angles can be changed.
Based on composite data of the first image data and the second image data or individual first image data and second image data, the sub-pixels transmits, for example, first pixel data (for left-side image display) to a data line 115 and a data line 117 and second pixel data (for right-side image display) to a data line 116 and a data line 118. As a result, a first image data group displaying a first image and a second image data group displaying a second image are formed.
The display unit 7 includes the touch panel 124, the liquid crystal panel 100, and the backlight 101. As described above, the liquid crystal panel 100 in the display unit 7 can effectively simultaneously display the image viewed from the driver's seat side that is a first viewing direction and the image viewed from the passenger seat side that is a second viewing direction. The display unit 7 can also use a flat-panel display other than the liquid crystal display, such as an organic electroluminescent (EL) display panel, a plasma display panel, or a cold cathode flat-panel display.
The control unit 200 respectively distributes images and sounds from various sources (the CD/MD playback unit 201, the radio receiving unit 202, the TV receiving unit 203, the DVD playback unit 204, the HD playback unit 205, and the navigation unit 206) using the distribution circuit 207. The images are distributed to the first image adjustment circuit 208 and the second image adjustment circuit 209. The sounds are distributed to the sound adjustment circuit 210. The first image adjustment circuit 208 and the second image adjustment circuit 209 adjust luminosity, tone, contrast, and the like. Each adjusted image is displayed in the display unit 7 through the image outputting unit 211. The sound adjustment circuit 210 adjusts distribution to each speaker, volume, and sound. The adjusted sound is outputted from the speaker 16.
The image outputting unit 211 includes, for example, the first write circuit 226, the second write circuit 227, the VRAM 228, and the display panel driving unit 11, as shown in
An example of the various sources shown in
The navigation unit 206 includes a map information storing unit storing map information used for navigation. The navigation unit 206 can obtain information from the VICS information receiving unit 212 and the GPS information receiving unit 213. The navigation unit 206 can create an image for a navigation operation and display the image. The TV receiving unit 203 receives analog TV broadcast waves and digital TV broadcast waves from an antenna, via the selector 214.
The control unit 200 controls the distribution circuit 207 and the various sources. The control unit 200 allows display for two selected sources or one selected source. The control unit 200 also allows the display unit 7 to display an operation menu display used to control the various sources. As shown in
The user can control the various sources using the touch panel 124 mounted on a front surface of the display unit 7 and switches provided in the periphery of the display unit 7. Alternatively, the user can perform input operations for speech recognition and the like and selection operations using the operating unit 215. The user can also perform the input operations or the selection operations using the remote control 217, via the remote control transmitting and receiving unit 216. In adherence to the operations of the touch panel 124 and the operating unit 215, the control unit 200 performs control, including the various sources. The control unit 200 is configured to allow control of respective volumes of a plurality of speakers 16 provided within the vehicle, as shown in
The memory 218 includes, for example, the first screen RAM 233, the second screen RAM 234, the image quality setting information storing unit 235, and the counter-environment adjustment holding unit 236, as shown in
Images from, for example, a camera 220 for rear-monitoring that is connected to the external audio/video inputting unit 219 can be displayed in the display unit 7. Aside from the camera 220 for rear-monitoring, a video camera, a game console, and the like can be connected to the external audio/video inputting unit 219.
The control unit 200 can change settings for normal positions of outputted images and sounds, and the like, based on information detected by the brightness detecting unit 221 (for example, light switches and optical sensors in the vehicle) and the passenger detecting unit 222 (for example, pressure sensors provided in the seats).
Reference number 223 indicates a rear display unit provided for a backseat of the vehicle. The same image as that displayed in the display unit 7 or one of the image for the driver's seat or the image for the passenger seat can be displayed, via the image outputting unit 211.
The control unit 200 displays a toll display and the like from the ETC on-board device 250. The control unit 200 can control the communication unit 225 for wirelessly connecting a mobile phone and the like and perform display related to the wireless connection.
Next, an image interpolation process performed in the display device will be described. In the schematic diagram in
An exemplary example of when the image interpolation device of the present invention is mounted on the vehicle is below described in detail, with reference to the accompanying drawings. Herebelow, after an overview and characteristics of the image interpolation device of the present invention are described, an image interpolation device of a first example will be described. Lastly, various variation examples (second example) will be described as another example.
Overview and Characteristics
First, the overview and the characteristics of the image interpolation device of the present invention will be described.
The AV unit 320 is a DVD player that reads video signals stored on a DVD disc (not shown) and outputs the signals to the image interpolation device 310. Specifically, the AV unit 320 issues a display request for DVD video images, based on an instruction from the passenger in the vehicle, and outputs image data of the DVD video images to the image interpolation device 310. The AV unit 320 is not limited to the DVD player and can include features for compact disc, hard disk, radio, television, and the like.
The navigation unit 330 is a device that performs route guidance, based on planned route information set in advance and positional information of an own vehicle. Specifically, the navigation unit 330 creates a “navigation” video image, based on the planned route information of the own vehicle set by the passenger of the vehicle (for example, the driver) and positional information transmitted from an artificial satellite. The positional information is obtained by a GPS receiver. The navigation unit 330 outputs image data of the created “navigation” video image to the image interpolation device 310.
When the AV unit 320 and the navigation unit 330 are mounted on the vehicle in this way, a display unit 317 in the image interpolation device 310 displays the DVD video images outputted from the AV unit 320 and the navigation video images outputted from the navigation unit 330. According to the first example, a resolution of the display device 317 is 800×480. A resolution of the image data of the DVD video image is 800×480. A resolution of the image data of the navigation video image is 800×480.
At this time, two 800×480 images are required to be displayed in the display unit 317 having the resolution of 800×480, if the display unit 317 receives display requests from both the AV unit 320 and the navigation unit 330.
Therefore, when display is performed using a two-screen display configuration or a two-perspective display configuration so that respective images do not overlap, a ½ horizontal resolution conversion is required to be performed on the image data of the DVD video images and the image data of the navigation video images to be displayed.
As shown in
As shown in
According to the first example, the image interpolation process is performed only on the image data of the navigation video image. A processing subject of the image interpolation process is limited to only the navigation video image in this way for a following reason. Pixel defects occur in characters, symbols, and the like as a result of the resolution conversion being performed on the image data of the navigation video image. Therefore, a situation in which contents of the video image become unrecognizable because of the pixel defects tends to occur easily. However, it goes without saying that, as another example, the image interpolation process can be performed on both the DVD video image v1 and the navigation video image v2.
A main characteristic of the image interpolation device 310 of the present invention is the image interpolation process. In the image interpolation process, with regards to pixels positioned in an interpolation subject area and peripheral pixels of the pixels, feature quantities of the pixels are calculated. Based on the calculated feature quantities of the pixels positioned in the interpolation subject area, pixel values of interpolation pixels are determined. As a result of the image interpolation process, the image quality degradation accompanying the resolution conversion of the image data can be suppressed.
The main characteristic will be described in detail. As shown in
A feature quantity is an indicator indicating a degree of divergence in pixel values when a pixel positioned in the interpolation subject area is compared with other pixels positioned in the interpolation subject area and the peripheral pixels. Specifically, the feature quantity is calculated by a determination of an absolute value of a difference between the pixel value of a focused pixel within the pixels positioned in the interpolation subject area, and a mean value of the pixel values of each pixel positioned in the interpolation subject area and the pixel values of the peripheral pixels. For example, if the feature quantity is large, the feature quantity indicates that the pixel has significantly changed compared to the peripheral pixels (in other words, a pixel characterizing the image). If the feature quantity is small, the feature quantity indicates that the pixel has changed little from the peripheral pixels.
When described using the example in
The image interpolation device 310 similarly calculates the feature quantities of a “Pixel 3” and a “Pixel 4” positioned in an interpolation subject area B, the feature quantities of a “Pixel 5” and a “Pixel 6” positioned in an interpolation subject area C, and the feature quantities of a “Pixel m” and a “Pixel n” positioned in an interpolation subject area N.
Next, the image interpolation device 310 determines the pixel values of the interpolation pixels based on the pixels positioned in the interpolation subject area, to preferentially use the pixel value of the pixel characterizing an image in an original image as the pixel value of the interpolation pixels. Specifically, the image interpolation device 310 determines the pixel value of the pixel having a feature quantity that exceeds a threshold, among the pixels positioned in the interpolation subject area, to be the pixel value of the interpolation pixels.
By the image interpolation device 310 determining the pixel value of the pixel exceeding the threshold that is acceptable for copying the pixel characterizing the image in the original image to be the pixel value of the interpolation pixels in this way, the pixel value of the pixel characterizing the image in the original image can be preferentially used as the pixel value of the interpolation pixels. The image quality degradation accompanying the resolution conversion of the image data can be suppressed. Furthermore, in relation to this, through use of the pixel value of the pixel characterizing the image in the original image without processes such as averaging and weight-averaging being performed, an image of the original image (in other words, the original image prior to the resolution conversion) can be easily maintained.
For example, when the feature quantity |P1−(P1+P2+P3)/3| of the “Pixel 1” is equal to a threshold “THRESH” or more, as in the interpolation subject area A, the pixel value of the interpolation pixels (in other words, the interpolation subject area A) is determined to be the pixel value “P1” of the “Pixel 1”. When both feature quantities of the “Pixel 1” and the “Pixel 2” are equal to the threshold or more, the pixel value of the pixel having the larger feature quantity is preferably used as the pixel value of the interpolation pixels.
When both the feature quantity of the “Pixel 5” and the feature quantity of the “Pixel 6” is less than the threshold “THRESH”, the pixel value “P5” and the pixel value “P6” of the “Pixel 5” and the “Pixel 6” positioned in the interpolation subject area C are respectively determined to be the pixel values of the interpolation pixels (in other words, the interpolation subject area C).
Therefore, in terms of the above-described example of the conventional art, through uniform creation of the interpolation data in which the image data from the periphery of the position at which the image data are to be interpolated are averaged or weight-averaged, interpolation data in which the pixel characterizing the image data is ambiguously processed are not created. Rather, by the pixel value of the pixel exceeding the threshold that is acceptable for copying the pixel characterizing the image in the original image being determined to be the pixel value of the interpolation pixels, the pixel value of the pixel characterizing the image in the original image can be preferentially used as the pixel value of the interpolation pixels. The image quality degradation accompanying the resolution conversion of the image data can be suppressed, as in the above-described main characteristic.
Furthermore, in relation to this, by the use of the pixel value of the pixel characterizing the image in the original image without processes such as averaging and weight-averaging being performed, the image of the original image (in other words, the original image prior to the resolution conversion) can be easily maintained.
Next, the image interpolation device according to the first example will be described. Here, after a configuration of the image interpolation device according to the first example is described, procedures of the various processes of the image interpolation device will be described.
Configuration of the Image Interpolation Device
The image data inputting unit 311 is a processing unit that inputs the image data outputted from the AV unit 320 and/or the navigation unit 330 to the feature quantity calculating unit 313, based on an image data input instruction from the image data input controlling unit 312. According to the first example, an example is given in which the DVD video image is inputted from the AV unit 320 (the resolution of the image is 800×480) and the navigation video image is inputted from the navigation unit 330 (the resolution of the image is similarly 800×480).
The image data input controlling unit 312 is a processing unit that controls a number of input systems of the image data inputted from the image data inputting unit 311 to the feature quantity calculating unit 313, depending on the display requests from the AV unit 320 and/or the navigation unit 330.
For example, when the display request for the DVD video image is received from the AV unit 320, the image data input controlling unit 312 instructs the image data inputting unit 311 to input the image data of the DVD video image v1. When the display request for the navigation video image is received from the navigation unit 330, the data input controlling unit 312 instructs the image data inputting unit 311 to input the image data of the navigation video image v2. When the display request for the DVD video image and the display request for the navigation video image are received from the AV unit 320 and from the navigation unit 330, the data input controlling unit 312 instructs the image data inputting unit 311 to input the image data of the DVD video image v1 and the navigation video image v2.
The feature quantity calculating unit 313 is a processing unit that, with regards to the pixels positioned in the interpolation subject area and the peripheral pixels of the pixels, calculates the feature quantities of the pixels based on the image data inputted from the image data inputting unit 311. Specifically, as shown in
The feature quantity calculating unit 313 similarly calculates the feature quantities of the “Pixel 3” and the “Pixel 4” positioned in the interpolation subject area B, the feature quantities of the “Pixel 5” and the “Pixel 6” positioned in the interpolation subject area C, and the feature quantities of the “Pixel m” and the “Pixel n” positioned in the interpolation subject area N. For the purpose described above, only the image data of the navigation video image is subject to the processes performed by feature quantity calculating unit 313 and the image interpolation processing unit 314. However, the feature quantity calculating unit 313 and the image interpolation processing unit 314 can perform the image interpolation process on both the DVD video image v1 and the navigation video image v2.
The image interpolation processing unit 314 determines the pixel values of the interpolation pixels based on the feature quantities of the pixels positioned in the interpolation subject area. Specifically, the image interpolation processing unit 314 extracts the pixel with the largest feature quantity calculated by the feature quantity calculating unit 313 from among the pixels positioned in the interpolation subject area. If the feature quantity of the extracted pixel is equal to the threshold or more, the image interpolation processing unit 314 determines the pixel value of the pixel to be the pixel value of the interpolation pixels.
In terms of the example of the interpolation subject area A shown in
When the feature quantity of the extracted “Pixel 4” is less than the threshold “THRESH”, as in the interpolation subject area B, the pixel values “P3” and “P4” of the “Pixel 3” and the “Pixel 4” positioned in the interpolation subject area B are respectively determined to be the pixel values of the interpolation pixels (in other words, the interpolation subject area B).
Through the extraction of the pixel with the largest feature quantity from among the pixels positioned in the interpolation subject area in this way, the pixel having the highest probability of characterizing the image in the original image can be extracted from within the interpolation subject area. The pixel value of the pixel characterizing the image in the original image can be preferentially used as the pixel value of the interpolation pixels.
The resolution conversion processing unit 315 is a processing unit that performs the resolution conversion on a plurality of pieces of image data of which the interpolation pixels have been interpolated by the image interpolation processing unit 314. For example, when the image data of the DVD video image v1 and the image data of the navigation video image v2 inputted from the image interpolation processing unit 314 are displayed in the display unit 7 using the two-perspective display configuration, the RGB digital signals are aligned in a dot array such as that shown in
Therefore, the resolution conversion processing unit 315 performs the ½ horizontal resolution conversion in which “G” of odd-numbered dots in the image data of the DVD video image v1 are culled and, additionally, “R” and “B” of even-numbered dots are culled.
At the same time, the resolution conversion unit 315 performs the ½ horizontal resolution conversion in which “R” and “B” of the odd-numbered dots in the image data of the navigation video image v1 are culled and, additionally, “G” of the even-numbered dots are culled.
By the resolution conversion being performed on the image data of which the interpolation pixels have been interpolated by the image interpolation process in this way, the resolution conversion can be performed while suppressing the image quality degradation of the image data.
The display control processing unit 316 is a processing unit that performs control to realign the image data to which the resolution conversion has been performed by the resolution conversion processing unit 315 to a predetermined display configuration (the two-perspective display configuration in the first example) and display the realigned image data. Specifically, the display control processing unit 316 performs a realignment processing for realigning the RGB digital signals of the DVD video image v1 and the navigation video image v2 to which the resolution conversion has been performed by the resolution conversion processing unit 315 to the dot array shown in
In the example shown in
Compared to the pixel data shown in
By control being performed in this way so that the image data including at least one piece of image data to which the resolution conversion has been performed are realigned to the predetermined configuration and displayed, the image data can be displayed in various configurations in one display unit without a new configuration being provided.
Procedures of Various Processes
Next, procedures of the various processes of the image interpolation device according to the first example will be described.
When the display request for the DVD video image and the display request for the navigation video image are received from the AV unit 320 and the navigation device 330 (Step S601; Yes), the image data inputting unit 311 inputs the image data to the feature quantity calculating unit 313 for each input system of the DVD video image v1 and the navigation video image v2 (Step S602).
Then, the feature quantity calculating unit 313 successively calculates the feature quantities of the pixels positioned within the interpolation subject area, based on the image data inputted from the image data inputting unit 311 (Step S603). Next, the image interpolation processing unit 314 extracts the pixel with the largest feature quantity calculated by the feature quantity calculating unit 313 from among the pixels positioned in the interpolation subject area (Step S604).
When the feature quantity of the extracted pixel is equal to the threshold or more (Step S605; Yes), the image interpolation processing unit 314 determines the pixel value of the pixel to be the pixel value of the interpolation pixels (Step S606). At the same time, when the feature quantity of the extracted pixel is less than the threshold (Step S605; No), the pixel value of each pixel positioned in the interpolation subject area is respectively determined to be the pixel values of the interpolation pixels (Step S607).
Then, when the pixel values of the interpolation pixels are determined for all interpolation subject areas (Step S608; Yes), the image interpolation processing unit 314 creates the image data in which the pixel value of each interpolation pixel is reflected (Step S609). When the pixel values of the interpolation pixels in all interpolation subject areas are not determined (Step S608; No), the process from Step S603 to Step S607 are recursively performed until the pixel values of the interpolation pixels for all interpolation subject areas are determined.
Next, the resolution conversion processing unit 315 respectively performs the ½ horizontal resolution conversion process on the image data of the navigation video image and the image data of the DVD video image, of which the interpolation pixels have been interpolated by the image interpolation processing unit 314 (Step S610).
Then, the display control processing unit 316 realigns the image data of the navigation video image and the image data of the DVD video image to which the ½ resolution conversion has been performed by the resolution conversion processing unit 315 to the predetermined display configuration and displays the image data of the navigation video image and the image data of the DVD video image (Step S611).
Lastly, when either the navigation video image or the DVD video image or both the navigation video image and the DVD video image are completed (Step S612; Yes), the process is completed. When both the navigation video image and the DVD video image are not completed (Step S612; No), the process from Step S602 to Step S611 are repeated.
As described above, in the image interpolation device 310 according to the first example, the pixel with the largest feature quantity is extracted from the pixels positioned in the interpolation subject area. When the feature quantity of the extracted pixel is equal to or more than the threshold, the pixel value of the pixel is determined to be the pixel value of the interpolation pixels. Therefore, the pixel having the highest probability of characterizing the image in the original image is extracted from the interpolation subject area. The pixel value of the extracted pixel can be preferentially used as the pixel value of the interpolation pixels. The image quality deterioration accompanying the resolution conversion of the image data can be more effectively suppressed.
An example of the present invention has been described above. However, in addition to the first example described above, the present invention can be achieved by various differing examples within the technical scope described in the scope of claims.
For example, in the first example, when the display requests (in other words, the display requests for the DVD video image and the navigation video image) are received, an example in which the image interpolation process according to the present invention is performed has been described. However, the present invention is not limited thereto. The present invention can be applied regardless of whether the display request is a single request or a plurality of requests. In particular, in the present invention, a much higher effect can be achieved by the image interpolation process according to the present invention being applied to the image data requiring the resolution conversion (for example, when the resolution conversion for a relatively small display unit, such as a mobile phone, is required), even when the display request is a single request.
In the first example, a following example has been described. The pixel with the largest feature quantity calculated by the feature quantity calculating unit 313 is extracted from among the pixels positioned in the interpolation subject area. When the feature quantity of the extracted pixel is less than the threshold, the pixel value of each pixel positioned in the interpolation subject area is respectively determined to be the pixel values of the interpolation pixels. However, the present invention is not limited thereto. If the feature quantity of the extracted pixel is less than the threshold, a pixel value that is an average of the pixel values of the pixels positioned in the interpolation subject area can be determined to be the pixel value of the interpolation pixels.
In terms of the example in
When the pixel with the largest feature quantity calculated by the feature quantity calculating unit 313 is extracted from among the pixels positioned in the interpolation subject area and the feature quantity of the extracted pixel is less than the threshold, the pixel value that is the average of the pixel values of the pixels positioned in the interpolation subject area is determined to be the pixel value of the interpolation pixels. As a result, a large-scale image interpolation process can be performed when the luminance levels of adjacent pixels are not divergent. The image quality degradation accompanying the resolution conversion of the image data can be effectively suppressed.
In the present invention, the difference in the pixel values between the pixels positioned in the interpolation subject area is calculated. When the absolute value of the difference in the pixel values between the pixels positioned in the interpolation subject area is equal to the threshold or more, the pixel value that is the average of the pixel values of the pixels positioned in the interpolation subject area can be determined to be the pixel value of the interpolation pixels.
For example, in terms of the example in
When the difference in the pixel values between the pixels positioned in the interpolation subject area is calculated and the absolute value of the difference in the pixel values between the pixels positioned in the interpolation subject area is equal to the threshold or more in this way, the pixel value that is the average of the pixel values of the pixels positioned in the interpolation subject area is determined to be the pixel value of the interpolation pixels. As a result, a large difference in the luminance levels occurring locally can be smoothed. The image quality degradation accompanying the resolution conversion of the image data can be effectively suppressed.
In the first example, an example in which video image signals inputted from the image interpolation device 310 are composite signals (RGB format) is described. However, the present invention is not limited thereto. The present invention can be similarly applied even when the video image signals of another format, such as YC format, are inputted.
Among each process described in the example, all or some of the processes that have been described as being performed automatically can be performed manually. On the other hand, all or some of the processes that have been described as being performed manually can be performed automatically by a known method. In addition, information including processing procedures, control procedures, specific names, and various data and parameters (for example, resolutions and resolution conversion rates) indicated within the text above and shown within the diagrams can be arbitrarily changed unless otherwise noted.
Respective constituent elements of each device shown in the diagrams are functional concepts and are not necessarily required to be configured as shown in the diagram. In other words, specific configurations of dispersal and integration of each device are not limited to that shown in the diagram. Depending on various loads, usage conditions, and the like, all or some of the devices can be functionally or physically dispersed or integrated in arbitrary units. Furthermore, all or an arbitrary number of respective processing functions performed in each device can be actualized by the CPU and a program analytically executed by the CPU. Alternatively, the processing functions can be actualized as hardware using wired logic.
The present example is described using the two-screen display configuration displaying two screens on a single display and the two-perspective display configuration outputting two differing video images in two directions as examples. However, a multi-screen display configuration displaying a plurality of screens that are three or more and a multi-direction display configuration outputting differing video images in a plurality of directions that are three or more can be used.
The present example is described using a device mounted on a vehicle as an example. However, use of the present invention is not limited thereto. For example, the present invention can also be applied to a display device other than that used in a vehicle, such as that for household use.
As described above, the image interpolation device and the display device of the present invention is effective for interpolating images. In particular, the present invention is suitable for a resolution conversion maintaining the characteristics of the original image.
Number | Date | Country | Kind |
---|---|---|---|
2004-316906 | Oct 2004 | JP | national |
2005-265690 | Sep 2005 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP05/19182 | 10/19/2005 | WO | 6/13/2007 |