The present disclosure relates to an image processing apparatus, an image processing method, and a program, and particularly relates to an image processing apparatus, an image processing method, and a program that are capable of correcting a three-dimensional image viewable with naked eyes with high accuracy.
A viewing system that uses a projector array and allows a viewer to view a three-dimensional image with naked eyes realizes the viewing of the three-dimensional image with naked eyes by projecting a plurality of images of different viewpoints in a unit of a pixel column for each projector, and further diffusing the projected images of each viewpoint at a predetermined diffusion angle in a horizontal direction.
Incidentally, in the viewing system with naked eyes using a projector array, by increasing the number of projectors to be used, the number of projectable images can be increased, and thus it is possible to achieve high resolution of a three-dimensional image to be viewed.
However, on the other hand, when the number of projectors increases, a device configuration and a device cost increase.
Thus, it is conceivable to configure the viewing system with a small number of projectors without reducing resolution, and in a case of realizing viewing of a three-dimensional image with naked eyes with a small number of projectors, there arises a need to increase a diffusion angle of a diffusion plate required for the system.
However, when the diffusion angle of the diffusion plate is increased, images (multi-viewpoint images) are mixed between a plurality of projectors, and moreover, there is also optical deterioration due to a lens modulation transfer function (MTF) (imaging performance of a lens expressed by an MTF curve) of the projectors. Thus, blurring or crosstalk occurs in a three-dimensional image to be viewed.
Therefore, there has been proposed a signal processing technology for individually eliminating blurring and crosstalk by capturing an image of blurring or crosstalk by an imaging device such as a camera, and by applying, on the basis of a result of capturing the image, correction corresponding to the blurring or the crosstalk to an image to be projected in advance (see Patent Documents 1 to 3).
However, in a case where technologies of Patent Documents 1 and 2 are applied, an inverse filter is designed to individually correct deterioration such as blurring and crosstalk at a time of projection. Thus, when an amount of blurring increases to some extent, the blurring cannot be appropriately corrected, and artifacts and uncorrected blurring may occur at the time of projection due to excessive correction.
Furthermore, in a case where a technology of Patent Document 3 is applied, it may take time to converge and obtain calculation results for obtaining an inverse filter coefficient, or the calculation results may not converge when the number of projectors increases.
As a result, even when the technologies of Patent Documents 1 to 3 are applied, there is a limit to amounts of blurring and crosstalk that can be corrected, and even when the technologies of Patent Documents 1 to 3 are used in combination, there is a limit to correction that can be appropriately applied.
The present disclosure has been made in view of such a situation, and particularly corrects a three-dimensional image viewable with naked eyes with high accuracy by integrally and simultaneously correcting deterioration due to mixing of images between a plurality of projectors and optical deterioration due to a lens MTF.
An image processing apparatus according to one aspect of the present disclosure includes: a projection unit that projects a multi-viewpoint image; and an image generation unit that generates the multi-viewpoint image by integrally and simultaneously applying correction to optical deterioration and correction to crosstalk deterioration.
An image processing method and a program according to one aspect of the present disclosure correspond to the image processing apparatus according to one aspect of the present disclosure.
In one aspect of the present disclosure, a multi-viewpoint image is projected, and the multi-viewpoint image is generated by integrally and simultaneously applying correction to optical deterioration and correction to crosstalk deterioration.
Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. Note that, in the present specification and drawings, components having substantially the same functional configuration are denoted by the same reference signs, and redundant description thereof is omitted.
Hereinafter, a mode for carrying out the present technology will be described. The description will be made in the following order.
1. Preferred Embodiment
2. Application Example 1
3. Application Example 2
4. Example of Executing Processing by Software
The present disclosure makes it possible to achieve high resolution of a three-dimensional image by integrally and simultaneously correcting crosstalk deterioration due to mixing of images between a plurality of projectors and optical deterioration due to a lens MTF.
The image processing unit in
The image generation unit 31 generates viewpoint images P1 to Pn to be respectively projected by the projection units 32-1 to 32-n from (a group of) multi-viewpoint images PM1 serving as input.
Furthermore, the image generation unit 31 applies correction to the generated (group of) multi-viewpoint images PM1 by inverse functions (inverse filters) for correction supplied from the correction unit 36 such that (a group of) output images PM2 projected and reflected on the screen 33 including a mirror and diffused via the diffusion plate 34 to be viewed match the (group of) input images PM1.
Moreover, the image generation unit 31 outputs the multi-viewpoint images P1 to Pn corrected by the inverse functions (inverse filters) to the projection units 32-1 to 32-n, respectively.
The projection units 32-1 to 32-n include, for example, projectors, and respectively project the multi-viewpoint images P1 to Pn on the screen 33 as the (group of) output images PM2.
Note that, in a case where it is not particularly necessary to distinguish the projection units 32-1 to 32-n from each other and the multi-viewpoint images P1 to Pn from each other, the projection units 32-1 to 32-n and the multi-viewpoint images P1 to Pn are simply referred to as the projection units 32 and the multi-viewpoint images P, and other configurations are also referred to in a similar manner.
The diffusion plate 34 including an anisotropic diffusion plate is provided in the front stage of the screen 33 and diffuses images in a predetermined diffusion distribution in a unit of a pixel column of the multi-viewpoint images P1 to Pn, and the images are viewed by a viewer, so that viewing of a three-dimensional image with naked eyes is realized.
More specifically, each of the multi-viewpoint images P1 to Pn includes images of different viewpoints in a unit of one or a plurality of pixel columns, and when each of the plurality of multi-viewpoint images P1 to Pn is viewed by a viewer from a predetermined viewing direction, an image of a pixel column corresponding to each viewing direction is viewed. Thus, viewing of a three-dimensional image is realized.
In
The imaging unit 35 is provided at a position corresponding to a viewing position of a viewer, captures images to be viewed by the viewer, and outputs the captured images to the correction unit 36.
The correction unit 36 generates inverse functions (filters) for correcting the (group of) output images PM2, which are images captured by the imaging unit 35, to be the same as the (group of) input images PM1, and outputs the inverse functions (filters) to the image generation unit 31.
<Principle of Viewing Three-Dimensional Image>
Here, a principle of viewing a three-dimensional image will be described.
The projection units 32-1 to 32-n of the image processing unit 11 are arranged in a horizontal direction.
Here, in order to simplify the description, for example, as illustrated in
Each of the projection units 32 constitutes images of different viewpoints in a unit of one or a plurality of pixel columns in the horizontal direction, and projects the images on the screen 33 as a multi-viewpoint image.
Here, only optical paths of images at pixel positions (pixel columns) Psc1 and Psc2 at end portions on the screen 33 among the images projected by each of the projection units 32-11, 32-12, 32-21, 32-22, 32-31, and 32-32 will be described.
That is, the optical path of the image at the pixel position Psc1 of the images projected by the projection unit 32-11 is an optical path r11 represented by a solid line, and the optical path of the image at the pixel position Psc1 of the images projected by the projection unit 32-12 is an optical path r12 represented by a dotted line.
Furthermore, the optical paths of the images at the pixel position Psc1 of the images projected by the projection units 32-21 and 32-22 are an optical path r21-1 represented by a two-dot chain line and r22-1 represented by a one-dot chain line, respectively.
Moreover, the optical path of the image at the pixel position Psc2 of the images projected by the projection unit 32-31 is an optical path r31 represented by a two-dot chain line, and the optical path of the image at the pixel position Psc2 of the images projected by the projection unit 32-32 is an optical path r32 represented by a one-dot chain line.
Furthermore, the optical paths of the images at the pixel position Psc2 of the images projected by the projection units 32-21 and 32-22 are an optical path r21-2 represented by a solid line and r22-2 represented by a dotted line, respectively.
The viewer H1 views the images of the optical paths r22-1 to r32 at a viewpoint V1 as a left eye, and views the images of the optical paths r21-1 to r31 at a viewpoint V2 as a right eye.
Furthermore, the viewer Hn views the images of the optical paths r12 to r22-2 at a viewpoint Vn−1 as a left eye, and views the images of the optical paths r11 to r21-2 at a viewpoint Vn as a right eye.
That is, viewing of a three-dimensional image is realized by the viewers H1 and Hn viewing images in different viewing directions with the right and left eyes.
Note that
<Regarding Correction of Multi-Viewpoint Image>
Here, in describing correction of a multi-viewpoint image, a relationship between an image projected on the screen 33 by each of the projection units 32 and an image projected on the screen 33 and further reflected by the screen 33 to be actually viewed will be described.
As indicated by dotted lines in
At this time, when an angle formed by a position on the image P(k) facing a center position Vc, which is a center position of the viewing zone Z, and a position on the viewing zone Z, which is a viewing direction, is defined as an angle θ, a pixel column at a horizontal position i on the viewing zone Z is assumed to be represented by tan θ on the viewing zone Z.
Thus, a relationship between the pixel column at the horizontal position i on the image P(k) projected on the screen 33 and a pixel column viewed at tan θ, which is a horizontal position on the viewing zone Z, is as indicated by dotted lines in
That is, as illustrated in
Thus, for example, as illustrated in
At this time, a horizontal position i of a pixel column on the image P(k−1) projected by the projection unit 32-(k−1) and tan θ, which is the horizontal position on the viewing zone Z, have a relationship represented by a straight line Lk−1 indicated by a right-downward one-dot chain line as illustrated in
Similarly, for example, as illustrated in
At this time, a horizontal position i of a pixel column on the image P(k+1) projected by the projection unit 32-(k+1) and tan θ, which is the horizontal position on the viewing zone Z, have a relationship represented by a straight line Lk+1 indicated by a right-downward solid line as illustrated in
In view of the above, when the plurality of projection units 32-1 to 32-n is arranged in the horizontal direction as illustrated in
Note that, in
In a case where the projection units 32-1 to 32-n are arranged as illustrated in
At this time, as illustrated in
At this time, at the center position Vc, as illustrated in
Here, the pixel columns Pc−4 to Pc+3 are pixel columns projected by the projection units 32-(k−4) to 32-(k+3), respectively, on the screen 33 viewed at the position of the center position Vc.
Thus, in a case where the pixel column of an image on the screen 33 facing the center position Vc is defined as, for example, a pixel column Pt between the pixel column Pc−1 and the pixel column Pc as illustrated in
Note that, when moving to the position Vc′, the discrete but viewable pixel columns Pc−4 to Pc+3 cannot be viewed at the center position Vc.
Therefore, in the present disclosure, to enable viewing of the images of the pixel columns discrete in the horizontal direction projected on the screen 33 as continuous images in the horizontal direction, the diffusion plate 34 is provided in the front stage of the screen 33.
That is, when the diffusion plate 34 is provided in the front stage of the screen 33, as illustrated in
Note that a downward convex waveform in the horizontal direction in
The diffusion plate 34 diffuses the images of each pixel column at a predetermined diffusion angle, so that the images are diffused in the diffusion distribution D having a peak of diffusion intensity at the viewing position where the images are discretely viewed when the diffusion plate 34 is not provided.
That is, in a case where the diffusion plate 34 is not provided, as illustrated in the upper part of
On the other hand, in a case where the diffusion plate 34 is provided, as illustrated in the lower part of
Note that, in
Thus, for example, as illustrated in
As a result, the images viewed from the center position Vc can be viewed as images in which pixel columns are continuously arranged in the horizontal direction.
However, in this case, at the pixel column Pt, since the images of the pixel columns Pc and Pc−1 are diffused, the image is viewed from the center position Vc as an image in a state where both of the images are mixed, but when images of not only nearby pixel columns but also distant pixel columns are mixed, blurring caused by crosstalk (crosstalk deterioration) occurs.
Furthermore, in the projection unit 32, an image is projected via a lens, and blurring (optical deterioration) occurs in the projected image due to an influence of a lens MTF (lens performance expressed by an MTF curve).
Therefore, the projected image needs to be corrected for blurring caused by the crosstalk and blurring caused by the lens MTF.
<Blurring Caused by Crosstalk and Blurring Caused by Lens MTF>
Blurring caused by crosstalk (crosstalk deterioration) and blurring caused by a lens MTF (optical deterioration) will be described.
Note that, here, as illustrated in
Furthermore, here, blurring caused by crosstalk and blurring caused by a lens MTF that occur at the pixel column Pt when viewing from the center position Vc is performed will be considered.
In this case, as illustrated in
Furthermore, as illustrated in
Similarly, in the images of the pixel column Pc and the surrounding pixel columns Pk_1 to Pk_4 on the straight line Lk, blurring represented by a deterioration function FL-k according to a lens MTF of the projection unit 32-k occurs.
Moreover, in the images of the pixel column Pk−1 and the surrounding pixel columns Pk−1_1 to Pk−1_4 on the straight line Lk−1, blurring represented by a deterioration function FL-(k−1) according to a lens MTF of the projection unit 32-(k−1) occurs.
As a result, the image of the pixel column Pt is viewed in a state where blurring occurs by combining blurring caused by the crosstalk by the diffusion plate 34 (hereinafter, also referred to as blurring caused by the crosstalk or crosstalk deterioration) and blurring caused by the lens MTF of each of the projection units 32-(k+1) to 32-(k−1) (hereinafter, also referred to as blurring caused by the lens MTF or optical deterioration).
Here, as a method of correcting blurring caused by the crosstalk (crosstalk deterioration) and blurring caused by the lens MTF (optical deterioration), an example in which the blurring caused by the crosstalk (crosstalk deterioration) and the blurring caused by the lens MTF (optical deterioration) are corrected independently from each other will be described.
Here, as illustrated in
Similarly, correction in directions of an arrow Zk in the drawing based on the deterioration function FL-k of the lens MTF of the projection unit 32-k is applied to pixels of the pixel column Pk on the straight line Lk by using the surrounding pixel columns Pk_1 to Pk_4.
Moreover, correction in directions of an arrow Zk−1 in the drawing based on the deterioration function FL-(k−1) of the lens MTF of the projection unit 32-(k−1) is applied to the pixel column Pk−1 on the straight line Lk−1 by using the surrounding pixel columns Pk−1_1 to Pk−1_4.
As a result, correction based on the lens MTF is applied to each pixel of the pixel columns Pk−1, Pk, and Pk+1 having the same horizontal direction on the image as that of the pixel column Pt.
Next, pixels of the pixel column Pt are corrected in directions of an arrow Zc in the drawing based on the deterioration function Fs in each of the straight lines Lk−1, Lk, and Lk+1 in the pixel columns Pk−1, Pk, and Pk+1.
As a result, in each pixel in the pixel column Pt, correction is applied to blurring caused by the lens MTF of each of the projection units 32-(k−1), 32-k, and 32-(k+1) and blurring caused by crosstalk between each other.
However, for example, although it is assumed that the pixel column Pk_3 closest to the pixel column Pt in
For this reason, when the pixels of the pixel column Pt are corrected, presence or absence of correlation according to a distance in a two-dimensional space is not considered. Thus, although the blurring caused by the crosstalk and the blurring caused by the lens MTF are corrected, it cannot be said that the correction is optimal.
Thus, the correction unit 36 of the present disclosure generates inverse functions (inverse filters) for integrally and simultaneously correcting the blurring caused by the crosstalk (crosstalk deterioration) and the blurring caused by the lens MTF (optical deterioration) and outputs the inverse functions (inverse filters) to the image generation unit 31. Then, the image generation unit 31 uses the inverse functions (inverse filters) to correct generated multi-viewpoint images, outputs the corrected multi-viewpoint images to the projection units 32-1 to 32-n, and causes the projection units 32-1 to 32-n to project the corrected multi-viewpoint images.
For example, as illustrated in
Here, the inverse functions for applying correction are inverse functions (inverse filters) obtained on the basis of a transfer function (crosstalk deterioration transfer function) representing a generation model of the blurring caused by the crosstalk, and a transfer function (optical deterioration transfer function) representing a generation model of the blurring caused by the lens MTF.
More specifically, an input image and an output image that is projected without being corrected are expressed by the following Equation (1).
Y=D·M(X) (1)
Here, X is the input image, Y is the output image, D(X) is the transfer function representing the generation model of the blurring caused by the crosstalk, and M(X) is the transfer function representing the generation model of the blurring caused by the lens MTF.
The correction unit 36 obtains, in advance, the transfer function D(X) representing the generation model of the blurring caused by the crosstalk as a function corresponding to a diffusion distribution for images in a unit of a pixel column by the diffusion plate 34 by, for example, causing the projection unit 32 to project a known test pattern on the screen 33, capturing an image by the imaging unit 35 via the diffusion plate 34, and comparing the captured test pattern with the known test pattern.
Furthermore, the correction unit 36 obtains, in advance, the transfer function M(X) representing the generation model of the blurring caused by the lens MTF as a function by, for example, causing the projection unit 32 to project a known test pattern on the screen 33, capturing an image by the imaging unit 35 via the diffusion plate 34, and comparing the captured test pattern with the known test pattern. Furthermore, the transfer function M(X) may be obtained on the basis of data of the lens MTF individually preset for each of the projection units 32.
Then, by obtaining inverse functions (inverse filters) on the basis of the transfer functions D(X) and M(X) and multiplying the input image by the inverse functions (inverse filters), the correction unit 36 corrects the output image projected on the screen 33 and diffused by the diffusion plate 34 to be viewed.
Y′=D·M(D−1·M−1(X)) (2)
Here, Y′ is the corrected output image, D(X)−1 is the inverse function of the transfer function representing the generation model of the blurring caused by the crosstalk, and M(X)−1 is the inverse function of the transfer function representing the generation model of the blurring caused by the lens MTF.
Thus, (D−1M−1(X)) serving as the inverse functions (inverse filters) makes it possible to integrally and simultaneously correct the blurring caused by the crosstalk and the blurring caused by the lens MTF.
That is, the correction unit 36 obtains (D−1M−1(X)) serving as the inverse functions (inverse filters) by the method described above and supplies (D−1M−1(X)) to the image generation unit 31.
When the image generation unit 31 generates the images P1 to Pn on the basis of the input images PM1 (
By this processing, since the blurring caused by the crosstalk (crosstalk deterioration) and the blurring caused by the lens MTF (optical deterioration) are integrally and simultaneously corrected, correction is appropriately applied to the surrounding pixel columns according to a spatial position of a pixel column to be corrected, and it becomes possible to correct a three-dimensional image to be viewed with high accuracy.
As a result, even when the image processing unit 11 has a configuration in which the number of projection units 32 is small, a diffusion angle by the diffusion plate 34 is set wide, and crosstalk easily occurs, it is possible to realize viewing of a high-definition three-dimensional image.
Note that, by adjusting a constraint term of each of D−1(X) and M−1(X) in (D−1·M−1(X)) serving as the inverse functions (inverse filters), adjustment may be performed so as to preferentially correct one of the blurring caused by the crosstalk (crosstalk deterioration) and the blurring caused by the lens MTF (optical deterioration).
<Display Processing>
Next, display processing by the image processing unit 11 in
In Step S11, the correction unit 36 sets an unprocessed projection unit 32 among the projection units 32-1 to 32-n as a projection unit to be processed, and acquires and stores an amount of crosstalk on the screen 33 of the projection unit 32 to be processed as information regarding an amount of blurring caused by crosstalk.
More specifically, for example, the image generation unit 31 generates a test pattern, and causes the projection unit 32 to be processed to project the test pattern on the screen 33, and the imaging unit 35 captures an image of the test pattern projected on the screen 33 via the diffusion plate 34, and outputs the captured image of the test pattern to the correction unit 36.
Then, the correction unit 36 measures a diffusion distribution on the basis of comparison between a known test pattern and the captured image of the test pattern, and specifies the amount of crosstalk from the diffusion distribution.
Note that the correction unit 36 may acquire, in advance, a design value or an amount of crosstalk that is measured by another measurement instrument.
In Step S12, the correction unit 36 acquires and stores an amount of blurring of the projection unit 32 to be processed as information regarding an amount of blurring caused by a lens MTF.
More specifically, for example, the image generation unit 31 generates a test pattern, and causes the projection unit 32 to be processed to project the test pattern on the screen 33, and the imaging unit 35 captures an image of the test pattern projected on the screen 33, and outputs the captured test pattern to the correction unit 36.
The correction unit 36 specifies the amount of blurring related to the lens MTF on the basis of comparison between a known test pattern and the captured image of the test pattern.
Note that the correction unit 36 may acquire, in advance, a design value or an amount of blurring related to the lens MTF that is measured by another measurement instrument.
In Step S13, the correction unit 36 determines whether or not there is an unprocessed projection unit 32, and in a case where there is an unprocessed projection unit 32, the processing returns to Step S11.
That is, the processing of Steps S11 to S13 is repeated until the information regarding the amount of crosstalk (the amount of blurring caused by the crosstalk) and the information regarding the amount of blurring caused by the lens MTF that are related to all the projection units 32 are acquired.
Then, in a case where it is considered in Step S13 that the information regarding the amount of crosstalk (the amount of blurring caused by the crosstalk) and the information regarding the amount of blurring caused by the lens MTF related to all the projection units 32 are acquired, the processing proceeds to Step S14.
In Step S14, the correction unit 36 sets inverse functions (inverse filters) including optimization of a distribution of pixels on the basis of the information regarding the amount of crosstalk (the amount of blurring caused by the crosstalk) and the information regarding the amount of blurring caused by the lens MTF that are related to all the projection units 32, and supplies the inverse functions (inverse filters) to the image generation unit 31.
That is, as described with reference to
In Step S15, the image generation unit 31 reads input images to generate images P1 to Pn, and multiplies each of the images P1 to Pn by the inverse functions (inverse filters), so that the blurring caused by the crosstalk and the blurring caused by the lens MTF are integrally and simultaneously corrected.
Then, the image generation unit 31 outputs the images P1 to Pn in which the blurring caused by the crosstalk and the blurring caused by the lens MTF are integrally and simultaneously corrected to the projection units 32-1 to 32-n, respectively.
In Step S16, the projection units 32-1 to 32-n respectively project, in a superimposed manner, the images P1 to Pn in which the blurring caused by the crosstalk and the blurring caused by the lens MTF are integrally and simultaneously corrected on the screen 33.
By the series of processing described above, P1 to Pn in which the blurring caused by the crosstalk (crosstalk deterioration) and the blurring caused by the lens MTF (optical deterioration) are integrally, collectively, and simultaneously corrected are projected on the screen 33 as multi-viewpoint images in a superimposed manner. As a result, a user who views the images via the diffusion plate 34 can view a three-dimensional image from which the blurring caused by the crosstalk and the blurring caused by the lens MTF are removed with high accuracy with naked eyes.
Note that the processing of Steps S11 to S14 may be performed offline in advance so that the inverse functions (inverse filters) are obtained in advance.
In this case, when the multi-viewpoint images are displayed in a superimposed manner, it is only necessary to perform the processing of Steps S15 and S16.
Furthermore, an example has been described above in which the image processing unit 11 in
For example, the projection units 32 and the screen 33 may include a liquid crystal display (LCD) or an organic light emitting diode (OLED), and the diffusion plate 34 may include a lenticular lens or a parallax barrier.
Furthermore, an example has been described in which the correction unit 36 generates inverse functions (inverse filters) used for correction from a transfer function representing a generation model of blurring caused by crosstalk and a transfer function representing a generation model of blurring caused by a lens MTF, and the image generation unit 31 corrects multi-viewpoint images by applying the inverse filters.
However, the image generation unit 31 may directly apply optimization processing similar to the correction using the inverse filters on pixels to apply similar correction.
<Case Where Error Due to Inverse Functions Occurs>
An example has been described above in which blurring caused by crosstalk and blurring caused by a lens MTF are integrally, collectively, and simultaneously corrected by obtaining inverse functions (inverse filters) and multiplying an input image by the inverse functions (inverse filters). However, by multiplying the input image by the obtained inverse functions (inverse filters), some pixel values of pixels of the input image are saturated, and an error may occur as an image.
In such a case, an image may be generated by linear interpolation by using an image of a viewpoint where no error has occurred.
That is, for example, an example of generating multi-viewpoint images in a range of viewpoint positions V11 to V12 as illustrated in
It is assumed that, when a viewpoint position is continuously changed in the range of the viewpoint positions V11 to V12 in
That is, it is assumed that, when the image P101 is viewed at the viewpoint position V11 and the image P105 is viewed at the viewpoint position V12, the images P102 to P104 are viewed at the corresponding viewpoint positions obtained by dividing a distance between the viewpoint position V11 and the viewpoint position V12 into four equal parts.
In a case where input images are multiplied by inverse functions (inverse filters) to obtain the images P101 to P105 in
In such a case, when the input images are multiplied by the inverse functions (inverse filters), a failure occurs in the generated images.
Thus, in a case where an error occurs, when the images P101 and P105 viewed at the viewpoint positions V11 and V12 are obtained, the images therebetween may be generated so as to be mixed according to the viewpoint positions.
That is, as illustrated in the lower part of
Similarly, as illustrated in the lower part of
Moreover, as illustrated in the lower part of
In a case where a viewpoint is fixed by such mixing, the mixing is conspicuous, but a motion parallax, which is smoothness in a case of moving the viewpoint, is secured.
That is, since such a motion parallax, which is a human visual characteristic, is secured, when viewpoint positions for the images P121 to P125 in the lower part of
<Display Processing in Case where Error Due to Inverse Functions Occurs>
Next, display processing in a case where an error due to inverse functions occurs will be described with reference to a flowchart in
That is, in Step S36, the image generation unit 31 determines, for example, whether or not an error indicating occurrence of a failure in the images, such as saturation of pixel values, has occurred in the images P1 to Pn generated by using the inverse functions (inverse filters).
In a case where it is determined in Step S36 that the error has occurred, the processing proceeds to Step S37.
In Step S37, as described with reference to the lower part of
By this processing, in a case where the error has occurred, the phase images of the viewpoint positions where no error has occurred is used to generate the image of the viewpoint position where the error has occurred by interpolation.
As a result, by using the inverse functions (inverse filters), it becomes possible to integrally, collectively, and simultaneously correct the blurring caused by the crosstalk (crosstalk deterioration) and the blurring caused by the lens MTF (optical deterioration), and even when an error has occurred by using the inverse functions (inverse filters), it becomes possible to obtain an image without a failure by generating the image by interpolation.
An example has been described above in which multi-viewpoint images are projected by the image processing unit 11 in
That is, for example, as illustrated in
Then, multi-viewpoint images that enable viewing of an image Pa in a viewpoint position range Lpa in
Also in the example in which different two-dimensional images are viewable by changing the viewpoint position in this manner, as described above, by integrally, collectively, and simultaneously correcting blurring caused by crosstalk (crosstalk deterioration) and blurring caused by a lens MTF (optical deterioration), it is possible to appropriately correct the blurring caused by the crosstalk (crosstalk deterioration) and the blurring caused by the lens MTF (optical deterioration).
Incidentally, the series of processing described above can be executed by hardware or by software. In a case where the series of processing is executed by software, programs constituting the software are installed from a recording medium in a computer which is built in dedicated hardware, a general-purpose personal computer, for example, in which various programs can be installed for execution of various functions, or the like.
The input/output interface 1005 is connected to an input unit 1006 including input devices such as a keyboard and a mouse, with which a user inputs an operation command, an output unit 1007 that outputs a processing operation screen and an image of a processing result to a display device, a storage unit 1008 including a hard disk drive that stores programs and various types of data, and a communication unit 1009 that includes a local area network (LAN) adapter and executes communication processing via a network represented by the Internet. Furthermore, the input/output interface 1005 is connected to a drive 1010 that reads and writes data from/in a removable storage medium 1011 such as a magnetic disk (including a flexible disk), an optical disk (including a compact disc-read only memory (CD-ROM) and a digital versatile disc (DVD)), a magneto-optical disk (including a mini disc (MD)), or a semiconductor memory.
The CPU 1001 executes various types of processing according to programs stored in the ROM 1002 or programs read from the removable storage medium 1011 such as the magnetic disk, the optical disk, the magneto-optical disk, or the semiconductor memory, installed in the storage unit 1008, and loaded from the storage unit 1008 to the RAM 1003. In the RAM 1003, for example, data necessary for the CPU 1001 to execute various types of processing is also stored if necessary.
In the computer configured as described above, the series of processing described above is performed by, for example, the CPU 1001 loading the programs stored in the storage unit 1008 to the RAM 1003 via the input/output interface 1005 and the bus 1004 to execute the programs.
The programs executed by the computer (CPU 1001) can be provided by being recorded on the removable storage medium 1011 serving as a package medium or the like, for example. Furthermore, the programs can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
In the computer, the programs can be installed in the storage unit 1008 via the input/output interface 1005 by mounting the removable storage medium 1011 on the drive 1010. Furthermore, the programs can be received by the communication unit 1009 via a wired or wireless transmission medium, and can be installed in the storage unit 1008. Alternatively, the programs can be installed in advance in the ROM 1002 or the storage unit 1008.
Note that the programs executed by the computer may be programs in which a series of processing is performed in time series in the order described in the present specification or may be programs in which the processing is performed in parallel or at a necessary timing, such as when a call is made.
Note that the CPU 1001 in
Furthermore, in the present specification, a system means a set of a plurality of components (devices, modules (parts), and the like), and it does not matter whether or not all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected to one another via a network and one device including a plurality of modules housed in one casing are both the system.
Note that embodiments of the present disclosure are not limited to the embodiments described above, and various modifications can be made without departing from the gist of the present disclosure.
For example, the present disclosure can have a configuration of cloud computing in which one function is shared and processed in cooperation by a plurality of devices via a network.
Furthermore, the steps described in the flowcharts described above can be executed by one device, or can be shared and executed by a plurality of devices.
Moreover, in a case where a plurality of types of processing is included in one step, the plurality of types of processing included in the one step can be executed by one device, or can be shared and executed by a plurality of devices.
Note that the present disclosure can also have the following configurations.
<1> An image processing apparatus including:
a projection unit that projects a multi-viewpoint image; and
an image generation unit that generates the multi-viewpoint image by integrally and simultaneously applying correction to optical deterioration and correction to crosstalk deterioration.
<2> The image processing apparatus according to <1>, in which
the image generation unit generates the multi-viewpoint image by applying, to an input image, correction filters that integrally and simultaneously apply the correction to the optical deterioration and the correction to the crosstalk deterioration.
<3> The image processing apparatus according to <2>, further including
a correction unit that sets, as the correction filters, inverse filters including inverse functions of an optical deterioration transfer function representing a model that causes optical deterioration in the input image and a crosstalk deterioration transfer function representing a model that causes crosstalk deterioration in the input image.
<4> The image processing apparatus according to <3>, in which
the optical deterioration transfer function is set on the basis of an optical characteristic based on a modulation transfer function (MTF) curve of a lens used when the projection unit includes a projector.
<5> The image processing apparatus according to <3>, in which
the crosstalk deterioration transfer function is set on the basis of a diffusion distribution by a diffusion plate that diffuses the multi-viewpoint image projected by the projection unit in a unit of a pixel column.
<6> The image processing apparatus according to <5>, in which
the projection unit includes a projector, and the diffusion plate includes an anisotropic diffusion plate.
<7> The image processing apparatus according to <5>, in which
the projection unit includes a liquid crystal display (LCD) or an organic light emitting diode (OLED), and the diffusion plate includes a lenticular lens or a parallax barrier.
<8> The image processing apparatus according to <3>, in which
the correction unit adjusts constraint terms in the inverse functions, and sets the correction filters that preferentially correct one of the correction to the optical deterioration and the correction to the crosstalk deterioration.
<9> The image processing apparatus according to <2>, in which when an error occurs in the multi-viewpoint image due to correction using the correction filters, the image generation unit generates a multi-viewpoint image corresponding to the multi-viewpoint image in which the error occurs by linear interpolation by using a multi-viewpoint image in which the error does not occur.
<10> The image processing apparatus according to <9>, in which
the multi-viewpoint image in which an error occurs due to correction using the correction filters includes an image including a pixel having a pixel value saturated.
<11> The image processing apparatus according to any one of <1> to <10>, in which
the multi-viewpoint image includes a multi-viewpoint image that enables viewing of a three-dimensional image according to a viewing position.
<12> The image processing apparatus according to any one of <1> to <10>, in which
the multi-viewpoint image includes a multi-viewpoint image that enables viewing of a two-dimensional image according to a viewing position.
<13> An image processing method including:
image generation processing of generating a multi-viewpoint image projected by a projection unit by integrally and simultaneously applying correction to optical deterioration and correction to crosstalk deterioration.
<14> A program that causes a computer to function as:
a projection unit that projects a multi-viewpoint image; and
an image generation unit that generates the multi-viewpoint image by integrally and simultaneously applying correction to optical deterioration and correction to crosstalk deterioration.
Number | Date | Country | Kind |
---|---|---|---|
2019-068493 | Mar 2019 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/011584 | 3/17/2020 | WO | 00 |