This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2015-070389, filed Mar. 30, 2015, the entire contents of which are incorporated herein by reference.
Embodiments described herein relate generally to an information processing device and a projection apparatus.
Image projection apparatus may include a projector that projects an image onto a projection plane. It is sometimes difficult for users to decide where to install an image projection apparatus so as to project an image appropriately with minimal distortion. Several techniques have been developed to assist users with tars task. For example, there has been known a technique to evaluate a reproduction level of an image by comparing a desired image to be projected onto the projection plane with an actual image on the projection plane picked up with an imaging means such as a camera or other imager. However it is sometimes difficult to evaluate an appropriate level of an image observed from the position of the imaging means, which may be at a point or area distant from the projection plane.
Each of the embodiments will now be described in detail with reference to the accompanying drawings.
Note that the figures are conceptual pattern diagrams, and the relationships between thicknesses and widths and ratios of size of each part are not necessarily represented to scale. Moreover, the size and ratio of components that appear in multiple figures are not necessarily to scale, or the same in each figure.
According to one embodiment, an information processing device connected to a projection unit comprises a shape acquiring unit, an observation point acquiring unit, an evaluating unit, and a generating unit. The shape acquiring unit is configured to acquire shape information regarding a projection plane onto which an image is projected by the projection unit and projection point information regarding a position of the projection unit. The observation point acquiring unit is configured to acquire observation point information regarding an observation point to observe the projection plane. The evaluating unit is configured to evaluate an appropriate level of the projection plane based on the shape information, the projection point information, and the observation point information. The generating unit is configured to generate an auxiliary information data based on the appropriate level.
According to one embodiment, a projection apparatus comprises a projection unit and the information processing device.
The image projection apparatus 100 comprises an information processing device 200, a projection unit 110, a distance sensor 111, and an input unit 112. The information processing device 200 comprises an evaluating unit 203, and a generating unit 204, The information processing device 200 further comprises a shape acquiring unit 201, and an observation point acquiring unit 202.
The information processing device 200 may be, for example, an IC (integrated circuit) such as an ASIC (Application Specific Integrated Circuit), FPGA (Field Programmable Gate Array), an electric circuit such as a CPU (Central Processing Unit), or an MPU (Micro Processing Unit). The information processing device 200 may be implemented, in whole or in part, using an IC such as an LSI (Large Scale integration) or IC chip set. The IC may be a special-purpose processor or general-purpose processor. Some or all of the blocks in
The information processing device 200 may be in a housing with the projection unit 110, the distance sensor 111, and the input unit 112 or in a housing separated from, another housing in which the projection unit 110, the distance sensor 111, and the input unit 112 are located. The information processing device 200 may be connected to the projection unit 110 directly or indirectly. The data from the information processing device 200 may be, for example, transmitted to the projection unit 110 by wire or by a wireless interface.
The image projection apparatus 100 is set so that the projection unit 110 faces a projection plane. The projection plane is a plane of an object that faces the image projection apparatus 100.
An input image data 101 is converted to an output data image by the generating unit 204 and the output data image is projected onto the projection plane by the projection unit 110. A process for selecting an appropriate projection plane onto which an output image is projected will now be described.
The distance sensor 111 measures a distance between the distance sensor 111 and the projection plane and transmits distance information 209 acquired during the measurement step to the shape acquiring unit 201.
The shape acquiring unit 201 acquires information regarding a three dimensional shape of the projection plane (shape information 205) and information regarding a position of the image projection apparatus 100 (projection point information 210) from the distance information 203. The shape acquiring unit 201 transmits the shape information 205 and the projection point information 210 to the evaluating unit 203.
The shape information 205 may be, for example, information regarding asperity of a projection plane or three dimensional coordinates of points on the projection plane. The projection point information 210 may be, for example, information regarding relative positions between the projection plane and the image projection apparatus 100. The projection point information 210 may be, for example, information regarding a coordinate of the position of the image projection apparatus 100 within a coordinate system.
The observation point acquiring unit 202 may be connected to the input unit 112. The observation point acquiring unit 202 acquires information regarding an observation point for observing the project plane (observation point information) 206 input by the user via the input unit 112. The observation point acquiring unit 202 transmits the observation point information 206 to the evaluating unit 203. The observation point information 206 may be, for example, information regarding relative positions between the image projection apparatus 100 and the observation point or between the projection plane and the observation point. The observation point information 206 may be, for example, information regarding a coordinate of the position of the observation point on a coordinate system.
The input image data 101 is transmitted to the evaluating unit 203. In one embodiment, the numbers of pixels along horizontal direction and vertical direction of the input image data 101 are included in the input image data 101 and transmitted to the evaluating unit 203.
The evaluating unit 203 evaluates whether the projecting plane is appropriate. For example, the evaluating unit may evaluate whether the one or more parameters of the projecting plane meet an appropriate level 207. The appropriate level may be evaluated, for example, based on at least one of (i) a positional shift between the image estimated to be observed from the observation point, in the case the output image is projected onto the projection plane (an estimated image), and the output image, and (ii) deficit information (such as information regarding image distortion and/or deficit). The evaluating unit 203 transmits the appropriate level 207 to the generating unit 204.
The estimated image is acquired with the shape information 205, the projection point information 210, the observation point information 206, and the input image data 101 by the evaluating unit 203. That is, the estimated image is the image acquired by estimating the image to be observed from the observation point (an observation image) in the case the output image is projected onto the projection plane.
A position gap may be, for example, a gap between a position of a point in the estimated image and a position of a point in the output image which corresponds to the point in the estimated image. A deficit means that there is no point in the estimated image which corresponds to the point in the output image. The less the positional shift and the deficit in the estimated image is, the larger the appropriate level 207 is.
The generating unit 204 generates auxiliary information data 208 for informing the user of the appropriate level 207. For example, the appropriate level 207 may be included in the auxiliary information data 208. The auxiliary information data 208 may include, for example, a predetermined region in the projection plane and the appropriate level 207 in the predetermined region.
Processing of the information processing device 200 will new be described.
The distance sensor 111 and the shape acquiring unit 201 will now be described.
The object 350 has a projection plane 351. The distance sensor 111 measures the distance between the distance sensor 111 itself and the projection plane 351 and transmits the observation point information regarding the distance (a distance information) 209 to the shape acquiring unit 201. The shape acquiring unit 201 acquires the shape information 205 of projection plane 351 and the projection point information 210 based on the distance information 203 and transmits the shape information 205 and the projection point information 210 to the evaluating unit 203.
The distance sensor 111 may comprise a projection unit 111a and a receiving unit 111b. The projection unit 111a and the receiving unit 111b may be at almost same height from the ground. For example, a line segment C1 connecting the center 111c of the projection unit 111a and the center 111d of the receiving unit 111b may be almost parallel to the bottom face 111e of the distance sensor 111. Therefore the line segment C1 is horizontal in the case where the distance sensor 111 is provided on a horizontal plane.
The projection unit 111a may, for example, project infrared light which has a random pattern onto the projection plane 351. The receiving unit 111b may receive a part of the infrared light reflected by the projection plane 351.
The case where the pattern of infrared light projected by the projection unit 111a and the pattern of infrared light received by the receiving unit 111b are two dimensional images will now be described. The coordinates of a point in the receiving image corresponding to the coordinates (xp, yp) of a pixel in the projection image is defined as (xc, yc). The shape acquiring unit 201 may find coordinates (Xs, Ys, Zs) of a three dimensional shape of the projection plane 351 by finding the coordinates (xc, yc) of a pixel in the receiving image corresponding to the coordinates (xp, yp) of a pixel in the projection image as follows:
In
An example of positional relationship between the projection unit 111a, the receiving unit 111b, and the object 350 is described in
The distance D between the distance sensor 111 and the object 350 is defined by relation (2).
The coordinates (Xs, Ys, Zs) of a three dimensional shape of the projection plane 351 are defined by relations
where yc is a y-coordinate of the pixel in the receiving image.
The light that the projection unit 111a projects and the light the receiving unit 111b receives may be visible light. Alternatively, the light may be invisible light, such as infrared (IR) or ultraviolet (UV) light.
An image unit may be used for means to measure a distance instead of the distance sensor 111. In this case, the projection unit 110 projects light in a predetermined pattern onto the object 350. The image unit may take an image on the object 350. The distance information between the image unit and the object 350 may be acquired based on the corresponding relation between the image taken by the image unit and the pattern projected onto the object 350 by the projection unit 110.
A plurality of image units may be used instead of a distance sensor 111. In this case, the image projection apparatus 100 may acquire the distance between the image units and the object 350 based on the corresponding relationship between pixels in a plurality of images taken by the plurality of imaging units. For example, the distance between the line connecting two imaging units and object 350 may be acquired.
The method by which the distance is measured by the distance sensor 111 is not limited to the above-described method involving the projection of light which has a predetermined pattern. For example, the distance sensor 111 may project light modulated by a pulse shape onto the projection plane and receive the light refracted by the projection plane 351, and measure the distance between the distance sensor 111 and the projection plane 351 based on the difference between the phase of the projected light and the received light. In this way, the distance sensor 111 is an example of a means to acquire the three dimensional shape of the projection plane 351 no be projected an image by the projection unit 110. However, such means to acquire the three dimensional shape of the projection plane 351 are not limited to the means described above.
For example, the three dimensional shape of the projection plane 351 may be determined with reference to the position of the receiving unit 111b of the distance sensor 111 as an origin. Therefore the coordinates of a projection point (the projection unit 110) may be set based on the distance between the receiving unit 111b of the distance sensor 111 and the projection unit 110. For example, in the case where receiving unit 111b may be located in the X direction at px, in the Y direction at py, and in the Z direction at pz apart from the projection unit 110, the coordinates of the projection point (Xp, Yp, Zp) may be defined as (px, py, pz). The shape acquiring unit 201 acquires the shape information 205 of the projection plane and transmits the shape information 205 to the evaluating unit 203 with the projection point information 210.
The observation point acquiring unit 202 acquires the observation point information 206 and transmits the observation point information 206 to the evaluating unit 203.
For example, a coordinate system, which has a 2 direction parallel to the projection direction, of the projection unit 110, a X direction perpendicular to the Z direction, and a Y direction perpendicular to the X direction and Z direction may be displayed on the display 113 before a user inputs a position of an observation point. A user may input a position of an observation point in the coordinate system with the operating unit 114. After inputting, the position of the observation point in the coordinate system may be displayed on the display 113.
The observation point information 206 may include information regarding an area that includes the observation point instead of information regarding the position of the observation point. The area of the observation point may be an area defined based on a reference observation point. By inputting a reference observation point and a length, the area of observation point which has center on the reference observation point and semi diameter of the length may be defined. The area of observation point may be displayed on the display 113, instead of the position of the reference observation point.
In some embodiments, the input unit 112 may be separate from a housing in which the information processing device 200 is located. The input unit 112 may be, for example, implemented on a tablet device or personal computer (PC), or other type of remote control.
For example, the input unit 112 may comprise a touch panel or touchscreen. By touching the touch panel, the coordinate system displayed on the touch panel may be rotate, A user may set a position of an observation point, or an area of an observing area by touching the corresponding area of the touch panel. In the case where an area of observation point is set, the observation point acquiring unit 202 may use the center of the area or the center of gravity of the area as a reference observation point.
The evaluating unit 203 acquires the appropriate level 207 and transmits the appropriate level 207 to the generating unit 204 (S202). An exemplary method of acquiring the appropriate level 207 will be described with reference to
The evaluating unit 203 finds a point on the projection plane 351 and a pixel in the output image corresponds to the point on the projection plane in the case where the output image is projected onto the project plane by the projection unit 110. The pixel in the output image may be, for example, represented by a coordinate system, having reference to the output image.
In
A coordinate mP of a pixel in the output image is defined as (xp, yp). The three dimensional coordinate M of a point on the projection plane 351 corresponding to the pixel is defined as (Xs, Ys, Zs) in the case that the output image is projected onto the projection plane 351. The relationship between the coordinate mp of a pixel in the output image and the three dimensional coordinate M of the point on the projection plane 351 is as follows;
{tilde over (m)}
p
=P
P
·{tilde over (M)} (6)
where {tilde over (m)}p is a homogeneous coordinate of mp and {tilde over (M)} is a homogeneous coordinate of M.
Pp represents a perspective projection matrix for an image projected onto the projection plane 351 by the projection unit 110, That is, the evaluating unit 203 transforms the perspective projection in the process of finding the coordinates of the pixel in the output image corresponding to the point of the three dimensional coordinates in the projection plane 351. The perspective projection matrix Pp is represented by an internal parameter Ap, and external parameters Rp, and tp.
P
p
=A
P
·[R
P
·t
P] (7)
The internal parameter Ap may indicate, for example, characteristic of the projection unit 110. For example, the internal parameter Ap may be defined based on the focal length of a lens of the projection unit 110 and the position (the coordinates) of the center of a display element for the light axis of the lens of the projection unit 110.
The external parameters Rp, and tp may indicate, for example, the position of the projection unit 110 and attitude of the projection unit 110. For example, the external parameters Rp, and tp may be defined based on the position of the projection unit 110 for the origin set arbitrarily in the three dimensional space and the direction in which an image is projected by the projection unit 110.
The internal parameter Ap and the external parameters Rp, and tp will be described in more detail.
In
In
y′=y (9)
Therefore the coordinate (cx, cy) of the center 131a of the display element 131 for the light axis 121a of the lens 121 is (0, 0). That is, cx=0 and cy=0. On the other hand, in
y′=y+c
y (10)
The y-coordinate cy of the center 131a of the display element 131 for the light axis 121a of the lens 121 is not 0. That is, cy may be a value other than 0 that depends on the position of the light axis 121a of the lens 121 and the center 131a of the display element 131. In the case where the light axis 121a of the lens 121 and the center 131a of the display element 131 are misaligned along the x direction, the x-coordinate cx of the center 131a of the display element 131 internal parameter Ap is represented as follows:
where fx and fy in the relation (11) represent the focal lengths of the lens 121 for each pixels. As described in relation (11), the internal parameter Ap is a parameter defined based on the coordinate (cx, cy) of the center 131a of the display element 131 and focal length fx, fy of the lens 121 for each pixel.
The external parameter tp represents parallel translation of projection point. The matrix for the external parameters tp is a parallel translation matrix t which has the position of the projection point as the origin. The parallel translation matrix t is represented as follows:
The external parameters RP represents a projection direction of the projection unit 110. The matrix for the external parameters RP is a rotation matrix R to translate a coordinate so that the vector V representing the projection direction is in a Z direction. The rotation matrix R is represented as follows:
Therefore a matrix of external parameters [RP, tp] is a matrix to translate a coordinate on the projection plane 351 represented by the world coordinate to a coordinate in a coordinate system which has a projection position as an origin and the projection direction as its Z axis. The rotation matrix Rx to rotate a coordinate (x, y, z) α degrees around a X axis is represented as follows:
The rotation matrix Ry to rotate a coordinate (x, y, z) β degrees around a Y axis is represented as follows:
The rotation matrix Rz to rotate a coordinate (x, y, z) Y degrees around a Z axis is represented as follows:
As described with respect to
A method to find a coordinate of a pixel in the observation image corresponding to a three dimensional coordinate of a point on the projection image 351 will now be described. A coordinate me of a pixel in the observation image is defined as (xe, ye). A three dimensional coordinate M on the projection plane 351 corresponding to the coordinate me is defined as (Xs, Ys, Zs), The relation between the coordinate me and the three dimensional coordinate M is represented as follows:
{tilde over (m)}
e
≅P
e
·{tilde over (M)} (17)
The matrix Pe in the relation (17) is a perspective projection matrix for the image on the projection plane 351 observed from the position of the observation point acquired by the observation point acquiring unit 202. That is, the evaluating unit 203 transforms the perspective projection in the process of finding the coordinate of the pixel in the output image corresponding to the point of the three dimensional coordinate in the projection plane 351.
In this embodiment, the perspective projection Peis defined based on an internal parameter and an external parameter of the image unit because the projection plane 351 is observed from the observation point with the image unit like a camera. The internal parameter and the external parameter are similar to the internal parameter and the external parameters of the perspective projection Pp and may be acquired by preliminary calibration.
The corresponding relation between a coordinate (xp, yp) of a pixel in the input image and a coordinate (xe, ye) of a pixel in the observation image is acquired by acquiring the coordinate (xp, yp) of a pixel in the input image corresponding to a point on the projection plane 351 and the coordinate (xe, ye) of a pixel in the observation image corresponding to the point found with the relation (6) and (17). In
Therefore the relation between the output image and the observation image is represented with the relations (6) and (17). That is, the estimated image is acquired with reference to the output image and relations (6) and (17).
The case where the relation between a coordinate (xp, yp) of a pixel in the input image and a coordinate (xe, ye) of a pixel in the observation image is not acquired will be described with
The case where a relationship between the positions of the projection plane, the observation point and the projection point are illustrated in
When the projection plane 503 is observed from the observation point 502, the projection area 505 which is a part of the projection area 504 is in a shadow of the part in the projection area 504 which is closer to the observation point. The projection area 505 is an area between the convex portion and concave portion.
An image projected onto the point (Xs, Ys, Zs) in the projection area 505 is not observed from the observation point 502. Therefore the pixel in the observation image corresponding to the coordinate (xp, yp) of the pixel in the output image is not found. That is, there is a deficit in the image observed from the observation point 502.
The case where the coordinate (xp, yp) of the pixel in the output image is projected onto the point (Xs, Ys, Zs) which, is included in the projection area 508 are illustrated in
For example, when the projection plane 506 includes a reflective or mirrored area, the distance sensor 111 may not be able to measure the distance between the distance sensor 111 and a mirrored area in the projection plane 506, since the light incident to the mirror area is reflected toward the projection unit 111a, rather than toward the receive unit 111b. As another example, when there is an area which has significantly low reflection ratio in the projection plane 506, sufficient light may not reach the receive unit 111b because the low reflection area may not reflect sufficient light from the projection unit 111a to the distance sensor. As described above, it may be difficult to measure the distance between the distance sensor 111 and the area which has significantly low reflection ratio.
In
The evaluating unit 203 calculates shift amounts d for all corresponding pairs between the pixels (xp, yp) in the output image and the pixels (xe, ye) in the estimated image with relation (18).
d=(xp−xe)2+(yp−ye)2 (18)
The total of the shift amounts d for the observation point is defined as error D. In the case where the area of observation point is used for the observation point information 206, the error D(i) for the area of observation point the is represented by relation (19). i is an index for each of the observation points in the area (1≦i≦L). L is a number of the points in the area of observation point and may be chosen arbitrarily, j is an index for each of the pixels in the output image (1≦i≦N). N is a number of pixels in the output image. i, j, L, and N are whole numbers.
In the case where there is no pixel (xe, ye) in the estimated image corresponding to the (xp, yp) in the output image, the shift amount d may be the predetermined maximum value dmax for the shift amount d. In the case of
The average among the errors D( i) for each of the observation points in the observation area may be a representative error Dout for the area.
D
out=max(D(i)) (20)
The evaluating unit 203 may, for example, calculate an appropriate level Rlv (the appropriate level 207) which represents how much the projection plane is appropriate for projection of an image with the relation (21) based on an error D(i) for the area of observation point and an reference error Dth which is a predetermined number representing the acceptable amount of an error.
For example, the closer the appropriate level Rlv gets to 1, the more appropriate the projection plane is for projection. In the case where the maximum value among the errors D(i) for each of the observation points in the area of observation point is chosen as a representative error Dout for the area of observation point, the appropriate level Rlv may be the minimum value. In the case where the minimum value among the errors D(i) for each of the observation points in the area of observation point is chosen as a representative error Dout for the area of observation point, the appropriate level Rlv may be the maximum value. In the case where the average among the errors D(i) for each of the observation points in the area of observation point is chosen as a representative error Dout for the area of observation point, the appropriate level Rlv is the average. The evaluating unit 203 transmits the appropriate level Rlv to the generating unit 204.
Other methods may be used to calculate the appropriate level 207. For example, the appropriate level 207 may represent the ratio of an area which includes neither a deficit nor a positional shift to the total area in the observation image. For example, in the case where the observation image includes neither a deficit nor a positional shift, the appropriate level 207 may be 100 percent. The appropriate level 207 may be represented as rank.
The generating unit 204 generates the auxiliary information data 208 to be provided for a user based on the appropriate level Rlv (S203).
The projection unit 110 projects the output image including the auxiliary information onto the projection image (S204).
Examples of the output image including the auxiliary information are illustrated in
The auxiliary information may not be a part of image information. In the case of the image projection apparatus 100 comprise a sound generating unit, the sound generating unit may generate sound based on the appropriate level 207 included in the auxiliary information transmitted from the generating unit 204.
After the projection plane is decided as described above, the input image data 101 is input to the generating unit 204, transmitted to the projection unit 110, and an output image is generated by the projection unit 110. This output image does not include the auxiliary information. The output image is projected onto the projection plane by the projection unit 110. The projected image on the projection plane includes less deficit and positional shift and is same or similar to the output image in the case where the projected image is observed from the observation point.
By this embodiment, it is easy to determine whether the projection plane is appropriate for a projection to the arbitrary observation point. By this embodiment, it is easy for a user to determine whether the projection setting, which includes the position of the projection unit 110, direction of the projection unit, projection plane, and the position of the observation point, is appropriate or should be changed.
The second embodiment will now be described. In this embodiment, the appropriate levels 207 will be generated for small regions in the projection plane. The evaluating unit 203 evaluates the appropriate levels 207 for small regions and transmits the appropriate levels 207 to the generating unit 204. Each of the appropriate levels 207 indicates how much the small regions in the projection plane onto which the output image is projected are appropriate. Each of the appropriate levels 207 may indicate whether the area in the projection plane is appropriate or not. Detail of the evaluating unit 203 and the generating unit 204 will be described.
An example of output image including a plurality of small regions is illustrated in
In the case where there is no pixel (xe, ye) in the estimated image corresponding to the (xp, yp) in the output image, the shift amount d may be the predetermined maximum, value dmax for the shift amount d. In the case where the observation point information 206 encompasses an observation area, the errors D(I, j) for each of the plurality of the observation points and the position of observation points are calculated. The maximum value among the calculated errors D(I, j) may be the representative error Dout(j) for the area of observation point.
D
out(j)=max(D(i, j)) (22)
The error D for the small region j (or Dout (j)) may be acquired in this way.
The evaluating unit 203 evaluates the appropriate level Rlv(j) (the appropriate level 207) which indicates how appropriate the projection plane is for projecting an image with the relation (23) based on the representative errors D for each small region (or Dout(j)) a reference error Dth. The reference error Dth is a predetermined number representing the acceptable amount of an error. The appropriate level Rlv(j) is calculated for each small region as follows:
The evaluating unit 203 transmits a plurality of the appropriate levels Rlv(j) to the generating unit 204,
The generating unit 204 generates the auxiliary information data 208 based on the appropriate levels Rlv(j). The projection unit 110 may, for example, generate the output image including an image based on the input image data 101 and the auxiliary information data 208.
Examples of the output image of this embodiment are illustrated in
In the case where a region in the projection plane has a relatively high appropriate level, the auxiliary information may include information to indicate that a direction to the region is an appropriate projection direction, as shown in
By this embodiment, it is easy to determine whether the projection plane is appropriate for a projection to an arbitrary observation point. By this embodiment, it is easy for a user to determine whether the projection setting, which includes the position of the projection unit 110, direction of the projection unit, projection plane, and the position of the observation point, is appropriate or should be changed.
The third embodiment will now be described. In this embodiment, the image projection apparatus 100 further includes a correction unit 301 to generate a correction image data 302 based on the shape information 205, the observation point, information 206, and the projection point information 210. The correction image data 302 is a corrected input image data 101. The evaluating unit 203 evaluates the appropriate level 207 for the correction image data 302 to be projected onto the projection plane by the projection unit 110.
Processing of the correction unit 301 will be described first. Generally, in the case where an image projected by the projection unit 110 is observed, as long as the projection point and the observation image are not same, the observation image gets distorted relative to the output image. The correction unit 301 corrects the input image data 101 and generates the correction image data 302 so that the observation image does not get distorted relative to the output image.
The distortion of an image on the projection plane will be described with
As described in
{tilde over (m)}
c1
≅P
C
·P
P
−1
·{tilde over (m)}
P (24)
As described in
{tilde over (m)}
c2
≅P
P
·P
C
−1
·{tilde over (m)}
P (25)
It is found that the relation between the distortion of the second image relative to the first image and the distortion of the fourth image relative to the third image is a predistortion. Therefore, as represented in the relation (26), in the case where the fourth image is projected from the position of the projection unit 110 and the fourth image is observed from the position of the image apparatus 370, the distortion is cancelled. That is, the distortion in the observation image is suppressed.
{tilde over (m)}
c1
≅P
C
·P
P
−1·(PP·PC−1·{tilde over (m)}P)≅{tilde over (m)}P (26)
The information processing device 200 of this embodiment will be described based on the relations represented above.
The image projection apparatus 100 in this embodiment projects a correction image and an image in which a distortion is suppressed in the case where the image is observed from the reference observation point. The correction unit 301 generates the correction image data 302 from the input linage data 101 and transmits the correction image data 302 to the projection unit 110 (S212).
The evaluating unit 203 evaluates a positional shift of pixels and a deficit (information regarding a distortion and/or deficit) in the observation image relative to the output image and calculates the appropriate level 207 of the projection plane. The correction image is shaped so as not to nave a positional shift of pixels and not to have a deficit relative to the input image, in the case where the correction image is observed from the reference observation point.
As illustrated in
The evaluating unit 203 further evaluates a positional shift of pixels and a deficit (information of a distortion and/or a deficit) in the observation image relative to the output image in the case where the observation image is observed from the observing point other than the reference observation point.
In the case where the observation point information 206 acquired by the observation point acquiring unit 202 includes the reference observation point, rather than the observation point, the appropriate level 207 is calculated based on the first evaluate information which indicates information of a distortion and a deficit of the reference observing point. In the case where the observation point information 206 is an observation area, the appropriate level 207 is calculated based on the evaluation of other information which indicates information of a distortion and a deficit of each of the reference observation points in the area.
In this embodiment, the information of a distortion and a deficit for the observing point other than the reference observing point is acquired by estimating a distortion and a deficit in the observation image relative to the output image which is caused in the virtual case where the output image is projected from, the reference observing position and the projected, output image is observed from the observation point. Based on this, the information of a distortion and a deficit is estimated with relation (22) for a plurality of the observation points other than the reference observation point. Among the acquired information of a distortion and a deficit, the maximum value is defined as distortion information D2.
The evaluation unit 203 finds the sum of deficit information D1 and distortion information D2 as an amount of deficit and distortion Dout and calculates the appropriate level Rlv with relation (23) and the amount of deficit and distortion Dout (S213). In the case where the observation point information 206 includes information of an observing point, the amount of deficit and distortion Dout may be deficit information D1. In the case where the observation point information 206 includes an observation area, the amount of deficit and distortion Dout may be distortion information D2. In this case, it is supposed that there is no deficit in the observation image observed from the reference observation point.
As described above, the evaluating unit 203 transfers the appropriate level 207 to the generation unit 204. The generation unit 204 generates the auxiliary information data 208 as described in first and second embodiments and transfers the auxiliary information data 208 to the projection unit 110 (S214). The projection unit 110 projects the auxiliary information onto the projection plane. The projection unit 110 may project the auxiliary information with the correction image onto the projection plane (S215).
By this embodiment, it is easy to determine whether the projection plane is appropriate for a projection to the arbitral observation point. By this embodiment, it is easy for a user to determine whether the projection setting, which is the position of the projection unit 110, direction of the projection unit, projection plane, and the position of the observation point, is appropriate or should be changed.
The fourth embodiment will now be described. In particular, the differences between the first embodiment and this embodiment will be described. In this embodiment, the image projection apparatus 100 further comprises an image unit 401 to image the projection plane and the evaluation unit 203 evaluates the appropriate level 207 based on not only the information of deficit and distortion but also optical information of the projection plane.
In
The details of the image unit 401 and the estimation unit 203 will now be described with reference to
The evaluation unit 203 finds the information of deficit and distortion and optical information of the projection plane based on the shape information 205, the observation point information 206, the projection point information 210, and image information 402. The evaluation unit 203 finds the representative error Dout of the information of deficit and distortion in the same manner as described above with respect to the first embodiment with the relation (18) through (20). The evaluation unit 203 finds the relation between the coordinate mc of a pixel in the image taken by the image unit 401 and the three dimensional coordinate in the projection plane.
{tilde over (m)}
C
≅P
C
·{tilde over (M)} (27)
where Pc is a perspective projection matrix.
In this embodiment, calibration of the image unit 401 is done and the internal parameter, the external parameters, and the perspective projection matrix Pc are found before taking an image. The coordinate (xp, yp) of a pixel mp corresponding to the coordinate (xc, yc) of a pixel mc is acquired with relation (6) and (27), and the pixel value C(xc, yc) of a pixel mc and the pixel value P(xp, yp) of a pixel mP is calculated. The difference between the pixel value C(xc, yc) of a pixel mc and the pixel value P(xp, yp) of a pixel mp is defined as C(=|C(xc, yc)−P(xp, yp)|). The differences C for all of the pixels in the input image are calculated. The sum of the differences C is defined as an error or the optical information Cout.
The error or the optical information Cout indicates the sum of the differences between pixel values in the pattern image and pixel values in the image taken by the image unit 401 corresponding to the pixel in the pattern image. Therefore, in the case where the color of the projection image is white, the value of the error of the optical information Cout is more likely small. In the case where the color of the projection image is not white or has a pattern, the error or the optical information Cout is more likely large. The evaluating unit 203 calculates the optically appropriate level RlvC based on the optical information Cout. The value Cth is a predetermined acceptable upper value of the optical error.
RlvC=1−Cout/Cth (28)
The optically appropriate level Rlv defined on relation (21) is defined as an appropriate level of a shape RlvD. The final appropriate level Rlv (the final appropriate level 207) of the projection plane is calculated by blending the optically appropriate level RlvC and the appropriate level of a shape RlvD with a weight α as represented in the relation (29). The final appropriate level Rlv is calculated by adjusting the weight α based on a degree of influence on the image by the shape information and by the optical information.
Rlv=α·RlvC+(1−α)·RlvD (29)
The evaluating unit 203 transmits the final appropriate level Rlv as the appropriate level 207 to the generation unit 204. The generation unit generates the auxiliary information data 208 in the way described in first through third embodiments and transmits the auxiliary information data 208 to the projection unit 110. The projection unit 110 projects an image based on the auxiliary information. The projection unit 110 may project an image based on the auxiliary information including the correction image.
By this embodiment, it is easy to determine whether the projection plane is appropriate for a projection to the arbitrary observation point. By this embodiment, it is easy for a user to determine whether the projection setting, which includes the position of the projection unit 110, direction of the projection unit, projection plane, and the position of the observation point, is appropriate or should be changed.
Each of the embodiments was described with specific examples. However, this disclosure is not limited to these specific examples. For example, one of ordinary skill in the art will understand that this disclosure may be implemented using available variations in the specific configuration of each element.
One of ordinary skill in the art will also understand that this disclosure may be implemented using combinations of two or more elements from the specific examples.
One of ordinary skill in the art will also understand that this disclosure may be implemented using other optical devices and image display apparatuses.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the disclosure. Indeed, the embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the disclosure. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the disclosure.
Number | Date | Country | Kind |
---|---|---|---|
2015-070389 | Mar 2015 | JP | national |