Embodiments of the invention generally relate to an image display device and an image processing device.
There is an image display device that includes a projector and a reflector (a combiner), wherein the projector includes a displayer and an optical part, the displayer displays an image, the optical part includes optical elements such as multiple lenses, etc., the projector projects the image displayed on the displayer, and the reflector reflects the image projected from the projector toward an eye of a viewer. In the image display device, optical distortion and/or partial loss of the image viewed by the viewer may occur according to the arrangement of the projector and/or the pupil position of the viewer. It is desirable for the viewer to be able to easily adjust the display to an easily-viewable state in which the optical distortion and the like are suppressed.
According to an embodiment of the invention, an image display device that includes an optical part and a controller is provided. First input image data and second input image data are input to the controller. The controller causes the optical part to emit a first light based on first corrected image data of the first input image data having been corrected. The controller causes the optical part to emit a second light when receiving a signal employing first correction information of a relationship between the first input image data and the first corrected image data, wherein the second light is based on second corrected image data of the second input image data corrected based on the first correction information.
According to another embodiment of the invention, an image processing device including a controller is provided. First input image data and second input image data are input to the controller. The controller outputs a first corrected image data of the first input image data having been corrected. The controller outputs second corrected image data when receiving a signal employing first correction information. The first correction information is of a relationship between the first input image data and the first corrected image data. The second corrected image data is of the second input image data corrected based on the first correction information.
Various embodiments will be described hereinafter with reference to the accompanying drawings.
The drawings are schematic or conceptual; and the relationships between the thicknesses and widths of portions, the proportions of sizes between portions, etc., are not necessarily the same as the actual values thereof. Further, the dimensions and/or the proportions may be illustrated differently between the drawings, even for identical portions.
In the drawings and the specification of the application, components similar to those described in regard to a drawing thereinabove are marked with like reference numerals, and a detailed description is omitted as appropriate.
As illustrated in
The circuit part 140 is connected by a wired or wireless method to an external storage medium, network, etc., and receives image information. For example, as illustrated in
The circuit part 140 includes a corrector 141 and an adjuster 142 (referring to
The displayer 110 includes multiple pixels 110e. The multiple pixels 110e are provided to be arranged in a plane. The displayer 110 emits an image light L1 including image information. The displayer 110 is a display that displays an image. The light that includes the image information is emitted toward the optical part 120. The display includes, for example, a liquid crystal, an organic EL, liquid crystal on silicon (Liquid Crystal On Silicon), etc. However, the embodiment is not limited thereto.
The optical part 120 is provided between the displayer 110 and the reflector 130 in the optical path of the image light L1 emitted from the multiple pixels 110e of the displayer 110. The optical part 120 includes at least one optical element. The optical part 120 projects the image light L1 that is incident. The optical element can include a lens, a prism, a mirror, etc. For example, the optical part 120 changes the travel direction of at least a part of the image light L1. Thus, the projector 125 (the optical part 120) emits the image light including the image information toward the reflector 130. In the case where multiple optical elements are used, the multiple optical elements may not be disposed on a straight line. The emission direction of the image light emitted by the projector 125 with respect to the reflector 130 is adjustable.
The reflector 130 reflects at least a part of the image light L1 passing through the optical part 120. For example, the reflector 130 reflects the light passing through the optical part 120 toward a pupil 150 of a viewer 60 (a user) of the image display device. The light that is reflected by the reflector 130 forms an image 170 (an observed image) as a virtual image when viewed from the pupil 150. Thus, the viewer 60 can view the image.
The reflector 130 transmits a part of the light incident on the reflector 130 from the external environment. Thereby, the viewer 60 can view the external environment through the reflector 130. The reflector 130 is provided along a first surface 11p. For example, multiple fine reflective surfaces are disposed in parallel in the first surface 11p and used as the reflector 130. The first surface 11p may be a plane or a curved surface. Each of the reflective surfaces is, for example, a half mirror and reflects at least a part of the incident light. Each of the reflective surfaces is tilted with respect to the first surface 11p; and a level difference is formed between the reflective surfaces. The angles between the first surface 11p and the reflective surfaces are determined by the positional relationship between the assumed pupil 150 and the optical axis of the optical part 120. Thereby, for example, the reflection angle of the light can be adjusted. The reflector 130 has a Fresnel configuration formed of the multiple reflective surfaces and the multiple level differences.
However, in the embodiment, the reflector 130 is not limited to such a half mirror. A normal half mirror may be used as the reflector 130; and another member in which the reflection angle can be adjusted may be used similarly. Also, although an example in which a half mirror in which the reflectance and the transmittance are the same fraction is applied is described as an example, the embodiment is not limited to the same fraction. The material that is included in the reflective surface may be any material as long as the material transmits a part of the light and reflects a part of the light.
In the example, the image is displayed as a virtual image. However, the image may be displayed as a real image by separating the reflector 130 from the pupil 150.
In the example, the image 170 is displayed at the front of the pupil 150. However, the image may be displayed at the edge of the visual field of the viewer 60 such as an image 171. Thereby, the visual field of the viewer 60 is not obstructed.
In the example illustrated in
The image display device 101 further includes eyeglasses lenses 160. In the example, the holder 320 further includes nose pads 321 and a bridge 322. The bridge 322 connects one eyeglasses lens 160 and the other eyeglasses lens 160. The rim of the eyeglasses lens 160 (the frame holding the eyeglasses lens 160), etc., may be provided as necessary. Although a configuration similar to normal corrective eyeglasses is described in the application, the embodiment may have a configuration such as that in which the left and right lenses are formed as one body.
The eyeglasses lens 160 (the reflector 130) is held by the holder 320. For example, similarly to a normal eyeglasses frame, the angle between the holder 320 and the eyeglasses lens 160 may be changeable.
For example, the relative arrangement of the nose pad 321 and the eyeglasses lens 160 is fixed. The reflector 130 is included in the eyeglasses lens 160 (provided as one body with the eyeglasses lens 160). In other words, a combiner integrated-type eyeglasses lens 160 is used; and the relative positional relationship of the reflector 130 and the eyeglasses lens 160 is fixed.
The eyeglasses lens 160 has a first surface 161 and a second surface 162. The second surface 162 is separated from the first surface 161. The reflector 130 is provided between the first surface 161 and the second surface 162. The position of the reflector 130 is not limited to that recited above; for example, a configuration in which the reflector 130 is disposed on the second surface 162 may be used.
A binocular head mounted display (HMD) that uses two image display devices 101 is illustrated in
In the example, one circuit part 140 is provided for one image display device 101. In the case where two image display devices 101 are used, the circuit part 140 may be integrated as much as possible.
When using the image display device 101, the viewer 60 places the nose pads 321 on the nose and places one end 320e of the holder 320 on an ear. Thus, the position of the holder 320 and the relative position of the eyeglasses lens 160 (and the reflector 130) are regulated according to the positions of the nose and the ears of the viewer 60. When using the image display device 101, the relative arrangement of the reflector 130 with respect to the holder 320 is substantially fixed. The position of the pupil 150 with respect to the reflector 130 moves according to the eyeball movement.
The relative arrangement of the displayer 110 and the optical part 120 is fixed inside the projector 125 of
The projector 125 of
In
At the startup of the image display device 101 or when a prescribed input is input to the circuit part 140 of the image display device 101, the adjuster 142 performs the processing causing the viewer 60 to select a correction coefficient (a correction table). Memory 144 that is included in the adjuster 142 stores the selected correction coefficient. When the image display device 101 displays the image, the adjuster 142 outputs the correction coefficient to the corrector 141. As illustrated in
Subsequently, the displayer 110 displays the corrected image that is input and emits an image light toward the optical part 120. The optical part 120 emits, toward the reflector 130, corrected light in which the travel direction of at least a part of the light rays included in the image light incident on the optical part 120 is corrected. The reflector 130 reflects a part of the incident light; and the reflected light forms an image as the observed image when viewed from the pupil 150. Thus, in the embodiment, an easily-viewable observed image in which the optical distortion and/or partial loss are suppressed is displayed by the displayer 110 displaying the corrected image. Details of the correction processing are described below.
Partial loss, distortion, and color breakup of the image viewed by the viewer will now be described with reference to
The image display device 109 according to the reference example includes a reflector 130b and a projector 125b. A configuration similar to the reflector 130 is applicable to the reflector 130b; and a configuration similar to the projector 125 is applicable to the projector 125b. The image display device 109 differs from the image display device 101 according to the embodiment in that the circuit part 140 is not included. In other words, the correction processing of the object image described above is not performed in the image display device 109.
An image 170b (an observed image) is formed of the light emitted from the projector 125b. For example, in the image display device, the optical design is performed on the premise that the pupil of the viewer is positioned within a constant range.
In such an eyeglasses-type image display device, the position of the reflector 130 with respect to the position of the eye is determined according to the arrangement of the ears, the nose, and the eyes of the viewer 60. For example, when the viewer 60 changes, the position of the reflector 130 with respect to the eye changes. Therefore, when the viewer 60 changes, there are cases where the position of the image viewed by the viewer 60 changes; and the image is not displayed at the appropriate position. In the case where the pupil 150 is inside the eye range 180, the viewer can view the entire observed image 170b. However, in the case where the pupil 150 is outside the eye range 180, a partial loss of the screen occurs.
The light that includes image information and is emitted from the displayer travels toward the pupil via optical elements such as lenses, half mirrors, etc., included in the optical part and/or the reflector 130b. For example, aberration occurs each time the light passes through an optical element or each time the light is reflected by an optical element. Therefore, degradation such as optical distortion, color breakup, etc., occurs in the observed image 170b.
Optical distortion is an aberration when the light that is emitted from the displayer passes through the optical element. The optical distortion occurs when the image of the light passing through the optical element loses the resemblance to the image of the light emitted from the displayer.
In the display of the color image, color breakup is an aberration occurring due to the difference of the wavelengths of the light. The size of the image of the light after passing through the optical element is dependent on the wavelength of the light. For example, for shorter wavelengths, refraction due to the lens, etc., occurs more easily; and the eye range easily becomes narrow. For example, the center position of the image is dependent on the wavelength of the light and is different by color.
As illustrated in
Thus, distortion occurs in the displayed image according to the relative arrangement of the pupil and the projector. Further, partial screen loss occurs in the case where the pupil exists outside the eye range.
Conversely, in the embodiment, the circuit part 140 performs the processing of causing the viewer 60 to select the correction coefficient. Then, the observed image is displayed by using the corrected image generated based on the selected correction coefficient. Thereby, the display can be adjusted to an easily-viewable state for each viewer.
Details of the correction processing will now be described.
The distortion is different between the colors in the case where the object image is, for example, an image illustrated using three primary colors such as red, green, blue, etc. In such a case, the corrected image is generated using the three correction coefficients set for each color.
When setting the correction coefficients, for example, the observed image is imaged using a camera. The correction coefficients can be determined from the correspondences between each pixel of the observed image and each pixel of the corrected image. In the case where the object image and the corrected image are images illustrated using three primary colors such as red, green, blue, etc., the correspondences of the pixels are stored for each color.
The observed image that is viewed by the viewer 60 includes multiple color images corresponding to multiple color components. For example, the observed image is the superimposition of a first color image corresponding to the first color pixels, a second color image corresponding to the second color pixels, and a third color image corresponding to the third color pixels.
As described above, the correspondence between the observed image and the corrected image is different between the colors due to the effect of the color breakup. For example, as illustrated in
The aspect ratio a:1 is stored in the correction coefficients. For example, the correction coefficients include information relating to an overlapping region Sa (the product region) inside the viewed image where the multiple color images (the first to third color images) overlap each other. Specifically, the correction coefficients include the information of the ratio (a:1) of the length along the lateral direction of the overlapping region Sa and the length along the vertical direction of the overlapping region Sa. Thereby, for example, the aspect ratio of the object image can be maintained when correcting the distortion.
By the processing recited below, the correspondences between the coordinate positions in the corrected image and the positions in the normalized coordinates are obtained for each of the three colors of RGB.
Other than the color-displayable region, the regions where the three colors cannot be represented simultaneously may be defined as regions where a single color or two colors are displayable. The correction coefficients may include information of regions where a single color or two colors are displayable. In other words, for example, the correction coefficients may further include information relating to a non-overlapping region Sb inside the viewed image where multiple color images (the first color image and the second color image) do not overlap each other.
For example, in the case of red, other than the color-displayable region, the information of a region Rd where the display of a single color is possible may be stored in the correction coefficient as shown in
The correction coefficients may include parameters of curve representations. Thereby, the correspondence between the pixels of the corrected image and the pixels of the observed image can be calculated.
In step S130, the adjuster 142 generates a display image based on a first pattern (a first test pattern) based on the correction coefficients. For example, the image that is generated is output as image data from a processor 143 to the displayer 110 as illustrated in
For example, the circuit part 140 (the memory 144 of the adjuster 142) pre-stores the multiple correction coefficients (e.g., a first correction coefficient (first correction information C1) and a second correction coefficient (second correction information C2)). The number of correction coefficients may be any number of two or more. The appropriate correction coefficients are different according to the pupil position of the viewer 60 and/or the arrangement of the projector. Therefore, each of the multiple correction coefficients stored in the memory 144 may be determined to correspond to the positional relationship between the projector 125 and the reflector 130. Or, each of the multiple correction coefficients stored in the memory 144 may be determined to correspond to the positional relationship between the projector 125 and the pupil of the viewer 60. Each of the multiple correction coefficients may be determined for each individual entity of the image display device 101 by considering the individual differences (the manufacturing fluctuation, etc.) of the image display device 101. The memory 144 may store one correction coefficient that is calculated from the multiple correction coefficients and used as a reference.
When step S130 is first performed, the adjuster 142 selects one of the multiple correction coefficients stored in the memory 144 and generates the display image (the corrected image) based on the selected correction coefficient. The selected correction coefficient may be the initial or final correction coefficient in the registration order or may be a prescribed correction coefficient to be used as the reference.
The information of the position of the projector 125 or the information of the pupil position of the viewer may be acquired as the reference of selecting the one of the stored correction coefficients.
For example, in the case where a mechanical mechanism that adjusts the position of the projector 125 in stages is included in the position controller 126 and a scale, etc., illustrating the position of the projector 125 is provided, the viewer 60 is caused to input information corresponding to the value of the scale to the circuit part 140. The correction coefficient may be selected based on this information. Or, the position controller 126 may be caused to sense the position of the projector 125. Any sensor such as a camera, a potentiometer, etc., can be used to sense the position of the projector 125. The correction coefficient may be selected based on the sensed position information.
For example, the image display device 101 further includes a sensor 182 that senses the pupil position (referring to
Projector position information may be used to estimate the pupil position of the viewer 60. For example, in an eyeglasses-type display device, the relative arrangement of the reflector 130 and the optical part 120 changes according to the arrangement of the ears, the nose, the eyes, etc., of the viewer 60. Therefore, it is also possible to somewhat estimate the pupil position of the viewer 60 based on the information of the relative arrangement of the optical part 120 and the reflector 130.
For example, in the case where step S130 is first executed, the adjuster 142 generates a first display image (first corrected image data Cm1) by correcting a first image (first input image data Pm1) including a first pattern based on the first correction coefficient.
The first correction coefficient is one of the multiple correction coefficients stored in the memory 144 or a correction coefficient calculated from the multiple correction coefficients stored in the memory 144. Then, the circuit part 140 (the controller 14) causes the projector 125 to emit the light (a light Ld1) including the image information of the generated first display image as the image light.
In other words, the adjuster 142 transmits, to the projector 125, the corrected image data (the first corrected image data Cm1) of the input image data (the first input image data Pm1) for the test having been corrected. The projector 125 emits the first light (the light Ld1) based on the first corrected image data Cm1.
Thereby, the first display image (the first corrected image data Cm1) is displayed to the viewer 60. The first image before the correction may be stored in the memory 144, etc., or may be input from the outside.
The viewer 60 can input, to the circuit part 140, a signal selecting the first correction coefficient (the first display image) based on the displayed image. In step S140, the circuit part 140 receives the selection of the correction coefficient from the viewer 60. In other words, for example, the circuit part 140 receives a signal Sig1 employing the first correction information of the relationship between the first input image data and the first corrected image data. Step S160 is executed in the case where the circuit part 140 receives the signal selecting the first correction coefficient (the signal employing the first correction information). Step S150 is executed in the case where the circuit part 140 does not receive the signal selecting the first correction coefficient in step S140.
A method in which the viewer 60 inputs, to the image display device 101, information relating to whether or not the correction coefficient is selected (e.g., a code indicating the end of step S140, etc.) is an example of a method for the circuit part 140 receiving the selection of some correction coefficient. For example, software (an application) is installed in a computer and/or a portable terminal; and the information is input from the viewer 60 to the circuit part 140 via the computer, the portable terminal, etc.
In step S150, the circuit part 140 switches the correction coefficient that is currently used to another correction coefficient. Then, the circuit part 140 again executes step S130 using the switched correction coefficient.
A method in which the viewer 60 inputs, to the image display device, the information instructing a switch of the correction coefficient is an example of the method for the circuit part 140 switching the correction coefficient. For example, software (an application) is installed in a computer and/or a portable terminal; and information is input from the viewer 60 to the circuit part 140 via the computer, the portable terminal, etc.
The information (e.g., the information of the position of the projector 125, the information of the pupil position of the viewer 60, etc., acquired in step S130 of the first time may be used to switch the correction coefficient in step S150. The correction coefficient that is selected or generated in step S150 is used in step S130 of the second or subsequent times.
When switching the correction coefficient used to generate the display image, for example, “a: the correction corresponding to the arrangement of the projector,” “b: the correction corresponding to the pupil position,” or “c: the correction corresponding to the individual difference of the image display device” is performed. The correction coefficient may be switched by combining at least two of the three of a to c.
“a: the correction corresponding to the arrangement of the projector” can be performed in the case where the correction coefficient is stored for each arrangement of the projector 125. In such a case, for example, the correction coefficient that corresponds to the arrangement of the projector 125 is generated by a linear interpolation method such as a bilinear interpolation from the multiple correction coefficients stored in the memory 144, etc. For example, a single-axis bar or the like is displayed; and the viewer 60 inputs, to the image display device, an input value indicating the position of the projector 125. The adjuster 142 generates the correction coefficient by a linear interpolation method corresponding to the input value. Or, the viewer 60 may input, to the image display device, information indicating the rotational position of the projector 125. Or, the memory 144 may switch the stored multiple correction coefficients in order. In the case where the correction coefficients are stored for each position of the pupil of the viewer 60, “b: the calculation of the correction coefficient corresponding to the pupil position” can be performed. In such a case, the correction coefficient can be switched by a processing similar to the case of “a: the correction corresponding to the arrangement of the projector.” In the case where the correction coefficient is stored for each individual difference of the image display devices, “c: the correction corresponding to the individual difference of the image display device” is performed. In such a case, the multiple correction coefficients that are stored in the memory 144 are switched in order.
For example, in step S150, the circuit part 140 switches the first correction coefficient described above to the second correction coefficient. The second correction coefficient is one of the multiple correction coefficients stored in the memory 144 or a correction coefficient calculated from the multiple correction coefficients stored in the memory 144. Subsequently, in step S130, the circuit part 140 generates a second display image (third corrected image data Cm3) by correcting the first image (the first input image data Pm1) described above based on the second correction coefficient. Then, the circuit part 140 causes the projector 125 to emit the light (a light Ld2) including the image information of the generated second display image as the image light.
In other words, the adjuster 142 transmits, to the projector 125, the corrected image data (the third corrected image data Cm3) of the input image data (the first input image data Pm1) for the test having been corrected. The projector 125 emits the third light (the light Ld2) based on the third corrected image data Cm3. The third corrected image data Cm3 is different from the first corrected image data Cm1.
Thereby, the second display image (the third corrected image data Cm3) is displayed to the viewer 60. The viewer 60 can input, to the circuit part 140, the signal of selecting the second correction coefficient (the second display image) based on the displayed image. In other words, for example, the circuit part 140 can receive a signal Sig2 employing the second correction information C2 of the relationship between the first input image data Pm1 and the third corrected image data Cm3.
Thus, the viewer 60 is caused to select one of the display images (the correction coefficients) by repeating steps S130 to S150. For example, the viewer 60 selects a display image in which the distortion, etc., are corrected appropriately.
The circuit part 140 stores first information relating to the relationship between the selected one of the display images and the first image before the correction. The first information includes, for example, the correction coefficient used to generate the selected display image.
In step S160, the memory 144 stores the first information. For example, in the case where the first display image is selected, the memory 144 stores the first correction coefficient.
Thus, the viewer 60 can easily adjust the display to an easily-viewable state.
Subsequently, when the image display device 101 performs the display, the corrector 141 generates the corrected image in which the object image (the input data) is corrected based on the first information (the selected correction coefficient); and light that includes the image information of the corrected image is emitted from the projector 125.
For example, in step S140, the image display device 101 performs a display based on the first correction information C1 in the case where the viewer 60 selects the first display image and the circuit part 140 receives the signal Sig1 employing the first correction information C1 of the relationship between the first input image data Pm1 and the first corrected image data Cm1. In such a case, the circuit part 140 transmits, to the projector 125, corrected image data (second corrected image data Cm2) of input data (second input image data Pm2) corrected based on the first correction information C1. The projector 125 emits a second light (a light Ld3) based on the second corrected image data.
In the example illustrated in
The first element r1 is positioned on the left side (e.g., the left edge) of the first image M1. In other words, inside the first image M1 illustrated in
The color of the first element r1 and the color of the second element r2 each are represented by two or more primary colors of the primary colors used in the displayer 110. For example, in the case where the three colors of red, green, and blue are used in the displayer 110, the color of each rectangle can be set to magenta in which red and blue are combined (superimposed).
The primary colors that are used in the displayer 110 are the colors of the light emitted by the subpixels included in the pixels 110e of the displayer 110. For example, each of the pixels 110e includes a first subpixel that emits light of the first color (e.g., red), a second subpixel that emits light of the second color (e.g., green), and a third subpixel that emits light of the third color (e.g., blue). A color image is displayed by superimposing the light of the three colors.
As described in reference to
As described above, the color breakup and/or the distortion of the observed image are different according to the pupil position of the viewer 60 and/or the arrangement of the projector 125. Therefore, a difference between the correction results occurs according to the correction coefficient. In other words, for example, the shape of the image s1 illustrated in FIG. 9B is different from the shape of the image s2 illustrated in
Thus, the circuit part 140 corrects the input image for each color component. In other words, for example, the first correction information C1 (the first correction coefficient) includes first color correction information Ca that corrects the component of the first color, and second color correction information Cb that corrects the component of the second color. The first color correction information Ca is different from the second color correction information Cb. The first image M1 is a superimposition of the image of the first color and the image of the second color. That is, the first input image data Pm1 of the first image M1 includes first color input image data Pc1 of the image of the first color and second color input image data Pc2 of the image of the second color. Also, the first corrected image data Cm1 of the first display image MD1 includes first color corrected image data Cc1 relating to the first color and second color corrected image data Cc2 relating to the second color. The first color corrected image data is data in which the first color input image data is corrected using the first color correction information; and the second color corrected image data is data in which the second color input image data is corrected using the second color correction information. In other words, the first color correction information Ca is of the relationship between the first color input image data Pc1 and the first color corrected image data Cc1; and the second color correction information Cb is of the relationship between the second color input image data Pc2 and the second color corrected image data Cc2.
As described above, the image display device 101 according to the embodiment displays the multiple display images (the first and second display images MD1, MD2, etc.) corresponding to mutually-different color breakup, etc. Thereby, the processing of causing the viewer 60 to select the correction coefficient is performed. Thereby, the viewer 60 can easily adjust the display to an easily-viewable state.
For example, a method for suppressing the color breakup and/or the distortion by adjusting the optical design of the reflector 130 and/or the optical part 120 also may be considered. However, in the case where the image light is projected from the side of the viewer 60 as in the image display device 101, the degrees of freedom of the optical design easily become limited. Therefore, an easily-viewable display may not be obtained only by the adjustment of the optical design. Also, there are cases where it is difficult to adjust the optical design for each user. Conversely, in the embodiment, the viewer 60 is caused to select a correction coefficient; and a corrected image is generated using the selected correction coefficient. Thereby, the viewer 60 easily obtains an easily-viewable display.
In the case where the image light is projected from the side of the viewer 60 as in the image display device 101, the distortion on the left side and the distortion on the right side may be different in the observed image viewed by the viewer 60. Conversely, the first pattern P1 illustrated in
The distortion at the edge part of the observed image easily becomes larger than the distortion at the central part of the observed image. Therefore, it is desirable for the first element r1 and the second element r2 to be proximal to the edge parts of the first image M1. In other words, the distance between the first element r1 and the side e1 is shorter than the distance between the first element r1 and a center c1. Also, the distance between the second element r2 and the side e2 is shorter than the distance between the second element r2 and the center c1. Thereby, the adjustment of the display can be performed easily.
It is desirable for the first pattern (the first direction D1 in which the rectangles extend and the second direction D2 in which the two rectangles are arranged) to be determined based on the relative arrangement of the projector 125 and the reflector 130. For example, in the case where the image light travels along the X-Y plane as in
The shape of the first pattern may be changed according to the relative arrangement of the projector 125 and the reflector 130. This will now be described with reference to
In the example, the plane that includes the incident direction DL1 and the reflection direction DL2 is the Z-Y plane.
In such a case, in the virtual image displayed to the viewer 60, the distortion in the vertical direction is larger than the distortion in the horizontal direction. At this time, a first pattern such as that illustrated in
Thus, in the virtual image when viewed by the viewer 60, it is desirable for the first direction D1 to be a direction perpendicular to a plane including the incident direction and the reflection direction of the reflector 130 for the image light (the light ray at the center of the luminous flux). In the virtual image when viewed by the viewer 60, it is desirable for the second direction D2 to be a direction parallel to a plane including the incident direction and the reflection direction of the reflector 130 for the image light. Thereby, the distortion and/or the color breakup can be confirmed easily; and the adjustment of the display can be performed easily.
The first patterns shown in
Similarly to the image display device 101 according to the first embodiment, the image display device according to the second embodiment includes the controller 14 (the circuit part 140), the projector 125, the reflector 130, etc. The processing of the circuit part 140 of the second embodiment is different from that of the first embodiment. Otherwise, a description similar to that of the first embodiment is applicable to the second embodiment.
For example, third input image data Pm3 that includes information of a second pattern (a second test pattern) is input to the circuit part 140.
In step S110, the circuit part 140 causes the optical part 120 to emit a light Ld4 including image information n1 of a display image (a third display image MD3) based on the third input image data. Also, for example, the light includes information n2 of an instruction image instructing the viewer to use the third display image MD3 to change at least one of the relative arrangement of the optical part 120 and the reflector 130, the relative arrangement of the optical part 120 and the eye of the viewer 60, or the relative arrangement of the reflector 130 and the eye of the viewer 60. For example, the instruction is provided to the viewer by text or an illustration.
The second pattern P2 may be a pattern indicating the four corners of the third display image MD3 as illustrated in
The second patterns P2 shown in
The viewer 60 can perform the adjustment of the partial screen loss (the adjustment of the eye range) while referring to the observed image displayed in step S110. In step S120, the circuit part 140 detects the end of step S110. In the case where the circuit part 140 detects the end of step S110 in step S120, the circuit part 140 executes the processing of step S130. A method in which the viewer 60 inputs information indicating the end of step S110 (e.g., a code indicating the end of step S110, etc.) to the image display device is an example of the method by which the circuit part 140 detects the end of step S110. For example, software (an application) is installed in a computer and/or a portable terminal; and the information is input from the viewer 60 to the circuit part 140 via the computer, the portable terminal, etc.
As described in reference to
For example, the positional relationship between the projector 125 and the pupil 150 changes according to individual differences such as the shape of the head of the viewer 60, etc. Therefore, partial screen loss may occur when the user of the image display device changes. Conversely, according to the embodiment, the partial loss can be suppressed by the adjustment of the position of the projector 125 by executing step S110.
As described above, in an image display device such as that of the embodiment, not only partial screen loss but also color breakup and distortion occur easily. For example, there are cases where an easily-viewable display state is not obtained by only displaying the screen for the adjustment and causing the user to adjust the partial screen loss. Conversely, according to the embodiment, the color breakup and the distortion can be suppressed by steps S130 to S150.
The color breakup and the distortion are dependent on the position of the projector 125 with respect to the reflector 130 and the pupil position of the viewer 60. Therefore, it is desirable to first adjust the partial screen loss by adjusting the position of the projector 125 with respect to the reflector 130, and subsequently adjust the color breakup and the distortion. In other words, it is desirable for the second processing (step S110) to be performed before the first processing (steps S130 to S150). Thereby, the viewer 60 can easily adjust the display to an easily-viewable state.
For example, in a method of a reference example, the color breakup and/or the distortion are not considered when displaying the adjustment image and causing the user to adjust the partial screen loss. In such a case, there are cases where the visibility decreases due to the color breakup and the distortion of the adjustment image; and the viewer 60 cannot adjust the partial screen loss sufficiently. Conversely, in the embodiment, for example, as in
Similarly to the image display device 101 according to the first embodiment, the image display device 102 according to the embodiment includes the displayer 110, the optical part 120, the reflector 130, etc. These are similar to those of the first embodiment and the second embodiment. The processing of the circuit part 140 of the third embodiment is different from that of the first embodiment or the second embodiment.
As illustrated in
As illustrated in
In the embodiment, the memory 144 can further store the user information. The memory 144 associates and stores the first information (the correction coefficient selected by the user in steps S130 to S150 described above) and the user information of the user. When the user information is input to the adjuster 142 from the outside, the adjuster 142 outputs the correction coefficient to the corrector 141 based on the user information.
In step S170, the processor 143 acquires the user information input from the outside by the user.
In step S180, the processor 143 determines whether or not the acquired user information matches pre-registered user information. In other words, in the case where the user information input in step S170 matches the user information already stored in the memory 144, the processor 143 performs processing B; and in the case of no match, the processor 143 performs processing A.
Similarly to the description of the first embodiment, the processor 143 receives the signal selecting the correction coefficient in step S140. Subsequently, step S161 is executed. In step S161, the memory 144 associates and stores the user information input in step S170 and the first information (the correction coefficient used to generate the selected display image).
The processing B includes steps S190, S200, S210, and S140.
In step S190, the processor 143 reads the first information (the correction coefficient) associated with the user information input in step S170 from the memory 144.
In step S200, the processor 143 generates a corrected image (a fourth display image) based on the first information.
The fourth display image includes a third test pattern. The third test pattern has a feature similar to the second pattern P2 described in reference to
In step S210, the processor 143 causes the projector 125 to emit light including the image information of the fourth display image MD4.
In the case where the fourth display image includes an image indicating the outer perimeter, the viewer 60 can suppress the partial screen loss by modifying the position of the projector 125 while viewing the virtual image of the fourth display image. Here, the correction coefficient (the first information) that is used to generate the fourth display image is selected to suppress the color breakup and the distortion in the state in which the partial screen loss is avoided by the processing A. Therefore, by modifying the projector 125 to the appropriate position, an easily-viewable display can be obtained in which the partial screen loss, the color breakup, and the distortion are suppressed.
In the case where the fourth display image includes the image based on the first pattern P1, the viewer 60 again selects the correction coefficient by a method similar to the description relating to steps S130, S140, and S150 described above. In step S140 in the example of
For example, the memory 144 stores the user information associated with the first correction information (the first correction coefficient). At this time, in the case where the user information associated with the first correction information is input in step S170, the processor 143 performs the processing B as a result of the determination in step S180. In step S190 of the processing B, the processor 143 reads the first correction information. Then, in step S200, the processor 143 generates the second corrected image data from the input image (the second input image data) by using the first correction information. In other words, in the example, the second corrected image data is data of the fourth display image MD4 described above. Subsequently, in step S210, the processor 143 causes the projector 125 to emit the second light including the information of the second corrected image data. Thereby, the viewer 60 can adjust the display by using the displayed image (the virtual image).
For the image display devices according to the first to third embodiments, the block diagrams of
(Position Controller 126)
Examples of the position controller 126 used in the image display devices according to the first to third embodiments will now be described with reference to
In
In the example, a long hole 31 is provided along the optical axis of the optical part 120 in the position controller 126a. A movable shaft 51 is provided in the projector 125. The movable shaft 51 is fixed to the projector 125. The movable shaft 51 passes through the long hole 31 and can move by s1iding through the long hole 31. Thereby, the position of the projector 125 can be adjusted.
The optical path of a light L2 emitted from one end of the projector 125 and the optical path of a light L3 emitted from another end of the projector 125 are shown in
Conversely, as in
A position controller 126b is used as an example of the position controller 126 in
For example, the reflector 130 is provided along the first surface 11p. The arrangement of the projector 125 is changeable in a direction Dx along the first surface 11p. For example, the direction Dx is parallel to a plane including the incident direction (e.g., DL1) and the reflection direction (DL2) at the reflector 130 of the light emitted from the projector 125 (the displayer 110). In the example, the direction Dx is parallel to the X-axis direction. The relative arrangement of the projector 125 and the reflector 130 is changeable in the left/right direction of the viewer 60.
A long hole 32 is provided along the X-axis direction in the position controller 126b. A movable shaft 52 is fixed to the projector 125, passes through the long hole 32, and can move by s1iding through the long hole 32. Thereby, the position of the projector 125 can be adjusted in the left/right direction of the viewer 60.
In the example as illustrated in
For example, the reflector 130 is provided along the first surface 11p. The arrangement of the projector 125 is changeable in a direction Dz along the first surface 11p. The direction Dz is a direction perpendicular to the first direction Dx described in reference to
A long hole 33 is provided along the Z-axis direction in the position controller 126c. A movable shaft 53 is fixed to the projector 125, passes through the long hole 33, and can move by s1iding through the long hole 33. Thereby, the position of the projector 125 can be adjusted in the vertical direction of the viewer 60.
In the example as illustrated in
For example, the optical part 120 has an optical axis 120a. The angle between the optical axis 120a and the first surface 11p is changeable by the position controller 126d. In other words, an incident direction DLa on the reflector 130 of the image light L1 including the image information is changeable by the position controller 126d.
In the example, a position controller 126d includes a rotation shaft 54. The projector 125 is held by the rotation shaft 54. The projector 125 can be rotated with the rotation shaft 54 at the center. For example, the projector 125 can be rotated in the X-Y plane.
Thus, an incident direction DLa and a reflection direction DLb of the image light L1 at the reflector 130 can be adjusted by rotating the projector 125. Thereby, the direction in which the image is viewed can be adjusted.
For example, there is a method of a reference example in which the position of the virtual image is adjusted by modifying the position of the reflector 130 without modifying the arrangement of the projector 125. However, in an eyeglasses-type image display device, the relative arrangement of the eyeglasses frame and the eyeglasses lens (the reflector 130) is substantially fixed when using. Therefore, as described above, the relative arrangement of the pupil 150 of the viewer 60 and the reflector 130 is substantially fixed; and there are cases where it is difficult to modify the position of the reflector 130. Conversely, in the embodiment, the arrangement of the projector 125 is modified by the position controller 126. Thereby, for example, the degrees of freedom of the adjustment of the relative arrangement of the reflector 130 and the projector 125 increase.
The mechanisms of the position controller 126 described above are examples; and the embodiment includes any configuration in which the position of the projector can be adjusted similarly. Further, the mechanisms of the position controller 126 described above may be multiply combined. For example, a position controller 126e illustrated in
As illustrated in
For example, the circuit part 140 is connected to an external storage medium or network via the interface 42 and acquires the object image (the input image). The connection to the outside may include a wired or wireless method. The user information, the information input by the viewer 60 in steps S120, S140, and S150, etc., may be input to the circuit part 140 via the interface 42.
For example, a program 45 that processes the acquired object image is stored in the memory 44. For example, the object image is appropriately corrected based on the program 45; and an appropriate display is performed in the displayer 110 thereby. The program 45 may be provided in a state of being pre-stored in the memory 44, or may be provided via a network or a storage medium such as CD-ROM, etc., and appropriately installed.
The image information may be stored in the memory 44. For example, the information of the first image M1, the first pattern P1, the second pattern P2, etc., may be stored in the memory 44. For example, a part of the memory 44 corresponds to the memory 144 that stores the multiple correction coefficients.
The circuit part 140 may include a sensor 46. The sensor 46 may include, for example, any sensor such as a camera, a microphone, a position sensor, an acceleration sensor, etc. For example, the image that is displayed by the displayer 110 is modified appropriately based on the information obtained from the sensor 46. Thereby, the convenience and ease of viewing of the image display device can be improved. The position information relating to the relative arrangement of the projector 125 and the reflector 130 may be sensed by the sensor 46.
The information obtained from the sensor 46, the image information, etc., are processed in the processing circuit 43 based on the program 45. Thus, the obtained image information is input from the circuit part 140 to the displayer 110; and the display is performed by the image display device. For example, a part of the processing circuit 43 corresponds to the corrector 141 and the processor 143; and the processing of the adjuster 142 and the corrector 141 is performed in the processing circuit 43 based on the program 45.
The example illustrated in
A part of each block or each entire block of the circuit part 140 may include an integrated circuit such as LSI (Large Scale Integration), etc., or an IC (Integrated Circuit) chipset. Each block may include an individual circuit; or a circuit in which some or all of the blocks are integrated may be used. The blocks may be provided as one body; or some blocks may be provided separately. Also, for each block, a part of the block may be provided separately. The integration is not limited to LSI; and a dedicated circuit or a general-purpose processor may be used.
The embodiment differs from the first embodiment in that the aspect ratio of the observed image is corrected using the correction coefficient.
In the embodiment, the first image includes a third pattern (the third test pattern).
A fourth pattern P4 that is included in the first display image MD1 of
According to the embodiments, an image display device and an image processing device can be provided in which the viewer can adjust the display to an easily-viewable state.
In the specification of the application, “perpendicular” and “parallel” refer to not only strictly perpendicular and strictly parallel but also include, for example, the fluctuation due to manufacturing processes, etc. It is sufficient to be substantially perpendicular and substantially parallel.
Hereinabove, embodiments of the invention are described with reference to specific examples. However, the invention is not limited to these specific examples. For example, one skilled in the art may similarly practice the invention by appropriately selecting specific configurations of components such as the projector, the reflector, the circuit part, etc., from known art; and such practice is within the scope of the invention to the extent that similar effects can be obtained.
Further, any two or more components of the specific examples may be combined within the extent of technical feasibility and are included in the scope of the invention to the extent that the purport of the invention is included.
Moreover, all image display devices and image processing devices practicable by an appropriate design modification by one skilled in the art based on the image display devices and image processing devices described above as embodiments of the invention also are within the scope of the invention to the extent that the spirit of the invention is included.
Various other variations and modifications can be conceived by those skilled in the art within the spirit of the invention, and it is understood that such variations and modifications are also encompassed within the scope of the invention.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2016-015597 | Jan 2016 | JP | national |
This is a continuation application of International Application PCT/JP2016/087766, filed on Dec. 19, 2016. This application also claims priority to Japanese Application No. 2016-015597, filed on Jan. 29, 2016. The entire contents of each are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2016/087766 | Dec 2016 | US |
Child | 15914456 | US |