A known technique is described in, for example, Patent Literature 1.
In one embodiment of the present disclosure, an aerial image display device includes a display that displays an image with traveling image light, an imaging optical system including one or more optical elements to receive the traveling image light as incident light, and a drive that changes a positional relationship between an object focal point of the imaging optical system and the display relative to each other. The drive switches between a first positioning and a second positioning. The first positioning is positioning with the display located closer to the imaging optical system than the object focal point of the imaging optical system to display a virtual image in air. The second positioning is positioning with the display located farther from the imaging optical system than the object focal point of the imaging optical system to display a real image in air.
The objects, features, and advantages of the present disclosure will become more apparent from the following detailed description and the drawings.
An aerial image display device with the structure that forms the basis of an aerial image display device according to one or more embodiments of the present disclosure will now be described.
A known display device described in Patent Literature 1 forms, for example, an aerial image from light emitted from a display using an optical element such as a retroreflector.
For example, a display that displays a real image in the air and a display that displays a virtual image in the air may be used to display an image as a real image and a virtual image. This structure thus includes two displays.
One or more aspects of the present disclosure are directed to an aerial image display device that can display a real image and a virtual image in the air.
One or more embodiments of the present disclosure will now be described with reference to the drawings. The drawings used herein are schematic and are not drawn to scale relative to the actual size of each component.
The display 10 displays an image using traveling image light. The controller 40 changes the image to be displayed on the display 10. The display 10 may be a transmissive display or a self-luminous display. For example, the transmissive display may be a liquid crystal display. For example, the self-luminous display may be a display including a self-luminous element such as a light-emitting diode (LED) element, an organic electroluminescent (OEL) element, an organic LED (OLED) element, or a semiconductor laser diode (LD) element.
The imaging optical system 20 receives image light as incident light. The imaging optical system 20 includes one or more optical elements. The imaging optical system 20 may be a reflective optical system including an optical element that is a light-reflecting member, such as a reflective mirror. The imaging optical system 20 may be a catadioptric optical system including an optical element that is a light-reflecting member, such as a reflective mirror, and a member that refracts light, such as a lens. The imaging optical system 20 may be a reflective or catadioptric optical system that is a coaxial optical system. The imaging optical system 20 may be a reflective or catadioptric optical system that is a non-coaxial optical system. The non-coaxial optical system may be, for example, a decentered optical system, or may be an off-axial optical system.
The imaging optical system 20 includes the display 10 as a light source. The imaging optical system 20 includes an object space containing the display 10 and an image space opposite to the object space. The imaging optical system 20 includes an object focal point as a focal point in the object space, and an image focal point as a focal point in the image space. In the present embodiment, the imaging optical system 20 includes the optical elements arranged to define each of the object focal point and the image focal point at a single point.
When the display 10 is located closer to the imaging optical system 20 than the object focal point f of the imaging optical system 20, a virtual image V corresponding to image light is displayed in the air in the image space as an image corresponding to the display 10. The relative positioning of the display 10 and the imaging optical system 20 with the display 10 located closer to the imaging optical system 20 than the object focal point f of the imaging optical system 20 to display a virtual image V in the air is referred to as a first positioning. The first positioning indicating the positional relationship between the display 10 and the object focal point f of the imaging optical system 20 is relative positioning to display a virtual image V in the air. When the display 10 is located farther from the imaging optical system 20 than the object focal point f of the imaging optical system 20, a real image R corresponding to the image light is displayed in the air in the image space as an image corresponding to the display 10. The relative positioning of the display 10 and the imaging optical system 20 with the display 10 located farther from the imaging optical system 20 than the object focal point f of the imaging optical system 20 to display a real image R in the air is referred to as a second positioning. The second positioning indicating the positional relationship between the display 10 and the object focal point f of the imaging optical system 20 is relative positioning to display a real image R in the air. The first positioning and the second positioning may be referred to as first relative positioning and second relative positioning as appropriate.
The drive 30 changes the positional relationship between the focal point of the imaging optical system 20 and the display 10 relative to each other. The drive 30 changes the positions of the display 10 and the imaging optical system 20 relative to each other. The drive 30 changes, for example, the position of at least the display 10 or at least one of the optical elements in the imaging optical system 20 to change their relative positions. The drive 30 may change, for example, the positions of all optical elements in the imaging optical system 20 to change their positions relative to the display 10. The drive 30 may change, for example, the positions of one or more optical elements in the imaging optical system 20 to change the positional relationship between the focal point of the imaging optical system 20 and the display 10 relative to each other.
The controller 40 controls the display 10, and may be, for example, a processor. The controller 40 may include one or more processors. The processors may include a general-purpose processor that reads a specific program to perform a specific function, and a processor dedicated to specific processing. The dedicated processor may include an application-specific integrated circuit (ASIC). The processors may include a programmable logic device (PLD). The PLD may include a field-programmable gate array (FPGA). The controller 40 may be either a system on a chip (SoC) or a system in a package (SiP) in which one or more processors cooperate with other components. The controller 40 may control the operation of the drive 30. The controller 40 may function as a controller for controlling the operation of the drive 30. The function of the controller for controlling the operation of the drive 30 may be a motor driver for controlling a drive motor. A controller other than the controller 40 may control the operation of the drive 30. The controller other than the controller 40 may be an integrated circuit (IC) for motor control.
The controller 40 may perform control to enlarge the image (also referred to as a virtual-use image) to be displayed on the display 10 for displaying a virtual image V in the air relative to the image (also referred to as a real-use image) to be displayed on the display 10 for displaying a real image R in the air. In this case, the virtual image V is formed behind the curved mirror 21 as viewed from the user. Thus, the virtual image V is located farther from the user than the real image R displayed in the air. The virtual image V is thus less viewable to the user than the real image R. In addition, viewing the virtual image V through the semi- transmissive curved mirror 21 or a windshield 25 (illustrated in
The controller 40 may perform control to cause the luminance of the image to be displayed on the display 10 for displaying a virtual image V in the air to be higher than the luminance of the image to be displayed on the display 10 for displaying a real image R in the air. This also reduces the likelihood of the virtual image V being less viewable to the user than the real image R. The luminance of the virtual-use image may be, but not limited to, more than one time and about ten times or less the luminance of the real-use image.
The controller 40 may perform control to cause the contrast of the image to be displayed on the display 10 for displaying a virtual image V in the air to be higher than the contrast of the image to be displayed on the display 10 for displaying a real image R in the air. This also reduces the likelihood of the virtual image V being less viewable to the user than the real image R. The contrast of the virtual-use image may be, but not limited to, more than one time and about twice or less the contrast of the real-use image.
The controller 40 may perform control to cause the frame frequency of the image to be displayed on the display 10 for displaying a virtual image V in the air to be higher than the frame frequency of the image to be displayed on the display 10 for displaying a real image R in the air. This also reduces the likelihood of the virtual image V being less viewable to the user than the real image R. The frame frequency of the virtual-use image may be, but not limited to, more than one time and about eight times or less the frame frequency of the real-use image. For example, when the frame frequency of the real-use image is 30 Hz, the frame frequency of the virtual-use image may be higher than 30 Hz and about 240 Hz or lower.
In the present embodiment, the drive 30 can control shifts from the first positioning to the second positioning and from the second positioning to the first positioning by, for example, changing the spatial position of the display 10. The movements of the display 10 in the present embodiment are translational. In the present embodiment, the position of the curved mirror 21 may be fixed. The drive 30 may have any structure that allows the display 10 to move between the first positioning and the second positioning from one position to the other. Examples of such a structure include an electric slider and an electric cylinder controlled by, for example, a servo motor. The drive 30 may stop the display 10 at two positions that are a first spatial position corresponding to the first positioning and a second spatial position corresponding to the second positioning. The drive 30 may stop the display 10 at the two positions, or the first spatial position and the second spatial position, or at any spatial position between the first spatial position and the second spatial position.
In one or more embodiments of the present disclosure, the first spatial position is a spatial position corresponding to the first positioning, and the second spatial position is a spatial position corresponding to the second positioning. The first spatial position and the second spatial position may indicate the spatial positions of one or more components, which may vary between embodiments. For example, the first spatial position and the second spatial position may indicate the spatial positions of the display 10, the imaging optical system 20, and one or more optical elements in the imaging optical system 20.
The user of the aerial image display device 1 can view a virtual image V when the display 10 and the imaging optical system 20 are at the first positioning, and can view a real image R when the display 10 and the imaging optical system 20 are at the second positioning.
The user of the aerial image display device 1 can view a virtual image V when the display 10 is at a spatial position corresponding to the first positioning, and can view a real image R when the display 10 is at a spatial position corresponding to the second positioning. In the present embodiment, the aerial image display device 1 can thus display a real image R and a virtual image V in the air as the drive 30 changes the positions of the display 10 and the optical element relative to each other.
The user of the aerial image display device 1 can view a virtual image V when the curved mirror 21 is at the first spatial position, and can view a real image R when the curved mirror 21 is at the second spatial position. In the present embodiment, the aerial image display device 1 can thus display a real image R and a virtual image V in the air as the drive 30 changes the positions of the display 10 and the optical element relative to each other.
The relative positioning of an optical element in the imaging optical system 20 and the display 10 can be changed by the drive 30 moving the optical element (curved mirror 21) or the display 10 to shift between the first positioning and the second positioning. The direction in which the drive 30 moves the display 10 or the optical element may be a direction including a component parallel to the optical axis of the optical element that is closest to the display 10 on the optical path of the imaging optical system 20. In the present embodiment described above, the optical element closest to the display 10 on the optical path is the curved mirror 21. Such movement allows a small space to be used for moving the display 10 or an optical element. For example, aligning the direction of movement of the display 10 or the optical element with the optical axis minimizes the space used for moving the display 10 or the optical element.
A real image R and a virtual image V viewable to the user may be the same or different images. For different images, the controller 40 displays an image to be viewed by the user as a virtual image V when the display 10 and the optical element are at the first positioning, and displays an image to be viewed by the user as a real image R when the display 10 and the optical element are at the second positioning. Thus, the user receives different information from the real image R and the virtual image V displayed in the air.
The real image R is an inverted image. The virtual image V is an erect image. The controller 40 switches, for example, the up-down orientation of the image to be displayed on the display 10 between the orientation for the first positioning and the orientation for the second positioning. The controller 40 reads, for example, an image with a different up-down orientation from a storage and causes the image to be displayed on the display 10. Thus, when the real image R and the virtual image V are switched between them, the user can view the image displayed in the correct up-down orientation in the air.
When a displayed image is distorted, the display 10 may display an image with correction to reduce such distortion caused by an optical element. The distortion may be corrected using, for example, predefined distortion correction tables. Distortion correction tables to be used may be defined as appropriate for the optical elements in the imaging optical system 20. With the real image R being an inverted image and the virtual image V being an erect image, the distortion correction tables may include a real image correction table and a virtual image correction table. The controller 40 may switch, for example, the distortion correction table between the distortion correction table for the first positioning and the distortion correction table for the second positioning. The controller 40 may use, for example, different distortion correction tables for the first positioning and for the second positioning. Thus, the aerial image display device 1 switching between a real image R and a virtual image V can provide images corrected for distortion to the user.
The user of the aerial image display device 1 can view a virtual image V when the curved mirror 21 is at the first spatial position, and can view a real image R when the curved mirror 21 is at the second positioning. In the present embodiment, the aerial image display device 1 can thus display a real image R and a virtual image V in the air as the drive 30 changes the positions of the display 10 and the optical element relative to each other.
The user of the aerial image display device 1 can view a virtual image V when the display 10 is at the first spatial position, and can view a real image R when the display 10 is at the second spatial position. In the present embodiment, the aerial image display device 1 can thus display a real image R and a virtual image V in the air as the drive 30 changes the positions of the display 10 and the optical elements relative to one another.
The user of the aerial image display device 1 can view a virtual image V when the plane mirror 22 is at the first spatial position, and can view a real image R when the plane mirror 22 is at the second spatial position. In the present embodiment, the aerial image display device 1 can thus display a real image R and a virtual image V in the air as the drive 30 changes the positions of the display 10 and the optical elements relative to one another.
The user of the aerial image display device 1 can view a virtual image V when the display 10 is at the first spatial position, and can view a real image R when the display 10 is at the second spatial position. In the present embodiment, the aerial image display device 1 can thus display a real image R and a virtual image V in the air as the drive 30 changes the positions of the display 10 and the optical elements relative to one another.
The user of the aerial image display device 1 can view a virtual image V when the curved mirror 23 is at the first spatial position, and can view a real image R when the curved mirror 23 is at the second spatial position. In the present embodiment, the aerial image display device 1 can thus display a real image R and a virtual image V in the air as the drive 30 changes the positions of the display 10 and the optical elements relative to one another.
The user of the aerial image display device 1 can view a virtual image V when the display 10 is at the first spatial position, and can view a real image R when the display 10 is at the second spatial position. In the present embodiment, the aerial image display device 1 can thus display a real image R and a virtual image V in the air as the drive 30 changes the positions of the display 10 and the optical elements relative to one another.
The user of the aerial image display device 1 can view a virtual image V when the lens 24 is at the first spatial position, and can view a real image R when the lens 24 is at the second spatial position. In the present embodiment, the aerial image display device 1 can thus display a real image R and a virtual image V in the air as the drive 30 changes the positions of the display 10 and the optical elements relative to one another.
When the imaging optical system 20A is a catadioptric optical system, as in the present embodiment, the lens 24 may be changed instead of moving the lens 24. For example, two lenses 24 with different lens characteristics may be retained, and the drive 30 may move the two lenses 24 to change between these lenses 24. Changing the lens 24 changes the position of the object focal point f of the imaging optical system 20A. This switches between the first positioning in which the display 10 is located closer to the imaging optical system 20A than the object focal point f and the second positioning in which the display 10 is located farther from the imaging optical system 20A than the object focal point f.
In one or more embodiments of the present disclosure, examples of the movable body include a vehicle, a vessel, and an aircraft. Examples of the vehicle include an automobile, an industrial vehicle, a railroad vehicle, a community vehicle, and a fixed-wing aircraft traveling on a runway. Examples of the automobile include a passenger vehicle, a truck, a bus, a motorcycle, and a trolley bus. Examples of the industrial vehicle include an industrial vehicle for agriculture and an industrial vehicle for construction. Examples of the industrial vehicle include a forklift and a golf cart. Examples of the industrial vehicle for agriculture include a tractor, a cultivator, a transplanter, a binder, a combine, and a lawn mower. Examples of the industrial vehicle for construction include a bulldozer, a scraper, a power shovel, a crane vehicle, a dump truck, and a road roller. Examples of the vehicle include a human-powered vehicle. The classification of the vehicle is not limited to the above examples. Examples of the automobile include an industrial vehicle travelling on a road. One type of vehicle may fall within multiple classes. Examples of the vessel include a jet ski, a boat, and a tanker. Examples of the aircraft include a fixed-wing aircraft and a rotary-wing aircraft. In the example described below, the movable body includes a windshield 25. The movable body may be any of the above examples including a combiner in place of the windshield 25.
The camera 50 is installed on the movable body. The camera 50 captures an image of a space expected to include, for example, the face, or the upper body, of the driver or a user of the movable body. The camera 50 may be installed at any position inside or outside the movable body. The camera 50 may be installed, for example, inside or on a dashboard in the movable body. The camera 50 may be installed, for example, inside another device such as an air duct.
The camera 50 may be an infrared camera that receives infrared light and generates images. The camera 50 may function both as an infrared camera and a visible light camera. The camera 50 may include, for example, a charge-coupled device (CCD) image sensor or a complementary metal-oxide-semiconductor (CMOS) image sensor.
The camera 50 outputs a captured image to the controller 40. The camera 50 may output a captured image to the controller 40 through wired communication or wireless communication. The wired communication may include, for example, communication using a controller area network (CAN). The controller 40 may detect the position of eyes 5 of the user based on the captured image output from the camera 50. The controller 40 changes the image to be displayed on the display 10 based on the detected position of the eyes 5 of the user.
In the present embodiment, the imaging optical system 20B in the aerial image display device 1B includes optical elements that are a curved mirror 21 and a windshield 25. The image light emitted from the display 10 is reflected from the curved mirror (concave mirror) 21, reaches the windshield 25, and is then reflected from the windshield 25 to reach the eyes 5 of the user. The user can thus view an aerial image.
In the present embodiment, the drive 30 can control shifts from the first positioning to the second positioning and from the second positioning to the first positioning by, for example, changing the spatial position of the display 10. In the present embodiment, the drive 30 can move the display 10 to the first spatial position and to the second spatial position. The movements of the display 10 in the present embodiment are translational. In the present embodiment, the position of the curved mirror 21 may be fixed. The drive 30 may have any structure that allows the display 10 to move between the first spatial position and the second spatial position from one position to the other. Examples of such a structure include an electric slider and an electric cylinder. The drive 30 may stop the display 10 at two positions that are the first spatial position and the second spatial position. The drive 30 may stop the display 10 at the two positions, or the first spatial position and the second spatial position, or at any position between the first spatial position and the second spatial position.
The user of the aerial image display device 1 can view a virtual image V with the image light reflected from the windshield 25 when the display 10 is at the first spatial position, and can view a real image R with the image light reflected from the windshield 25 when the display 10 is at the second spatial position. In the present embodiment, the aerial image display device 1B can thus display a real image R and a virtual image V in the air as the drive 30 changes the positions of the display 10 and the optical elements relative to one another.
The controller 40 may control the drive 30 based on the position of the eyes 5 of the user. The controller 40 may move the display 10 based on the position of the eyes 5 of the user to switch the aerial display between a real image R and a virtual image V. The controller 40 may also control the drive 30 based on, for example, the operating state (e.g., being stopped or being traveling) of the movable body. The controller 40 may move the display 10 based on the operating state of the movable body to switch the aerial display between a real image R and a virtual image V.
In a variation of the tenth embodiment, the display 10 may be fixed, and the spatial position of the curved mirror 21 may be shifted between the first spatial position and the second spatial position to switch the aerial display between a real image R and a virtual image V.
Upon switching from a real image R to a virtual image V or from a virtual image V to a real image R, the eyes of the user may fail to respond to the switch. The user may take time before viewing the new image, or feel discomfort from visually induced motion sickness. The controller 40 may cause, for example, the display 10 to display a black image when switching the image to be displayed on the display 10 between an image for display as a real image R and an image for display as a virtual image V. A black image is displayed to reduce the likelihood of the viewability being lowered, and thus reduce discomfort.
The aerial image display device 1B may be in other embodiments described below. The camera 50 may capture an image of the user to obtain an image of the pupils of the eyes 5 of the user. The controller 40 may perform control to enlarge the image to be displayed on the display 10 when the pupils enlarge. When the pupils of the user enlarge, the user gazes at the image or a part of the image. Enlarging the image to be displayed on the display 10 allows the user to view the image or a part of the image more easily. This allows the user aboard a transport vehicle to avoid unsafe situations more easily. The ratio of enlargement of the image may be, but not limited to, more than one time and about three times or less. A part of the image being gazed at by the user may be detected, and the part of the image being gazed at by the user may be enlarged.
The camera 50 may capture an image of the user to obtain an image of the pupils of the eyes 5 of the user. The controller 40 may perform control to increase the luminance of the image to be displayed on the display 10 when the pupils enlarge. This produces the same or similar advantageous effects as described above, allowing the user aboard a transport vehicle to avoid unsafe situations more easily. The luminance of the image may be increased by a factor of, but not limited to, more than one and about ten or less. A part of the image being gazed at by the user may be detected, and the luminance may be increased in the part of the image being gazed at by the user.
The camera 50 may capture an image of the user to obtain an image of the pupils of the eyes 5 of the user. The controller 40 may perform control to increase the contrast of the image to be displayed on the display 10 when the pupils enlarge. This produces the same or similar advantageous effects as described above, allowing the user aboard a transport vehicle to avoid unsafe situations more easily. The contrast of the image may be increased by a factor of, but not limited to, more than one and about two or less. A part of the image being gazed at by the user may be detected and the contrast may be increased in the part of the image being gazed at by the user.
The camera 50 may capture an image of the user to obtain an image of the pupils of the eyes 5 of the user. The controller 40 may perform control to increase the frame frequency of the image to be displayed on the display 10 when the pupils enlarge. This produces the same or similar advantageous effects as described above, allowing the user aboard a transport vehicle to avoid unsafe situations more easily. The frame frequency of the image may be increased by a factor of, but not limited to, more than one and about eight or less.
The aerial image display device according to one or more embodiments of the present disclosure may be implemented in forms 1 to 15 described below.
In one or more embodiments of the present disclosure, first, second, or others are identifiers for distinguishing the components. The identifiers of the components distinguished with first, second, and others in the present disclosure are interchangeable. For example, the first reflective element is interchangeable with the second reflective element. The identifiers are to be interchanged together. The components for which the identifiers are interchanged are also to be distinguished from one another. The identifiers may be eliminated. The components without such identifiers can be distinguished with reference numerals. The identifiers such as first and second in the present disclosure alone should not be used to determine the order of components or to suggest the existence of smaller number identifiers.
The aerial image display device according to one or more embodiments of the present disclosure can display a real image and a virtual image in the air.
The present disclosure may be embodied in various forms without departing from the spirit or the main features of the present disclosure. The embodiments described above are thus merely illustrative in all respects. The scope of the present disclosure is defined not by the description given above but by the claims. Any variations and alterations contained in the claims fall within the scope of the present disclosure.
Number | Date | Country | Kind |
---|---|---|---|
2021-135174 | Aug 2021 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2022/030110 | 8/5/2022 | WO |