The present invention relates to an image control apparatus, a display apparatus, a movable body, and an image control method.
Development of a head-up display (HUD) installed in a movable body such as a vehicle, a ship, an aircraft, and an industrial robot, etc., is in progress. The HUD directly projects information to human vision, and provides the occupant (driver, passenger, etc.) with various kinds of information including images. In the HUD, the generated light image is diffracted in a direction toward the occupant by the windshield or a combiner, etc., and is displayed as if the image exists at a virtual image position in front of the occupant's line of sight. As the displayed image, there is known an image for displaying an arc-shaped or trapezoidal-shaped line segment that has an overall shape protruding upward (see, for example, Patent Literature 1).
PTL 1: WO2014/174575
The arc-shaped or trapezoidal-shaped line segment protruding upward is output such that the width in the left-right direction of the arc or trapezoid, when viewed from the driver's field of view, represents the vehicle width on the road. This line segment display functions as a driving support tool for a driver who is inexperienced in terms of the vehicle width spacing, but driving related information other than the vehicle width is not presented. Furthermore, it is not sufficiently taken into consideration how to display the driving related information to the occupant with good visibility.
Among the pieces of driving related information, the speed related information is information that is of particular interest to the occupant; however, this speed related information changes from moment to moment depending on the situation. There is a need for a method of displaying images that enables the occupant to easily recognize the speed related information.
An object of the present invention is to provide an image control technology that enables an occupant of a moving body to recognize the difference between the present traveling speed and other speed related information, with good visibility.
An aspect of the present invention provides an image control apparatus installed in a movable body, the image control apparatus including a controller configured to generate data of a display image in which a present moving speed of the movable body is indicated together with other speed related information, and change a display mode related to a difference between the present moving speed and a speed indicated by the other speed related information, upon detecting that a predetermined condition is satisfied.
According to the present disclosure, an occupant of a moving body is able to recognize the difference between the present traveling speed and other speed related information, with good visibility.
The display apparatus 1 is set, for example, on the dashboard or in the dashboard of the automobile 300, and the display apparatus 1 projects a light image to a predetermined projection area 311 of a windshield 310 in front of the occupant P (driver, passenger, etc.).
The display apparatus 1 includes an optical apparatus 10 and a control apparatus 20. The control apparatus 20 mainly generates image data of an image to be projected onto the windshield 310 and controls the display. The optical apparatus 10 projects an image formed based on the generated image data, onto the projection area 311 of the windshield 310. The configuration of the optical apparatus 10 is not directly related to the present invention, and thus the detailed configuration is not illustrated. For example, as described later, the optical apparatus 10 may include a laser light source, a light scanning device in which laser light output from the laser light source is two-dimensionally scanned on a screen, and a projection optical system (for example, a concave mirror, etc.) for projecting image light of an intermediate image, formed on a screen, onto the projection area 311 of the windshield 310. By projecting the image light onto the projection area 311, the driver is made to visually recognize a virtual image. Note that, instead of the laser light source, the screen, and the light scanning device, a light emitting diode (LED) or the like may be used as the light source, and a liquid crystal element or a Digital Minor Device (DMD) element may be used as an image forming unit.
The projection area 311 of the windshield 310 is formed of a transmission/reflection member that reflects some parts of the light components and transmits other parts of the light components. The light image formed on the screen is projected by a projection optical system included in the optical apparatus 10, and is then reflected by the projection area 311 and travels toward the occupant P. When the reflected light enters the pupils of the occupant P in the light paths indicated by the broken lines, the occupant P visually recognizes the image projected on the projection area 311 of the windshield 310. At this time, the occupant P perceives as if the light image enters his or her pupils from a virtual image position I, through the light paths indicated by the dotted lines. The displayed image is recognized as if the image exists at the virtual image position I.
Although a camera 5 is installed in the automobile 300 in addition to the display apparatus 1, the camera 5 is not essential. The camera 5 captures, for example, an image of an environment such as the front or the side of the automobile 300. The camera 5 may measure the inter-vehicle distance between the automobile 300 and a preceding vehicle in conjunction with an ACC mode. From the image acquired by the camera 5, speed related information such as a speed sign may be extracted and used for highlighted display control according to the embodiment. Details of the highlighted display control based on the speed related information will be described later with reference to
The control apparatus 20 includes a field-programmable gate array (FPGA) 201, a central processing unit (CPU) 202, a read-only memory (ROM) 203, a random access memory (RAM) 204, an interface (hereinafter referred to as “I/F”) 205, a bus line 206, an LD driver 207, a MEMS controller 208, and a solid state drive (SSD) 209 as an auxiliary storage device. Furthermore, a recording medium 211 that can be detachably attached may be included.
The FPGA 201 controls the operation of the LD driver 207 and the MEMS controller 208. The LD driver 207 generates and outputs a drive signal for driving the LD 101 under the control of the FPGA 201. The drive signal controls the light emission timing of each of the laser elements that emit light of R, G, and B. The MEMS controller 208 generates and outputs a MEMS control signal under control of the FPGA 201, and controls the scan angle and scan timing of the MEMS 102. Instead of the FPGA 201, another logic device such as a programmable logic device (PLG) may be used.
The CPU 202 controls the overall image data processing of the display apparatus 1. The ROM 203 stores various programs including programs executed by the CPU 202 to control each function of the display apparatus 1. The RAM 204 is used as a work area of the CPU 202.
The I/F 205 is an interface for communicating with an external controller, etc., and is connected to, for example, a vehicle navigation device, and various sensor devices via a Controller Area Network (CAN) of the automobile 300. The camera 5 for capturing the traveling environment of the vehicle through the windshield 310, may be connected to the I/F 205.
The display apparatus 1 can read and write information in the recording medium 211 via the I/F 205. An image processing program for implementing the processing in the display apparatus 1 may be provided by the recording medium 211. In this case, the image processing program is installed in the SSD 209 from the recording medium 211 via the I/F 205. The installation of the image processing program is not necessarily performed with the recording medium 211, and may be downloaded from another computer via a network. The SSD 209 stores the installed image processing program and also stores necessary files and data.
Examples of the recording medium 211 include portable recording media such as a flexible disk, a Compact Disk Read-Only Memory (CD-ROM), a digital versatile disc (DVD), a secure digital (SD) memory card, and a Universal Serial Bus (USB) memory. Furthermore, as the auxiliary storage device, a Hard Disk Drive (HDD) or a flash memory, etc., may be used instead of the SSD 209. The auxiliary storage device such as the SDD 209 and the recording medium 211 are both computer readable recording media.
The display apparatus 1 is connected to an electronic device such as a vehicle navigation device 400, a sensor group 500, and the camera 5 via the I/F 205 and a CAN. The display apparatus 1 acquires external information from these electronic devices and uses the information as a determination material as to whether to perform highlighting (highlighted display). The vehicle navigation device 400 includes navigation information such as a road map, global positioning system (GPS) information, a speed limit area, traffic regulation information, the speed limit of each road, and the like, and generates a route navigation image according to the user's input operation. The image control unit 250 uses at least a part of the navigation information included in the vehicle navigation device 400 to determine whether highlighting is necessary, the timing of performing highlighting, and the like.
The sensor group 500 includes an acceleration sensor, a gyro sensor, a laser radar device, a weather sensor, a brightness sensor, and the like, and detects information pertaining to the automobile 300 such as the behavior, the state, the surrounding state, and the distance to a vehicle traveling ahead, etc. The information obtained by the sensor group 500 is supplied to the image control unit 250, and at least a part of the sensor information is used for the determination of highlighting.
The camera 5 is a monocular camera, a stereo camera, an omnidirectional camera or the like, and detects the condition of the traveling path, a vehicle ahead, a bicycle, a person, a sign, and the like. The information acquired by the camera 5 is supplied to the image control unit 250, and at least a part of the camera information is used for the determination of highlighting.
The image data generating unit 820 includes a data adjusting unit 8210. The image data generating unit 820 generates data of an image to be projected on the projection area 311, and determines whether to generate image data used for highlighting, based on the information input from the information input unit 800. When it is determined that highlighting is necessary, the data adjusting unit 8210 adjusts the generated image data to be image data used for highlighting.
The image rendering unit 840 includes a control unit 8410 and controls the operation of the optical apparatus 10 according to the image data generated by the image data generating unit 820. The image rendering unit 840 may be implemented by the FPGA 201, the LD driver 207, and the MEMS controller 208. When the image data generating unit 820 generates data of a highlight image, the image rendering unit 840 forms a light image based on the data of the highlight image, and the formed light image is projected on the projection area 311 of the windshield 310. Specific examples of the highlight image will be described below.
In
In this example, numerical values are displayed only on some of the scale marks 35, that is, on scale marks near the present vehicle speed; the numerical values may be displayed at regular intervals (for example, every 20 km). Either one of the display of the numerical values of the scale marks 35, or the vehicle speed display by the characters 34, may be omitted. Furthermore, the scale marks 35 may be omitted.
In addition to the pointer 32 indicating the pointer setting speed indicating the present vehicle speed, an indicator 33 indicating the reference speed is displayed outside the speed gauge 31. The reference speed is speed information serving as a reference for traveling, such as a setting speed of the ACC, a speed limit of a road, a regulatory speed temporarily set by speed regulations, and the like. In this example, it is assumed that the speed of the ACC mode is set to “70 km/h”.
When the position of the indicator 33 indicating the reference speed and the position of the pointer 32 indicating the present vehicle speed are close, it may be difficult for the occupant to recognize the difference instantaneously. Therefore, as illustrated in
In
The intervals between the scale marks 35 do not necessarily have to be symmetrically or equally expanded on both sides of the pointer 32 representing the present vehicle speed; for example, the intervals between the scale marks 35 may be unevenly expanded in the enlarged area E including at least one of the present vehicle speed and the reference speed (the ACC set speed in this example). The scale marks 35 may be omitted, and only the difference between the indicator 33 and the pointer 32 may be displayed in a highlighted manner.
As illustrated in
The switching from the display of the regular mode of
The reference speed is not limited to the ACC set speed. For example, the road speed limit may be used as the reference speed. When traveling on an expressway, the reference speed may be set to 100 km/h, and when traveling in an urban area, the reference speed may be set to 50 km/h. Such road information may be acquired from the vehicle navigation device 400, the camera 5, and the like. When certain conditions related to the vehicle speed are satisfied, or when there is a user input, the display of the scale marks 35 is controlled, and highlighting is performed so that the difference between the present vehicle speed and the reference speed can be easily viewed.
The display control can be performed, for example, by storing an object including the speed gauge 31, the pointer 32, the indicator 33, and the scale marks 35 in the ROM 203 in advance, and performing image adjustment to expand the intervals between the scale marks 35 in the area around the present vehicle speed and/or the reference speed at the timing of highlighting.
The characters 34 indicating the present vehicle speed may not always be displayed. The display may be turned off after being displayed for a fixed time (for example, several seconds) after the speed change. Furthermore, the numerical values appended to the scale marks 35 may not always be displayed, and may be displayed at predetermined time intervals.
For example, when the present moving speed becomes 85 km/h and approaches the speed limit, and the difference from the speed limit decreases, the display mode is changed from the regular display to the highlighted display. For example, as illustrated in (b) of
For example, the indicator 33 may be displayed in an eye-catching color such as yellow or orange, or the numerical value of the reference speed may be surrounded by a color circle to display the speed mark 36 like a road sign. By this display, the occupant can recognize the difference between the present vehicle speed and the reference speed with good visibility.
In (a) of
In (a) of
Also in
When the present vehicle speed approaches the reference speed and the difference decreases more than an allowable range, highlighted display is performed as illustrated in (b) of
In
Examples of highlighting the change in the difference in a manner that can be easily recognized, when the difference between predetermined speed related information and the present speed becomes smaller, are not limited to the examples described above. The length of the pointer 32A may be extended in the radial direction of the speed gauge 31A to make it easy to recognize that the pointer 32A is approaching the reference speed mark 36. Characters 34B indicating the speed may be highlighted. Examples of highlighting of the characters 34B include, but are not limited to, changing the color of the characters 34B, changing the thickness of the characters 34B, adding an underline or a frame, changing the display position, and the like. It is not necessary to perform all of the operations of (c) in
Note that in any of the examples illustrated in
In
In any of the examples illustrated in
The image control unit 250 determines whether to perform highlighting (step S12). Whether to perform highlighting is determined based on whether the present vehicle speed exceeds an allowable range and approaches the reference speed, whether the acceleration level of the present vehicle speed exceeds a predetermined level, or whether the user has input an instruction to perform highlighting. The determination of whether to perform highlighting (step S12) and the acquiring of the speed related information (step S11) may be performed in random order at the same time.
When highlighting is to be performed (YES in step S12), the display image is adjusted to generate and output image data for highlighting the difference between the present vehicle speed and the reference speed (step S13). The highlighting may be a display method in which the intervals between the scale marks 35 accompanying to the speed gauge 31 are partially expanded at the portion near the present vehicle speed, or a display method in which the speed mark 36 is disposed on the line segment to facilitate the comparison with the present vehicle speed, or a display method in which information is highlighted with thick lines, a color bar, or blinking, etc., when the present vehicle speed approaches the reference speed, as illustrated in
When highlighting is not to be performed, (NO in step S12), regular display is performed (step S14). Thereafter, it is determined whether the image display control has ended (step S15). The image display control is ended, for example, when traveling is finished and the engine is turned off. Until the image display control is ended, steps S11 to S15 are repeated.
By this display control method, the display mode is changed, the difference between the present vehicle speed and the speed related information is highlighted, and the occupant can easily view the difference.
The highlighting step S13 may include the steps of generating data of a display image in which the present moving speed is displayed together with other speed related information, highlighting the difference between the present moving speed and other speed related information, and scanning the light for rendering the display image in which the difference is highlighted, at the predetermined projection area 311 of the automobile 300, to form a virtual image of the display image so as to be visible.
When the display control method is executed by a program, a program for display control may be stored in advance in the ROM 203 or the SSD 209, and the CPU 202 may read and execute the program. In this case, the CPU 202 executes at least the following processes.
(a) Generating data of a display image indicating the present moving speed of the movable body together with other speed related information.
(b) Changing the display mode related to the difference between the present moving speed and the speed indicated by other speed related information, in a case where a predetermined condition is satisfied.
The present invention is not limited to the embodiments described above. For example, as the optical apparatus 10, a panel method may be adopted instead of the laser scanning method. As a panel method, an imaging device such as a liquid crystal panel, a DMD panel, or a Vacuum Fluorescent Display (VFD), etc., may be used.
The projection area 311 of the windshield 310 may be provided with a combiner formed of a half-silvered mirror (half mirror, semitransparent minor), or a hologram, etc. A light transmission/reflection type reflection film may be vapor-deposited on the surface of or between the layers of the windshield 310.
At least a part of each function of the display apparatus 1 may be implemented by cloud computing configured of one or more computers.
The image control apparatus, the display apparatus, the movable body, and the image control method are not limited to the specific embodiments described herein, and variations and modifications may be made without departing from the scope of the present invention.
The present application is based on and claims the benefit of priority of Japanese Priority Patent Application No. 2018-066210, filed on Mar. 29, 2018, and Japanese Priority Patent Application No. 2019-053305, filed on Mar. 20, 2019, the entire contents of which are hereby incorporated herein by reference.
Number | Date | Country | Kind |
---|---|---|---|
2018-066210 | Mar 2018 | JP | national |
2019-053305 | Mar 2019 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2019/013251 | 3/27/2019 | WO | 00 |