The present disclosure relates to a technology for projecting an image from a vehicle onto, for example, a road surface.
Various technologies for controlling light emitted from vehicles have been proposed to improve the safety of the vehicles. The technologies include a headlamp (headlight) system that controls an emission direction of light of a headlamp in consideration of a road shape ahead of a vehicle, and a driving assistance device that projects images such as graphics and a symbol onto a road surface ahead of a vehicle (for example, Patent Documents 1 and 2 below).
The aforementioned technologies can alert a driver to the existence of a feature requiring caution (hereinafter referred to as a “caution-required feature”) in driving a vehicle. The technologies, however, do not allow the driver to intuitively understand a solid shape of the caution-required feature. In a situation where the driver has difficulty in visually recognizing a caution-required feature, particularly, at night or in bad weather, etc., a technology that allows the driver to easily understand a solid shape of the caution-required feature is desired.
The present invention has been conceived to solve the problem, and has an object of providing a projection control apparatus that can alert a driver of a vehicle to the existence of a caution-required feature while allowing the driver to intuitively understand a solid shape of the caution-required feature.
A projection control apparatus according to the present disclosure includes: a caution-required feature detector to detect a caution-required feature that is a feature with a solid shape that impacts on traveling of a subject vehicle; an optical illusion image generating unit to generate an optical illusion image that recalls the solid shape of the caution-required feature; an alert determination unit to determine whether alerting the caution-required feature is necessary; and a projection controller to cause a projector to project the optical illusion image at a position corresponding to a position of the caution-required feature determined as requiring the alert.
The present disclosure can alert a driver of a vehicle to the existence of a caution-required feature while allowing the driver to intuitively understand a solid shape of the caution-required feature.
The objects, features, aspects, and advantages of the present disclosure will become more apparent from the following detailed description and the accompanying drawings.
The projection control system according to Embodiment 1 includes a projection control apparatus 10, and a peripheral detection device 21 and a projector 22 that are connected to the projection control apparatus 10 as illustrated in
The peripheral detection device 21 is an in-vehicle device with a function of detecting a feature existing around a subject vehicle, and can include, for example, sensors such as an ultrasound sensor, a millimeter wave radar, light detection and ranging (LiDAR), and an image analyzer that detects a feature from images captured around the subject vehicle by cameras (including an infrared camera). The peripheral detection device 21 can detect, for example, a position (a distance and a direction from the subject vehicle) and a shape of the detected feature. The peripheral detection device 21 may also have a function of detecting information on an environment around the subject vehicle (for example, brightness, temperature, and weather) besides the function of detecting a feature.
The projector 22 is an in-vehicle device that illuminates an image onto, for example, a road surface around the subject vehicle. The projector 22 may be a dedicated device for projecting an image, or a device that projects an image using laser light emitted by a headlamp.
The projection control apparatus 10 is an in-vehicle apparatus that controls the projector 22 based on a result of a feature detected by the peripheral detection device 21. The projection control apparatus 10 is not necessarily installed in the subject vehicle, and may be implemented by an application program to be executed by a mobile device that can be brought into a vehicle, such as a mobile phone, a smartphone, and a portable navigation device (PND). A part of functions of the projection control apparatus 10 may be implemented by a server that is installed outside the subject vehicle and can communicate with the projection control apparatus 10.
As illustrated in
The caution-required feature detector 11 detects a caution-required feature around the subject vehicle (a feature with a solid shape that impacts on the traveling of the subject vehicle), based on the result of the feature detected by the peripheral detection device 21. Assumed caution-required features include structures on a road through which the subject vehicle is traveling, for example, a median strip, a curb, a sidewalk, and a gutter. The structures each has a step (a height or a depth) with a certain dimension or more, with respect to the road.
For example, road surfaces in particular states may be caution-required features. The road surfaces include a road surface on which snow is accumulated, an icy road surface, a road surface with puddles, a gravel road surface, a road surface with a rut, a road surface with a slit, a collapsed road surface, and a cracked road surface. For example, objects not installed on roads may also be caution-required features. The objects include an on-street parking vehicle and a pole of a road construction site.
The optical illusion image generating unit 12 generates an optical illusion image that recalls a solid shape of the caution-required feature detected by the caution-required feature detector 11. The optical illusion image generated by the optical illusion image generating unit 12 is an image to be perceived as if a feature existed through optical illusion when viewed from the driver of the subject vehicle. The optical illusion image generating unit 12 can generate the optical illusion image by, for example, a rendering method that conforms to one-point perspective with respect to positions of eyes (an eyepoint) of the driver.
Furthermore, the optical illusion image is not necessarily an image that duplicates an actual solid shape of a caution-required feature but may be any that recalls a rough solid shape (for example, a protruding shape or a depressed shape) of the caution-required feature. For example, when a caution-required feature has a shape protruding from a road surface, such as a median strip, a curb, a sidewalk, or a pole of a road construction site, the optical illusion image generating unit 12 should generate an optical illusion image to be perceived as a protruding shape. Furthermore, when a caution-required feature has a shape depressed from a road surface, such as a gutter or a collapsed road surface, the optical illusion image generating unit 12 should generate an optical illusion image to be perceived as a depressed shape. When a caution-required feature is a road surface on which snow is accumulated, an icy road surface, a road surface with puddles, a gravel road surface, a road surface with a rut, a road surface with a slit, a collapsed road surface, or a cracked road surface, the optical illusion image generating unit 12 should generate an optical illusion image with an arbitrary shape that recalls, for example, the accumulated snow, the icy surface, the puddles, the gravel, the rut, the slit, the collapse, and the crack, respectively. Furthermore, the shape of the optical illusion image may be different from the actual shape. When the caution-required feature detector 11 can detect an actual solid shape of the caution-required feature, the optical illusion image generating unit 12 may generate an optical illusion image that simulates the actual solid shape.
The alert determination unit 13 determines whether alerting the caution-required feature detected by the caution-required feature detector 11 is necessary. Specifically, the alert determination unit 13 determines whether alerting the driver of the subject vehicle to the existence of the caution-required feature is necessary. Hereinafter, the caution-required feature determined by the alert determination unit 13 as requiring an alert may be referred to as an “alert target feature”.
There are various criteria for determining whether alerting a caution-required feature is necessary, that is, criteria for determining whether the caution-required feature is an alert target feature. Some examples of the criteria will be described hereinafter.
For example, the urgency of a caution-required feature can be a criterion for determining whether alerting the caution-required feature is necessary. Since the urgency of a caution-required feature at a position closer to the subject vehicle is high, the alert determination unit 13 may determine a caution-required feature whose distance from the subject vehicle is smaller than or equal to a predefined threshold (for example, 100 m) as an alert target feature. Furthermore, the alert determination unit 13 may calculate a time at which the subject vehicle will arrive at a caution-required feature from a traveling speed of the subject vehicle, and determine a caution-required feature with the calculated arrival time less than or equal to a predefined threshold (for example, seven seconds) as an alert target feature.
The threshold need not be a fixed value. The threshold may be increased as a traveling speed of the subject vehicle is higher. The threshold may be increased as a braking distance of the subject vehicle is longer. The braking distance of the subject vehicle depends on a traveling speed of the subject vehicle and a coefficient of friction of a road surface. Furthermore, the threshold may be changed according to a type (an attribute) of a caution-required feature. For example, the threshold may be increased for a caution-required feature having difficulty in being visually determined, such as an icy road surface, so that the caution-required feature is alerted earlier.
Furthermore, when the subject vehicle implements automated driving, the urgency of a caution-required feature is low for the driver. This is because the subject vehicle automatically avoids the caution-required feature. Thus, the alert determination unit 13 may determine whether alerting the caution-required feature is necessary, based on an automated driving level implemented by the subject vehicle (an automated driving level defined by the Society of Automotive Engineers (SAE)). For example, when the subject vehicle is operated at level 4 or 5 of the automated driving, the driver need not monitor a behavior of the subject vehicle or a situation around the subject vehicle. Thus, the alert determination unit 13 may determine that alerting the caution-required feature is unnecessary. When the subject vehicle is operated at level 3 of the automated driving, whether alerting the caution-required feature is necessary may be defined according to a specification of an automated driving apparatus, or may be set by the driver.
While the lane keeping function is active even with the subject vehicle being operated at levels lower than level 3 of the automated driving, the alert determination unit 13 may determine that alerting a caution-required feature outside a lane through which the subject vehicle is traveling is unnecessary. Similarly, while automatic braking or speed control that is suitable for a situation ahead of the subject vehicle is active even with the subject vehicle being operated at levels lower than level 3 of the automated driving, the alert determination unit 13 may exclude a road surface from alert target features.
For example, the recognition difficulty of a caution-required feature can be a criterion for determining whether alerting the caution-required feature is necessary. For example, since the driver has difficulty in recognizing a caution-required feature whose part or entirety is hidden by, for example, accumulated snow, mud, or a roadside tree and cannot be seen by the driver, the necessity of determining the caution-required feature as an alert target is high. Thus, the alert determination unit 13 may determine a part of a caution-required feature which accounts for a certain rate (for example, 50%) or higher and which cannot be visually recognized from the subject vehicle as an alert target feature.
Furthermore, since the driver has difficulty in recognizing a caution-required feature in a situation where the surrounding of the subject vehicle is dark, the necessity of determining the caution-required feature as an alert target feature is high in such a situation. Thus, the alert determination unit 13 may determine that alerting the caution-required feature is necessary when the brightness around the subject vehicle is lower than or equal to a predefined threshold. Furthermore, when a headlamp or a width indicator (small lamp) of the subject vehicle is lit, it is highly probable that the surrounding of the subject vehicle is dark. Thus, the alert determination unit 13 may determine that alerting the caution-required feature is necessary when the headlamp or the width indicator of the subject vehicle is lit.
A weather condition around the subject vehicle can be a criterion for determining whether alerting a caution-required feature is necessary. For example, since the driver has difficulty in recognizing a caution-required feature in bad weather causing low visibility, such as rain, snow, or fog, the alert determination unit 13 may determine that alerting a caution-required feature is necessary in bad weather. Furthermore, the alert determination unit 13 may determine that alerting a caution-required feature is necessary when determining that a road surface highly probably becomes icy, from an outside-air temperature or a humidity. The alert determination unit 13 may obtain information on a weather condition from, for example, a temperature sensor, a humidity sensor, or a rain sensor, or by communicating with a weather information distribution system.
Furthermore, the recognition of the driver on a caution-required feature can be a criterion for determining whether alerting the caution-required feature is necessary. For example, a driver sensing device (not illustrated) of the subject vehicle determines the recognition of the driver on a caution-required feature from the movement of the line of sight of the driver. Then, the alert determination unit 13 may determine whether alerting the caution-required feature is necessary, based on the determination result. For example, the alert determination unit 13 may determine, as an alert target feature, a caution-required feature whose recognition of the driver is lower than or equal to a predefined threshold.
The projection controller 14 controls the projector 22, and causes the projector 22 to display, on a road surface, etc., an image for alerting the existence of a caution-required feature (an alert target feature) determined by the alert determination unit 13 as requiring the alert. Specifically, the projection controller 14 causes the projector 22 to project the optical illusion image generated by the optical illusion image generating unit 12 at a position corresponding to the position of the alert target feature to alert the alert target feature. Here, the “position corresponding to the position of the alert target feature” is not limited to the position identical to the position of the alert target feature but should be a position at which the position of the alert target feature can be understood by projection of the optical illusion image. Specifically, examples of the “position corresponding to the position of the alert target feature” include the proximity of the alert target feature, such as the periphery of the alert target feature and a position adjacent to the alert target feature.
Here, specific example operations of the projection control system according to Embodiment 1 will be described. Assume that, for example, the subject vehicle on which the projection control system is mounted is traveling through a road as illustrated in
In a situation of
The optical illusion image generating unit 12 generates an optical illusion image that recalls the solid shape of the median strip that is an alert target feature. Since the median strip protrudes from a road surface, the optical illusion image generating unit 12 generates, for example, an optical illusion image of a protruding shape as illustrated in
The alert determination unit 13 determines whether alerting the median strip as an alert target feature is necessary. For example, when determining that alerting a caution-required feature whose distance from the subject vehicle is smaller than or equal to 100 m is necessary, the alert determination unit 13 determines a portion of the median strip up to 100 m from the subject vehicle as a caution-required feature.
The projection controller 14 controls the projector 22, and causes the projector 22 to project the optical illusion image generated by the optical illusion image generating unit 12 at a position corresponding to the position of the median strip that is an alert target feature. For example, when the projection controller 14 projects the optical illusion image in
As such, the projection control system according to Embodiment 1 can alert the existence of a caution-required feature while making the driver of the subject vehicle intuitively understand a solid shape of the caution-required feature.
Although
Although
As described above, the caution-required feature sometimes has a depressed shape, for example, a gutter or a collapsed road surface. For example, when a median strip exists to the right of a road through which the subject vehicle is traveling and a gutter exists to the left of the road as illustrated in
The virtual height (or depth) H of the optical illusion image may be changed according to the height of the caution-required feature. In other words, as a caution-required feature is higher, a value of the virtual height H of the optical illusion image for alerting the caution-required feature may be increased.
As described above, a road surface of a road in a particular state may be a caution-required feature. When a road surface that is a caution-required feature has become an alert target feature, the projector 22 may project an optical illusion image onto the entire road surface of a lane through which the subject vehicle is traveling as illustrated in
Furthermore, after the driver recognizes a state of a road surface, unless the state of the road surface is changed, the necessity of alerting the driver to the state of the road surface is low. Thus, the projector 22 may project the optical illusion image of the road surface, for example, only in a certain range in the vicinity of a road-surface change location as illustrated in
When a falling object exists on a road surface, it is preferred to assign a higher priority to making the driver recognize the existence of the falling object than to making the driver recognize a state of a road surface, in view of maintaining the safety. Thus, when the caution-required feature detector 11 detects the falling object on the road surface, it is preferred that the alert determination unit 13 stops projection of an optical illusion image of the road surface, or projects an optical illusion image for alerting the falling object.
Next, operations of the projection control apparatus 10 will be described with reference to a flowchart in
When its own operation mode is set to the optical illusion image projection mode (YES in Step S101), the caution-required feature detector 11 obtains a detection result of a feature from the peripheral detection device 21 (Step S102), and checks whether a feature has been detected around the subject vehicle (Step S103). When the feature has been detected around the subject vehicle (YES in Step S103), the caution-required feature detector 11 determines whether the feature is a caution-required feature, based on a shape of the feature (Step S104).
When the feature is a caution-required feature (YES in Step S104), the alert determination unit 13 determines whether alerting the caution-required feature is necessary (Step S105). In other words, the alert determination unit 13 determines whether the caution-required feature is an alert target feature in Step S105.
When the alert determination unit 13 determines the caution-required feature as an alert target feature (YES in Step S105), the optical illusion image generating unit 12 generates an optical illusion image that recalls a solid shape of the alert target feature (Step S106). Then, the projection controller 14 controls the projector 22, and causes the projector 22 to project the optical illusion image generated by the optical illusion image generating unit 12 at a position corresponding to the position of the alert target feature to alert the alert target feature (Step S107). When NO is determined in any one of Steps S103, S104, and S105, processes in Steps S106 and S107 are not performed.
After the aforementioned processes, the processes return to Step S101, In other words, the projection control apparatus 10 repeatedly executes the aforementioned operations.
Here, the operation mode in Step S101 (ON or OFF of the optical illusion image projection mode) may be switched by the user or automatically by the projection control apparatus 10. Examples of a method for the user to switch the operation mode of the projection control apparatus 10 include a method using a hardware key such as a physical switch, and a method using a software key such as a graphical user interface (GUI).
When the projection control apparatus 10 automatically switches the operation mode, for example, when a headlamp or a width indicator (small lamp) of the subject vehicle is lit, the optical illusion image projection mode may be turned ON. Furthermore, when the brightness around the subject vehicle or around an alert target feature is lower than or equal to a certain value, the optical illusion image projection mode may be turned ON. The brightness around the subject vehicle or the alert target feature may be measured by an illuminance sensor of the subject vehicle, or estimated from the lightness of an image captured by a camera functioning as the caution-required feature detector 11.
Furthermore, the projection control apparatus 10 may automatically switch the operation mode, based on a behavior of the subject vehicle. For example, when the caution-required feature detector 11 detects an inferior road surface condition of a road through which the subject vehicle is traveling, and a sensor of the subject vehicle detects behaviors such as a drift and a skid of the subject vehicle, the optical illusion image projection mode may be turned ON.
[Modifications]
When a predefined condition is satisfied, the projection controller 14 may change a display mode of an optical illusion image. For example, when the recognition of the driver on an alert target feature is lower than a predefined threshold or when a rate of a part of an alert target feature that cannot be visually recognized from the subject vehicle is higher than a predefined threshold, the projection controller 14 may highlight an optical illusion image for alerting the alert target feature. Furthermore, as the recognition of the driver on an alert target feature is lower or as a rate of a part of an alert target feature that cannot be visually recognized from the subject vehicle is higher, the projection controller 14 may increase a degree of the highlight. The highlight mode may be any mode, for example, brightness enhancement display, blinking, enlarged display, or animation display.
Furthermore, the optical illusion image generating unit 12 may deform an optical illusion image to be generated, according to various conditions. For example, the optical illusion image generating unit 12 may change an optical illusion image according to a height of the eyepoint of the driver (a height of a position of the seat of the subject vehicle). Specifically, when the eyepoint of the driver is higher, the virtual height H or the virtual depth D of the optical illusion image should be increased. Furthermore, the virtual height H or the virtual depth D of the optical illusion image may be reduced when the traveling speed of the subject vehicle is low, whereas the virtual height H or the virtual depth D of the optical illusion image may be increased when the traveling speed of the subject vehicle is high.
[Example Hardware Configuration]
When the processing circuit 50 is dedicated hardware, the processing circuit 50 is, for example, a single circuit, a composite circuit, a programmed processor, a parallel-programmed processor, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or any combination of these. Functions of the constituent elements of the projection control apparatus 10 may be implemented by separate processing circuits, or collectively implemented by a single processing circuit.
Here, examples of the memory 52 may include non-volatile or volatile semiconductor memories such as a random-access memory (RAM), a read-only memory (ROM), a flash memory, an erasable programmable read-only memory (EPROM), and an electrically erasable programmable read-only memory (EEPROM), a hard disk drive (HDD), a magnetic disc, a flexible disk, an optical disk, a compact disk, a mini disk, a Digital Versatile Disc (DVD), a drive device thereof, and further any a storage medium to be used in the future.
What is described above is the configuration for implementing the functions of the constituent elements of the projection control apparatus 10 using one of the hardware and the software, etc. However, the configuration is not limited to this. A part of the constituent elements of the projection control apparatus 10 may be implemented by dedicated hardware, and another part of the constituent elements may be implemented by software, etc. For example, the processing circuit 50 functioning as the dedicated hardware can implement the functions of the part of the constituent elements, and the processing circuit 50 functioning as the processor 51 can implement the functions of the other part of the constituent elements through reading and executing a program stored in the memory 52.
As described above, the projection control apparatus 10 can implement each of the functions by hardware, software, etc., or any combinations of these.
The positioning device 23 calculates a position and a traveling direction of the subject vehicle, based on positioning signals obtained from the global navigation satellite system (GNSS) and information obtained from sensors (for example, a speed sensor, an acceleration sensor, and an azimuth sensor) of the subject vehicle. The positioning device 23 can identify the position of the subject vehicle with accuracy high enough to identify a lane through which the subject vehicle is traveling, specifically, on the order of several tens of centimeters.
The map data storage 24 is a storage medium in which map data is stored. The map data includes information on positions and shapes of lanes of roads, and positions and shapes (including solid shapes) of features around the roads. The map data stored in the map data storage 24 is high-definition map data in which the positions and the shapes of the lanes and the features are described with accuracy on the order of several tens of centimeters. The map data stored in the map data storage 24 may be used for a map matching process for correcting the position of the subject vehicle which has been calculated by the positioning device 23.
The basic configuration and operations of the projection control apparatus 10 according to Embodiment 2 are identical to those according to Embodiment 1. However, the caution-required feature detector 11 according to Embodiment 2 detects a caution-required feature around the subject vehicle, based on information on the position of the subject vehicle which has been calculated by the positioning device 23, and information on positions and solid shapes of features around the subject vehicle which is included in the map data stored in the map data storage 24.
In Step S111, the caution-required feature detector 11 obtains position information of the subject vehicle from the positioning device 23. In Step S112, the caution-required feature detector 11 searches the map data stored in the map data storage 24 for a feature around the subject vehicle.
When the caution-required feature detector 11 detects the feature around the subject vehicle in Step S112 (YES in Step S112), the caution-required feature detector 11 determines whether the feature is a caution-required feature, based on the information on a shape of the feature included in the map data (Step S104).
Since the processes of the other steps are basically the same as those described with reference to
[Modifications]
Although
Inclusion of the peripheral detection device 21 in the projection control system according to Embodiment 2 further enables the peripheral detection device 21 to detect a dividing line of a lane. The projection control system can correct the position of the subject vehicle which has been calculated by the positioning device 23, based on the detected dividing line of the lane, and improve the accuracy of calculating the position of the subject vehicle.
The caution-required feature information storage 15 stores caution-required feature information including position information on a caution-required feature. The caution-required feature information may include not only the position information on the caution-required feature but also information on the type and the shape of the caution-required feature. Storing information on the caution-required feature in the caution-required feature information storage 15 means “registering” the information.
The setting device 25 is a device that registers a caution-required feature in the caution-required feature information storage 15. Specifically, the caution-required feature information storage 15 obtains the caution-required feature information from the setting device 25, and stores the caution-required feature information. The setting device 25 is a user device of the projection control apparatus 10, such as a mobile device or a smart phone, or a device of an infrastructure (an infrastructure device) such as a security camera. The setting device 25 may be any device that can be authenticated by the projection control apparatus 10.
There is no constraint on methods for the setting device 25 to obtain the caution-required feature information and register the caution-required feature information in the caution-required feature information storage 15. For example, when the setting device 25 is a user device, the setting device 25 may extract and obtain the caution-required feature information from a map data storage that is not illustrated, according to an operation of the user, and register the obtained information in the caution-required feature information storage 15. The setting device 25 may include a human-machine interface (HMI) that can select a region or a type of a caution-required feature on which the setting device 25 obtains information.
Furthermore, the setting device 25 may download the caution-required feature information from an external server with a database of caution-required features, according to an operation of the user, and register the obtained information in the caution-required feature information storage 15. The setting device 25 may include the HMI that can select a region or a type of a caution-required feature to be downloaded. Furthermore, the setting device 25 may have functions of detecting an update of the database, automatically downloading the latest caution-required feature information, and updating information registered in the caution-required feature information storage 15. Furthermore, the setting device 25 may automatically download information on a caution-required feature occurring due to, for example, the impact of weather, and register the information as information on a temporary caution-required feature in the caution-required feature information storage 15.
Furthermore, the user may be allowed to operate the setting device 25 and register original information on a caution-required feature in the caution-required feature information storage 15. For example, when the user transmits, to the projection control apparatus 10, an image of a feature captured by a smart phone that is the setting device 25, and the projection control apparatus 10 analyzes the image and determines the feature as a caution-required feature, the feature may be registered as a caution-required feature in the caution-required feature information storage 15.
When the setting device 25 is an infrastructure device such as a security camera, the projection control apparatus 10 obtains information on a feature determined by the infrastructure device as a caution-required feature, and registers the feature in the caution-required feature information storage 15.
The basic configuration and operations of the projection control apparatus 10 according to Embodiment 3 are identical to those according to Embodiment 1. The caution-required feature detector 11 according to Embodiment 3 detects a caution-required feature around the subject vehicle, based on information on the position of the subject vehicle which has been calculated by the positioning device 23 and the caution-required feature information stored in the caution-required feature information storage 15.
In Step S121, the caution-required feature detector 11 obtains position information of the subject vehicle from the positioning device 23. In Step S122, the caution-required feature detector 11 searches for a registered caution-required feature located around the subject vehicle, based on the position information of the subject vehicle and the caution-required feature information stored in the caution-required feature information storage 15.
When the caution-required feature detector 11 detects the registered caution-required feature located around the subject vehicle in Step S122 (YES in Step S122), the processes proceed to Step S105. In Step S105, the alert determination unit 13 determines whether alerting the caution-required feature is necessary. When the caution-required feature detector 11 does not detect the registered caution-required feature located around the subject vehicle in Step S122 (NO in Step S122), the processes return to Step S101. Since the processes of the other steps are basically the same as those described with reference to
[Modifications]
The alert determination unit 13 may unconditionally determine the caution-required feature registered in the caution-required feature information storage 15 as an alert target feature. A flowchart of
Although
The alert target type storage 16 stores information on an alert target type that is a type of a feature to be alerted. Storing information on an alert target type in the alert target type storage 16 means “registering” the information.
The alert determination unit 13 determines only a caution-required feature of the alert target type registered in the alert target type storage 16 as an alert target. Conversely speaking, the alert determination unit 13 excludes a caution-required feature of a non-alert target type from alert targets.
The setting device 25 is a device that registers an alert target type in the alert target type storage 16. Specifically, the alert target type storage 16 obtains information on an alert target type from the setting device 25, and stores the information. The setting device is a user device of the projection control apparatus 10, such as a mobile device or a smart phone, or an infrastructure device such as a security camera. The setting device 25 may be any device that can be authenticated by the projection control apparatus 10.
There is no constraint on methods for the setting device 25 to obtain information on an alert target type and register the information in the alert target type storage 16. For example, when the setting device 25 is a user device, the user may operate the setting device to select which type of a caution-required feature is desirably alerted. Then, the setting device 25 may register the type selected by the user in the alert target type storage 16. For example, a road shoulder, a median strip, a gutter, a road-surface condition change location, or an obstacle is probably selected as an alert target type.
When the setting device 25 is an infrastructure device such as a security camera, the projection control apparatus 10 obtains information on a type of a feature determined by the infrastructure device as an alert target feature, and registers the information in the alert target type storage 16.
The basic configuration and operations of the projection control apparatus 10 according to Embodiment 4 are identical to those according to Embodiment 1. The alert determination unit 13 determines only a caution-required feature of the alert target type registered in the alert target type storage 16 as an alert target as described above.
Step S131 is executed when the caution-required feature detector 11 detects a caution-required feature (when YES is determined in Step S104). In Step S131, the alert determination unit 13 checks whether a type of the caution-required feature is an alert target type. When the type of the caution-required feature is an alert target type (YES in Step S131), the processes proceed to Step S105. When the type of the caution-required feature is not an alert target type (NO in Step S131), the processes return to Step S101. Since the processes of the other steps are basically the same as those described with reference to
[Modifications]
For example, when the alert target type storage 16 is applied to the projection control system according to Embodiment 3, the projection control apparatus 10 includes both of the caution-required feature information storage 15 and the alert target type storage 16. Here, the alert determination unit 13 may determine both of a caution-required feature registered in the caution-required feature information storage 15 and a caution-required feature of a type registered in the alert target type storage 16 as alert target features.
Embodiments can be freely combined, or appropriately modified and omitted.
The foregoing description is in all aspects illustrative, and numerous modifications and variations that have not yet been exemplified can be devised.
10 projection control apparatus, 11 caution-required feature detector, 12 optical illusion image generating unit, 13 alert determination unit, 14 projection controller, 15 caution-required feature information storage, 16 alert target type storage, 21 peripheral detection device, 22 projector, 23 positioning device, 24 map data storage, 25 setting device.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2021/020787 | 6/1/2021 | WO |