The present invention relates to an operation support device and an operation support method.
As a technology of supporting driver's operation on an on-board device, there is a technology of detecting a line of sight of a driver and using the line of sight as a command of device operation. For example, Patent Document 1 discloses a system as its fourth embodiment. In the system, a line of sight of a driver is detected, and operation regarding a device that the driver sees is performed with a remote operation device. Specifically, in the system of Patent Document 1, a vehicle interior area including a line of sight direction of a user is identified, the identified area and disposition information of on-board devices are compared, and an on-board device disposed in the line of sight direction of the user is identified.
Patent Document 1: Japanese Patent Application Laid-Open No. 2016-110424
In the technology of Patent Document 1, an on-board device is identified based only on the line of sight. The line of sight of a driver is subject to influence from vibration of a vehicle and limitation of a gaze time period during driving of a vehicle, for example. Thus, there has been a problem in that accuracy of identification is not always high.
In the light of the problem described above, the present invention has an object to provide a technology of accurately identifying a device that a driver desires to operate out of on-board devices, and supporting operation of the device.
An operation support device of the present invention includes a line of sight direction acquisition unit, a characteristic behavior acquisition unit, a device identification unit, and a display controller. The line of sight direction acquisition unit is configured to acquire a line of sight direction of a driver of a vehicle. The characteristic behavior acquisition unit is configured to acquire a characteristic behavior being a characteristic behavior of the driver other than a line of sight. The device identification unit is configured to identify at least one on-board device out of a plurality of on-board devices mounted in the vehicle as an operation target device that the driver desires to operate, based on the line of sight direction and the characteristic behavior. The display controller is configured to cause a display device mounted in the vehicle to display an operation screen of the operation target device.
An operation support device of the present invention includes a line of sight direction acquisition unit, a characteristic behavior acquisition unit, a device identification unit, and a display controller. The line of sight direction acquisition unit is configured to acquire a line of sight direction of a driver of a vehicle. The characteristic behavior acquisition unit is configured to acquire a characteristic behavior being a characteristic behavior of the driver other than a line of sight. The device identification unit is configured to identify at least one on-board device out of a plurality of on-board devices mounted in the vehicle as an operation target device that the driver desires to operate, based on the line of sight direction and the characteristic behavior. The display controller is configured to cause a display device mounted in the vehicle to display an operation screen of the operation target device. Therefore, according to the operation support device of the present invention, a device that a driver desires to operate can be accurately identified out of on-board devices, and operation of the device can be supported.
These and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.
<A-1. Configuration>
In
The operation support device 101 includes a line of sight direction acquisition unit 11, a characteristic behavior acquisition unit 12, a device identification unit 13, and a display controller 14. A display device 21 is an on-board display. Examples of the display device 21 include a display in an instrument panel, a head-up display (abbreviated as HUD), and a meter display. One or more display devices 21 may be provided.
The line of sight direction acquisition unit 11 acquires a line of sight direction of a driver of a vehicle.
The characteristic behavior acquisition unit 12 acquires a characteristic behavior, which is a characteristic behavior of the driver other than a line of sight. Examples of the characteristic behavior include a driver's finger pointing, gesture, speaking, facial motion, and change in facial expression, or a combination of these.
The device identification unit 13 identifies at least one on-board device 22 out of a plurality of on-board devices 22 mounted in the vehicle as an operation target device that a user desires to operate, based on the line of sight direction and the characteristic behavior.
The display controller 14 causes the display device 21 mounted in the vehicle to display an operation screen of the operation target device. The operation screen is a screen directly or indirectly used for operation of the operation target device. Examples of the operation screen include a screen for displaying an operation menu, a screen for displaying an operation tutorial, and a function explanatory screen of the operation target device.
<A-2. Operation>
First, the line of sight direction acquisition unit 11 determines a line of sight direction of a driver of a vehicle (Step S101). Next, the characteristic behavior acquisition unit 12 acquires a characteristic behavior (Step S102). Next, the device identification unit 13 identifies at least one on-board device 22 out of a plurality of on-board devices 22 as an operation target device, based on the line of sight direction and the characteristic behavior (Step S103). Next, the display controller 14 causes the display device 21 mounted in the vehicle to display an operation screen of the operation target device (Step S104). This ends the operation of the operation support device 101.
In the flow of
<A-3. Effect>
An operation support device 101 of the first embodiment includes a line of sight direction acquisition unit 11, a characteristic behavior acquisition unit 12, a device identification unit 13, and a display controller 14. The line of sight direction acquisition unit 11 is configured to acquire a line of sight direction of a driver of a vehicle. The characteristic behavior acquisition unit 12 is configured to acquire a characteristic behavior being a characteristic behavior of the driver other than a line of sight. The device identification unit 13 is configured to identify at least one on-board device 22 out of a plurality of on-board devices 22 mounted in the vehicle as an operation target device that the driver desires to operate, based on the line of sight direction and the characteristic behavior. The display controller 14 is configured to cause a display device 21 mounted in the vehicle to display an operation screen of the operation target device. Therefore, according to the operation support device 101 of the first embodiment, the operation target device can be accurately identified, based not only on the line of sight direction but also on the characteristic behavior. Further, the operation screen can be used for driver's operation by causing the operation screen of the operation target device to be displayed on the display device 21.
An operation support method of the first embodiment includes the following steps. One step is to acquire a line of sight direction of a driver of a vehicle. One step is to acquire a characteristic behavior being a characteristic behavior of the driver other than a line of sight. One step is to identify at least one on-board device 22 out of a plurality of on-board devices 22 mounted in the vehicle as an operation target device that the driver desires to operate, based on the line of sight direction and the characteristic behavior. One step is to cause a display device 21 mounted in the vehicle to display an operation screen of the operation target device. Therefore, according to the operation support method of the first embodiment, the operation target device can be accurately identified, based not only on the line of sight direction but also on the characteristic behavior. Further, the operation screen can be used for driver's operation by causing the operation screen of the operation target device to be displayed on the display device 21.
<B-1. Configuration>
The line of sight detector 23, the characteristic behavior detector 24, and the operation device 25 are mounted in the vehicle.
The operation device 25 is a device for operating an operation screen of the on-board device 22 displayed on the display device 21. Examples of the operation device 25 include a touch pad and a joystick.
The line of sight detector 23 includes a camera, for example. The line of sight detector 23 detects a line of sight direction of a driver, based on an image of a face of the driver captured by the camera. The line of sight direction acquisition unit 11 acquires the line of sight direction of the driver from the line of sight detector 23, and outputs the line of sight direction to the device identification unit 13.
The characteristic behavior detector 24 detects a characteristic behavior of the driver. As illustrated in
The fingertip direction detector 24A includes a camera that captures a vehicle interior, for example. The fingertip direction detector 24A detects a finger pointing behavior of the driver as a characteristic behavior, based on an image of a finger of the driver captured by the camera. If the driver performs finger pointing, the fingertip direction detector 24A detects a fingertip direction.
The voice input device 24B includes a microphone mounted in a vehicle interior, for example. The voice input device 24B acquires speaking of the driver through the microphone. In the voice input device 24B, specific keywords are registered. If the speaking voice of the driver includes a specific keyword, the voice input device 24B detects the speaking as a characteristic behavior.
The gesture detector 24C includes a camera that captures a vehicle interior, for example. The gesture detector 24C acquires an image of the driver captured by the camera. In the gesture detector 24C, specific gestures are registered. If a motion of the driver corresponds to a specific gesture, the gesture detector 24C detects the gesture as a characteristic behavior.
The face direction detector 24D includes a camera that captures a vehicle interior, for example. The face direction detector 24D detects a face direction of the driver, based on an image of the driver captured by the camera. For example, if the face of the driver is continuously directed in one direction for a certain period of time, or if the face direction suddenly moves and then stays in a certain direction, the face direction detector 24D detects the face direction in such a case as a characteristic behavior.
Note that the characteristic behavior detector 24 only needs to include at least any of the fingertip direction detector 24A, the voice input device 24B, the gesture detector 24C, and the face direction detector 24D.
The characteristic behavior acquisition unit 12 acquires the characteristic behavior from the characteristic behavior detector 24, and outputs the characteristic behavior to the device identification unit 13.
The device identification unit 13 acquires the line of sight direction from the line of sight direction acquisition unit 11 and the characteristic behavior from the characteristic behavior detecting unit 12, and identifies an operation target device that the driver desires to operate out of the on-board devices 22, based on the line of sight direction and the characteristic behavior. If the device identification unit 13 cannot uniquely identify an operation target device, the device identification unit 13 need not necessarily identify only one operation target device, and may identify a plurality of operation target devices that may be considered to be the true operation target device. Details of processing of identifying an operation target device performed by the device identification unit 13 will be described later in <B-2>.
Examples of the on-board device 22 include a navigation device, an air conditioner, and an audio device. In
In
The display controller 14 creates an operation screen of the operation target device, and causes the display device 21 to display the operation screen.
In
<B-2. Processing of Identifying Operation Target Device>
Next, operation of the operation support device 102 will be described, according to the flow of
First, the line of sight direction acquisition unit 11 acquires a line of sight direction of a driver, and the characteristic behavior acquisition unit 12 acquires a characteristic behavior of the driver (Step S201). Here, the acquired line of sight direction of the driver need not necessarily be a line of sight direction at the same time point as the time point when the characteristic behavior is performed, and may be a line of sight direction within a predetermined period of time preceding or following the time point when the characteristic behavior is performed.
Next, the device identification unit 13 performs processing of identifying a candidate for an operation target device based on the line of sight direction (Step S202). When the driver attempts to operate a specific on-board device 22, the driver sees the on-board device 22 or displayed information of the on-board device 22. For example, when displayed information of a navigation device is displayed on another display device, the driver sees displayed information displayed on the another display device when the driver attempts to operate the navigation device. Accordingly, when the on-board device 22 or displayed information of the on-board device 22 is a line of sight target of the driver, it is likely that the on-board device 22 is identified as an operation target device.
Therefore, when the on-board device 22 or displayed information of the on-board device 22 overlaps the line of sight direction of the driver, the device identification unit 13 identifies the on-board device 22 as a candidate for an operation target device. The device identification unit 13 stores disposition information indicating where in the vehicle interior the on-board devices 22 are mounted. Based on the disposition information, the device identification unit 13 determines whether or not the line of sight direction of the driver overlaps an on-board device 22. Further, the device identification unit 13 occasionally acquires, from the display controller 14, display disposition information indicating which display device 21 and where in the display device 21 the displayed information of the on-board devices 22 is displayed. Based on the display disposition information, the device identification unit 13 determines whether or not the line of sight direction of the driver overlaps displayed information of an on-board device 22. If neither of the whole on-board devices 22 nor displayed information of the whole on-board devices 22 overlaps the line of sight direction of the driver, the device identification unit 13 cannot identify a candidate for an operation target device.
Next, the device identification unit 13 performs processing of identifying a candidate for an operation target device based on the characteristic behavior (Step S203). Specifically, if the characteristic behavior is a finger pointing behavior, the device identification unit 13 identifies an on-board device 22 whose device itself or displayed information of the device overlaps the finger pointing direction of the driver as a candidate for an operation target device. Further, if the characteristic behavior is speaking, the device identification unit 13 identifies an on-board device 22 whose device is associated with a keyword included in the speaking as a candidate for an operation target device. For example, if the driver speaks “I want to turn down the volume”, an audio device associated with the keyword “volume” is identified as a candidate for an operation target device. Further, if the characteristic behavior is a gesture of the driver, the device identification unit 13 identifies an on-board device 22 associated with the gesture as a candidate for an operation target device. Further, if the characteristic behavior is a face direction, the device identification unit 13 identifies an on-board device 22 whose device itself or displayed information of the device overlaps the face direction as a candidate for an operation target device.
Next, the device identification unit 13 determines whether or not the candidate for an operation target device identified in Step S202 and the candidate for an operation target device identified in Step S203 match (Step S204). If a candidate for an operation target device is not identified in either or both of Step S202 and Step S203, as well as if both the candidates do not match, the processing proceeds to No in Step S204. In this case, the operation support device 102 ends the processing without identifying an operation target device.
On the other hand, the candidate for an operation target device identified in Step S202 and the candidate for an operation target device identified in Step S203 match, the device identification unit 13 identifies the matching candidate as an operation target device (Step S205). Then, the display controller 14 creates an operation screen of the operation target device, and causes the display device 21 to display the operation screen (Step S206). This ends the processing of the operation support device 102.
The flow of
Next, the device identification unit 13 calculates operation target probability X1 of each on-board device 22, based on the line of sight direction (Step S302). In Step S302, if the line of sight direction covers and overlaps a plurality of on-board devices 22, an on-board device 22 having a larger overlapping degree is calculated to have higher operation target probability. Further, if the line of sight direction does not overlap any on-board device 22, an on-board device 22 located closer to the line of sight direction is calculated to have higher operation target probability. Such calculations are in reference to calculation of operation target probability based on a relationship between a device itself of an on-board device 22 and a line of sight direction. However, the same calculation applies to calculation of operation target probability based on a relationship between displayed information of an on-board device 22 and a line of sight direction.
Next, the device identification unit 13 calculates operation target probability X2 of each on-board device 22, based on the characteristic behavior (Step S303).
Then, for each on-board device 22, the device identification unit 13 combines the operation target probability X1 based on the line of sight direction and operation target probability X2 based on the characteristic behavior, and calculates operation target probability X of each on-board device 22 (Step S304). For example, an average value of X1 and X2 may be used as the operation target probability X.
Then, the device identification unit 13 identifies an operation target device, based on the operation target probability X (Step S305). A detailed flow of this step is illustrated in
In the flow of
If the maximum value of the operation target probability X is less than a, the device identification unit 13 determines whether or not the maximum value of the operation target probability X is equal to or larger than b (Step S3053). Here, a>b. If the maximum value of the operation target probability X is less than b, the device identification unit 13 ends Step S305 without identifying an operation target device.
If the maximum value of the operation target probability X is equal to or larger than b, the device identification unit 13 determines whether or not the second highest operation target probability X of an on-board device 22 is equal to or larger than c (Step S3054). Here, a>b>c. If the second highest operation target probability X of an on-board device 22 is less than c, the device identification unit 13 ends Step S305 without identifying an operation target device.
If the second highest operation target probability X of an on-board device 22 is equal to or larger than c, the device identification unit 13 identifies two on-board devices 22 having the two highest operation target probabilities X in descending order as operation target devices (Step S3055). This ends Step S305.
Description returns back to the flow of
<B-3. Display of Operation Screen>
As illustrated in Step S3055 of
In
If a plurality of display devices 21 are present, the display controller 14 selects one display device 21, and causes the selected display device 21 to display operation screen(s).
For example, the display controller 14 can cause a display device 21 located closest to the line of sight direction of the driver that is used when the device identification unit 13 identifies an operation target device to display operation screen(s). With this configuration, the driver can visually recognize the operation screen(s) without significantly moving the line of sight direction when the driver selects an operation target device from the on-board devices 22.
Alternatively, the display controller 14 may classify the on-board devices 22 into a device related to controlled traveling, a device related to the body, and a device related to information, and may cause a display device 21 suited for the classification of operation target devices to display operation screen(s). The device related to controlled traveling refers to a device that performs control related to traveling of a vehicle, such as auto-cruise and auto-braking. The device related to the body refers to a device that performs control related to the body of a vehicle, such as a window and a door. The device related to information refers to a device that provides information to a passenger of a vehicle, such as a navigation device and an audio device.
Then, for example, the display controller 14 may cause the HUD to display operation screen(s) of the device related to controlled traveling, and cause the CID to display operation screen(s) of the device related to the body and the device related to information. With this configuration, the driver can safely perform operation on operation screen(s) of the device related to controlled traveling that is related to traveling of a vehicle while the driver drives the vehicle.
<B-4. Operation Device>
Next, operation of the operation screen will be described. The driver can operate an operation screen displayed on the display device 21 by using the operation device 25. The operation receiver 15 acquires operation information of the operation device 25, and outputs the operation information to the display controller 14 and the operation target device controller 16.
Based on the operation information of the operation device 25 acquired from the operation receiver 15, the display controller 14 updates the operation screen and causes the display device 21 to display the operation screen.
Based on the operation information of the operation device 25 acquired from the operation receiver 15, the operation target device controller 16 performs control of an operation target device.
Here, it is determined in advance that the operation device 25 for operating the operation screen 43 of the navigation device is the dial 25B. In this case, the operation target device controller 16 lights up the dial 25B. With this configuration, the driver can easily notice the operation device 25 used for operation of the operation screen 43.
Further, in addition to lighting up the dial 25B, the display controller 14 may light up the operation screen 43 of the navigation device. With this configuration, the driver can more easily notice the operation device 25 used for operation of the operation screen 43.
Further, the same light up color may be used for the dial 25B and the operation screen 43, so that the driver can more easily notice the operation device 25 used for operation of the operation screen 43.
Further, the display controller 14 may change light up colors depending a type of an operation screen, such as by using blue for an operation screen of the navigation device and red for an operation screen of the audio device, so that the driver can easily know operation details of the operation screen.
Further, the display controller 14 may perform various light up displays depending on a type of the operation device 25, such as by performing light up display that light repeatedly moves around the dial 25B when the display controller 14 performs light up display for the dial 25B, and by causing the button 25C to flicker when the display controller 14 performs light up display for the button 25C. With this configuration, the driver can easily know how to operate the operation device 25.
<B-5. Examples of Characteristic Behavior of Driver>
Next, examples of characteristic behavior of the driver are illustrated.
As described in the above examples, the operation support device 102 of the second example identifies an operation target device, based on both the line of sight direction and the characteristic behavior, and can therefore accurately identify an operation target device. If a plurality of on-board devices 22 are disposed adjacent to each other, the line of sight direction may overlap a plurality of on-board devices 22. Further, the line of sight direction may move while covering a plurality of on-board devices 22, due to the sway of the vehicle. In such a case, it is difficult to identify an operation target device, based only on the line of sight direction. However, an operation target device can be accurately identified by using the characteristic behavior and thereby compensating for accuracy of identifying an operation target device.
<B-6. Effect>
According to the operation support device 102 of the second embodiment, the device identification unit 13 identifies at least one on-board device out of the plurality of on-board devices 22 as a candidate for the operation target device, based on the line of sight direction, and identifies at least one on-board device 22 out of the plurality of on-board devices 22 as a candidate for the operation target device, based on the characteristic behavior. If the candidate for the operation target device identified based on the line of sight direction and the candidate for the operation target device identified based on the characteristic behavior match, the device identification unit 13 identifies the candidate for the operation target device as the operation target device. Therefore, according to the operation support device 102, the operation target device can be accurately identified.
In addition to the configuration of the operation support device 101 of the first embodiment, the operation support device 102 further includes an operation receiver 15, and an operation target device controller 16. The operation receiver 15 is configured to acquire operation information of a plurality of operation devices 25 mounted in the vehicle and operated by the driver. The operation target device controller 16 is configured to control the operation target device, based on the operation information. The display controller 14 changes the operation screen of the operation target device, based on the operation information. Therefore, according to the operation support device 102, the operation screen of the operation target device can be displayed, and the operation screen can be used for driver's operation.
Further, in the operation support device 102, the device identification unit 13 identifies the operation target device, based on an overlapping degree between the line of sight direction of the driver and the on-board device. Therefore, the driver can cause the display device 21 to display an operation screen of a device by seeing the device that the driver desires to operate.
Further, in the operation support device 102, the display controller 14 causes the display device 21 to display respective pieces of displayed information of a plurality of operable devices. The device identification unit 13 identifies the operation target device, based on an overlapping degree between the displayed information and the line of sight direction of the driver. Therefore, the driver can cause the display device 21 to display an operation screen of a device by seeing displayed information of the device that the driver desires to operate on a display screen of the display device 21.
Further, in the operation support device 102, the operation target device controller 16 lights up the operation device 25 used to change the operation screen. Therefore, according to the operation support device 102, the operation device 25 used for operation of the operation screen 43 can be easily notified to the driver.
Further, in the operation support device 102, the device identification unit 13 calculates operation desired probability, based on the line of sight direction and the characteristic behavior, and identifies the operation target device, based on the operation desired probability. The operation desired probability is probability that the driver desires to operate each of the plurality of on-board devices 22. If a plurality of the operation target devices are present, the display controller 14 displays the operation screen of the operation target device having higher operation desired probability to be more conspicuous. Therefore, according to the operation support device 102, the operation screen of the on-board device 22 that the user may desire to operate can be appropriately displayed.
Further, in the operation support device 102, the display controller 14 causes the display device 21 to display an operation screen of the on-board device 22 that is located adjacent to the operation target device and that is not the operation target device to be less conspicuous than an operation screen of the operation target device. Therefore, an operation screen of the operation target device that the driver is more likely to desire to operate can be displayed noticeable to the driver while the operation screen of the on-board device 22 that the driver may desire to operate is displayed.
<C-1. Configuration>
In the second embodiment, the characteristic behavior is used as compensation for the line of sight direction, for the purpose of identifying an operation target device. In contrast, in a third embodiment, the characteristic behavior is used to determine a timing of identifying an operation target device, based on the line of sight direction.
A configuration of an operation support device 103 according to the third embodiment is as illustrated in
<C-2. Operation>
In the flow of
Next, the device identification unit 13 calculates operation target probability of each on-board device 22, based on the line of sight direction within a predetermined period of time, e.g., 2 seconds, from the time point when the characteristic behavior is performed (Step S402). A method of calculating the operation target probability based on the line of sight direction is as described in the second embodiment.
Next, the device identification unit 13 identifies an operation target device, based on the operation target probability of each on-board device 22 (Step S403). Here, the device identification unit 13 identifies one or more on-board devices having the highest operation target probabilities in descending order as operation target device(s). Details of an identification method are as described in the flow of
Next, the display controller 14 creates operation screen(s) of the operation target device(s), and causes the display device 21 to display the operation screen(s) (Step S404). This step is the same as Step S206 of
In this manner, in the third embodiment, the characteristic behavior is not a direct element for identifying an operation target device, but is used to determine a timing for identifying an operation target device, based on the line of sight direction. In the example described above, when the driver sees a certain on-board device 22 within a certain period of time (e.g., 2 seconds) following the time point when the characteristic behavior is performed, the on-board device 22 is identified as an operation target device.
However, the line of sight direction used to calculate operation target probability need not be a line of sight direction within a predetermined period of time following the time point when a characteristic behavior is performed, and may be a line of sight direction within a predetermined period of time preceding the time point when a characteristic behavior is performed. In this case, when a characteristic behavior is performed after the driver sees a certain on-board device 22 for a certain period of time (e.g., 2 seconds), the on-board device 22 is identified as an operation target device.
Further,
<C-3. Effect>
In the operation support device 103 of the third embodiment, the device identification unit 13 identifies the operation target device, based on the line of sight direction of the driver within a predetermined period of time preceding or following a time point when the characteristic behavior is performed. In this manner, the operation support device 103 can accurately identify an operation target device by identifying an operation target device with the line of sight direction at a timing determined based on the characteristic behavior.
<D-1. Configuration>
Similarly to the embodiments described above, the device identification unit 13 identifies an operation target device, based on a line of sight direction and a characteristic behavior of a driver. At the time of identifying an operation target device, the device identification unit 13 considers an overlapping degree between an on-board device 22 or displayed information of the on-board device 22 and the line of sight direction of the driver. Typically, an on-board device 22 that overlaps the line of sight direction of the driver is identified as an operation target device.
Although particular reference is not made in the second and third embodiments, the line of sight direction of the driver used to identify the operation target device described above is not an instantaneous line of sight direction, but is a line of sight direction during a certain continuous period of time. This “certain continuous period of time” is referred to as a “gaze detected period of time”.
In the fourth embodiment, the device identification unit 13 variably sets the gaze detected period of time, depending on presence or absence of traveling of a vehicle, a type of a traveling road, and a condition of a nearby vehicle, for example.
The surrounding condition detector 26 includes a camera, a radar, or the like mounted in the vehicle, and detects a traveling condition of a nearby vehicle. The nearby vehicle refers to a vehicle that travels around the subject vehicle. Examples of the traveling condition of a nearby vehicle include a traveling speed of the nearby vehicle, and a distance between the nearby vehicle and the subject vehicle.
The vehicle sensor 27 is a sensor that detects a condition of a vehicle in which the sensor is mounted, and for example, a vehicle speed sensor is included in the vehicle sensor 27. The device identification unit 13 can determine whether the vehicle is traveling or stopping, based on detected information of the vehicle sensor.
For example, the road condition detector 28 calculates the position of the vehicle by using signals of the Global Positioning System (GPS) or the like, and refers to map information to detect a type of a road on which the vehicle currently travels. For example, the type of a traveling road is a type whether a road is a general road or a freeway.
Information from the surrounding condition detector 26, the vehicle sensor 27, and the road condition detector 28 is information indicating how much time the driver can gaze an on-board device 22. For example, the driver concentrates more on driving during traveling of the vehicle as compared to the time when the vehicle is stopping, and thus the driver cannot gaze an on-board device 22 for a long period of time. Further, it is considered that a concentration load on driving is larger in a freeway as compared to a general road, and the driver cannot gaze an on-board device 22 for a long period of time. Further, it is considered that, when a nearby vehicle is traveling around, a driving load is larger as compared to the time when there is no nearby vehicle, and thus the driver cannot gaze an on-board device 22 for a long period of time. In view of such conditions, for example, the device identification unit 13 sets the gaze detected period of time to 500 ms or more and 1500 ms or less when the vehicle is traveling in a general road, and to 2000 ms or more and 3000 ms or less when the vehicle is stopping.
With this configuration, erroneous detection of an operation target device is less liable to be caused, and the driver may not be required to gaze an on-board device 22 for a long period of time to the extent possible.
<D-2. Effect>
In the operation support device 104 of the fourth embodiment, the device identification unit 13 identifies the operation target device, based on the line of sight direction during a gaze detected period of time. The gaze detected period of time varies depending on at least any of presence or absence of traveling of the vehicle, a type of a traveling road, and a condition of a nearby vehicle traveling around the vehicle. Therefore, according to the operation support device 104, driving safety is considered, and erroneous detection of an operation target device can be less liable to be caused.
The line of sight direction acquisition unit 11, the characteristic behavior acquisition unit 12, the device identification unit 13, the display controller 14, the operation receiver 15, and the operation target device controller 16 in the operation support devices 101, 102, 103, and 104 described above are implemented by a processing circuit 81 illustrated in
If the processing circuit 81 is dedicated hardware, examples of the processing circuit 81 include a single circuit, a composite circuit, a programmed processor, a processor for parallel programming, an application specific integrated circuit (ASIC), and a field-programmable gate array (FPGA), or a combination of these. The function of each part of the line of sight direction acquisition unit 11 etc. may be implemented by a plurality of processing circuits 81, or the functions of individual parts may be collectively implemented by one processing circuit.
If the processing circuit 81 is a processor, the functions of the line of sight direction acquisition unit 11 etc. are implemented by a combination with software etc. (software, firmware, or software and firmware). The software etc. are described as a program, and are stored in memory. As illustrated in
In the above, a configuration in which each function of the line of sight direction acquisition unit 11 etc. is implemented by any one of hardware and software etc. is described. However, the configuration is not limited to the above, and a configuration in which a part of the line of sight direction acquisition unit 11 etc. is implemented by dedicated hardware and another part is implemented by software etc. may be adopted. For example, the function of the device identification unit 13 may be implemented by a processing circuit as dedicated hardware. The function of other parts may be implemented by the processing circuit 81 as the processor 82 reading out and executing the program stored in the memory 83.
In this manner, the processing circuit may implement the above-mentioned each function by hardware, software etc., or a combination of these.
Further, in the above, the operation support devices 101, 102, 103, and 104 are described as devices mounted in a vehicle, However, the operation support devices 101, 102, 103, and 104 may also be used in a system constructed as a system achieved by appropriately combining a device mounted in a vehicle, a portable navigation device (PND), a communication terminal (e.g., a mobile terminal such as a mobile phone, a smartphone, and a tablet), a function of an application installed in these devices, and a server, for example. In this case, each function or each component of the operation support devices 101, 102, 103, and 10 described above may be distributed in each device that constructs the system, or may be centralized in any of the devices.
Note that, in the present invention, each embodiment can be freely combined, and each embodiment can be modified or omitted as appropriate, within the scope of the invention.
While the invention has been shown and described in detail, the foregoing description is in all aspects illustrative and not restrictive. It is therefore understood that numerous unillustrated modifications and variations can be devised without departing from the scope of the invention.
11 line of sight direction acquisition unit, 12 characteristic behavior acquisition unit, 13 device identification unit, 14 display controller, 15 operation receiver, 16 operation target device controller, 21 display device, 21A meter display, 21B HUD, 21C CID, 21D front passenger seat display, 22 on-board device, 23 line of sight detector, 24 characteristic behavior detector, 24A fingertip direction detector, 24B voice input device, 24C gesture detector, 24D face direction detector, 25 operation device, 25A touch pad, 25B dial, 25C button, 26 surrounding condition detector, 27 vehicle sensor, 81 processing circuit, 82 processor, 83 memory, 101, 102, 103, 104 operation support device
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2017/026448 | 7/21/2017 | WO | 00 |