This application is based on and claims the benefit of priority of Japanese Patent Application No. 2007-19396 filed on Jan. 30, 2007, the disclosure of which is incorporated herein by reference.
The present disclosure generally relates to an information provision apparatus for use in a vehicle.
In recent years, a technique for preventing uneasiness of a driver in a traffic congestion due to slower and/or stopped traffic in spite of a traffic light being turned to green is provided, for example, in Japanese patent document JP-A-2005-222241. That is, the disclosure in the above patent application illustrates a configuration that prevents the uneasiness of the driver who does not know why traffic is slower or stopped.
In the above disclosure, a head vehicle in the traffic congestion captures an image of the front area and transmits the image to the rear vehicles in the traffic congestion, thereby enabling the rear vehicles to view the situation of the congestion. In this manner, the situation of the congestion such as a gridlocked intersection condition can be viewed in the rear vehicles for preventing the uneasiness of the drivers in the rear vehicles.
In this case, the situation presented as an image eases the drivers in the rear vehicles to wait until the congestion cause is resolved by remembering similar situations. However, when the image of the intersection does not explain the cause of the congestion, the uneasiness of the driver will not be resolved. Further, the repeated provision of the intersection image may bore the driver, thereby leading to a non-satisfaction of the driver.
In view of the above and other problems, the present invention provides an apparatus that prevents the driver to feel the uneasiness due to a temporal stop of the vehicle in a traffic congestion or the like by feeding the driver with information.
The apparatus of the present includes a stop detection unit capable of detecting a temporal stop of a vehicle, a display unit capable of displaying various information to be viewed by an occupant of the vehicle, and a control unit capable of serially feeding the display unit with different information in an ever-changing manner while the vehicle is temporarily stopping. The apparatus of the present invention thus provides information in different categories for the driver of the vehicle when the vehicle stops due to a traffic congestion, thereby having a good time of diversion and being eased from the uncomfortableness. Further, the information fed in an ever-changing manner, that is, information fed in a manner that avoids successive provision of information in the same category, allow the driver to pay too much attention for the information on the display unit, thereby enabling the driver to resume the driving operation without lingering distraction left on the provided information.
Other objects, features and advantages of the present invention will become more apparent from the following detailed description made with reference to the accompanying drawings, in which:
One embodiment of the present invention is explained as follows while referring to the drawing. First,
The external information sensing system 2 consists of an external camera 9 and an in-vehicle outside communications equipment 10. The external camera 9 is installed at an inside or outside of the vehicle compartment of the subject vehicle for imaging a front image of the subject vehicle. The in-vehicle outside communications equipment 10 is an equipment for road-vehicle-communication and/or vehicle to vehicle communications, and acquires, from road-side infrastructure, traffic light information that includes signal states (that is, a red light (stop signal), a green light (proceed signal), a yellow light (caution signal)) of a road-side device installed in the forward crossings of the subject vehicle, time of each signal state, a position of the road-side device (longitude and latitude) and the like.
In addition, the in-vehicle outside communications equipment 10 acquires traffic congestion information from a traffic information center. The traffic congestion information includes a position (a longitude and a latitude) of the congestion section of the traffic congestion occurring ahead of the subject vehicle, a length of the traffic congestion section, a passage expectation time of the traffic congestion section and the like. Furthermore, the in-vehicle outside communications equipment 10 transmits and receives various information between the information center outside of the subject vehicle, and various information can also be transmitted and received between surrounding vehicles around the subject vehicle.
The inside information sensing system 3 mainly consists of an internal camera 11 and an eye/speech analysis device (i.e., a driver condition detector in conceptual language) 12. The internal camera 11 is installed in an appointed position in the compartment of the subject vehicle for imaging drive's face image in the subject vehicle. The eye/speech analysis device 12 estimates the eyes direction (by using XYZ coordinate having the sitting position of the vehicle driver defined as the origin with the X-axis aligned with vehicle's front-rear direction, the Y-axis in the lateral direction and the Z-axis in the vertical direction) of the vehicle driver, the driving state of the vehicle driver, an atmosphere in the compartment of the subject vehicle based on the analysis (a sound recognition result) of the sound of the vehicle driver and passengers from a voice-input device (not illustrated) as well as the driver's face image input from the camera 11.
The driving state of the driver is represented as an awakening degree, feelings of the vehicle driver, a degree of carelessness and the like, in this case. The awakening degree of the vehicle driver is measured and estimated based on the number of blinks and the speed of the blinks of the vehicle driver derived from the face image of the vehicle driver. The feelings of the driver is estimated from the face image (the degree of opening of lips and eyes) of the vehicle driver, the volume of the voice, the words (e.g., vocabulary) as well as the tone (i.e., highness/lowness of frequency) of the driver's voice. In addition, the carelessness of the vehicle driver is estimated from the degree of change of the eyes direction of the vehicle driver.
In addition, because the atmosphere in the compartment is affected by the number of people in the vehicle, the number of people in the subject vehicle is considered. For example, when the driver is an only occupant of the subject vehicle, the atmosphere (e.g., quietness, strain, liveliness and the like) in the compartment of the vehicle is estimated from the driving state of the vehicle driver. On the other hand, when the driver is accompanied by other occupants in the subject vehicle, the atmosphere (quietness, strain, liveliness) in the vehicle is estimated from a conversation or lack of conversation, the laughing/angry voice or the like in the conversation, the words (e.g., vocabulary) of the occupants.
The vehicle information sensing system 4 consists of a steering angle sensor 13, a brake pedal manipulation variable detection sensor (i.e., brakes sensor in the following) 14, an accelerator pedal manipulation variable detection sensor (an accelerator sensor in the following) 15 and a navigation system 16. The steering angle sensor 13 is a sensor detecting the steering angle (i.e., the angle of steer of the subject vehicle from a neutral position of the steering wheel that is defined as the steering position when the vehicle is traveling straight) of the steering wheel of the subject vehicle. The brakes sensor 14 is the sensor detecting a quantity of pressing of the brake pedal by the vehicle driver, and the accelerator sensor 15 is a sensor detecting a quantity of pressing of the accelerator pedal by the vehicle driver. The detecting signal output from these sensors 13, 14, 15 is provided for the arithmetic unit 5.
The navigation system 16 consists of a well known type of a vehicle position detector, an operation switch group, a map data storage unit, a display unit and the like (not illustrated), and outputs, under control of an instruction from the arithmetic unit 5, various information processed therein and information stored in the map data storage unit to the arithmetic unit 5.
The personal information accumulation device 6 accumulates personal information such as general information (a full name, age, an occupation, sex, married/single distinction, having children and the like) as well as information about a personal hobby/idea. More practically, the information of the personal hobby/idea is classified as categories such as music, sports, movies, news, television programs, shopping, restaurants, leisure activities, theme parks, and the like, and lists the contents that suits the preferred hobby/idea of the driver in each of the categories (for example, in a “music” category, a genre (classic, jazz, popular songs or the like), a title, a name of the artist and the like). The personal information accumulation device 6 outputs the personal information having the general information and the information about the hobby/idea in response to an instruction from the arithmetic unit 5. In addition, the personal information in the personal information accumulation device 6 is updated by the arithmetic unit 5 appropriately in a timely manner, and is accumulated in an appending manner.
Furthermore, in the personal information accumulation device 6, a database which accumulates the driving operation (e.g., an operation of a steering wheel, a brake, an accelerator and the like) of the vehicle driver detected by the vehicle information sensing system 4 is installed. The update, addition and the like of the data of the database is performed by the arithmetic unit 5 appropriately.
The display unit 7 is, for example, formed by a liquid crystal display, and is disposed at a position that is viewable by the vehicle driver and other occupants of the vehicle. The voice-output device 8 includes an amplifier (not illustrated), a speaker installed appropriately in the vehicle compartment.
The arithmetic unit 5 includes functional blocks of a halt calculation department (i.e., a stop detector in conceptual language) 17, a scene presumption department 18, a state estimation department (i.e., a driver's condition detector, a condition determiner in conceptual language) 19, an information filtering department 20, an individual adaptation department 21, an information processing department 22, a display control department 23 and a sound control department 24.
The halt calculation department 17 calculates a stop time (i.e., a halt in the following) when the subject vehicle stops by the stop factors such as the stop signal (a red light) of an existing road-side device (a traffic signal) ahead of the subject vehicle, the stops of the front vehicle due to the traffic congestion by the concentration of traffic and the like. When the subject vehicle stops by the above-described stop factor as shown in
The halt Xs is calculated by respectively different calculation methods when the subject vehicle stops due to the stop signal of the road-side device, and when the subject vehicle stops due to the stop of the front vehicle in the traffic congestion ahead of the subject vehicle. The calculation methods of the halt Xs is explained in the following.
When the subject vehicle is at the top of the traffic that is stopped at the stop signal of the road-side device, the true stop signal time is identical to the halt Xs. However, as shown in
Therefore, for handling this kind of situation, a required departure time per vehicle is set in advance, and the required departure time is included in the halt Xs together with the true stop signal time in the calculation. In addition, when there are plural vehicles between the road-side device and the subject vehicle as shown in
The distance L between the stop signal of the road-side device and the subject vehicle can be calculated, in this case, from a current position (latitude/longitude) of the subject acquired from the navigation system 16 and the position (latitude/longitude) of the road-side device included in the traffic light information acquired through the in-vehicle outside communications equipment 10. The position of the road-side device may also be grasped by referring to the map data of the navigation system 16, and the distance L may be acquired from the position of the road-side device and a current position of the subject vehicle. Further, by using communication between vehicles, the number of vehicles in front of the subject vehicle toward the road-side device may by directly acquired from the other vehicles through the in-vehicle outside communications equipment 10.
In addition, the distance L between the top stopping vehicle and the subject vehicle may be acquired from outside of the subject vehicle through the in-vehicle outside communications equipment 10. Further, by using communication between vehicles, the number of vehicles in front of the subject vehicle toward the road-side device may by directly acquired from the other vehicles through the in-vehicle outside communications equipment 10.
In addition, the scene presumption department 18 estimates, from the driving state of the vehicle driver estimated by the eye/speech analysis device 12 as well as an estimation result of the atmosphere in the compartment of the subject vehicle, the condition/circumstance of the driver of the subject vehicle. That is, for example, when there is no co-occupant in the vehicle, the scene presumption department 18 estimates the condition of the driver who is driving alone as a concentrated driving condition, a careless driving condition, a tense driving condition or the like, based on the number of the occupants in the subject vehicle and the atmosphere in the compartment (quietness, strain, liveliness or the like). Further, when there is a co-occupant in the vehicle, the driver's condition is estimated as a co-occupant affected driving condition such as being involved in a conversation or in listening a music with the co-occupant, having the co-occupant in a sleeping condition or the like.
The state estimation department 19 is equipped with a first function that determines whether the driving operation (a steering operation, a brake operation, an accelerator operation or the like) of the vehicle driver detected by the vehicle information sensing system 4 is different (i.e., diverted) substantially from the driving operation of the vehicle driver accumulated in the database in the personal information accumulation device 6 (i.e., from the normal driving operation) for a purpose of determining (estimating) whether the driver's current emotion (i.e., condition) is different from a usual condition (i.e., the normal condition). The determination by the first function is employed as one of predetermined criteria for control determination.
Furthermore, the state estimation department 19 is equipped with a second function that reads feelings (a state) of the vehicle driver from biological information such as the voice, expression, head movement, eye direction, and the like of the vehicle driver acquired by the eye/speech analysis device 12, for a purpose of determining (i.e., estimating) whether the current driver's current condition is different from the normal condition. The second function is also employed as one of predetermined criteria for control determination.
The information filtering department 20 acquires information (offer-able information) that can be offered in subject vehicle from the outside information center through the in-vehicle outside communications equipment 10, and the acquired information is memorized in an information server (not illustrated). The information filtering department 20 filters and extracts, from the offer-able information memorized in the server, the information that is in accordance with the personal information (the general information as well as the information about a hobby/idea as stated above) being input from the individual adaptation department 21, that is, preferred information of the driver. The offer-able information memorized in the information server is accompanied by identification information for use in the information extraction, and the filtering department 20 extracts the offer-able information by checking agreement of the identification information and the personal information.
Then, the information processing department 22 is used to elect (acquire) the information that is suitable for the condition of the driver of the subject vehicle estimated by the scene estimation department 18 from the extracted offer-able information extracted by the information filtering department 20 (i.e., the preferred information of the vehicle driver). Further, the elected information is provided for the driver who is brought to a temporal stop at the stop signal, in traffic congestion or the like, thereby easing the driver in a course of guiding the driver's attention to the information that it is offered in association with the stop factor that has been causing the uncomfortable feeling of the driver.
The information processing department 22 processes, by employing a technique of natural language processing and the like, the elected information described above to have a playback time that is substantially same as the halt Xs calculated by the halt calculation department 17. That is, in other words, the content of the offer-able information is processed (i.e., edited) so that the information provision shall conclude within the halt Xs after a start of the information provision at the time of the stop of the subject vehicle, or at the determination of the stop of the subject vehicle. Details of the editing process of the information by the information processing department 22 is described later.
Then, by providing the contents of the edited information for the driver of the subject vehicle, the driver is enabled to look through the contents of the provided information before starting the subject vehicle. Therefore, the driver's attention is not left to a lingering condition that is partially attracted to the provided information after the start of the vehicle. As a result, the driver's attention is easily focused on the driving operation, that is, concentration on the driving operation is enabled. The information provision apparatus 1 of the present embodiment achieves safety of the driving operation after the start of the vehicle in the above-described manner.
In addition, appearance and arrangement of the information content are adjusted for the viewer (i.e., for an individual driver) by the display control department 23 after the editing of the information by the information processing department 22. Similarly, the sound control department 24 changes the edited information mentioned above for the ease of hearing by the individual driver before outputting as a sound.
Processing control of the offer-able information is explained with reference to
In step S30, the condition of the vehicle driver is estimated by the scene presumption department 18, and then, in step S40, the condition (feelings) of the vehicle driver is estimated by the state estimation department 19. In step S50, the personal information of the vehicle driver is acquired by the personal adaptation department 21, and then, in step S60, the offer-able information that agrees with the hobby/idea of the vehicle driver is extracted by the information filtering department 20. Then, in step S70, the offer-able information extracted in step S60 is further elected and edited by the information processing department 22 for provision to the driver and the co-occupants. In this case, the information content is determined by classifying the content to be provided into categories of long time information and short time information in the first place. More practically, as the long time information, information type (e.g., a television program, a music piece, a DVD track and the like) is determined together with the information content to be provided, and plural information contents are prepared. Similarly, as the short time information, the information type (e.g., a commercial film, four-frame cartoon, a game and the like) is determined together with the information content to be provided, and plural information contents are prepared.
The prepared long time Information A-F and short time information a˜f together with last information contents Z, Y, X to be provided are shown in
Then, in step S80, whether the situation allows the provision of the long time information is determined by the information processing department 22 based on the driver's condition as well as the feelings of the driver, the driving operation and the like. When it is determined that the situation allows the provision of the long time information (YES in step S80), the process proceeds to step S90. In step S90, plural pieces of random information including the long time information such as (a, B, C, b) as exemplarily shown in
On the other hand, when it is determined that the situation does not allow the provision of the long time information (“NO” in step S80), the process proceeds to step S100. In step S100, plural pieces of random information including short time information only such as (a, c, d, f, b) as exemplarily shown in
Then, in step S110, the last information contents to be provided such as Z in
The information content is explained in the following. First, information registered by the driver while the vehicle is traveling is, for example, prepared and provided. In addition, information calculated from the presence of the occupants, the number of occupants, day of the week and time of the day (by, for example, using Bayesian theorem) such as a music piece being listened to, a TV/DVD program being watched, a shop visited previously, a tourist spot and the like are prepared and provided. Further, for example, Internet registration information such as gourmet information, travel information, book information and comic information is prepared and provided.
Furthermore, the information of the other person downloaded from the Internet for provision such as the stand-up comedy of younger performers and the like is prepared and provided. Furthermore, the information of the other person who has watched the same music clip and/or the same television program is prepared and provided. In addition, local information such as shop information, tourist spot information, local artist information, local radio information and the like is prepared and provided.
The short time information is the information that can be displayed in one screen of the display unit 7 (e.g., information to be concluded with one screen, or information whose contents can be understood with one screen), and the information that can be provided in a short time and whose contents can be understood in a short time. The short time information is the information mainly formed as a still image, or the information of moving picture if it is played in a short time.
The long time information is the information of moving picture, a slide show, and a scrollable information, that is, the information that takes longer time to be provided. The last information is the information that turns driver's attention to the driving operation of the vehicle. The last information is, for example, the message and/or the image that notifies the driver that the vehicle has to be started in a short time as shown in
When plural information contents are provided in a manner shown in
In addition, the information on the display unit 7 is dimmed as time elapses when plural information contents are displayed (refer to
Furthermore, in the present embodiment, the plural information contents are joined in a fade-out fade-in manner (refer to
In the above-described manner, the traffic congestion is detected based on the vehicle speed, the distance from the front vehicle or the like, the plural information contents such as a four-frame cartoon, news and the like are provided smoothly one by one in a fade-in fade-out manner at the temporal stop due to the traffic congestion so that the driver is eased away from unsafe driving. Then, at the end of the information provision, for a purpose of notifying the driver that the vehicle has to be started in a short time, a screen is turned suddenly to a local commercial film of the driver's preference, news of the day or the like, that is, the content is turned to completely different one from what has been provided to remind the driver of the start of the vehicle in a short time, to encourage the driver to prepare for the start of the vehicle.
The provision of the information that includes the long time information is explained with reference to
First, as a sightseeing loving driver, tourist information of a popular destination Beijing is acquired from a travel agency through the Internet, and the Beijing tourist information is displayed on the display unit 7 as shown in
Then, the commercial film of nearby brand-name store is displayed on the display unit 7 based on the map information as shown in
Then, on the display unit 7, the pictures taken in the previous tour stored in the hard disk drive (HDD) is displayed in order as a slide show as shown in
In addition, the last information displayed on the display unit 7 may be the message of “RESUME DRIVING” as shown in
Furthermore, the last information may be a movie that repeats the same scene or the like. Furthermore, the last information may be an output of a voice message such as “The signal changes in a short time.” “Start the vehicle right away.” or the like.
Furthermore, in the above embodiment, the road-side device is assumed to be the traffic signal, the road-side device may be, for example, a radio beacon, a light beacon or the like that is communicable. In addition, the communication method such as a dedicated short range communication (DSRC), a wireless LAN as well as other types of communication may also be employed.
Such changes are regarded within a scope of the present invention.
Number | Date | Country | Kind |
---|---|---|---|
2007-19396 | Jan 2007 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
5699056 | Yoshida | Dec 1997 | A |
5801763 | Suzuki | Sep 1998 | A |
5813989 | Saitoh et al. | Sep 1998 | A |
6480783 | Myr | Nov 2002 | B1 |
6734799 | Munch | May 2004 | B2 |
6868332 | Hashimoto | Mar 2005 | B2 |
6995663 | Geisler et al. | Feb 2006 | B2 |
6998972 | Geisler et al. | Feb 2006 | B2 |
7027621 | Prokoski | Apr 2006 | B1 |
7167106 | Haase | Jan 2007 | B2 |
7187292 | Hayashi et al. | Mar 2007 | B2 |
7190274 | Ihara et al. | Mar 2007 | B2 |
7292152 | Torkkola et al. | Nov 2007 | B2 |
7301464 | Coulter | Nov 2007 | B2 |
7317386 | Lengning et al. | Jan 2008 | B2 |
7468673 | Sultan et al. | Dec 2008 | B2 |
7526333 | Yasushi et al. | Apr 2009 | B2 |
20020067269 | Cadell et al. | Jun 2002 | A1 |
20030146841 | Koenig | Aug 2003 | A1 |
20040098193 | Kageyama | May 2004 | A1 |
20040201583 | Burroughes et al. | Oct 2004 | A1 |
20050093719 | Okamoto et al. | May 2005 | A1 |
20050128063 | Isaji et al. | Jun 2005 | A1 |
20060006990 | Obradovich | Jan 2006 | A1 |
20060164230 | DeWind et al. | Jul 2006 | A1 |
20060220915 | Bauer | Oct 2006 | A1 |
20070005231 | Seguchi | Jan 2007 | A1 |
20070159344 | Kisacanin | Jul 2007 | A1 |
20070290867 | Kuramori et al. | Dec 2007 | A1 |
20070296601 | Sultan et al. | Dec 2007 | A1 |
20080030313 | Obradovich | Feb 2008 | A1 |
20080105482 | Yamaguchi et al. | May 2008 | A1 |
20090160676 | Stehle et al. | Jun 2009 | A1 |
20090224942 | Goudy et al. | Sep 2009 | A1 |
20090299857 | Brubaker | Dec 2009 | A1 |
Number | Date | Country |
---|---|---|
A-10-082653 | Mar 1998 | JP |
A-2002-107156 | Apr 2002 | JP |
A-2002-243472 | Aug 2002 | JP |
2004-355055 | Dec 2004 | JP |
A-2004-355055 | Dec 2004 | JP |
A-2005-121382 | May 2005 | JP |
A-2005-222241 | Aug 2005 | JP |
A-2005-352619 | Dec 2005 | JP |
A-2006-284314 | Oct 2006 | JP |
Number | Date | Country | |
---|---|---|---|
20080284615 A1 | Nov 2008 | US |