This application claims priority to Japanese Patent Application No. 2020-094756 filed on May 29, 2020, incorporated herein by reference in its entirety.
The present disclosure relates to a server device, a control device, a vehicle, and an operation method for an information processing system.
Technologies that support the prevention of dangerous driving of vehicles are known. For example, Japanese Unexamined Patent Application Publication No. 2015-219736 (JP 2015-219736 A) discloses a system that predicts dangerous driving based on the position of a vehicle and encourages the driver to drive safely.
There is room for improvement in techniques for determining dangerous driving and other improper driving.
In the following, a server device and the like will be disclosed that can improve the determination of improper driving.
A server device according to the present disclosure includes a communication unit and a control unit that sends and receives information to and from another device via the communication unit. The control unit sends a captured image of surroundings of a vehicle or an inside of a vehicle cabin when a traveling mode of the vehicle is different from a past traveling mode to a terminal device such that the terminal device outputs the captured image.
A control device for a vehicle according to the present disclosure includes a communication unit and a control unit that sends and receives information to and from another device via the communication unit. The control unit sends a captured image of surroundings of a vehicle or an inside of a vehicle cabin when a traveling mode of the vehicle is different from a past traveling mode to a server device such that the server device sends the captured image to the terminal device.
In an operation method for an information processing system according to the present disclosure, the information processing system including a server device and a vehicle that send and receive information to and from each other, the vehicle sends a captured image of surroundings of the vehicle or an inside of a vehicle cabin when a traveling mode of the vehicle is different from a past traveling mode to the server device, and the server device sends the captured image to the terminal device such that the terminal device outputs the captured image.
With the server device and the like according to the present disclosure, it is possible to improve the determination of improper driving.
Features, advantages, and technical and industrial significance of exemplary embodiments of the disclosure will be described below with reference to the accompanying drawings, in which like signs denote like elements, and wherein:
Hereinafter, embodiments will be described.
A driver of the vehicle 13 has his/her own driving behavior tendency. The driving behavior tendency is of interest, for example, when assessing automobile insurance, or when evaluating the performance of a driver in the case where the vehicle 13 is a taxi vehicle. Specifically, it is required to determine whether the driving behavior tendency is oriented toward safe and proper driving (hereinafter collectively referred to as proper driving) or dangerous or improper driving (hereinafter collectively referred to as improper driving). Here, since the driving behavior tendency is reflected in a traveling mode of the vehicle 13, it is possible to determine the driving behavior tendency of the driver using information regarding the traveling mode of the vehicle 13. However, if a driver who normally encourages proper driving happens to exhibit improper driving for some reason, it is necessary to identify the factor that induced such driving in order to understand the driver's original tendency. The information processing system 1 in the present embodiment supports the identification of the factor that induced the improper driving, and thereby contributes to the improvement of the determination of improper driving.
In the present embodiment, the vehicle 13 sends a captured image of surroundings of the vehicle 13 when the traveling mode of the vehicle 13 is different from the past traveling mode to the server device 10. The traveling mode includes, for example, an acceleration/deceleration, a rate of change in steering angle over time, a selected route, and the like. The traveling mode corresponds to a driver's driving behavior such as depression/release of an accelerator pedal, steering, braking operation, and route selection. The past traveling mode corresponds to a normal driving behavior of the driver of the vehicle 13 and reflects the tendency of the driver. Therefore, when the traveling mode of the vehicle 13 is different from the past traveling mode, it is highly probable that the driver has performed a driving behavior different from the original tendency for some reason. The server device 10 sends a captured image of the surroundings of the vehicle 13 to the terminal device 12 such that the terminal device 12 outputs the captured image. Therefore, for example, when the driver or the person in charge of assessing the automobile insurance verifies the captured image output by the terminal device 12, it is possible to identify the factor that induced the driver's improper driving. Examples of the factor that induces improper driving such as sudden braking or sudden steering include a road obstacle or jumping out of a pedestrian that must be avoided by sudden braking or sudden steering. Further, examples of the factor that induces improper driving such as sudden acceleration include chasing by another vehicle from behind, road rage, etc., which must be avoided by sudden acceleration, or coercion and intimidation by a passenger when the vehicle 13 is a taxi vehicle. Alternatively, examples of the factors of improper driving such as adopting a route different from the usual route include congestion and road closure on the normal route. As described above, according to the information processing system 1, it is possible to improve the determination of improper driving.
The communication unit 20 has one or more communication modules corresponding to a wired or wireless LAN standard for connecting to the network 11. In the present embodiment, the server device 10 is connected to the network 11 via the communication unit 20, and performs information communication with another device via the network 11.
The storage unit 21 has, for example, a semiconductor memory, a magnetic memory, an optical memory, or the like. The storage unit 21 functions as, for example, a main storage device, an auxiliary storage device, or a cache memory. The storage unit 21 stores information, a control/processing program, and the like used for the operation of the server device 10.
The control unit 22 has, for example, one or more general-purpose processors such as a central processing unit (CPU), or one or more dedicated processors specialized for a specific process. Alternatively, the control unit 22 may have one or more dedicated circuits such as a field-programmable gate array (FPGA) and an application specific integrated circuit (ASIC). The control unit 22 comprehensively controls the operation of the server device 10 by operating according to the control/processing program or operating according to an operation procedure implemented as a circuit. Then, the control unit 22 sends and receives various information to and from the terminal device 12 and the vehicle 13 via the communication unit 20, and performs the operation according to the present embodiment.
The input/output unit 30 has an input interface that detects the user's input and sends the input information to the control unit 33. The input interface is any input interface including, for example, a physical key, a capacitive key, a touch screen integrated with a panel display, various pointing devices, a microphone that accepts voice input, a camera that captures a captured image or an image code, and the like. Further, the input/output unit 30 has an output interface that outputs, to the user, information generated by the control unit 33 or received from another device. The output interface is any output interface including, for example, an external or built-in display that outputs information as an image/video, a speaker that outputs information as audio, or a connection interface for an external output device.
The communication unit 31 has a communication module corresponding to a wired or wireless LAN standard, a module corresponding to a mobile communication standards such as fourth generation (4G) and fifth generation (5G), and the like. The terminal device 12 is connected to the network 11 through the communication unit 31 via a nearby router device or a mobile communication base station, and performs information communication with other devices via the network 11.
The storage unit 32 has, for example, a semiconductor memory, a magnetic memory, an optical memory, or the like. The storage unit 32 functions as, for example, a main storage device, an auxiliary storage device, or a cache memory. The storage unit 32 stores information, a control/processing program, and the like used for the operation of the terminal device 12.
The control unit 33 has, for example, one or more general-purpose processors such as a CPU, a micro processing unit (MPU), or one or more dedicated processors specialized for a specific process. Alternatively, the control unit 33 may have one or more dedicated circuits such as an FPGA and an ASIC. The control unit 33 comprehensively controls the operation of the terminal device 12 by operating according to the control/processing program or operating according to an operation procedure implemented as a circuit. Then, the control unit 33 sends and receives various information to and from the server device 10 and the like via the communication unit 31, and performs the operation according to the present embodiment.
The communication unit 41 includes one or more communication interfaces. The communication interface is, for example, an interface compatible with mobile communication standards such as a long term evolution (LTE), 4G, or 5G. The communication unit 41 receives the information used for the operation of the control device 40, and sends the information obtained through the operation of the control device 40. The control device 40 is connected to the network 11 through the communication unit 41 via a mobile communication base station, and performs information communication with other devices via the network 11.
The positioning unit 42 includes one or more global navigation satellite system (GNSS) receivers. The GNSS includes, for example, at least one of a global positioning system (GPS), a quasi-zenith satellite system (QZSS), a global navigation satellite system (GLONASS), and Galileo. The positioning unit 42 acquires position information of the vehicle 13.
The input/output unit 43 includes one or more input interfaces and one or more output interfaces. The input interface is, for example, a physical key, a capacitive key, a pointing device, a touch screen integrated with a display, or a microphone that accepts voice input. The output interface is, for example, a display or a speaker. The display is, for example, a liquid crystal display (LCD) or an organic electro luminescence (EL) display. The input/output unit 43 accepts an operation of inputting information used for the operation of the control device 40, sends the input information to the control unit 47, and outputs information obtained through the operation of the control device 40.
The imaging unit 44 includes one or more cameras and a control circuit therefor, which are provided at positions that enables imaging of the surroundings of the vehicle 13 or the inside of the vehicle cabin. The camera of the imaging unit 44 may be a monocular camera or a stereo camera. The imaging unit 44 images the surroundings of the vehicle 13 or the inside of the vehicle cabin at predetermined time intervals, and sends the captured images to the control unit 47. Further, the captured images may be associated with information on audio around the vehicle 13 or inside the vehicle cabin, which is acquired from the input interface of the input/output unit 43.
The detection unit 45 has sensors for detecting the motion state of the vehicle 13. The sensors include, for example, sensors that detect a vehicle speed, an acceleration, a steering angle, a tilt, a braking operation, and the like of the vehicle 13. The detection unit 45 detects information indicating the motion state of the vehicle 13 by the sensors and sends the information to the control unit 47.
The storage unit 46 includes one or more semiconductor memories, one or more magnetic memories, one or more optical memories, or a combination of at least two of them. The semiconductor memory is, for example, a random access memory (RAM) or a read only memory (ROM). The RAM is, for example, a static RAM (SRAM) or a dynamic RAM (DRAM). The ROM is, for example, an electrically erasable ROM (EEPROM). The storage unit 46 functions as, for example, a main storage device, an auxiliary storage device, or a cache memory. The storage unit 46 stores the information used for the operation of the control device 40 and the information obtained through the operation of the control device 40.
The control unit 47 has one or more general-purpose processors such as a CPU and an MPU, or one or more dedicated processors specialized for a specific process. Alternatively, the control unit 47 may have one or more dedicated circuits such as an FPGA and an ASIC. The control unit 47 comprehensively controls the operations of the control device 40 and the vehicle 13 by operating according to a control/processing program or operating according to an operation procedure implemented as a circuit. The control unit 47 sends and receives various information to and from the server device 10 via the communication unit 41, and performs the operation according to the present embodiment.
In step S500, the vehicle 13 sends the position information to the server device 10. The control unit 47 of the vehicle 13 sends the current position of the vehicle 13 acquired from the positioning unit 42 to the server device 10 via the communication unit 41. The control unit 22 of the server device 10 receives the position information via the communication unit 20 and stores it in the storage unit 21. Step S500 is repeated at predetermined time intervals (e.g., intervals of a few milliseconds to a few seconds).
In step S502, the vehicle 13 sends state information indicating the motion state of the vehicle 13 to the server device 10. The state information is, for example, information indicating the motion state such as the vehicle speed, the acceleration, and the steering angle of the vehicle 13 detected by the detection unit 45 of the vehicle 13. The control unit 47 of the vehicle 13 sends the state information to the server device 10 via the communication unit 41. The control unit 22 of the server device 10 receives the state information via the communication unit 20 and stores it in the storage unit 21.
In step S504, the vehicle 13 sends the captured image to the server device 10. The control unit 47 of the vehicle 13 sends the captured image of the surroundings of the vehicle 13 or the inside of the vehicle cabin, which is captured by the imaging unit 44, to the server device 10 via the communication unit 41. The control unit 22 of the server device 10 receives the captured image via the communication unit 20. Step S504 is repeated at predetermined time intervals (e.g., intervals of a few milliseconds to a few seconds).
In step S506, the server device 10 determines whether to send the captured image, the position information, and the like to the terminal device 12, based on the traveling mode of the vehicle 13. For example, the control unit 22 of the server device 10 compares the traveling mode corresponding to the position information and the state information of the vehicle 13 with the past traveling mode corresponding to the past position information and the past state information stored in the storage unit 21. Then, the control unit 22 determines whether the current traveling mode is different from the past traveling mode. When the current traveling mode is different from the past traveling mode, the control unit 22 determines to send the captured image and the position information to the terminal device 12. Here, the details of step S506 will be described with reference to
In step S700, the control unit 22 determines whether there is a speed violation. For example, the control unit 22 compares the past vehicle speed stored in the storage unit 21 with the acquired current vehicle speed of the vehicle 13. When the current vehicle speed shows an abnormal value, the control unit 22 determines there is a speed violation. Any determined criteria can be used for determination of the abnormal value. For example, a standard is set for the magnitude of a deviation from the past average value or the median value. When the magnitude of the deviation exceeds the standard, it can be determined as an abnormal value, and when the magnitude of the deviation is equal to or less than the standard, it can be determined as a normal value. The magnitude of the deviation may be an absolute value or a deviation value. The determination of the abnormal value is the same in the following description. When the control unit 22 determines there is a speed violation (Yes in step S700), that is, when the traveling mode of the vehicle 13 is different from the past traveling mode, the process proceeds to step S708, and the control unit 22 determines to send the captured image received from the vehicle 13 to the terminal device 12. When the control unit 22 does not determine that there is a speed violation (No in step S700), the control unit 22 proceeds to step S702.
In step S702, the control unit 22 determines whether there is a sudden acceleration or a sudden braking. For example, the control unit 22 compares the absolute value of the past acceleration stored in the storage unit 21 with the absolute value of the acquired current acceleration of the vehicle 13. When the current absolute value indicates an abnormal value, the control unit 22 determines that there is a sudden acceleration or a sudden braking. When the control unit 22 determines there is a sudden acceleration or a sudden braking (Yes in step S702), that is, when the traveling mode of the vehicle 13 is different from the past traveling mode, the process proceeds to step S708, and the control unit 22 determines to send the captured image received from the vehicle 13 to the terminal device 12. When the control unit 22 does not determine that there is a sudden acceleration or a sudden braking (No in step S702), the control unit 22 proceeds to step S704.
In step S704, the control unit 22 determines whether there is a sudden steering. For example, the control unit 22 obtains the rate of change over time of the acquired current steering angle with respect to the latest steering angle stored in the storage unit 21. Then, the control unit 22 compares the rate of change over time of the current steering angle with the rate of change over time of the past steering angle for the same unit time. When the rate of change over time of the current steering angle shows an abnormal value, the control unit 22 determines that there is a sudden steering. When the control unit 22 determines there is a sudden steering (Yes in step S704), that is, when the traveling mode of the vehicle 13 is different from the past traveling mode, the process proceeds to step S708, and the control unit 22 determines to send the captured image received from the vehicle 13 to the terminal device 12. When the control unit 22 does not determine that there is a sudden steering (No in step S704), the control unit 22 proceeds to step S706.
In step S706, the control unit 22 determines whether a different route has been adopted. For example, the control unit 22 derives a road on which the vehicle 13 frequently travels from the history of transitions of position information stored in the storage unit 21. For example, the control unit 22 derives the fact that the vehicle 13 frequently travels on main roads, bypasses, highways, and the like. Further, the control unit 22 derives the current traveling route of the vehicle 13 based on the transition of the acquired position information from the latest position information. When the current route deviates from the past route, the control unit 22 determines that a different route has been adopted. For example, the control unit 22 determines that the different road has been adopted when a travel distance over which the current route has deviated from the past route is larger than a predetermined reference distance, and does not determine that the different route has been adopted when the travel distance is equal to or less than or the reference distance. Examples of the case where it is determined that the difference route has been adopted include a case where the vehicle 13 frequently traveled on main roads, bypasses, highways, and the like in the past, but is now traveling through residential areas or narrow alleys. When the control unit 22 determines that the different road has been adopted (Yes in step S706), that is, when the traveling mode of the vehicle 13 is different from the past traveling mode, the process proceeds to step S708, and the control unit 22 determines to send the captured image received from the vehicle 13 to the terminal device 12. When the control unit 22 does not determine that the different road has been adopted (No in step S706), the control unit 22 ends the procedure of
In step S700 or S706, the control unit 22 may perform the determination process in consideration of the environmental information. For example, in step S700, when the environmental information indicates that the vehicle 13 is traveling on a main road, the criterion for determining that there is an abnormal value is set relatively loosely at a predetermined ratio, and when the environmental information indicates that the vehicle 13 is traveling through the residential area, the criterion for determining that there is an abnormal value is set relatively strictly. This enables a suitable determination of speed violation considering the environment. Further, in step S706, when the environmental information indicates congestion or road closure in the past route of the vehicle 13, the criterion for determining that there is an abnormal value is set relatively loosely, which enables a suitable determination of the different road considering the environment.
Further, in steps S700 to S706, the control unit 22 may perform the determination process in consideration of the position information or the time. For example, the control unit 22 compares the past traveling mode with the current traveling mode at the same position as the current position of the vehicle 13. Alternatively, the control unit 22 compares the past traveling mode with the current traveling mode at the same time of day as the current time. By doing so, it is possible to eliminate the peculiarity caused by the position or the time and determine the traveling mode different from the past traveling mode.
In
After the traveling of the vehicle 13 is completed, the terminal device 12 outputs the captured image, the position information, the environmental information, and the traveling mode information of the vehicle 13 in step S510. For example, after the traveling of the vehicle 13 is completed, step S510 is performed when evaluating the driving behavior tendency of the driver. For example, in response to the operation input of the driver or the evaluator, the control unit 33 of the terminal device 12 outputs the captured image, the position information, the environmental information, and the traveling mode information through the input/output unit 30. For example, as shown in
The terminal device 12 presents to the driver or the evaluator the position information, the environmental information, and the captured image obtained when the vehicle 13 presents an unusual traveling mode, which indicates that the probability of improper driving is high. Thus, the driver or the evaluator can visually verify, from the captured image, whether there is a factor that induces improper driving. At the same time, shapes, arrangement, etc. of the roads can be grasped from the position information, and the road environment, the traffic conditions, etc. can be grasped from the environmental information, which can be utilized for verification/determination of improper driving.
In the procedure of
In step S900, before the control device 40 of the vehicle 13 sends the captured image and the position information to the server device 10, the control device 40 determines whether to send the captured image, the position information, etc. to the terminal device 12 via the server device 10 based on the traveling mode of the vehicle 13. For example, the control unit 47 of the control device 40 compares the traveling mode corresponding to the position information and the state information of the vehicle 13 with the past traveling mode corresponding to the past position information and the past state information stored in the storage unit 46. Then, the control unit 47 determines whether the current traveling mode is different from the past traveling mode. When the current traveling mode is different from the past traveling mode, the control unit 47 determines to send the captured image and the position information to the terminal device 12. When the control unit 47 determines to send the captured image and the position information to the terminal device 12, the position information and the captured image are sent to the server device 10. Further, for example, the control device 40 of the vehicle 13 may be configured so that the information shown in
As described above, according to the present embodiment, it is possible to improve the determination of improper driving.
In the above-described embodiment, the processing/control program defining the operation of the terminal device 12 and the control device 40 may be stored in the storage unit 21 of the server device 10 or a storage unit of another server device, and downloaded to each device via the network 11. Alternatively, the processing/control program may be stored in a portable, non-transitory recording/storage medium that can be read by each device, and may be read from the medium by each device.
Although the embodiments have been described above based on the drawings and examples, it should be noted that those skilled in the art can easily make various modifications and alterations thereto based on the present disclosure. It should be noted, therefore, that these modifications and alterations are within the scope of the present disclosure. For example, the functions included in each step, etc. can be rearranged so as not to be logically inconsistent, and a plurality of steps, etc. can be combined into one or divided.
Number | Date | Country | Kind |
---|---|---|---|
2020-094756 | May 2020 | JP | national |