The disclosure of Japanese Patent Application No. 2018-234146 filed on Dec. 14, 2018 including the specification, drawings and abstract is incorporated herein by reference in its entirety.
The disclosure relates to an information processing system, a program, and an information processing method.
In the related art, a technology for providing information on road congestion has been known. For example, Japanese Unexamined Patent Application Publication No. 2014-228434 (JP 2014-228434 A) discloses a navigation device that acquires a changing point of a traffic volume of a road, acquires a traveling vehicle image obtained by imaging the load at the changing point, and displays the traveling vehicle image in a display mode in which a position of the changing point is identified.
In the disclosure disclosed in JP 2014-228434 A, each of a head position and a tail position of a congestion section indicated in congestion information from a VICS (registered trademark, which stands for “vehicle information and communication system”) center is acquired as the changing point of the traffic volume on the road. However, the congestion information provided from the VICS center indicates a rough congestion section and congestion degree, and the accuracy thereof is not always sufficient. Therefore, there is room for improvement in the technology for providing information on road congestion.
An object of the disclosure made in consideration of the above circumstances is to provide a technology for providing information on congestion on a road.
A first aspect of the disclosure relates to a system including a vehicle and a server configured to communicate with the vehicle. The vehicle acquires a moving image obtained by imaging an oncoming lane during traveling. The vehicle or the server determines at least one of a congestion section and a congestion degree of the oncoming lane based on the moving image. The server is configured to store at least one of the congestion section and the congestion degree of the oncoming lane, and provide information to a client by using the stored information.
A second aspect of the disclosure relates to a program. The program causes a vehicle that is communicable with a server to execute steps of acquiring a moving image obtained by imaging an oncoming lane during traveling, determining at least one of a congestion section and a congestion degree of the oncoming lane based on the moving image, and transmitting, to the server, the at least one of the congestion section and the congestion degree of the oncoming lane.
A third aspect of the disclosure relates to an information processing method executed by a system including a vehicle and a server that is communicable with the vehicle. The method includes acquiring, by the vehicle, a moving image obtained by imaging an oncoming lane during traveling, determining, by the vehicle or the server, at least one of a congestion section and a congestion degree of the oncoming lane based on the moving image, storing, by the server, at least one of the congestion section and the congestion degree of the oncoming lane, and providing, by the server, information to a client by using the stored information.
With the information processing system, the program, and the information processing method according to the aspects of the disclosure, the technology for providing information on road congestion is improved.
Features, advantages, and technical and industrial significance of exemplary embodiments of the disclosure will be described below with reference to the accompanying drawings, in which like numerals denote like elements, and wherein:
Hereinafter, an embodiment of the disclosure will be described.
Configuration of Information Processing System
An outline of an information processing system 1 according to an embodiment of the disclosure will be described with reference to
The outline of the embodiment will be first described, and the details thereof will be described below. The vehicle 10 includes, for example, an in-vehicle camera, acquires a moving image obtained by imaging an oncoming lane during traveling. The vehicle 10 determines a congestion section and/or a congestion degree of the oncoming lane based on the moving image, and transmits the determination result and the like, to the server 20. The server 20 collects information from one or more vehicles 10 to store a congestion section and/or a congestion degree for each lane. Then, the server 20 provides information to the client 31 by using the stored information.
As described above, according to the embodiment, the congestion section and/or the congestion degree of the oncoming lane is determined by using the moving image that the vehicle 10 has actually imaged during traveling. The congestion section and/or the congestion degree of the oncoming lane determined by using the actual moving image is highly accurate information that more closely matches the actual situation in the field, for example, compared to the rough congestion section and the congestion degree indicated in the road traffic information provided from the VICS. Therefore, since the accuracy of the information provided to the client 31 is improved, the technology for providing information regarding congestion on the road is improved.
Next, each configuration of the information processing system 1 will be described in detail.
Configuration of Vehicle
As shown in
The communication unit 11 includes a communication module connected to the network 30. The communication module is compatible with mobile communication standards such as 4th Generation (4G) and 5th Generation (5G), but is not limited thereto, and may be compatible with any communication standard. For example, an on-vehicle communication apparatus such as data communication module (DCM) may function as the communication unit 11. In the embodiment, the vehicle 10 is connected to the network 30 through the communication unit 11.
The positioning unit 12 includes a receiver compatible with a satellite positioning system. The receiver is compatible with, for example, a global positioning system (GPS), but is not limited thereto, and may be compatible with any satellite positioning system. The positioning unit 12 includes, for example, a gyro sensor and a geomagnetic sensor. For example, a car navigation device may function as the positioning unit 12. In the embodiment, the vehicle 10 acquires the position of a host vehicle and a direction in which the host vehicle is facing by using the positioning unit 12.
The imaging unit 13 includes an in-vehicle camera that images a subject in the field of view and generates a moving image. The moving image includes a plurality of still images captured at a predetermined frame rate (for example, 30 fps). Hereinafter, each of the still images is also referred to as a frame. The in-vehicle camera may be a monocular camera or a stereo camera. The imaging unit 13 is included in the vehicle 10 such that the oncoming lane can be imaged during traveling. For example, an electronic apparatus having a camera function such as a drive recorder or a smartphone used by an occupant may function as the imaging unit 13. In the embodiment, the vehicle 10 uses the imaging unit 13 to acquire a moving image obtained by imaging the oncoming lane during traveling.
The storage unit 14 includes one or more memories. In the embodiment, the “memory” is, for example, a semiconductor memory, a magnetic memory, or an optical memory, but is not limited thereto. Each memory included in the storage unit 14 may function as, for example, a main storage device, an auxiliary storage device, or a cache memory. The storage unit 14 stores predetermined information used for the operation of the vehicle 10. For example, the storage unit 14 may store a system program, an application program, embedded software, road map information and the like. The road map information includes, for example, road link identification information, node identification information, and lane identification information. The information stored in the storage unit 14 may be updatable with, for example, information to be acquired from the network 30 through the communication unit 11.
The controller 15 includes one or more processors. In the embodiment, the “processor” is a general-purpose processor, a dedicated processor specialized for specific processing, or the like, but is not limited thereto. For example, an electronic control unit (ECU) mounted on the vehicle 10 may function as the controller 15. The controller 15 has a time measuring function for grasping the current time. The controller 15 controls the operation of the entire vehicle 10.
For example, the controller 15 uses the imaging unit 13 to acquire a moving image obtained by imaging the oncoming lane during traveling. In each frame of the moving image, for example, an oncoming vehicle B on the oncoming lane A can appear as shown in
In addition, the controller 15 determines at least one of a congestion section and a congestion degree of the oncoming lane based on the acquired moving image. Hereinafter, a method of determining a congestion section and a congestion degree will be specifically described.
In general, a vehicle being in congestion has a characteristic of a relatively slow vehicle speed and a relatively short inter-vehicle distance to a following vehicle. Therefore, it can be detected whether an individual vehicle is in congestion based on the vehicle speed and the inter-vehicle distance to the following vehicle. The controller 15 detects, from the moving image, a plurality of oncoming vehicles on the oncoming lane which has a vehicle speed less than a reference speed and an inter-vehicle distance to the following vehicle less than a reference distance, as oncoming vehicles in congestion (congested oncoming vehicles). The reference speed and the reference distance may be determined in advance based on, for example, the results of experiments or simulations, or dynamically determined according to the type of road (for example, a general road or a highway), a speed limit, or the like.
It should be noted that any method using a moving image can be employed for detecting the vehicle speed and the inter-vehicle distance of the oncoming vehicle. For example, the controller 15 detects a stationary object, an oncoming vehicle and a vehicle following the oncoming vehicle on the moving image by image recognition. The stationary object is, for example, a streetlight, a roadside tree, a guardrail, a sign, or a signal light installed near the road, but is not limited thereto. Any image recognition algorithm, such as pattern matching, feature point extraction, or machine learning, can be employed for the detection of the stationary object and the oncoming vehicle. The controller 15 sets the position of the host vehicle as the origin and detects, from the moving image, position coordinates of the stationary object, position coordinates of the oncoming vehicle, and position coordinates of the following vehicle by, for example, three-dimensional restoration. The three-dimensional restoration can be performed, for example, using multi-viewpoint images obtained by a motion stereo method using a moving image from a monocular camera or a stereo method using a moving image from a stereo camera. The controller 15 detects the vehicle speed of the oncoming vehicle based on a temporal change in the difference between the position coordinates of the stationary object and the position coordinates of the oncoming vehicle. The controller 15 detects the inter-vehicle distance of the oncoming vehicle based on the difference between the position coordinates of the oncoming vehicle and the position coordinates of the following vehicle.
The controller 15 determines the congestion section of the oncoming lane based on, among a plurality of frames included in the moving image, an imaging position (first imaging position) of a frame of when the host vehicle passes by the first congested oncoming vehicle and an imaging position (second imaging position) of a frame of when the host vehicle passes by the last congested oncoming vehicle. Specifically, the controller 15 determines the congestion section by regarding the first imaging position and the second imaging position as the head position and the tail position of the congestion section, respectively.
The controller 15 determines the congestion degree of the oncoming lane based on the vehicle speed of the congested oncoming vehicle. Specifically, the controller 15 determines that the congestion degree of the oncoming lane is higher as a vehicle speed of one of the congested oncoming vehicles or an average vehicle speed of two or more of the congested oncoming vehicles is slower. In addition, the congestion degree may be indicated by a grade (for example, “low”, “medium”, and “high”) or may be indicated by a numerical value.
In addition, the controller 15 refers to the road map information stored in the storage unit 14 and specifies lane identification information of the oncoming lane. The controller 15 transmits, to the server 20 through the communication unit 11, at least one of the lane identification information of the oncoming lane, a time slot to which the imaging time of the moving image belongs (for example, 12: 10 to 12: 20), and the congestion section and the congestion degree of the oncoming lane. The controller 15 may further transmit, to the server 20, above-mentioned frames, that is, the frame (head image) of when the host vehicle pass by the first oncoming vehicle and the frame (tail image) of when the vehicle passes by the last oncoming vehicle.
Configuration of Server
As shown in
The server communication unit 21 includes a communication module connected to the network 30. The communication module is compatible with, for example, a wired local area network (LAN) standard, but is not limited thereto, and may be compatible with any communication standard. In the embodiment, the server 20 is connected to the network 30 through the server communication unit 21.
The server storage unit 22 includes one or more memories. Each memory included in the server storage unit 22 may function as, for example, a main storage device, an auxiliary storage device, or a cache memory. The server storage unit 22 stores predetermined information used for the operation of the server 20. For example, the server storage unit 22 may store a system program, an application program, a database, road map information and the like. The information stored in the server storage unit 22 may be updatable with, for example, information to be acquired from the network 30 through the server communication unit 21.
The server controller 23 includes one or more processors. The server controller 23 has a time measuring function for grasping the current time. The server controller 23 controls the operation of the entire server 20.
For example, the server controller 23 receives, from the vehicle 10 through the server communication unit 21, at least one of the lane identification information of the oncoming lane, a time slot to which the imaging time of the moving image belongs, and the congestion section and the congestion degree of the oncoming lane. The server controller 23 may further receive, from the vehicle 10, the head image and tail image described above. The server controller 23 stores the received information in the server storage unit 22. Here, the server controller 23 may collect information from a plurality of vehicles 10 and store (accumulate) the collected information in the server storage unit 22. For example, as shown in
Then, the server controller 23 provides information to the client 31 by using the information stored in the server storage unit 22. The provision of information may be performed, for example, in response to the request from the client 31 (for example, pull distribution), or may be automatically performed by the server controller 23 (for example, push distribution). The provision of information may be performed by a web application stored in the server storage unit 22. The provision of information may include providing the information stored in the server storage unit 22 as it is or after processing, or may include providing any information newly generated by using the information stored in the server storage unit 22.
For example,
Operation Flow of Vehicle
An operation flow of the vehicle 10 will be described with reference to
Step S100: the controller 15 acquires a moving image obtained by imaging the oncoming lane during traveling, and the imaging time and imaging position of the moving image.
Step S101: the controller 15 determines the congestion section of the oncoming lane based on the moving image.
Step S102: the controller 15 determines the congestion degree of the oncoming lane based on the moving image.
Step S103: the controller 15 transmits, to the server 20 through the communication unit 11, the lane identification information of the oncoming lane, a time slot to which the imaging time of the moving image belongs, and the congestion section and the congestion degree of the oncoming lane. Here, the controller 15 may further transmit, to the server 20, above-mentioned frames, that is, the frame (head image) of when the host vehicle passes by the first oncoming vehicle and the frame (tail image) of when the vehicle passes by the last oncoming vehicle.
Operation Flow of Server
An operation flow of the server 20 will be described with reference to
Step S200: the server controller 23 receives, from the vehicles 10 through the server communication unit 21, at least one of the lane identification information of the oncoming lane, a time slot to which the imaging time of the moving image belongs, and the congestion section and the congestion degree of the oncoming lane. The server controller 23 may further receive, from the vehicle 10, the head image and tail image described above.
Step S201: the server controller 23 stores the information received from the vehicles 10 in the server storage unit 22. Here, the server controller 23 may collect information from the vehicles 10 and store (accumulate) the collected information in the server storage unit 22.
Step S202: the server controller 23 provides information to the client 31 by using the information stored in the server storage unit 22.
As described above, in the information processing system 1 according to the embodiment, the vehicle 10 acquires a moving image obtained by imaging an oncoming lane during traveling and determines at least one of a congestion section and a congestion degree of the oncoming lane based on the moving image. The server 20 stores at least one of the congestion section and the congestion degree of the oncoming lane and provides information to the client 31 by using the stored information. The congestion section and/or the congestion degree of the oncoming lane, which is determined by using the moving image actually imaged by the vehicle 10 during traveling, is highly accurate information that more closely matches the actual situation in the field, for example, compared to the rough congestion section and the congestion degree indicated in the road traffic information provided from the VICS. Therefore, since the accuracy of the information provided to the client 31 is improved, the technology for providing information regarding congestion on the road is improved.
The disclosure has been described based on the drawings and the examples, but it is to be noted that those skilled in the art easily perform various modifications and changes based on this disclosure. Therefore, it is to be noted that these modifications and changes are included in the scope of the disclosure. For example, the functions and the like included in each unit, each step, or the like can be disposed again so as not to be logically contradictory, and a plurality of units, steps, or the like can be combined into one, or divided.
For example, in the embodiment described above, some processing operations executed in the vehicle 10 may be executed in the server 20, and some processing operations executed in the server 20 may be executed in the vehicle 10. For example, the processing for determining a congestion section and a congestion degree may be executed by the server 20 instead of the vehicle 10.
Further, in the embodiment described above, the vehicle 10 may receive road traffic information indicating the estimated congestion section of the oncoming lane, for example, from the VICS center or the like via the network 30, and start imaging the oncoming lane when the estimated congestion section is reached. According to such a configuration, the vehicle 10 does not need to always image the oncoming lane all the time, and as a result, the processing burden on the vehicle 10 is reduced.
A general-purpose information processing devices such as a smartphone, a computer, or the like can be configured to function as a configuration unit provided in the vehicle 10 according to the embodiment described above or the server 20. Specifically, a program, in which processing contents for realizing each function of the vehicle 10, the server 20 or the like according to the embodiment are described, is stored in a memory of the information processing device such that a processor of the information processing device reads and executes the program. Therefore, the disclosure according to the embodiment can also be realized as the program that can be executed by the processor.
Number | Date | Country | Kind |
---|---|---|---|
JP2018-234146 | Dec 2018 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
20020128770 | Ooishi | Sep 2002 | A1 |
20100114465 | Kim | May 2010 | A1 |
20120033123 | Inoue | Feb 2012 | A1 |
20120209510 | Ikawa | Aug 2012 | A1 |
20120271544 | Hein | Oct 2012 | A1 |
20130124073 | Ren | May 2013 | A1 |
20160069703 | Nakano | Mar 2016 | A1 |
20160217333 | Ozawa | Jul 2016 | A1 |
20180181139 | Ishii | Jun 2018 | A1 |
20180196443 | Bai | Jul 2018 | A1 |
20180326996 | Fujisawa | Nov 2018 | A1 |
20190049253 | Kitamura | Feb 2019 | A1 |
20190186929 | Iwata | Jun 2019 | A1 |
20190186945 | Choi | Jun 2019 | A1 |
20190189004 | Suzuki | Jun 2019 | A1 |
20190219413 | Prakah-Asante | Jul 2019 | A1 |
20200019173 | Chen | Jan 2020 | A1 |
20200058219 | Hassani | Feb 2020 | A1 |
20200124423 | Jiang | Apr 2020 | A1 |
20200124435 | Edwards | Apr 2020 | A1 |
20200406753 | Hayashi | Dec 2020 | A1 |
Number | Date | Country |
---|---|---|
2014-228434 | Dec 2014 | JP |
Number | Date | Country | |
---|---|---|---|
20200193810 A1 | Jun 2020 | US |