This application claims the benefits of Japanese Patent Application No. 2022-173825, filed on Oct. 28, 2022, which is hereby incorporated by reference herein in its entirety.
The present disclosure relates to an information processing method and an information processing apparatus.
Conventionally, there have been technologies to notify surrounding vehicles of information on a position and a predicted traveling course of a host vehicle through vehicle-to-vehicle communication so as to identify a risky vehicle (see, for example, Japanese Patent Application Laid-open No. 2008-90663). Further, there have been technologies to match video of a video camera with data of a motion sensor to identify a person (see, for example, U.S. Patent Application Publication No. 2015/0085111). Further, there have been technologies to collect information on an adjacent sensor at all times to ascertain position information on each sensor in a wireless sensor network (see, for example, U.S. Pat. No. 9,351,124).
The present disclosure has an object of providing an information processing method and an information processing apparatus for suitably obtaining information to perform vehicle-to-vehicle communication.
An aspect of the present disclosure provides an information processing method including, by an information processing apparatus: receiving first information relating to a second vehicle acquired using a sensor from a first in-vehicle terminal installed in a first vehicle; specifying communication identification information on the second vehicle based on the first information and second information indicating a characteristic of a surrounding vehicle present around the first vehicle; and sending the communication identification information to the in-vehicle terminal.
Further, an aspect of the present disclosure provides an information processing apparatus including a controller that executes: receiving first information relating to a second vehicle acquired by using a sensor from a first in-vehicle terminal installed in a first vehicle; specifying communication identification information on the second vehicle based on the first information and second information indicating a characteristic of a surrounding vehicle present around the first vehicle; and sending the communication identification information to the in-vehicle terminal.
Further, an aspect of the present disclosure provides an information processing method including executing by a first in-vehicle terminal installed in a first vehicle: sending first information relating to a second vehicle acquired by using a sensor; and when receiving communication identification information on the second vehicle from an information processing apparatus that has received the first information, communicating with a second in-vehicle terminal installed in the second vehicle using communication identification information.
An aspect of the present disclosure may include at least one of an information processing apparatus, an information processing system, a program, and a recording medium on which the program is recorded, each of which has the same characteristics as those of the information processing method.
According to the present disclosure, it is possible to suitably obtain information for performing vehicle-to-vehicle communication.
An information processing method according to an embodiment includes:
The second information may include position information on the surrounding vehicle. The second information may include information on at least one of a shape, a type, a color, and matters written on a license plate of the surrounding vehicle.
Further, when receiving the first information, the information processing apparatus may use the sensor installed in the first vehicle. Further, when receiving the first information, the information processing apparatus may use a sensor provided in a transportation infrastructure used by the first vehicle.
Further, when specifying the communication identification information on the second vehicle, the information processing apparatus may be based on the first information, the second information, and third information acquired by a base station.
Further, the communication identification information on the second vehicle may be used by the first in-vehicle terminal to communicate with a second in-vehicle terminal installed in the second vehicle.
Further, an information processing method according to the embodiment includes: sending, by a first in-vehicle terminal installed in a first vehicle, first information relating to a second vehicle acquired using a sensor; and communicating, by the first in-vehicle terminal, with a second in-vehicle terminal installed in the second vehicle using communication identification information when receiving the communication identification information on the second vehicle from an information processing apparatus receiving the first information.
The information processing method may further include: specifying, by the first in-vehicle terminal, the second vehicle based on information indicating a surrounding area of the first vehicle. Further, the first in-vehicle terminal may adjust an operation planned by the first vehicle through communication with the second in-vehicle terminal. Further, when adjusting the operation, the first in-vehicle terminal may adjust at least one of a timing at which the first vehicle performs the operation and a traveling speed of the first vehicle.
The first in-vehicle terminal may suspend or interrupt automatic driving assistance with respect to the first vehicle when not receiving the communication identification information. Further, the first in-vehicle terminal may decrease an automatic driving assistance level with respect to the first vehicle when not receiving the communication identification information.
Hereinafter, an information processing apparatus according to an embodiment will be described with reference to the drawings. The configuration of the embodiment is given only as an example, and the information processing apparatus is not limited to the configuration of the embodiment.
The network 1 is, for example, a 5G cellular network (5G network: 5GC). However, the network 1 may be a network other than the 5GC, for example, a public communication network such as the Internet, a wide area network (WAN), or other communication networks. The network 1 may include a wireless network (wireless path) such as a wireless LAN (Local Area Network including Wi-Fi) and BLE.
A vehicle A is an example of a “first vehicle,” and is an automatic driving vehicle. However, the vehicle A may be a vehicle capable of being driven by a person. A vehicle B is an example of a vehicle (called a surrounding vehicle) present around the vehicle A. The surrounding vehicle is, for example, a vehicle present within a prescribed distance from the center of the vehicle A. Each of the vehicle A and the vehicle B has the in-vehicle terminal 4 (
The network 1 serving as the 5GC includes, as entities present in the 5GC, an access and mobility management function (AMF) 11, a unified data management (UDM) 12, a network data analytic function (NWDAF) 13, and a gateway mobile location centre (GMLC) and location management function (LMF) 14.
The AMF 11 is an in-zone accommodation device of user equipment (UE: terminal) in the 5GC. The NWDAF 13 has the function of collecting and analyzing data from respective NFs and operation, administration and management (OAMs). The UDM 12 provides subscriber information, or acquires, registers, deletes, and changes the status of the UE. The GMLC is a gateway node that acquires the latitude and longitude of a mobile terminal in any positioning system, and exchanges information on the acquired latitude and longitude with the outside. The LMF is a function (NF) responsible for communication and control relating to a position information service stipulated in the 5GC. The GMLC/LMF 14 may include any one of the GMLC and the LMF.
In addition, the network 1 includes a sensing function (SF) 2. The SF 2 receives sensing information (sensing data: an example of first information) relating to the vehicle B from the vehicle A. The SF 2 has the function of analyzing the shape, the distance, or the speed of a target object from the sensing information (the sensing information indicates the receiving results of various sensors (such as a wireless receiver, a Radar, a Lidar, and a camera)). The SF 2 exchanges information with the AMF 11, the UDM 12, the NWDAF 13, and the GMLC/LMF 14 corresponding to the sensing information or the external server 3 (an example of a second server) connected to the network 1, and determines communication ID information (an example of communication identification information) on the vehicle B that is a communication target. The communication ID information may include, for example, a subscription permanent identifier (SUPI), an IP address, and a MAC address. However, the communication ID information is not limited to the above information so long as the vehicle A is enabled to communicate with the vehicle B by the acquisition of the communication ID information. The SF 2 may be mounted as a new NF or a part of an existing NF such as the GMLC/LMF 14.
Each of the AMF 11, the UDM 12, the NWDAF 13, the GMLC/LMF 14, and the SF 2 is a function realized when one or two or more computers (information processing apparatuss) run a program.
At this time, the in-vehicle terminal 4A of the vehicle A sends the notification of a lane change request to the in-vehicle terminal 4B of the vehicle B (<1> in
The above vehicle-to-vehicle communication is enabled when a communication ID relating to the vehicle B is known to the vehicle A (in-vehicle terminal 4A). The vehicle A has a sensor group 51 (
The server device 20 is communicable with the in-vehicle terminals 4 via the base station 30 through its communicating function. The server device 20 may be connected to the network 1 in a wired or wireless fashion. The server device 20 may be a stationary terminal or a mobile terminal.
The server device 20 includes a processor 21 serving as a processing unit or a control unit (controller), a storage device 22, a communication interface 23 (communication IF 23), an input device 24, and a display 25, all of which are connected to one another via a bus 26.
The storage device 22 includes a main storage device and an auxiliary storage device. The main storage device is used as at least one of a program and data storage area, a program development area, a program work area, a communication data buffer area, and the like. The main storage device is configured by a random access memory (RAM) or a combination of a RAM and a read-only memory (ROM). The auxiliary storage device is used as a data and program storage area. A non-volatile memory storage medium is applied as the auxiliary storage device. The non-volatile storage medium is, for example, a hard disk, a solid-state drive (SSD), a flash memory, an electrically-erasable programmable read-only memory (EEPROM), or the like. Further, the storage device 22 may include a drive device for a disk recording medium.
The communication IF 23 is a circuit that performs communication processing. The communication IF 23 is, for example, a network interface card (NIC). Further, the communication IF 23 may be a wireless communication circuit that performs wireless communication (such as 5G, wireless LAN (Wi-Fi), and BLE). Further, the communication IF 23 may be a combination of a circuit that performs wired communication processing and a wireless communication circuit.
The input device 24 includes a key, a button, a pointing device, a touch panel, or the like, and is used to input information. The display 25 is, for example, a liquid-crystal display or the like, and displays information and data.
The processor 21 runs various programs stored in the storage device 22 to perform various processing. For example, the processor 21 is operable as each of the AMF 11, the UDM 12, the NWDAF 13, the GMLC/LMF 14, the SF 2, and the external server 3.
The in-vehicle terminal 4 acquires sensing information (sensing data) from the sensor group 51 provided in a vehicle (for example, the vehicle A). The sensor group 51 may include, for example, at least one of a wireless receiver, a radar, a LiDAR, and a camera. The sensing information indicates, for example, the sensing results (the intensity of reflected waves) of a radar and a LiDAR and the electric-field intensity of radio waves from the base station 30 received by a wireless receiver. Further, the sensing information may represent an image (including a moving image (video)) captured by a camera. The in-vehicle terminal 4 is capable of assisting automatic driving including a lane change by giving a control signal to an automatic driving mechanism provided in the vehicle A. Further, the in-vehicle terminal 4 is capable of increasing and decreasing an automatic driving level according to the propriety of communication.
The processors 21 and 41 are, for example, central processing units (CPUs). The CPUs are also called microprocessor units (MPUs). The processor 21 may have a single processor configuration or a multi-processor configuration. Further, a single physical CPU connected through a single socket may have a multi-core configuration. The processors 21 and 41 may include computation devices having various circuit configurations such as a digital signal processor (DSP) and a graphics processing unit (GPU). Further, the processors 21 and 41 may have a configuration to cooperate with at least one of an integrated circuit (IC), other digital circuits, an analog circuit, and the like. The IC includes an LSI, an application-specific integrated circuit (ASIC), a programmable logic device (PLD), or the like. The PLD includes, for example, a field-programmable gate array (FPGA). The processors 21 and 41 also include, for example, one called a microcontroller (MCU), a system-on-a-chip (SoC), a system LSI, a chip set, or the like.
In step S001 of
In step S002, the processor 21 determines the type of the sensing data. When the type of the sensing data indicates the receiving results of various sensors (the electric-field intensity of radio waves or the intensity of reflected waves of a radar or a LiDAR), the processing proceeds to step S003. When the type of the sensing data indicates an image (including video), the processing proceeds to step S006.
In step S003, the processor 21 determines the distance, the direction, the speed, or the like of the vehicle B from the sensing data.
In step S004, the processor 21 maps the distance, the direction, and the speed of the vehicle B determined from the sensing data with respect to a list of candidates for the vehicle B to determine the vehicle B. Here, as a method for determining the vehicle B, any of the following methods 1 to 5 or a combination of the methods is considered.
In step S005, the processor 21 maps (compares) the list of candidates for the vehicle B with the distance, the direction, and the speed of the vehicle B determined from the sensing data to determine the vehicle B. Then, the processing proceeds to step S007.
When the processing proceeds to step S006, the processor 21 transfers the sensing data (image) received from the vehicle A to the external server 3 (second server) and acquires (receives) information on the vehicle B from the external server 3 (see <4> in
In step S007, the processor 21 sends communication ID information (such as an SUPI, an IP address, and a MAC address) on the vehicle B included in, for example, the list of candidates for the vehicle B to the vehicle A (see <6> in
In step S202, the processor 21 analyzes an image based on the image data, and extracts information on at least one of the shape, the color, the matters written on a license plate, and the type of the vehicle B.
In step S203, the processor 21 determines information on the vehicle B from vehicle information managed by the external server 3. In step S204, the processor 21 sends the information on the vehicle B to the SF 2. The information on the vehicle B includes communication ID information on the vehicle B.
In step S102, the processor 41 detects the presence of the vehicle B on a rear side in a change destination lane through sensing with the sensor group 51, and makes a plan to negotiate the change of the lane with the vehicle B.
In step S103, the processor 41 sends sensing data relating to the vehicle B to the network 1 (5GC), and requests the provision of communication ID information on the vehicle B.
In step S104, the processor 41 determines whether the communication ID information on the vehicle B has been received (acquired) from the network 1 (SF 2). The processing proceeds to step S105 when it is determined that the communication ID information has been acquired. Otherwise, the processing proceeds to step S107.
In step S105, the processor 41 communicates with the in-vehicle terminal 4B of the vehicle B using the communication ID information on the vehicle B, and notifies the vehicle B of a lane change request.
In step S106, the processor 41 receives a reply indicating the permission of the lane change of the vehicle A from the in-vehicle terminal 4B, and gives a control signal for assisting automatic driving with respect to the lane change to the automatic driving mechanism 52 when confirming the deceleration of the vehicle B using the sensor group 51. The automatic driving mechanism 52 operates according to the control signal to change the lane of the vehicle A. Thus, the vehicle A is enabled to properly avoid the parked vehicle D as illustrated in
In step S107, the processor 41 decreases the automatic driving level (automatic driving assistance level) of the host vehicle (vehicle A) (for example, from level 3 to level 2). Alternatively, the processor 41 temporarily suspends or interrupts the planned automatic lane change operation.
The SF 2 may only perform at least one of the determination of the vehicle B and the acquisition of the communication ID information thereof through the exchange of information with the NF (the AMF 11, the UDM 12, the NWDAF 13, the GMLC/LMF 14) in the network 1 and the determination of the vehicle B and the acquisition of the communication ID information thereof through the exchange of information with the external server 3. Further, the SF 2 may optionally acquire information from the base station 30. Further, a condition that the processor 41 selects the decrease in the level or the suspension of the operation described in step S107 is appropriately settable. The omission of one of the decrease in the level and the suspension of the operation is also possible. Further, in step S107, an operation planned by the vehicle A may be adjusted. That is, at least one of a timing at which the vehicle A performs the operation and the traveling speed of the vehicle A may be adjusted.
According to the embodiment, the server device 20 (information processing apparatus) operating as the SF 2 receives sensing data (first information) relating to the vehicle B (second vehicle) acquired using the sensor group 51 (sensor) from the in-vehicle terminal 4A (first in-vehicle terminal) installed in the vehicle A (first vehicle). The server device 20 specifies communication ID information (communication identification information) on the vehicle B (second vehicle) based on the sensing data (first information) and a list of terminals as candidates for the in-vehicle terminal 4B installed in the vehicle B (second information indicating the characteristics of surrounding vehicles present around the vehicle A (first vehicle)). Then, the server device 20 sends the communication ID information to the in-vehicle terminal 4A.
Further, the in-vehicle terminal 4A (first in-vehicle terminal) installed in the vehicle A (first vehicle) sends the sensing data (first information) relating to the vehicle B (second vehicle) acquired using the sensor group 51 (sensor). Further, when receiving the communication ID information on the vehicle B (second vehicle) from the server device 20 (information processing apparatus) operating as the SF 2 having received the sensing data (first information), the in-vehicle terminal 4A communicates with the in-vehicle terminal 4B (second in-vehicle terminal) installed in the vehicle B (second vehicle) using the communication ID information.
Accordingly, even if the communication ID information on the vehicle B is not clear when the vehicle A desires communication with the vehicle B, the vehicle A is enabled to suitably acquire the communication ID information corresponding to the vehicle B by sending the sensing data relating to the vehicle B to the network 1. Then, the in-vehicle terminal 4A of the vehicle A is enabled to properly perform an automatic driving operation including a lane change or the like through communication with the in-vehicle terminal 4B.
The above embodiment and the modified examples are given only as an example, and the present disclosure may be appropriately changed and performed without departing from its gist. Further, the processing or means described in the present disclosure may be freely combined together and performed unless any technological contradiction arises.
Further, the processing described as being performed by one device may be cooperatively performed by a plurality of devices. Alternatively, the processing described as being performed by different devices may be performed by one device. It is possible to flexibly change a hardware configuration (server configuration) to realize respective functions in a computer system.
The present disclosure is realizable in such a manner that a computer program implementing the functions described in the above embodiment is supplied to a computer, and that one or more processors provided in the computer read and perform the program. Such a computer program may be provided to a computer by a non-transitory computer-readable storage medium connectable to the system bus of the computer, or may be provided to the computer via a network. The non-transitory computer-readable storage medium includes, for example, any type of a disk such as a magnetic disk (such as a floppy (TM) disk and a hard disk drive (HDD)) and an optical disk (such as a CD-ROM, a DVD disk, and a Blue-ray disk), and any type of a medium for storing electronic instructions such as a read-only memory (ROM), a random access memory (RAM), an EPROM, an EEPROM, a magnetic card, a flash memory, and an optical card.
Number | Date | Country | Kind |
---|---|---|---|
2022-173825 | Oct 2022 | JP | national |