This application claims priority to Japanese Patent Application No. 2023-215097, filed on Dec. 20, 2023, the entire contents of which are incorporated herein by reference.
The present disclosure relates to a method of operating an information processing apparatus, an information processing apparatus, and a medium.
In the design phase of city blocks, technology for simulating traffic volume or the like by information processing apparatuses is known in order to study the possibility of traffic congestion or the like in the city blocks. For example, Patent Literature (PTL) 1 discloses a system for simulating an operation plan for public transportation vehicles.
In the simulation of vehicle operation in a city block, there are a variety of situations that can be assumed, such as the presence or absence of objects e.g., pedestrians and other vehicles in a travel route of a vehicle and movement states of the objects. Simulating the vehicle operation for each different situation requires enormous amounts of information processing and time. Therefore, it is desirable to reduce processing time while ensuring accuracy in the simulation.
The present disclosure provides a method of operating an information processing apparatus and the like that can reduce processing time while ensuring accuracy in the simulation of vehicle operation in a city block.
A method of operating an information processing apparatus according to the present disclosure includes:
An information processing apparatus according to the present disclosure includes:
In a non-transitory computer readable medium storing a program according to the present disclosure, the program is configured to cause an information processing apparatus to execute operations, the operations including:
The method of operating an information processing apparatus and the like according to the present disclosure can reduce processing time while ensuring accuracy in the simulation of vehicle operation in a city block.
In the accompanying drawings:
An embodiment will be described below.
The server apparatus 10, which corresponds to an “information processing apparatus” according to the present embodiment, executes the information processing for the simulation in the city block design upon receiving the instructions from the terminal apparatus 12. The simulation is, for example, the virtual simulation of travel of a vehicle operated on streets in a city block in a so-called smart city or the like, under various situations. The vehicle is, for example, a commercial vehicle, such as a bus or a truck, the driving of which is automated at any level, such as one of Level 1 to Level 5 defined by the Society of Automotive Engineers (SAE). The various situations (hereinafter referred to as “surrounding situations”) include movement patterns of objects such as pedestrians, bicycles, and other vehicles that intersect a travel route of the vehicle, states of blind spots from the vehicle on the streets, and the like. In the simulation, the vehicle is virtually operated based on control operation that is executed according to the surrounding situations by a travel control program installed in the vehicle. In the present embodiment, the simulation is executed in two stages. In other words, a method of operating the server apparatus 10 includes a simple simulation process that operates in a plurality of pseudo surrounding situations by a mimic program that executes, according to a part of the surrounding situations of a surrounding environment in which the vehicle travels, a part of the control operation by the travel control program for controlling the travel of the vehicle according to the surrounding situations, and a selection process that extracts, from the plurality of pseudo surrounding situations, one or more pseudo surrounding situations in which the vehicle and an object exhibit a predetermined state, for a detailed simulation process by the travel control program. In the detailed simulation process, the control operation by the travel control program is executed in the pseudo surrounding situations (hereinafter referred to as “sample surrounding situations”) extracted in the selection process.
According to the present embodiment, as a result of the simple simulation process by the mimic program, which limitedly simulates the control operation by the travel control program and hence has a smaller processing load than the travel control program, the sample surrounding situations, in which the vehicle and the object exhibit the predetermined state, e.g., the closest approach distance between the vehicle and the object is less than a reference distance, are extracted. Then, the detailed simulation process by the travel control program is executed in the sample surrounding situations. By executing the simple simulation process by the mimic program based on the travel control program, the sample surrounding situations can be extracted while accuracy in the simulation is maintained in some degree. By executing the detailed simulation process for the sample surrounding situations, the simulation can be executed with a smaller overall processing load than when detailed simulation is executed for all the pseudo surrounding situations. Therefore, it is possible to reduce processing time while ensuring accuracy in the simulation of vehicle operation in the city block.
Next, configurations of the server apparatus 10 and the terminal apparatus 12 will be described.
The server apparatus 10 includes a communication interface 101, a memory 102, a controller 103, an input interface 105, and an output interface 106. When the server apparatus 10 is configured with two or more server computers, these components are appropriately arranged in the two or more server computers.
The communication interface 101 includes one or more interfaces for communication. The interface for communication is, for example, a LAN interface. The communication interface 101 receives information to be used for operations of the server apparatus 10 and transmits information obtained by the operations of the server apparatus 10. The server apparatus 10 is connected to the network 11 by the communication interface 101 and communicates information with the terminal apparatus 12 via the network 11.
The memory 102 includes, for example, one or more semiconductor memories, one or more magnetic memories, one or more optical memories, or a combination of at least two of these types, to function as main memory, auxiliary memory, or cache memory. The semiconductor memory is, for example, Random Access Memory (RAM) or Read Only Memory (ROM). The RAM is, for example, Static RAM (SRAM) or Dynamic RAM (DRAM). The ROM is, for example, Electrically Erasable Programmable ROM (EEPROM). The memory 102 stores information to be used for operations of the controller 103 and information obtained by the operations of the controller 103. The memory 102 stores, for example, a travel control program 104 and detailed simulation data 107 for the detailed simulation process, and a mimic program 108 and simple simulation data 109 for the simple simulation process.
Returning to
The functions of the server apparatus 10 are realized by a processor included in the controller 103 executing a control program. The control program is a program for causing the processor to function as the controller 103. Some or all of the functions of the server apparatus 10 may be realized by a dedicated circuit included in the controller 103. The control program may be stored on a non-transitory recording/storage medium readable by the controller 103, and be read from the medium by the controller 103.
The controller 103 executes the simple simulation process using the simple simulation data 109 by executing the mimic program 108. The controller 103 also simulates, by executing an emulation program, for example, an operating environment of a control apparatus such as an electronic control unit (ECU) in which the travel control program 104 is implemented in the vehicle. The controller 103 then executes the detailed simulation process using the detailed simulation data 107 by executing the travel control program 104 in the simulated operating environment.
The input interface 105 includes one or more interfaces for input. The interface for input is, for example, a physical key, a capacitive key, a pointing device, a touch screen integrally provided with a display, or a microphone that receives audio input. The input interface 105 accepts operations for inputting information to be used in the operations of the server apparatus 10 and transmits the inputted information to the controller 103.
The output interface 106 includes one or more interfaces for output. The interface for output is, for example, a display or a speaker. The display is, for example, a liquid crystal display (LCD) or an organic electro-luminescence (EL) display. The output interface 106 outputs information obtained by the operations of the server apparatus 10.
The terminal apparatus 12 includes a communication interface 121, a memory 122, a controller 123, an input interface 125, and an output interface 126.
The communication interface 121 includes a communication module compliant with a wired or wireless LAN standard, a module compliant with a mobile communication standard such as LTE, 4G, or 5G, or the like. The terminal apparatus 12 connects, by the communication interface 121, to the network 11 via a nearby router apparatus or mobile communication base station, and communicates information with the server apparatus 10 and the like over the network 11.
The memory 122 includes, for example, one or more semiconductor memories, one or more magnetic memories, one or more optical memories, or a combination of at least two of these types. The semiconductor memory is, for example, RAM or ROM. The RAM is, for example, SRAM or DRAM. The ROM is, for example, EEPROM. The memory 122 functions as, for example, a main memory, an auxiliary memory, or a cache memory. The memory 122 stores information to be used for operations of the controller 123 and information obtained by the operations of the controller 123.
The controller 123 has one or more general purpose processors, such as CPUs or micro processing units (MPUs), or one or more dedicated processors, such as GPUs, that are dedicated to specific processing. Alternatively, the controller 123 may have one or more dedicated circuits such as FPGAs or ASICs. The controller 123 is configured to perform overall control of operations of the terminal apparatus 12 by operating according to control/processing programs or operating according to operating procedures implemented in the form of circuits. The controller 123 then transmits and receives various types of information to and from the server apparatus 10 and the like via the communication interface 121, to execute operations according to the present embodiment.
The functions of the terminal apparatus 12 are realized by a processor included in the controller 123 executing a control program. The control program is a program for causing the processor to function as the controller 123. Some or all of the functions of the terminal apparatus 12 may be realized by a dedicated circuit included in the controller 123. The control program may be stored on a non-transitory recording/storage medium readable by the controller 123 and be read from the medium by the controller 123.
The input interface 125 includes one or more interfaces for input. The interface for input may include, for example, a physical key, a capacitive key, a pointing device, and a touch screen integrally provided with a display. The interface for input may also include a microphone that accepts audio input and a camera that captures images. The interface for input may further include a scanner, camera, or IC card reader that scans an image code. The input interface 125 accepts operations for inputting information to be used in the operations of the controller 123 and transmits the inputted information to the controller 123. The input interface 125 transmits, to the controller 123, the images captured by the camera.
The output interface 126 includes one or more interfaces for output. The interface for output may include, for example, a display and a speaker. The display is, for example, an LCD or an organic EL display. The output interface 126 outputs information obtained by the operations of the controller 123.
In step S30, the controller 103 acquires simple simulation data. The controller 103 reads and acquires the simple simulation data 109, which is stored in advance in the memory 102. Alternatively, the operator inputs any simple simulation data using the input interface 125 of the terminal apparatus 12, and the controller 123 of the terminal apparatus 12 transmits, by the communication interface 121, the input data to the server apparatus 10. The controller 103 of the server apparatus 10 receives, by the communication interface 101, the information transmitted from the terminal apparatus 12. The controller 103 thereby acquires the simple simulation data. Alternatively, the operator may input any simple simulation data using the input interface 105 of the server apparatus 10, and the controller 103 may acquire the input simple simulation data.
In step S31, the controller 103 executes the simple simulation process. The operator inputs instructions for execution of the simple simulation process using the input interface 125 of the terminal apparatus 12. The controller 123 of the terminal apparatus 12 transmits, by the communication interface 121, the input instructions to the server apparatus 10. The controller 103 of the server apparatus 10 receives, by the communication interface 101, the instructions transmitted from the terminal apparatus 12. The controller 103 thereby executes, according to the mimic program 108, the simple simulation process using the simple simulation data. Details on the simple simulation process are illustrated in
In step S401 in
In step S402, the controller 103 determines whether a sensor range has been determined. The sensor range is, for example, the field of view 53 of the camera. When the field of view 53 is set to a range corresponding to the position of the vehicle 51 in the city block 50, it can be determined that the sensor range has been determined (Yes in step S402), and the controller 103 proceeds to step S403. When it is determined that the sensor range has not been determined (No in step S402), the controller 103 proceeds to step S410, determines to maintain the vehicle speed of the vehicle 51, and ends the procedure in
In step S403, the controller 103 determines whether there is a hidden object. For example, since the field of view 53 of the vehicle 51 at positions P1, P3 and P5 does not include the obstacles 56-1 and 56-2, it is determined that there is no hidden object. Since the field of view 53 of the vehicle 51 at positions P2 and P4 includes the obstacles 56-1 and 56-2, respectively, it is determined that there is a hidden object. When it is determined that there is a hidden object (Yes in step S403), the controller 103 proceeds to step S404. When it is determined that there is no hidden object (No in step S403), the controller 103 proceeds to step S410, determines to maintain the vehicle speed of the vehicle 51, and ends the procedure in
In step S404, the controller 103 starts decelerating the vehicle 51 by a static parameter. The static parameter is any value defined in advance in the mimic program 108, and corresponds to a deceleration rate for preliminary deceleration of the vehicle 51.
In step S405, the controller 103 determines a vehicle-to-vehicle deceleration rate. For example, as the vehicle 51 travels to the position P3, the other vehicle 54 that travels along a direction 54D on the street 57-2, which intersects the street 57-3, is captured in the field of view 53. The controller 103, for example, derives, based on a variation in the position of the other vehicle 54 with time, the time until the other vehicle 54 intersects the travel route 52, and determines the deceleration rate to avoid contact with the other vehicle 54. Alternatively, when the other vehicle 54 is not included in the field of view 53 or when the time until the other vehicle 54 intersects the travel route 52 is longer than an arbitrary criterion, the controller 103 may maintain the vehicle speed of vehicle 51. The arbitrary criterion is set equal to or more than the time required for the vehicle 51 to reach a point at which the other vehicle 54 intersects the travel route 52.
In step S406, the controller 103 determines a vehicle-to-pedestrian deceleration rate. For example, as the vehicle 51 travels to the position P5, the pedestrian 55 that moves along a direction 55D across the street 57-3 is captured in the field of view 53. The controller 103, for example, derives, based on a variation in the position of the pedestrian 55 with time, the time until the pedestrian 55 intersects the travel route 52, and determines the deceleration rate to avoid contact with the pedestrian 55. Alternatively, when the pedestrian 55 is not included in the field of view 53 or when the time until the pedestrian 55 intersects the travel route 52 is longer than an arbitrary criterion, the controller 103 may maintain the vehicle speed of vehicle 51. The arbitrary criterion is set equal to or more than the time required for the vehicle 51 to reach a point at which the pedestrian 55 intersects the travel route 52.
In step S407, the controller 103 adjusts the deceleration rate. For example, the controller 103 adjusts the deceleration rate to a value equal to or less than a maximum deceleration rate that is possible under specifications of the vehicle 51.
In step S408, the controller 103 determines the vehicle speed. For example, the controller 103 determines the vehicle speed of the decelerated vehicle 51 by deceleration with the adjusted deceleration rate. This executes simulation in which the vehicle 51 travels at the decelerated vehicle speed. In such simple simulation, the controller 103 may generate computer graphics (CG) images representing the city block 50, the vehicle 51, the other vehicle 54, the pedestrian 55, the obstacles 56-1 and 56-2, and the like, and transmit the CG images to the terminal apparatus 12. Thus, the operator can view the CG images of the simple simulation with the terminal apparatus 12.
In step S409, the controller 103 stores the closest approach distance between the vehicle 51 and the object. For example, when the vehicle 51 travels at the decelerated vehicle speed, the controller 103 derives the distance of closest approach between the vehicle 51 and the other vehicle 54 or pedestrian 55, and stores the distance of closest approach in the memory 102, together with identification information on the data for the simulation pattern 22. The closest approach includes a case in which the vehicle 51 makes contact with the other vehicle 54 or pedestrian 55.
Returning to
In step S33, the controller 103 executes the detailed simulation process. The operator inputs instructions for execution of the detailed simulation process using the input interface 125 of the terminal apparatus 12. The controller 123 of the terminal apparatus 12 transmits, by the communication interface 121, the input instructions to the server apparatus 10. The controller 103 of the server apparatus 10 receives, by the communication interface 101, the instructions transmitted from the terminal apparatus 12. The controller 103 thereby executes the detailed simulation process using the detailed simulation data 107 according to the travel control program 104. At this time, the controller 103 executes the detailed simulation using the data for the simulation patterns 21 extracted by the selection process. The controller 103 outputs signals, data, and the like similar to those output by the ECU, for example, according to the detailed simulation data 107, by emulating the implementation environment of the travel control program 104 in the vehicle and executing the travel control program 104. Furthermore, the controller 103 extracts data for one or more simulation patterns 21 in which the closest approach distance between the vehicle and the object is equal to or less than an arbitrary criterion. The criterion for the closest approach distance is a value arbitrarily determined in a range of tens of centimeters to two meters, for example, such that the probability of a contact accident is greater than a certain degree. In the detailed simulation process, the controller 103 may generate CG images that represent pseudo surrounding environments corresponding to the data for the simulation patterns 21 and operation of the vehicle 51 corresponding to output obtained by executing the travel control program 104, and transmits the CG images to the terminal apparatus 12. Thus, the operator can view the CG images of the detailed simulation with the terminal apparatus 12.
According to the present embodiment, by executing the simple simulation process by the mimic program based on the travel control program, the sample surrounding situations can be extracted while accuracy in the simulation is maintained in some degree. By executing the detailed simulation process for the sample surrounding situations, the simulation can be executed with a smaller overall processing load than when detailed simulation is executed for all the surrounding situations. Specifically, by excluding, from screening, phenomena that are outside a recognition range of the vehicle, it is possible to execute the detailed simulation concentrating on events that are caused by the difficulty of recognition from the vehicle due to the infrastructure of the city block. Therefore, it is possible to reduce processing time while ensuring accuracy in the simulation of the vehicle operation in the city block.
In the present embodiment, the simple simulation process, the selection process, and the detailed simulation process may be distributed and executed by two or more server computers. The present embodiment includes a case in which the above procedure described as the operations of the server apparatus 10 is executed by an information processing apparatus, such as a stand-alone PC, for example. Furthermore, an information processing apparatus such as a server computer or PC may be configured communicably with the ECU or equivalent control apparatus installed in the vehicle, and emulate the operating environment of the detailed simulation by the configuration with the ECU or the like.
While the embodiment has been described with reference to the drawings and examples, it should be noted that various modifications and revisions may be implemented by those skilled in the art based on the present disclosure. Accordingly, such modifications and revisions are included within the scope of the present disclosure. For example, functions or the like included in each means, each step, or the like can be rearranged without logical inconsistency, and a plurality of means, steps, or the like can be combined into one or divided.
Number | Date | Country | Kind |
---|---|---|---|
2023-215097 | Dec 2023 | JP | national |