The disclosure of Japanese Patent Application No. 2019-096229 filed on May 22, 2019 including the specification, drawings and abstract is incorporated herein by reference in its entirety.
The disclosure relates to an information processing device, an autonomous vehicle, an information processing method and a program.
In recent years, technologies relevant to vehicles that can perform autonomous traveling have been developed. For example, Japanese Patent Application Publication No. 2017-222271 describes an information providing device that detects the orientation of the face of an occupant at a driver's seat, and displays a screen corresponding to a subtask, on a display disposed in a forward region including a region diagonally in front of the driver's seat, in the case of determining that the occupant at the driver's seat is executing the subtask. The subtask means an action such as watching a moving image including a movie and a TV program, listening to music, checking news, gaming, viewing a photograph or image data, using a social networking service (SNS), viewing a map, and reading a book.
The disclosure has an object to provide a more entertaining experience to a riding user in a vehicle that can perform autonomous traveling.
An information processing device according to a first aspect of the disclosure includes a control unit that executes: generating a traveling plan for an autonomous vehicle, the traveling plan including a traveling route and a traveling schedule; and generating a moving image to be displayed on a display device that is provided in a vehicle cabin of the autonomous vehicle. The control unit generates the traveling plan or the moving image, such that at least a part of a behavior of the autonomous vehicle when the autonomous vehicle travels in accordance with the traveling plan and at least a part of the moving image to be displayed on the display device during traveling are coordinated with each other.
In an information processing method according to a second aspect of the disclosure, a computer executes: generating a traveling plan for an autonomous vehicle, the traveling plan including a traveling route and a traveling schedule; and generating a moving image to be displayed on a display device that is provided in a vehicle cabin of the autonomous vehicle. The computer generates the traveling plan or the moving image, such that at least a part of a behavior of the autonomous vehicle when the autonomous vehicle travels in accordance with the traveling plan and at least a part of the moving image to be displayed on the display device during traveling are coordinated with each other.
A program according to a third aspect of the disclosure causes a computer to execute: generating a traveling plan for an autonomous vehicle, the traveling plan including a traveling route and a traveling schedule; and generating a moving image to be displayed on a display device that is provided in a vehicle cabin of the autonomous vehicle. The computer generates the traveling plan or the moving image, such that at least a part of a behavior of the autonomous vehicle when the autonomous vehicle travels in accordance with the traveling plan and at least a part of the moving image to be displayed on the display device during traveling are coordinated with each other.
With the disclosure, it is possible to provide a more entertaining experience to a riding user in a vehicle that can perform autonomous traveling.
Features, advantages, and technical and industrial significance of exemplary embodiments will be described below with reference to the accompanying drawings, in which like numerals denote like elements, and wherein:
An information processing device according to a first embodiment of the disclosure generates a traveling plan for a vehicle (also referred to as an “autonomous vehicle”) that can perform autonomous traveling, and generates data for a moving image that is watched within the vehicle by an occupant, a user of the autonomous vehicle. The traveling plan includes a traveling route along which the autonomous vehicle travels, and a traveling schedule that includes estimated times of arrivals at spots on the traveling route. Further, the information processing device generates the traveling plan or the moving image, such that at least a part of a behavior of the autonomous vehicle when the autonomous vehicle travels in accordance with the traveling plan and at least a part of moving image to be generated are coordinated with each other.
In the interior of the autonomous vehicle, a display device is provided. Examples of the display device include a liquid crystal monitor or touch panel that is provided on an instrument panel or the like, a projector that projects a picture on a screen, a head-up display that projects a picture on a windshield or a combiner, and a head mount display (HMD). The user can watch the moving image displayed on the display device.
The moving image is a picture content such as a movie, a two-dimensional or three-dimensional computer graphics (CG) animation and a virtual reality (V R) moving image. The information processing device according to the embodiment may draw the moving image based on a predetermined algorism, or may generate a single moving image by joining at least a part of existing moving image data as materials. The information processing device may generate the moving image as a program in a programming environment suitable for image processing or animation output, for example, in a programming environment with Processing, and may play back the moving image by executing the program. In each case, the information processing device generates at least a part of the content of the moving image, such that the at least a part of the content of the moving image is coordinated with at least a part of the behavior of the autonomous vehicle during the traveling of the autonomous vehicle. The behavior may be a change in the acceleration, deceleration, yaw rate, pitch rate or roll rate that is applied to the autonomous vehicle. For example, the information processing device may generate a moving image resulting from drawing a manner in which a movement route on the moving image is curved or some kind of character acts at the timing when the autonomous vehicle turns at an intersection based on a traveling route to a previously set destination.
For example, the content of the moving image may be a pseudo traveling moving image in which a surrounding landscape changes by moving a viewpoint (camera) from an occupant's eye (also referred to as a first-person viewpoint) within a virtual space resembling a real or imaginary place including the sky, seafloor, space and others. Further, the moving image is not limited to the expression of the change in the landscape associated with the movement, that is, the expression of the so-called first-person viewpoint, and may be the expression of the action of some kind of character, or may be a moving image in which an abstract form, pattern or color, or the like changes so as to be coordinated with the behavior of the vehicle. Further, the moving image may be coordinated with not only the picture but also an acoustic effect that is output from an unillustrated speaker.
Further, the autonomous vehicle decides a traveling route from the current place to a destination that is input by the user, for example. A known method can be employed for search of the route using map information and position information about the autonomous vehicle.
With the above information processing device, the user that rides in the autonomous vehicle and that watches the moving image can enjoy a moving image in which the content changes so as to be coordinated with the acceleration received by the traveling. That is, the vehicle in which the user rides can provide a more entertaining experience to the user, while traveling on a driveway.
During the traveling of the autonomous vehicle, the information processing device may alter the content of the moving image, for example, in response to the behavior of the autonomous vehicle for performing a deceleration or a lane change depending on a congestion situation of a road or for dealing with a disturbance such as an avoidance of an obstacle on a road. For example, the information processing device may draw an obstacle or character that appears in a movement direction on the moving image, in real time, or may draw a change in viewpoint for avoiding the obstacle or character, in real time, in response to the disturbance that affects the behavior of the autonomous vehicle.
The traveling route may be decided in consideration of the production of the content of the moving image. For example, the user selects a genre or story of a moving image that the user hopes to watch, and a traveling route suitable for the selected genre or story is selected. For example, a route including a curve where a high acceleration is generated in the autonomous vehicle at a previously set important point of the content of the moving image, a route including an unpaved road where vibration is generated, or the like may be purposely selected. Further, the autonomous vehicle may acquire congestion information about the road separately, and a route that is not congested may be selected in order to easily alter the acceleration, deceleration or steering angle of the autonomous vehicle for the important point of the content of the moving image.
A specific embodiment of the disclosure will be described below, based on the drawings. Unless otherwise mentioned, it is not intended to limit the technical scope of the disclosure, only to dimensions, materials, shapes, relative arrangements and others of constituent components that are described in the embodiment.
The communication unit 101 is a communication device for connecting the vehicle 100 to a network. The communication unit 101 can communicate with another server device and the like via the network, for example, using a mobile communication service such as 3rd Generation (3G) and Long Term Evolution (LTE). The vehicle 100 may acquire the map information and the congestion information about the driveway through the communication unit 101. The communication unit 101 may further include a communication device for performing inter-vehicle communication with another vehicle.
The storage unit 102 is a device in which information is stored in a transitory or non-transitory manner, and is constituted by a storage medium such as a magnetic disk and a flash memory. For example, the map information and the congestion information about the driveway are stored in the storage unit 102. Further, the destination as a place to which the user goes, the traveling plan for the vehicle 100 that is generated by a later-described traveling plan generation unit 1063, and the moving image that is generated by a later-described moving image generation unit 1064 are stored in the storage unit 102.
The sensor 103 is a device for sensing the situation surrounding the vehicle 100. Specifically, the sensor 103 is configured to include a stereo camera, a laser scanner, a LIDAR, a radar and the like. Information that is relevant to the situation surrounding the vehicle 100 acquired by the sensor 103 is sent to the control unit 106. The position information acquisition unit 104 is a device that acquires the current position of the vehicle 100, and is specifically configured to include a global positioning system (GPS) receiver and the like. Information that is relevant to the current position of the vehicle 100 acquired by the position information acquisition unit 104 is sent to the control unit 106.
The control unit 106 has a function to perform arithmetic processing for controlling the vehicle 100. For example, the control unit 106 is constituted by a microcomputer. The control unit 106 includes an environment detection unit 1061, a traveling control unit 1062, the traveling plan generation unit 1063 and the moving image generation unit 1064, as functional modules. Each of the functional modules may be realized when a processor such as a CPU executes a program stored in a storage unit such as a ROM that is included in the control unit 106. Further, some or all of the functions may be realized by hardware circuits such as an ASIC and a FPGA.
The environment detection unit 1061 detects the environment surrounding the vehicle 100, based on the information acquired by the sensor 103. For example, the environment detection unit 1061 detects a physical body (including a human and an animal) such as another vehicle that exists in an area surrounding the vehicle 100. Further, the environment detection unit 1061 detects various objects necessary for the autonomous traveling of the vehicle 100, as exemplified by the number and positions of lanes on the road, the structure of the road, and road signs. Further, the environment detection unit 1061 may perform the tracking of the detected physical body. In this case, for example, the relative speed of the physical body may be evaluated from a difference between coordinates of the physical body detected in the previous step and the current coordinates of the physical body.
The traveling control unit 1062 is a vehicle control device that controls the traveling of the vehicle 100 based on the traveling plan stored in the storage unit 102, the position information about the vehicle 100 acquired by the position information acquisition unit 104 and data about the surrounding environment detected by the environment detection unit 1061. For example, the traveling control unit 1062 causes the vehicle 100 to travel along the traveling route included in the traveling plan, in accordance with the traveling schedule included in the traveling plan. In the case where the environment detection unit 1061 detects a physical body with which the vehicle 100 can collide, the traveling control unit 1062 executes a collision avoidance control by which the vehicle 100 travels so as to avoid the collision with the physical body. As the method for the above autonomous traveling of the vehicle 100, a known method can be employed. Control information, a command generated by the traveling control unit for controlling the traveling of the vehicle 100 is output to the drive unit 105, and further is output to the moving image generation unit 1064.
The traveling plan generation unit 1063 generates the traveling plan for the vehicle 100, for example, using the map information and congestion information stored in the storage unit 102, the destination of the user, and the position information acquired from the position information acquisition unit 104. The traveling plan includes the traveling route and the traveling schedule. The traveling plan generation unit 1063 may decide the traveling route and the traveling schedule, in consideration of the production of the moving image.
The moving image generation unit 1064 generates the moving image to be displayed on the display device 107. The moving image generation unit 1064 generates the moving image, such that at least a part of the behavior of the vehicle 100 when the vehicle 100 travels in accordance with the traveling plan and at least a part of the moving image to be displayed on the display device 107 during the traveling are coordinated with each other.
The drive unit 105 is configured to include the motor that is the prime mover, and mechanisms (for example, an inverter, a brake and a steering mechanism) for the traveling of the vehicle 100. The drive unit 105 causes the vehicle 100 to travel based on the command generated by the traveling control unit 1062 for controlling the traveling of the vehicle 100. T hereby, the autonomous traveling of the vehicle 100 is realized.
The display device 107 is a picture output device provided in a vehicle cabin, and for example, is a liquid crystal monitor or touch panel provided on an instrument panel or the like, a projector that projects a picture on a screen, a head-up display that projects a picture on a windshield or a combiner, and a head mount display (HMD). In the case of the touch panel, the display device 107 also functions as an input device that accepts an operation by the user.
Processing Flow
A process that is executed by the control unit 106 of the information processing device will be described below.
Then, the traveling plan generation unit 1063 generates the traveling plan, while the moving image generation unit 1064 generates the moving image (
In
In
Based on the traveling route and the traveling schedule, for example, the moving image generation unit 1064 can calculate the timing of a change in the behavior including the acceleration, deceleration, yaw rate, pitch rate or roll rate of the vehicle 100, the direction of the change, and the amount of the change, and can generate a moving image in which the content of the picture to be watched by the user changes so as to be coordinated with the timing, the direction of the change and the amount of the change. For example, the moving image generation unit 1064 generates a moving image in which the content changes depending on a turning radius evaluated from the map information about the intersection and an estimated vehicle speed. That is, in
For example, suppose that the road on the section between the spot B and the spot C is an unpaved road. It is predicted that the vehicle 100 traveling on the section will jolt wildly compared to a paved road. That is, the vehicle 100 generates a behavior such as a vertical acceleration or deceleration and a short-period fluctuation in the pitch or the roll, and in a time period when the vehicle 100 is estimated to travel on the section, the moving image generation unit 1064 generates the moving image showing the content in which the space ship on which a viewer rides sails in outer space so as to go through rocks (in
During the traveling of the vehicle 100, the moving image generation unit 1064 determines whether a disturbance has occurred (
In the case where it is determined in S202 that the disturbance has occurred (S202: YES), the moving image generation unit 1064 alters the moving image in response to the disturbance (
After S203 or in the case where it is not determined in S202 that the disturbance has occurred (S202: NO), the moving image generation unit 1064 determines whether the generation and output of the moving image are ended (
Modification 1
The traveling plan generation unit 1063 of the vehicle 100 acquires the position information, and acquires the destination of the user (
Next, the traveling plan generation unit 1063 acquires information indicating the selection about the moving image by the user (
Thereafter, the traveling plan generation unit 1063 generates the traveling plan, and the moving image generation unit 1064 generates the moving image (
With the modification, it is possible to select a traveling route that easily realizes the behavior of the vehicle 100 in accordance with a previously set scenario of the moving image.
Modification 2
In the modification, the traveling plan and the moving image are generated by the management server 200, and is sent to the vehicle 100 through the network N1. That is, the functions of the traveling plan generation unit 1063 and moving image generation unit 1064 of the control unit 106 shown in
For at least some of the functional blocks shown in
Each of the above embodiment and modifications is just an example, and the disclosure can be carried out while being appropriately modified without departing from the spirit of the disclosure. Further, the processes and means described in the disclosure can be carried out while being freely combined as long as there is no technical inconsistency.
A process described as a process that is performed by a single device may be executed by a plurality of devices in cooperation. Alternatively, a process described as a process that is performed by different devices may be executed by a single device. In the computer system, a hardware configuration (server configuration) to realize each function can be flexibly modified.
The disclosure can be realized, also when a computer program in which the functions described in the above embodiment are implemented is supplied to a computer and one or more processors included in the computer read and execute the computer program. The computer program may be provided to the computer through a non-transitory computer-readable storage medium that can be connected to a system bus of the computer, or may be provided to the computer through a network. Examples of the non-transitory computer-readable storage medium include an arbitrary type of disk such as a magnetic disk (a Floppy® disk, a hard disk drive (HDD) and the like) and an optical disk (a CD-ROM, a DV D disk, a Blu-ray disk and the like), a read only memory (ROM), a random access memory (RAM), an E PROM, an EEPROM, a magnetic card, a flash memory, an optical card, and an arbitrary type of medium suitable for storing electronic instructions.
Number | Date | Country | Kind |
---|---|---|---|
2019-096229 | May 2019 | JP | national |