This application claims the priority benefit of Taiwan application serial no. 110143485, filed on Nov. 23, 2021. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of this specification.
The disclosure relates to an autonomous driving decision-making technology, and in particular to a method and an electronic apparatus for predicting a path based on an object interaction relationship.
With the vigorous development of science and technology, research on autonomous driving is thriving. Currently, an autonomous vehicle analyzes a large number of information in real time to realize effective self-driving. For example, an autonomous vehicle needs to accurately analyze data such as map information or surrounding objects during operation. The analysis results of these data are used as the basis for controlling the driving of the autonomous vehicle, so that the decision of the autonomous vehicle in the event of an emergency is similar to the behavior of a human driver.
However, the decision-making ability of autonomous driving has an effect on the safety of the autonomous vehicle. Once a decision of autonomous driving is wrong, serious problems such as traffic accidents may occur. Therefore, improving the accuracy of decision making in autonomous driving is an important issue to those skilled in the art.
The disclosure provides a method and an electronic apparatus for predicting a path based on an object interaction relationship, which improve the accuracy of predicting a trajectory of an object around a main vehicle.
A method for predicting a path based on an object interaction relationship of the disclosure is adapted for an electronic apparatus including a processor. The processor is configured to control a first vehicle. The method includes the following. A video including multiple image frames is received. Object recognition is performed on a certain image frame among the plurality of image frames to recognize at least one object in the certain image frame.
Preset interactive relationship information associated with the at least one object is obtained from an interactive relationship database based on the at least one object. A first trajectory for navigating the first vehicle is determined based on the preset interactive relationship information.
An electronic apparatus of the disclosure is adapted for controlling a first vehicle. The electronic apparatus includes a storage device and a processor. The storage device stores an interactive relationship database. The processor is coupled to the storage device, and the processor is configured to: receive a video including multiple image frames; perform object recognition on a certain image frame among the plurality of image frames to recognize at least one object in the certain image frame; obtain preset interactive relationship information associated with the at least one object from the interactive relationship database based on the at least one object; and determine a first trajectory for navigating the first vehicle based on the preset interactive relationship information.
Based on the above, in the method and the electronic apparatus for predicting a path based on an object interaction relationship provided by the embodiment of the disclosure, the predicted trajectory of the predicted object is generated based on the preset interactive relationship information between the objects. The predicted trajectory is used to determine the trajectory for navigating the main vehicle. In this way, the predicted trajectory of the predicted object is generated by considering the preset interactive relationship between the objects. The disclosure reduces the trajectory prediction error of the object around the main vehicle. Based on the above, the accuracy of predicting the trajectory of the object around the main vehicle is improved, and the trajectory for navigating the main vehicle is accurately planned.
To provide a further understanding of the above features and advantages of the disclosure, embodiments accompanied with drawings are described below in details.
The processor 110 is coupled to the storage device 120 and the input/output device 130. The processor 110 is, for example, a central processing unit (CPU), or other programmable general-purpose or special-purpose devices such as a microprocessor, a digital signal processor (DSP), a programmable controller, application specific integrated circuits (ASIC), a programmable logic controller (PLC), or other similar devices or a combination of these devices. The processor 110 loads and performs the program stored in the perform storage device 120 to perform the method for predicting a path based on an object interaction relationship based on the embodiment of the disclosure.
The storage device 120 is, for example, any type of fixed or removable random access memory (RAM), read-only memory (ROM), flash memory, hard disk, or a similar element or a combination of the above elements. The storage device 120 is used to store the program and data that may be performed by the processor 110. In an embodiment, the storage device 120 stores an interactive relationship database 121 and an environment information database 122. In addition, the storage device 120 also stores, for example, a video received by the input/output device 130 from the image capturing apparatus 12.
The input/output device 130 is a wired or wireless transmission interface such as a Universal Serial Bus (USB), RS232, Bluetooth (BT), and Wireless fidelity (Wi-Fi). The input/output device 130 is used to receive a video provided by an image capturing apparatus such as a camera.
The image capturing apparatus 12 is used to extract an image in front of it. The image capturing apparatus 12 may be a camera that adopts a charge coupled device (CCD), a complementary metal-oxide semiconductor (CMOS) element, or other element lenses. In this embodiment, the image capturing apparatus 12 may be disposed in a main vehicle (also known as a first vehicle), and disposed to extract a road image in front of the main vehicle. It is worth noting that this main vehicle is a vehicle controlled by the processor 110.
In an embodiment, the electronic apparatus 11 may include the above-mentioned image capturing apparatus, and the input/output device 130 is a bus used to transmit data within the device, and the video captured by the image capturing apparatus may be transmitted to the processor 110 for processing. The embodiment is not limited to the above architecture.
First, in step S202, the processor 110 may receive a video including a plurality of image frames. Specifically, the processor 110 receives the video including the plurality of image frames from the image capturing apparatus 12 by using the input/output device 130.
In step S204, the processor 110 may perform object recognition on a certain image frame among the plurality of image frames, so as to recognize at least one object in the certain image frame. In an embodiment, the processor 110, for example, performs object detection and a recognition algorithm on the certain image frame to recognize the object in the certain image frame. For example, the processor 110 extracts features in the certain image frame and recognizes the object by using a pre-established and trained object recognition model. The object recognition model is a machine learning model established through, for example, a convolutional neural network (CNN), deep neural networks (DNN), or other types of neural networks combined with a classifier. The object recognition model learns from a large number of input images, and may extract the features in the image and classify these features to recognize the object corresponding to a specific object type. Those skilled in the art should know how to train the object recognition model that may recognize the object in the certain image frame.
For example,
In step S206, the processor 110 may obtain preset interactive relationship information associated with at least one object from the interactive relationship database 121 based on the at least one object. In this embodiment, the preset interactive relationship information between the plurality of preset objects may be included in the interactive relationship database.
In an embodiment, the preset object may refer to a certain traffic object in the road image, and the preset interactive relationship information may refer to the object interactive relationship among a plurality of certain traffic objects. Taking the situation of an autonomous vehicle driving on the road as an example, the certain traffic object may be a traffic cone, a ball, a street tree, a vehicle, a construction sign, a person, a vehicle, etc. The disclosure is not limited thereto. In other words, the certain traffic object refers to an object that may appear on the road and may induce a driving behavior by a human driver.
In this embodiment, the object interactive relationship between certain traffic objects may be divided into two types of object interactive relationships. The first type of object interactive relationship records the object interactive relationship between an actual object and a virtual object. Based on the first type of object interactive relationship, the virtual object and the trajectory for the virtual object may be predicted and generated based on the detected actual object. On the other hand, the second type of object interactive relationship records the object interactive relationship between two actual objects. Based on the second type of object interactive relationship, the trajectory of one actual object may be predicted based on another one of the detected two actual objects. In other words, the first type of object interactive relationship may include the object interactive relationship between the virtual object that does not appear in the lane but is predicted to appear because of the actual object appearing on the lane and the actual object. On the other hand, the second type of object interactive relationship may include the object interactive relationship between two actual objects appearing in the lane.
The following will explain the situation that may occur on an actual lane. For example, the object interactive relationship may include, for example, the object interactive relationship between a ball and a person, and the object interactive relationship between a traffic cone/street tree/construction sign and a vehicle, etc. The disclosure is not limited thereto. In this embodiment, the object interactive relationship between the ball (the actual object) and the person (the virtual object) belongs to the first type of object interactive relationship. Generally, when the ball rolls into the lane, there is a possibility that a child (person) chasing the ball and rushing into the lane may appear. Therefore, the interactive relationship database may store the object interactive relationship between the ball and the person as “after the ball is detected, a person with the same path as the ball and moving in m seconds appears after n seconds”, where n and m are preset values. On the other hand, the object interactive relationship between the traffic cone/street tree/construction sign (that is, the actual object) and the vehicle (that is, the actual object) belongs to the second type of object interactive relationship. Generally, when a human driver is driving a vehicle, if the driver sees an obstacle such as a traffic cone/street tree/construction sign in the lane in front of the vehicle, the driver turns to avoid these obstacles. Therefore, the interactive relationship database may store the object interactive relationship between the traffic cone/street tree/construction sign and the vehicle as “when the traffic cone/street tree/construction sign and the vehicle are detected, the driving speed of the vehicle slows down to k kilometers per hour in order for the vehicle to switch lanes when the vehicle is j meters away from the traffic cone/street tree/construction sign”, where j and k are preset values. It is worth noting that driver may encounter other different situations while driving the vehicle, so the disclosure is not limited to the above object interactive relationships. Those skilled in the art should design an object interactive relationship between other certain traffic objects based on the enlightenment of the above-mentioned exemplary embodiment.
In step S208, the processor 110 may determine a trajectory (also known as a first trajectory) for navigating the main vehicle based on the preset interactive relationship information. In this embodiment, the trajectory may include a path and the velocity at each trajectory point in the path. Specifically, the processor 110 may generate a predicted trajectory of a predicted object based on the preset interactive relationship information. Next, the processor 110 may determine the first trajectory of the main vehicle based on the predicted trajectory.
In an embodiment, the processor 110 first determines whether the preset interactive relationship information includes the first type or the second type of object interactive relationship to generate a determination result. Next, the processor 110 may generate the predicted trajectory of the predicted object based on the determination result.
In this embodiment, in response to determining that the preset interactive relationship information includes the first type of object interactive relationship, the processor 110 may obtain the preset object corresponding to the preset interactive relationship information associated with the recognized object from the interactive relationship database 121 based on the object recognized in step S204 as the predicted object. As in the foregoing example, assuming that there is the first type of object interactive relationship between the ball and the person, the processor 110 may obtain the “person” from the interactive relationship database 121 as the predicted object based on the recognized ball. Next, the processor 110 may calculate the predicted trajectory of the object based on the preset interactive relationship information and the trajectory of the recognized object.
Referring to
In this embodiment, if the processor 110 determines that the preset interactive relationship information includes the second type of object interactive relationship, the processor 110 may adopt a predicted trajectory generation process different from that for the first type of object interactive relationship. Specifically, referring to
Referring to
Referring to
After the predicted trajectory of the predicted object other than the main vehicle is calculated, the processor 110 determines the first trajectory for navigating the main vehicle based on the predicted trajectory. In an embodiment, the processor 110 may calculate a predicted collision time between a generated predicted trajectory and the original target trajectory of the main vehicle, and adjust the original target trajectory of the main vehicle based on the predicted collision time to generate the first trajectory. For example, the processor 110 adjusts the driving velocity (for example, acceleration and deceleration) or the driving direction (for example, turning) of the main vehicle in the original target trajectory to generate the first trajectory. It is worth noting that the processor 110 may update the path included in the original target trajectory and the velocity at each trajectory point in the path based on the adjusted driving velocity or direction of the main vehicle to generate the first trajectory. In this way, by considering the preset interactive relationship between objects, the embodiment of the disclosure may accurately predict the trajectory of the object around the main vehicle, thereby accurately planning the trajectory for navigating the main vehicle.
Referring to
Referring to
In step S8023, the processor 110 may obtain lane geometry information from the environment information database 122 based on positioning data of the main vehicle. The environment information database 122 may store map information, and the map information may include road information and intersection information. The processor 110 may obtain the lane geometry information such as lane reduction and curves from the environment information database 122. Specifically, the electronic apparatus 11 of the embodiment may be further coupled to a positioning device (not shown). The positioning device is, for example, a Global Positioning System (GPS) device, which may receive the positioning data of the current position of the main vehicle, including longitude and latitude data.
In step S803, the processor 110 may calculate the predicted trajectory of the predicted object based on at least one of the object feature value, the preset interactive relationship information, and the lane geometry information. Referring to
In step S804, the processor 110 may determine the first trajectory for navigating the main vehicle based on the predicted trajectory of the predicted object. The aforementioned embodiment may be referred to for the specific description of determining the first trajectory, which will not be repeated herein. After the first trajectory is determined, the processor 110 may control the movement of the main vehicle based on the first trajectory.
It is worth noting that each step in
In summary, in the method and the electronic apparatus for predicting a path based on an object interaction relationship provided by the embodiment of the disclosure, the predicted trajectory of the predicted object may be generated based on the preset interactive relationship information between the objects. The predicted trajectory is used to determine the trajectory for navigating the main vehicle. In this way, the predicted trajectory of the predicted object is generated by considering the preset interactive relationship between the objects. The disclosure may reduce the trajectory prediction error of the objects around the main vehicle, thereby improving the accuracy of predicting the trajectory of these surrounding objects. In addition, the disclosure may accurately calculate and the predicted trajectory of the predicted object through the object feature values of the surrounding objects and the lane geometry information. Based on the above, the disclosure may accurately plan the trajectory for navigating the main vehicle by effectively predicting the impact of the surrounding objects on the main vehicle.
Although the disclosure has been disclosed in the above by way of embodiments, the embodiments are not intended to limit the disclosure. Those with ordinary knowledge in the technical field can make various changes and modifications without departing from the spirit and scope of the disclosure. Therefore, the protection scope of the disclosure is subject to the scope of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
110143485 | Nov 2021 | TW | national |