This application claims priority of Taiwan Patent Application No. 099144710, filed on Dec. 20, 2010, the entirety of which is incorporated by reference herein.
1. Technical Field
The disclosure relates to traffic computer systems, and more particularly to computer systems for real-time traffic situation awareness.
2. Description of the Related Art
Current driving guide systems provide real time traffic information for drivers, such as information about car speeds on specific road sections, road sections under construction, and road sections with car accidents. Generally, the traffic information provided by ordinary driving guide systems is derived from public information provided by the government. For example, the government may set vehicle detectors under road surfaces to count the traffic flow to obtain traffic information. The cost for setting the vehicle detectors, however, is high, leading to lack of vehicle detectors due to economic considerations. The government may also set video cameras to monitor traffic situations on road sections. The video cameras, however, are only set on road intersections and cannot provide enough full-scale traffic information. Thus, an efficient and economical method for providing real-time traffic information for drivers is therefore required.
A driving recorder is an apparatus installed on a car to record video images when a user is driving the car. A global positioning system (GPS) can provide accurate positioning information of a car. If a car is equipped with a driving recorder and a GPS module, the real-time image provided by the driving recorder and the positioning data provided by the GPS module can be taken as a source from which real-time traffic information is derived. If a great amount of real-time image and positioning data generated by many cars are integrated and combined, useful real-time traffic information is generated and provided to drivers of cars.
The disclosure provides a real-time traffic situation awareness system. In one embodiment, the real-time traffic situation awareness system receives driving data from a car, wherein the driving data comprises an image, GPS data, and gyroscope sensor data. The real-time traffic situation awareness system comprises an image processing unit, a feature extraction unit, a feature matrix database, a data grouping unit, and a situation awareness unit. The image processing unit processes the image to generate a processed image. The feature extraction unit generates a data point according to the processed image, the GPS data, and the gyroscope sensor data. The feature matrix database stores a plurality of feature matrixes of a plurality of data groups corresponding to a plurality of geographic areas. The data grouping unit searches the feature matrix database for a plurality of feature groups of an optimal feature matrix corresponding to a geographic area near to that of the data point according to the GPS data of the data point. The situation awareness unit analyzes the feature groups according to a plurality of situation awareness rules to generate traffic information.
The disclosure also provides a real-time traffic situation awareness method. In one embodiment, a real-time traffic situation awareness system comprises an image processing unit, a feature extraction unit, a feature matrix database, a data grouping unit, and a situation awareness unit. First, driving information is received from a car, wherein the driving information comprises an image, GPS data, and gyroscope sensor data. The image is then processed with the image processing unit to generate a processed image. A data point is then generated with the feature extraction unit according to the processed image, the GPS data, and the gyroscope sensor data. A plurality of feature matrixes of a plurality of data groups corresponding to a plurality of geographic areas is then stored with the feature matrix database. The feature matrix database is then searched for a plurality of feature groups of an optimal feature matrix corresponding to a geographic area near to that of the data point with the data grouping unit according to the GPS data of the data point. The feature groups are then analyzed with the situation awareness unit according to a plurality of situation awareness rules to generate traffic information.
The disclosure provides a guiding apparatus. In one embodiment, the guiding apparatus is installed on a car and comprises an image sensor, a GPS module, a gyroscope sensor, a wireless transceiver, a processor, a roadmap database, and a screen. The image sensor detects an image. The GPS module generates GPS data. The gyroscope sensor detects a 3-dimensional gravity operation of a car to generate gyroscope sensor data comprising acceleration data and angle acceleration data of the car. The wireless transceiver is coupled to a wireless network and connects the guiding apparatus to a real-time traffic situation awareness system via the wireless network. The processor gathers the image, the GPS data, and the gyroscope sensor data to generate driving information, and directs the wireless transceiver to send the driving information to the real-time traffic situation awareness system. The roadmap database stores a roadmap. The wireless transceiver receives traffic information from the real-time traffic situation awareness system, the processor generates guiding information according to the traffic information, and the screen shows the guiding information and the roadmap thereon.
A detailed description is given in the following embodiments with reference to the accompanying drawings.
The disclosure can be more fully understood by reading the subsequent detailed description and examples with references made to the accompanying drawings, wherein:
The following description is of the best-contemplated mode of carrying out the disclosure. This description is made for the purpose of illustrating the general principles of the disclosure and should not be taken in a limiting sense. The scope of the disclosure is best determined by reference to the appended claims.
The disclosure provides a real-time traffic situation awareness system. The real-time traffic situation awareness system analyzes a great amount of real-time image data to generate useful data points, and then compiles statistics of the data points via data learning of artificial intelligence to generate real-time traffic information. The real-time traffic information generated by the real-time traffic situation awareness system is sent back to driving guide systems installed on the cars. The driving guide systems on the cars can then estimate required travel time period of road sections, determine road situations, and other real-time information such as road sections under construction according to the real-time traffic information. In addition, the real-time information provided by the real-time traffic situation awareness system also comprises calibration information for a GPS apparatus to fix positioning data provided by a global positioning system (GPS) which may have signal loss or drift due to city obstacles such as tunnels and overpasses.
Referring to
Referring
The image processing unit 202 then analyzes the video image of the driving data to generate a processed image (step 302). In one embodiment, the image processing unit 202 processes the image of the driving data with a pattern recognition process to find road marks existing in the image to generate the processed image. In one embodiment, the road marks comprise traffic lights, signboards, road signs, and buildings. For example, the traffic lights and the signboards can be identified from the image according to the colors and shapes of the traffic lights and the signboards. In addition, an object tracking technique is used to trace road marks from image data. Furthermore, buildings are identified from images according to edge detection and corner detection techniques.
The feature extraction unit 204 then combines the processed image generated by the image processing unit 202 with the GPS data and the gyroscope sensor data of the driving data to generate a data point available for the system 200, wherein the data point comprises information about location, speed, acceleration, angular acceleration, and direction of the car and a corresponding timestamp. Referring to
If the data point comprises available GPS data (step 304), the data point is sent to the feature selection unit 206 as a source for feature data learning (step 306). The feature selection unit 206 performs a data training process on the received data to generate matrixes for rapid calculation. The data training process comprises training of a single road point and a trace of a road section. When a new data point is added to a database for data training, the feature selection unit 206 generates a weight for the new data point according to a timestamp of the new data. When the data training process begins, the feature selection unit 206 classifies the new data point according to GPS data of the new data point, gathers past data points neighboring to the new data point from the training database, and performs the data training process on the new data point and the past data points to generate a classified data group. Referring to
The feature selection unit 206 then generates weights of data points according to timestamps of the data points, and updates the data of the classified data group according to the weights. The earlier the timestamps of the data points are, the lower the weights of the data points are. The feature selection unit 206 then analyzes the data points of the classified data group to extract critical features, thereby lowering data dimensions and increasing data processing speed. In one embodiment, the feature selection unit 206 performs a principle component analysis (PCA) on the data points of the classified data group to generate the critical features. Referring to
Because the data points taken as an input to the traffic situation awareness system 200 are divided into a plurality of classified data groups according to the geographical areas of the data points, the traffic situation awareness system 200 can performs statistic processes such as data training, principle component analysis, and linear discrimination analysis to derive the feature matrixes which are to be stored to the feature matrix data base 212 from the classified data groups, wherein the feature matrixes respectively correspond to the classified data groups. When the feature extraction unit 204 generates a new data point, the data grouping unit 210 searches the feature matrix database 212 according to the geographical area of the new data point (step 311) to obtain a calculation matrix near the geographical area of the new data point, and then calculates a similar feature group of the new data point (step 312). If the data grouping unit 210 can successfully find a similar feature group corresponding to the new data point from the feature matrix database 212 (step 313), the situation awareness unit 214 then analyzes the statistics data of the similar feature group according to a plurality of situation awareness rules to obtain traffic information corresponding to the geographical area of the new data point (step 315).
Referring to
Referring to
While the disclosure has been described by way of example and in terms of embodiments, it is to be understood that the disclosure is not limited thereto. To the contrary, it is intended to cover various modifications and similar arrangements (as would be apparent to those skilled in the art). Therefore, the scope of the appended claims should be accorded the broadest interpretation so as to encompass all such modifications and similar arrangements.
Number | Date | Country | Kind |
---|---|---|---|
099144710 | Dec 2010 | TW | national |