The present invention relates to a driving assistance device, and more particularly to a three-dimensional driving navigation device.
With the improvement of science and technology, electronic products are applied increasingly widely. For example, smart phones, personal digital assistants, navigation devices, and tablet computers already become popular electronic products. During driving, a driver may use an electronic product to plan a navigation path according to a current position, so that the driver travels to a destination with assistance and prompts, making the driving smoother and more convenient.
However, currently, navigation devices in fact still have some driving requirements that are yet to be met. For example, when a driver is unclear about the way and has to follow another vehicle (for example, a friend or relative's vehicle), the driver often loses the track of the other vehicle because of some driving conditions (for example, traffic jams or fast speed), resulting in trouble and danger in driving. Alternatively, when the driver wants to take a short cut as the friend does, the driver cannot achieve this by using an existing navigation device.
In view of the foregoing problem, in an embodiment, a three-dimensional driving navigation device is provided, applicable to a vehicle. The three-dimensional driving navigation device includes a lens group, a three-dimensional image synthesis module, a GPS module, a communications module, a map information database, a processing module, and a display module. The lens group includes a plurality of lenses, the lenses being respectively disposed on different positions around the vehicle, and respectively photographing a plurality of external images around the vehicle. The three-dimensional image synthesis module is connected to the lens group. The three-dimensional image synthesis module receives the external images and synthesizes the external images into a three-dimensional panoramic projection image. The GPS module detects and outputs a real-time vehicle position of the vehicle. The communications module receives a history driving trajectory transferred from outside. The map information database stores map information. The processing module is connected to the three-dimensional image synthesis module, the GPS module, the communications module, and the map information database. The processing module receives the real-time vehicle position, the history driving trajectory, the map information, and the three-dimensional panoramic projection image, and compares the real-time vehicle position, the map information, and the history driving trajectory to generate a driving path. The processing module superimposes a virtual guide image on a position corresponding to the driving path in the three-dimensional panoramic projection image. The display module is connected to the processing module. The display module displays the three-dimensional panoramic projection image and the virtual guide image.
In an embodiment, the virtual guide image may be a three-dimensional image, and a fixed distance is kept between the virtual guide image and the real-time vehicle position. Alternatively, the virtual guide image may be a trajectory line, and the virtual guide image is set along the driving path.
In an embodiment, the processing module outputs a departure signal when the real-time vehicle position departs from the driving path, so that the communications module receives a corrected driving trajectory transferred from outside. The processing module receives the corrected driving trajectory and compares the corrected driving trajectory with the map information to generate a corrected driving path. The processing module superimposes the virtual guide image on a position corresponding to the corrected driving path in the three-dimensional panoramic projection image. In an embodiment, the corrected driving path is a path for returning to the driving path, or the corrected driving path is another driving path.
In an embodiment, the three-dimensional driving navigation device may further include a navigation module, connected to the processing module. The processing module outputs a departure signal when the real-time vehicle position departs from the driving path. The navigation module receives the departure signal and correspondingly outputs a corrected navigation route for returning to the driving path. In an embodiment, the processing module further superimposes the virtual guide image on a position corresponding to the corrected navigation route in the three-dimensional panoramic projection image.
In an embodiment, the three-dimensional driving navigation device may further include an input module, connected to the communications module. The input module receives a search condition. The communications module outputs the search condition, and the history driving trajectory corresponds to the search condition.
In an embodiment, the display module may be embedded in a windshield of the vehicle, and the three-dimensional panoramic projection image and overlaps an external view in front of the vehicle.
In an embodiment, the processing module may further superimpose a direction indication image in the three-dimensional panoramic projection image according to the driving path.
In conclusion, by means of image processing and synthesis, the three-dimensional driving navigation device in the embodiments of the present invention establishes a three-dimensional panoramic projection image, and receives a history driving trajectory of another vehicle to generate a driving path, and superimposes a virtual guide image on a position corresponding to the driving path in the three-dimensional panoramic projection image, to enable a driver to intuitively travel according to a driving route of the other vehicle.
The present invention will become more fully understood from the detailed description given herein below for illustration only, and thus are not limitative of the present invention, and wherein:
Referring to
As shown in
In addition, the front-view lens 10F, the rear-view lens 10B, the left-view lens 10L, and the right-view lens 10R may be specifically wide-angle lenses or fisheye lenses. The vehicle-body front-side image IF, the vehicle-body rear-side image IB, the vehicle-body left-side image IL, and the vehicle-body right-side image IR at least partially overlap each other. That is, the vehicle-body front-side image IF, the vehicle-body rear-side image IB, the vehicle-body left-side image IL, and the vehicle-body right-side image IR all partially overlap each other without any gap, so as to obtain a complete image around the vehicle 2. The lens group 10 outputs the vehicle-body front-side image IF, the vehicle-body rear-side image IB, the vehicle-body left-side image IL, and the vehicle-body right-side image IR.
The three-dimensional image synthesis module 20 may be specifically implemented by using a microcomputer, a processor or a dedicated chip, and the three-dimensional image synthesis module 20 may be mounted on the vehicle 2. The three-dimensional image synthesis module 20 is connected to the front-view lens 10F, the rear-view lens 10B, the left-view lens 10L, and the right-view lens 10R. The three-dimensional image synthesis module 20 receives and may first combine the vehicle-body front-side image IF, the vehicle-body rear-side image IB, the vehicle-body left-side image IL, and the vehicle-body right-side image IR into a planar panoramic image, then synthesizes the planar panoramic image into a three-dimensional panoramic projection image Isurr by using a back projection manner, and outputs the three-dimensional panoramic projection image Isurr.
Alternatively, referring to
The GPS module 30 may be specifically implemented by using a microcomputer, a processor or a dedicated chip and mounted in the vehicle 2, so as to detect and output a real-time vehicle position (that is, the position of the vehicle 2) by using a satellite. In some embodiments, the GPS module 30 may be located in a wearable device (for example, a watch or wristband) worn by a driver or an electronic product (for example, a smart phone or a tablet computer).
The communications module 40 receives a history driving trajectory transferred from outside. In an embodiment, the history driving trajectory may be formed of a plurality of history driving positions (that is, previous GPS positions) of an external vehicle 4. As shown in
As shown in
The map information database 50 stores map information. In an embodiment, the map information database 50 is located in a navigation device, and map information is built in the map information database 50. Alternatively, the three-dimensional driving navigation device 1 downloads the map information instantly from a remote end by using the communications module 40 and saves the map information in the map information database 50.
The processing module 60 may be specifically implemented by using a microcomputer, a processor or a dedicated chip and mounted in the vehicle 2. The processing module 60 is connected to the three-dimensional image synthesis module 20, the GPS module 30, the communications module 40, and the map information database 50. The processing module 60 receives the real-time vehicle position, the history driving trajectory, the map information, and the three-dimensional panoramic projection image Isurr. When the real-time vehicle position is located on the history driving trajectory, the processing module 60 compares the real-time vehicle position, the map information, and the history driving trajectory to generate a driving path. The processing module 60 superimposes a virtual guide image Vg on a position corresponding to the driving path in the three-dimensional panoramic projection image Isurr. In particular, after finishing comparison of the map information and the history driving trajectory, the processing module 60 may obtain a corresponding position of the history driving trajectory on an actual map, and then generate, according to the real-time vehicle position, a driving path with the real-time vehicle position being a starting point. The processing module 60 then superimposes the virtual guide image Vg on a road corresponding to the driving path in the three-dimensional panoramic projection image Isurr to form a three-dimensional navigation image In (referring to
The display module 70 is specifically disposed inside the vehicle 2 (for example, on a dashboard) and connected to a display screen of the processing module 60. The display module 70 displays the three-dimensional panoramic projection image Isurr and the virtual guide image Vg (that is, the three-dimensional navigation image In). As shown in
Referring to both
In conclusion, by means of image processing and synthesis, the three-dimensional driving navigation device in the embodiments of the present invention establishes a three-dimensional panoramic projection image, receives a history driving trajectory of another vehicle to generate the driving path, and superimposes a virtual guide image on a position corresponding to the driving path in the three-dimensional panoramic projection image, so as to enable a driver to intuitively travel according to a driving route of the other vehicle.
Further, for example, when a driver is unclear about the way and follows another vehicle of a friend to travel, if the driver loses the track of the vehicle of the friend because of traffic jams or an excessively fast speed of the vehicle of the friend, the three-dimensional driving navigation device 1 may receive a history driving trajectory of the vehicle of the friend to generate the three-dimensional navigation image In, so that the driver can intuitively follow a driving route of the friend to travel. Alternatively, if the driver wants to travel according to the driving route of the friend (for example, the driving route of the friend can enable the driver to reach the destination sooner or has relatively fast traffic), the three-dimensional driving navigation device 1 may receive the history driving trajectory of the vehicle of the friend to generate the three-dimensional navigation image In, so that the driver can intuitively follow the driving route of the friend to travel.
In an embodiment, the processing module 60 further superimposes a direction indication image Vd in the three-dimensional panoramic projection image according to the driving path, so that the driver can know in advance a subsequent moving direction in the virtual guide image Vg to react instantly, thereby further improving the driving safety. As shown in
As shown in
In an embodiment, the processing module 60 outputs a departure signal when the real-time vehicle position departs from the driving path, so that a communications module 40 receives a corrected driving trajectory transferred from outside. The processing module 60 receives the corrected driving trajectory and compares the corrected driving trajectory with the map information to generate a corrected driving path. The processing module 60 superimposes the virtual guide image Vg on a position corresponding to the corrected driving path in the three-dimensional panoramic projection image Isurr.
For example, if the vehicle 2 changes to travel on another road segment because of some conditions (for example, the road is under construction or an accident occurs in front), the processing module 60 may determine, according to that the real-time vehicle position is not on the driving path, that the vehicle 2 departs from the driving path to output a departure signal, to drive the communications module 40 to download and receive, from the remote server 5 or Internet of Vehicles by using the Internet, a corrected driving trajectory transferred by another external vehicle 4. The corrected driving trajectory is also formed of a plurality of history driving positions (that is, previous GPS positions) of the other external vehicle 4, and the real-time vehicle position is located on the corrected driving trajectory.
The processing module 60 receives the corrected driving trajectory and then compares the map information to generate the corrected driving path, and the processing module 60 further superimposes the virtual guide image Vg (for example, the virtual vehicle or trajectory line) on a position corresponding to the corrected driving path in the three-dimensional panoramic projection image Isurr, to enable the driver to intuitively follow the virtual guide image Vg when traveling on the corrected driving path.
In some embodiments, the corrected driving path may be a path for returning to the original driving path, so that the driver can continue to follow the virtual guide image Vg of the original driving path to travel. Alternatively, the corrected driving path may be another driving path. That is, the driver no longer travels according to the driving path corresponding to the original history driving trajectory, but instead, travels according to a new driving path.
As shown in
Although the present invention has been described in considerable detail with reference to certain preferred embodiments thereof, the disclosure is not for limiting the scope of the invention. Persons having ordinary skill in the art may make various modifications and changes without departing from the scope and spirit of the invention. Therefore, the scope of the appended claims should not be limited to the description of the preferred embodiments described above.