This application claims the priority benefit of Taiwan application serial no. 104127060, filed on Aug. 19, 2015. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of this specification.
Field of the Invention
The invention relates to a system and a control method thereof, and particularly relates to an augmented reality interactive system and a dynamic information interactive display method thereof.
Description of Related Art
Along with development of technology, and an increasingly affluent life, transportation vehicles are increasingly popular in general families. However, along with increasingly frequent use of the transportation vehicles (for example, cars, ships, airplanes, etc.), consequent traffic accidents are also significantly increased. Taking a vehicle as an example, in order to improve vehicle driving safety, a head-up display (HUD) system has become a basic equipment of many vehicles. The HUD system may project driving information such as a vehicle speed, a fuel capacity, a mileage and distances with front and rear vehicles, etc., onto a front windshield of the vehicle, such that the driver may simultaneously observe the driving information projected on the front windshield of the vehicle while paying attention to road conditions through the front windshield. Therefore, the driver is avoided to be distracted to look down to watch a car dashboard during a driving process, so as to avoid a traffic accident.
However, in order to install the HUD system in a limited space of the vehicle, the HUD system generally has a miniaturization design, such that the HUD system can be limited and fixed within a small region to project and display an image. Therefore, in the driver's point of view, the information displayed by the general HUD system cannot directly indicate scenes and objects on the driving direction or information between the vehicles, so that the driver is still hard to intuitively determine an actual road condition according to the projected display information.
Moreover, regarding an application of a general HUD system, the driver may adjust a setting and a function of the HUD system only through a manner of manually controlling a computer input interface. In other words, the driver is hard to directly control the HUD system during the period of driving the vehicle, which limits an application range of the HUD system.
The invention is directed to an augmented reality interactive system and a dynamic information interactive display method thereof, which are capable of resolving the problem mentioned in the related art.
The invention provides an augmented reality interactive system, which is adapted to be disposed in a transportation vehicle, and the augmented reality interactive system includes a transparent display, a motion detection unit, and a processing unit. The transparent display has a display panel pervious to light. The display panel is adapted to serve as a windshield of the transportation vehicle, where the transparent display controls an image displayed on the display panel according to a display signal, so as to display interactive information on the display panel. The motion detection unit is configured to detect an operation motion of a user, so as to generate a control command. The processing unit is coupled to the transparent display and the motion detection unit, and is configured to receive the control command, so as to generate the corresponding display signal based on the operation motion for controlling an operation of the transparent display.
The invention provides a dynamic information interactive display method applied to a transportation vehicle, which includes following steps. Interactive information is displayed through a display panel pervious to light, where the display panel is adapted to serve as a windshield of the transportation vehicle, and an image displayed on the display panel is controlled by a display signal. An operation motion of a user is detected through a motion detection unit, so as to generate a control command. The control command is received through a processing unit, so as to generate the corresponding display signal based on the operation motion for controlling the display panel.
The invention provides an augmented reality interactive system, which is adapted to be disposed in a transportation vehicle, and the augmented reality interactive system includes a transparent substrate, a motion detection unit, and a processing unit. The transparent substrate is pervious to light and has a display function, where the transparent substrate is adapted to serve as a windshield of the transportation vehicle. The motion detection unit is configured to detect an operation motion, so as to generate a control command. The processing unit is coupled to the transparent substrate and the motion detection unit, and is configured to receive the control command, so as to control an operation of the transparent substrate based on the operation motion.
According to the above descriptions, the embodiments of the invention provides an augmented reality interactive system and a dynamic information interactive display method thereof, by which the interactive information can be displayed on the windshield, and the interactive information can be integrated with scenes and objects in front of the transportation vehicle to form an augmented reality image under a premise that the driver does not look down. In collaboration with extensible application programs, the driver is able to perform an interactive operation with the augmented reality image, so as to obtain more complete driving information and driving assistance to improve driving safety and operability.
In order to make the aforementioned and other features and advantages of the invention comprehensible, several exemplary embodiments accompanied with figures are described in detail below.
The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
Reference will now be made in detail to the present preferred embodiments of the invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the description to refer to the same or like parts.
The transparent display 110 has a display panel DP pervious to light and a driving portion (not shown) used for driving the display panel DP, where the display panel DP pervious to light is disposed in the transportation vehicle to serve as a windshield of the transportation vehicle, as shown in
In the present embodiment, the display panel DP is, for example, a side light-incident type liquid crystal display (LCD) panel driving according to a field-sequential-color method, a self-luminous active matrix organic light-emitting diode (AMOLED) panel made of a transparent material, an electrowetting display panel adopting a transparent ink and a hydrophobic layer material, or any type of a transparent substrate, which is not limited by the invention. In other words, as long as a driver can observe objects at one side of the display panel DP from another side of the display panel DP (i.e. the display panel DP is pervious to light), and the display panel DP has an image display function, the display structure thereof is considered to be complied with the transparent display 110 of the invention.
It should be noted that although in the embodiment of
The motion detection unit 120 is configured to detect an operation motion of the driver, and generates a corresponding control command CMD according to the operation motion. The operation motion can be at least one of a gesture motion, a voice control motion, an eye control motion and a brain wave control motion according to a design requirement. A hardware configuration of the motion detection unit 120 can be designed according to the selected operation type. For example, if the operation motion is a gesture motion or an eye control motion, the motion detection unit 120 is, for example, implemented by an image capturing device and a corresponding image processing circuit; if the operation motion is a voice control operation, the motion detection unit 120 is, for example, implemented by an audio capturing device and a corresponding audio processing circuit; and if the operation motion is a brain wave control motion, the motion detection unit 120 is, for example, implemented by a brain wave detection device and a corresponding signal processing circuit. Moreover, the motion detection unit 120 can be disposed near a driver seat (for example, disposed on a dashboard as shown in
The vehicle dynamic detection unit 130 is configured to detect driving information DINF (for example, a vehicle speed, a driving path offset or a steering wheel turning direction, etc.) of the transportation vehicle and environment information EINF (a position and a distance of an obstacle on a driving direction, an environment light intensity and an environment temperature, etc.) around the transportation vehicle, and provides the detected driving information DINF and the environment information EINF to the processing unit 140. In the present embodiment, the actual hardware of the vehicle dynamic detection unit 130 can be correspondingly set according to a type of the required driving information DINF and the environment information EINF. For example, if the driving information DINF includes the vehicle speed, the vehicle offset and the steering wheel turning direction, the hardware of the vehicle dynamic detection unit 130 may include a vehicle computer originally installed on the transpiration vehicle. If the environment information EINF includes the position and the distance of the obstacle on the driving direction, the environment light intensity and the environment temperature, the hardware of the vehicle dynamic detection unit 130 may include an object sensor (for example, an infrared sensor, an ultrasonic sensor, etc.), a light sensor and a temperature sensor, which is determined according to an actual design requirement of a designer, and is not limited by the invention.
The processing unit 140 is a control core of the whole augmented reality interactive system 100, which is configured to control an operation of each unit in the augmented reality interactive system 100, and perform signal processing according the control command CMD, the driving information DINF and the environment information EINF received from each of the units, so as to generate a corresponding display signal VDATA to control an operation of the transparent display 110. The processing unit 140 may implement an interactive control between the driver and the image displayed by the display panel DP according to the control command CMD, and may perform a computation processing of an application program according to the driving information DINF and the environment information EINF, or make the display panel DP to display auxiliary information related to the driving information DINF and the environment information EINF.
In an exemplary embodiment of the invention, the hardware configuration of the processing unit 140 can be implemented by a processor of the vehicle computer originally install on the transportation vehicle, and the function of performing the signal processing according to the control command CMD, the driving information DINF and the environment information EINF to generate the corresponding display signal VDATA can be implemented by software. In another exemplary embodiment, the processing unit 140 can also be implemented by independent hardware, which is not limited by the invention.
Then, the motion detection unit 120 detects an operation motion of the driver, and accordingly generates the control command CMD (step 320). To be specific, in the step S320, after the motion detection unit 120 detects the operation motion of the driver, the motion detection unit 120 determines whether the detected operation motion is complied with a predetermined command motion, and if yes, the motion detection unit 120 generates the corresponding control command CMD; and if not, the motion detection unit 120 continually detects the operation motion of the driver.
Then, the processing unit 140 receives the control command CMD generated by the motion detection unit 120, and generates the corresponding display signal VDATA based on the operation motion of the driver, so as to control the image display of the display panel DP (step S330).
In detail, under the system structure of the augmented reality interactive system 100, information can be displayed on the windshield of the transportation vehicle, so as to integrate scenes and objects in the front of the transportation vehicle to implement a display application of augmented reality. In collaboration with different types of application programs, for example, GPS navigation, reverse display, a driving visual enhancement technology, etc., an interactive control between the driver and an augmented reality image (i.e. the interactive information combined with the scenes and objects in front of the transportation vehicle) can be implemented based on a somatosensory control manner. Therefore, under the system structure of the invention, a plurality of interactive functions facilitating vehicle driving can be extended.
For example, the interactive information can be designed as interactive information IMG shown in
The main region Rm can be used to display the currently executed application program window EPW, the function list FL for listing application programs or data folders and other auxiliary information AINF related to the driving information DINF or the environment information EINF. In the present embodiment, a window position and a window size of the currently executed application program window EPW and the auxiliary information AINF in the main region Rm can be adjusted by the driver through the operation motion. In other words, from a system point of view, the processing unit 140 may generate the corresponding display signal VDATA according to the operation motion of the driver, so as to make the transparent display 110 to adjust the window position and the window size of the currently executed application program on the display panel DP. For example, the driver may perform an operation motion to maximize the application program window EPW to occupy the full main region Rm, to set the application program window EPW to a center position, or minimize the application program window EPW to the background program window BPW.
The background program window BPW can be set to be displayed in the lower edge region Re2 of the interactive information IMG. In the present embodiment, the application program set to the background program is continually kept in a running state. Taking an application program of a navigation map shown in
Moreover, in an application of the embodiment, the system may operate based on a method similar to a simplex operation, such that only a single application program window EPW can be displayed in the main region Rm at a same time. Namely, in case that one application program is executed, another application program cannot be executed. However, in the application example of the simplex operation, if the currently executed application program is minimized to the background program, another application program can be opened in the main region Rm. A plurality of the background program windows BPW can be displayed in the lower edge region Re2 at a same time. In other words, when the application program is executed and is not set to the background program, the processing unit 140 may prohibit execution of another application program. Conversely, when the currently executed application program is set to the background program, the processing unit 140 allows another application program to be executed. However, the invention is not limited thereto. In another application of the embodiment, the system may also operate based on a method similar to a multiplex operation, such that the processing unit 140 may simultaneously open a plurality of application programs, and display the application program window EPW of each application program in the main region Rm at a same time.
It should be noted that each of display portions (the permanent function column PFC, the function list FL, the application program window EPW, the auxiliary information AINF and the background program window BPW) in the interactive information IMG is presented in a display manner of a transparent window or a linear icon. Therefore, when the driver views the interactive information IMG on the display panel DP, the driver may simultaneously view the scenes and objects located at another side of the display panel without being shielded by the windows or the function list on the interactive information IMG.
Under such application, the driver may obtain more complete driving information based on the augmented reality image composed of the information displayed on the display panel DP/windshield and the scenes and objects in the front under a premise of not obstructing a driving field of vision, so as to improve driving safety.
It should be noted that the above example is only an exemplary embodiment applying the augmented reality interactive system 100 of the invention, and the invention is not limited thereto. Actually, the function provided by the application program can be self expended and developed under the system structure of the invention according to a requirement of the designer. For example, in another embodiment, the application program can also be a basic GPS navigation map or an application program providing a visual enhancement function in the night.
Moreover, since the present application directly uses the display panel DP to serve as the windshield, the image can be displayed at any position of the display panel DP. In other words, under the system structure of the present application, by correspondingly adjusting the image displayed on the windshield in collaboration with the scenes and objects in the front of the transportation vehicle to achieve a more closely integrated augmented reality application is more easier to be implemented compared with the general projection type head-up display (HUD) system.
Embodiments of
Referring to
After the interactive information IMG displays the function list FL, the driver can further move the function option icons FICN in the function list FL through a shift gesture, as shown in
Referring to
Similarly, when the processing unit 140 receives the right shift command, the processing unit 140 shifts the display position of the function option icons FICN on the display panel DP rightwards by one step according to the right shift command. For example, the function option icon FICN located in the currently selected region CSR is shifted to the right side of the currently selected region CSR, and the function option icon FICN originally located at the left side of the currently selected region CSR is shifted to the currently selected region CSR.
After the application program to be executed is selected, the driver may further select to execute the application program or the data folder corresponding to the function option icon FICN through a click gesture, as shown in
Referring to
It should be noted that the gesture applications of the aforementioned embodiments are only examples, which are not used to limit the application range of the invention. In other embodiments, a gesture setting of the open gesture, the shift gesture, the click gesture, etc. of the invention can all be self-defined as any gesture motion according to an actual requirement of the designer, which is not limited by the invention.
After the driver completes using the function of the application program and wants to close the application program, the driver may further execute the application program or the data folder corresponding to the function option icon FICN through a close gesture, as shown in
Referring to
It should be noted that the close gesture/close command of the invention is not limited to the above application. In an exemplary embodiment, the processing unit 140 may close all of the running background programs according to the close command, so as to release a memory space. In other words, in the invention, the processing unit 140 may close the currently executed application program or data folder or close all of the background programs according to the close command.
In summary, the embodiments of the invention provides an augmented reality interactive system and a dynamic information interactive display method thereof, by which the interactive information can be displayed on the windshield, and the interactive information can be integrated with scenes and objects in front of the transportation vehicle to form an augmented reality image under a premise of not shielding a sight line of the driver. In collaboration with extensible application programs, the driver is able to perform an interactive operation with the augmented reality image, so as to obtain more complete driving information and driving assistance to improve driving safety and operability.
It will be apparent to those skilled in the art that various modifications and variations can be made to the structure of the invention without departing from the scope or spirit of the invention. In view of the foregoing, it is intended that the invention cover modifications and variations of this invention provided they fall within the scope of the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
104127060 | Aug 2015 | TW | national |