1. Technical Field
The present disclosure relates to a monitoring system, and particularly to a monitoring system displaying information as to objectives such as traffic obstacles through a transparent display of a portable device.
2. Description of Related Art
Traffic accidents are often caused by driver inattention. For an emergency service vehicle such as, fire truck, ambulance, or police car, traffic accidents are liable to happen when the emergency service vehicle is moving at a high speed when going to the location of the emergency. Although alarms such as sirens can be used, it is still difficult for pedestrians or the driver of other vehicles near to the emergency service vehicle to quickly take evasive action. In addition, traffic accidents are also liable to happen during the night because of reduced vision.
Therefore, there is room for improvement in the art.
Many aspects of the present disclosure can be better understood with reference to the drawings. The components in the drawing(s) are not necessarily drawn to scale, the emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawing(s), like reference numerals designate corresponding parts throughout the several views.
The display unit 110 includes a transparent display 111. The transparent display 111 is a transparent portion of the display unit 110 such as a display panel which allows a user 2000 (see
The camera unit 120 produces scene images Gs (not shown) of the scene which can be viewed through the transparent display 111 of the portable device 1000. In the illustrated embodiment, the camera unit 120 includes camera(s) producing the scene images Gs, such as still photographs or videos, wherein the camera unit 120 has night-vision functionality such that the scene images Gs can be produced in darkness and in a lighted environment. In other embodiments, the camera unit 120 can include a plurality of cameras producing the scene images Gs from different directions, thereby avoiding dead spots or blind spots.
The storage unit 130 is a device such as a random access memory, a non-volatile memory, or a hard disk drive for storing and retrieving digital information, which stores sample objective data Ds (not shown) including sample objective figures and objective conditions. Herein, “objective” when used as a noun describes an object or a movement or a state (objective conditions) on or of the road which is significant to a driver, “objective data Do” (not shown) may mean statements or warnings relevant to each objective, “sample objective data Ds” is the generic name of a pre-recorded collection of all such data. These definitions may be specifically extended hereafter. In the illustrated embodiment, the sample objective figures are figures of possible traffic obstacles such as vehicles, humans, animals, huge objects, suspicious objects, or potholes in the road. The objective conditions are the possible traffic obstacles which may cause problems to the portable device 1000. The possible traffic obstacle can correspond to one or more objective conditions when, for instance, the possible traffic obstacle is located in the middle of a road while the user 2000 is approaching, or the possible traffic obstacle is itself approaching the user 2000 at high speed. In other embodiments, the sample objective figures can be figures of other types of possible objectives, for example, particular and favorite objects of the user 2000.
The control unit 140 receives the scene images Gs, and determines objective(s) 3000 (see
In the illustrated embodiment, the objective data Do includes objective information data Di (not shown) and objective position data Dp (not shown). The control unit 140 produces the objective information data Di including information concerning the objective 3000 such as the name, the type, and/or the description of the objective 3000 according to, for example, the sample objective figure and the objective condition in the sample objective data Ds which correspond to the objective 3000. For instance, when the control unit 140 determines the objective 3000 to be a possible traffic obstacle in the middle of a road according to the sample objective figure and the objective condition corresponding to the objective 3000, the objective information data Di can include the description about the possible traffic obstacle. The pre-stored information concerning the objective 3000 can be in the storage unit 130, or be pre-stored in, and received from, a server connected to the monitoring system through a long distance wireless network, wherein the information can be, for example, an augmented reality information received from the server which is an augmented reality server.
The display unit 110 receives the objective data Do from the control unit 140. Objective information 1112 (see
In addition to the camera unit 120, other types of sensors can be used to produce sample objective data Ds, such that the control unit 140 can identity the objective 3000 to the user 2000 according to data from the other sensors and the scene images Gs produced by the camera unit 120. For instance, microphones can be used to produce environmental voice data, such that the control unit 140 can identify and describe the objective 3000 audibly as well as by the scene images Gs. In addition to the display unit 110 which displays the objective information 1112, other types of device can be used to provide objective information. For instance, a loudspeaker can be used to receive the objective data Do from the control unit 140 and produce audible warning(s) according to the objective data Do, thereby warning the user 2000 of the appearance of the objective 3000.
In the illustrated embodiment, the display unit 210 includes a transparent display 211. The transparent display 211 is a transparent AMOLED display disposed on a frame of a glass portion 4100 of the portable device 4000. The camera unit 220 produces the scene images Gs of the scene which can be viewed through the transparent display 211 of the portable device 4000. The storage unit 430 stores the sample objective data Ds including sample objective figures and objective conditions. The control unit 240 receives the scene images Gs and determines the objective(s) 3000 according to the scene images Gs by using the sample objective data Ds to analyze the scene images Gs by way of comparison. The first wireless communication unit 250 communicates with the second wireless communication unit of the vehicle 5000 through a short distance wireless network 6000 implemented according to BLUETOOTH telecommunication standard or other telecommunication standards such as near field communication (NFC).
The movement identification unit 270 is disposed on the portable device 4000 to determine a movement (for example, up, down, left, or right movement) of the portable device 4000. The movement identification unit 270 determines the movement according to the variation of a direction and an angle of the portable device 4000. In the illustrated embodiment, the movement identification unit 270 includes a direction identification unit determining a direction of the portable device 4000 and an angle identification unit determining an angle of the portable device 4000, wherein the direction identification unit may include an electronic compass and the angle identification unit may include a gravity sensor. The camera unit 220 moves according to the movement of the portable device 4000, thereby producing the scene images Gs corresponding to the vision angle of the user 2000 through the transparent display 111 of the portable device 1000.
A relative location compensation unit can be used to determine a difference between the relative location (for example, the relative distance and/or the relative direction) between the portable device 4000 and the objective 3000 as well as the relative location between the camera unit 20 and the objective 3000. The control unit 40 can compensate for the difference by, for instance, enabling the camera unit 20 to zoom in or re-orientate according to the difference, or by enabling the control unit 40 to consider the difference when determining the position of the virtual image 1111 on the transparent display 11, thereby eliminating any inaccuracy between the display and the factual situation which are caused by the difference. The location of the portable device 4000 can be manually configured, or automatically detected by, for instance, using a detection device. In this case, the control unit 40 can compensate for a difference between the relative location between the user 2000 and the objective 3000 as well as the relative location between the camera unit 20 and the objective 3000 which has been determined by a relative location compensation unit.
In step S1110, the scene images Gs corresponding to a scene are produced. The objective 3000 is tracked when the objective 3000 moves. As the objective 3000 moves, the scene images Gs are produced corresponding to the movement of the objectives 3000. In the illustrated embodiment, camera(s) with night-vision functionality are used to produce the scene images Gs. In addition, step S1110 is performed by the camera unit 120 disposed on the portable device 1000. In other embodiments, step S1110 can be performed by the camera unit 220 disposed on the vehicle 5000. The scene images Gs corresponding to the scene can be produced according to a movement of the portable device 4000, wherein the movement of the portable device 4000 can be determined according to the variation of a direction and an angle of the portable device 4000. Correspondingly, the scene images Gs corresponding to the scene can be produced according to the movement of the portable device 4000.
In step S1120, the objective 3000 is determined according to the scene images Gs. The objective 3000 can be determined according to the scene images Gs by, for instance, using the sample objective data Ds including the sample objective figures and the objective conditions to analyze the scene images Gs. In the illustrated embodiment, the objective 3000 is determined by comparing the scene images Gs with the sample objective figures to recognize possible traffic obstacles, and the condition of the possible traffic obstacles with the objective conditions are compared.
In step S1130, the objective data Do corresponding to the objective 3000 is produced. The objective data Do is produced to correspond to the movement of the objective 3000 when the objective 3000 moves. In the illustrated embodiment, the objective data Do includes the objective information data Di and the objective position data Dp. The objective information data Di includes the information concerning the objective 3000. The objective position data Dp corresponds to the virtual image 1111 of the objective 3000 seen through the transparent display 111, wherein the virtual image 1111 is viewed from a particular position P.
In step S1140, the objective data Do is transmitted to the portable device 1000 with the display unit 110. The display unit 110 includes the transparent display 111 allowing the user 2000 to view the scene through the transparent display 111, thereby enabling the transparent display 111 to display the objective information 1112 according to the objective data Do, wherein the objective information 1112 indicates the virtual image 1111 of the objective 3000 seen through the transparent display 111 by accompanying, labeling, or pointing to the virtual image 1111. In the illustrated embodiment, the transparent display 111 displays the objective information 1112 according to the objective information data Di in the objective data Do, while the objective information 1112 is displayed at position(s) of the transparent display 111 which corresponds to the objective position data Dp in the objective data Do to accompany the virtual image 1111.
The monitoring system is capable of displaying information as to objectives such as traffic obstacles through a transparent display on a portable device, thereby automatically informing a user about the appearance of the objectives. Camera(s) with night-vision functionality can be used to produce images of the objectives, thereby recognizing the objectives both in darkness and in a lighted environment.
While the disclosure has been described by way of example and in terms of preferred embodiment, the disclosure is not limited thereto. On the contrary, it is intended to cover various modifications and similar arrangements as would be apparent to those skilled in the art. Therefore the range of the appended claims should be accorded the broadest interpretation so as to encompass all such modifications and similar arrangements.
This application is a continuation-in-part of U.S. application Ser. No. 13/531,715 filed Jun. 25, 2012 by Cai et al., the entire disclosure of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | 13531715 | Jun 2012 | US |
Child | 13568699 | US |