The subject matter herein generally relates to vehicle warning systems, and particularly, to a vehicle warning system capable of automatically turning on an alarm system of a vehicle and a related method.
A driver can determine to turn on lights of a vehicle according to visibility. The light emitted by the lights not only increases the visibility of the driver, but also makes the vehicle more easily seen by others, such as the drivers of other vehicles or pedestrians. In addition, a loudspeaker can be turned on as an audible indicator for pedestrians, or to warn when the distance between the vehicle and obstacles or pedestrians is less than a safe distance.
Implementations of the present technology will now be described, by way of example only, with reference to the attached figures.
It will be appreciated that for simplicity and clarity of illustration, where appropriate, reference numerals have been repeated among the different figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein can be practiced without these specific details. In other instances, methods, procedures, and components have not been described in detail so as not to obscure the related relevant feature being described. Also, the description is not to be considered as limiting the scope of the embodiments described herein. The drawings are not necessarily to scale and the proportions of certain parts may be exaggerated to better illustrate details and features of the present disclosure.
Several definitions that apply throughout this disclosure will now be presented.
The term “coupled” is defined as connected, whether directly or indirectly through intervening components, and is not necessarily limited to physical connections. The connection can be such that the objects are permanently connected or releasably connected. The term “comprising,” when utilized, means “including, but not necessarily limited to”; it specifically indicates open-ended inclusion or membership in the so-described combination, group, series, and the like.
The present disclosure is described in relation to a vehicle warning system. The vehicle warning system includes a camera, a determining unit coupled to the camera, and an executing unit coupled to the determining unit. The camera is located in a device on a vehicle to obtain an image of a scene around the vehicle including perception of depth in relation to objects appearing in the scene (current 3D surroundings image). The determining unit compares data of the current 3D surroundings image with a characteristic data of a 3D surroundings module and determines whether any pedestrians appear in the current 3D surroundings image. The executing unit turns on an alarm system of the vehicle device when pedestrians are apparent in the current 3D surroundings image.
The camera 10 can be arranged on the front of the vehicle device 200 and can capture images of the surroundings (surroundings images) of the vehicle. Images of the scene in front of the vehicle device 200 can be captured. Each captured surroundings image includes distance information indicating the distance between the camera 10 and each object in the field of view of the camera 10. In the embodiment, the camera 10 is a 3D image capturing device, such as a depth-sensing camera or a Time of Flight (TOF) camera. The surroundings image captured by the camera 10 can be used to control vehicle lights. For example, in
The camera 10 can include a model creating module 11 and an image obtaining module 13. The model creating module 11 is configured to create a 3D surroundings model based on images captured by the camera 10 and the distances between the camera 10 and each object which is apparent in the obtained surroundings image. In at least one embodiment, the 3D surroundings model can include a 3D special person module and a 3D special face module, and the 3D special person module and the 3D special face module can be stored in the storing unit 20.
The image obtaining module 13 is configured to obtain a current 3D surroundings image captured by the camera 10. The current 3D surroundings image can include an X-Y coordinates image (see
The determining unit 30 is configured to receive the current 3D surroundings image and determine the appearance of a pedestrian according to the current 3D surroundings model. For example,
In general, the word “module”, as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, written in a programming language. The software instructions in the modules may be embedded in firmware, such as in an erasable programmable read-only memory (EPROM) device. The modules described herein may be implemented as either software and/or hardware modules and may be stored in any type of computer-readable medium or other storage device.
Referring to
At block 1101, the image obtaining module 13 obtains an image of a scene around the vehicle including perception of depth in relation to objects appearing in the scene (current 3D surroundings image) captured by the camera 10 and send the extracted object data of the current 3D surroundings image to the determining unit 30. The current 3D surroundings image can include an X-Y coordinates image (see
At block 1102, the determining unit 30 receives the extracted object data of the current 3D surroundings image and determines the appearance of a pedestrian. If yes, goes on block 1103, and if not, goes back block 1101.
At block 1103, the determining unit 30 demarcate the location of the pedestrians in the current 3D surroundings image and compare the extracted object data of the current 3D surroundings image with the characteristic data of the 3D surroundings module in the storing unit 20.
At block 1104, the determining unit 30 determines the face directions in the current 3D surroundings image to send to the executing unit 30 and further determines whether the face direction is frontal to the vehicle device 200, if yes, goes on block 1105, if no, goes on block 1106.
At block 1105, the executing unit 40 turns on the lights of the vehicle device 200 and increase the flicker frequency of the lights as a warning.
At block 1106, the executing unit 40 turns on the lights and the speaker of the vehicle device 200 and increase the flicker frequency of the light as a warning.
The embodiments shown and described above are only examples. Many details are often found in the art such as the other features of a vehicle warning system. Therefore, many such details are neither shown nor described. Even though numerous characteristics and advantages of the present technology have been set forth in the foregoing description, together with details of the structure and function of the present disclosure, the disclosure is illustrative only, and changes may be made in the detail, especially in matters of shape, size, and arrangement of the parts within the principles of the present disclosure, up to and including the full extent established by the broad general meaning of the terms used in the claims. It will therefore be appreciated that the embodiments described above may be modified within the scope of the claims.
Number | Date | Country | Kind |
---|---|---|---|
201510417842.1 | Jul 2015 | CN | national |