Embodiments are generally related to data-processing methods and systems and processor-readable media. Embodiments are also related to visibility for automobile safety.
Visibility is essential for automobile safety. A major cause of vehicle accidents is reduced visibility due to bad weather conditions such as heavy rain, snow, and fog. There have been various efforts in hardware system development for improving visibility for automobiles, including high sensitive cameras for visible/invisible light, technologies that project visible/invisible light, Radar, and LIDAR. More recently, software based methods have caught more attention.
The following summary is provided to facilitate an understanding of some of the innovative features unique to the disclosed embodiments and is not intended to be a full description. A full appreciation of the various aspects of the embodiments disclosed herein can be gained by taking the entire specification, claims, drawings, and abstract as a whole.
It is, therefore, one aspect of the disclosed embodiments to provide for methods and systems for improving driver visibility.
It is another aspect of the disclosed embodiments to provide for methods and systems for enhancing captured images by exploiting the priori knowledge about a scene and objects stored in a datable.
The aforementioned aspects and other objectives and advantages can now be achieved as described herein. Methods and systems are disclosed for improving driver visibility during bad weather and/or poor lighting for objects such as road signs, road lines, road markings, etc. The disclosed approach can enhance the captured images by exploiting the priori knowledge about the scene and the objects that are stored in the database.
A processing unit can determine the vehicle location and orientation from the GPS and other location/orientation sensors (e.g., magnetic sensor). The processing unit can download from a database a list of the stationary objects that are expected to be detectable at the current location and orientation. It also compares the scene captured from the camera with the one obtained from the database using the location and orientation information. The matched scenes indicate where the objects are expected to appear in the captured image. The object is then detected from the captured images at the expected location and orientation using various known technologies.
The visibility of the detected object can then be enhanced by conventional methods such as boosting object contrast, increasing object color saturation, enhancing object text readability, modifying object color, and/or reducing noise. The disclosed approach may also incorporate the information about the object that is retrieved from the database.
The accompanying figures, in which like reference numerals refer to identical or functionally-similar elements throughout the separate views and which are incorporated in and form a part of the specification, further illustrate the present invention and, together with the detailed description of the invention, serve to explain the principles of the present invention.
The particular values and configurations discussed in these non-limiting examples can be varied and are cited merely to illustrate at least one embodiment and are not intended to limit the scope thereof.
The embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which illustrative embodiments of the invention are shown. The embodiments disclosed herein can be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
The system 10 is generally composed of: 1) the set of sensors (including at least one camera) 21 that capture images, determines a vehicle location and orientation, and detects various stationary objects; 2) the database 22 that contains information about the objects such as road signs, road lines, and road markings, as well as road scenes; 3) the processing unit 24, which analyzes and processes the information provided by the sensors 12 and the database 22, and enhances the image/video captured; and 4) an output unit 26 which contains at least a display screen. Such a system 10 may also include other output devices 28 such as audio outputs.
The sensors 12 employed in system 10 can be divided into three groups: (visible light and/or infrared (IR)) video cameras 14; location sensors 16 and/or orientation sensors 18; and object detection sensors 20. System 10 can include at least one main camera 21 that captures scenes. The main camera 21 can work with, for example, visible light or IR. Such a system 10 such as those provided by one or more of the sensing devices 14 can contain additional IR cameras, particularly if the main camera 21 relies on visible light. The IR cameras may cover multiple frequency bands for better object detection and classification.
A GPS or a similar device may be applied for location determination of the vehicle. The location sensing device 16 may, for example, be implemented in the context of a GPS device/sensor. Furthermore, orientation of the vehicle can also be obtained from the GPS by detecting its trajectory. The orientation sensing device 18 may also be implemented in the context of a GPS device or with GPS components. In this manner, the locating and orientation sensing devices 16, 18 may be implemented as or with a single GPS module or component, depending upon design considerations. Alternatively, orientation can also be found using a dedicated orientation sensor such as a magnetic sensor. Finally, various sensors such as radars, LIDARs, and other devices that project light are useful for detecting objects and determining their 3-D locations and shapes.
The database 22 can contain data indicative of, for example, the road scene, which is mainly viewed from a driver facing the forward direction. Database 22 can also contain data indicative of attributes about stationary objects such as road signs, road lines, road markings, and so forth. The attributes of an object may include its location (in 3-D), size, shape, color, material property (metal, wood, etc.), the text contained, etc.
The detection reliability and accuracy can further be improved by incorporating information captured by various object detection sensors such as sensor(s) 12 shown in
The visibility of the detected object can be enhanced by conventional methods such as boosting object contrast, increasing object color saturation, enhancing object text readability, modifying object color, and/or reducing noise. It may also incorporate the information about the object that is retrieved from the database by:
Mixing: The prior information can be combined with the captured scene in a weighted fashion. For example, a STOP sign in a captured image may have a faded red background and a darkened white text. To improve the visibility, the saturation of the red color will be enhanced and the white color will be brightened when the captured image is combined with the colors specified in the database 22 for the sign. The relative weighting depends on the confidence level of the detection accuracy, the confidence level of database accuracy, and the weather condition. For example, under optimal weather conditions, the captured image may be displayed via output unit 26 without alternations. Under bad weather conditions, however, increased reliance on database 22 may be required, particularly if the detection is confirmed by multiple sensors 12. The weighting may also be user-adjustable so that a user may select the tradeoff that best fits to his/her preference.
Insertion: It is possible to insert information that is not currently visible, but existing in the database 22. This can be considered as an extreme case for mixing. This happens, for example, during a day of heavy fog, a plate carrying a road sign is detected by a radar device and its location and shape match the information stored in the database. A synthetic road sign may be added into the scene for display.
Guided filtering: Snow and rain noise can often be effectively reduced by temporal and/or spatial filtering. However, conventional filtering may also lead to blurred scene and lost details. Applying the location and shape information of the objects, effective edge-preserving can be implemented, which removes the noise while maintaining the detail fidelity.
Note that the disclosed embodiments are described herein with reference to flowchart illustrations and/or block diagrams of methods, systems, and computer program products and data structures according to embodiments of the invention. It will be understood that each block of the illustrations, and combinations of blocks, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the block or blocks.
The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the block or blocks.
As will be appreciated by one skilled in the art, the disclosed embodiments can be implemented as a method, data-processing system, or computer program product. For example, the process flow or method described above can be implemented in the context of a data-processing system, computer program, processor-readable media, etc.
Accordingly, the embodiments may take the form of an entire hardware implementation, an entire software embodiment or an embodiment combining software and hardware aspects all generally referred to as a “circuit” or “module.” Furthermore, the disclosed approach may take the form of a computer program product on a computer-usable storage medium having computer-usable program code embodied in the medium, Any suitable computer readable medium may be utilized including hard disks, USB flash drives, DVDs, CD-ROMs, optical storage devices, magnetic storage devices, etc.
Computer program code for carrying out operations of the present invention may be written in an object oriented programming language (e.g., JAVA, C++, etc.). The computer program code, however, for carrying out operations of the present invention may also be written in conventional procedural programming languages such as the “C” programming language or in a visually oriented programming environment such as, for example, Visual Basic.
The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer. In the latter scenario, the remote computer may be connected to a user's computer through a local area network (LAN) or a wide area network (WAN), wireless data network e.g., WiFi, WiMax, 802.11x, and cellular network or the connection can be made to an external computer via most third party supported networks (e.g., through the Internet via an internet service provider).
The embodiments are described at least in part herein with reference to flowchart illustrations and/or block diagrams of methods, systems, and computer program products and data structures according to embodiments of the invention. It will be understood that each block of the illustrations, and combinations of blocks, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data-processing apparatus to produce a machine such that the instructions, which execute via the processor of the computer or other programmable data-processing apparatus, create means for implementing the functions/acts specified with respect to, for example, the various instructions of the process/flow or method described above.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data-processing apparatus to function in a particular manner such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in, for example, a block or blocks of a process flow diagram or flow chart of logical operations.
The computer program instructions may also be loaded onto a computer or other programmable data-processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the block or blocks.
As illustrated in
The following discussion is intended to provide a brief, general description of suitable computing environments in which the system and method may be implemented, Although not required, the disclosed embodiments will be described in the general context of computer-executable instructions, such as program modules, being executed by a single computer. In most instances, a “module” constitutes a software application.
Generally, program modules (e.g., module 152) can include, but are not limited to, routines, subroutines, software applications, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types and instructions. Moreover, those skilled in the art will appreciate that the disclosed method and system may be practiced with other computer system configurations such as, for example, hand-held devices, multi-processor systems, data networks, microprocessor-based or programmable consumer electronics, networked personal computers, minicomputers, mainframe computers, servers, and the like,
Note that the term module as utilized herein may refer to a collection of routines and data structures that perform a particular task or Implements a particular abstract data type. Modules may be composed of two parts: an interface, which lists the constants, data types, variable, and routines that can be accessed by other modules or routines, and an implementation, which is typically private (accessible only to that module) and which includes source code that actually implements the routines in the module. The term module may also simply refer to an application such as a computer program designed to assist in the performance of a specific task such as word processing, accounting, inventory management, etc.
The interface 153 (e.g., a graphical user interface) can serve to display results, whereupon a user may supply additional inputs or terminate a particular session. In some embodiments, operating system 151 and interface 153 can be implemented in the context of a “windows” system. It can be appreciated, of course, that other types of systems are possible. For example, rather than a traditional “windows” system, other operation systems such as, for example, a real time operating system (RTOS) more commonly employed in wireless systems may also be employed with respect to operating system 151 and interface 153. The software application 154 can include, for example, module(s) 152, which can include instructions for carrying out steps or logical operations such as those of method 50 and other process steps described herein.
It will be appreciated that variations of the above-disclosed and other features and functions, or alternatives thereof, may be desirably combined into many other different systems or applications. Also, that various presently unforeseen or unanticipated alternatives, modifications, variations or improvements therein may be subsequently made by those skilled in the art which are also intended to be encompassed by the following claims.
This application clams priority under 35 U.S.C. 119(e) to U.S. Provisional Patent Application Ser. No. 61/708,112, entitled “Visibility Improvement in Bad Weather Using Enhanced Reality,” which was filed on Oct. 1, 2012 the disclosure of which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
61708112 | Oct 2012 | US |