Driver assistant system using influence mapping for conflict avoidance path determination

Information

  • Patent Grant
  • 9761142
  • Patent Number
    9,761,142
  • Date Filed
    Tuesday, September 3, 2013
    12 years ago
  • Date Issued
    Tuesday, September 12, 2017
    8 years ago
Abstract
A driver assistance system for a vehicle includes a plurality of sensors disposed at a vehicle and operable to detect objects at least one of ahead of the vehicle and sideward of the vehicle. The driver assistance system includes a data processor operable to process data captured by the sensors to determine the presence of objects ahead and/or sideward of the vehicle. Responsive to the data processing, the driver assistance system is operable to determine at least one of respective speeds of the determined objects and respective directions of travel of the determined objects. The driver assistance system is operable to determine respective influence values for the determined objects. Responsive to the respective determined speeds and/or directions of travel of the determined objects and responsive to the determined respective influence values, at least one path of travel for the vehicle is determined that limits conflict with the determined objects.
Description
CROSS REFERENCE TO RELATED APPLICATIONS

The present application is related to U.S. provisional application Ser. No. 61/718,382, filed Oct. 25, 2012 and Ser. No. 61/696,416, filed Sep. 4, 2012, which is hereby incorporated herein by reference in its entirety.


FIELD OF THE INVENTION

The present invention relates to imaging systems, vision systems and/or driver assistance systems for vehicles and, more particularly, to a machine vision system for full or partially autonomies driving and evasive steering and braking for collision avoidance and impact degrading.


BACKGROUND OF THE INVENTION

Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935; and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.


Use of plural imaging sensors and environmental sensors in vehicle machine vision and (human) vision systems are known. It is known to fuse vehicle sensors for achieving redundant or enhanced data in combination. Redundancy adds to the security of the driver assistant system.


To use the input of multiple sensor systems for a conflict or collision avoidance and impact degrading system utilizing object detection and tracking is also known. The task of the collision avoidance system is to analyze the sensor data to determine the potential environmental collision hazards for either initiating warnings, evasive steering, braking, acceleration or to capture a record.


SUMMARY OF THE INVENTION

The present invention provides a conflict avoidance system or vision system or imaging system for a vehicle that utilizes two or more cameras to capture images exterior of the vehicle (such as forward and/or rearward of the vehicle), and provides the communication/data signals, including camera data or image data, that may be displayed at a display screen that is viewable by the driver of the vehicle, such as when, for example, the driver is backing up the vehicle, and that may be processed and, responsive to such image processing, the system may detect an object at or near the vehicle and in the path of travel of the vehicle, such as when the vehicle is backing up. The vision system may be operable to display a surround view or bird's eye view of the environment at or around or at least partially surrounding the subject or equipped vehicle.


The present invention provides a driver assistance system or conflict or collision avoidance system or vision system or imaging system for a vehicle that utilizes one or more cameras or other external sensors to capture data or images exterior of the vehicle, and an (image) data processor/processing system for determining the potential environmental collision hazards for initiating evasive steering, braking and/or acceleration. The system of the present invention fills the vehicle's environmental detected hazardous objects properties influence into a 2D influence map. The higher the object's hazardous potential (such as due to a greater speed of the object or due to the direction of travel of the object, such as when an object is determined to be heading in a direction of travel towards the equipped vehicle or into the path of travel of the equipped vehicle), the greater the object's environment or surroundings becomes influenced (and thus a greater influence value or weighting is applied to that object by the system of the present invention). The driver assistance system or conflict or collision avoidance system chooses the optimal path around the objects with the least influence or potential conflict or obstruction within the influence map.


Optionally, the vehicle's maneuvering and braking abilities and metha information or other data (such as geographic, altitude, seasonal, climate, weather, urban vs. rural location, traffic density, car2car/car2x data, or the like) and/or other environmental properties or considerations may be considered as well. Optionally, legislative and ethical considerations may also be considered in determining the objects' influence ratings.


These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a plan view of a vehicle with a vision system and imaging sensors or cameras that provide exterior fields of view in accordance with the present invention;



FIG. 2 is a schematic top view of an influence map of a (typical) highway scene (no pedestrian, no cyclist, no cross traffic, no opposing traffic) in accordance with the present invention;



FIG. 3 is an exemplary influence map suitable for use in the system of the present invention;



FIG. 4 is a table (TABLE 1) showing an example of a ranking scheme, with the gray deposited values have the highest ranking value;



FIG. 5 is a table (TABLE 2) showing how speed of an object relative to the equipped vehicle may be taken into account as an influence rating parameter;



FIG. 6 is a look up table (TABLE 3) showing data that may be provided with entries that may be specified by the system's manufacturer, the vehicle's manufacturer or by legislation or commissions; and



FIG. 7 is a table (TABLE 4) showing how distance of an object to the equipped vehicle may be taken into account as an influence rating parameter.





DESCRIPTION OF THE PREFERRED EMBODIMENTS

A vehicle vision system and/or driver assist system and/or object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The vision system includes a processor that is operable to receive image data from the vehicle cameras and may provide a displayed image that is representative of the subject vehicle (such as for a top down or bird's eye or surround view, such as discussed below).


Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes at least one exterior facing imaging sensor or camera, such as a rearward facing imaging sensor or camera 14a (and the system may optionally include multiple exterior facing imaging sensors or cameras, such as a forwardly facing camera 14b at the front (or at the windshield) of the vehicle, and a sideward/rearward facing camera 14c, 14b at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (FIG. 1). The vision system 12 includes a control or processor 18 that is operable to process image data captured by the cameras and may provide displayed images at a display device 16 for viewing by the driver of the vehicle (although shown in FIG. 1 as being part of or incorporated in or at an interior rearview mirror assembly 20 of the vehicle, the control and/or the display device may be disposed elsewhere at or in the vehicle).


2D and 3D environment reconstruction out of image and sensor data is used to determine potential hazards in the path of the vehicle. The system may judge the object's properties and resulting potential as a hazard. Although it is known to rate hazardous influence in a so called influence map in robotics and computer games (influence mapping), automotive applications present challenges overcome by the present invention. The present invention uses an influence map for generating a manageable number of collision avoidance and impact degrading paths by following the most promising during computing.


With reference to FIG. 2, (A) represents distance markers within a time frame (speed) (the more the distance, the faster), (B) represents a collision avoidance and impact degrading (steering and braking or acceleration) path, (C) represents a collision avoidance and impact degrading (steering and braking or acceleration) path, (D) represents a collision avoidance and impact degrading (steering and braking or acceleration) path, (E) represents a collision avoidance and impact degrading (steering and braking or acceleration) path, and (F) represents a collision avoidance and impact degrading (steering and braking or acceleration) path. Also, (1) represents a relatively fast vehicle and (2) represents a relatively slow vehicle. (3) represents an opposing road side (very high influence level), (4) represents a side strip, (5) represents a hard shoulder/emergency lane (high influence level), and (6) represents a soft shoulder (very high influence level). As shown in FIG. 2, (7) represents a speed vector of another vehicle, and (8) represents the subject vehicle having a relatively high speed, faster than vehicle (1), and (9) represents the subject vehicle's own speed vector.


In the illustration of FIG. 2, all of the vehicles are running with more or less high speed and in the same direction (see (1), (7) and (9)), since a highway scene is shown (with high traffic, but without any traffic jam or accidents or the like). At the assumption that the subject vehicle (8) is closing to the given scenario with a higher speed (9) than the fastest other vehicle (1), the collision intervention system may engage. There may be alert levels employed for determining the necessarily actions to warn or intervene. The driver's driving ability may be reflected by that. The further on assumption is that the necessity of the intervention system to engage may have been given. Due to the fact that a collision is always to be avoided and it is desirable to have a substantially conflictless driving experience, especially when being driven by a fully or partially autonomous system, the system may be engaged at all times during vehicle operation.


When engaged, the intervention system receives data about objects (such as other traffic participants or obstacles on the road) from the vehicle image sensors and/or other environmental sensors (such as a RADAR or LIDAR or LASER and/or the like) and maybe via remote (car2X, car2car and/or the like). These may be already classified (such as, for example, a car, truck, cyclist, pedestrian, motorcyclist, horse carriage, policeman riding a horse, deer, lost load (as obstacle), pole, traffic island, traffic bollard and/or the like). These data are processed to create space depths map in order to receive a 3D or 2D (preferred) world reconstruction, which requires the detection of the object distances and width (preferably also the shape) via any means. Some systems deliver 3D imprint by nature, such as stereo vision camera systems, while other systems reconstruct by sensor fusion, such as mono camera systems plus a distance measuring laser.


The identified objects are mapped into the 2D (top view) influence map (FIG. 2). By monitoring the surrounding objects, the system is able to determine the speed and direction of each object. That information becomes attached to each object as a speed vector (7). The hazardous potential of each object becomes rated by the determination algorithm. The faster an object is (relative to the subject or host or equipped vehicle), the stronger its influence value is (see Tables 1 and 2) and a greater weighting or influence irradiation is applied to the faster object or objects. The influence rating or weighting factors equates to the colors or shading in the two dimensional influence map of FIG. 2, where the higher the influence of an object the darker the color or shading there is at or around the object.


The influence becomes irradiated to (increased or weighted towards) the equating objects area. The influence or weighting is weighted more forward and backward of a vehicle than sideward, since vehicles typically do not interfere substantially massively sideward. The faster an object's speed is, the more the object irradiates ahead. As can be seen in FIG. 2, the irradiation of an object quite close to another does coalescence when these superpose, such that groups of objects form areas with an elevated level of irradiation. It is the task of the algorithm to search the least dangerous collision avoidance and impact degrading path. This is done by laying out a future path of travel of the subject vehicle under consideration of the vehicle's maneuvering, braking and acceleration physics. There may be a vehicle kinematic model or look up table employed which may be generated very precisely, but may deliver just the most significant (may be quite simplified) key data necessary for generating a feasible path.


Also the future paths of travel of all known foreign vehicles (or objects) are estimated. Assumptions of their reaction time, maneuvering, braking and acceleration physics/abilities may have to be made. Different object classes may have different driving parameter assumptions. These may add to the objects' influence on the map. Additionally, the system may take metha data into account depending on the situation. This may comprise data from a navigation system, such as data that, for example, includes information as to whether the soft shoulder is drivable in an emergency (or may be a canyon), weather condition data, road surface data, (subject vehicle's) tire wear and inflation data, and the like, either provided by on board data storage or sensors and systems or from remote sensors or systems or data storage.


To lay out conflict or collision avoidance and impact degrading paths, the system seeks to plan a path with the least interference of the irradiation (lowest level of influence) and the lowest level of length wise and side wise acceleration (lowest level of steering intervention and hard braking or strong acceleration), and optionally the least irritation to other traffic participants (reflected by the choice of the resulting paths at the end of determination). Within the influence map of FIG. 2, this means that the least speed marker change over distance {f(distance/time)} and the path's curvature. At times when the system determines that an accident is unavoidable, the system may seek a path which reduces or minimizes the hazard of an impact. This is why other vehicles' influence maps have quite longer extension at the edges than on the center of the rear (and front). An offset impact is always more dangerous than a central impact (statistical evidence supports this). Since there are a large number of possible assumed future scenarios of the subject vehicle's driving behavior and other traffic participants' driving behavior and the other interactions, the resulting reactions becomes huge even for just a few (milli-)seconds into the future, such that the system may have assessment algorithm and criteria to filter out the “interesting” (most matching to the criteria) ones and to proceed to predetermine these rather than to predetermine all (hundreds) of the possibilities together. This may lead to optimal and to suboptimal results.


In the example of an influence map based path determination of FIG. 2, the ‘interesting’ paths are identified by ‘B’, ‘C’, ‘D’, ‘E’ and ‘F’. It becomes apparent that path ‘F’ may be a quite undesirable choice to go with since it requires hard braking. At path ‘E’ and ‘B’, there are quite hectic steering maneuvers involved, which may lead other traffic participants to panic maneuvers or at least being irritated. Path ‘C’ is already capable to be considered quite safe, but the subject vehicle has first to accelerate and then to brake or slow down to the slow speed of the drivers ahead. Vehicle (1) has to brake more aggressively hence it's braking space is diminishing by the subject vehicle changing to his or her lane. In this example, the choice of path ‘D’ would probably be the preferred or optimal choice, since nearly no substantial interference with any other traffic participant happens and the furthest way is free at least within the detection distance of the on board sensors (the detection distance may change under different environmental conditions, and in good cases the detection distance may extend or reach from about 30 meters to about 500 meters, depending on the system's components).


Object influence ratings may also be altered or weighted by the hazard of an impact of the subject vehicle to that of other vehicles' occupants (or other objects' occupants or the like and in the case of a pedestrian, to the pedestrian). A motorcyclist may be more vulnerable than a truck, and thus the influence value or weighting value may be chosen higher for a motorcyclist. This data may be provided by a look up table (see Table 3) with entries that may be specified by the system's manufacturer, the vehicle's manufacturer or by legislation or commissions. Ethical commissions may state whether it is preferable to endanger one motorcyclist than one school bus or other kinds of conflicting cases. Optionally, the ratings may differ in different cultural areas. By that, the system may engage a look up table which equates to the region that the vehicle is at currently (controlled online) or sold at (set at time of production).


Optionally, the distance of a foreign object to the respective vehicle may be taken into account as another influence rating parameter (see Table 4). As an optional parameter, which may gain to the influence area of a foreign object or road participant, the object or the like may be the blinking. At the times a vehicle blinks (for example, to the left), its forward irradiating ‘aura’ may be extended to the direction of the blinking. By that, the influence map rises in that area which may lower the probability that the own vehicle may enter the room or are that the blinking vehicle driver is intending to enter. The driver of the equipped or subject vehicle may rather stay behind the blinking vehicle.


The system is operable continuously as the vehicle is driven along the road. Thus, the system is always collecting environmental data which are fed into the influence mapping. Further, the system is recapitulating the current state in time slots (fractions of seconds long) and reevaluating the situation (by the influence map). During the milliseconds that are progressing an earlier as optimal laid out collision avoidance path may become abandoned and a better one at that state of time may be selected as the preferred or optimal path since the other traffic participants may act at least in part different than assumed earlier or objects that weren't detected previously may come into view of the sensors of the subject vehicle.


One solution for determining avoidance paths that may be optimal or semi optimal may be to handle the suspect vehicle and all foreign objects/vehicles as being like a marble having an influence value rolling or gliding over the influence map which influence values determining the heights (relate to according elevations and valleys).


The marbles may have an assumed mass ‘m’ exposed to an assumed gravity ‘g’ and an inherent inertia. When in motion already (according to the speed vectors (7) in FIG. 2), there may be an assumed kinetic energy inherent to each marble. By that the marble may be turned away and slowed down when running into the direction of an elevation and may be turned to and accelerated when heading into a valley or when a faster marble closes up from behind, which may cause the map to rise in that region. Due to the influence of each object or vehicle, the influence map under the marble may change continuously while the marble glides or rolls.


More specifically, the marble's acceleration/deceleration force and direction due to the slope of its resting surface at a specific point in the map may be calculated by superpositioning each surrounding objects influence contribution. Each object's contribution may be added to one common at a certain point such as can be seen in FIG. 3. The contribution ‘E’ may be dependent from the distance ‘r’ and the influence score ‘I’. Since the relative speed ‘s’ may effect the influence that each object may advance in direction its speed vector {right arrow over (s)}, the speed component in the direction of ‘r’ may be taken into account (see equation (1)) as well.

E=(I/r)*{right arrow over (s)}  (1)


To calculate the slope of the area that the marble is resting on, the area may be assumed as being a triangle between three points a, b, c, with its values Ea, Eb, Ec, which surround the marble. The slope normal ‘n’ is the vector product of (a-b)×(a-c).


While calculating one marble's normal, its own influence to the influence map may not be taken into account.


The exemplary influence map in FIG. 3 shows the influence of an object or vehicle ‘II’ with a speed vector to the left with an influence level of the value 5 (5 rings) and an object or vehicle ‘III’ with a speed vector to the right with an influence level of the value 7 (7 rings), which influence areas mostly irradiate circumferential and into the direction of the speed vector. The influence level of the objects II and III to the edges of the triangle of the to be calculated object (under test) is resting on can be calculated by counting the number of rings (and by that the influence value) the specific point or area or region is enclosed in. By summing up the influence of both other objects, the triangle has two edges with the height of 3 and one with the level 2. By that, the triangle's normal is tilted to upper left from upright (and by that the slope of the triangle will be to the upper left). When simulating the next time increment, object I is accelerated into the upper left direction. In this example, the triangle is chosen quite wide for giving example. The triangle may preferably be chosen in an infinitesimal small manner and the influence calculated not in INTEGER counting rings but in FLOAT by equation (1) to match the normal vector n more precisely.


The normal and by that the accelerating component of g of each object (marble) may be calculated accordingly. At a certain point in time, each marble may have a certain inherent inertia and acceleration. By that it is possible to presume each marble's new position and speed (or inertia) at a sequential time increment. This is already sufficient to run a basic conflict and/or collision avoidance. The nature of the marbles system will be to deflect from the most influencing objects in the local surrounding. The paths which will be gone in (near) future are mostly determined by the current influence map landscape. The higher an influence level in an area is the more it influences the future course.


In practice, the increment's length may depend on the system's calculating property or environmental sensors detection rate (such as, for example, 1/30 second or thereabouts). It may be desirable to have infinitesimal small time steps for the most accurate object curve calculation.


More sophisticated systems may aspire to presume the paths with the least conflict potential, for not running into suboptimal paths. Due to the fact every successive influence map (inertia and position of each marble or object) can be calculated out of the status of its predecessor, the future approximation and acceleration conflicts may be calculatable and thus simulatable. Of course, the accuracy of the simulated result may be less the more the future time increment is. Because of that, the simulation may be repeated from the beginning at each or nearly each new time increment which passes in real time. Since identical simulation of the identical starting scenario possibly come to the simulated result, the simulation may not be able to find (simulate) alternative avoidance paths. Because of that, there may be an artificial disturbance (factor) which may (possibly randomly) take influence to the simulations results.


The disturbance may act to the subject vehicle's marble alone or may be applied to all marbles in the system. As a specific solution, the disturbance may be an acceleration impulse. As an alternative specific solution, the disturbance may be a random change of the marbles resting areas slope normal. By that, the system may ‘fall’ into alternative solutions at fork way points at which just small disturbances can tilt the system to one or another extreme, such as known from weather forecast simulations. Unlike weather simulations, the fork ways alternative ways may have no tendency to chaotically oscillation, maybe just dampened swinging.


The found alternative paths at a current point of time may be assessed by a decision algorithm which may rank the paths by one, some or all of the following criteria or may draw a scoring by pooling one, some or all following criteria:

    • One path's lowest end point (after a certain amount of time increments);
    • One path's maximum influence elevation experienced on the whole;
    • One path's median influence elevation experienced on the whole way;
    • One path's maximum influence slope experienced on the whole way;
    • One path's maximum acceleration experienced on the whole way;
    • One path's maximum forward acceleration/deceleration experienced on the whole way;
    • One path's maximum lateral acceleration experienced on the whole way;
    • One path's maximum speed;
    • One path's median speed on the whole way; and/or
    • One path's shortest way.


In a more advance system, the simulated accelerations may be limited to stay in the physical limits of the real objects by reflecting the (may be partially assumed) objects' properties.


In further advanced systems, the system may be able to distinguish between a normal uncritical situation and a critical situation. The system may come to a decision by assessing the predetermined possible paths. There may be certain maximum limits in presumed deceleration (so braking) measures and/or lateral acceleration (hard curving) measures in all optional paths which when overrun may turn the system into a kind of ‘critical’ mode. Then the system may not brake as comfortable as possible, but as soon and as heavy/aggressively as possible. The system may be allowed to ignore general traffic rules. By that it may turn onto the emergency lane for evading a predicted critical situation or collision (however, the system would not make such a maneuver when in the usual ‘uncritical’ mode). The system may pass at the non fast lane (overpassing on the right on right hand traffic). The system may change lanes without blinking. The system may select to go off road in case it determines that this is the least hazardous way out of the detected situation or hazardous condition.


The system may be able to adapt over time by evolutional learning of its inherent parameters. Positive scenario postulations may strengthen a parameter or parameter set, and negative scenario postulations may cause the system to alter the parameter set. Different parameter optima may be found for different driving conditions including the weather, road (motor way/city, crowded/less crowded, bumpy/smooth asphalt/gravel) and vehicle conditions.


Furthermore, advanced systems, especially when being connected in real time via car2car or car2x or the like, may jointly simulate the other vehicles' and subject vehicle's influence effects and may come to solutions which may be the common best (not necessarily the individual's best) and communicate each participants presumed and dedicated driving path. When there are objects and traffic participants in the near which are not connected, the system may assume these as comparably high influencive (and less predictable and controllable as the other connected participants), which results that these participants may be circumscribed with comparably wider clearance. The common calculation unit may be placed external as a remote server.


Therefore, the present invention provides a conflict and/or collision avoidance system that determines the position and speed of other vehicles on the road on which the subject vehicle is traveling and, when it is determined that the subject vehicle is approaching the other vehicles, the system determines one or more possible paths that avoid the other vehicles or objects and the system may select a preferred or optimal path that avoids the other vehicles and objects and requires the least aggressive maneuvering (such as hard braking and/or hard steering of the subject vehicle). The system may generate an alert to the driver of the selected path, and may display the path or paths to the driver for the driver to select. Optionally, the system may control the vehicle, such as the braking system and/or steering system of the vehicle, to assist in maneuvering through the traffic along a selected path. The system may consider the size and speed and type of the other vehicles in determining the appropriate preferred or optimal path of travel for the subject vehicle.


As discussed above, the system may be operable to classify and ‘label’ or identify one or multiple object(s) and to set the speed and trajectory parameters and ‘matha’ properties to rank their hazardous potential or influence, even when the detected object is far from the subject vehicle and still a “spot” on the horizon, and when detection systems such as radar, laser and cameras are still unable to determine such parameters of the distant object. This hazardous influence ranking may be done by taking the speed, the distance, the size, the mass and the deformability and vulnerability of the subject vehicles or objects into account. There may be a look up table of each object's property influence value in use. In order to avoid overwhelming the driver with too many object's information and data, there may be a certain level of influence or a limited number of objects with the highest ranking which become brought to the driver's attention. In the example of such a ranking scheme shown in Table 1 (with Tables 2-4 showing sub tables of the used metrics) the gray deposited values are these of the three with the highest ranking value which would be the data of choice. When the vehicles' desired destinations are known due to data transmission, the intended paths can become predetermined. As Metha information, the local traffic rules may be regarded by the rating algorithms as well as when choosing the ranking of the information which will become presented to the driver.


The system may utilize one or more sensors in detecting objects and vehicles on the road ahead and alongside (and optionally behind) the subject vehicle. For example, the subject vehicle may include one or more cameras or imagers that capture image data of the scene occurring forwardly and/or sidewardly and/or rearwardly of the subject vehicle. The cameras have respective fields of view exterior of the vehicle and an image processor or image processors may process the image data to determine or detect objects or vehicles present in the field of view of the camera or cameras.


The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in PCT Application No. PCT/US2012/066570, filed Nov. 27, 2012, and/or PCT Application No. PCT/US2012/066571, filed Nov. 27, 2012, which are hereby incorporated herein by reference in their entireties.


The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an EyeQ2 or EyeQ3 image processing chip available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580; and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.


The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, an array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (preferably a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.


For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or International Publication Nos. WO 2011/028686; WO 2010/099416; WO 2012/061567; WO 2012/068331; WO 2012/075250; WO 2012/103193; WO 2012/0116043; WO 2012/0145313; WO 2012/0145501; WO 2012/145818; WO 2012/145822; WO 2012/158167; WO 2012/075250; WO 2012/103193; WO 2012/0116043; WO 2012/0145501; WO 2012/0145343; WO 2012/154919; WO 2013/019707; WO 2013/016409; WO 2012/145822; WO 2013/067083; WO 2013/070539; WO 2013/043661; WO 2013/048994; WO 2013/063014, WO 2013/081984; WO 2013/081985; WO 2013/074604; WO 2013/086249; WO 2013/103548; WO 2013/109869 and/or PCT Application No. PCT/US2012/056014, filed Sep. 19, 2012, and/or PCT/US2012/071219, filed Dec. 21, 2012, and/or PCT Application No. PCT/US2013/026101, filed Feb. 14, 2013, and/or PCT Application No. PCT/US2013/027342, filed Feb. 22, 2013, and/or PCT Application No. PCT/US2013/036701, filed Apr. 16, 2013 and/or U.S. patent application Ser. No. 13/964,134, filed Aug. 12, 2013 and published Feb. 20, 2014 as U.S. Publication No. 2014/0052340; Ser. No. 13/942,758, filed Jul. 16, 2013 and published Jan. 23, 2014 as U.S. Publication No. 2014/0025240; Ser. No. 13/942,753, filed Jul. 16, 2013 and published Jan. 30, 2014 as U.S. Publication No. 2014/0028852; Ser. No. 13/927,680, filed Jun. 26, 2013 and published Jan. 2, 2014 as U.S. Publication No. 2014/0005907; Ser. No. 13/916,051, filed Jun. 12, 2013 and published Dec. 26, 2013 as U.S. Publication No. 2013/0344736; Ser. No. 13/894,870, filed May 15, 2013 and published Nov. 28, 2013 as U.S. Publication No. 2013/0314503; Ser. No. 13/887,724, filed May 6, 2013 and published Nov. 14, 2013 as U.S. Publication No. 2013/0298866; Ser. No. 13/851,378, filed Mar. 27, 2013 and published Nov. 14, 2013 as U.S. Publication No. 2013/0300869; Ser. No. 61/848,796, filed Mar. 22, 2012 and published Oct. 24, 2013 as U.S. Publication No. 2013/0278769; Ser. No. 13/847,815, filed Mar. 20, 2013 and published Oct. 31, 2013 as U.S. Publication No. 2013/0286193; Ser. No. 13/800,697, filed Mar. 13, 2013 and published Oct. 3, 2013 as U.S. Publication No. 2013/0258077; Ser. No. 13/785,099, filed Mar. 5, 2013 and published Sep. 19, 2013 as U.S. Publication No. 2013/0242099; Ser. No. 13/779,881, filed Feb. 28, 2013 and published Sep. 5, 2013 as U.S. Publication No. 2013/0231825; Ser. No. 13/774,317, filed Feb. 22, 2013 and published Aug. 29, 2013 as U.S. Publication No. 2013/0222592; Ser. No. 13/774,315, filed Feb. 22, 2013 and published Aug. 22, 2013 as U.S. Publication No. 2013/0215271; Ser. No. 13/681,963, filed Nov. 20, 2012 and published Jun. 6, 2013 as U.S. Publication No. 2013/0141578; Ser. No. 13/660,306, filed Oct. 25, 2012 and published May 9, 2013 as U.S. Publication No. 2013/0116859; Ser. No. 13/653,577, filed Oct. 17, 2012 and published Apr. 25, 2013 as U.S. Publication No. 2013/0099908; and/or Ser. No. 13/534,657, filed Jun. 27, 2012 and published Jan. 3, 2013 as U.S. Publication No. 2013/0002873, and/or U.S. provisional applications, Ser. No. 61/845,061, filed Jul. 11, 2013; Ser. No. 61/844,630, filed Jul. 10, 2013; Ser. No. 61/844,173, filed Jul. 9, 2013; Ser. No. 61/844,171, filed Jul. 9, 2013; Ser. No. 61/840,542; Ser. No. 61/838,619, filed Jun. 24, 2013; Ser. No. 61/838,621, filed Jun. 24, 2013; Ser. No. 61/837,955, filed Jun. 21, 2013; Ser. No. 61/836,900, filed Jun. 19, 2013; Ser. No. 61/836,380, filed Jun. 18, 2013; Ser. No. 61/834,129, filed Jun. 12, 2013; Ser. No. 61/834,128, filed Jun. 12, 2013; Ser. No. 61/833,080, filed Jun. 10, 2013; Ser. No. 61/830,375, filed Jun. 3, 2013; Ser. No. 61/830,377, filed Jun. 3, 2013; Ser. No. 61/825,752, filed May 21, 2013; Ser. No. 61/825,753, filed May 21, 2013; Ser. No. 61/823,648, filed May 15, 2013; Ser. No. 61/823,644, filed May 15, 2013; Ser. No. 61/821,922, filed May 10, 2013; Ser. No. 61/819,835, filed May 6, 2013; Ser. No. 61/819,033, filed May 3, 2013; Ser. No. 61/16,956, filed Apr. 29, 2013; Ser. No. 61/815,044, filed Apr. 23, 2013; Ser. No. 61/814,533, filed Apr. 22, 2013; Ser. No. 61/813,361, filed Apr. 18, 2013; Ser. No. 61/840,407, filed Apr. 10, 2013; Ser. No. 61/808,930, filed Apr. 5, 2013; Ser. No. 61/807,050, filed Apr. 1, 2013; Ser. No. 61/806,674, filed Mar. 29, 2013; Ser. No. 61/806,673, filed Mar. 29, 2013; Ser. No. 61/804,786, filed Mar. 25, 2013; Ser. No. 61/793,592, filed Mar. 15, 2013; Ser. No. 61/793,614, filed Mar. 15, 2013; Ser. No. 61/772,015, filed Mar. 4, 2013; Ser. No. 61/772,014, filed Mar. 4, 2013; Ser. No. 61/770,051, filed Feb. 27, 2013; Ser. No. 61/770,048, filed Feb. 27, 2013; Ser. No. 61/766,883, filed Feb. 20, 2013; Ser. No. 61/760,366, filed Feb. 4, 2013; Ser. No. 61/760,364, filed Feb. 4, 2013; Ser. No. 61/758,537, filed Jan. 30, 2013; Ser. No. 61/756,832, filed Jan. 25, 2013; Ser. No. 61/754,804, filed Jan. 21, 2013; Ser. No. 61/745,925, filed Dec. 26, 2012; Ser. No. 61/745,864, filed Dec. 26, 2012; Ser. No. 61/736,104, filed Dec. 12, 2012; Ser. No. 61/736,103, filed Dec. 12, 2012; Ser. No. 61/735,314, filed Dec. 10, 2012; Ser. No. 61/734,457, filed Dec. 7, 2012; Ser. No. 61/733,598, filed Dec. 5, 2012; Ser. No. 61/733,093, filed Dec. 4, 2012; Ser. No. 61/727,912, filed Nov. 19, 2012; Ser. No. 61/727,911, filed Nov. 19, 2012; Ser. No. 61/727,910, filed Nov. 19, 2012; Ser. No. 61/713,772, filed Oct. 15, 2012; Ser. No. 61/710,924, filed Oct. 8, 2012; and/or Ser. No. 61/710,247, filed Oct. 2, 2012, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication No. WO 2013/043661, PCT Application No. PCT/US10/038477, filed Jun. 14, 2010, and/or PCT Application No. PCT/US2012/066571, filed Nov. 27, 2012, and/or U.S. patent application Ser. No. 13/202,005, filed Aug. 17, 20111, now U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.


The imaging device and control and image processor and any associated illumination source, if applicable, may comprise any suitable components, and may utilize aspects of the cameras and vision systems described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,937,667; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454; and 6,824,281, and/or International Publication Nos. WO 2010/099416 and/or WO 2011/028686, and/or U.S. patent application Ser. No. 12/508,840, filed Jul. 24, 2009, and published Jan. 28, 2010 as U.S. Pat. Publication No. US 2010-0020170, and/or PCT Application No. PCT/US2012/048110, filed Jul. 25, 2012, and/or U.S. patent application Ser. No. 13/534,657, filed Jun. 27, 2012 and published Jan. 3, 2013 as U.S. Publication No. 2013/0002873, which are all hereby incorporated herein by reference in their entireties. The camera or cameras may comprise any suitable cameras or imaging sensors or camera modules, and may utilize aspects of the cameras or sensors described in U.S. patent application Ser. No. 12/091,359, filed Apr. 24, 2008 and published Oct. 1, 2009 as U.S. Publication No. US-2009-0244361; and/or Ser. No. 13/260,400, filed Sep. 26, 2011, now U.S. Pat. No. 8,542,451, and/or U.S. Pat. Nos. 7,965,336 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties. The imaging array sensor may comprise any suitable sensor, and may utilize various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like, such as the types described in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,715,093; 5,877,897; 6,922,292; 6,757,109; 6,717,610; 6,590,719; 6,201,642; 6,498,620; 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 6,806,452; 6,396,397; 6,822,563; 6,946,978; 7,339,149; 7,038,577; 7,004,606; 7,720,580; and/or 7,965,336, and/or International Publication Nos. WO/2009/036176 and/or WO/2009/046268, which are all hereby incorporated herein by reference in their entireties.


The camera module and circuit chip or board and imaging sensor may be implemented and operated in connection with various vehicular vision-based systems, and/or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 7,004,606; 7,339,149; and/or 7,526,103, which are all hereby incorporated herein by reference in their entireties, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454; 6,320,176; and/or 7,480,149, which are hereby incorporated herein by reference in their entireties, a vehicle vision system, such as a forwardly, sidewardly or rearwardly directed vehicle vision system utilizing principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,877,897; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; and/or 7,859,565, which are all hereby incorporated herein by reference in their entireties, a trailer hitching aid or tow check system, such as the type disclosed in U.S. Pat. No. 7,005,974, which is hereby incorporated herein by reference in its entirety, a reverse or sideward imaging system, such as for a lane change assistance system or lane departure warning system or for a blind spot or object detection system, such as imaging or detection systems of the types disclosed in U.S. Pat. Nos. 7,720,580; 7,038,577; 5,929,786 and/or 5,786,772, and/or U.S. patent applications, Ser. No. 11/239,980, filed Sep. 30, 2005, now U.S. Pat. No. 7,881,496, and/or U.S. provisional applications, Ser. No. 60/628,709, filed Nov. 17, 2004; Ser. No. 60/614,644, filed Sep. 30, 2004; Ser. No. 60/618,686, filed Oct. 14, 2004; Ser. No. 60/638,687, filed Dec. 23, 2004, which are hereby incorporated herein by reference in their entireties, a video device for internal cabin surveillance and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962; 5,877,897; 6,690,268; and/or 7,370,983, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties, a traffic sign recognition system, a system for determining a distance to a leading or trailing vehicle or object, such as a system utilizing the principles disclosed in U.S. Pat. Nos. 6,396,397 and/or 7,123,168, which are hereby incorporated herein by reference in their entireties, and/or the like.


Optionally, the circuit board or chip may include circuitry for the imaging array sensor and or other electronic accessories or features, such as by utilizing compass-on-a-chip or EC driver-on-a-chip technology and aspects such as described in U.S. Pat. No. 7,255,451 and/or U.S. Pat. No. 7,480,149; and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, and/or Ser. No. 12/578,732, filed Oct. 14, 2009 and published Apr. 22, 2010 as U.S. Publication No. 2010/0097469, which are hereby incorporated herein by reference in their entireties.


Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device disposed at or in the interior rearview mirror assembly of the vehicle, such as by utilizing aspects of the video mirror display systems described in U.S. Pat. No. 6,690,268 and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011 and published Jun. 28, 2012 as U.S. Publication No. 2012/0162427, which are hereby incorporated herein by reference in their entireties. The video mirror display may comprise any suitable devices and systems and optionally may utilize aspects of the compass display systems described in U.S. Pat. Nos. 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252; and/or 6,642,851, and/or European patent application, published Oct. 11, 2000 under Publication No. EP 0 1043566, and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, which are all hereby incorporated herein by reference in their entireties. Optionally, the video mirror display screen or device may be operable to display images captured by a rearward viewing camera of the vehicle during a reversing maneuver of the vehicle (such as responsive to the vehicle gear actuator being placed in a reverse gear position or the like) to assist the driver in backing up the vehicle, and optionally may be operable to display the compass heading or directional heading character or icon when the vehicle is not undertaking a reversing maneuver, such as when the vehicle is being driven in a forward direction along a road (such as by utilizing aspects of the display system described in International Publication No. WO 2012/051500, which is hereby incorporated herein by reference in its entirety).


Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described International Publication Nos. WO 2010/099416; WO 2011/028686; WO 2012/075250; WO 2013/019795; WO 2012-075250; WO 2012/154919; WO 2012/0116043; WO 2012/0145501; and/or WO 2012/0145313, and/or PCT Application No. PCT/CA2012/000378, filed Apr. 25, 2012, and/or PCT Application No. PCT/US2012/066571, filed Nov. 27, 2012, and/or PCT Application No. PCT/US2012/068331, filed Dec. 7, 2012, and/or PCT Application No. PCT/US2013/022119, filed Jan. 18, 2013, and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011 and published Jun. 28, 2012 as U.S. Publication No. 2012/0162427, which are hereby incorporated herein by reference in their entireties.


Optionally, a video mirror display may be disposed rearward of and behind the reflective element assembly and may comprise a display such as the types disclosed in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,370,983; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187 and/or 6,690,268, and/or in U.S. patent applications, Ser. No. 12/091,525, filed Apr. 25, 2008, now U.S. Pat. No. 7,855,755; Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008; and/or Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are all hereby incorporated herein by reference in their entireties. The display is viewable through the reflective element when the display is activated to display information. The display element may be any type of display element, such as a vacuum fluorescent (VF) display element, a light emitting diode (LED) display element, such as an organic light emitting diode (OLED) or an inorganic light emitting diode, an electroluminescent (EL) display element, a liquid crystal display (LCD) element, a video screen display element or backlit thin film transistor (TFT) display element or the like, and may be operable to display various information (as discrete characters, icons or the like, or in a multi-pixel manner) to the driver of the vehicle, such as passenger side inflatable restraint (PSIR) information, tire pressure status, and/or the like. The mirror assembly and/or display may utilize aspects described in U.S. Pat. Nos. 7,184,190; 7,255,451; 7,446,924 and/or 7,338,177, which are all hereby incorporated herein by reference in their entireties. The thicknesses and materials of the coatings on the substrates of the reflective element may be selected to provide a desired color or tint to the mirror reflective element, such as a blue colored reflector, such as is known in the art and such as described in U.S. Pat. Nos. 5,910,854; 6,420,036; and/or 7,274,501, which are hereby incorporated herein by reference in their entireties.


Optionally, the display or displays and any associated user inputs may be associated with various accessories or systems, such as, for example, a tire pressure monitoring system or a passenger air bag status or a garage door opening system or a telematics system or any other accessory or system of the mirror assembly or of the vehicle or of an accessory module or console of the vehicle, such as an accessory module or console of the types described in U.S. Pat. Nos. 7,289,037; 6,877,888; 6,824,281; 6,690,268; 6,672,744; 6,386,742; and 6,124,886, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties.


Changes and modifications to the specifically described embodiments may be carried out without departing from the principles of the present invention, which is intended to be limited only by the scope of the appended claims as interpreted according to the principles of patent law.

Claims
  • 1. A driver assistance system for a vehicle, said driver assistance system comprising: a plurality of sensors disposed at a vehicle equipped with said driver assistance system, wherein said plurality of sensors comprise at least one vehicle-based camera and at least one vehicle-based non-camera sensor, and wherein said plurality of sensors are operable to detect objects at least one of ahead of the equipped vehicle and sideward of the equipped vehicle;wherein said at least one vehicle-based non-camera sensor is selected from the group consisting of a vehicle-based radar sensor disposed at the equipped vehicle and sensing exterior of the equipped vehicle and a vehicle-based lidar sensor disposed at the equipped vehicle and sensing exterior of the equipped vehicle;wherein said driver assistance system includes a data processor operable to process data captured by said at least one vehicle-based non-camera sensor and said at least one vehicle-based camera to determine the presence of objects at least one of ahead of the equipped vehicle and sideward of the equipped vehicle;wherein, responsive to said data processing, said driver assistance system is operable to determine at least one of respective speeds of the determined objects and respective directions of travel of the determined objects;wherein said driver assistance system is operable to determine respective influence values for the determined objects and wherein determined respective influence values are based on a determined potential hazard to the equipped vehicle presented by the determined objects;wherein, responsive to the at least one of respective speeds and respective directions of travel of the determined objects and responsive to the determined respective influence values, a plurality of paths for the equipped vehicle is determined;wherein the determined plurality of paths is assessed by a decision algorithm that ranks each of the determined paths based on likelihood of collision along the respective determined path with one or more determined objects;wherein said driver assistance system ranks the determined paths based on (i) the determined speeds of the respective objects relative to the equipped vehicle for the respective determined path, (ii) determined distances to the respective objects from the equipped vehicle for the respective determined path, (iii) a level of steering intervention required for the equipped vehicle to follow the respective determined path, (iv) a level of braking required for the equipped vehicle to follow the respective determined path, and (v) a level of acceleration required for the equipped vehicle to follow the respective determined path;wherein said data processor processes data captured by said at least one vehicle-based non-camera sensor and said at least one vehicle-based camera to determine a respective type of object for the determined objects;wherein a selected path of travel is selected from the plurality of determined paths responsive at least in part to the rankings of the ranked paths and responsive at least in part to the determined types of objects along one or more of the determined paths; andwherein said driver assistance system selects the selected path of travel based in part on at least one of (i) a legislative consideration and (ii) an ethical consideration.
  • 2. The driver assistance system of claim 1, wherein said plurality of sensors comprises a plurality of vehicle-based cameras each having an exterior field of view.
  • 3. The driver assistance system of claim 1, wherein the respective influence values for the detected objects are weighted in the directions of respective speed vectors of the determined objects.
  • 4. The driver assistance system of claim 3, wherein the respective influence values for the detected objects are weighted in relation to magnitudes of the respective speed vectors.
  • 5. The driver assistance system of claim 1, wherein the determined respective influence values for the determined objects comprise weighted values with increased weighting for objects with greater speeds.
  • 6. The driver assistance system of claim 1, wherein the determined respective influence values for the determined objects are ranked according to their hazardous potential and the path of travel is selected responsive to the rankings of the determined respective influence values.
  • 7. The driver assistance system of claim 1, wherein said driver assistance system generates an alert to the driver that is indicative of the determined path of travel.
  • 8. The driver assistance system of claim 1, wherein said decision algorithm ranks the determined paths based at least in part on lengths of the determined paths.
  • 9. The driver assistance system of claim 8, wherein at least one of the determined objects comprises a vehicle.
  • 10. The driver assistance system of claim 1, wherein said driver assistance system is operable to at least in part control at least one of a brake system of the equipped vehicle, a steering system of the equipped vehicle and an accelerator system of the equipped vehicle to guide the equipped vehicle along the selected path of travel.
  • 11. The driver assistance system of claim 1, wherein said driver assistance system determines at least one alternative path of travel for the equipped vehicle to follow that limits conflict with the determined objects.
  • 12. The driver assistance system of claim 11, wherein the selected path of travel or the at least one alternative path of travel for the equipped vehicle to follow is selected by iterating an influence map current condition into a future map condition in time steps.
  • 13. The driver assistance system of claim 1, wherein said driver assistance system determines the selected path of travel based at least in part on a driving condition at the road being traveled by the equipped vehicle.
  • 14. A driver assistance system for a vehicle, said driver assistance system comprising: a plurality of vehicle-based cameras disposed at a vehicle equipped with said driver assistance system;at least one vehicle-based non-camera sensor disposed at the equipped vehicle, wherein said at least one non-camera sensor is selected from the group consisting of a vehicle-based radar sensor disposed at the equipped vehicle and sensing exterior of the equipped vehicle and a vehicle-based lidar sensor disposed at the equipped vehicle and sensing exterior of the equipped vehicle;wherein said plurality of vehicle-based cameras and said vehicle-based non-camera sensor are operable to detect objects at least one of ahead of the equipped vehicle and sideward of the equipped vehicle;wherein said driver assistance system includes a data processor operable to process data captured by said vehicle-based cameras and said vehicle-based non-camera sensor to determine the presence of objects at least one of ahead of the equipped vehicle and sideward of the equipped vehicle;wherein, responsive to said data processing, said driver assistance system is operable to determine at least one of respective speeds of the determined objects and respective directions of travel of the determined objects;wherein said driver assistance system is operable to determine respective influence values for the determined objects and wherein determined respective influence values are based on a determined potential hazard to the equipped vehicle presented by the determined objects;wherein, responsive to the at least one of respective speeds and respective directions of travel of the determined objects and responsive to the determined respective influence values, a plurality of paths of travel for the equipped vehicle is determined;wherein the determined plurality of paths is assessed by a decision algorithm that ranks each of the determined paths based on likelihood of collision along the respective determined path with one or more determined objects;wherein said driver assistance system ranks the determined paths based on (i) the determined speeds of the respective objects relative to the equipped vehicle for the respective determined path, (ii) determined distances to the respective objects from the equipped vehicle for the respective determined path, (iii) a level of steering intervention required for the equipped vehicle to follow the respective determined path, (iv) a level of braking required for the equipped vehicle to follow the respective determined path, and (v) a level of acceleration required for the equipped vehicle to follow the respective determined path;wherein said data processor processes data captured by said plurality of vehicle-based cameras and said vehicle-based non-camera sensor to determine a respective type of object for the determined objects;wherein a selected path of travel is selected from the plurality of determined paths responsive at least in part to the rankings of the ranked paths and responsive at least in part to the determined types of objects along one or more of the determined paths;wherein said driver assistance system selects the selected path of travel based in part on at least one of (i) a legislative consideration and (ii) an ethical consideration; andwherein at least one of (i) said driver assistance system is operable to at least in part control at least one of a brake system of the equipped vehicle, a steering system of the equipped vehicle and an accelerator system of the equipped vehicle to guide the equipped vehicle along the selected path of travel and (ii) said driver assistance system generates an alert to the driver that is indicative of the selected path of travel.
  • 15. The driver assistance system of claim 14, wherein at least one of (i) the respective influence values for the detected objects are weighted in the directions of respective speed vectors of the determined objects, (ii) the respective influence values for the detected objects are weighted in relation to magnitudes of the respective speed vectors and (iii) the determined respective influence values for the determined objects comprise weighted values with increased weighting for objects with greater speeds.
  • 16. The driver assistance system of claim 14, wherein the determined respective influence values for the determined objects are ranked according to their hazardous potential and the path of travel is determined responsive to the rankings of the determined respective influence values.
  • 17. The driver assistance system of claim 14, wherein said decision algorithm ranks the determined paths based at least in part on lengths of the determined paths.
  • 18. The driver assistance system of claim 14, wherein said driver assistance system determines at least one alternative path of travel for the equipped vehicle to follow that limits conflict with the determined objects.
  • 19. A driver assistance system for a vehicle, said driver assistance system comprising: a plurality of vehicle-based cameras disposed at a vehicle equipped with said driver assistance system;at least one vehicle-based non-camera sensor disposed at the equipped vehicle, wherein said at least one vehicle-based non-camera sensor is selected from the group consisting of a vehicle-based radar sensor disposed at the equipped vehicle and sensing exterior of the equipped vehicle and a vehicle-based lidar sensor disposed at the equipped vehicle and sensing exterior of the equipped vehicle;wherein said plurality of vehicle-based cameras and said vehicle-based non-camera sensor are operable to detect objects at least one of ahead of the equipped vehicle and sideward of the equipped vehicle;wherein said driver assistance system includes a data processor operable to process data captured by said vehicle-based cameras and said vehicle-based non-camera sensor to determine the presence of objects at least one of ahead of the equipped vehicle and sideward of the equipped vehicle;wherein, responsive to said data processing, said driver assistance system is operable to determine at least one of respective speeds of the determined objects and respective directions of travel of the determined objects;wherein said driver assistance system is operable to determine respective influence values for the determined objects and wherein determined respective influence values are based on a determined potential hazard to the equipped vehicle presented by the determined objects;wherein, responsive to the at least one of respective speeds and respective directions of travel of the determined objects and responsive to the determined respective influence values, a plurality of paths of travel for the equipped vehicle is determined;wherein at least one of (i) the respective influence values for the detected objects are weighted in the directions of respective speed vectors of the determined objects, (ii) the respective influence values for the detected objects are weighted in relation to magnitudes of the respective speed vectors, (iii) the determined respective influence values for the determined objects comprise weighted values with increased weighting for objects with greater speeds and (iv) the determined respective influence values for the determined objects are ranked according to their hazardous potential and the plurality of paths of travel are determined responsive to the rankings of the determined respective influence values; andwherein the determined plurality of paths is assessed by a decision algorithm that ranks each of the determined paths based on likelihood of collision along the respective determined path with one or more determined objects;wherein said driver assistance system ranks the determined paths based on (i) the determined speeds of the respective objects relative to the equipped vehicle for the respective determined path, (ii) determined distances to the respective objects from the equipped vehicle for the respective determined path, (iii) a level of steering intervention required for the equipped vehicle to follow the respective determined path, (iv) a level of braking required for the equipped vehicle to follow the respective determined path, and (v) a level of acceleration required for the equipped vehicle to follow the respective determined path;wherein said data processor processes data captured by said plurality of vehicle-based cameras and said vehicle-based non-camera sensor to determine respective types of objects for the determined objects;wherein a selected path of travel is selected from the plurality of determined paths responsive at least in part to the rankings of the ranked paths and responsive at least in part to the determined types of objects along one or more of the determined paths; andwherein said driver assistance system selects the selected path of travel based in part on at least one of (i) a legislative consideration and (ii) an ethical consideration.
  • 20. The driver assistance system of claim 19, wherein said decision algorithm ranks the determined paths based at least in part on lengths of the determined paths.
US Referenced Citations (373)
Number Name Date Kind
4720790 Miki et al. Jan 1988 A
4987357 Masaki Jan 1991 A
4991054 Walters Feb 1991 A
5001558 Burley et al. Mar 1991 A
5003288 Wilhelm Mar 1991 A
5012082 Watanabe Apr 1991 A
5016977 Baude et al. May 1991 A
5027001 Torbert Jun 1991 A
5027200 Petrossian et al. Jun 1991 A
5044706 Chen Sep 1991 A
5055668 French Oct 1991 A
5059877 Teder Oct 1991 A
5064274 Alten Nov 1991 A
5072154 Chen Dec 1991 A
5073012 Lynam Dec 1991 A
5076673 Lynam et al. Dec 1991 A
5086253 Lawler Feb 1992 A
5096287 Kakinami et al. Mar 1992 A
5097362 Lynas Mar 1992 A
5115346 Lynam May 1992 A
5121200 Choi Jun 1992 A
5124549 Michaels et al. Jun 1992 A
5130709 Toyama et al. Jul 1992 A
5148014 Lynam Sep 1992 A
5151816 Varaprasad et al. Sep 1992 A
5168378 Black Dec 1992 A
5170374 Shimohigashi et al. Dec 1992 A
5172235 Wilm et al. Dec 1992 A
5177685 Davis et al. Jan 1993 A
5182502 Slotkowski et al. Jan 1993 A
5184956 Langlais et al. Feb 1993 A
5189561 Hong Feb 1993 A
5193000 Lipton et al. Mar 1993 A
5193029 Schofield Mar 1993 A
5204778 Bechtel Apr 1993 A
5208701 Maeda May 1993 A
5245422 Borcherts et al. Sep 1993 A
5253109 O'Farrell Oct 1993 A
5255442 Schierbeek et al. Oct 1993 A
5276389 Levers Jan 1994 A
5285060 Larson et al. Feb 1994 A
5289182 Brillard et al. Feb 1994 A
5289321 Secor Feb 1994 A
5305012 Faris Apr 1994 A
5307136 Saneyoshi Apr 1994 A
5309137 Kajiwara May 1994 A
5313072 Vachss May 1994 A
5325096 Pakett Jun 1994 A
5325386 Jewell et al. Jun 1994 A
5329206 Slotkowski et al. Jul 1994 A
5331312 Kudoh Jul 1994 A
5336980 Levers Aug 1994 A
5341437 Nakayama Aug 1994 A
5351044 Mathur et al. Sep 1994 A
5355118 Fukuhara Oct 1994 A
5374852 Parkes Dec 1994 A
5386285 Asayama Jan 1995 A
5394333 Kao Feb 1995 A
5406395 Wilson et al. Apr 1995 A
5406414 O'Farrell et al. Apr 1995 A
5410346 Saneyoshi et al. Apr 1995 A
5414257 Stanton May 1995 A
5414461 Kishi et al. May 1995 A
5416313 Larson et al. May 1995 A
5416318 Hegyi May 1995 A
5416478 Morinaga May 1995 A
5424952 Asayama Jun 1995 A
5426294 Kobayashi et al. Jun 1995 A
5430431 Nelson Jul 1995 A
5434407 Bauer et al. Jul 1995 A
5440428 Hegg et al. Aug 1995 A
5444478 Lelong et al. Aug 1995 A
5451822 Bechtel et al. Sep 1995 A
5457493 Leddy et al. Oct 1995 A
5461357 Yoshioka et al. Oct 1995 A
5461361 Moore Oct 1995 A
5469298 Suman et al. Nov 1995 A
5471515 Fossum et al. Nov 1995 A
5475494 Nishida et al. Dec 1995 A
5497306 Pastrick Mar 1996 A
5498866 Bendicks et al. Mar 1996 A
5500766 Stonecypher Mar 1996 A
5510983 Iino Apr 1996 A
5515448 Nishitani May 1996 A
5521633 Nakajima et al. May 1996 A
5528698 Kamei et al. Jun 1996 A
5529138 Shaw et al. Jun 1996 A
5530240 Larson et al. Jun 1996 A
5530420 Tsuchiya et al. Jun 1996 A
5535314 Alves et al. Jul 1996 A
5537003 Bechtel et al. Jul 1996 A
5539397 Asanuma et al. Jul 1996 A
5541590 Nishio Jul 1996 A
5550677 Schofield et al. Aug 1996 A
5555555 Sato et al. Sep 1996 A
5568027 Teder Oct 1996 A
5574443 Hsieh Nov 1996 A
5581464 Woll et al. Dec 1996 A
5594222 Caldwell Jan 1997 A
5610756 Lynam et al. Mar 1997 A
5614788 Mullins Mar 1997 A
5619370 Guinosso Apr 1997 A
5632092 Blank et al. May 1997 A
5634709 Iwama Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5648835 Uzawa Jul 1997 A
5650944 Kise Jul 1997 A
5660454 Mori et al. Aug 1997 A
5661303 Teder Aug 1997 A
5666028 Bechtel et al. Sep 1997 A
5670935 Schofield et al. Sep 1997 A
5677851 Kingdon et al. Oct 1997 A
5699044 Van Lente et al. Dec 1997 A
5724316 Brunts Mar 1998 A
5732379 Eckert et al. Mar 1998 A
5737226 Olson et al. Apr 1998 A
5760828 Cortes Jun 1998 A
5760931 Saburi et al. Jun 1998 A
5760962 Schofield et al. Jun 1998 A
5761094 Olson et al. Jun 1998 A
5765116 Wilson-Jones et al. Jun 1998 A
5765118 Fukatani Jun 1998 A
5781437 Wiemer et al. Jul 1998 A
5786772 Schofield et al. Jul 1998 A
5790403 Nakayama Aug 1998 A
5790973 Blaker et al. Aug 1998 A
5793308 Rosinski et al. Aug 1998 A
5793420 Schmidt Aug 1998 A
5796094 Schofield et al. Aug 1998 A
5835255 Miles Nov 1998 A
5837994 Stam et al. Nov 1998 A
5844505 Van Ryzin Dec 1998 A
5844682 Kiyomoto et al. Dec 1998 A
5845000 Breed et al. Dec 1998 A
5848802 Breed et al. Dec 1998 A
5850176 Kinoshita et al. Dec 1998 A
5850254 Takano et al. Dec 1998 A
5867591 Onda Feb 1999 A
5877707 Kowalick Mar 1999 A
5877897 Schofield et al. Mar 1999 A
5878357 Sivashankar et al. Mar 1999 A
5878370 Olson Mar 1999 A
5883739 Ashihara et al. Mar 1999 A
5884212 Lion Mar 1999 A
5890021 Onoda Mar 1999 A
5896085 Mori et al. Apr 1999 A
5899956 Chan May 1999 A
5915800 Hiwatashi et al. Jun 1999 A
5923027 Stam et al. Jul 1999 A
5924212 Domanski Jul 1999 A
5929786 Schofield et al. Jul 1999 A
5949331 Schofield et al. Sep 1999 A
5959555 Furuta Sep 1999 A
5963247 Banitt Oct 1999 A
5986796 Miles Nov 1999 A
5990469 Bechtel et al. Nov 1999 A
5990649 Nagao et al. Nov 1999 A
6020704 Buschur Feb 2000 A
6049171 Stam et al. Apr 2000 A
6066933 Ponziana May 2000 A
6084519 Coulling et al. Jul 2000 A
6097023 Schofield et al. Aug 2000 A
6097024 Stam et al. Aug 2000 A
6144022 Tenenbaum et al. Nov 2000 A
6175300 Kendrick Jan 2001 B1
6178034 Allemand et al. Jan 2001 B1
6198409 Schofield et al. Mar 2001 B1
6201642 Bos et al. Mar 2001 B1
6222447 Schofield et al. Apr 2001 B1
6223114 Boros et al. Apr 2001 B1
6227689 Miller May 2001 B1
6250148 Lynam Jun 2001 B1
6266082 Yonezawa et al. Jul 2001 B1
6266442 Laumeyer et al. Jul 2001 B1
6285393 Shimoura et al. Sep 2001 B1
6294989 Schofield et al. Sep 2001 B1
6297781 Turnbull et al. Oct 2001 B1
6302545 Schofield et al. Oct 2001 B1
6310611 Caldwell Oct 2001 B1
6313454 Bos et al. Nov 2001 B1
6317057 Lee Nov 2001 B1
6320176 Schofield et al. Nov 2001 B1
6320282 Caldwell Nov 2001 B1
6333759 Mazzilli Dec 2001 B1
6341523 Lynam Jan 2002 B2
6353392 Schofield et al. Mar 2002 B1
6370329 Teuchert Apr 2002 B1
6392315 Jones et al. May 2002 B1
6396397 Bos et al. May 2002 B1
6411204 Bloomfield et al. Jun 2002 B1
6420975 DeLine et al. Jul 2002 B1
6424273 Gutta et al. Jul 2002 B1
6430303 Naoi et al. Aug 2002 B1
6442465 Breed et al. Aug 2002 B2
6477464 McCarthy et al. Nov 2002 B2
6497503 Dassanayake et al. Dec 2002 B1
6498620 Schofield et al. Dec 2002 B2
6516664 Lynam Feb 2003 B2
6523964 Schofield et al. Feb 2003 B2
6534884 Marcus et al. Mar 2003 B2
6539306 Turnbull Mar 2003 B2
6547133 DeVries, Jr. et al. Apr 2003 B1
6553130 Lemelson et al. Apr 2003 B1
6559435 Schofield et al. May 2003 B2
6574033 Chui et al. Jun 2003 B1
6589625 Kothari et al. Jul 2003 B1
6594583 Ogura et al. Jul 2003 B2
6611202 Schofield et al. Aug 2003 B2
6611610 Stam et al. Aug 2003 B1
6636258 Strumolo Oct 2003 B2
6650455 Miles Nov 2003 B2
6672731 Schnell et al. Jan 2004 B2
6674562 Miles Jan 2004 B1
6678614 McCarthy et al. Jan 2004 B2
6680792 Miles Jan 2004 B2
6690268 Schofield et al. Feb 2004 B2
6700605 Toyoda et al. Mar 2004 B1
6704621 Stein et al. Mar 2004 B1
6710908 Miles et al. Mar 2004 B2
6711474 Treyz et al. Mar 2004 B1
6714331 Lewis et al. Mar 2004 B2
6717610 Bos et al. Apr 2004 B1
6735506 Breed et al. May 2004 B2
6741377 Miles May 2004 B2
6744353 Sjönell Jun 2004 B2
6757109 Bos Jun 2004 B2
6762867 Lippert et al. Jul 2004 B2
6794119 Miles Sep 2004 B2
6795221 Urey Sep 2004 B1
6802617 Schofield et al. Oct 2004 B2
6806452 Bos et al. Oct 2004 B2
6819231 Berberich et al. Nov 2004 B2
6822563 Bos et al. Nov 2004 B2
6823241 Shirato et al. Nov 2004 B2
6824281 Schofield et al. Nov 2004 B2
6831261 Schofield et al. Dec 2004 B2
6850156 Bloomfield et al. Feb 2005 B2
6882287 Schofield Apr 2005 B2
6889161 Winner et al. May 2005 B2
6891563 Schofield et al. May 2005 B2
6909753 Meehan et al. Jun 2005 B2
6946978 Schofield Sep 2005 B2
6953253 Schofield et al. Oct 2005 B2
6968736 Lynam Nov 2005 B2
6975775 Rykowski et al. Dec 2005 B2
6989736 Berberich et al. Jan 2006 B2
7004606 Schofield Feb 2006 B2
7005974 McMahon et al. Feb 2006 B2
7038577 Pawlicki et al. May 2006 B2
7062300 Kim Jun 2006 B1
7065432 Moisel et al. Jun 2006 B2
7079017 Lang et al. Jul 2006 B2
7085637 Breed et al. Aug 2006 B2
7092548 Laumeyer et al. Aug 2006 B2
7111968 Bauer et al. Sep 2006 B2
7116246 Winter et al. Oct 2006 B2
7123168 Schofield Oct 2006 B2
7145519 Takahashi et al. Dec 2006 B2
7149613 Stam et al. Dec 2006 B2
7161616 Okamoto et al. Jan 2007 B1
7167796 Taylor et al. Jan 2007 B2
7195381 Lynam et al. Mar 2007 B2
7202776 Breed Apr 2007 B2
7205904 Schofield Apr 2007 B2
7227459 Bos et al. Jun 2007 B2
7227611 Hull et al. Jun 2007 B2
7311406 Schofield et al. Dec 2007 B2
7325934 Schofield et al. Feb 2008 B2
7325935 Schofield et al. Feb 2008 B2
7338177 Lynam Mar 2008 B2
7339149 Schofield et al. Mar 2008 B1
7344261 Schofield et al. Mar 2008 B2
7355524 Schofield Apr 2008 B2
7370983 De Wind et al. May 2008 B2
7380948 Schofield et al. Jun 2008 B2
7388182 Schofield et al. Jun 2008 B2
7402786 Schofield et al. Jul 2008 B2
7423248 Schofield et al. Sep 2008 B2
7425076 Schofield et al. Sep 2008 B2
7446650 Scholfield et al. Nov 2008 B2
7459664 Schofield et al. Dec 2008 B2
7460951 Altan Dec 2008 B2
7480149 DeWard et al. Jan 2009 B2
7490007 Taylor et al. Feb 2009 B2
7492281 Lynam et al. Feb 2009 B2
7526103 Schofield et al. Apr 2009 B2
7561181 Schofield et al. Jul 2009 B2
7581859 Lynam Sep 2009 B2
7592928 Chinomi et al. Sep 2009 B2
7616781 Schofield et al. Nov 2009 B2
7619508 Lynam et al. Nov 2009 B2
7639149 Katoh Dec 2009 B2
7681960 Wanke et al. Mar 2010 B2
7720580 Higgins-Luthman May 2010 B2
7777611 Desai Aug 2010 B2
7855755 Weller et al. Dec 2010 B2
7859565 Schofield et al. Dec 2010 B2
7881496 Camilleri et al. Feb 2011 B2
7914187 Higgins-Luthman et al. Mar 2011 B2
7965336 Bingle et al. Jun 2011 B2
8013780 Lynam et al. Sep 2011 B2
8027029 Lu et al. Sep 2011 B2
8058977 Lynam Nov 2011 B2
8340866 Hanzawa et al. Dec 2012 B2
8694224 Chundrlik, Jr. et al. Apr 2014 B2
8849495 Chundrlik, Jr. et al. Sep 2014 B2
20020015153 Downs Feb 2002 A1
20020044065 Quist et al. Apr 2002 A1
20020113873 Williams Aug 2002 A1
20020159270 Lynam et al. Oct 2002 A1
20030137586 Lewellen Jul 2003 A1
20030179084 Skrbina Sep 2003 A1
20030222982 Hamdan et al. Dec 2003 A1
20030227777 Schofield Dec 2003 A1
20040012488 Schofield Jan 2004 A1
20040016870 Pawlicki et al. Jan 2004 A1
20040032321 McMahon et al. Feb 2004 A1
20040051634 Schofield et al. Mar 2004 A1
20040114381 Salmeen et al. Jun 2004 A1
20040128065 Taylor et al. Jul 2004 A1
20040200948 Bos et al. Oct 2004 A1
20050078389 Kulas et al. Apr 2005 A1
20050134966 Burgner Jun 2005 A1
20050134983 Lynam Jun 2005 A1
20050146792 Schofield et al. Jul 2005 A1
20050169003 Lindahl et al. Aug 2005 A1
20050195488 McCabe et al. Sep 2005 A1
20050200700 Schofield et al. Sep 2005 A1
20050232469 Schofield et al. Oct 2005 A1
20050264891 Uken et al. Dec 2005 A1
20060018511 Stam et al. Jan 2006 A1
20060018512 Stam et al. Jan 2006 A1
20060028731 Schofield et al. Feb 2006 A1
20060050018 Hutzel et al. Mar 2006 A1
20060061008 Karner et al. Mar 2006 A1
20060091813 Stam et al. May 2006 A1
20060103727 Tseng May 2006 A1
20060164230 DeWind et al. Jul 2006 A1
20060250501 Wildmann et al. Nov 2006 A1
20060290479 Akatsuka et al. Dec 2006 A1
20070023613 Schofield et al. Feb 2007 A1
20070104476 Yasutomi et al. May 2007 A1
20070109406 Schofield et al. May 2007 A1
20070109651 Schofield et al. May 2007 A1
20070109652 Schofield et al. May 2007 A1
20070109653 Schofield et al. May 2007 A1
20070109654 Schofield et al. May 2007 A1
20070120657 Schofield et al. May 2007 A1
20070176080 Schofield et al. Aug 2007 A1
20080180529 Taylor et al. Jul 2008 A1
20090037052 Ogasawara et al. Feb 2009 A1
20090113509 Tseng et al. Apr 2009 A1
20090177347 Breuer et al. Jul 2009 A1
20090243824 Peterson et al. Oct 2009 A1
20090244361 Gebauer et al. Oct 2009 A1
20090295181 Lawlor et al. Dec 2009 A1
20100020170 Higgins-Luthman et al. Jan 2010 A1
20100045797 Schofield et al. Feb 2010 A1
20100097469 Blank et al. Apr 2010 A1
20100228437 Hanzawa et al. Sep 2010 A1
20110246156 Zecha et al. Oct 2011 A1
20120033196 Vanek et al. Feb 2012 A1
20120062743 Lynam et al. Mar 2012 A1
20120218412 Dellantoni et al. Aug 2012 A1
20120245817 Cooprider et al. Sep 2012 A1
20120262340 Hassan et al. Oct 2012 A1
20120303222 Cooprider et al. Nov 2012 A1
20130124052 Hahne May 2013 A1
20130131918 Hahne May 2013 A1
20130231825 Chundrlik, Jr. Sep 2013 A1
20140067206 Pflug Mar 2014 A1
20140309884 Wolf Oct 2014 A1
20140313339 Diessner Oct 2014 A1
Foreign Referenced Citations (1)
Number Date Country
WO2013081985 Jun 2013 WO
Related Publications (1)
Number Date Country
20140067206 A1 Mar 2014 US
Provisional Applications (2)
Number Date Country
61718382 Oct 2012 US
61696416 Sep 2012 US