The present disclosure is directed to a frontal section camera arrangement for a vehicle, to a vehicle including the frontal section camera arrangement, to a method for operating a frontal section camera arrangement and to a computer program.
In some jurisdictions, commercial vehicles are legally required to be equipped with Lane Departure Warning Systems that are typically established by camera systems using front-looking cameras situated behind the windshield. This particular mounting position has the disadvantage that the very near field in front of a commercial vehicle cannot be observed by that camera, especially in case of US trucks with the cabin behind the engine.
Another disadvantage is that additional advanced driver assistance system (ADAS) functions, such as the detection of smaller objects like, for example, traffic signs or lights, cannot be accomplished because of the shaking of the suspended cabin. Yet another disadvantage is that the direct field of vision is limited by the camera head itself, which has a negative impact on the New Car Assessment Programme (NCAP) rating.
Document WO 2019/209791 A1 presents a vehicle sensor cleaning system that includes one or more vehicle sensors, including external view cameras, and a cleaning device. The system determines parameters for a cleaning event based on sensed information, operating parameters of the vehicle, or environmental information. The system cleans the one or more sensors to allow for safe operation of the vehicle.
There is still a need to improve cleaning especially of a camera arrangement.
An object of the disclosure is to provide an improved arrangement and method, namely at least a frontal section camera arrangement with improved cleaning and a method for operating the frontal section camera arrangement. In particular it is an object to implement in a frontal section camera arrangement a fully automated cleaning system and further enable an optimized consumption of the available cleaning resources for the frontal section camera arrangement.
The frontal section camera arrangement includes a camera arrangement and a frontal section appliance for being arranged on a frontal section of an exterior of the vehicle. The camera arrangement includes a camera unit that is arranged on the frontal section appliance for the frontal section of the exterior of the vehicle, which is, in particular, a chassis or bumper of the vehicle, and configured to provide image data.
The camera arrangement also includes a camera cleaning unit adapted to clean the camera unit of the camera arrangement, wherein the camera cleaning unit includes a compressed air provision unit for providing compressed air for use in an air-based cleaning process for the camera unit, and also includes a liquid provision unit for providing a liquid for use in a liquid-based cleaning process for the camera unit.
The camera cleaning unit is configured to receiving operation instructions for driving the camera cleaning unit in the air-based cleaning process and/or the liquid-based cleaning process. The camera arrangement further includes a cleaning control unit that is connected to the camera unit for receiving the image data and is also connected to the camera cleaning unit for providing the operation instructions. In the frontal section camera arrangement of the first aspect of the disclosure, the cleaning control unit is advantageously configured to detect a contamination, on the camera unit and, upon detecting the contamination on the camera unit, to activate the air-based cleaning process by providing an operation instruction indicative thereof. Further, upon determining that the contamination is still on the camera unit, after having performed the air-based cleaning process during a predetermined process time, the cleaning control unit is configured to additionally or alternatively activate the liquid-based cleaning process by providing an operation instruction indicative thereof.
Thus, the frontal section camera arrangement, which, upon operation, is intended to be arranged via the frontal section appliance to the exterior of the vehicle, in particular to the chassis, bumper or any other part of the vehicle that is decouple from shaking movements of the cabin caused by vehicle dynamic effects, enables a fixed orientation of the camera unit while providing a field of view that is not obstructed by vehicle components such as wipers, motor compartment, et cetera, and not obstructing the field of view of the driver. However, since the camera is positioned on the exterior of the vehicle, a cleaning of the camera unit is mandatory. The downside of the exterior mounting position is that the camera unit is exposed to contamination, so that a cleaning of the camera lens becomes necessary. Because typically, but not necessarily, the camera is configured to fulfill ADAS functions and the driver usually does not see the image for a supervision of its cleanliness, an automatic cleaning is beneficial. However, if contamination detection is triggered too often in known cleaning systems, the limited cleaning resources such as water or other cleaning liquids may be used in a non-efficient manner. The frontal section camera arrangement therefore has a camera cleaning unit that is operable in different operation processes, namely an air-based cleaning process that uses compressed air from a compressed air provision unit, and a liquid-based cleaning process that uses a liquid provided by a liquid provision unit.
First, upon detection of a contamination on the camera unit, the air-based cleaning process is activated. Contamination refers to any material located on the camera unit and obstructing its intended field of view, and includes, for example, mud, dust, water drops, snow, ice, oil, grease, insects, etc. If after a predetermined process time, the contamination on the camera unit has not disappeared, the cleaning control unit alternatively or additionally activates the liquid-based cleaning process, thus saving the liquid resource in cases where the air-based cleaning process is sufficient to eliminate the contamination and clean the camera unit.
Thus, the frontal section camera arrangement provides an efficient and fully automated solution for removing contamination of the camera unit located in an exterior of the vehicle, while further enabling an optimized consumption of the available cleaning resources, for instance, the liquid used in the liquid-based cleaning process.
In the following, embodiments of the frontal section camera arrangement of the first aspect of the disclosure will be described.
In an embodiment, the camera unit can be mounted or arranged at the foremost position of the vehicle. By mounting the camera to the foremost position of the vehicle, the blind areas in front of the vehicle are reduced or eliminated, because the motor compartment is taken out of the camera's field of view. Furthermore, the effect of cabin movements propagated to the camera and affecting the stability of its orientation is mitigated by assembling the camera on the chassis. Since with the exterior mounting position the camera in the cabin becomes obsolete, the direct field of vision for the driver is increased by positioning the camera away from it.
In another embodiment, the frontal section camera arrangement includes a wiping unit including a wiping element and a wiper actuator. The cleaning control unit is further configured to activate the wiping unit for wiping the camera unit, in particular during the liquid-based cleaning process, in particular after provision of the liquid.
In an embodiment, the cleaning control unit is further configured to identify a type of contamination from a predetermined list of identifiable types of contaminations, and to directly select a corresponding one of the air-based cleaning process and the liquid-based cleaning process in dependence on the identified type of contamination and a predetermined association rule between identifiable types of contaminations and cleaning processes. Suitable identifiable contaminants or types of contamination include, for example dust, mud, water, snow, ice, oil, grease and/or insects. Each of the types of contaminations are associated via a predetermined association rule to a starting cleaning process. For example, in a particular embodiment, in the case of mud having been identified as a contaminant from a list of identifiable types of contaminations, it is assumed that an air-based cleaning process with a low air-flow will not suffice to clean the camera unit and the cleaning control unit directly provides an operation instruction indicative of an air-based cleaning process with a high air-flow, or of a liquid-based cleaning process without having to operate the air-based cleaning process in advance.
In an embodiment, the cleaning control unit can be further connected to one or more sensing units of the vehicle for receiving corresponding sensing data. In this particular embodiment, the cleaning control unit is further configured to detect and/or identify the contamination using the sensing data provided or ascertained by the sensing units.
In particular, the sensing units are connected to the cleaning control unit and configured to provide the sensing data used for detecting and/or identifying the contamination on the camera unit and may include one or more of the following sensors.
A wiper status sensor for sensing the state of a wiper unit for wiping, in particular, the windshield and or the headlights. Typically, the wipers are automatically activated when dedicated sensors detect rain. Thus, the state of the wipers is an indication of an expected presence of water on the camera unit. For example, according to the frequency of the wiper, the air-based cleaning process is activated for drying, wherein the cleaning interval is adjusted depending on the frequency of the wiper. In a predictive step, it is estimated to which degree the camera unit is obstructed with raindrops so that the cleaning process can be selected accordingly.
A radar sensor, preferably in a frontward oriented mounting position close to the camera unit. The camera arrangement can be operated together with a front looking radar as a radar sensor. When the data from two different sensors like the camera and the radar need to be merged, it is of great advantage if these sensors are mounted on a common body, so that their relative orientation is fix. The radar sensor is usually arranged alongside with the camera unit in a frontward-oriented mounting position. Here, the cleaning control unit activates the respective cleaning process based on a determination of whether an object observed by the radar (for example, a pedestrian, tree, streetlight, sign, etc.) is also detected by the camera unit itself. The sensing data from the radar sensor is thus compared to the image data provided by the camera unit. If the object or objects detected by the radar are not detected by the camera, or vice versa, the cascade of cleaning processed is initiated as described above. Using contextual information from the radar sensor, the logic for activating the corresponding cleaning process could be, in a particular example, and given that the respective objects are detectable by both sensors, as follows:
First, an object is detected by the radar. If the object is also detected by the camera unit, no cleaning process is activated. If, however, the object is not detected by the camera unit, the cleaning control unit activates the corresponding cleaning process.
Optionally, the predictive step for activating the cleaning process performed by the cleaning control unit takes account of the different detection ranges of the sensors, so that the occurrence of an object that is detected by the radar in the far range can be predicted for the camera in the near field.
If the radar sensor does not detect any object for a certain time, the situation can be classified as non-critical and a cleaning of the camera unit can be initiated,
In addition, the radar sensor can identify if an object is located in front of the camera unit, so that it can be expected that the occlusion is only temporary, and no activation of the cleaning process is required.
Another sensor that can additionally or alternatively uses is a LIDAR sensor, preferably in a frontward oriented mounting position close to the camera unit.
Alternatively, or additionally, an ultrasound sensor, preferably in a frontward oriented mounting position close to the camera unit, and/or an infrared sensor, also preferably in a frontward oriented mounting position close to the camera unit are used for determination and provision of sensing data as explained above.
Further, in another embodiment, the frontal section camera arrangement additionally or alternatively includes an auxiliary camera unit that is different than the camera unit, for providing the sensing data. The auxiliary sensor is in this particular embodiment another camera that is preferably situated on a vehicle side different that that where the camera to be cleaned in installed. For example, a pair of front and rear-view cameras, left and right side looking cameras, or cameras on the truck (front looking) and on the rear-end of the trailer. The cleaning demand is derived from the capability of detecting and re-identifying the same object, where in such constellations, a temporal synchronization based on vehicle odometry and a prediction of the detected object needs to be considered. The logic for activating the corresponding cleaning process is, in a particular example, as follows: if a first sensor, for example first camera unit, detects a specific object, it is predicted via a temporal and dynamic model based on vehicle data, at what time and location the same object would be detected by a second sensor, that is, the second camera unit. If that second camera unit does not recognize the predicted object, a cleaning process of the camera unit is initiated.
In general, and different from a reversing camera, which is optionally equipped with a cleaning system, too, the frontal camera unit for the ADAS is operated for most of the time during vehicle operation. Thus, the cleaning efficiency in terms of resource consumption is even more critical than for a reversing camera, which is operated only sporadically. To realize the most economical resource consumption, a cleaning-process cascade including an air-based cleaning process and a liquid-based cleaning process is advantageously established, leveraging the multipath (air and water) cleaning technology. Different cleaning processes can thus be activated sequentially with increasing resource consumption and intermediate checks of the cleaning success are included to determine either that the camera unit is clean or that further cleaning is required. In particular, the first cleaning process is an air-based cleaning process involving only provision of air. Different airflows can be used, so that if the contamination remains on the sensor, more airflow is provided. A second cleaning process is the liquid, preferably but not limited to water or an aqueous solution, based cleaning process, wherein also different flow rates of the liquid can be uses. Additionally, both cleaning processes can be combined, if required for eliminating the contaminant. Preferably, this procedure is supported by a blockage detection that determines different states of occlusion of contaminants, (for example water droplets, dust, mud, oil, and insects).
If an obstruction or contamination is detected and the type of obstruction is known (or assumed), the obstruction is classified or predicted based on sensor data and/or environmental data, such as weather data. In a particular embodiment, the corresponding cleaning process is activated by the cleaning control unit as follows: cleaning process (air- or liquid-based) is selected according to the expected type, and optionally also the expected persistence of the contamination. For example, water, snow, loose dust, et cetera, can be regarded as level-1 contaminants for which the cleaning process begins with the activation of the air-based cleaning process. Ice, dried or wet dust, mud, et cetera, can be regarded as level-2 contaminants for which a liquid-based cleaning process is activated. This also happens if the air-based cleaning process has failed. Further, in this particular example, oil and grease are considered as level-3 contaminants for which a combination of air- and liquid-based cleaning process is activated. This is also the case when the liquid-based cleaning process has failed. If after the activation of a combination of the air-based and the liquid-based cleaning, for example after a predetermined time span after the activation of the combination, the contaminant or obstruction has not been removed, the cleaning process is deemed as unsuccessful and a corresponding signal is provided, for example to the driver or to the electronic control unit.
If an obstruction or contamination is detected, but type of obstruction or contamination is unknown, the cleaning process is selected and escalated based on success of lower level cleaning, activated with increasing resource consumption. For example, first a level-1 cleaning process consisting of an air-based cleaning process alone is activated, preferably with increasing airflow rate until a maximum flow rate is reached. If unsuccessful, a so-called level-2 cleaning process consisting of a liquid-based cleaning process alone is activated, preferably with increasing liquid flow rate until a maximum flow rate is reached. If unsuccessful, a level-3 cleaning process consisting of a combination of an air-based and a liquid-based cleaning process is activated, preferably with increasing air- and/or liquid flow rate until a maximum flow rate is reached. If after the activation of a combination of the air-based and the liquid-based cleaning, for example after a predetermined time span after the activation of the combination, the contaminant or obstruction has not been removed, the cleaning process is deemed as unsuccessful and a corresponding signal is provided, for example to the driver or to the electronic control unit.
In an embodiment, which may include any or any combination of the technical features described above, the frontal section appliance, includes one or more connection elements for attaching the camera unit to the chassis, grid or bumper of the vehicle. In a further embodiment, the camera cleaning unit is integrated into the frontal section appliance.
In another embodiment, the camera unit and the camera cleaning unit are integrated into a common housing element. Alternatively, in another embodiment, the camera unit and the camera cleaning unit include each a respective housing element and are arranged on the vehicle such that the cleaning unit is able to clean the camera unit.
In yet another embodiment the cleaning control unit is integrated with the camera unit in a common housing element. Alternatively, in another embodiment, the cleaning control unit is part of an electronic control unit of the vehicle that is connected via at least one communication channel to the camera unit and to the camera cleaning unit.
According to a second aspect of the present disclosure, a vehicle that includes a frontal section camera arrangement according to the first aspect of the disclosure is described. In the vehicle, the camera unit of the camera arrangement is arranged on a frontal section of an exterior of the vehicle, in particular a chassis, a grid or a bumper of the vehicle. The vehicle thus shares the advantages of the frontal section camera arrangement of the first aspect of the disclosure or of any of its embodiments.
In the following, advantageous embodiments of the vehicle of the second aspect of the disclosure will be described.
In a particular embodiment, the compressed air provision unit is pneumatically connected to an air compressor of the vehicle for receiving the compressed air. Since typically compressed air is produced using ambient air taken from the environment, there is virtually no shortage of compressed air. Thus, air-based cleaning process is preferred as a starting process in cases where the nature of the contaminant is unknown or in cases whether the nature of the contaminant or obstruction is known or identified and corresponds to a type of contamination which in principle can be cleaned with air according to a predetermined association rule between type of contaminations or contaminants or obstructions (these three terms are regarded as synonyms in the present description) and the cleaning processes (air-based, liquid based and, optionally wipers).
Additionally, or alternatively, the a liquid provision unit is connected or formed by a liquid tank including the liquid, in particular water or an aqueous solution or a liquid containing a cleansing component such as an alcohol or a detergent. In a particular embodiment, the liquid provision unit is or is connected to the tank for storing liquid used for cleaning the windshield of the vehicle. Since this tank has a finite volume, it is desirable to reduce the use of the liquid therein only for those cases where it is truly necessary, that is, for cases where the air-based cleaning process is not sufficient to eliminate the contamination of obstruction on the camera unit.
The object is achieved according to a third aspect of the present disclosure, with a method for operating a frontal section camera arrangement of a vehicle. The method includes:
Thus, the method of the third aspect shares the advantages of the frontal section camera arrangement of the first aspect of the disclosure or of any of its embodiments.
In the following, embodiments of the method of the third aspect of the disclosure will be described.
In particular, an embodiment of the inventive method further includes:
In yet another embodiment, the method alternatively or additionally includes:
In yet another embodiment, the step of detecting and/or identifying a type of contamination using the sensing data, further includes:
Optionally, upon determining that the selected object has not been detected at the predetermined location and/or point in time by the camera unit and/or one or more of the sensing units, the method includes deciding that a contamination on the camera unit has been detected.
A fourth aspect of the disclosure is formed by a computer program including instructions which, when the program is executed by an cleaning control unit of a frontal section camera arrangement, cause the cleaning control unit to carry out the steps of the method of the third aspect of the disclosure.
These and other aspects of the disclosure will be apparent from and elucidated with reference to the embodiments described hereinafter.
The embodiments of the disclosure are described in the following on the basis of the drawings in comparison with the state of the art, which is also partly illustrated. The latter is not necessarily intended to represent the embodiments to scale. Drawings are, where useful for explanation, shown in schematized and/or slightly distorted form. With regards to additions to the lessons immediately recognizable from the drawings, reference is made to the relevant state of the art. It should be borne in mind that numerous modifications and changes can be made to the form and detail of an embodiment without deviating from the general idea of the disclosure.
The invention will now be described with reference to the drawings wherein:
The camera arrangement 100 includes a camera unit 102 arranged on the frontal section appliance 103 for the frontal section 1008 of the exterior of the vehicle 1000, in particular a chassis 1004 or bumper 1005 or grill of the vehicle 1000. The camera arrangement also includes a camera cleaning unit 104 that is advantageously adapted to clean the camera unit 102. The camera cleaning unit 104 includes a compressed air provision unit 106 that is adapted for providing compressed air A for use in an air-based cleaning process AP for the camera unit 102, and also a liquid provision unit 108 for providing a liquid L for use in a liquid-based cleaning process LP for the camera unit 102. The camera cleaning unit 104 is configured to receiving operation instructions OI for driving the camera cleaning unit 104 in the air-based cleaning process and/or the liquid-based cleaning process.
The camera arrangement 100 further includes a cleaning control unit 114 that is connected to the camera unit 102 for receiving the image data ID and connected to the camera cleaning unit 104 for providing the operation instructions OI. The cleaning control unit 114, in general, is advantageously configured to determine, using the image data, whether a contaminant, contamination or obstruction 116 is on the camera unit 102 and is obstructing the view of the camera unit 102 in a way that interferes with the expected functionality of the camera unit 102 in the ADAS 400. The cleaning control unit 114 is thus advantageously configured to detect a contamination 116 on the camera unit 102 and, upon detecting the contamination 116 on the camera unit 112, to activate the air-based cleaning process AP, by providing an operation instruction OI indicative thereof. After a predetermined process time, that is, time during which the respective process, in this case the air-based cleaning process has been in operation providing air A to clean the camera unit 102, the cleaning control unit checks whether the contamination or the obstruction 116 is still on the camera unit 102. Upon determining that the contamination 116 is still on the camera unit 102, the cleaning control unit is advantageously configured, for example, to increase the flow of air, for instance if the determination indicates that there has been some removal of the contaminant, that is, that the air-based cleaning process has been at least partially effective, or to additionally or alternatively activate the liquid-based cleaning process LP by providing an operation instruction OI indicative thereof, in particular when the determination indicates that the air-based cleaning process has not been effective.
Preferably, the cleaning control unit 114 is further configured to identify a type of contamination 116 from a predetermined list of identifiable types of contaminations 118. Such list 118 is for example stored in the cleaning control unit 114, which is then advantageously configured to directly select a corresponding one of the air-based cleaning process AP and the liquid-based cleaning process LP in dependence on the identified type of contamination 116 and a predetermined association rule 119 between identifiable types of contaminations in the list 118 and a corresponding cleaning processes, for example, AP or LP or a combination of both.
In this particular vehicle 1000, the cleaning control unit 114 is further connected to a sensing unit 120 of the vehicle 1000 for receiving corresponding sensing data (SD). The cleaning control unit 114 is further configured to detect and/or identify the type of contamination 116 using the sensing data SD.
For example, if an obstruction or contamination 116 is detected and the type of contamination corresponds to an item in the list 118, the corresponding cleaning process is activated by the cleaning control unit as follows: cleaning process (air- or liquid-based) is selected according to the identified type of obstruction, and optionally also the expected persistence of the contamination. For example, water, snow, loose dust, et cetera, can be regarded as level-1 contaminants for which the cleaning process begins with the activation of the air-based cleaning process. On the other hand, ice, dried or wet dust, mud, et cetera, can be regarded as level-2 contaminants for which a liquid-based cleaning process is activated. This also happens if the air-based cleaning process has failed. Further, in this particular example, oil and grease are considered as level-3 contaminants for which a combination of air- and liquid-based cleaning process is activated. This is also the case when the liquid-based cleaning process has failed. If after the activation of a combination of the air-based and the liquid-based cleaning, for example after a predetermined time span after the activation of the combination, the contaminant or obstruction has not been removed, the cleaning process is deemed as unsuccessful and a corresponding signal is provided, for example to the driver or to an electronic control unit of the vehicle 1000.
If, on the other hand, an obstruction or contamination is detected, but type of obstruction or contamination is unknown, that is, it does not correspond to an item in the list 118, the cleaning process is selected and escalated based on success of lower level cleaning, activated with increasing resource consumption. For example, first a level-1 cleaning process consisting of an air-based cleaning process alone is activated, preferably with increasing airflow rate until a maximum flow rate is reached. If unsuccessful, a so-called level-2 cleaning process consisting of a liquid-based cleaning process alone is activated, preferably with increasing liquid flow rate until a maximum flow rate is reached. If unsuccessful, a level-3 cleaning process consisting of a combination of an air-based and a liquid-based cleaning process is activated, preferably with increasing air- and/or liquid flow rate until a maximum flow rate is reached. If after the activation of a combination of the air-based and the liquid-based cleaning, for example after a predetermined time span after the activation of the combination, the contaminant or obstruction has not been removed, the cleaning process is deemed as unsuccessful and a corresponding signal is provided, for example to the driver or to an electronic control unit 1010 of the vehicle 1000.
In the vehicle 1000, the camera cleaning unit 104 is connected to an air compressor 1012 that is configured to compress air, in particular ambient air, and to provide compressed air A to the camera cleaning unit 104. The compressor 1012 may be part of a pneumatic system of the vehicle 1000, which is further configured to provide compressed air to other pneumatic units, such as a braking unit or a suspension unit. Alternatively, the compressor unit 1012 is a dedicated unit for providing compressed air A to the camera cleaning unit 104.
The sensing units 120, 122, 124, 126, 128, 130 and 132 are connected to the cleaning control unit 114, in particular via a CAN bus, and are configured to provide corresponding sensing data SD that is to be used for detecting and/or identifying the contamination 116 on the camera unit 102.
In particular, one of the sensing units is a wiper status sensor 122, adapted for sensing the state of a wiper unit for wiping, in particular, the windshield and or the headlights of the vehicle 1000. Typically, the wipers are automatically activated when dedicated sensors detect rain. Thus, the state of the wipers is an indication of an expected presence of water on the camera unit. For example, according to the frequency of the wiper, the air-based cleaning process is activated for drying, wherein the cleaning interval is adjusted depending on the frequency of the wiper. In a predictive step, it is estimated to which degree the camera unit is obstructed with raindrops so that the cleaning process can be selected accordingly.
Another sensing unit is a radar sensor 120, preferably in a frontward oriented mounting position on the frontal section 1008 of the chassis 1004, close to or proximate the camera unit 102. The camera unit 102 can be operated together with a front looking radar as a radar sensor 120. When the data (ID, SD, see
An additional sensing unit is a LIDAR sensor 124, preferably in a frontward oriented mounting position close to or proximate the camera unit 102, in the frontal section 1008 of the chassis 1004 of the vehicle 1000. The way of operation is similar to that described for the radar sensor 120. The same applies to an ultrasound sensor 126 and/or to an infrared sensor 128.
The vehicle also includes auxiliary camera units 130, 132 different than the camera unit 102, and mounted at the sides of the vehicle. The auxiliary sensors include, in this particular vehicle 1000 of
Another approach to derive optimal distribution of the cleaning resources is the so called Predictive Cleaning Cross Validation (PCCV) strategy, where-different from a standalone camera-information from sensors other than the one to be cleaned, that is, the camera unit 102, are used to determine the actual demand for cleaning, which is supported by a predictive step in order to estimate when a certain event can be expected relative to the sensor.
The principle of the Cleaning Cross Validation strategy is to evaluate sensing data that is provided by sensors or sensing units other than the camera unit to be cleaned. The general system architecture includes the camera unit 102 to be cleaned, an ECU 1010 for the activation of the cleaning event, to which at least one other sensor is connected besides the camera unit 102, a camera cleaning unit 104. The logic for the cleaning activation is either directly transferrable from the auxiliary sensor to the camera-cleaning activation or needs to be transformed by considering a temporal synchronization and prediction, as it will be explained in the following with respect to
An object 121 is detected by the radar sensor 120. If the object 121 is also detected by the camera unit 102, no cleaning process is activated. If however, the object 121 is not detected by the camera unit 102, a cleaning process is activated. Optionally, the predictive step takes account of the different detection ranges of the radar sensor 120 and the camera unit 102, so that the occurrence of an object 121 that is detected by the radar sensor 120 in the far range, can be predicted for the camera unit 102 in the near field, for example with a time delay depending on the velocity of the vehicle.
If the radar sensor 120 does not detect any object, 121 for a certain time, the situation can be classified as noncritical and, in a particular embodiment, a cleaning process of the camera unit 102 is initiated.
In addition, the radar sensor 120 can identify if an object 121 is located in front of the camera unit 102, so that it can be expected that the occlusion is only temporary, and no cleaning is required.
The method 500 may include optional steps, indicated by the boxes with discontinuous lines in
Also optionally, the method 500 may include, in a step 503, receiving sensing data SD from one or more sensing units 120, 122, 124, 126, 128, 130, 131, 132; and then detecting and/or identifying, in the step 505, a type of contamination 118 using the sensing data SD, wherein, in particular, the sensing units include one or more of
Optionally, upon determining that the selected object has not been detected at the predetermined location and/or point in time by the camera unit and/or one or more of the sensing units, the method includes, in a step 518, deciding that a contamination on the camera unit has been detected.
If the type of contamination is unknown, for example, it does not correspond to any type of identifiable contaminant, a cascaded cleaning with incremented profile is started in step 610. First, the air-based cleaning process is activated by providing an operation instruction indicative thereof. Upon determining, in step 612, that the contamination is still on the camera unit, after having performed the air-based cleaning process during a predetermined process time T, step 610 is configured to additionally or alternatively activate the liquid-based cleaning process by providing an operation instruction indicative thereof.
If, on the other hand, the contamination is identified in step 608 as belonging to a list of identifiable types of contaminants, an adjusted cleaning process with a profile adapted to the identified contaminant is started in step 614. For example, a cleaning process (air- or liquid-based) is selected according to the expected type, and optionally also the expected persistence of the contamination. For example, water, snow, loose dust, et cetera, can be regarded as level-1 contaminants for which the cleaning process begins with the activation of the air-based cleaning process. Ice, dried or wet dust, mud, et cetera, can be regarded as level-2 contaminants for which a liquid-based cleaning process is activated. This also happens if the air-based cleaning process has failed, as determined during the verification step 616. Further, in this particular example, oil and grease are considered as level-3 contaminants for which a combination of air- and liquid-based cleaning process is activated. This is also the case when the liquid-based cleaning process has failed. If after the activation of a combination of the air-based and the liquid-based cleaning, for example after a predetermined time span after the activation of the combination, the contaminant or obstruction has not been removed, the cleaning process is deemed as unsuccessful and a corresponding signal is provided, for example to the driver or to the electronic control unit.
In summary, the disclosure is directed to a frontal section camera arrangement including a camera arrangement and a frontal section appliance for being arranged on a frontal section of the vehicle. The camera arrangement includes a camera unit configured to provide image data and a camera cleaning unit adapted to clean the camera unit, and including a compressed air provision unit for an air-based cleaning process, and a liquid provision unit for a liquid-based cleaning process. A cleaning control unit is configured to detect a contamination on the camera unit and activate the air-based cleaning process, and, upon determining that the contamination has not been removed, to additionally or alternatively activate the liquid-based cleaning process thus enabling an improved use of the cleaning resources.
It is understood that the foregoing description is that of the preferred embodiments of the invention and that various changes and modifications may be made thereto without departing from the spirit and scope of the invention as defined in the appended claims.
This application is a continuation application of international patent application PCT/EP2022/054664, filed Feb. 24, 2022, designating the United States, and the entire content of the application is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/EP2022/054664 | Feb 2022 | WO |
Child | 18815462 | US |