The present disclosure relates to a vehicle optical sensing system for detecting a vehicle fuel door status, and more particularly, it relates to the system to determine an open or closed state of the fuel door.
Pressure sensors in a vehicle fuel tank may be used to determine that the fuel tank pressure is lower than a normal pressure. This low pressure scenario sometimes occurs due to the vehicle fuel cap not be secured to a vehicle fuel fill passage.
At least some implementations of a system to detect a vehicle fuel door alert condition are described. The system includes an optical sensor and a controller. The optical sensor may have a field of view (FOV) that includes a vehicle fuel door region, and the optical sensor may provide an output that includes image data associated with the fuel door region. The controller is in communication with the optical sensor to receive the output and may include at least one processor adapted to analyze the optical sensor output and provide a control output at least when the optical sensor output indicates that the vehicle fuel door is at least partially open.
In at least some implementations, a system to detect a vehicle fuel door alert condition includes an optical sensor and a controller. The optical sensor is adapted to monitor an area of a vehicle that includes a vehicle fuel door region. The controller is couplable to the optical sensor, and includes memory and at least one processor. The memory is a non-transitory computer readable medium having instructions stored thereon for determining the vehicle fuel door alert condition. The instructions include receiving an image from the optical sensor that includes a region of interest that includes the fuel door region, analyzing the image using image processing techniques to determine at least one criteria associated with the alert condition, and when at least one criteria is determined, then providing the alert signal.
Further, at least some implementations of a method of detecting a vehicle fuel door alert condition using a controller in a vehicle are described. The method includes: receiving at the controller at least one image from an optical sensor, wherein the at least one image comprises a region of interest associated with a vehicle fuel door region; using at least one image, determining at the controller whether the vehicle fuel door alert condition exists, wherein the alert condition is associated with a fuel door in the vehicle fuel door region being at least partially open; and when the alert condition is determined to exist, then providing an alert signal from the controller.
Other embodiments can be derived from combinations of the above and from the embodiments shown in the drawings and the descriptions that follow.
The following detailed description of preferred implementations and best mode will be set forth with regard to the accompanying drawings, in which:
Referring in more detail to the drawings,
As shown in
The vehicle fuel door 20 may be coupled to the vehicle 12 in any suitable manner. As best shown in
As shown in
Vehicle electronics 30 also may comprise one or more VCMs 36 configured to perform various vehicle tasks. One non-limiting example of a vehicle task includes monitoring a fuel tank pressure and providing a ‘check engine’ warning via the instrument panel 32 when a fuel tank pressure is determined to be below a threshold. VCM 36 also could provide an informational message or signal to the ECU 16, which may be used in the method described below—e.g., since a low pressure indication could result from the fuel cap 22 not being properly secured to the port 26. Generally, skilled artisans will appreciate that a ‘check engine’ warning light is an ambiguous indicator; i.e., it does not indicate a source or root cause of a problem, only that a problem exists. Further, even if the user were provided a less ambiguous indication—e.g., that a low fuel tank pressure exists—this would not indicate a root cause either (e.g., which may be simply that the fuel cap 22 is not secured). This of course is merely an example of a task of the VCM 36 and an example of how VCM data may be used by the ECU 16 in some implementations; other implementations are contemplated also.
One or more VCMs 36 may be coupled to the ECU 16 via a vehicle communication bus 52. Or in other implementations, discrete electrical connections could be used or any other suitable type of communication link (e.g., optical links, short range wireless links, etc.).
Referring again to
Processor(s) 84 can be any type of device capable of processing electronic instructions including microprocessors, microcontrollers, electronic control circuits comprising integrated or discrete components, application specific integrated circuits (ASICs), and the like. The processor(s) 84 can be a dedicated processor(s)—used only for ECU 16—or it can be shared with other vehicle systems (e.g., VCMs 36). Processor(s) 84 execute various types of digitally-stored instructions, such as software or firmware programs which may be stored in memory 82, which enable the ECU 16 to provide a variety of vehicle services. For instance, processor(s) 84 can execute programs, process data and/or instructions, and thereby carry out at least part of the method discussed herein. In at least one embodiment, processor(s) 84 may be configured in hardware, software, or both: to receive image data from camera 14; to evaluate the image data using an image processing algorithm; and to determine whether a fuel door alert conditions exists. When an alert condition is detected, the processor(s) 84 also may generate an alert signal that may be used by the instrument panel 32 and/or audio system 34 to notify the vehicle user of an abnormal fuel door status, as will be explained in greater detail below.
In at least one embodiment, the processor 84 executes an image processing algorithm stored on memory 82. The algorithm may use any suitable image processing techniques, including but not limited to: pixelation, linear filtering, principal components analysis, independent component analysis, hidden Markov models, anisotropic diffusion, partial differential equations, self-organizing maps, neural networks, wavelets, etc.—as those terms are understood by skilled artisans. The algorithm may be used to identify regions of interest in an image or image data, compare real-time image data to stored image data, and use one or more image processing techniques such as edge detection to determine the fuel door status, as will be discussed in greater detail below.
As used herein, stored image data or stored images include images which include pattern information regarding a region of interest. Thus, the stored image could include an entire frame of image data from camera 14 (e.g., which would include the ground and environment, as well as part of vehicle 12—inlcuding the fuel door region 18). However, this is not required. For example, the stored image could include a portion of the image—e.g., only pattern data of the region of interest. This could be a pixel pattern representative of a fuel door—e.g., in a closed state or in an open state. The pattern could include shape, relative size, contrasting features, etc. Thus, the term stored image data or stored image should be construed broadly.
As used herein, real-time image data and real-time images are data which is received from camera 14 in actual time, in near actual time, during the current ignition cycle, or even during the current ignition cycle and including a predetermined period of time prior to the ignition cycle. For example, in one embodiment, real-time image data or images include those images received by the camera 14 and transmitted to the ECU 16 (e.g., actual time less any processing delays at the camera 14 and/or ECU 16 and less any transmission lag time). In another embodiment, the real-time image data or images include any images received by camera 14 within a predetermined period of time from when the image data or image was first captured by camera 14. And in other embodiments, real-time image data or images could include a predetermined period of time prior to a vehicle ignition event (e.g., provided the camera 14 is powered and operative during this time period).
The optical sensing system 10 may be operable with a single camera 14; however, as will be explained below, in at least one embodiment, the system 10 comprises multiple cameras. Camera 14 may be positioned to capture image data that includes the fuel door region 18. For example, when the fuel door is on the driver's side of the vehicle, the camera 14 may be mounted at a driver side region 54 of the vehicle 12 (e.g., on or around a driver's side mirror or any other suitable feature on the driver side of the vehicle 12). Characteristics or parameters of camera 14 include a horizontal field of view (HFOV) of approximately 180° (e.g., using a fisheye lens). In at least one embodiment, the HFOV of camera 14 may be approximately 185°; however, this is not required and in other implementations, the HFOV could be larger or smaller. The vertical field of view (VFOV) may be narrower and may accommodate any suitable aspect ratio (e.g., 4:3, 16:9, etc.). It should be appreciated that the terms HFOV and VFOV are relative terms; thus, depending upon the orientation of camera 14 when mounted in the vehicle 12, the HFOV may not be horizontal with respect to the actual horizon and the VFOV may not be vertical with respect thereto. However, in at least one implementation, the HFOV of camera 14 is horizontal with respect to the actual horizon (see
In at least one implementation, camera 14 may be digital and may provide digital image data to the ECU 16; however, this is not required (e.g., analog images or video could be processed by ECU 16 instead). Camera 14 may be configured for day and/or low-light conditions; e.g., in digital camera implementations, an imaging sensor having a pixel array (not shown) of camera 14 could be adapted to process visible light, near-infrared light, or any combination thereof making camera 14 operable in day- or night-time conditions. Other optical sensor or camera implementations are also possible (e.g., sensors capable of thermal imaging, infrared imaging, image intensifying, etc.). In
Further, while the vehicle 12 in
Turning now to
The method 500 may begin with step 505 at any suitable time once the system 10 is powered (e.g., any time following vehicle ignition or power up). In at least one embodiment, the method 500 may be initiated only when the vehicle 12 is stationary (e.g., when the vehicle transmission is in PARK); however, this is not required. In at least one embodiment, the ECU 16 may receive an informational message indicating a transmission shift to PARK (e.g., from one of the VCMs 36).
In step 505, the processor 84 of ECU 16 calls up or retrieves one or more images stored in memory 82 which comprise at least the region of interest—i.e., the fuel door region 18.
Next in step 510, the processor 84 of the ECU 16 may receive one or more real-time (R/T) images from the camera 14. In at least one embodiment, the portion of the body panel 38 captured in the real-time image(s) may be identical or nearly so to the portion of the body panel 38 captured in the stored images (of step 505)—e.g., since the position and orientation of camera 14 may be fixed. As will be described below, since the stored and real-time image(s) will be compared to one another, this may simplify some image processing aspects—e.g., since the relative position of the region of interest (the fuel door region 18) may be the same in both the stored and the real-time images. Step 510 further may include at least temporarily storing these real-time image(s) in memory 82 (e.g., during image processing by processor 84).
In step 515 which follows, processor 84 of the ECU 16 may analyze and/or compare the one or more stored images (step 505) to the one or more real-time images (obtained step 510). In at least one embodiment, the analysis and/or comparison is of a specific region of interest (A)—the fuel door 20 (see also
As used herein, a match may include fuel door features (captured in the stored image) being identical to corresponding fuel door features (captured in the real-time image). Thus, a match could include a pixel-for-pixel comparison between the stored and real-time images. In some embodiments, a match may be determined without each pixel from the stored image being the identical to the corresponding pixel of the real-time image. For example, a match may be determined when a sufficient threshold quantity of pixels can be correlated between the two images. Further, such correlations may take into account a number of image processing factors such as image luminance differences, environmental noise or distortion differences, etc. Correlating stored and real-time images which constitute a match when every corresponding pixel is not identical will be appreciated by artisans familiar with pattern recognition and other image processing techniques.
Steps 505, 510, and 515 have been discussed with respect to stored image(s) of the fuel door region 18, wherein the fuel door 20 shown in the stored image was in the closed state; however, other comparison techniques also could be used. For example, the stored image(s) could portray the fuel door 20 in an open state (e.g., in various states of being partially open, or fully open), and the real-time image(s) could be compared to these stored image(s). In this instance, if one of the real-time images matches one of the stored images, then processor 84 determines a match of an open state of the fuel door 20 and the method 500 proceeds to step 520. Likewise, in this instance, if no match of the open state is determined, then the method could proceed directly to step 525. This technique could be used singly or in combination with techniques of closed state detection described above.
In step 520, the processor 84 may set a first counter or first flag to a value indicating that an indicia or criteria that the fuel door 20 is at least partially open (FLAG #1=‘true’). Again, in some embodiments, it may be desirable to establish multiple criteria before alerting the vehicle user that fuel door 20 is open—e.g., to minimize false alarms or false positive determinations (e.g., thereby avoiding potential user frustration due to false alarms). Thus, in at least one implementation, the processor 84 monitors the status of more than one flag (e.g., such as the flag of step 520). Other criteria are discussed below. Following step 520, the method proceeds to step 525.
It should be appreciated that steps 505, 515, and 520 may be performed in some embodiments, but not in others. For example, in at least one embodiment, the method may begin with step 510 (e.g., receiving one or more real-time images from camera 14) and then proceed to step 525. Also, in yet other embodiments, using the image processing algorithm stored in memory 82, the processor 84 may not compare stored image(s) to real-time image(s), but instead the processor 84 may determine any other suitable criteria associated with the fuel door 20 being in an open state (e.g., using image processing techniques).
In step 525, the processor 84 performs image processing of one or more real-time images. These one or more real-time images may be the same real-time image(s) used in steps 510-515, or different real-time images (e.g., subsequently obtained). In at least one embodiment, they are the same real-time image(s) used in steps 510-515 above. The image processing of step 525 may comprise any suitable technique, including but not limited to, classification techniques, feature extraction techniques, pattern recognition techniques, projection techniques, and the like. In at least one embodiment of step 525, the processor 84 uses an edge detection algorithm. For example, the edge detection algorithm may use feature extraction and/or pattern recognition techniques, among others—e.g., to identify a periphery of the fuel door 20 against a background which is not indicative of the fuel door 20 being in a closed state. For example, the fuel door 20 typically is flush to the body panel 38 of the vehicle 12 when the door is in the closed state (or fully closed). And when the fuel door 20 is at least partially open, the door is typically not flush but instead protrudes outwardly. The outwardly protruding portion may extend within the field of view of the camera 14, enabling its edges to be detected. Thus, the processor 84 may determine an edge by determining a discontinuity in brightness or luminance in the image—e.g., associated with the periphery of the fuel door in contrast to the environment of the vehicle 12 or the vehicle itself (e.g., body panel 38). Of course, this is merely one example; and other analogous implementations are also possible.
In step 530 (
In step 535, the processor 84 may set a second counter or second flag to a value indicating a criteria that the fuel door 20 is at least partially open (FLAG #2=‘true’)—in response to the edge detection/determination of steps 525 and 530. Thus, in at least one instance, the processor 84 potentially determines two criteria in steps 505-535—the first criteria based on a comparison of a stored image to a real-time image and the second criteria based on an edge detection using the same or a different real-time image.
In step 540, the processor 84 may determine the values of the first and second flags (i.e., FLAG #1 and FLAG #2). If both the first and second flags are ‘true,’ then the method may proceed to step 545 (e.g., sending an alert signal to the vehicle user, as discussed below). Or if the processor determines that only one (or neither) of the flags is ‘true,’ then the method 500 may proceed to step 550 or immediately loop back and repeat at least part of the method (e.g., beginning again with step 510), e.g., to continue to monitor for alert conditions.
In at least one embodiment, any combination of steps 505-540 could be repeated before proceeding further to determine additional criteria. For example, in one embodiment multiple real-time images may be required to match a stored image, and/or multiple processed real-time images may be required to indicate an open fuel door 20 (e.g., using edge detection). Additional flags could be set counting these instances, and the threshold quantity of flags (in step 540) may be higher before the method proceeds to step 545.
In step 545, the processor 84 sends or transmits an output in the form of an alert signal from the ECU 16 to the vehicle electronics 30 in response to the multiple criteria determined to be ‘true’ in step 540. This alert signal may be an electrical signal, an optical signal, short range wireless signal, etc. and may be sent to at least one of the vehicle control modules 36 which in turn provides a suitable alert to the vehicle user (e.g., via the instrument panel 32 and/or audio system 34). Of course, an alert could be sent directly from the ECU 16 (e.g., instead of sending an alert signal to vehicle electronics 30 which then performs the alert). Once received by the instrument panel 32 and/or audio system 34, a visual alert, audible alert, tactile alert, or combination thereof may be provided to the vehicle user. Following step 545, the method 500 may end.
When the method proceeds from step 540 to step 550, the processor 84 may check and/or reset the first and second flags. More specifically, the processor may ensure that both flags have values other than ‘true’ (e.g., ‘none’ or ‘false’). Performing step 550 may be desirable when the system architect of system 10 desires the flags both to be ‘true’ within a prescribed or predetermined period of time of one another (and thus before sending an alert signal from the ECU 16). For example, in at least one embodiment, it may not be desirable to send the alert signal when the edge detection algorithm determines a criteria indicating that the fuel door 20 is open hours or days after an earlier criteria indicated that a real-time image matched a stored image. While not illustrated in
Regardless, following step 550, the method proceeds to step 510 as well. And method 500 may receive one or more subsequent or newer real-time images from camera 14 and continue through at least some of steps 515-550 again. In at least one embodiment, method 500 is periodic. For example, the loop described above in method 500 is not continuously executed. For example, the method 500 may be performed each time the transmission of vehicle 12 moves from PARK to another gear, once per vehicle ignition cycle, etc., just to name a few non-limiting examples. Limiting the repetition of the method 500 may improve an overall performance of the system 10. Recall for example that in at least one embodiment, the system 10 may be used for advanced driver assistance (e.g., lane detection, blind-spot detection, etc.). Thus, it may be desirable to limit the computational demands on processor 84 by only occasionally running or operating method 500—e.g., especially since in at least one embodiment, the primary purpose of the system 10 is not to determine whether the fuel door 20 is ajar, whether the fuel cap 22 is dangling, or whether a nozzle 24 remains in the vehicle 12.
Other implementations are also possible which may be used singly or in combination with method 500. For example, two exemplary criteria were described above which were associated with a region of interest A that includes the fuel door 20 (as shown in
With respect to these other regions of interest (B and C), any portion of the steps of method 500 could be used to determine whether the fuel cap 22 is hanging or dangling or the nozzle 26 is in the vehicle 12. For example, memory 82 may comprise stored image(s) of the fuel cap 22 not dangling below the fuel door 20 and/or stored image(s) of the fuel cap 22 dangling below the fuel door 20—and these images may be compared to real-time image(s) obtained from camera 14. A similar technique may be employed to determine whether the nozzle 24 is present in the vehicle port 26.
In another example, image processing techniques (including using an edge detection algorithm) could be employed to analyze real-time image(s) and detect whether the fuel cap 22 is or is not dangling below the fuel door 20. Again, a similar technique could be employed to determine whether the nozzle 24 is or is not present in the vehicle port 26. Regardless of how the fuel cap 22 or nozzle 24 may be detected or determined, the processor 84 may presume that if the cap 22 is dangling or the nozzle is in the port 26, then the fuel door 20 is open; thus, in at least one embodiment, the processor 84 may send the alert signal, even if step 540 of method 500 did not determine both the first and second flags were ‘true.’
Thus, the processor 84 may determine other criteria—e.g. associated with these other regions of interest B, C (e.g., a third flag, a fourth flag, etc.). Further, the processor 84 may determine whether to issue the alert signal from the ECU 16 based upon more than two flags being ‘true’ (or different combinations of the flags being ‘true’).
In at least one embodiment, processor 84 determines that the fuel nozzle 24 is located in region of interest C and that the vehicle 12 has been shifted from PARK (e.g., to DRIVE, REVERSE, NEUTRAL, etc.). In this instance, the processor 84 promptly transmits an alert signal (e.g., to the vehicle electronics 30) to warn the driver that the vehicle 12 is pulling away from a filling station with the nozzle 24 engaged with the fuel port 26. While more than one criteria may be used, in at least one implementation, a single criteria is needed to trigger this alert signal—namely, identification of the nozzle 24 in region of interest C.
In some embodiments, a different alert signal may be used—e.g., alert signals which are not used by the vehicle 12 to cause visual, audible, or tactile alerts, but instead an alert signal which operates an emergency inhibit function. For example, when the vehicle 12 changes gears (e.g., from PARK to any other gear) and the nozzle 24 has been determined to be within the port 26, the alert signal may be sent to a VCM 36 which controls the vehicle drive train. In one embodiment, a VCM 36 may cause the vehicle 12 to brake automatically—e.g. inhibiting the vehicle 12 from moving away from the filling station with the nozzle 24 within the port 26.
Other implementations include requiring all criteria to be determined as ‘true’ simultaneously or otherwise be in a ‘true’ state at the same time. The order of determining criteria could be changed; e.g., in method 500, steps 510-520 could occur after steps 525-535, or the like.
In another implementation, ECU 16 may require other additional criteria that the fuel door 20 is open prior to sending the alert signal. For example, one of the VCMs 36 in the vehicle 12 may determine a low fuel tank pressure condition and provide an informational message to the ECU 16 regarding the low pressure condition. And this criteria (from VCM 36), combined with criteria determined using image data from the camera 14, may trigger the ECU 16 to send the alert signal. For example, the processor 84 may send an alert signal based on this VCM 36 criteria and criteria associated with region of interest B (e.g., that the fuel cap 22 is dangling).
Thus, there has been described an optical sensing system which can be used to determine a fuel door status—e.g., whether a fuel door alert condition exists. For example, an alert condition may include determining that a vehicle fuel door is open, determining that a vehicle fuel cap is dangling, and/or determining that a fuel station filling nozzle remains in a vehicle (e.g., just prior to the vehicle driving away from the filling station). The system may include an electronic control unit (ECU) and one or more sensors. To determine the fuel door status, the ECU may employ image processing techniques (e.g., comparing stored images to real-time images from the sensors, using real-time edge detection techniques, etc.). When the ECU determines that the fuel door is at least partially open, the ECU may provide an alert signal which may be used to notify a user of the vehicle of the condition.
It should be understood that all references to direction and position, unless otherwise indicated, refer to the orientation of the parking brake actuator illustrated in the drawings. In general, up or upward generally refers to an upward direction within the plane of the paper and down or downward generally refers to a downward direction within the plane of the paper.
While the forms of the invention herein disclosed constitute presently preferred embodiments, many others are possible. It is not intended herein to mention all the possible equivalent forms or ramifications of the invention. It is understood that the terms used herein are merely descriptive, rather than limiting, and that various changes may be made without departing from the spirit or scope of the invention.