The present disclosure relates generally to a non-contact obstacle and gesture detection system for a motor vehicle and methods of operating the non-contact obstacle and gesture detection system.
This section provides background information related to the present disclosure which is not necessarily prior art.
Motor vehicles are increasingly being equipped with sensors that detect the environment and terrain surrounding the motor vehicle. For example, some vehicles include sensor systems that provide images of the terrain and/or other objects in the vicinity of the vehicle. Sensing systems have also been used to detect the presence and position of objects near the motor vehicle while the vehicle is moving. The signals and data generated by these sensor systems can be used by other systems of the motor vehicle to provide safety features such as vehicle control, collision avoidance, and parking assistance. Such sensing systems are generally used to assist the driver while he or she is driving the motor vehicle and/or to intervene in controlling the vehicle.
Additionally, closure members for vehicles (e.g. doors, lift gates, etc.) can be provided with powered actuation mechanisms capable of opening and/or closing the closure members. Typically, powered actuation systems include a power-operated device such as, for example, an electric motor and a rotary-to-linear conversion device that are operable for converting the rotary output of the electric motor into translational movement of an extensible member. Such power actuated operation can lead to issues with the closure members unintentionally striking surrounding objects or obstacles. For example, an object near the closure member may obstruct the opening or closing of the closure member and/or the closure member may be damaged if opened under power and strikes the obstacle. However, known sensing system or obstacle detection systems may not properly address potential situations involving obstacles. Furthermore, including powered actuation systems each with respective obstacle detection sensors on more than one closure member of a vehicle can lead to increased complexity and cost.
Thus, there is a need for improved obstacle and gesture detection systems that control movement of the closure member in response to detecting an object or gesture.
This section provides a general summary of the disclosure, and is not a comprehensive disclosure of its full scope or all of its features.
It is an object of the present disclosure to provide an object detection system for a motor vehicle with at least one powered closure member and at least one non-powered closure member. The system includes at least one non-contact sensor for detecting an object adjacent the motor vehicle within a first field of view. The system also includes at least one power actuator coupled to the at least one powered closure member for moving the at least one powered closure member. An electronic control unit is coupled to the at least one non-contact sensor and the at least one power actuator and is configured to detect the object adjacent the motor vehicle using the at least one non-contact sensor and control movement of the at least one powered closure member accordingly. The electronic control unit is also configured to detect the at least one non-powered closure member is not in a closed position. The electronic control unit adjusts the first field of view of the at least one non-contact sensor to a second field of view and operates the at least one non-contact sensor using the second field of view in response to detecting the at least one non-powered closure member is not in the closed position.
It is another object of the present disclosure to provide a method of detecting an object adjacent a motor vehicle with at least one powered closure member and at least one non-powered closure member using an object detection system. The method includes the step of detecting the object adjacent the motor vehicle within a first field of view using at least one non-contact sensor and controlling movement of the at least one powered closure member using at least one power actuator accordingly. The method proceeds by detecting the at least one non-powered closure member is not in a closed position. The method also includes the step of adjusting the first field of view of the at least one non-contact sensor to a second field of view and operating the at least one non-contact sensor using the second field of view in response to detecting the at least one non-powered closure member is not in the closed position.
It is another object of the present disclosure to provide an object detection system for a motor vehicle with a first closure member and a second closure member. The system includes at least one non-contact sensor for detecting an object within a field of view adjacent the first closure member and the second closure member. The system also includes a sensor for determining a position of the second closure member. The system further includes an electronic control unit coupled to the at least one non-contact sensor and the sensor and configured to modify the field of view in response to detecting, using the sensor, a change in position of the second closure member.
It is another object of the present disclosure to provide an object detection system for a motor vehicle with a first closure member and a second closure member, the system having at least one non-contact sensor for detecting an object within a field of view adjacent the first closure member and the second closure member, a sensor for determining a position of the second closure member, and an electronic control unit coupled to the at least one non-contact sensor and the sensor and configured to modify the field of view in response to detecting, using the sensor, a change in position of the second closure member, such that the modified field of view does not detect the second closure member using the at least one non-contact sensor.
Further areas of applicability will become apparent from the description provided herein. The description and specific examples in this summary are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.
The drawings described herein are for illustrative purposes only of selected embodiments and not all implementations, and are not intended to limit the present disclosure to only that actually shown. With this in mind, various features and advantages of example embodiments of the present disclosure will become apparent from the following written description when considered in combination with the appended drawings, in which:
Corresponding reference numerals indicate corresponding parts throughout the several views of the drawings.
In the following description, details are set forth to provide an understanding of the present disclosure. In some instances, certain circuits, structures and techniques have not been described or shown in detail in order not to obscure the disclosure.
In general, the present disclosure relates to an obstacle and gesture detection system of the type well-suited for use in many applications. More specifically, a non-contact obstacle and gesture detection system for a motor vehicle and methods of operating the non-contact obstacle and gesture detection system are disclosed herein. The non-contact obstacle and gesture detection system of this disclosure will be described in conjunction with one or more example embodiments. However, the specific example embodiments disclosed are merely provided to describe the inventive concepts, features, advantages and objectives will sufficient clarity to permit those skilled in this art to understand and practice the disclosure.
Referring to the Figures, wherein like numerals indicate corresponding parts throughout the several views, the non-contact obstacle and gesture detection system 20 for the motor vehicle 22 is shown. As best shown in
The front door node 24 includes a first electronic control unit 32 that has a plurality of input-output terminals adapted to connect to a power source and to a vehicle bus (e.g., CAN or controller area network). The rear door node 28 includes a second electronic control unit 34 also adapted to connect to the power source and to the vehicle bus (e.g., CAN or controller area network). The first electronic control unit 32 and second electronic control unit 34 are each in communication with one another.
A first latch assembly 36 is in communication with the first electronic control unit 32 for latching the first closure member 26 relative to the motor vehicle 22 (e.g., to a vehicle body 37 of the motor vehicle 22). The system 20 also includes a first cinching actuator 38 coupled to the first latch assembly 36 for cinching the first closure member 26 to the vehicle body 37 of the motor vehicle 22. The system 20 also includes a second latch assembly 40 in communication with the second electronic control unit 34 for latching the second closure member 30 relative to the motor vehicle 22 and a second cinching actuator 42 is coupled to the second latch assembly 40 for cinching the second closure member 30 to the motor vehicle 22. For each latch assembly 36, 40, the cinch actuators 38, 42 may be included with or separate from the latch assemblies 36, 40.
At least one first handle switch (e.g., first inside and outside switches 44, 46 on the front passenger door 26) is coupled to the first electronic control unit 32 for detecting operation of a first handle of the first closure member 26. At least one second handle switch (e.g., second inside and outside switches 48, 50 on the rear passenger door 30) is also coupled to the second electronic control unit 34 for detecting operation of a second handle of the second closure member 30. Inside and outside switches 44, 46, 48, 50 on the front passenger door 26 and the rear passenger door 30 may be used to indicate that a user 21 is attempting to move the door 26, 30.
A plurality of non-contact sensors, also referred to as non-contact obstacle and gesture sensors 52, 54, 56, 58 are coupled to the first electronic control unit 32 for detecting the obstacle or gesture adjacent the motor vehicle 22. A first power actuator 60 is coupled to the first closure member 26 and to the first electronic control unit 32 for moving the first closure member 26 relative to the vehicle body 37. Similarly, a second power actuator 62 is coupled to the second closure member 30 and to the second electronic control unit 34 for moving the second closure member 30 relative to the vehicle body 37. Each of the first and second power actuators 60, 62 is configured to include an electric motor, a reduction geartrain, a slip clutch, and a drive mechanism, which together define a power assembly or first power actuator 60; however, it should be appreciated that various other power actuators 60, 62 may be used instead. Each of the first and second power actuators 60, 62 is configured to include a sensor for detecting the position (such as the absolute position) of the respective door to which the power actuator 60, 62 is configured to move. For example, sensor associated with the power actuator 60, 62 may be a hall-effect sensor arrangement in a configuration as is illustratively shown U.S. Pat. No. 11,713,609, the entire contents of which is incorporated herein by reference in its entirety. Other sensors for detecting the position of the closure panels may include for example a position sensor associated with the closure panel hinge.
Referring to
So, one or more first Hall-effect sensors 68 may be provided and positioned to send signals to the first electronic control unit 32 that are indicative of rotational movement of the electric motor and indicative of the rotational speed of the electric motor, for example, based on counting signals from the at least one Hall-effect sensor 68 detecting a target on a motor output shaft. In situations where the sensed motor speed is greater than a threshold speed and where a current sensor (not shown) registers a significant change in the current draw, the first electronic control unit 32 may determine that the user 21 is manually moving door 26 while the electric motor is also operating, thus moving vehicle door 26 between its open and closed positions. The first electronic control unit 32 may then send a signal to the electric motor to stop and may even disengage the slip clutch (if provided). Conversely, when the first electronic control unit 32 is in a power open or power close mode and the first Hall-effect sensors 68 indicate that a speed of the electric motor is less than a threshold speed (e.g., zero) and a current spike is registered, the first electronic control unit 32 may determine that an obstacle is in the way of vehicle door 26. If such an obstacle is detected, the system 20 may take any suitable action, such as sending a signal to turn off the electric motor. As such, the first electronic control unit 32 receives feedback from the Hall-effect sensors 68 to ensure that an obstacle contact has not occurred during movement of vehicle door 26 from the closed position to the open position, or vice versa. Similarly, although not shown in
The first power actuator 60 for the first closure member 26 is shown mounted within an interior chamber 72 of door 26. Power swing door actuator 60 further includes a connector mechanism or extensible member 74 of the drive mechanism to connect to the vehicle body 37. So, the first electronic control unit 32 is in communication with the first power actuator 60 (e.g., the electric motor) for providing electric control signals thereto. The first and second electronic control units 32, 34 each can include a microprocessor 76 and a memory 78 having executable computer readable instructions stored thereon.
As is also schematically shown in
The first electronic control unit 32 can, for example, receive a command signal from the internal/external handle switch 44, 46 to initiate an opening or closing of vehicle door 26. Upon receiving the command signal, first electronic control unit 32 can proceed to provide a signal to the electric motor (e.g., of first power actuator 60) in the form of a pulse width modulated voltage, for instance, (for speed control) to turn on the motor and initiate pivotal swinging movement of vehicle door 26. While providing the signal, first electronic control unit 32 also obtains feedback from the Hall-effect sensors 68 of the electric motor to ensure that contact with an obstacle has not occurred. If no obstacle is present, the motor can continue to generate a rotational force to actuate the spindle drive mechanism. Once vehicle door 26 is positioned at the desired location, the motor may be turned off and vehicle door 26 can be held at that location (i.e., door check).
The first electronic control unit 32 can also receive additional input from the plurality of non-contact obstacle or gesture sensors 52, 54, 56, 58 positioned on a portion of vehicle door 26, such as on a door mirror, or the like. The plurality of non-contact obstacle or gesture sensors 52, 54, 56, 58, for example, assess if an obstacle, such as another car, tree, or post, is near or in close proximity to vehicle door 26 or whether a user 21 is making a gesture near the vehicle door 26. A gesture may be for example a simple motion of the object, for example if the object is a person the gesture may include a motion of the hand such as a left to right swipe defining a simple gesture, or a more complex series of motions such as an intricate combination of gestures such as a hand moving up, then up, then down, then further down, then left, the right, and then in a circle. Other motions, such as a foot motion, a head motion, a motion of the entire body of the person such as a gait or walk or strut of the person may be detected. If such an obstacle or gesture is detected, the plurality of non-contact obstacle or gesture sensors 52, 54, 56, 58 will send a signal to first electronic control unit 32, and first electronic control unit 32 will proceed to control the electric motor (e.g., to stop movement of vehicle door 26, and thus prevent vehicle door 26 from hitting the obstacle).
The one of the plurality of non-contact obstacle or gesture sensors 52, 54, 56, 58 can also include at least one outwardly facing sensor 54, for example an inner trim sensor 54 disposed along a lower inner edge of the front door 26 inside the motor vehicle 22. Outwardly facing sensor 54 may also be provided in a door handle 49 or other location for providing an outwardly facing field of view. In more detail, the at least one inner trim sensor 54 is used to detect objects in the path of the closing door 26 (e.g., the at least one inner trim sensor 54 can be off while the door 26 is opening in an arc defined by its pivotal coupling to the vehicle body 37). The at least one inner trim sensor 54 can also be used to detect objects in the path of closing door 26, such as a knee. The one of the plurality of non-contact obstacle or gesture sensors 52, 54, 56, 58 can also include at least shut face sensor 56 disposed along a shut face of the door 26.
The plurality of non-contact obstacle or gesture sensors 52, 54, 56, 58 can also include a rocker panel sensor 58 disposed in a rocker panel of the vehicle 22 (e.g., under the front door 26). The rocker panel sensor 58 disposed on or inside the rocker panel 57 can be used to detect objects (e.g., the curb) in the path of the front door 26 while it is opening or while the front door 26 is closing (e.g., a leg). The plurality of non-contact obstacle or gesture sensors 52, 54, 56, 58 can each include a 77 gigahertz transceiver with an azimuth field of view of 140 degrees within a sealed housing. Such sensors 52, 54, 56, 58 are configured for detection of static objects (e.g., poles, vehicles, walls, curbs, etc.). The plurality of non-contact obstacle or gesture sensors 52, 54, 56, 58 are also configured for detection of slower moving dynamic objects (e.g., pedestrians). The plurality of non-contact obstacle or gesture sensors 52, 54, 56, 58 have illustrative operating parameters including −40° to +85° Celsius operation, 9 to 16 Volt operation, minimum detection range of 2 cm, maximum detection range of 15 meters, range resolution of 4 cm. Nevertheless, alternative sensors 52, 54, 56, 58, sensing technology, and arrangements of the plurality of non-contact obstacle or gesture sensors 52, 54, 56, 58 are contemplated.
According to an aspect, the second field of view 102 is reduced compared to the first field of view 100. So, the electronic control unit 32 can ignore part of the field of view of the at least one non-contact sensor 52, 54, 56, 58 when an adjacent or partner door (e.g., rear passenger door 30 when the at least one powered closure member 26, 30 is the front door 26) is detected to move from its closed position. Since the adjacent door (e.g., rear passenger door 30) may be non-powered, an absolute position of that door may not be known, only if it is open, as described in more detail below. For example, the electronic control unit 32 can ignore part of the field of view by disregarding data received from the at least one non-contact sensor 52, 54, 56, 58 associated with part of the field of view that the at least one powered closure may be moving or positioned within. The electronic control unit 32 can disregard data associated with part of the field of view based on angular data received by electronic control unit 32, such as for example when a FMCW radar sensor outputting angular data associated with a detected obstacle is employed, and/or based on distance data received electronic control unit 32, such as for example when a FMCW radar sensor outputting radial distance data associated with a detected obstacle is employed. As a result, the field of view may be modified such that the closure member does not overlap with the modified field of view such that the closure member as an obstacle would be trigger a false obstacle detection. In other words, while the closure panel may occupy an angular zone of the field of view as detected by the at least one non-contact sensor 52, 54, 56, 58, the electronic control unit 32 may receive data indicating the detection of an obstacle, but ignore or not act on such detection data knowing the obstacle is the closure panel. In one possible configuration the field of view may be continuously modified during the detected movement of the closure panel. In another possible configuration, the electronic control unit 32 may be configured to modify the field of view from a first field of view when the position of the second closure member is detected to be in the closed position to a second field of view when the position of the second closure member is detected not to be in the closed position, wherein the second field is not modified after the second closure member is detected not to be in the closed position. In other words, the modified field of view is selected to ensure that a fully opened closure member is not detected by the at least one non-contact sensor 52, 54, 56, 58 in any position of the door closure member. Such a configuration may be employed for example when the moving closure member is not configured with an absolute position sensor, but rather with a door ajar sensor (e.g. provided in a door latch) sensor configured only to detect when the door is not in a closed position.
Continuing to refer to
Thus, the adjustment of the first field of view 100 of the at least one non-contact sensor 52, 54, 56, 58 to the second field of view 102 and operating the at least one non-contact sensor 52, 54, 56, 58 using the second field of view 102 as described herein may involve the ajar signal being sent from the non-powered partner door 26, 30 to the powered door controller (e.g., electronic control unit 32). If the electronic control unit 32 knows the non-powered door 26, 30 is not in the home or closed position, the algorithm of the electronic control unit 32 can be modified to ignore targets in the swing path of the non-powered door 26, 30. This will effectively limit the field of view of the powered door 26, 30. The risk of limiting the field of view is mitigated by the fact that the non-powered door 26, 30 will physically stop an obstacle or object from contacting the powered door 26, 30 in the event there are additional obstacles in the area. If the non-powered door 26, 30 is open, it will be struck first by the object or obstacle. The responsibility for the non-powered door 26, 30 against obstacle contact resides with the operator 21, as in normal mechanical or non-powered doors 26, 30. Depending on how the algorithm of the electronic control unit 32 is executed or software architecture, the electronic control unit 32 may require an additional bus (e.g., CAN) message for the ajar signal to trigger the change of field of view or otherwise ignoring the targets or objects within a swing path of the non-powered or partner door 26, 30, for example.
The adjustment of the first field of view 100 of the at least one non-contact sensor 52, 54, 56, 58 to the second field of view 102 and operating the at least one non-contact sensor 52, 54, 56, 58 using the second field of view 102 described herein can be implemented on vehicles 22 where either only the front pair or rear pair of doors are powered (see e.g.,
According to another aspect and as previously discussed, the at least one non-contact sensor 52, 54, 56, 58 may be an outside mirror sensor 52 disposed on an outside mirror of the motor vehicle 22. The first field of view 100 and the second field of view 102 of the outside mirror sensor 52 extend outwardly from the outside mirror sensor 52 and fan out increasingly wider toward a rear of the motor vehicle 22.
So, on four door vehicles (e.g., motor vehicle 22) with four powered doors 26, 30, it is possible for the position of the doors 26, 30 away from the closed position to be known or communicated to the electronic control unit 32. It is expected that the at least one non-contact sensor 52, 54, 56, 58 will detect the adjacent or partner door (i.e., door 26, 30 on the same side of the vehicle 22) when the partner door is ajar and/or in motion. In this case the object identified by the at least one non-contact sensor 52, 54, 56, 58 can be judged against known position information of other doors 26, 30 on the vehicle 22, A data-based decision regarding the status of the obstacle (false or not) can be made.
Referring back to
Although, in
Clearly, changes may be made to what is described and illustrated herein without, however, departing from the scope defined in the accompanying claims. The foregoing description of the embodiments has been provided for purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure. Individual elements or features of a particular embodiment are generally not limited to that particular embodiment, but, where applicable, are interchangeable and can be used in a selected embodiment, even if not specifically shown or described. The same may also be varied in many ways. Such variations are not to be regarded as a departure from the disclosure, and all such modifications are intended to be included within the scope of the disclosure.
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. As used herein, the singular forms “a,” “an,” and “the” may be intended to include the plural forms as well, unless the context clearly indicates otherwise. The terms “comprises,” “comprising,” “including,” and “having,” are inclusive and therefore specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. The method steps, processes, and operations described herein are not to be construed as necessarily requiring their performance in the particular order discussed or illustrated, unless specifically identified as an order of performance. It is also to be understood that additional or alternative steps may be employed.
When an element or layer is referred to as being “on,” “engaged to,” “connected to,” or “coupled to” another element or layer, it may be directly on, engaged, connected or coupled to the other element or layer, or intervening elements or layers may be present. In contrast, when an element is referred to as being “directly on,” “directly engaged to,” “directly connected to,” or “directly coupled to” another element or layer, there may be no intervening elements or layers present. Other words used to describe the relationship between elements should be interpreted in a like fashion (e.g., “between” versus “directly between,” “adjacent” versus “directly adjacent,” etc.). As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
Although the terms first, second, third, etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms may be only used to distinguish one element, component, region, layer or section from another region, layer or section. Terms such as “first,” “second,” and other numerical terms when used herein do not imply a sequence or order unless clearly indicated by the context. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the example embodiments.
Spatially relative terms, such as “inner,” “outer,” “beneath,” “below,” “lower,” “above,” “upper,” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. Spatially relative terms may be intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the example term “below” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
This utility application claims the benefit of U.S. Provisional Application No. 63/472,543 filed Jun. 12, 2023. The entire disclosure of the above application is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
63472543 | Jun 2023 | US |