The present disclosure relates generally to motor vehicle-mounted wireless access systems and object impact avoidance systems and, more particularly, to such systems in which transmitted and reflected wireless signals are used to detect object motion and in which activation of one or more motor vehicle actuators, of one or more audible devices and/or of one or more illumination devices is controlled based on the detected motion.
Many vehicles today are equipped with a passive entry system, or “PES.” In some PES implementations, a key fob communicates with a computer of the motor vehicle, and the motor vehicle computer operates to automatically unlock one or more door locks of the motor vehicle in response to detection of the key fob being in close proximity to the motor vehicle. This allows an operator of the vehicle to approach the vehicle and open the door without having to manually unlock the door with a key or to manually press a button on the key fob. In some such applications, the motor vehicle computer is also configured to automatically lock the vehicle in response to detection of the key fob being outside of the close proximity of the motor vehicle.
Another known type of hands-free vehicle access or entry system employs an infrared (“IR”) detector assembly. Typically, such systems may use an active near infrared arrangement including multiple IR LEDs and one or more sensors in communication with a computer or other circuitry. The computer is typically operable in such an assembly to calculate the distance of an object from the assembly by timing the interval between emission of IR radiation and reception by the sensor(s) of at least a portion of the emitted IR radiation that is reflected by the object back to the sensor(s), and then interpreting the timing information to determine movement of the object within the IR field. Exemplary IR movement recognition systems are disclosed in US Patent Application Publication 20120200486, US Patent Application Publication 20150069249, and US Patent Application Publication 20120312956, and US Patent Application Publication 20150248796, the disclosures of which are incorporated herein by reference in their entireties.
For the purposes of promoting an understanding of the principles of this disclosure, reference will now be made to a number of illustrative embodiments shown in the attached drawings and specific language will be used to describe the same.
This disclosure relates to object detection system mountable to or carried by a motor vehicle in any of various locations at or about the motor vehicle. In some embodiments, the object detection system may implemented solely in the form of a hands-free vehicle access system. In some such embodiments, one or more illumination devices may be implemented to provide visual feedback of objects being detected. In other embodiments, the object detection system may be implemented in the form of a combination hands-free vehicle access system and an object impact avoidance system. In such embodiments, the object detection system operates in a hands-free vehicle access mode under some conditions and in an object impact avoidance mode under other operating conditions.
Referring now to
In some embodiments, the object detection system 10 may include a vehicle control computer 24 electrically connected to the object detection module 12 and having at least one processor or controller 26 and at least one memory 28. In some embodiments, the vehicle control computer 24 may include a communication circuit 30 for receiving the vehicle access signals wirelessly transmitted by the transmitter 22 of the key fob 20. In some embodiments, the communication circuit 18 of the object detection module 12 and the communication circuit 30 of the vehicle control computer 24 may be configured to wirelessly communicate with one another in a conventional manner so that the processors 14, 26 may conduct information transfer wirelessly via the communication circuits 18, 30.
In some embodiments, the object detection system 10 may include one or more actuator driver circuits 40 for controllably driving one or more corresponding actuators 46. In some such embodiments, the one or more actuator driver circuits 40 may include at least one processor or controller 42 and at least one memory 44 in addition to one or more conventional driver circuits, although in other embodiments the processor or controller 42 and the memory 44 may be omitted. In some embodiments, one, some or all of the one or more driver circuits 40 may be electrically connected to the vehicle control computer 24 so that the processor or controller 26 of the vehicle control computer 24 may control the operation of one or more actuators 46 via control of such one or more driver circuits 40. Alternatively or additionally, at least one, some or all of the one or more driver circuits 40 may be electrically connected to the object detection module 12 as illustrated by dashed-line connection in
In some embodiments, the object detection system 10 may include one or more conventional vehicle operating parameter sensors, sensing systems and/or switches 50 carried by the motor vehicle and electrically connected to, or otherwise communicatively coupled to, the vehicle control computer 24. Examples of such vehicle operating parameter sensors, sensing systems and/or switches 50 may include, but are not limited to, an engine ignition sensor or sensing system, a vehicle speed sensor or sensing system, a transmission gear selector position sensor, sensing system or switch, a transmission gear position sensor, sensing system or switch, and the like.
In some embodiments, the object detection system 10 may include one or more conventional audio and/or illumination device driver circuits 60 for controllably driving one or more corresponding audio (or audible) devices and/or one or more illumination devices 66. In some such embodiments, the one or more audio and/or illumination device driver circuits 60 may include at least one processor or controller 62 and at least one memory 64 in addition to one or more conventional driver circuits, although in other embodiments the processor or controller 62 and the memory 64 may be omitted. In some embodiments, one, some or all of the one or more driver circuits 60 may be electrically connected to the vehicle control computer 24 so that the processor or controller 26 of the vehicle control computer 24 may control the operation of one or more audio and/or illumination devices 66 via control of such one or more driver circuits 60. Alternatively or additionally, at least one, some or all of the one or more driver circuits 60 may be electrically connected to the object detection module 12 as illustrated by dashed-line connection in
Referring now to
In the embodiment illustrated in
Radiation emission and detection assemblies 100 are conventionally associated with processors or controllers 141 as depicted in
In some embodiments, the IR LEDs 102 and IR sensors 104 illustratively take the form of an IR sensor module available from NEONODE, INC. (San Jose, Calif.). The modules typically contain multiple pairs of IR emitter LEDs 102 and IR sensors 104 for receiving reflected IR radiation. Such modules typically have a range of about 200 millimeters (mm) of off-surface detection and arranging IR LEDs 102 and the IR sensors 104 in pairs permits a higher resolution of detection. For instance, the assembly 100 of IR LEDs 102 and IR sensors 104 is capable of detecting the difference between a single finger and multiple fingers. As a result, the assembly 100 of IR LEDs 102 and IR sensors 104 is capable of detecting gesturing by a user's hand, for instance.
The embodiment of the object detection module 121 illustrated in
In the embodiment illustrated in
The one or more illumination devices 112 is/are illustratively included to provide visual feedback of one or more conditions relating to detection by the radiation emission and detection assembly 100 of an object within a sensing region of the assembly 100. In one example embodiment, two illumination devices 112 may be provided for producing the desired visual feedback. In one implementation of this example embodiment, a first one of the illumination devices 112 may be configured and controlled to illuminate with a first color to visibly indicate the detected presence by the radiation emission and detection assembly 100 of an object within the sensing region, and the second illumination device 112 may be configured and controlled to illuminate with a second color, different from the first, to visibly indicate that the detected object exhibits a predefined gesture. In another example embodiment, three illumination devices 112 may be provided. In this embodiment, a first one of the illumination devices 112 may be controlled to illuminate with a first color to visibly indicate the detected presence of an object within an area of the sensing region in which the radiation emission and detection assembly 100 is unable determine whether the detected object exhibits a predefined gesture (e.g., the object may be within a sub-region of the sensing region which is too small to allow determination of whether the object exhibits the predefined gesture), a second one of the illumination devices 112 is controlled to illuminate with a second color to visibly indicate the detected presence of an object within an area of the sensing region in which the radiation emission and detection assembly 100 is able to determine whether the detected object exhibits a predefined gesture, and a third one of the illumination devices is controlled to illuminate with a third color to visibly indicate that the object within the sensing region is detected by the radiation emission and detection assembly 100 as exhibiting a predefined gesture.
In other embodiments, the one or more illumination devices 112 may include any number of illumination devices 10. Multiple illumination devices 112, for example, may be illuminated in one or more colors to provide a desired visual feedback. In any such embodiments, in one or more illumination devices 112 may be LEDs, and one or more such LEDs may illustratively be provided in the form of RGB LEDs capable of illumination in more than one color. According to this variant, it will be appreciated that positive visual indication of various modes of operation of the radiation emission and detection assembly 100 may be carried out in numerous different colors, with each such color indicative of a different state of operation of the object detection module 121. As one non-limiting example, the color red may serve to indicate that the radiation emission and detection assembly 100 has detected an object (e.g., a hand or foot) within the sensing region, but is unable to determine whether the detected object is exhibiting a predefined gesture. The color green, in contrast, may serve to indicate that the detected object is exhibiting a predefined gesture and, consequently, that the predefined vehicle command associated with that predefined gesture (e.g., unlocking the vehicle closure, opening the vehicle closure, etc.) is being effected. In addition to green, other colors might be uniquely associated with different predefined commands. Thus, while green illumination might reflect that a closure for the vehicle is being unlocked, blue illumination, for example, may reflect that a fuel door latch has been opened, purple illumination may reflect that a window is being opened, etc.
In still other embodiments, in addition to or alternatively to color distinction, different operating modes, i.e., different detection modes, of the radiation emission and detection assembly 100 may be visually distinguished from one another by controlling the at least one illumination device 112 to switch on and off with different respective frequencies and/or duty cycles. In some embodiments which include multiple illumination devices 112, the different operating modes of the radiation emission and detection assembly 100 may be additionally or alternatively distinguished visually from one another by activating different subsets of the multiple illumination devices 112 for different operating or detection modes, and/or by sequentially activating the multiple illumination devices 112 or subsets thereof with different respective activation frequencies and/or duty cycles.
The object detection module 121 further illustratively includes a number N of conventional supporting circuits (SC) and conventional driver circuits (DC) 1141-114N, wherein N may be any positive integer. The supporting circuit(s) (SC) is/are each electrically connected to the processor or controller 141, and may include one or more conventional circuits configured to support the operation of the processor or controller 141 and/or other electrical circuits and/or components of the object detection module 121. Example supporting circuits may include, but are not limited to, one or more voltage supply regulation circuits, one or more capacitors, one or more resistors, one or more inductors, one or more oscillator circuits, and the like. The driver circuit(s) (DC) include one or more inputs electrically connected to the processor or controller 141 and one or more outputs electrically connected to the one or more illumination devices 112 and the plurality of IR LEDs 104. The driver circuit(s) DC is/are conventional and is/are configured to be responsive to one or more control signals supplied by the processor or controller 141 to selectively drive, i.e., activate and deactivate, the plurality of IR LEDs 102 and the one or more illumination devices 112.
It will be understood that the terms “processor” and “controller” used in this disclosure is comprehensive of any computer, processor, microchip processor, integrated circuit, or any other element(s), whether singly or in multiple parts, capable of carrying programming for performing the functions specified in the claims and this written description. The at least one processor or controller 141 may be a single such element which is resident on a printed circuit board with the other elements of the inventive access system. It may, alternatively, reside remotely from the other elements of the system. For example, but without limitation, the at least one processor or controller 141 may take the form of a physical processor or controller on-board the object detection module 121. Alternately or additionally, the at least one processor or controller 141 may be or include programming in the at least one processor or controller 26 of the vehicle control computer 24 illustrated in
In the embodiment illustrated in
In one embodiment, electrical power for the object detection module 12, the vehicle control computer 24, the actuator driver circuit(s) 40, the actuator(s) 46, the audio/illumination device driver circuit(s) 60 and the audio/illumination device(s) 66 is illustratively provided by a conventional electrical power source and/or system on-board the motor vehicle. In alternate embodiments, electrical power for the object detection module 12, the actuator driver circuit(s) 40, the actuator(s) 46, the audio/illumination device driver circuit(s) 60 and/or the audio/illumination device(s) 66 may be provided by one or more local power sources, e.g., one or more batteries, on-board the associated module(s), circuit(s) and/or device(s).
Referring now to
In some embodiments, the processor or controller 141 is operable upon detection of the object OB within the sensing region R to selectively illuminate the at least one illumination device 112 in a manner which visibly indicates the detected presence of the object OB within the sensing region R. In some such embodiments, the processor or controller 141 is operable upon detection of the object OB within the sensing region to selectively illuminate the at least one illumination device in a manner which indicates that the object OB is within a sub-region of the sensing region R that is too small to make a determination of whether the object OB exhibits the predefined gesture, and is operable to selectively illuminate the at least one illumination device in a manner which indicates that the object OB is within a sub-region of the sensing region R in which a determination can be made of whether the object OB exhibits the predefined gesture. In embodiments in which the at least one illumination device 112 is provided in the form of an array 110 of illumination devices spaced apart at least partially across the sensing region R, the processor or controller 141 is illustratively operable to selectively illuminate illumination devices 112 in the array 10 in a manner which correlates the location of the detected object OB within the sensing region R to a corresponding location or region along the illumination device array 110. In any case, the memory 16 illustratively has instructions stored therein which, when executed by the processor 141, causes the processor 141 to carry out the functions described below. It will be understood that in other embodiments, such instructions may be stored, in whole or in part, in one or more other memory units within the system 10 and/or may be executed, in whole or in part, by one or more other processors and/or controllers within the system 10.
In a first example state of operation illustrated in
As illustrated by example in
In a second example state of operation illustrated in
In this example, the illumination devices 112″ are illuminated in the color amber (or yellow or gold), which serves as a visual feedback indication that the object OB is positioned within the sensing region R such that any subsequent gestures made by the object OB can be recognized by the processor or controller 141 as a predefined gesture or any of multiple different predefined gestures. As noted above, however, one or more other colors may alternatively be employed as desired. Alternatively or additionally still, one or more of the illumination devices 112″ (or one or more of the illumination devices 112 generally) may be controlled in another visually distinctive manner to provide the visual indication that the object OB is positioned within the sensing region R such that any subsequent gestures made by the object OB can be recognized by the processor or controller 141 as a predefined gesture or any of multiple different predefined gestures, e.g., sequentially activating and deactivating the illumination devices 112′ (or one or more illumination devices 112 generally) with a predefined frequency, activating and deactivating one or more of the illumination devices 112′ (or one or more illumination devices 112 generally) with a predefined frequency and/or duty cycle, and/or activating in any manner only a subset of the illumination devices 112′ (or any subset of the illumination devices 112 generally).
In a third example state of operation illustrated in
The memory 16 illustratively has stored therein a vehicle access condition value which represents the predefined gesture. In alternate embodiments, the vehicle access condition value may be stored in one or more of the memory 16, the memory 28, the memory 44 and the memory 64. In some embodiments, the vehicle access condition value is illustratively stored in the form of a predefined set or sequence of values, and the processor 141 is illustratively operable to process the signal(s) produced by the assembly 100 to convert such signals to a detected set or sequence of values, to then compare the detected set or sequence of values to the stored, predefined set or sequence of values and to then determine that the predefined gesture has been exhibited and detected by the assembly 100 if the detected set or sequence of values matches the vehicle access condition value in the form of the stored, predefined set or sequence of values. In some such embodiments, the object detection module 121 may have a “learning” mode of operation in which the predefined gesture may be programmed by exhibiting the predefined gesture within the sensing region R of the assembly 100, then converting the signals produced by the assembly 100 in response to the exhibited gesture to a learned set or sequence of values, and then storing the learned set or sequence of values as the predefined set of sequence or values corresponding to the predefined gesture. In some embodiments, two or more different vehicle access condition values may be stored in the memory 16 (and/or any of the memories 28, 44 and 64) each corresponding to a different one of two or more corresponding predefined gestures, and the processor 141 may be operable to compare detected sets or sequences of values produced by the assembly 100 to each of the two or more different stored vehicle access condition values to determine whether one of the two or more predefined gestures has been exhibited. In some such embodiments, each of the multiple predefined gestures may be associated with a different user of the motor vehicle, and in other such embodiments any single user may have two or more predefined gestures store in the memory 141.
In some embodiments, the processor or controller 141 may be responsive to (i) detection of the object OB within a sub-region of the sensing region R but insufficiently positioned in the sensing region R such that the sub-region R is too small to enable to the assembly 100 to determine whether the object OB exhibits a predefined gesture, (ii) detection of the object OB positioned within the sensing region R such that any subsequent gestures made by the object OB can be recognized by the processor or controller 141 as a predefined gesture or any of multiple different predefined gestures, and/or (iii) detection of the predefined gesture, to control at least one of the audio/illumination device driver circuits 60 to activate one or more respective audio and/or illumination devices 66 in addition to the one or more illumination devices 112 or in instead of the one or more illumination devices 112.
While the foregoing example illustrates the selective illumination of several of the illumination devices 112 simultaneously, it will be appreciated that the number of lights illuminated in any given situation may vary depending on the type of feedback desired, the number and/or type of illumination devices 112 being employed in the system, etc. Likewise, although one or more of the illumination devices 112 may activated with one or more colors and/or be activated and deactivated, i.e., switched on and off, to provide visual feedback of the position of the object OB, one or more illumination devices 112 may alternatively be activated (and deactivated) in any manner which visually directs, e.g., coaxes, the user to move the object OB is a particular direction and/or to a particular position relative to the assembly 100.
In one embodiment, the at least one processor or controller 141 is illustratively operable, upon determining from the radiation emission and detection assembly 100 that a predefined gesture has been exhibited by an object OB within the sensing region R of the assembly 100, to communicate instructions to the vehicle control computer 24 to effect the desired operation (e.g., to unlock or lock a closure —such as a door, rear hatch, tailgate, etc., to open a closure—such as a rear hatch, tailgate, etc. and/or to activate, i.e., turn on, one or more interior and/or exterior vehicle illumination devices). In some alternate embodiments, the at least one processor or controller 141 may be operable, upon such determination, to control one or more actuator driver circuits 40 and/or one or more audio/illumination device driver circuits 60 directly to effect the desired operation. In other alternate embodiments, the at least one processor or controller 141 may be operable, upon such determination, to communicate instructions to the vehicle to one or more other processors or controllers, e.g., the at least one processor or controller 42 and/or the at least one processor or controller 62, to effect the desired operation. In still other alternate embodiments, the at least one processor or controller 141 may be operable, upon such determination, to effect the desired operation in part and to instruct one or more other processors or controllers, e.g., 26, 42, 62, to also effect the desired operation in part.
In some embodiments, one or more aspects of the gesture access process described above and illustrated by example with respect to
In embodiments in which the gesture access process illustrated by example in
Referring now to
The at least one radar transmitter 132 is illustratively conventional, and is configured to be responsive to control signals produced by the processor or controller 141 to emit radio frequency (RF) radiation outwardly from the assembly 100. In one embodiment, the at least one radar transmitter 132 is configured to emit radiation in the so-called short-range-radar (SRR) band, e.g., at and around 24 gigahertz (GHz). Alternatively or additionally, the at least one radar transmitter 132 may be configured to emit radiation in the so-called long-range-radar (LRR) band, e.g., at and around 77 GHz. It will be understood, however, that these numerical frequency ranges are provided only by way of example, and that the at least one radar transmitter 132 may be alternatively or additionally configured to emit radiation at radar frequencies less than 1 GHz and up to or greater than 300 GHz. In any case, each of the plurality of radar detectors 134 is configured to detect radar signals in frequency range(s) corresponding to that/those of the at least one radar transmitter 132, and to produce radiation detection signals corresponding thereto.
The radiation detection signals produced by the radar detectors 134 illustratively include reflected radar signals if the emitted radiation is reflected by an object in a sensing region of the assembly 130, in accordance with a conventional time sequence in which the at least one radar transmitter 132 is activated to emit radiation and at least a portion of such emitted radiation is reflected by the object toward and detected by at least one of the radar detectors 134. As illustrated by example in
Referring again to
Referring now to
Referring now to
The object detection module 12, as described above with respect to
Referring now to
As further illustrated in
Referring now to
As another example implementation of the object detection module 12 in a motor vehicle, the object detection module 121 or the object detection module 122 may likewise be embodied in a motor vehicle access handle assembly (e.g., a door handle) 300 as illustrated by example in
As in the door handle assembly 200, the grip cover 312 includes an opening 322 therein configured to receive the lens 314, and the lens 314 may be secured to the grip cover 312 within the opening 322 via any conventional means. As further illustrated in
The circuit substrate 116, 136 is illustratively mounted to a support member 316 between sidewalls 324 of the grip cover 312. In some embodiments, the radiation emission and detection assembly 100, 130, the illumination device array 110 and the circuit substrate 116, 136 are all illustratively configured such that, when assembled, the radiation emission and detection assembly 100, 130 and the illumination device array 110 are together aligned with the opening 322 and the lens 314 described above. In alternate embodiments, the grip cover 312 may be at least partially light transmissive, and in such embodiments illumination of the one or more illumination devices 112 is viewable through the grip cover 312. In still other embodiments, the grip cover 312 may define another opening and be fitted with another lens through which illumination of the one or more illumination devices 112 may be viewed. In any case, the support member 316 is illustratively dimensioned to be sandwiched between the handle base 206 and the grip cover 212 so as to securely position the object detection module 121, 122 within the housing defined by the handle base 206 and the grip cover 212.
With particular reference to
In either of the motor vehicle access handle assemblies 200, 300 illustrated in
As yet another example implementation of the object detection module 12 in a motor vehicle, any of the object detection modules 121-124 may be embodied in a motor vehicle access assembly 400 as illustrated by example in
In embodiments in which the object detection module 12 is provided in the form of the object detection module 123 or 124, the radiation emission and detection assembly 100, 130 is illustratively provided in the form of a radiation assembly or module 150, 160 as described above, and in embodiments in which the object detection module 12 is provided in the form of the object detection module 121 or 122, the radiation emission and detection assembly 100, 130 and the one or more illumination devices 112 are together provided in the form of a radiation assembly or module 120, 140 as also described above. In the embodiment illustrated in
Thusly positioned, the at least one radiation transmitter, e.g., the plurality of IR LEDs 102 or the at least one radar transmitter, is positioned relative to the vertical seam 415 such that, when activated, radiation is emitted outwardly through the vertical oriented seam 415 at least partially along its length and, if an object is positioned within a sensing region of the radiation assembly or module 120, 140, 150, 160, at least some reflected radiation signals are reflected back towards (and in some embodiments, through) the vertically oriented seam 415 to be detected by one or more of the radiation receivers, e.g., one or more of the IR sensors 104 or one or more of the radar detectors 134. Otherwise, the respective processor or controller 141-144 is operable as described above with respect to
As further illustrated by example in
As a further example implementation of the object detection module 12 in a motor vehicle, any of the object detection modules 121-124 may be embodied in a motor vehicle access assembly 400 as illustrated by example in
With specific reference to
As best shown in
An object detection assembly 542, in the form of one of the object detection module 121-124, overlies the first flange 536. The object detection assembly 542 illustratively includes a radiation emission and detection assembly 544, e.g., in the form of one of the radiation assemblies or modules 120, 140, 150, 160, at the viewing angle α relative to the plane C for detecting movement in a sensing region in front of the assembly 544. It should be appreciated that since the viewing angle α is acute relative to the plane C of the back plate 524, once the assembly 500 is attached or mounted to the motor vehicle 522, the radiation emission and detection assembly 544 is pointed generally toward the feet of an operator that is standing behind the motor vehicle 522, thus allowing the assembly 544 to detect movement in the region of the feet of the operator.
As best shown in
As best shown in
As best shown in
As best shown in
As best shown in
The processor or controller 141-142 of the object detection assembly 542 is depicted in the example embodiment illustrated in
In the illustrated embodiment, the one or more illumination devices 112 is/are depicted in the form of a plurality of light emitting diodes 572 mounted to the circuit board 570 in alignment with the slit 568. Each LED in the plurality of light emitting diodes 572 is electrically connected to the circuit board 570 for emitting light in response to the detection of movement by the assembly 544 as described above. A lens 574 is illustratively disposed between the circuit board 570 and the cover member 566, and overlies the plurality of light emitting diodes 572 for holding the light emitting diodes 572 in place and for protecting the light emitting diodes 572 while allowing light from the light emitting diodes 572 to pass through the lens 574. It should be appreciated that other light emitting devices could be utilized instead of light emitting diodes 572.
In addition to, or as an alternative to the light emitting diodes 572, an audible device 573 (schematically shown and which may be one of the audio devices 66 depicted in
A plurality of first ribbon wires 576 and a jumper board 578 extend between and electrically connect the circuit board 570 and the radiation emission and detection assembly 544. The first ribbon wires 576 extend along the lower and flank segments 558, 560 of the plate frame 554. A first potting material 582 is disposed between back plate 524 and ribbon wires 580 and jumper board 578 for damping vibrations between the back plate 524 and the assembly 544, first ribbon wires 576 and jumper board 578 and for holding the first ribbon wires 576 and jumper board 578 in place relative to the back plate 524.
As best shown in
As best shown in
As best shown in
The second ribbon wires 586 further extend through the passage 604 for allowing the second ribbon wires 586 to be connected to a computer of the motor vehicle 522 for electrically connecting the circuit board 570 to the computer, e.g., the vehicle control computer 24, of the motor vehicle 522. More specifically, the second wires 576, 580, 586 electrically connect the license plate bracket and sensor assembly 500 to the existing passive entry system of the motor vehicle 522.
Operation of the license plate bracket and sensor assembly 500 is as described above with respect to
In embodiments in which the object detection assembly 542 is implemented in the form of the object detection module 121 or 122 illustrated in
In embodiments in which the object detection assembly 542 is implemented in the form of the object detection module 123 or 124 illustrated in
In the second example embodiment of the license plate bracket and sensor assembly 500′ illustrated in
Referring now to
In some embodiments, at least one object detection module 12 illustrated in any of
Referring now to
It will be further understood that the process 700 may be executed using any of the object detection modules 121-124. In this regard, dashed-line boxes are shown around some of the steps or groups of steps of the process 700 to identify steps which are part of the process 700 when the object detection module 12 is implemented in the form of the object detection module 121 or the object detection module 122 to include at least one illumination device 112. As will be described below, such steps are illustratively omitted in embodiments in which the object detection module 12 is implemented in the form of the object detection module 123 or the object detection module 124 which do not include any such illumination devices 112.
The process 700 illustratively begins at step 702 where the processor or controller 14 is operable to determine whether a Key Fob signal has been detected. As described above, the Key Fob signal is illustratively produced by a conventional Key Fob 20 or other mobile electronic device. In some embodiments, the Key Fob signal is received by the communication circuit 30 of the vehicle control computer 24 and passed, processed or unprocessed, to the processor or controller 14. In other embodiments in which the object detection module 12 includes a communication circuit 18, the Key Fob signal may be received directly by the processor or controller 14. In any case, until the Key Fob signal is detected, the process 700 loops back to step 702.
If the Key Fob signal is received by the communication circuit 30 of the vehicle control computer 24, the processor or controller 26 of the vehicle control computer 24 is illustratively operable to decode the received Key Fob signal and determine whether it matches at least one Key Fob code stored in the memory 28. If not, the processor or controller 26 disregards or ignores the Key Fob signal and the process 700 loops back to step 702. Likewise, if the Key Fob signal is received by the communication circuit 18 of the object detection module 12, the processor 14 is similarly operable to determine whether the received Key Fob signal matches at least one Key Fob code stored in the memory 16 or in the memory 28. If not, the process 700 likewise loops back to step 702. Thus, the process 700 advances along the “YES” branch of step 702 only if the received Key Fob signal matches at least one stored Key Fob code, such that the gesture access process proceeds only for authorized users, i.e., only for users carrying a Key Fob 20 that is recognizable by the object detection system 10. It will be understood that some embodiments of the process 700 may not include step 702, and in such embodiments the process 700 begins at step 704.
Following the “YES” branch of step 702 (in embodiments which include step 702), the process 700 advances to step 704 where the processor or controller 14 is operable to monitor the object detection assembly; more specifically, to monitor the radiation emission and detection assembly 100, 130 of the respective object detection module 121-124 for object detection signals produced thereby, if any. In some embodiments, the processor or controller 14 is operable at step 704 to activate the radiation emission and detection assembly 100, 130 to begin transmitting radiation following step 702, and in other embodiments the radiation emission and detection assembly 100, 130 may already be operating and the processor or controller 14 may be operable at step 704 to begin monitoring the signals being produced by the previously activated radiation emission and detection assembly 100, 130.
In any case, following step 704 the processor or controller 14 is operable at step 706 to determine whether any object detection signals have been produced by the radiation emission and detection assembly 100, 130 of the respective object detection module 121-124. If not, then an object has not been detected within the sensing region of the radiation emission and detection assembly 100, 130 of the respective object detection module 121-124. In some embodiments, the process 700 advances from the “NO” branch of step 706 back to the beginning of step 702 as illustrated by example in
In embodiments in which the object detection module 12 is implemented in the form of the object detection module 121 or the object detection module 122, the process 700 illustratively includes step 708. Conversely, in embodiments in which the object detection module 12 is implemented in the form of the object detection module 123 or the object detection module 124, the process 700 does not include step 708. In implementations of the process 700 which include it, step 708 illustratively includes step 710 in which the processor or controller 14 is operable to identify one or more illumination devices 112 to illuminate based on the received object detection (OD) signal(s) produced by the radiation emission and detection assembly 100, 130 of the respective object detection module 121, 122. Thereafter at step 712, the processor or controller 14 is operable to control one or more of the driver circuit(s) DC to illuminate the identified illumination device(s) 112 according to a predefined detection scheme.
In one embodiment, the processor or controller 14 is operable at steps 710 and 712 to identify and illuminate at least one of the illumination devices 112 according to various different detection or illumination schemes. For example, if an object is determined, based on the object detection signals produced by the radiation emission and detection assembly 100, 130, to be within the sensing region of the radiation emission and detection assembly 100, 130 but within a sub-region of the sensing region that is too small to allow determination by the radiation emission and detection assembly 100, 130 and/or by the processor or controller 14 of whether the object within the sensing region exhibits a predefined gesture, the processor or controller 14 is operable to control illumination of the one or more illumination devices 112 according to an “insufficient detection” illumination scheme. In one embodiment in which the object detection module 121 or 122 includes a plurality of illumination devices in the form of an array 110 extending at least partially across the sensing region as described above with respect to the example illustrated in
As another example, if an object is determined, based on the object detection signals produced by the radiation emission and detection assembly 100, 130, to be within the sensing region of the radiation emission and detection assembly 100, 130 and also within a sub-region of the sensing region in which the radiation emission and detection assembly 100, 130 and/or by the processor or controller 14 can determine whether the object therein exhibits a predefined gesture, the processor or controller 14 is operable to control illumination of the one or more illumination devices 112 according to an “object detection” illumination scheme. In one embodiment in which the object detection module 121 or 122 includes a plurality of illumination devices in the form of an array 110 extending at least partially across the sensing region as described above with respect to the example illustrated in
In embodiments which include step 708, the process 700 advances from step 712 to step 714, and in embodiments which do not include step 708 the process 700 advances from the “YES” branch of step 706 to step 714. In any case, the processor or controller 14 is operable at step 714 to compare the received object detection signals (OD), i.e., received from the radiation emission and detection assembly 100, 130, to one or more vehicle access condition (VAC) values stored in the memory 16 (or the memory 28, 42 and/or 64), and to determine at step 716 whether the VAC is satisfied. In some embodiments, for example, the stored VAC is satisfied if the object detected within a suitable sub-region of the sensing region of the radiation emission and detection assembly 100, 130 exhibits a predefined gesture which, when processed by the processor or controller 14 to determine a corresponding vehicle access value, matches the stored VAC as described above. Alternatively or additionally, as also described above, one or more VAC values stored in the memory 16, 28, 42 and/or 64 may be associated in the memory with a corresponding Key Fob code, and in some embodiments multiple VAC values are stored in the memory 16, 28, 42, 64 with each associated with a different Key Fob code. In some such embodiments, vehicle access may be granted only if the combination of the Key Fob code and associated VAC are satisfied.
In embodiments in which the object detection module 12 is implemented in the form of the object detection module 121 or the object detection module 122, the process 700 illustratively includes step 718 to which the process 700 advances from the “YES” branch of step 716. Conversely, in embodiments in which the object detection module 12 is implemented in the form of the object detection module 123 or the object detection module 124, the process 700 does not include step 718. In implementations of the process 700 which include it, step 718 illustratively includes step 720 in which the processor or controller 14 is operable to control one or more of the driver circuit(s) DC to illuminate the identified illumination device(s) 112 according to another predefined detection or illumination scheme different from the “insufficient detection” and “object detection” schemes described above. For example, if an object previously determined to be within the sensing region of the radiation emission and detection assembly 100, 130 is determined, based on the object detection signals produced by the radiation emission and detection assembly 100, 130, to exhibit a predefined gesture as described above, the processor or controller 14 is illustratively operable to control illumination of one or more illumination devices 112 according to an “access grant” illumination scheme. In one embodiment in which the object detection module 121 or 122 includes a plurality of illumination devices in the form of an array 110 extending at least partially across the sensing region as described above with respect to the example illustrated in
In embodiments which include step 718, the process 700 advances from step 718 to step 724, and in embodiments which do not include step 718 the process 700 advances from the “YES” branch of step 716 to step 724. In any case, the processor or controller 14 is operable at step 724 to control one or more of the actuator driver circuits 40 to activate one or more corresponding vehicle access actuators 46 in order to actuate one or more corresponding vehicle access closure devices. Examples of such vehicle access closure devices may include, but are not limited to, one or more access closure locks, one or more access closure latches, and the like. At step 724, the processor or controller 14 may be operable to, for example, control at least one lock actuator associated with at least one access closure of the motor vehicle to unlock the access closure from a locked state or condition and/or to lock the access closure from an unlocked state or condition, and/or to control at least one latch actuator associated with at least one access closure of the motor vehicle to at least partially open the access closure from a closed position or condition and/or to close the access closure from an at least partially open position or condition.
In some embodiments, the process 700 may optionally include a step 726 to which the process 700 advances from step 724, as illustrated by dashed-line representation in
In embodiments in which the object detection module 12 is implemented in the form of the object detection module 121 or the object detection module 122, the process 700 may illustratively include step 722 to which the process 700 advances from the “NO” branch of step 716. Conversely, in embodiments in which the object detection module 12 is implemented in the form of the object detection module 123 or the object detection module 124, the process 700 does not include step 72. In implementations of the process 700 which include it, the processor or controller 14 is illustratively operable at step 722 to control one or more of the driver circuit(s) DC to illuminate the identified illumination device(s) 112 according to another predefined detection or illumination scheme different from the “insufficient detection,” “object detection” and “access grant” schemes described above. For example, if an object previously determined to be within the sensing region of the radiation emission and detection assembly 100, 130 is determined, based on the object detection signals produced by the radiation emission and detection assembly 100, 130, to fail to exhibit a predefined gesture as described above within a predefined time period following execution of step 712, the processor or controller 14 may illustratively be operable to control illumination of one or more illumination devices 112 according to a “fail” illumination scheme. In one embodiment in which the object detection module 121 or 122 includes a plurality of illumination devices in the form of an array 110 extending at least partially across the sensing region as described above with respect to the example illustrated in
Referring now to
The process 800 illustratively begins at step 802 where the processor or controller 14 is operable to determine whether a Key Fob signal has been detected. Illustratively, the processor or controller 14 is operable to execute step 802 as described above with respect to step 702 of the process 700. Thus, the process 800 advances along the “YES” branch of step 802 only if the received Key Fob signal matches at least one stored Key Fob code, such that the process 800 proceeds from step 802 only for authorized users, i.e., only for users carrying a Key Fob 20 that is recognizable by the object detection system 10. It will be understood that some embodiments of the process 800 may not include step 802, and in such embodiments the process 800 begins at step 804.
Following the “YES” branch of step 802 (in embodiments which include step 802), the process 800 advances to step 804 where the processor or controller 14 is operable to monitor one or more of the vehicle operating parameter sensors and/or switches 50 mounted to or within or otherwise carried by the motor vehicle. Illustratively, signals produced by the one or more monitored sensors and/or the status(es) of the one or more switches monitored at step 804 are indicative of an operating condition or state, e.g., engine running or not, and/or of a moving condition or state of the motor vehicle, e.g., motor vehicle stationary, moving, enabled to move, etc. As described above with respect to
Following step 804, the process 800 advances to step 806 where the processor or controller 14 is operable to determine a mode based on the monitored vehicle sensor(s) and/or switch(es). Generally, the mode determined by the processor or controller 14 at step 806 is a gesture access (GA) mode if the signal(s) produced by the monitored vehicle sensor(s) and/or the operational state(s) of the monitored switch(es) correspond to a state or condition of the motor vehicle conducive to gesture access operation of the system 10, and is an object impact avoidance (OIA) mode of signal(s) produced by the monitored vehicle sensor(s) and/or the operational state(s) of the monitored switch(es) correspond to a state or condition of the motor vehicle conducive to object impact avoidance operation of the system 10. In the former case, for example, the processor 14 may operate in the gesture access mode if the motor vehicle is stationary and disabled from moving, and in the latter case, for example, the processor 14 may operate in the object impact avoidance mode if the motor vehicle is moving or is enabled to move.
For purposes of this disclosure, the phrase “disabled from moving” should be understood to mean at least that the engine of the motor vehicle may or may not be running and, if the engine is running, that one or more actuators are preventing the motor vehicle from moving in the forward or reverse direction. In some embodiments, for example, an engine ignition switch in the “off” position means that the motor vehicle is disabled from moving, and the processor 14 may be operable at step 806 under such conditions to set mode=GA. In other example embodiments, an engine ignition switch in the “run” or “on” position means that the engine is running, and the processor 14 may be then operable at step 806 under such conditions to determine the status of one or more other vehicle operating parameters such as the transmission selection lever, the vehicle brakes and/or vehicle road speed. In some such embodiments, the processor 14 may be operable at step 806 when the engine is running to set mode=GA if, and as long as, the transmission selection lever is in “park” or otherwise not in a selectable gear (e.g., in the case of a manual transmission) and/or the vehicle brakes are engaged and/or the vehicle speed is zero. The phrase “enabled to move,” on the other hand, should be understood to mean at least that the engine of the motor vehicle has been started, and in some embodiments the processor 14 may be operable at step 806 under conditions in which the engine ignition switch is in the “run” or “on” position to set mode=OIA. In some embodiments in which the processor or controller 14 has determined that the engine has been started, the processor 14 may then be further operable at step 806 to determine the status of at least one other vehicle operating parameter such as the transmission selection lever, the vehicle brakes or vehicle road speed. In some such embodiments, the processor 14 may be operable at step 806 when the engine is running to set mode=OIA if, and as long as, a drive gear (forward or reverse) of the motor vehicle transmission has been selected, and/or the vehicle brakes are disengaged and/or vehicle speed is greater than zero. Those skilled in the art will recognize other vehicle operating parameters which may be used alone, in combination with one or more of the above-described vehicle operating parameters and/or in combination with other vehicle operating parameters to determine when and whether the motor vehicle is disabled from moving or enabled to move, and it will be understood that any such other vehicle operating parameters are intended to fall within the scope of this disclosure. Moreover, those skilled in the art will recognize other vehicle operating conditions conducive to gesture access mode of operation or in which gesture access mode may be safely executed, and it will be understood that the processor or controller 14 may be alternatively configured to set mode=GA at step 806 according to any such other vehicle operating conditions. Further still, those skilled in the art will recognize other vehicle operating conditions conducive to object impact avoidance mode of operation or in which object impact avoidance mode may be safely executed, and it will be understood that the processor or controller 14 may be alternatively configured to set mode=OIA at step 806 according to any such other vehicle operating conditions. It will be appreciated that configuring the processor or controller 14 to set mode=GA or OIA based on any such other vehicle operating conditions will involve only mechanical steps for a skilled programmer.
If, at step 806, the processor or controller 14 has set mode=GA, the process 800 advances to step 808 to execute a GA control process. In some embodiments, the GA control process may be the process 700 illustrated in
If, at step 806, the processor or controller 14 has set mode=OIA, the process 800 advances to step 810 to execute an OIA control process. An example of one such OIA process is illustrated in
Referring now to
The process 900 illustratively begins at step 902 where the processor or controller 14 is operable to determine whether a Key Fob signal has been detected. Illustratively, the processor or controller 14 is operable to execute step 902 as described above with respect to step 702 of the process 700. Thus, the process 900 advances along the “YES” branch of step 902 only if the received Key Fob signal matches at least one stored Key Fob code, such that the process 900 proceeds from step 902 only for authorized users, i.e., only for users carrying a Key Fob 20 that is recognizable by the object detection system 10. It will be understood that some embodiments of the process 900 may not include step 902, and in such embodiments the process 900 begins at steps 904 and 906.
Following the “YES” branch of step 902 (in embodiments which include step 902), the process 900 advances to steps 904 and 906. At step 904, the processor 14 is illustratively operable to execute a GA control process. In some embodiments, the GA control process may be the process 700 illustrated in
At step 906, the processor or controller 14 is operable to determine, e.g., by monitoring the engine ignition switch included in the vehicle sensors/switches 50, whether the engine ignition status IGN is “on” or “running.” If not, the process 900 loops back to the beginning of step 906. Thus, as long as the engine of the motor vehicle is not running, the processor or controller 14 will continue to execute the GA control process at step 904. If, however, the processor or controller 14 determines at step 906 that the engine ignition status IGN is “on” or “running,” thus indicating that the engine of the motor vehicle has been started and is running, the process 900 advances to step 908 where the processor or controller 14 is operable to monitor one or more vehicle sensors and/or switches. Thereafter at step 910, the processor or controller 14 is operable to compare the signal(s) and/or state(s) of the monitored vehicle sensor(s) and/or switch(es) to gesture access (GA) and/or object detection (OD) conditions, and thereafter at step 912 the processor or controller 14 is operable to determine a mode as either gesture access (GA) or object impact avoidance (OIA) based on the comparison. Illustratively, the processor or controller 14 is operable to execute steps 908-912 as described above with respect to step 806 of the process 800.
Following step 912, the processor or controller 14 is illustratively operable to determine whether the mode determined at step 912 is GA or OIA. If GA, the process 900 loops back to the beginning of steps 904 and 906. Thus, with the engine running, as long as the vehicle operating parameters correspond to gesture access operating conditions, the processor or controller 14 will continue to execute the GA control process at step 904. However, if the processor or controller 14 determines at step 914 that the mode determined at step 912 is OIA, the process 900 advances to step 916 where the processor or controller 14 is operable to suspend execution of the GA control process executing at step 904 and to execute an object impact avoidance control process beginning at step 918.
At step 918, the processor or controller 14 is operable to monitor the object detection assembly; more specifically, to monitor the radiation emission and detection assembly 130 of the respective object detection module 122, 124 for object detection signals produced thereby, if any. Thereafter at step 920, the processor or controller 14 is operable to compare the object detection signal(s) produced by the assembly 130 to one or more object detection parameters (ODP) stored in the memory 16 (and/or stored in the memory 28, 44 or 64). In some embodiments, for example, the one or more stored ODPs is/are satisfied by an object detected anywhere within the distance D2 of the radiation emission and detection assembly 130 as illustrated in
Following step 920, the processor or controller 14 is operable at step 922 to determine whether the one or more stored ODPs has/have been satisfied. If so, the process 900 advances to step 924 where the processor or controller 14 is operable to control one or more of the actuator driver circuits 40 to control one or more corresponding actuators 48 to activate one or more corresponding object avoidance devices, mechanisms and/or systems 50 of the motor vehicle. Examples of such object avoidance devices, mechanisms and/or systems 50 may include, but are not limited to, one or more electronically controllable motor vehicle access closure latches or latching systems, an automatic (i.e., electronically controllable) engine ignition system, an automatic (i.e., electronically controllable) motor vehicle braking system, an automatic (i.e., electronically controllable) motor vehicle steering system, an automated (i.e., electronically controllable) motor vehicle driving system (e.g., “self-driving” or “autonomous driving” system), and the like. Thus, depending upon the location of the object detection module 12 on and relative to the motor vehicle, the processor or controller 14 may execute step 924 by locking one or more electronically controllable access closure latches or latching systems, by automatically turning off the engine ignition system, by activating an electrically controllable motor vehicle braking system to automatically apply braking force to stop or slow the motor vehicle, by controlling an automatic steering system so as to avoid impact with the detected object and/or by controlling an automated vehicle driving system so as to avoid impact with the detected object. Those skilled in the art will recognize other object impact avoidance devices, mechanisms and/or systems which may be controlled at step 924 to avoid or mitigate impact with the detected object, and it will be understood that any such other object impact avoidance devices, mechanism and/or systems are intended to fall within the scope of this disclosure. In any case, the process 900 illustratively loops from step 924 back to the beginning of step 918 so that the processor or controller 14 continues to execute the object impact avoidance control process of steps 918-924 as long as the one or more stored ODP conditions continue to be satisfied.
In some embodiments, the processor or controller 14 may be additionally operable at step 926 to control one or more audio and/or illumination driver circuits 60 to activate one or more corresponding audio devices and/or illumination devices 66. Examples of the one or more audio devices 66 which the processor or controller 14 may activate at step 926 may include, but are not limited to, a vehicle horn, one or more electronically controllable audible warning devices, e.g., in the form of one or more predefined alarm sounds, sequences or the like, one or more electronically controllable audio notification devices or systems, one or more electronically controllable audio voice messaging devices or systems, or the like. Examples of the one or more illumination devices 66 which the processor or controller 14 may activate at step 926 may include, but are not limited to, one or more electronically controllable visible warning devices, one or more exterior vehicle lights, one or more interior vehicle lights, or the like.
If at step 922, the processor or controller 14 determines that the one or more stored ODPs is/are not, or no longer, satisfied, the process 900 advances to step 926 where the processor or controller 14 is operable to control the one or more actuator driver circuits 40 to reset the corresponding one or more actuators 46 activated at step 924. If, at step 924, the process or controller 14 activated one or more audible and/or illumination devices 66, the processor or controller 14 is further operable at step 926 to reset or deactivate such one or more activated audible and/or illumination devices 66. Following step 926, the process 900 loops back to steps 904 and 906 where the processor or controller 14 is operable at step 904 to again execute the GA control process and at steps 906-914 to determine whether to continue to execute the GA control process or whether to again suspend the GA process and execute the OIA process of steps 918-924. It will be understood that if step 924 has not yet been executed prior to determining at step 922 that the ODPs is/are not satisfied, step 926 may be bypassed and the process 900 may proceed directly from the “NO” branch of step 922 to steps 904 and 906.
In some embodiments of the process 800 illustrated in
In a first example, a gesture access system for a motor vehicle may comprise at least one radiation transmitter configured to be mounted to the motor vehicle and, when activated, to emit radiation outwardly away from the motor vehicle, at least one radiation receiver configured to be mounted to the motor vehicle and to produce radiation detection signals, the radiation detection signals including reflected radiation signals if the emitted radiation is reflected by an object toward and detected by the at least one radiation receiver, at least one illumination device configured to be mounted to the motor vehicle and, when activated, to produce light visible from outside the motor vehicle, at least one processor operatively coupled to the at least one radiation transmitter, to the at least one radiation receiver and to the at least one illumination device, and at least one memory having instructions stored therein which, when executed by the at least one processor, cause the at least one processor to activate the at least one radiation transmitter and to process the radiation detection signals to: determine whether an object is within a sensing region of the at least one radiation receiver, activate the at least one illumination device according to a first illumination scheme if the object is determined to be within the sensing region, determine whether the object within the sensing region exhibits a predefined gesture, and if the object within the sensing region is determined to exhibit the predefined gesture, activate the at least one illumination device according to a second illumination scheme different from the first illumination scheme, and control at least one actuator associated with an access closure of the motor vehicle to at least one of unlock the access closure from a locked condition, lock the access closure from an unlocked condition, open the access closure from a closed position and close the access closure from an open position.
A second example includes the subject matter of the first example, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to: determine a sub-region of the sensing region occupied by the object if the object is determined to be within the sensing region, and activate the at least one illumination device according to a third illumination scheme, different from the first and second illumination schemes, if the sub-region occupied by the object is too small to allow determination of whether the object within the sensing region exhibits the predefined gesture.
A third example includes the subject matter of the first example or the second example, and wherein the at least one radiation transmitter may be configured to be mounted to the motor vehicle separately and remotely from the at least one radiation receiver.
A fourth example includes the subject matter of the first example or the second example, and wherein the at least one radiation transmitter and the at least one radiation receiver may together comprise a radiation emission and detection assembly configured to be mounted to the motor vehicle.
A fifth example includes the subject matter of any of the first example through the fourth example, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to activate at least one of one or more auxiliary illumination devices and one or more audio devices on or within the motor vehicle if the object within the sensing region is determined to exhibit the predefined gesture.
A sixth example includes the subject matter of any of the first example through the fifth example, and wherein the at least one memory may have a key fob code stored therein, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to receive a key fob signal wirelessly transmitted by a key fob within a key fob signal detection area of the motor vehicle, to determine a code based on the received key fob signal, and to activate the at least one radiation transmitter and process the radiation detection signals only if the determined code matches the stored key fob code.
A seventh example includes the subject matter of any of the first example through the sixth example, and wherein the at least one memory further may have at least a first vehicle access condition value stored therein corresponding to a first predefined gesture, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to determine that the object within the sensing region exhibits the predefined gesture if the processed radiation detection signals match the at least the first vehicle access condition value stored in the at least one memory.
An eighth example includes the subject matter of the seventh example, and wherein the first vehicle access condition value may be associated in the at least one memory with a first key fob code, and the at least one memory may further have at least a second vehicle access condition value stored therein corresponding to a second predefined gesture and the second vehicle access condition value is associated in the at least one memory with a second key fob code different from the first key fob code, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to receive a key fob signal wirelessly transmitted by a key fob within a key fob signal detection area of the motor vehicle, to determine a code based on the received key fob signal, and to determine that the object within the sensing region exhibits the predefined gesture if the processed radiation signals match the at least the stored first vehicle access condition value and the determined code matches the stored first key fob code or if the processed radiation signals match the at least the stored second vehicle access condition value and the determined code matches the stored second key fob code.
A ninth example includes the subject matter of any of the first example through the eighth example, and wherein the at least one illumination device may comprise at least one multi-color LED, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to activate the at least one illumination device according to the first illumination scheme by controlling the at least one multi-color LED to emit visible light of a first color, and to activate the at least one illumination device according to the second illumination scheme by controlling the at least one multi-color LED to emit visible light of a second color different from the first color.
A tenth example includes the subject matter of the ninth example, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to activate the at least one illumination device according to the third illumination scheme by controlling the at least one multi-color LED to emit visible light of a third color different from the first and second colors.
An eleventh example includes the subject matter of any of the first example through the tenth example, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to activate the at least one illumination device according to the first illumination scheme by controlling the at least one illumination device to switch on and off with at least one of a first frequency and a first duty cycle, and to activate the at least one illumination device according to the second illumination scheme by controlling the at least one illumination device to switch on and off with at least one of a second frequency different from the first frequency and a second duty cycle different from the first duty cycle.
A twelfth example includes the subject matter of the eleventh example, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to activate the at least one illumination device according to the third illumination scheme by controlling the at least one illumination device to switch on and off with at least one of a third frequency different from the first and second frequencies and a third duty cycle different from the first and second duty cycles.
A thirteenth example includes the subject matter of any of the first example through the twelfth example, and wherein the at least one illumination device may comprise a plurality of illumination devices, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to activate the at least one illumination device according to the first illumination scheme by controlling at least a first one of the plurality of illumination devices to illuminate, and to activate the at least one illumination device according to the second illumination scheme by controlling at least a second one of the plurality of illumination devices, different from the at least the first one of the plurality of illumination devices, to illuminate.
A fourteenth example includes the subject matter of the thirteenth example, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to activate the at least one illumination device according to the third illumination scheme by controlling at least a third one of the plurality of illumination devices, different from the at least the first one of the plurality of illumination devices and from the at least the second one of the plurality of illumination devices, to illuminate.
A fifteenth example includes the subject matter of any of the first example through the eighth example, and wherein the at least one illumination device may comprise a plurality of illumination devices each configured to selectively emit visible light in any of a plurality of colors, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to activate the at least one illumination device according to the first illumination scheme by controlling one or more of the plurality of illumination sources to emit visible light of a first one of the plurality of colors, and to activate the at least one illumination device according to the second illumination scheme by controlling one or more of the plurality of illumination sources to emit visible light of a second one of the plurality of colors different from the first one of the plurality of colors.
A sixteenth example includes the subject matter of the fifteenth example, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to activate the at least one illumination device according to the third illumination scheme by controlling one or more of the plurality of illumination sources to emit visible light of a third one of the plurality of colors different from the first one of the plurality of colors and from the second one of the plurality of colors.
A seventeenth example includes the subject matter of any of the first example through the eighth example, and wherein the at least one illumination device may comprise a plurality of illumination devices each configured to selectively emit visible light, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to activate the at least one illumination device according to the first illumination scheme by controlling one or more of the plurality of illumination sources to switch on and off with at least one of a first frequency and a first duty cycle, and to activate the at least one illumination device according to the second illumination scheme by controlling one or more of the plurality of illumination sources to switch on and off with at least one of a second frequency different from the first frequency and a second duty cycle different from the first duty cycle.
An eighteenth example includes the subject matter of the seventeenth example, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to activate the at least one illumination device according to the third illumination scheme by controlling one or more of the plurality of illumination sources to switch on and off with at least one of a third frequency different from the first and second frequencies and a third duty cycle different from the first and second duty cycles.
A nineteenth example includes the subject matter of any of the first example through the eighteenth example, and wherein the at least one illumination device may comprise two or more illumination devices spaced apart at least partially across the sensing region, and wherein the instructions stored in the at least one memory may include instructions which, when executed by the at least one processor, cause the at least one processor to activate according to at least one of the first, second and third illumination schemes at least one of the two or more illumination devices aligned with the portion of the sensing region occupied by the object.
A twentieth example includes the subject matter of the nineteenth example, and wherein the at least one radiation receiver may comprise two or more radiation sensors or receivers spaced apart at least partially across the sensing region, each of the two or more radiation sensors aligned with a corresponding one of the two or more illumination devices.
A twenty first example includes the subject matter of any of the first example through the twentieth example, and wherein the at least one radiation transmitter may comprise a plurality of infrared LEDs for emitting the radiation in the form of infrared radiation, and wherein the at least one radiation receiver may comprise a plurality of infrared radiation sensors.
A twenty second example includes the subject matter of the twenty first example, and wherein the plurality of infrared LEDs may be arranged as an array of the plurality of infrared LEDs, and wherein the plurality of infrared radiation sensors may be arranged as an array of the plurality of infrared radiation sensors.
A twenty third example includes the subject matter of the twenty second example, and wherein the array of infrared LEDs may be arranged to align with the array of infrared radiation sensors such that each infrared LED in the array of infrared LEDs is positioned adjacent to a corresponding one of the infrared radiation sensors in the array of infrared radiation sensors.
A twenty fourth example includes the subject matter of the twenty second example or the twenty third example, and wherein the at least one illumination device may comprise a plurality of illumination devices arranged as an array of the plurality of illumination devices, and wherein the array of illumination devices may be arranged to align with the array of infrared radiation sensors such that each illumination device in the array of illumination devices is positioned adjacent to a corresponding one of the infrared radiation sensors in the array of infrared radiation sensors.
A twenty fifth example includes the subject matter of any of the first example through the twentieth example, and wherein the at least one radiation transmitter may comprise at least one radar transmitter configured to emit radar signals when activated, and wherein the at least one radiation receiver may comprise at least one radar receiver configured to detect reflected radar signals and to produce the radar detection signals.
A twenty sixth example includes the subject matter of the twenty fifth example, and wherein the at least one radar receiver may comprise two or more radar receivers spaced apart at least partially across the sensing region.
A twenty seventh example includes the subject matter of the twenty fifth example or the twenty sixth example, and wherein the at least one illumination device may comprise two or more illumination devices spaced apart at least partially across the sensing region.
A twenty eighth example includes the subject matter of any of the first example through the twenty seventh example, and wherein the system may further comprise a housing for mounting to at least a portion of the motor vehicle, and wherein at least one of the at least one radiation transmitter and the at least one of the radiation receiver may be mounted to or within the housing, and wherein the at least one illumination device may be mounted to or within the housing, and wherein the access closure of the motor vehicle may comprise one of a front, rear and side access closure of the motor vehicle.
A twenty ninth example includes the subject matter of any of the first example through twenty seventh example, and wherein the system may further comprise a circuit substrate for mounting to at least a portion of the motor vehicle, and wherein at least one of the at least one radiation transmitter and the at least one of the radiation receiver may be operatively mounted to the circuit substrate, and wherein the access closure of the motor vehicle may comprise one of a front, rear and side access closure of the motor vehicle.
A thirtieth example includes the subject matter of the twenty ninth example, and wherein the at least one illumination device may be operatively mounted to the circuit substrate.
A thirty first example includes the subject matter of the twenty ninth example, and wherein the circuit substrate may comprise a first circuit substrate mounted to at least a first portion of the motor vehicle, and further may comprise a second circuit substrate for mounting to at least a second portion of the motor vehicle proximate to or remote from the first portion of the motor vehicle.
A thirty second example includes the subject matter of any of the first example through the twenty seventh example, and wherein the system may further comprise a license plate bracket having a housing for mounting to the motor vehicle and supporting a license plate against the motor vehicle, and wherein the at least one radiation transmitter and the at least one radiation receiver may be mounted to or within the housing, and wherein at least one of the plurality of illumination devices may be mounted to or within the housing, and wherein the access closure of the motor vehicle may comprise a rear access closure of the motor vehicle.
A thirty third example includes the subject matter of the thirty second example, and wherein the at least one actuator may comprise at least one of a latch for releasably securing the rear access closure in a closed position, a locking device for locking and unlocking the rear access closure in its closed position and at least one motor for opening and closing the rear access closure.
A thirty fourth example includes the subject matter of the thirty second example or the thirty third example, and wherein the rear access closure may be one of a rear hatch door and a trunk lid of the motor vehicle.
A thirty fifth example includes the subject matter of any of the first example through the twenty seventh example, and wherein the access closure may comprise an access door of the motor vehicle, and wherein the system may further comprise a handle assembly mountable to the access door, the handle assembly including a housing, and wherein the at least one radiation transmitter and the at least one radiation receiver may be mounted to or within the housing, and wherein at least one of the plurality of illumination devices may be mounted to or within the housing.
In a thirty sixth example, a gesture access system for a motor vehicle, may comprise a housing configured to be mounted to a motor vehicle adjacent to a first door of the motor vehicle and aligned with a vertically oriented seam defined between the first door and one of a second door of the motor vehicle adjacent to the first door and a stationary exterior member of the motor vehicle adjacent to the first door, the housing recessed within the motor vehicle relative to an outer surface of the first door, a radiation assembly carried by the housing, the radiation assembly including at least one radiation transmitter configured, when activated, to emit radiation outwardly through the vertically oriented seam, and at least one radiation receiver configured to produce radiation detection signals, the radiation detection signals including reflected radiation signals if the emitted radiation is reflected by an object back inwardly through the vertically oriented seam and detected by the at least one radiation receiver, at least one processor operatively connected to the radiation assembly, and at least one memory having instructions stored therein which, when executed by the at least one processor, cause the at least one processor to activate the at least one radiation transmitter and to process the radiation detection signals to: determine whether an object is within a sensing region of the radiation assembly opposite the vertically-oriented seam and, if so, whether the object exhibits a predefined gesture while within the sensing region, and if the object exhibits the predefined gesture while within the sensing region of the radiation assembly, control at least one actuator associated with the first door to at least one of unlock the first door from a locked condition, lock the first door from an unlocked condition and at least partially open the first door from a closed position.
A thirty seventh example includes the subject matter of the thirty sixth example, and wherein the at least one radiation transmitter may comprise an array of infrared LEDs each configured to emit infrared radiation when activated, and wherein the at least one radiation receiver may comprise an array of infrared radiation sensors each configured to detect reflected infrared radiation and produce corresponding radiation signals, and wherein the array of infrared radiation-emitted LEDs and the array of infrared radiation sensors may each be arranged vertically relative to the housing and aligned with the vertically-oriented seam.
A thirty eighth example includes the subject matter of the thirty sixth example, and wherein the at least one radiation transmitter may comprise at least one radar transmitter configured to emit radar signals when activated, and wherein the at least one radiation receiver may comprise at least one radar receiver configured to detect reflected radar signals and to produce the radar detection signals, and wherein the at least one radar transmitter and the at least one radar receiver may each be arranged relative to the housing to be aligned with the vertically-oriented seam.
A thirty ninth example includes the subject matter of any of the thirty sixth example through the thirty eighth example, and wherein the system may further comprise a recess or pocket provided along an inside edge of the first door, the recess or pocket dimensioned to receive two or more fingers of a human hand in order to facilitate opening the first door, and wherein the at least one processor may be operable to control the at least one actuator associated with the first door to at least partially open the first door sufficiently to allow the two or more fingers of a human hand to access and engage the recess or pocket.
A fortieth example includes the subject matter of any of the thirty sixth example through the thirty ninth example, and wherein the system may further comprise at least one illumination device configured to produce visible light, the at least one illumination device mounted to or within the housing and arranged relative to the housing to emit the visible light outwardly away from the motor vehicle through the vertically-oriented seam, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to activate the at least one illumination device according to a first illumination scheme when the object is within the sensing region of the radiation assembly and to activate the at least one illumination device according to a second illumination scheme, different from the first illumination scheme, if the object exhibits the predefined gesture while within the sensing region of the radiation assembly.
A forty first example includes the subject matter of any of the thirty sixth example through the fortieth example, and wherein the at least one memory may have a key fob code stored therein, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to receive a key fob signal wirelessly transmitted by a key fob within a key fob signal detection area of the motor vehicle, to determine a code based on the received key fob signal, and to activate the at least one radiation transmitter and process the radiation detection signals only if the determined code matches the stored key fob code.
In a forty second example, a gesture access and object impact avoidance system for a motor vehicle may comprise at least one radar signal transmitter configured to be mounted to the motor vehicle and, when activated, to emit radar signals, at least one radar signal receiver configured to be mounted to the motor vehicle and to produce radar detection signals, the radar detection signals including at least one reflected radar signal if at least one of the emitted radar signals is reflected by an object toward and detected by the at least one radar signal receiver, at least one processor operatively connected to the at least one radar signal transmitter and to the at least one radar signal receiver, and configured to activate the at least one radar signal transmitter, and at least one memory having instructions stored therein which, when executed by the at least one processor, cause the at least one processor to: monitor at least one vehicle operating parameter signal produced by at least one vehicle operating parameter sensor or switch, if the monitored at least one vehicle operating parameter signal satisfies a first vehicle operating condition, operate in a gesture access mode by monitoring the radar detection signals to determine whether an object is within a sensing region of the at least one radar signal receiver and, if so, controlling at least one actuator associated with an access closure of the motor vehicle to lock, unlock, open or close the access closure if the object within the sensing region exhibits a predefined gesture, and if the at least one vehicle operating parameter sensor signal satisfies a second vehicle operating condition different from the first vehicle operating condition, operate in an object impact avoidance mode by monitoring the radar detection signals to determine whether an object is within a predefined distance of the at least one radar signal receiver and, if so, at least one of activating at least one warning device and controlling at least one actuator associated with at least one impact avoidance device of the motor vehicle.
A forty third example includes the subject matter of the forty second example, and wherein the at least one radar signal transmitter and the at least one radar signal receiver may be provided together in the form of a radar signal transceiver module configured to be mounted to the motor vehicle.
A forty fourth example includes the subject matter of the forty second example or the forty third example, and wherein the system may further comprise a housing configured to be mounted to the motor vehicle, and wherein the at least one radar signal transmitter and the at least one radar signal receiver may be mounted together to or within the housing.
A forty fifth example includes the subject matter of any of the forty second example through the forty fourth example, and wherein the at least one radar signal receiver may comprise a plurality of radar signal receivers spaced apart at least partially across the sensing region.
A forty sixth example includes the subject matter of any of the forty second example through the forty fifth example, and wherein the system may further comprise at least one illumination device configured to be mounted to the motor vehicle and, when activated, to produce visible light, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor, when operating in the gesture access mode, to activate the at least one illumination device according to a first illumination scheme if the object is determined to be within the sensing region of the radar signal receiver, and to activate the at least one illumination device according to a second illumination scheme, different from the first illumination scheme, if the object within the sensing region exhibits the predefined gesture.
A forty seventh example includes the subject matter of the forty sixth example, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor, when operating in the gesture access mode, to process the at least one at least one radar detection signal to determine a sub-region of the sensing region occupied by the object if the object is determined to be within the sensing region of the at least one radar signal receiver, and to activate the at least one illumination device according to a third illumination scheme, different from the first illumination scheme and the second illumination scheme, if the sub-region occupied by the object is too small to allow determination of whether the object within the sensing region exhibits the predefined gesture.
A forty eighth example includes the subject matter of the forty sixth example or the forty seventh example, and wherein at least one illumination device may be configured to produce the visible light in each of at least first and second different colors, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to activate the at least one illumination device according to the first and second illumination schemes by controlling the at least one illumination device to produce the visible light in the first and second respective colors.
A forty ninth example includes the subject matter of the forty eighth example, and wherein at least one illumination device may be further configured to produce the visible light in a third color different from the first and second colors, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to activate the at least one illumination device according to the third illumination scheme by controlling the at least one illumination device to produce the visible light in the third color.
A fiftieth example includes the subject matter of any of the forty sixth example through the forty ninth example, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor, when operating in the gesture access mode, to activate the at least one illumination device according to the first illumination scheme by controlling the at least one illumination device to switch on and off with at least one of a first frequency and a first duty cycle, and to activate the at least one illumination device according to the second illumination scheme by controlling the at least one illumination device to switch on and off with at least one of a second frequency and a second duty cycle, the second frequency different from the first frequency and the second duty cycle different from the first duty cycle.
A fifty first example includes the subject matter of the fiftieth example, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor, when operating in the gesture access mode, to activate the at least one illumination device according to the third illumination scheme by controlling the at least one illumination device to switch on and off with at least one of a third frequency and a third duty cycle, the third frequency different from the first and second frequencies and the third duty cycle different from the first and second duty cycles.
A fifty second example includes the subject matter of any of the forty sixth example through the fifty first example, and wherein the at least one illumination device may comprise a plurality of illumination devices spaced apart at least partially across the sensing region.
A fifty third example includes the subject matter of the fifty second example, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor, when operating in the gesture access mode, to activate the at least one illumination device according to the first illumination scheme by controlling at least a first one of the plurality of illumination devices to illuminate, and to activate the at least one illumination device according to the second illumination scheme by controlling at least a second one of the plurality of illumination devices, different from the first one of the plurality of illumination devices, to illuminate.
A fifty fourth example includes the subject matter of the forty second example through the fifty third example, wherein the at least one memory may have a key fob code stored therein, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to receive a key fob signal wirelessly transmitted by a key fob within a key fob signal detection area of the motor vehicle, to determine a code based on the received key fob signal, and to activate the at least one radar transmitter and process the radar detection signals only if the determined code matches the stored key fob code.
A fifty fifth example includes the subject matter of any of the forty second example through the fifty third example, and wherein the at least one memory may further have at least a first vehicle access condition value stored therein corresponding to a first predefined gesture, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to determine that the object within the sensing region exhibits the predefined gesture if the processed radar detection signals match the at least the first vehicle access condition value stored in the at least one memory.
A fifth sixth example includes the subject matter of the fifty fifth example, and wherein the first vehicle access condition value may be associated in the at least one memory with a first key fob code, and the at least one memory may further have at least a second vehicle access condition value stored therein corresponding to a second predefined gesture and the second vehicle access condition value may be associated in the at least one memory with a second key fob code different from the first key fob code, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to receive a key fob signal wirelessly transmitted by a key fob within a key fob signal detection area of the motor vehicle, to determine a code based on the received key fob signal, and to determine that the object within the sensing region exhibits the predefined gesture if the processed radar signals match the at least the stored first vehicle access condition value and the determined code matches the stored first key fob code or if the processed radar signals match the at least the stored second vehicle access condition value and the determined code matches the stored second key fob code.
A fifty seventh example includes the subject matter of any of the forty second example through the fifty sixth example, wherein the at least one warning device may comprise at least one of one or more illuminating devices and one or more audible sound producing devices.
A fifty eighth example includes the subject matter of any of the forty second example through the fifty seventh example, and wherein the at least one impact avoidance device of the motor vehicle may comprise at least one of an electronically controllable motor vehicle braking system, an electronically controllable motor vehicle steering system and an electronically controllable locking system for selectively locking at least one access closure of the motor vehicle.
A fifty ninth example includes the subject matter of any of the forty second example through the fifty eighth example, and wherein the at least one vehicle operating parameter sensor or switch may comprise at least one of an ignition switch, a transmission gear position sensor and a vehicle speed sensor.
In a sixtieth example, a gesture access and object impact avoidance system for a motor vehicle may comprise at least one radar signal transmitter configured to be mounted to the motor vehicle and, when activated, to emit radar signals, at least one radar signal receiver configured to be mounted to the motor vehicle and to produce radar detection signals, the radar detection signals including at least one reflected radar signal if at least one of the emitted radar signals is reflected by an object toward and detected by the at least one radar signal receiver, at least one processor operatively connected to the at least one radar transmitter and the at least one radar receiver, the at least one processor configured to activate the at least one radar signal transmitter and to be operable in either of (i) a gesture access mode to control an actuator associated with an access closure of the motor vehicle to lock, unlock, open or close the access closure if an object within a sensing region of the at least one radar signal receiver exhibits a predefined gesture, and (ii) an object impact avoidance mode to activate a warning device or control an actuator associated with an impact avoidance device of the motor vehicle if an object is within a predefined distance of the at least one radar signal receiver, and at least one memory having instructions stored therein which, when executed by the at least one processor, cause the at least one processor to operate in the gesture access mode if the motor vehicle is disabled from moving, and to operate in the object impact avoidance mode if the motor vehicle is moving or enabled to move.
A sixty first example includes the subject matter of the sixtieth example, and wherein the at least one memory may have a key fob code stored therein, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to receive a key fob signal wirelessly transmitted by a key fob within a key fob signal detection area of the motor vehicle, to determine a code based on the received key fob signal, and operate in the gesture access mode only if the determined code matches the stored key fob code.
A sixty second example includes the subject matter of the sixtieth example or the sixty first example, and wherein the at least one radar signal transmitter and the at least one radar signal receiver may be provided together in the form of a radar signal transceiver module configured to be mounted to the motor vehicle.
A sixty third example includes the subject matter of any of the sixtieth example through the sixty second example, wherein the system may further comprise a housing configured to be mounted to the motor vehicle, and wherein the at least one radar signal transmitter and the at least one radar signal receiver may be mounted together to or within the housing.
A sixty fourth example includes the subject matter of any of the sixtieth example through the sixty third example, and wherein the at least one radar signal receiver may comprise a plurality of radar signal receivers spaced apart at least partially across the sensing region.
A sixty fifth example includes the subject matter of any of the sixtieth example through the sixty fourth example, and wherein the system may further comprise at least one illumination device configured to be mounted to the motor vehicle and further configured, when activated, to produce visible light, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor, when operating in the gesture access mode, to activate the at least one illumination device according to a first illumination scheme if the object is determined to be within the sensing region of the at least one radar signal receiver, and to activate the at least one illumination device according to a second illumination scheme, different from the first illumination scheme, if the object within the sensing region exhibits the predefined gesture.
A sixty sixth example includes the subject matter of the sixty second example, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor, when operating in the gesture access mode, to process the radar detection signals to determine a sub-region of the sensing region occupied by the object if the object is determined to be within the sensing region of the at least one radar signal receiver, and to activate the at least one illumination device according to a third illumination scheme, different from the first illumination scheme and the second illumination scheme, if the sub-region occupied by the object is too small to allow determination of whether the object within the sensing region exhibits the predefined gesture.
A sixty seventh example includes the subject matter of the sixty fifth example or the sixty sixth example, and wherein the at least one illumination device may comprise a plurality of illumination devices spaced apart at least partially across the sensing region.
A sixty eighth example includes the subject matter of any of the sixtieth example through the sixty seventh example, and wherein the at least one warning device may comprise at least one of one or more illuminating devices and one or more audible sound producing devices.
A sixty ninth example includes the subject matter of any of the sixtieth example through the sixty eighth example, and wherein the at least one impact avoidance device of the motor vehicle may comprise at least one of an electronically controllable motor vehicle braking system, an electronically controllable motor vehicle steering system and an electronically controllable locking system for selectively locking at least one access closure of the motor vehicle.
A seventieth example includes the subject matter of any of the sixtieth example through the sixty ninth example, and wherein the instructions stored in the at least one memory may further include instructions which, when executed by the at least one processor, cause the at least one processor to monitor at least one vehicle operating parameter signal produced by at least one vehicle operating parameter sensor or switch, and to determine whether the motor vehicle is disabled from moving, is moving or is enabled to move based on the at least one vehicle operating parameter signal.
A seventy first example includes the subject matter of the seventieth example, and wherein the at least one vehicle operating parameter sensor or switch may comprise at least one of an ignition switch, a transmission gear position sensor and a vehicle speed sensor.
In a seventy second example, a method is provided for processing reflected radar signals produced by at least one radar signal receiver mounted to a motor vehicle, the reflected radar signals including at least one radar signal transmitted by at least one radar signal transmitter, also mounted to the motor vehicle, and reflected by an object toward and detected by the at least one radar signal receiver. In this seventy second example, the method may comprise monitoring, with at least one processor, at least one vehicle operating parameter signal produced by at least one vehicle operating parameter sensor or switch carried by the motor vehicle, if the monitored at least one vehicle operating parameter signal satisfies a first vehicle operating condition, operating in a gesture access mode by processing the reflected radar signals with the at least one processor to determine whether an object is within a sensing region of the at least one radar signal receiver and, if so, controlling at least one actuator associated with an access closure of the motor vehicle with the at least one processor to lock, unlock, open or close the access closure if the object within the sensing region exhibits a predefined gesture, and if the at least one vehicle operating parameter sensor signal satisfies a second vehicle operating condition different from the first vehicle operating condition, operating in an object impact avoidance mode by processing the reflected radar signals with the at least one processor to determine whether an object is within a predefined distance of the at least one radar signal receiver and, if so, at least one of activating at least one warning device with the at least one processor and controlling at least one actuator associated with at least one impact avoidance device of the motor vehicle with the at least one processor.
In a seventy third example, a method is provided for processing reflected radar signals produced by at least one radar signal receiver mounted to a motor vehicle, the reflected radar signals including at least one radar signal transmitted by at least one radar signal transmitter, also mounted to the motor vehicle, and reflected by an object toward and detected by the at least one radar signal receiver. In this seventy third example, the method may comprise monitoring, with at least one processor, at least one vehicle operating parameter signal produced by at least one vehicle operating parameter sensor or switch carried by the motor vehicle, determining, with the at least one processor, whether the motor vehicle is moving or enabled to move based on the at least one vehicle operating parameter signal, if the motor vehicle is determined by the processor to be moving or enabled to move, operating in an object impact avoidance mode by processing the reflected radar signals with the at least one processor to determine whether an object is within a predefined distance of the at least one radar signal receiver and, if so, at least one of activating, with the at least one processor, at least one warning device and controlling, with the at least one processor, at least one actuator associated with an impact avoidance device of the motor vehicle to activate the at least one impact avoidance device, and otherwise operating in a gesture access mode by processing the reflected radar signals with the at least one processor to determine whether an object is within a sensing region of the at least one radar signal receiver and, if so, controlling at least one actuator associated with an access closure of the motor vehicle with the at least one processor to at lock, unlock, open or close the access closure if the object within the sensing region exhibits a predefined gesture.
While this disclosure has been illustrated and described in detail in the foregoing drawings and description, the same is to be considered as illustrative and not restrictive in character, it being understood that only illustrative embodiments thereof have been shown and described and that all changes and modifications that come within the spirit of this disclosure are desired to be protected. For example, while some embodiments are illustrated in the attached drawings and described above as including at least one illumination device 112 for providing visual feedback during gesture access operation, any of the object detection modules 12 which include at least one illumination device 112 may alternatively include at least one audible device responsive to at least one control signal to produce at least one audible signal. In some such embodiments, at least one audible device may be configured to produce sounds of different volumes and/or frequencies. In other such embodiments, two or more audible devices may be included, each producing sound with a different volume and/or frequency. In any such embodiments, the at least one audible device may be controlled to switch on and off with a predefined frequency and/or duty cycle. In some such embodiments which include multiple audible devices, at least two of the multiple audible devices may be controlled to switch on and off with different frequencies and/or duty cycles. Obviously, many modifications and variations of this disclosure are possible in light of the above teachings, and it is to be understood that the various features described herein may be practiced in any combination whether or not specifically recited in the appended claims.
This is a continuation of U.S. patent application Ser. No. 16/164,570, filed Oct. 18, 2018, which is a continuation-in-part of U.S. patent application Ser. No. 15/262,647, filed Sep. 12, 2016, which claims the benefit of and priority to U.S. Provisional Patent Application Ser. No. 62/217,842, filed Sep. 12, 2015, which is also a continuation-in-part of U.S. patent application Ser. No. 15/378,823, filed Dec. 14, 2016, which claims the benefit of and priority to U.S. Provisional Patent Application Ser. No. 62/266,917, filed Dec. 14, 2015, and which also claims the benefit of and priority to PCT/US2018/037517, filed Jun. 14, 2018, the disclosures of which are all expressly incorporated herein by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
5682135 | Labonde | Oct 1997 | A |
6086131 | Bingle et al. | Jul 2000 | A |
6676186 | Greif | Jan 2004 | B2 |
8333492 | Dingman | Dec 2012 | B2 |
8643628 | Eriksson et al. | Feb 2014 | B1 |
8868299 | Kroemke | Oct 2014 | B2 |
8917239 | Eriksson et al. | Dec 2014 | B2 |
9001087 | Eriksson et al. | Apr 2015 | B2 |
9102266 | Dingman | Aug 2015 | B2 |
9164625 | Holmgren et al. | Oct 2015 | B2 |
9394737 | Gehin | Jul 2016 | B2 |
9446739 | Herthan | Sep 2016 | B2 |
9470033 | Dudar | Oct 2016 | B1 |
9598003 | Dingman | Mar 2017 | B2 |
9646436 | Campbell et al. | May 2017 | B1 |
9670702 | Sugita | Jun 2017 | B2 |
9694735 | Sheehan | Jul 2017 | B2 |
9739082 | Krauss | Aug 2017 | B2 |
9745778 | Bingle | Aug 2017 | B1 |
9776556 | Dingman | Oct 2017 | B2 |
9812017 | Krauss | Nov 2017 | B2 |
9892583 | Bingle | Feb 2018 | B2 |
9922472 | Jergess | Mar 2018 | B2 |
9956940 | Ette | May 2018 | B2 |
10087673 | Rosenmarkle | Oct 2018 | B1 |
10137363 | Parshionikar | Nov 2018 | B2 |
10246009 | McMahon et al. | Apr 2019 | B2 |
20010011836 | Grey | Aug 2001 | A1 |
20010052839 | Nahata et al. | Dec 2001 | A1 |
20030020645 | Akiyama | Jan 2003 | A1 |
20040031908 | Neveux et al. | Feb 2004 | A1 |
20060226953 | Shelley | Oct 2006 | A1 |
20060232379 | Shelly | Oct 2006 | A1 |
20080068145 | Weghaus et al. | Mar 2008 | A1 |
20090160211 | Krishnan | Jun 2009 | A1 |
20090302635 | Nakamura | Dec 2009 | A1 |
20100106182 | Patel | Apr 2010 | A1 |
20100275530 | Laskowski | Nov 2010 | A1 |
20110196568 | Nickolaou et al. | Aug 2011 | A1 |
20110309912 | Muller | Dec 2011 | A1 |
20120200486 | Meinel et al. | Aug 2012 | A1 |
20120312956 | Chang et al. | Dec 2012 | A1 |
20140156112 | Lee | Jun 2014 | A1 |
20140169139 | Lee | Jun 2014 | A1 |
20140204599 | Miura | Jul 2014 | A1 |
20150009062 | Hethan | Jan 2015 | A1 |
20150069249 | Alameh et al. | Mar 2015 | A1 |
20150248796 | Iyer et al. | Sep 2015 | A1 |
20150277848 | Grothe et al. | Oct 2015 | A1 |
20160096509 | Ette | Apr 2016 | A1 |
20160300410 | Jones et al. | Oct 2016 | A1 |
20160376819 | Bingle | Dec 2016 | A1 |
20170074009 | Banter | Mar 2017 | A1 |
20170138097 | Patel | May 2017 | A1 |
20170152697 | Dehelean | Jun 2017 | A1 |
20170158115 | Linden | Jun 2017 | A1 |
20170166166 | Lindic | Jun 2017 | A1 |
20170174179 | Schumacher | Jun 2017 | A1 |
20170234054 | Kumar et al. | Aug 2017 | A1 |
20170306684 | Baruco | Oct 2017 | A1 |
20170369016 | Gurghian et al. | Dec 2017 | A1 |
20180065542 | Dingman | Mar 2018 | A1 |
20180178788 | Ikedo et al. | Jun 2018 | A1 |
20180238098 | Rhode et al. | Aug 2018 | A1 |
20180238099 | Schatz et al. | Aug 2018 | A1 |
20190061689 | Breer | Feb 2019 | A1 |
20190128040 | Mitchell | May 2019 | A1 |
20190162010 | Rafrafi et al. | May 2019 | A1 |
20190162821 | Rafrafi et al. | May 2019 | A1 |
20190162822 | Rafrafi et al. | May 2019 | A1 |
Number | Date | Country |
---|---|---|
102016007388 | Dec 2016 | DE |
2082908 | Jul 2009 | EP |
2738337 | Jun 2014 | EP |
1020120032145 | Apr 2012 | KR |
2009152956 | Dec 2009 | WO |
Entry |
---|
Search Report and Written Opinion for International Patent Application No. PCT/US2016/066623 dated Apr. 3, 2017. |
Search Report and Written Opinion for International Patent Application No. PCT/US2016/051299 dated Dec. 26, 2016. |
Search Report and Written Opinion for International Patent Application No. PCT/US2018/037517 dated Mar. 11, 2019. |
Non-final Office Action for U.S. Appl. No. 15/262,746; dated Apr. 20, 2018. |
Non-final Office Action for U.S. Appl. No. 15/378,823; dated Jul. 27, 2018. |
Number | Date | Country | |
---|---|---|---|
20190186177 A1 | Jun 2019 | US |
Number | Date | Country | |
---|---|---|---|
62217842 | Sep 2015 | US | |
62266917 | Dec 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16164570 | Oct 2018 | US |
Child | 16284347 | US | |
Parent | PCT/US2018/037517 | Jun 2018 | US |
Child | 16164570 | Oct 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15262647 | Sep 2016 | US |
Child | 16164570 | US | |
Parent | 15378823 | Dec 2016 | US |
Child | 16164570 | Oct 2018 | US |