This disclosure relates to a camera monitoring system (CMS) for use in a commercial truck, for example, incorporating exterior cameras with sensors for detecting emergency vehicle lights.
Non-emergency vehicles must yield the roadway to emergency vehicles, which typically involves pulling the non-emergency vehicle to the side of the road. This can be quite challenging for a commercial vehicle due to its large size.
Typically a truck driver relies on continuously looking in the sideview mirrors to identify approaching emergency vehicles. In many cases it is difficult for the driver to see emergency vehicles due to the presence of and length of the trailer. This is especially true in que situations where it might be hard or almost impossible to detect emergency vehicles coming from far behind.
TETRA radio has been used to detect emergency vehicles where TETRA is used. However there are situations where TETRA is not used or is not available.
In one exemplary embodiment, a camera system for a first vehicle includes first and second exterior cameras that are configured to capture images outside of the first vehicle corresponding to legally prescribed views. First and second displays are arranged within a vehicle cabin of the first vehicle and are configured to depict the captured images respectively from the first and second exterior cameras. A vehicle location detector is configured to determine a location of the first vehicle. A warning device is configured to provide at least one alert within the vehicle cabin. A controller is in communication with the first and second displays, the first and second exterior cameras, the vehicle location detector, and the warning device. The controller is configured to identify a lighting characteristic of one or more vehicles using at least one of the first and second exterior cameras, recognize a target vehicle from the one or more vehicles by comparing the lighting characteristic to a database of emergency vehicle lighting characteristics for the location of the first vehicle, detect location information of the target vehicle relative to the first vehicle, and activate the warning device to provide the alert based upon the location information.
In a further embodiment of any of the above, the legally prescribed views include Class II and Class IV views, and the target vehicle is outside the legally prescribed views when recognized.
In a further embodiment of any of the above, the first and second displays are arranged respectively at left and right side A-pillars within the vehicle cabin.
In a further embodiment of any of the above, the vehicle location detector is a navigation system.
In a further embodiment of any of the above, the warning device includes at least one of a visual alert and/or an audio alert.
In a further embodiment of any of the above, the visual alert is provided on at least one of the first and second displays.
In a further embodiment of any of the above, the alert communicates an estimated time of arrival and/or distance of the target vehicle relative to the first vehicle.
In a further embodiment of any of the above, the warning device is provided by an audio source. The controller is configured to signal the audio source to provide an audible alert within the vehicle cabin the location information of the target vehicle relative to the first vehicle.
In a further embodiment of any of the above, the lighting characteristic includes light color, light intensity, light pattern, and combinations thereof.
In a further embodiment of any of the above, the controller is configured to classify the lighting characteristic as a common lighting characteristic or an emergency vehicle lighting characteristic based upon the emergency vehicle lighting characteristics organized by location in the database.
In another exemplary embodiment, a method of alerting a host vehicle driver of an approaching emergency vehicle includes, the method is configured to be used with a camera mirror system that has first and second exterior cameras that are configured to capture images outside of a host vehicle that corresponds to legally prescribed views, and first and second displays within a vehicle cabin are configured to depict the captured images respectively from the first and second exterior cameras, the method includes the steps of determining a location of the host vehicle, identifying a lighting characteristic of one or more vehicles using at least one of the first and second exterior cameras, comparing the lighting characteristic to a database of emergency vehicle lighting characteristics for the location of the host vehicle, recognizing a target vehicle in response to the comparing step, detecting location information of the target vehicle relative to the host vehicle in response to the recognizing step, and alerting the host vehicle driver of a location of the target vehicle relative to the host vehicle.
In a further embodiment of any of the above, the legally prescribed views include Class II and Class IV views, and the target vehicle is outside the legally prescribed views when recognized.
In a further embodiment of any of the above, the first and second displays are arranged respectively at left and right side A-pillars within the vehicle cabin.
In a further embodiment of any of the above, the determining step is performed by identifying a GPS coordinate for the host vehicle using a navigation system.
In a further embodiment of any of the above, the alerting step is performed by providing at least one of a visual alert and/or an audio alert.
In a further embodiment of any of the above, the visual alert is provided on at least one of the first and second displays.
In a further embodiment of any of the above, the alert communicates an estimated time of arrival and/or distance of the target vehicle relative to the first vehicle.
In a further embodiment of any of the above, the alerting step is performed by providing an audible alert with an audio source.
In a further embodiment of any of the above, the lighting characteristic includes light color, light intensity, light pattern, and combinations thereof.
In a further embodiment of any of the above, the comparing step is performed by classifying the lighting characteristic as a common lighting characteristic or an emergency vehicle lighting characteristic based upon the emergency vehicle lighting characteristics organized by location in the database.
The disclosure can be further understood by reference to the following detailed description when considered in connection with the accompanying drawings wherein:
The embodiments, examples and alternatives of the preceding paragraphs, the claims, or the following description and drawings, including any of their various aspects or respective individual features, may be taken independently or in any combination. Features described in connection with one embodiment are applicable to all embodiments, unless such features are incompatible.
A schematic view of a commercial vehicle 10 is illustrated in
Each of the camera arms 16a, 16b includes a base that is secured to, for example, the cab 12. A pivoting arm is supported by the base and may articulate relative thereto. At least one rearward facing camera 20a, 20b is arranged respectively within camera arms. The exterior cameras 20a, 20b respectively provide an exterior field of view FOVEX1, FOVEX2 that each include at least one of the Class II and Class IV views (
First and second video displays 18a, 18b are arranged on each of the driver and passenger sides within the vehicle cab 12 on or near the left and right A-pillars to display Class II and Class IV views on its respective side of the vehicle 10, which provide rear facing side views along the vehicle 10 that are captured by the exterior cameras 20a, 20b.
If video of Class V and Class VI views are also desired, a camera housing 16c and camera 20c may be arranged at or near the front of the vehicle 10 to provide those views (
The displays 18a, 18b, 18c face a driver region 24 within the cabin 22 where an operator is seated on a driver seat 26. It may be desirable to incorporate a vehicle detection system (VDS) 29 into the CMS 15 to provide an alert to a vehicle operator of an approaching emergency vehicle(s) before the driver may otherwise be able to identify an emergency vehicle due to driver distraction or vantage point. That is, the emergency vehicle may be outside the legally prescribed views displayed on the displays 18a, 18b, 18c, but the emergency vehicle may still fall within the image capture units of one or more of the exterior cameras 20a, 20b. Accordingly, the VDS 29 may grant the vehicle operator an early warning to clear the road for the approaching emergency vehicle(s), or target vehicle(s). One example VDS 29 is illustrated in
As shown in
Sensors 36-44 may be used with the VDS 29 to gather addition information in connection with providing an alert of one or more approaching emergency vehicle, for example, by determining the location of the emergency vehicle relative to the vehicle 10. A controller or ECU 30 is in communication with the displays 18a-18c, the external cameras 20a, 20b, the VLD 58 and the warning device 60, and is configured to identify a lighting characteristic of one or more vehicles using the external cameras 20a, 20b. The lighting characteristic includes light color, light intensity, light pattern, and combinations thereof. The controller 30 can recognize a target vehicle by comparing the identified lighting characteristic to a database, which may be organized by region and/or country, for example. The database includes lighting characteristics (e.g., light color, light intensity, etc.) of emergency vehicles for the location of the vehicle 10. That is, emergency vehicles of a particular country and/or region may have distinctive colors and/or patterns. By referencing the vehicle location against a database, the VDS 29 can recognize an emergency vehicle.
In addition to comparing the lighting characteristic, the controller 30 may classify the lighting characteristic as one of a common characteristic or an emergency characteristic. Classification as an emergency characteristic is recognized as a target vehicle. Accordingly, it follows that classification as a common characteristic (e.g., flashing lights by a vehicle signaling a desire to pass) is not recognized as a target vehicle. For example, if the VLD 52 determines that the vehicle 10 is in a particular region and identifies a lighting characteristic for each of a second vehicle and third vehicle, the controller 30 will compare each respective second and third vehicle lighting characteristic to the database of emergency vehicle lighting characteristics for that particular region. The controller 30 then classifies the second vehicle lighting characteristic as an emergency characteristic, thus recognizing the second vehicle as a target vehicle. Similarly, the controller 30 classifies the third vehicle characteristic as a common characteristic, and thus ignores the third vehicle. The controller 30 may compare emergency vehicle lighting characteristics for a respective state, country, and/or region where the vehicle 10 is located. It will be appreciated that the database includes emergency vehicle lighting characteristics for law enforcement, firefighters, medical, and other rescue personnel vehicles.
The controller 30 is also configured to detect location information of the target vehicle relative to the vehicle 10 and signal the warning device 60 to communicate an alert of the target vehicle. The warning device 60 may communicate the alert visually using the displays 18a-18c, the electronic display 46, and/or audibly using the speaker 48. The alert includes an estimated time of arrival and/or estimated distance of the target vehicle relative to the vehicle 10. The estimated time and/or estimated distance may be communicated by displaying distance numbers, changing colors, flashing symbols at different rates, changing audible alert frequency and/or noise amplitude, for example.
The controller 30 is operably connected to the components 20a, 20b, 36-48, 52 and 60 through a vehicle data bus 52, which may be a controller area network (“CAN”) bus or LIN bus, for example. The controller 30 is configured to perform an alert in the event that a target (or emergency) vehicle has been recognized. The controller 30 can be one controller or multiple controllers, if desired.
In one example the controller 30 is configured to continuously determine location information of the target vehicle. When the warning device 60 communicates the alert, the controller 30 initiates a new cycle to determine current location information of the target vehicle relative to the vehicle 10. Thus, the controller 30 can account for target vehicle feedback and therefore provide real-time target vehicle location information. As an example, target vehicle feedback can include change of speed.
It should be noted that the controller 30 may include one or more discrete units. Moreover, a portion of the controller 30 may be provided in the vehicle 10, while another portion of the controller 30 may be located elsewhere. In terms of hardware architecture, such a computing device can include a processor, memory, and one or more input and/or output (I/O) device interface(s) that are communicatively coupled via a local interface. The local interface can include, for example but not limited to, one or more buses and/or other wired or wireless connections. The local interface may have additional elements, which are omitted for simplicity, such as controllers, buffers (caches), drivers, repeaters, and receivers to enable communications. Further, the local interface may include address, control, and/or data connections to enable appropriate communications among the aforementioned components.
The controller 30 may be a hardware device for executing software, particularly software stored in memory. The controller 30 can be a custom made or commercially available processor, a central processing unit (CPU), an auxiliary processor among several processors associated with the controller, a semiconductor-based microprocessor (in the form of a microchip or chip set) or generally any device for executing software instructions.
The memory can include any one or combination of volatile memory elements (e.g., random access memory (RAM, such as DRAM, SRAM, SDRAM, VRAM, etc.)) and/or nonvolatile memory elements (e.g., ROM, hard drive, tape, CD-ROM, etc.). Moreover, the memory may incorporate electronic, magnetic, optical, and/or other types of storage media. The memory can also have a distributed architecture, where various components are situated remotely from one another, but can be accessed by the processor. The database may be stored in memory.
The software in the memory may include one or more separate programs, each of which includes an ordered listing of executable instructions for implementing logical functions. A system component embodied as software may also be construed as a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed. When constructed as a source program, the program is translated via a compiler, assembler, interpreter, or the like, which may or may not be included within the memory.
The disclosed input and output devices that may be coupled to system I/O interface(s) may include input devices, for example but not limited to, a keyboard, mouse, scanner, microphone, camera, mobile device, proximity device, etc. Further, the output devices, for example but not limited to, a printer, display, etc. Finally, the input and output devices may further include devices that communicate both as inputs and outputs, for instance but not limited to, a modulator/demodulator (modem; for accessing another device, system, or network), a radio frequency (RF) or other transceiver, a telephonic interface, a bridge, a router, etc.
When the controller 30 is in operation, the processor can be configured to execute software stored within the memory, to communicate data to and from the memory, and to generally control operations of the computing device pursuant to the software. Software in memory, in whole or in part, is read by the processor, perhaps buffered within the processor, and then executed.
It should also be understood that although a particular component arrangement is disclosed in the illustrated embodiment, other arrangements will benefit herefrom. Although particular step sequences are shown, described, and claimed, it should be understood that steps may be performed in any order, separated or combined unless otherwise indicated and will still benefit from the present invention.
Although the different examples have specific components shown in the illustrations, embodiments of this invention are not limited to those particular combinations. It is possible to use some of the components or features from one of the examples in combination with features or components from another one of the examples.
Although an example embodiment has been disclosed, a worker of ordinary skill in this art would recognize that certain modifications would come within the scope of the claims. For that reason, the following claims should be studied to determine their true scope and content.
This application claims priority to U.S. Provisional Application No. 63/165,241 filed on Mar. 24, 2021.
Number | Date | Country | |
---|---|---|---|
63165241 | Mar 2021 | US |