The present disclosure generally relates to vehicle views and, more specifically, to detection and presentation of obstructed vehicle views.
Generally, a vehicle includes a windshield, a rear window, and side windows that partially define a cabin of the vehicle and enable a driver and/or other occupant(s) (e.g., passengers) to view an area surrounding the vehicle. Oftentimes, the windshield is formed from laminated safety glass, and the side and rear windows are formed from tempered glass, laminated glass, polycarbonate, acrylic resins, and/or other materials.
A vehicle also typically includes mirrors (e.g., a rearview mirror, side mirrors) to facilitate a driver in viewing a surrounding area next to and/or behind the vehicle. Oftentimes, the mirrors of the vehicle include a reflective layer (e.g., formed of metallic material) and a glass or plastic layer coupled to the reflective layer to protect the reflective layer from becoming damaged.
The appended claims define this application. The present disclosure summarizes aspects of the embodiments and should not be used to limit the claims. Other implementations are contemplated in accordance with the techniques described herein, as will be apparent to one having ordinary skill in the art upon examination of the following drawings and detailed description, and these implementations are intended to be within the scope of this application.
Example embodiments are shown for detection and presentation of obstructed vehicle views. An example disclosed vehicle includes a first side mirror, a first camera adjacent to the first side mirror, a first sensor module to detect opaque material on the first side mirror, an obstruction identifier to determine whether viewing of a first area via the first side mirror is obstructed, and a display to present, via the camera, the first area responsive to the obstruction identifier determining viewing via the first side mirror is obstructed.
An example disclosed method for detection and presentation of obstructed vehicle views includes detecting, via a first sensor module, opaque material on a first side mirror and determining, via a processor, whether viewing of a first area via the first side mirror is obstructed. The example disclosed method also includes capturing the first area via a camera and presenting the first area via a display responsive to determining that viewing of the first area is obstructed.
An example disclosed vehicle includes a rearview mirror, a camera adjacent to a rearview window, a sensor module to detect opaque material on the rearview mirror, an obstruction identifier to determine whether viewing of the an area via the rearview mirror is obstructed, and a display to present, via the camera, the area responsive to the obstruction identifier determining that viewing via the rearview mirror is obstructed.
For a better understanding of the invention, reference may be made to embodiments shown in the following drawings. The components in the drawings are not necessarily to scale and related elements may be omitted, or in some instances proportions may have been exaggerated, so as to emphasize and clearly illustrate the novel features described herein. In addition, system components can be variously arranged, as known in the art. Further, in the drawings, like reference numerals designate corresponding parts throughout the several views.
While the invention may be embodied in various forms, there are shown in the drawings, and will hereinafter be described, some exemplary and non-limiting embodiments, with the understanding that the present disclosure is to be considered an exemplification of the invention and is not intended to limit the invention to the specific embodiments illustrated.
Generally, a vehicle includes a windshield, a rear window, and side windows that partially define a cabin of the vehicle and enable a driver and/or other occupant(s) (e.g., passengers) to view an area surrounding the vehicle. Oftentimes, the windshield is formed from laminated safety glass, and the side and rear windows are formed from tempered glass, laminated glass, polycarbonate, acrylic resins, and/or other materials.
Further, a vehicle typically includes a rearview mirror and side mirrors (i.e., side-view mirrors, wing mirrors, fender mirrors) to facilitate a driver in viewing a surrounding area next to and/or behind the vehicle. Oftentimes, rearview mirrors and side mirrors include a reflective layer (e.g., formed of aluminum and/or other metallic material(s)) and a glass or plastic layer coupled to the reflective layer to protect the reflective layer from becoming damaged.
In some instances, opaque material collects on a mirror and/or a corresponding window, thereby potentially preventing a driver from viewing surrounding areas of the vehicle. For example, when a glass temperature is at or below a dew point temperature, a film of condensation and/or ice may form on a window and/or a mirror as a result of condensation collecting on a surface of the window and/or mirror. In some instances, condensation collects on a mirror when a temperature of a glass layer of the mirror is at or below a dew point temperature of air adjacent to the glass layer. In other examples, the opaque material is rain droplets and/or snow that collects on a surface of a window and/or a mirror. Further, in some examples, cracks may form in a glass layer of a window and/or a mirror, thereby potentially resulting in an opaque surface.
Example apparatus and methods disclosed herein include sensor modules that detect when opaque material located on a mirror and/or an adjacent window of the vehicle prevents a driver of the vehicle from viewing a surrounding area via the mirror. The examples apparatus and methods disclosed herein further include a display of the vehicle that presents image(s) and/or video of the obstructed view of the surrounding area that is captured via a camera of the vehicle. For example, the video presents image(s) and/or video of an area next to and/or behind the vehicle when opaque material on a side mirror (i.e., a side-view mirror, a wing mirror, a fender mirror) and/or an adjacent side window prevents the driver from viewing that area via the side mirror. Additionally or alternatively, the video presents image(s) and/or video of an area behind the vehicle when opaque material on a rearview mirror and/or an rearview window prevents the driver from viewing that area via the rearview mirror.
Turning to the figures,
In the illustrated example, the vehicle 100 includes a cabin 102, a windshield 104, a rearview window 106, a side window 108 (e.g., a first window, a front driver-side window), a side window 110 (e.g., a second window, a front passenger-side window), a side window 112 (e.g., a third window, a back driver-side window), and a side window 114 (e.g., a fourth window, a back passenger-side window). For example, the windshield 104 is formed from laminated safety glass. The rearview window 106, the side window 108, the side window 110, the side window 112, and the side window 114 are formed from tempered glass, laminated glass, polycarbonate, acrylic resins, and/or other transparent material(s).
Additionally, the vehicle 100 includes a side mirror 116 (e.g., a first side mirror, a driver-side side mirror) adjacent to the side window 108, a side mirror 118 (e.g., a second side mirror, a passenger-side side mirror) adjacent to the side window 110, and a rearview mirror 120. For example, the side mirror 116 enables a driver of the vehicle 100 to view an area 122 (e.g., a first area) adjacent to and/or behind a driver-side of the vehicle 100. The side mirror 118 enables the driver to view an area 124 (e.g., a second area) adjacent to and/or behind a passenger-side of the vehicle 100. Further, the rearview mirror 120 enables the driver to view an area 126 (e.g., a third area) behind the vehicle 100 through the rearview window 106.
The vehicle 100 of the illustrated example also includes an infotainment head unit 128 that provides an interface between the vehicle 100 and a user (e.g., the driver). The infotainment head unit 128 includes digital and/or analog interfaces (e.g., input devices and output devices) to receive input from and display information for the user(s). The input devices include, for example, a control knob, an instrument panel, a digital camera for image capture and/or visual command recognition, a touch screen, an audio input device (e.g., cabin microphone), buttons, or a touchpad. The output devices may include instrument cluster outputs (e.g., dials, lighting devices), actuators, a display 130 (e.g., a heads-up display, a center console display such as a liquid crystal display (LCD), an organic light emitting diode (OLED) display, a flat panel display, a solid state display, etc.), and/or a speaker 132. In the illustrated example, the infotainment head unit 128 includes hardware (e.g., a processor or controller, memory, storage, etc.) and software (e.g., an operating system, etc.) for an infotainment system (such as SYNC® and MyFord Touch® by Ford®, Entune® by Toyota®, IntelliLink® by GMC®, etc.). For example, the infotainment head unit 128 displays the infotainment system via the display 130.
Further, the vehicle 100 of the illustrated example includes a camera 134 (e.g., a first camera) adjacent to the side mirror 116, a camera 136 (e.g., a second camera) adjacent to the side mirror 118, and a camera 138 (e.g., a third camera) adjacent to the rearview window 106. The camera 134 captures image(s) and/or video of the area 122 adjacent to the vehicle 100. For example, the camera 134 is coupled to the side mirror 116 or another surface of the vehicle 100 adjacent to the side mirror 116 to enable the camera 134 to capture the image(s) and/or video of the area 122. Further, the camera 136 captures image(s) and/or video of the area 124 adjacent to the vehicle 100. For example, the camera 136 is coupled to the side mirror 118 or another surface of the vehicle 100 adjacent to the side mirror 118 to enable the camera 136 to capture the image(s) and/or video of the area 124. Additionally, the camera 138 captures image(s) and/or video of the area 126 behind the vehicle 100. For example, the camera 138 is coupled to the vehicle 100 adjacent to the rearview window 106 to enable the camera 138 to capture the image(s) and/or video of the area 126.
In the illustrated example, the vehicle 100 also includes sensor modules that detect whether opaque material (e.g., condensation, rain droplets, ice, snow, cracked surfaces, etc.) is located on and/or within a transparent or reflective surface of the vehicle 100 (e.g., the windshield 104, the rearview window 106, the side window 108, the side window 110, the side window 112, the side window 114, the side mirror 116, the side mirror 118, the rearview mirror 120). The sensors modules of the vehicle 100 include hardware (e.g., a sensor, a transmitter, a processor, memory, storage, etc.) to detect opaque material on a vehicle surface. Further, the sensor modules may include software to detect opaque material on a vehicle surface. For example, one or more of the sensors modules detect whether opaque material is on a vehicle surface by comparing light intensity measurement(s) collected by sensor(s) adjacent to a first side of a vehicle surface to light intensity measurement(s) collected by other sensor(s) adjacent to an opposing second side of the vehicle surface. Additionally or alternatively, one or more of the sensors modules detect whether opaque material is on a vehicle surface by comparing a reference light intensity to light intensity measurement(s) collected by sensor(s) adjacent to a first side of a vehicle surface for a light beam that is emitted by light transmitter adjacent to an opposing second side of the vehicle surface. Example sensor modules are disclosed as opaqueness detection assemblies in U.S. Application ______, Docket No. 83791863 (NGE File No. 026780.8669), filed on Mar. 24, 2017 and U.S. Application ______, Docket No. 83791849 (NGE File No. 026780.8670), filed on Mar. 24, 2017, which are incorporated herein by reference in their entireties.
As illustrated in
Additionally, the vehicle 100 of the illustrated example includes a sensor module 152 (e.g., a seventh sensor module) that detects whether an object is positioned in the cabin 102 of the vehicle 100 that obstructs the driver from viewing the area 126 via the rearview mirror 120. For example, the sensor module 152 includes transmitter(s) (e.g., light transmitters 402 of
The vehicle 100 also includes an obstruction identifier 154 that determines whether opaque material and/or object(s) within the cabin 102 are obstructing the driver's view of the area 122 via the side mirror 116, the area 124 via the side mirror 118, and/or the area 126 via the rearview mirror 120.
For example, in response to the sensor module 140 detecting opaque material on the side mirror 116 and/or the sensor module 142 detecting opaque material on the side window 108, the obstruction identifier 154 determines whether viewing of the area 122 by the driver is obstructed by opaque material. In response to the obstruction identifier 154 determining that viewing of the area 122 via the side mirror 116 is obstructed, the camera 134 captures image(s) and/or video of the area 122 and the display 130 presents the image(s) and/or video of the area 122 to the driver.
Additionally or alternatively, in response to the sensor module 144 detecting opaque material on the side mirror 118 and/or the sensor module 146 detecting opaque material on the side window 110, the obstruction identifier 154 determines whether viewing of the area 124 by the driver is obstructed by opaque material. In response to the obstruction identifier 154 determining that viewing of the area 124 via the side mirror 118 is obstructed, the camera 136 captures image(s) and/or video of the area 124 and the display 130 presents the image(s) and/or video of the area 124 to the driver.
Further, the obstruction identifier 154 determines whether viewing of the area 126 behind the vehicle 100 is obstructed in response to the sensor module 148 detecting opaque material on the rearview mirror 120, the sensor module 150 detecting opaque material on the rearview window 106, and/or the sensor module 152 detecting an object within the cabin 102. The camera 136 captures image(s) and/or video of the area 126 and the display 130 presents the image(s) and/or video of the area 126 to the driver in response to the obstruction identifier 154 determining that viewing of the area 122 via the rearview mirror 120 is obstructed.
The light transmitters 402 of the illustrated example are coupled to the rearview mirror 120 of the vehicle 100. In other examples, the light transmitters 402 may be coupled to any surface near the rearview mirror 120 within the cabin 102 that facilitates detection of objects within the cabin 102 that obstruct the driver from viewing the area 126 via the rearview mirror 120. Additionally, the sensors 406 are coupled to and/or are otherwise positioned adjacent to the rearview window 106 of the vehicle 100.
The sensors 406 collect light intensity measurements of the light beams 404. When one or more of the sensors 406 collects a light intensity measurement that is less than a reference light intensity associated with an unobstructed view, the sensor module 400 detects that an object located within the cabin 102 obstructs the driver's view of the area 126 via the rearview mirror 120. In some examples, the light transmitters 402 and/or the sensors 406 are positioned and/or oriented such that objects that do not block a view of the driver (e.g., the driver, vehicle seats, seat headrests) do not affect (e.g., reduce) the light intensity measurements collected by the sensors 406.
In the illustrated example, the light transmitters 402 emit the light beams 404 in a crisscross pattern toward the respective sensors 406 to increase an area within the cabin 102 in which an object may be detected by the sensor module 400.
For example, the light transmitters 402 include a light transmitter 402a (e.g., a first of the light transmitters) that emits a light beam 404a (e.g., a first light beam) toward the rearview window 106, a light transmitter 402b (e.g., a second of the light transmitters) that emits a light beam 404b (e.g., a second light beam) toward the rearview window 106, a light transmitter 402c (e.g., a third of the light transmitters) that emits a light beam 404c (e.g., a third light beam) toward the rearview window 106, and a light transmitter 402d (e.g., a fourth of the light transmitters) that emits a light beam 404d (e.g., a fourth light beam) toward the rearview window 106. Further, the sensors 406 include a sensor 406a (e.g., a first of the sensors) that is to receive the light beam 404a, a sensor 406b (e.g., a second of the sensors) that is to receive the light beam 404b, a sensor 406c (e.g., a third of the sensors) that is to receive the light beam 404c, and a sensor 406d (e.g., a fourth of the sensors) that is to receive the light beam 404d.
In the illustrated example, the light transmitter 402a is coupled to an upper driver-side corner 408a of the rearview mirror 120 and emits the light beam 404a toward the light sensor 406A coupled to a lower driver-side corner 410a of the rearview window 106. Further, the light transmitter 402b is coupled to a lower driver-side corner 408b of the rearview mirror 120 and emits the light beam 404b toward the sensor 406b coupled to an upper driver-side corner 410b of the rearview window 106 such that the light beam 404a and the light beam 404b crisscross within the cabin 102 of the vehicle 100. Additionally, the light transmitter 402c is coupled to an upper passenger-side corner 408c of the rearview mirror 120 and emits the light beam 404c toward the sensor 406c coupled to a lower passenger-side corner 410c of the rearview window 106. The light transmitter 402d is coupled to a lower passenger-side corner 408d of the rearview mirror 120 and emits the light beam 404d toward the sensor 406d coupled to an upper passenger-side corner 410d of the rearview window 106 such that the light beam 404c and the light beam 404d crisscross within the cabin 102 of the vehicle 100.
The light transmitter 502 of the illustrated example is coupled to the rearview mirror 120 of the vehicle 100. In other examples, the light transmitter 502 may be coupled to any surface near the rearview mirror 120 within the cabin 102 that facilitates detection of objects within the cabin 102 that obstruct the driver from viewing the area 126 via the rearview mirror 120. Further, the sensors 506 are coupled to and/or are otherwise positioned adjacent to the rearview window 106 of the vehicle 100. In the illustrated example, the sensors 506 form a matrix of sensors that include a plurality of sensors arrays. In other examples, the sensors are positioned along an outer edge 508 of the rearview window 106.
The sensors 506 collect light intensity measurements of the light beam 504. When one or more of the sensors 506 collects a light intensity measurement that is less than a reference light intensity associated with an unobstructed view, the sensor module 500 detects that an object located within the cabin 102 obstructs the driver's view of the area 126 via the rearview mirror 120. In some examples, the light transmitter 502 and/or the sensors 506 are positioned and/or oriented such that objects that do not block a view of the driver (e.g., the driver, vehicle seats, seat headrests) do not affect (e.g., reduce) the light intensity measurements collected by the sensors 506.
As illustrated in
The on-board computing platform 702 includes a microcontroller unit, controller or processor 710 and memory 712. In some examples, the processor 710 of the on-board computing platform 702 is structured to include the obstruction identifier 154. Alternatively, in some examples, the obstruction identifier 154 is incorporated into another electronic control unit (ECU) with its own processor 710 and memory 712. The processor 710 may be any suitable processing device or set of processing devices such as, but not limited to, a microprocessor, a microcontroller-based platform, an integrated circuit, one or more field programmable gate arrays (FPGAs), and/or one or more application-specific integrated circuits (ASICs). The memory 712 may be volatile memory (e.g., RAM including non-volatile RAM, magnetic RAM, ferroelectric RAM, etc.), non-volatile memory (e.g., disk memory, FLASH memory, EPROMs, EEPROMs, memristor-based non-volatile solid-state memory, etc.), unalterable memory (e.g., EPROMs), read-only memory, and/or high-capacity storage devices (e.g., hard drives, solid state drives, etc). In some examples, the memory 712 includes multiple kinds of memory, particularly volatile memory and non-volatile memory.
The memory 712 is computer readable media on which one or more sets of instructions, such as the software for operating the methods of the present disclosure, can be embedded. The instructions may embody one or more of the methods or logic as described herein. For example, the instructions reside completely, or at least partially, within any one or more of the memory 712, the computer readable medium, and/or within the processor 710 during execution of the instructions.
The terms “non-transitory computer-readable medium” and “computer-readable medium” include a single medium or multiple media, such as a centralized or distributed database, and/or associated caches and servers that store one or more sets of instructions. Further, the terms “non-transitory computer-readable medium” and “computer-readable medium” include any tangible medium that is capable of storing, encoding or carrying a set of instructions for execution by a processor or that cause a system to perform any one or more of the methods or operations disclosed herein. As used herein, the term “computer readable medium” is expressly defined to include any type of computer readable storage device and/or storage disk and to exclude propagating signals.
The sensors 704 are arranged in and around the vehicle 100 to monitor properties of the vehicle 100 and/or an environment in which the vehicle 100 is located. One or more of the sensors 704 may be mounted to measure properties around an exterior of the vehicle 100. Additionally or alternatively, one or more of the sensors 704 may be mounted inside a cabin of the vehicle 100 or in a body of the vehicle 100 (e.g., an engine compartment, wheel wells, etc.) to measure properties in an interior of the vehicle 100. For example, the sensors 704 include accelerometers, odometers, tachometers, pitch and yaw sensors, wheel speed sensors, microphones, tire pressure sensors, biometric sensors and/or sensors of any other suitable type. In the illustrated example, the sensors 704 include the sensor module 140 and the sensor module 142 to detect whether viewing the area 122 via the side mirror 116 is obstructed; the sensor module 144 and the sensor module 146 to detect whether viewing the area 124 via the side mirror 118 is obstructed; and the sensor module 148, the sensor module 150, and the sensor module 152 (e.g., the sensor module 400, the sensor module 500) to detect whether viewing the area 126 via the rearview mirror 120 is obstructed.
The ECUs 706 monitor and control the subsystems of the vehicle 100. For example, the ECUs 706 are discrete sets of electronics that include their own circuit(s) (e.g., integrated circuits, microprocessors, memory, storage, etc.) and firmware, sensors, actuators, and/or mounting hardware. The ECUs 706 communicate and exchange information via a vehicle data bus (e.g., the vehicle data bus 708). Additionally, the ECUs 706 may communicate properties (e.g., status of the ECUs 706, sensor readings, control state, error and diagnostic codes, etc.) to and/or receive requests from each other. For example, the vehicle 100 may have seventy or more of the ECUs 706 that are positioned in various locations around the vehicle 100 and are communicatively coupled by the vehicle data bus 708.
In the illustrated example, the ECUs 706 include a body control module 714 and a door control unit 716. For example, the body control module 714 controls one or more subsystems throughout the vehicle 100, such as an immobilizer system, an HVAC system, etc. For example, the body control module 714 includes circuits that drive one or more of relays (e.g., to control wiper fluid, etc.), brushed direct current (DC) motors (e.g., to control wipers, etc.), stepper motors, LEDs, etc. The door control unit 716 controls one or more electrical systems located on doors of the vehicle 100, such as power windows, power locks, power mirrors, mirror heating elements, etc. For example, the door control unit 716 includes circuits that drive one or more of relays brushed direct current (DC) motors (e.g., to control power seats, power locks, power windows, etc.), stepper motors, LEDs, etc.
The vehicle data bus 708 communicatively couples the infotainment head unit 128, the on-board computing platform 702, the sensors 704, and the ECUs 706. In some examples, the vehicle data bus 708 includes one or more data buses. The vehicle data bus 708 may be implemented in accordance with a controller area network (CAN) bus protocol as defined by International Standards Organization (ISO) 11898-1, a Media Oriented Systems Transport (MOST) bus protocol, a CAN flexible data (CAN-FD) bus protocol (ISO 11898-7) and/a K-line bus protocol (ISO 9141 and ISO 14230-1), and/or an Ethernet™ bus protocol IEEE 802.3 (2002 onwards), etc.
Initially, at block 802, the obstruction identifier 154 identifies a mirror of the vehicle 100. For example, the obstruction identifier 154 identifies the side mirror 116 (e.g., a first side mirror). At block 804, a sensor module detects whether opaque material (e.g., condensation, rain droplets, ice, snow, a cracked surface) is located on the mirror. For example, the sensor module 140 (e.g., a first sensor module) detects whether opaque material is on the side mirror 116.
In response to the sensor module detecting that opaque material is on the mirror, the method 800 proceeds to block 806 at which the obstruction identifier 154 determines whether the opaque material obstructs a driver from viewing an area adjacent to the vehicle 100 via the mirror. For example, the obstruction identifier 154 determines whether the opaque material obstructs the driver from viewing the area 122 (e.g., a first area) via the side mirror 116. In response to the obstruction identifier 154 determining that the opaque material obstructs the driver from viewing the area adjacent to the vehicle 100, the method 800 proceeds to block 808. At block 808, a camera collects image(s) and/or video of the area and the display 130 of the vehicle 100 presents the obstructed view to the driver. For example, the camera 134 (e.g., a first camera) collects image(s) and/or video of the area 122 and the display 130 presents the image(s) and/or video of the area 122 to the driver.
Otherwise, in response to the sensor module not detecting opaque material at block 804 or upon the obstruction identifier 154 determining at block 806 that the opaque material does not obstruct the view of the driver, the method 800 proceeds to block 810. At block 810, another sensor module detects whether opaque material is on a window that corresponds to the identified mirror. For example, the sensor module 142 (e.g., a second sensor module) detects whether opaque material is on the side window 108 (e.g., a first side window). In response to the sensor module detecting that opaque material is on the window, the method 800 proceeds to block 812 at which the obstruction identifier 154 determines whether the opaque material obstructs a driver from viewing the area adjacent to the vehicle 100 via the identified mirror. For example, the obstruction identifier 154 determines whether the opaque material obstructs the driver from viewing the area 122 via the side mirror 116. In response to the obstruction identifier 154 determining that the opaque material obstructs the driver from viewing the area adjacent to the vehicle 100, the method 800 proceeds to block 808 at which the display 130 presents the obstructed view to the driver. For example, the camera 134 collects or captures image(s) and/or video of the area 122 and the display 130 presents the image(s) and/or video of the area 122 to the driver.
The method proceeds to block 814 in response to the display 130 presenting the image(s) and/or video of the obstructed area at block 808, in response to the sensor module not detecting opaque material at block 810, or in response to the obstruction identifier 154 determining that the opaque material does not obstruct the view of the driver at block 812. At block 814, the obstruction identifier 154 determines whether there is another mirror of the vehicle 100 to monitor. In response to determining that there is another mirror (e.g., the side mirror 118, the rearview mirror 120), the method 800 returns to block 802 to repeat blocks 802, 804, 806, 808, 810, 812.
For example, the sensor module 144 (e.g., a third sensor module) detects whether opaque material is on the side mirror 118 (e.g., a second side mirror) (block 804). In response to the sensor module 144 detecting opaque material on the side mirror 118, the obstruction identifier 154 determines whether a view of the area 124 (e.g., a second area) by the driver is obstructed by the opaque material (block 806). Further, the sensor module 146 (e.g., a fourth sensor module) detects whether opaque material is on the side window 110 (e.g., a second side window) (block 810). In response the sensor module 146 detecting opaque material on the side window 110, the obstruction identifier 154 determines whether a view of the area 124 by the driver is obstructed by the opaque material (block 812). In response to the obstruction identifier 154 determining that viewing of the area 124 via the side mirror 118 is obstructed, the camera 136 captures and the display 130 presents image(s) and/or video of the area 122 to the driver (block 808).
Additionally or alternatively, the sensor module 148 (e.g., a fifth sensor) detects whether opaque material is on the rearview mirror 120 (block 804). In response to the sensor module 148 detecting opaque material on the rearview mirror 120, the obstruction identifier 154 determines whether a view of the area 126 (e.g., a third area) by the driver is obstructed by the opaque material (block 806). Further, the sensor module 150 (e.g., a sixth mirror) and/or the sensor module 152 (e.g., a seventh mirror) detects whether opaque material is on the rearview window 106 (block 810). In response to the sensor module 150 and/or the sensor module 152 detecting opaque material on the rearview mirror 120, the obstruction identifier 154 determines whether a view of the area 126 by the driver is obstructed by the opaque material (block 812). In response to the obstruction identifier 154 determining that viewing of the area 126 via the rearview mirror 120 is obstructed, the camera 138 captures and the display 130 presents image(s) and/or video of the area 122 to the driver (block 808).
In response to determining at block 814 that there is not another mirror to monitor, the method 800 proceeds to block 816 at which the sensor module 152 detects whether there is an object in the cabin 102 of the vehicle 100 that is between the driver's view of the area 126 via the rearview mirror 120. In response to the sensor module 152 detecting that the there is not an object in the cabin 102, the method 800 returns to block 802. In response to the sensor module 152 detecting that the there is an object in the cabin 102, the method 800 proceeds to block 818 at which the obstruction identifier 154 determines whether the object obstructs a driver from viewing the area 126 via the rearview mirror 120. In response to the obstruction identifier 154 determining that the driver's view of the area 126 via the rearview mirror 120 is not obstructed, the method 800 returns to block 802. Otherwise, in response to the obstruction identifier 154 determining that the driver's view of the area 126 via the rearview mirror 120 is obstructed by the object, the method 800 proceeds to block 820 at which the camera 138 captures and the display 130 presents image(s) and/or video of the area 122 to the driver.
In this application, the use of the disjunctive is intended to include the conjunctive. The use of definite or indefinite articles is not intended to indicate cardinality. In particular, a reference to “the” object or “a” and “an” object is intended to denote also one of a possible plurality of such objects. Further, the conjunction “or” may be used to convey features that are simultaneously present instead of mutually exclusive alternatives. In other words, the conjunction “or” should be understood to include “and/or”. The terms “includes,” “including,” and “include” are inclusive and have the same scope as “comprises,” “comprising,” and “comprise” respectively.
The above-described embodiments, and particularly any “preferred” embodiments, are possible examples of implementations and merely set forth for a clear understanding of the principles of the invention. Many variations and modifications may be made to the above-described embodiment(s) without substantially departing from the spirit and principles of the techniques described herein. All modifications are intended to be included herein within the scope of this disclosure and protected by the following claims.
This application is related to U.S. Application Docket No. 83791863 (NGE File No. 026780.8669), filed on Mar. 24, 2017 and U.S. Application ______, Docket No. 83791849 (NGE File No. 026780.8670), filed on Mar. 24, 2017, which are incorporated herein by reference in their entireties.