The present disclosure relates to a capacitive depth sensor system and method for use in an autonomous vehicle environment.
The ability to determine when a vehicle's engine will stall, or electrical function will be compromised, as it is moving through water is oftentimes difficult. Autonomous vehicles driving through flooded terrain must contemporaneously evaluate possible routes that avoid flood water that exceeds operating limitations of the vehicle.
Some water depth sensing technologies use ultrasonic sensors to detect and alert the operator of the vehicle as they are driving through flooded terrain. However, these sensors may not have the same precision as capacitive based sensors, and may be of limited use to an autonomous driving system when making determinations for navigational routes and taking mitigating actions when flood waters are encountered on routes.
The detailed description is set forth with reference to the accompanying drawings. The use of the same reference numerals may indicate similar or identical items. Various embodiments may utilize elements and/or components other than those illustrated in the drawings, and some elements and/or components may not be present in various embodiments. Elements and/or components in the figures are not necessarily drawn to scale. Throughout this disclosure, depending on the context, singular and plural terminology may be used interchangeably.
The systems and methods disclosed herein are configured to detect water depth levels under and around a vehicle as it is driven in flooded terrain using a capacitive water depth sensor system. The capacitive water depth sensor system can be tuned to operate with very low power consumption and, as such, can be used to continuously monitor for water and flooding while the vehicle is parked for extended periods of time. In one embodiment, the capacitive water depth sensor system includes a plurality of capacitive sensors that are integrated into a cavity in the wheel well molding of the vehicle—with gaps to allow air to escape from the top of the wheel well molding. In another embodiment, the capacitive water depth sensor system includes a plurality of capacitive sensors that are integrated directly into the wheel well molding. The capacitive sensors are integrated into the cavity of the wheel well molding by way of application of a conductive primer to the plastic of the wheel well molding. The conductive primer creates an electrostatic field similar to an electrostatic field generated between two plates housing a capacitive element. The electrostatic field may be created between capacitive sensors and the sheet metal of the vehicle to which the wheel well is attached. When groundwater rises within the space between the wheel well and the sheet metal of the vehicle, a processor connected to the capacitive sensors can detect the level of water based on a change in the electrostatic field based at least in part on the dielectric constant of water. In some embodiments, the primer may be painted on a portion of the vehicle other than the wheel well, and an electrostatic field may be generated by the primer. This embodiment may provide more sensitivity to moisture collecting on the sensor created by the primer painted on the vehicle.
In some embodiments, there may be a hole in the wheel well to allow air to escape as the water levels rise under the vehicle between the capacitive sensors and the sheet metal of the vehicle. The hole in the wheel well enables accurate reading of the water depth level because the dielectric constant of air is present between the capacitive sensors that are not submerged in water and the sheet metal. When a capacitive sensor is enclosed in an area without access to air, and there is water vapor in the enclosed area, the dielectric constant of the area between the plates housing the capacitive sensor is not exactly the same as air and therefore may give rise to inaccurate measurements.
The capacitive water depth sensor system also includes one or more processors that can analyze the intensity and pattern of a capacitive signal that is generated by the capacitive sensors, to determine when the capacitive sensors have accumulated dirt or snow. The one or more processors can further distinguish between splashes of water that are detected by the capacitive sensors as the vehicle is traveling over a wet surface, and when the vehicle is being driven in water. The one or more processors can further determine when the vehicle is simply sitting (parked) on a flooded terrain. To this end, the one or more processors may classify any sudden changes in a capacitive signal of the capacitive sensors, over a short period (less than a few seconds), as environmental substances begin to coat or stick to the capacitive sensors. Environmental substances may include but are not limited to water, snow, dirt, mud, and/or any other substances that have the ability to adhere to the capacitive sensors. The capacitive sensors may detect one or more environmental substances as the environmental substances are projected upward into the gap between the wheel well and the sheet metal, but do not remain in the gap. Accordingly, the one or more processors may classify these sudden changes in capacitive signal as noise signals that are filtered out of a capacitive signal corresponding to a rising water level underneath the vehicle. For example, as the water level underneath a vehicle increases as the vehicle moves from a wet surface where the tires of the vehicle are not submerged to a surface where the tires are completely submerged, the water droplets and any other environmental substances detected by the capacitive sensors are filtered out of the capacitive signal corresponding to rising water levels.
In one example embodiment, a nozzle attached to a liquid dispensing tank may be attached to a component connecting the wheel well to the sheet metal. The space or gap between the wheel well and the sheet metal may be referred to as a chamber. The one or more processors may send instructions to an actuator that may cause the dispensing tank to dispense a cleaning solution to the capacitive sensors via the nozzle to remove any environmental substances that adhere to the capacitive sensors. These environmental substances may include dirt, mud, snow, or ice. In some embodiments, the one or more processors may send a signal to a display in the cab of the vehicle instructing the operator of the vehicle to press a button on the touch display or elsewhere in the cab of the vehicle that will cause the cleansing solution to be dispensed onto the capacitive sensors. In other embodiments, the one or more processors may determine that the capacitive signal received from the capacitive sensors corresponds to a noisy capacitive signal, and may send a signal to the actuator to dispense the cleaning solution onto the capacitive sensors.
This disclosure applies to both the powertrain and electrical infrastructure within the vehicle. In some embodiments, the vehicle engine may include an internal combustion engine, battery electric vehicle (BEV), plug-in hybrid electric vehicle (PHEV), one or more turbines, or any other engine technology that propels a vehicle which, if exposed to rising water, may be compromised by the water of a certain depth. The sensors disclosed herein detect water depth levels so as to prevent the compromise of any supporting electrical drives, engine controls, and/or wiring infrastructure included in the vehicle. It should be noted that the word vehicle is inclusive of, but not limited to petrol-fueled cars, vans, buses, or trucks. The word vehicle also includes electric and/or hybrid cars, vans, buses, or trucks.
In some embodiments, additional capacitive sensors may be placed on or around the engine of the vehicle, or on certain areas around an electrical system of the vehicle. If a particular location of the engine or electrical system happens to be particularly sensitive to moisture, additional capacitive sensors may be placed in these locations to allow earlier detection to help prevent the engine from flooding or the electrical system from short-circuiting. These and other advantages of the present disclosure are provided in greater detail herein.
Embodiments described herein may further include the capacitive water depth sensor system in communication with an autonomous vehicle (AV) controller of an AV. The AV may operate independently, or be part of an AV fleet in communication with a fleet control server. The AV controller may be configured to determine water levels, and rates of rising water using the capacitive sensor system described herein, and perform a mitigating action that causes the vehicle to either clean malfunctioning sensors, or in cases of functional sensors that detect rising water, move the vehicle to a location that mitigates damage risk. Other mitigating actions may be performed as well, including disabling or powering down critical components when the vehicle cannot be moved to another location and sending warning messages to the fleet control server, to occupants of the AV, to other parties with equity in the vehicle or trip, or to emergency personnel.
In some embodiments, the capacitive depth sensor system 102 comprises a processor 104, memory 106, and communications interface 108. The memory 106 stores instructions that are executed by the processor 104 to perform aspects of the discussed water depth condition analysis and generate the warnings disclosed herein. When referring to operations executed by the capacitive depth sensor system 102, it will be understood that this includes the execution of instructions by the processor 104.
The capacitive depth sensor system 102 may be affixed to the inside of a fender portion 104 of a vehicle 150. For example, the capacitive depth sensor system may be affixed to the inside of the fender portion 104. The capacitive depth sensor system 102 may be implemented on a printed circuit board (PCB).
The processor 104 may perform the same functions as those described with general reference to the processor throughout the application. That is, the processor may perform the blocks in FIG.10. Capacitive sensor(s) 110 may comprise a first capacitive sensor and a second capacitive sensor as discussed herein. The processor 104 may receive signals from a detector circuit, that may be included in the capacitive depth sensor system 102, which indicates when the capacitance of one or both capacitive sensor(s) 110 has changed.
The capacitive sensors 208, 210, 212, and 214 may be affixed to the fender molding 202. Water may rise through the gap 218 and may cause a change in the capacitive signal generated by these capacitive sensors as the water rises vertically in the gap 218. The capacitive sensors 208, 210, 212, and 214 may each generate an electrostatic field and a corresponding capacitive signal when there is no water in the gap 218. When there is no water in the gap 218, the capacitive signal generated by each capacitive sensors 208, 210, 212, and 214 is based at least in part on the dielectric constant of the ambient air. When there is water interacting with, or disturbing, the electrostatic field generated by a capacitive sensor based on the dielectric constant of the ambient air, the capacitive signal generated by the capacitive sensor is based at least in part on the additional dielectric constant of water. The dielectric constant of water is approximately 80 times that of air. Because the electric field is inversely proportional to the capacitance of a dielectric constant, and the dielectric constant of water is 80 times that of air, a decrease in the electrostatic field across a capacitive sensor will result in an increase in capacitance (increase in capacitive signal). This decrease in the electrostatic field may correspond to a positive numeric value. The one or more processors (not shown in
At block 1004, if the capacitive signal is below the threshold value (YES) the method may progress to block 1006, where the AV controller (e.g., the AV controller 1100 as described hereafter with respect to
In some embodiments, the AV controller 1100 may generate the signal to dispense the cleaning solution. In another embodiment, a remote administrative operator (not shown in
At block 1004, if the capacitive signal is not below the threshold value (NO), the method may progress to block 1010 and the method may display an image on the display corresponding to a water depth level associated with the capacitive signal. The water depth level may be detected as explained above with reference to
The method may then progress to block 1010, where the AV controller determines whether a capacitive signal is received from a capacitive sensor corresponding to the highest water level. Responsive to determining that signal is received (YES), the method may progress to block 1012, and where the AV controller can perform one or more mitigating actions, which may include causing the vehicle to move to higher ground, and/or generating a message to occupants of the AV providing instructions for exiting the vehicle, etc. For example, determining that the capacitive sensor corresponds to the highest water level may correspond to the example depicted in
The AV controller may also, in some aspects, perform mitigating actions that include displaying an instruction message or warning to vehicle occupants that inform them of actions being taken by the vehicle, such as rerouting a trip due to flood waters, etc. In some embodiments, the AV controller may send a message to the mobile device via an application service hosted by a server (i.e., a mobile application executing on the mobile device).
If, at block 1012, the AV controller determines that a capacitive signal is not received corresponding to the highest water level (NO), the AV controller determines whether a capacitive signal is received from at least two capacitive sensors as shown in block 1014. For example, the AV controller may determine whether the water 822 has exceeded capacitive sensors 814 and 812 as in
Examples of partial autonomy modes can include autonomy levels 1 through 4 as understood in the art of autonomous driving technology. By way of a short overview of the levels of autonomy, a vehicle having Level-1 autonomy may include a single automated driver assistance feature, such as for steering or acceleration. Adaptive cruise control is one such example of a Level-1 autonomous system. Level-2 autonomy in vehicles may provide partial automation of steering and acceleration functionality, where the automated system(s) are supervised by a human driver that performs non-automated operations such as braking and other controls. Level-3 autonomy in a vehicle can provide conditional automation and control of driving features, such as, for example, “environmental detection” capabilities that can make informed decisions for itself. For example, a Level-3 autonomous vehicle may control accelerating past a slow-moving vehicle, while a driver remains ready to retake control of the vehicle if the system is unable to execute the task. Level 4 autonomous vehicles may generally include a high level of autonomy and can operate independently of a human driver, but still include manual controls for a human-operator override. Level-4 automation may also enable a self-driving mode to intervene responsive to a predefined conditional trigger, such as a road hazard or a system failure. A vehicle with Level-5 autonomy may not include any human vehicle controls and is generally configured to operate with non-human control system(s). Vehicle 1103 may include any level of autonomy. In a preferred embodiment, vehicle 1103 may be a Level-4 or Level-5 autonomous vehicle.
AV controller 1100 may generally include an object collision avoidance system 1110, and a capacitive water depth sensor system 1135 configured to interface with vehicle drive, communication, sensory, and other systems, to receive control instructions from mobility control module 1105 for navigating and driving vehicle 1103 according to embodiments of the present disclosure. Example vehicle systems may include a drive wheel controller 1115. The object collision avoidance system 1110 may communicate one or more control signals to internal drive components and systems, such as a drive wheel controller 1115 that controls one or more traction motor(s) 1120, and interfaces with external systems and vehicles via a wireless transmitter 1130, which may be disposed in communication with the mobility control module 1105.
The mobility control module 1105 may include one or more processor(s) 1150, and a memory 1155. Processor(s) 1150 may be one or more commercially available general-purpose processor(s), such as a processor from the Intel® or ARM® architecture families. In some aspects, mobility control module 1105 may be implemented in a system on a chip (SoC) configuration, to include other system components such as RAM, flash storage and I/O buses. Alternatively, mobility control module 1105 can be implemented using purpose-built integrated circuits, or any other suitable technology now known or later developed. Mobility control module 1105 also includes a memory 1155.
Mobility control module 1105 may receive navigational data from navigation receiver(s) 1140, the proximity sensor(s) 1133, and the capacitive water depth sensor system 1135, determine a navigational path from a first location to a second location based on water depth information, and provide instructions to the drive wheel controller 1115 for autonomous and/or semi-autonomous operation that moves the vehicle 1103 to a location with an operable water depth. Operable water depth, as described in greater detail hereafter, may be a maximum water depth at which the vehicle 1103 is able to operate within its intended design.
The memory 1155 may include executable instructions implementing the basic functionality of the capacitive water depth sensor system 1135, and/or a database of locations in a particular geographic area. An example geographic area is described using an example discussed with respect to
The capacitive water depth sensor system 1135 may be substantially similar or identical to the capacitive depth sensor system 102 described with respect to
The object collision avoidance system 1110 may be disposed in communication with the capacitive water depth sensor system 1135, and may include one or more proximity sensor(s) 1133, one or more navigation receiver(s) 1140, and a navigation interface 1145 through which users of the vehicle 1103 may receive information associated with water depth surrounding the vehicle 1103, receive information from other vehicles in an autonomous vehicle fleet 1116, with which vehicle 1103 may be associated. Mobility control module 1105 may communicate wheel engagement instructions, braking instructions, and other information to the drive wheel controller 1115, including signals for control of the one or more traction motor(s) 1120.
In an example embodiment, the mobility control module 1105 may further include a key 1139, which may be configured to activate operation of the vehicle 1103. The key 1139 may be a physical key or may be an identification code or a password entered by a user via a touch screen interface (e.g., the interface device 1125). The identification code may be associated with a service provider (not shown in
The vehicle 1103 may communicate with one or more other AVs in fleet 1116 in various ways, including via an indirect communication channel using network 1160, and/or via a direct communication channel that communicates directly between the vehicle 1103 and other vehicles of the fleet 1116, such as AV 1218 as depicted in
The wireless transmitter 1130 may embody any known or later known technology protocol, using one or more telecommunications, vehicle-to-vehicle, or other communication protocols to communicate with one or more other autonomous vehicles in the fleet 1116 and/or with the autonomous vehicle fleet control server 1170 using a wireless communication network such as, for example, network 1160. For example, the AV controller 1100 may receive water depth information from the capacitive water depth sensor system 1135. The water depth information may include an operative water depth associated with the capacitive sensor mounted at the top-most portion of the fender with respect to a surface of the ground. More particularly, the operative water depth may be associated with the capacitive signal from that sensor. Accordingly, the AV controller 1100 may generate vehicle control instructions for performing a mitigating action based on the second water depth information.
In one example, the AV controller 1100 may send a message to the fleet control server 1170 via network 1160, and receive a response message from fleet control server 1170 via the network 1160. The message may indicate a route recommendation for moving vehicle 1103 to a location with an operable water depth, where another AV in fleet 1116 provides water depth information to fleet control server 1170, and the second AV water depth information is used in the analysis for moving vehicle 1103 to the second location.
Network 1160 may be and/or include the Internet, a private network, public network or other configuration that operates using any one or more known communication protocols such as, for example, transmission control protocol/Internet protocol (TCP/IP), Bluetooth®, Wi-Fi, Ultra Wide-Band (UWB), and cellular technologies such as Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), High Speed Packet Access (HSPDA), Long-Term Evolution (LTE), Global System for Mobile Communications (GSM), and Fifth Generation (5G), to name a few examples. The network 1160 illustrates an example of one possible communication infrastructure in which the connected devices and vehicles 1103 and/or 1218 (as shown in
Navigation receiver(s) 1140 can include one or more of a global positioning system (GPS) receiver, and/or other related satellite navigation systems such as the global navigation satellite system (GLNSS), Galileo, or other similar systems known in the art of autonomous vehicle operation. Additionally, navigation receiver(s) 1140 can be configured to receive locally based navigation cues to aid in precise navigation through space-restricted areas, such as, for example, in a crowded street, and/or in a distributed beacon environment.
Proximity sensor(s) 1133 may work in connection with navigation receiver(s) 1140 to provide situational awareness to the mobility control module 1105 for autonomous navigation. For example, proximity sensor(s) 1133 may provide a secondary source of data that can indicate the presence of standing or moving water, extreme weather, obstacles partially submerged in standing or moving water, and other functionality. Proximity sensor(s) 1133 may alert the mobility control module 1105 to the presence of sensed obstacles, and provide trajectory information to the mobility control module 1105, where the trajectory information is indicative of moving objects or people that may interact with the vehicle 1103. Sensed obstacles can include other vehicles, pedestrians, animals, structures, curbs, and other random objects.
In some aspects, the AV controller 1100 may perform one or more mitigating actions based, at least in part, on information received from proximity sensor(s) 1133. The trajectory information may include one or more of a relative distance, a trajectory, a speed, a size approximation, a weight approximation, and/or other information that may indicate the physical characteristics of a physical object or person. The mobility control module 1105 may be configured to aggregate information from the navigation receiver(s) 1140, such as current position and speed, along with sensed obstacles from proximity sensor(s) 1133, and interpret the aggregated information to compute a safe path towards a destination such that the vehicle 1103 avoids collisions. In some implementations, proximity sensor(s) 1133 may be configured to determine the lateral dimensions of the path upon which the vehicle 1103 is traveling, (e.g., determining relative distance from the side of a sidewalk or curb), to help aid the mobility control module 1105 in maintaining precise navigation on a particular path.
The interface device 1125 may allow a passenger and/or operator of the vehicle 1103 to receive information associated with any mitigating actions. The interface device 1125 may include a touch screen interface surface (not shown in
Mobility control module 1105 may connect with one or more of the drive wheel controller 1115, which in turn may operate one or more traction motors 1120. The mobility control module 1105 may communicate with the drive wheel controller 1115 for providing autonomous and/or semi-autonomous navigation to selected points of interest such as, for example, a location with a navigable level of floodwater. The drive wheel controller 1115 may also control one or more drive mechanisms such as, for example, one or more brushless direct current (DC) motor(s), or another traction motor technology.
For example, the AV controller 1100 may determine that the water 1205 is not the cause of the capacitive sensor signal triggering the alert, but rather the capacitive sensor is covered by an environmental substance such as dirt, mud, road debris, tar, etc. In an embodiment, AV controller 1100 may generate one or more vehicle control instruction(s) for causing the vehicle 1103 to send a sensor clean instruction to a capacitive sensor cleaning system. In one aspect, the sensor clean instruction may be configured to cause an actuator of the capacitive sensor cleaning system (not shown in
The AV controller 1100 may also update a vehicle maintenance log (not shown in
In another example embodiment, the mitigating action may include instructions that cause the vehicle 1103 to navigate to a car wash for self-cleaning, which may clear the sensor system of any debris or environmental substance.
The vehicle 1103 may not be experiencing a false signal due to dirty capacitive sensors, but rather may be surrounded by static or rising surface water 1205. According to another example embodiment, performing the mitigating action can include generating one or more instructions, via mobility control module 1105, that cause the vehicle 1103 to move/return to a location with operable water depth (e.g., a location 1211). The operable water depth may be a water depth known to be navigable by the vehicle 1103 based on known configuration parameters of the vehicle 1103. For example, vehicle height from the road (that is, clearance between the road surface and the undercarriage of vehicle 1103), position of critical electrical components onboard vehicle 1103, such as motors, circuitry, etc., and other factors associated with vehicle 1103 design may be known and tested prior to deployment in the field. In one aspect, operable water depth may be a water depth determined to be less than a threshold water depth. The threshold water depth can be a maximum water depth over which vehicle 1103 is not designed to operate.
AV controller 1100 may determine that the location 1211 is associated with operable water depth in various ways. In one example, the determination may be based on a persistent memory of water depths that were observed (detected) and saved to memory by capacitive water depth sensor system 1135. The water depth information may be recorded with respect to time, with respect to the vehicle location, and/or based on information received from the autonomous vehicle fleet control server 1170. In this example, if vehicle 1103 traversed water at a 12″ depth within the last three minutes at the location 1211, which may be a relatively short distance (e.g., 15 feet, 100 yards, etc.) from a present position of vehicle 1103, the AV controller 1100 may determine that the prior location 1211 is likely to be less than the threshold water depth for safe vehicle operation. In an example embodiment, a threshold water depth may be 18″ or less. Accordingly, AV controller 1100 may determine that 3 minutes ago, water depth at location 1211 was 12″, and thus within a range for operable water depth. The AV controller 1100 may base the determination on a rate of rising water, weather conditions, camera information, data received from the fleet control server 1170, and via other factors that may provide a predictive data set that can predict a water depth at a remote location such as location 1211.
AV controller 1100 may further determine that emergency help may be needed from another vehicle, such as a tow vehicle (not shown). Accordingly, AV controller 1100 may send a message to control system 1170 requesting emergency help. In an embodiment, the message sent to control system 1170 may include location information, vehicle information, water depth information, nature of the emergency, or other contemplated information. Autonomous fleet control server 1170 may also be disposed in communication with emergency services associated with a city, region, etc. such that information received from the capacitive sensor system is shared with emergency services, and the fleet control server 1170 can receive traffic, emergency vehicle information, and the like, and response messages to the vehicle 1103 can include the information.
In another example embodiment, vehicle 1103 may receive a response message from autonomous vehicle fleet control server 1170 indicating a route recommendation for vehicle 1103. Accordingly, vehicle 1103 may move/return to the location 1220 based at least in part on the operable water depth information 1240 that describes water depth information at location 1220. The response message may include location information 1230, indicative of the location 1220, and water depth information 1240 that can include value(s) associated with water depth at location 1220. The message may further include route information 1235, which can provide GPS information associated with a route 1225 for navigating to the location 1220, water depth information 1240 associated with the location 1220, and/or include vehicle identification information (e.g., the vehicle ID 1245) associated with a second autonomous vehicle 1218 of fleet 1116 that may provide such information.
In another example embodiment, the mitigating action may include instructions for performing body height adjusting operation. In this example, vehicle 1103 may include one or more lifts configured to extend a clearance between the road 1210 and vehicle body portions that include sensors associated with the capacitive water depth sensor system 1135. For example, the AV controller 1100 may cause the mobility control module 1105 to instruct body lift actuators (not shown in
The AV controller 1100 may perform other mitigating actions based on water depth information, including instructing one or more vehicle systems to disengage, power down, or otherwise, prepare for imminent water contact. By powering down, the AV controller 1100 may mitigate possible damage associated with water covering critical or sensitive engine components.
In another example, the mitigating action may include providing power to one or more window actuators to provide emergency vehicle access or means for exit in a catastrophic flood event.
A mitigating action may further include causing a water diversion mechanism (Not shown in
At step 1302, the AV controller 1100 may receive a first capacitive signal from a first capacitive sensor in a vehicle wheel well of an autonomous vehicle (AV). The AV may be, for example, vehicle 1103 as shown in
At step 1304, the capacitive water depth sensor system 1135 may determine that the first capacitive signal exceeds a threshold value.
At step 1306, the capacitive water depth sensor system 1135 may receive a second capacitive signal from a second capacitive sensor corresponding to a water level.
At step 1308, the capacitive water depth sensor system 1135 may send, to AV controller 1100, first water depth information comprising a water depth level associated with the first capacitive signal.
At step 1310, the capacitive water depth sensor system 1135 may receive, from the AV controller 1100, a vehicle control instruction for performing a mitigating action.
At step 1313, the capacitive water depth sensor system 1135 may perform the mitigating action based on the vehicle control instruction.
In the above disclosure, reference has been made to the accompanying drawings, which form a part hereof, which illustrate specific implementations in which the present disclosure may be practiced. It is understood that other implementations may be utilized, and structural changes may be made without departing from the scope of the present disclosure. References in the specification to “one embodiment,” “an embodiment,” “an example embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, one skilled in the art will recognize such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
Implementations of the systems, apparatuses, devices, and methods disclosed herein may comprise or utilize one or more devices that include hardware, such as, for example, one or more processors and system memory, as discussed herein.
An implementation of the devices, systems, and methods disclosed herein may communicate over a computer network. A “network” and a “bus” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network, a bus, or another communications connection (either hardwired, wireless, or any combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmission media can include a network and/or data links, which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general-purpose or special purpose computer. Combinations of the above should also be included within the scope of non-transitory computer-readable media.
Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause the processor to perform a certain function or group of functions. The computer-executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
Those skilled in the art will appreciate that the present disclosure may be practiced in network computing environments with many types of computer system configurations, including in-dash vehicle computers, personal computers, desktop computers, laptop computers, message processors, handheld devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, various storage devices, and the like. The disclosure may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by any combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both the local and remote memory storage devices.
Further, where appropriate, the functions described herein can be performed in one or more of hardware, software, firmware, digital components, or analog components. For example, one or more application-specific integrated circuits (ASICs) can be programmed to carry out one or more of the systems and procedures described herein. Certain terms are used throughout the description and claims refer to particular system components. As one skilled in the art will appreciate, components may be referred to by different names. This document does not intend to distinguish between components that differ in name, but not function.
It should be noted that the sensor embodiments discussed above may comprise computer hardware, software, firmware, or any combination thereof to perform at least a portion of their functions. For example, a sensor may include computer code configured to be executed in one or more processors and may include hardware logic/electrical circuitry controlled by the computer code. These example devices are provided herein for purposes of illustration and are not intended to be limiting. Embodiments of the present disclosure may be implemented in further types of devices, as would be known to persons skilled in the relevant art(s).
At least some embodiments of the present disclosure have been directed to computer program products comprising such logic (e.g., in the form of software) stored on any computer-usable medium. Such software, when executed in one or more data processing devices, causes a device to operate as described herein.
While various embodiments of the present disclosure have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the present disclosure. Thus, the breadth and scope of the present disclosure should not be limited by any of the above-described exemplary embodiments but should be defined only in accordance with the following claims and their equivalents. The foregoing description has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the present disclosure to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. Further, it should be noted that any or all of the aforementioned alternate implementations may be used in any combination desired to form additional hybrid implementations of the present disclosure. For example, any of the functionality described with respect to a particular device or component may be performed by another device or component. Further, while specific device characteristics have been described, embodiments of the disclosure may relate to numerous other device characteristics. Further, although embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that the disclosure is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as illustrative forms of implementing the embodiments. Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments could include, while other embodiments may not include, certain features, elements, and/or steps. Thus, such conditional language is not generally intended to imply that features, elements, and/or steps are in any way required for one or more embodiments. Although certain aspects of various embodiments may have been described using a singular word or phrase (such as “a signal” or “a processor”) it should be understood that the description may be equally applicable to plural words or phrases (such as “signals” and “processors”).