Embodiments of the present disclosure relate to systems and methods for enhancing electric vehicle range estimation using sensor inputs.
As countries and consumers work toward reducing the burning of fossil fuels, electric vehicles are becoming more prevalent, with the number of electric vehicles on the road increasing.
Rather than the relatively quick methods of refueling a vehicle with an internal combustion engine, electric vehicles require a more time-consuming method of recharging vehicle batteries. Due to the time-consuming nature of electric vehicle recharging, it is imperative that battery life estimates, including vehicle range estimation, be as accurate as possible, in order to enable electric vehicle users to travel efficiently between instances of recharging the electric vehicle.
Current systems generally using vehicle navigation systems in order to determine distance intended to be traveled in order to determine range estimation. However, environmental factors can effect the actual range of an electric vehicle, which often results in changes to the range estimation during travel. Many electric vehicle users experience range anxiety, especially when the range drops rapidly while climbing a hill or in a head wind condition.
For at least these reasons, electric vehicle range estimation systems and methods which incorporate means for incorporating environmental factors when determining an estimated vehicle range are needed.
According to an object of the present disclosure, a method for enhancing battery electric vehicle (BEV) range estimation using sensor inputs is provided.
In certain preferred aspects, the present methods may comprise:
According to an exemplary embodiment, the determining the BEV range estimate may comprise analyzing, for the BEV, one or more of the following: a battery life estimate of one or more batteries of the BEV, using a battery monitoring system of the BEV; and a mass of the BEV.
According to an exemplary embodiment, the one or more conditions may comprise one or more of the following: a presence of wind; a presence of one or more road conditions; a presence of one or more terrain conditions; a presence of one or more weather phenomena; a speed of the BEV; a presence of a hill; an occurrence of the BEV ascending the hill; an occurrence of the BEV descending the hill; an occurrence of the BEV towing an object; and a mass of an object being towed by the BEV.
According to an exemplary embodiment, the determining whether the one or more conditions are present may comprise receiving one or more inputs from one or more sensors in electronic communication with a computing device of the BEV.
According to an exemplary embodiment, the one or more sensors may comprise one or more of the following: one or more cameras, one or more radar sensors; and one or more LiDAR sensors.
According to an exemplary embodiment, the assigning the value to each condition may comprise, when a condition is present, assigning a value of 1 to the condition, and, when a condition is not present, assigning a value of 0 to the condition.
According to an exemplary embodiment, the assigning the value to each condition may comprise, when the condition is a hill, assigning a value of 0 when the hill is not present, assigning a value of +1 when the hill is present and the BEV is ascending the hill, and assigning a value of −1 when the hill is present and the vehicle is descending the hill.
According to an object of the present disclosure, a system for enhancing BEV range estimation using sensor inputs is provided. The system may comprise a BEV comprising a processor and a graphical user interface. The processor may be configured to determine a BEV range estimate of a BEV, generating a determined BEV range estimate, determine whether one or more conditions are present, and, when one or more conditions are present, perform a BEV range correction, generating a corrected BEV range estimate, comprising assigning a value to each condition, generating an assigned value for each condition, multiplying the assigned value by an associated gain factor for each condition, and adding, to the determined BEV range estimate, a product of the assigned value and the associated gain factor, for each condition. The processor may further be configured to output, via the graphical user interface, the corrected BEV range estimate.
According to an exemplary embodiment, the determining the BEV range estimate may comprise analyzing, for the BEV, one or more of the following: a battery life estimate of one or more batteries of the BEV, using a battery monitoring system of the BEV; and a mass of the BEV.
According to an exemplary embodiment, the one or more conditions may comprise one or more of the following: a presence of wind; a presence of one or more road conditions; a presence of one or more terrain conditions; a presence of one or more weather phenomena; a speed of the BEV; a presence of a hill; an occurrence of the BEV ascending the hill; an occurrence of the BEV descending the hill; an occurrence of the BEV towing an object; and a mass of an object being towed by the BEV.
According to an exemplary embodiment, the determining whether the one or more conditions are present may comprise receiving one or more inputs from one or more sensors in electronic communication with a computing device of the BEV.
According to an exemplary embodiment, the system may further comprise the one or more sensors. The one or more sensors may comprise one or more of the following: one or more cameras; one or more radar sensors; and one or more LiDAR sensors.
According to an exemplary embodiment, the assigning the value to each condition may comprise, when a condition is present, assigning a value of 1 to the condition, and, when a condition is not present, assigning a value of 0 to the condition.
According to an exemplary embodiment, the assigning the value to each condition may comprise, when the condition is a hill, assigning a value of 0 when the hill is not present, assigning a value of +1 when the hill is present and the BEV is ascending the hill, and assigning a value of −1 when the hill is present and the vehicle is descending the hill.
According to an object of the present disclosure, a system for enhancing BEV range estimation using sensor inputs is provided. The system may comprise a BEV comprising a graphical user interface, and a computing device, comprising a processor and a memory. The memory may be configured to store programming instructions that, when executed by the processor, cause the processor to determine a BEV range estimate of a BEV, generating a determined BEV range estimate, determine whether one or more conditions are present, and, when one or more conditions are present, perform a BEV range correction, generating a corrected BEV range estimate, comprising assigning a value to each condition, generating an assigned value for each condition, multiplying the assigned value by an associated gain factor for each condition, and adding, to the determined BEV range estimate, a product of the assigned value and the associated gain factor, for each condition. The memory may further be configured to store programming instructions that, when executed by the processor, cause the processor to output, such as via the graphical user interface, the corrected BEV range estimate.
According to an exemplary embodiment, the determining the BEV range estimate may comprise analyzing, for the BEV, one or more of the following: a battery life estimate of one or more batteries of the BEV, using a battery monitoring system of the BEV; and a mass of the BEV.
According to an exemplary embodiment, the one or more conditions may comprise one or more of the following: a presence of wind; a presence of one or more road conditions; a presence of one or more terrain conditions; a presence of one or more weather phenomena; a speed of the BEV; a presence of a hill; an occurrence of the BEV ascending the hill; an occurrence of the BEV descending the hill; an occurrence of the BEV towing an object; and a mass of an object being towed by the BEV.
According to an exemplary embodiment, the BEV further may comprise one or more sensors, the determining whether the one or more conditions are present may comprise receiving one or more inputs from one or more sensors in electronic communication with a computing device of the BEV, and the one or more sensors may comprise one or more of the following: one or more cameras; one or more radar sensors; and one or more LiDAR sensors.
According to an exemplary embodiment, the assigning the value to each condition may comprise, when a condition is present, assigning a value of 1 to the condition, and, when a condition is not present, assigning a value of 0 to the condition.
According to an exemplary embodiment, the assigning the value to each condition may comprise, when the condition is a hill, assigning a value of 0 when the hill is not present, assigning a value of +1 when the hill is present and the BEV is ascending the hill, and assigning a value of −1 when the hill is present and the vehicle is descending the hill.
The accompanying drawings, which are incorporated in and form a part of the Detailed Description, illustrate various non-limiting and non-exhaustive embodiments of the subject matter and, together with the Detailed Description, serve to explain principles of the subject matter discussed below. Unless specifically noted, the drawings referred to in this Brief Description of Drawings should be understood as not being drawn to scale and like reference numerals refer to like parts throughout the various figures unless otherwise specified.
The following Detailed Description is merely provided by way of example and not of limitation. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding background or in the following Detailed Description.
Reference will now be made in detail to various exemplary embodiments of the subject matter, examples of which are illustrated in the accompanying drawings. While various embodiments are discussed herein, it will be understood that they are not intended to limit to these embodiments. On the contrary, the presented embodiments are intended to cover alternatives, modifications, and equivalents, which may be included within the spirit and scope of the various embodiments as defined by the appended claims. Furthermore, in this Detailed Description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present subject matter. However, embodiments may be practiced without these specific details. In other instances, well known methods, procedures, components, and circuits have not been described in detail as not to unnecessarily obscure aspects of the described embodiments.
Some portions of the detailed descriptions which follow are presented in terms of procedures, logic blocks, processing, and other symbolic representations of operations on data within an electrical device. These descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. In the present application, a procedure, logic block, process, or the like, is conceived to be one or more self-consistent procedures or instructions leading to a desired result. The procedures are those requiring physical manipulations of physical quantities. Usually, although not necessarily, these quantities may take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in an electronic system, device, and/or component.
It should be borne in mind, however, that these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the description of embodiments, discussions utilizing terms such as “determining,” “communicating,” “taking,” “comparing,” “monitoring,” “calibrating,” “estimating,” “initiating,” “providing,” “receiving,” “controlling,” “transmitting,” “isolating,” “generating,” “aligning,” “synchronizing,” “identifying,” “maintaining,” “displaying,” “switching,” or the like, refer to the actions and processes of an electronic item such as: a processor, a sensor processing unit (SPU), a processor of a sensor processing unit, an application processor of an electronic device/system, or the like, or a combination thereof. The item manipulates and transforms data represented as physical (electronic and/or magnetic) quantities within the registers and memories into other data similarly represented as physical quantities within memories or registers or other such information storage, transmission, processing, or display components.
It is understood that the term “vehicle” or “vehicular” or other similar term as used herein is inclusive of motor vehicles in general such as passenger automobiles including sports utility vehicles (SUV), buses, trucks, various commercial vehicles, watercraft including a variety of boats and ships, aircraft, and the like, and includes hybrid vehicles, electric vehicles, plug-in hybrid electric vehicles, hydrogen-powered vehicles and other alternative fuel vehicles (e.g. fuels derived from resources other than petroleum). As referred to herein, a hybrid vehicle is a vehicle that has two or more sources of power, for example both gasoline-powered and electric-powered vehicles. In aspects, a vehicle may comprise an internal combustion engine system as disclosed herein.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. These terms are merely intended to distinguish one component from another component, and the terms do not limit the nature, sequence or order of the constituent components. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. Throughout the specification, unless explicitly described to the contrary, the word “comprise” and variations such as “comprises” or “comprising” will be understood to imply the inclusion of stated elements but not the exclusion of any other elements. In addition, the terms “unit”, “-er”, “-or”, and “module” described in the specification mean units for processing at least one function and operation, and can be implemented by hardware components or software components and combinations thereof.
Although exemplary embodiment is described as using a plurality of units to perform the exemplary process, it is understood that the exemplary processes may also be performed by one or plurality of modules. Additionally, it is understood that the term controller/control unit refers to a hardware device that includes a memory and a processor and is specifically programmed to execute the processes described herein. The memory is configured to store the modules and the processor is specifically configured to execute said modules to perform one or more processes which are described further below.
Further, the control logic of the present disclosure may be embodied as non-transitory computer readable media on a computer readable medium containing executable program instructions executed by a processor, controller or the like. Examples of computer readable media include, but are not limited to, ROM, RAM, compact disc (CD)-ROMs, magnetic tapes, floppy disks, flash drives, smart cards and optical data storage devices. The computer readable medium can also be distributed in network coupled computer systems so that the computer readable media is stored and executed in a distributed fashion, e.g., by a telematics server or a Controller Area Network (CAN).
Unless specifically stated or obvious from context, as used herein, the term “about” is understood as within a range of normal tolerance in the art, for example within 2 standard deviations of the mean. “About” can be understood as within 10%, 9%, 8%, 7%, 6%, 5%, 4%, 3%, 2%, 1%, 0.5%, 0.1%, 0.05%, or 0.01% of the stated value. Unless otherwise clear from the context, all numerical values provided herein are modified by the term “about”.
Embodiments described herein may be discussed in the general context of processor-executable instructions residing on some form of non-transitory processor-readable medium, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types. The functionality of the program modules may be combined or distributed as desired in various embodiments.
In the figures, a single block may be described as performing a function or functions; however, in actual practice, the function or functions performed by that block may be performed in a single component or across multiple components, and/or may be performed using hardware, using software, or using a combination of hardware and software. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, logic, circuits, and steps have been described generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure. Also, the example device vibration sensing system and/or electronic device described herein may include components other than those shown, including well-known components.
Various techniques described herein may be implemented in hardware, software, firmware, or any combination thereof, unless specifically described as being implemented in a specific manner. Any features described as modules or components may also be implemented together in an integrated logic device or separately as discrete but interoperable logic devices. If implemented in software, the techniques may be realized at least in part by a non-transitory processor-readable storage medium comprising instructions that, when executed, perform one or more of the methods described herein. The non-transitory processor-readable data storage medium may form part of a computer program product, which may include packaging materials.
The non-transitory processor-readable storage medium may comprise random access memory (RAM) such as synchronous dynamic random access memory (SDRAM), read only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EEPROM), FLASH memory, other known storage media, and the like. The techniques additionally, or alternatively, may be realized at least in part by a processor-readable communication medium that carries or communicates code in the form of instructions or data structures and that can be accessed, read, and/or executed by a computer or other processor.
Various embodiments described herein may be executed by one or more processors, such as one or more motion processing units (MPUs), sensor processing units (SPUs), host processor(s) or core(s) thereof, digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), application specific instruction set processors (ASIPs), field programmable gate arrays (FPGAs), a programmable logic controller (PLC), a complex programmable logic device (CPLD), a discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein, or other equivalent integrated or discrete logic circuitry. The term “processor,” as used herein may refer to any of the foregoing structures or any other structure suitable for implementation of the techniques described herein. As employed in the subject specification, the term “processor” can refer to substantially any computing processing unit or device comprising, but not limited to comprising, single-core processors; single-processors with software multithread execution capability; multi-core processors; multi-core processors with software multithread execution capability; multi-core processors with hardware multithread technology; parallel platforms; and parallel platforms with distributed shared memory. Moreover, processors can exploit nano-scale architectures such as, but not limited to, molecular and quantum-dot based transistors, switches and gates, in order to optimize space usage or enhance performance of user equipment. A processor may also be implemented as a combination of computing processing units.
In addition, in some aspects, the functionality described herein may be provided within dedicated software modules or hardware modules configured as described herein. Also, the techniques could be fully implemented in one or more circuits or logic elements. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of an SPU/MPU and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with an SPU core, MPU core, or any other such configuration. One or more components of an SPU or electronic device described herein may be embodied in the form of one or more of a “chip,” a “package,” an Integrated Circuit (IC).
The actual range of a battery electric vehicle (BEV) is highly influenced by external/environmental conditions, such as, e.g., wind, road conditions (e.g., road material, smoothness, etc.), speed, whether the BEV is climbing or descending hills, whether the BEV is towing, and and/or other conditions. One of the top complaints of BEVs is the range of the BEV. Many customers experience range anxiety, especially when the vehicle range drops rapidly while, e.g., climbing a hill or in a head wind condition. These and other conditions can be accounted for by using, e.g., an onboard camera and radar systems from, e.g., an advanced driver assistance system (ADAS).
Due to all of these external influences, the predicted/estimated BEV range can have a large error. The systems and methods of the present disclosure may use the ADAS's camera and radar sensors (among possible other sensors) to detect these and other external conditions that may change the driving range of the BEV.
Referring now to
According to an exemplary embodiment, the vehicle 100 may comprise one or more sensors such as, for example, one or more LiDAR sensors 105, one or more radio detection and ranging (RADAR) sensors 110, one or more cameras 115, and/or one or more position determining sensors 120 (e.g., one or more Global Positioning System devices), among other suitable sensors. According to an exemplary embodiment, the one or more sensors may be in electronic communication with one or more computing devices 125. The one or more computing devices 125 may be separate from the one or more sensors and/or may be incorporated into the one or more sensors.
According to an exemplary embodiment, the computing device 125 may comprise a processor 130 and/or a memory 135. The memory 135 may be configured to store programming instructions that, when executed by the processor 130, may be configured to cause the processor 130 to perform one or more tasks such as, e.g., receiving one or more inputs from one or more sensors (e.g., one or more cameras, one or more radar sensors, and/or other suitable sensors), determining whether one or more conditions are present which may affect BEV range and range estimation based on the one or more inputs, weighing these one or more conditions based on how the one or more conditions may affect BEV range, generating a correction factor for each of the present one or more conditions, applying the correction factor or factors, determining one or more vehicle actions, and/or performing one or more vehicle actions, among other functions.
According to an exemplary embodiment, the memory 135 may be configured to store a smart BEV range estimation algorithm which may be executed by the processor 130. The smart BEV range estimation algorithm may comprise an ADAS and map info processing model.
According to an exemplary embodiment, the smart BEV range estimation algorithm, when executed by the processor 130, may be configured to cause the vehicle 100 to perform one or more vehicle actions such as, e.g., updating the BEV range estimation, displaying the updated BEV range estimation, and/or one or more other vehicle actions. According to an exemplary embodiment, the one or more conditions may comprise, e.g., the presence of wind, the presence of one or more road and/or terrain conditions, the presence of one or more weather phenomena (e.g., rain, snow, sleet, hail, etc.), the speed of the BEV, whether the BEV is climbing or descending hills, whether the BEV is towing an object and/or the mass of the object being towed by the BEV, and and/or other suitable conditions.
Referring now to
At 202, a BEV range estimate is determined. The BEV range estimate may be determined using, e.g., a battery life estimate of the one or more batteries of the BEV, a mass of the BEV, and/or other suitable factors. According to an exemplary embodiment, the battery life estimate may be determined via, e.g., a battery monitoring system configured to measure current, voltage and/or temperature of the one or more batteries, and/or a current percentage of charge of the one or more batteries.
At 204, it is determined whether one or more conditions are present which may affect BEV range and range estimation. According to an exemplary embodiment, the one or more conditions may comprise, e.g., the presence of wind, the presence of one or more road and/or terrain conditions, the presence of one or more weather phenomena (e.g., rain, snow, sleet, hail, etc.), the speed of the BEV, whether the BEV is climbing or descending hills, whether the BEV is towing an object and/or the mass of the object being towed by the BEV, and and/or other suitable conditions.
According to an exemplary embodiment, determining the presence of a condition may be determined using one or more suitable methods. One or more sensors (e.g., cameras, radar sensors, LiDAR sensors, etc.) coupled to the BEV and/or accessible by the BEV may be used in order to determine the presence and/or absence of the one or more conditions, and determining whether the one or more conditions are present may comprise receiving one or more inputs from the one or more sensors. For example, an angle of the BEV may be used to determine whether a hill is present and/or whether the BEV is ascending or descending the hill, the presence of a trailer or other suitable object may be determined in order to determine whether the BEV is towing an object, road friction may be analyzed to determine whether one or more weather phenomena are present, road/terrain shape may be analyzed to determine whether the BEV is on a smooth road, a rough road, off-road, etc., and/or one or more other suitable means may be used to determine the presence and/or absence of one or more conditions.
According to an exemplary embodiment, when the one or more conditions are not present, then the system, at 212, outputs the determined BEV range estimate. According to an exemplary embodiment, when the one or more conditions are present, then, at 206, a BEV range correction is performed.
According to an exemplary embodiment, the BEV range correction may comprise assigning a value to each condition. For example, when a condition is present, a value of 1 may be assigned to the condition, and when a condition is not present, a value of 0 may be assigned to the condition. According to an exemplary embodiment, in determining the assigned value of the condition of the presence of a hill, the assigned value may be equal to 0 if a hill is not present, +1 when a hill is present and the BEV is ascending the hill, and −1 when a hill is present and the vehicle is descending the hill. According to an exemplary embodiment, the BEV range correction may comprise, for each condition, multiplying the assigned value by a gain factor.
According to an exemplary embodiment, for each condition, a value of the associated gain factor may correlate to the condition's effect on the determined BEV range estimate. According to an exemplary embodiment, each range factor may be tuned based on the BEV and the influence the condition has on the determined BEV range estimate for that particular BEV.
According to an exemplary embodiment, the BEV range correction may comprise adding, to the determined BEV range estimate, the product, for each condition, of the value assigned to the condition and the gain factor associated with the condition, in accordance with Equation 1.
Where, for each condition (1 through n), K is the gain factor.
At 208, using Equation 1, the corrected BEV range estimate is determined. The corrected BEV range estimate may then, at 210, be output. According to an exemplary embodiment, the BEV may comprise a graphical user interface through which the system may output the determined BEV range estimate and/or the corrected BEV range estimate via, e.g., a display, a speaker, and/or other suitable means.
Referring now to
The hardware architecture of
Some or all components of the computing device 300 may be implemented as hardware, software, and/or a combination of hardware and software. The hardware may comprise, but is not limited to, one or more electronic circuits. The electronic circuits may comprise, but are not limited to, passive components (e.g., resistors and capacitors) and/or active components (e.g., amplifiers and/or microprocessors). The passive and/or active components may be adapted to, arranged to, and/or programmed to perform one or more of the methodologies, procedures, or functions described herein.
As shown in
At least some of the hardware entities 314 may be configured to perform actions involving access to and use of memory 312, which may be a Random Access Memory (RAM), a disk driver and/or a Compact Disc Read Only Memory (CD-ROM), among other suitable memory types. Hardware entities 314 may comprise a disk drive unit 316 comprising a computer-readable storage medium 318 on which may be stored one or more sets of instructions 320 (e.g., programming instructions such as, but not limited to, software code) configured to implement one or more of the methodologies, procedures, or functions described herein. The instructions 320 may also reside, completely or at least partially, within the memory 312 and/or within the CPU 306 during execution thereof by the computing device 300.
The memory 312 and the CPU 306 may also constitute machine-readable media. The term “machine-readable media”, as used here, refers to a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions 320. The term “machine-readable media”, as used here, also refers to any medium that is capable of storing, encoding or carrying a set of instructions 320 for execution by the computing device 300 and that cause the computing device 300 to perform any one or more of the methodologies of the present disclosure.
Referring now to
Vehicle 100 may be configured to be incorporated in or with a vehicle having the same or similar system architecture as that shown in
As shown in
Operational parameter sensors that are common to both types of vehicles may comprise, for example: a position sensor 434 such as an accelerometer, gyroscope and/or inertial measurement unit; a speed sensor 436; and/or an odometer sensor 438. The vehicle system architecture 400 also may comprise a clock 442 that the system uses to determine vehicle time and/or date during operation. The clock 442 may be encoded into the vehicle on-board computing device 420, it may be a separate device, or multiple clocks may be available.
The vehicle system architecture 400 also may comprise various sensors that operate to gather information about the environment in which the vehicle is traveling. These sensors may comprise, for example: a location sensor 444 (for example, a Global Positioning System (GPS) device); object detection sensors such as one or more cameras 446; a LiDAR sensor system 448, and/or a RADAR and/or a sonar system 450. The sensors also may comprise environmental sensors 452 such as, e.g., a humidity sensor, a precipitation sensor, a light sensor, and/or ambient temperature sensor. The object detection sensors may be configured to enable the vehicle system architecture 400 to detect objects that are within a given distance range of the vehicle in any direction, while the environmental sensors 452 may be configured to collect data about environmental conditions within the vehicle's area of travel. According to an exemplary embodiment, the vehicle system architecture 400 may comprise one or more lights 454 (e.g., headlights, flood lights, flashlights, etc.).
During operations, information may be communicated from the sensors to an on-board computing device 420 (e.g., computing device 300 of
Geographic location information may be communicated from the location sensor 444 to the on-board computing device 420, which may then access a map of the environment that corresponds to the location information to determine known fixed features of the environment such as streets, buildings, stop signs and/or stop/go signals. Captured images from the cameras 446 and/or object detection information captured from sensors such as LiDAR 448 may be communicated from those sensors to the on-board computing device 420. The object detection information and/or captured images may be processed by the on-board computing device 420 to detect objects in proximity to the vehicle. Any known or to be known technique for making an object detection based on sensor data and/or captured images may be used in the embodiments disclosed in this document.
What has been described above includes examples of the subject disclosure. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the subject matter, but it is to be appreciated that many further combinations and permutations of the subject disclosure are possible. Accordingly, the claimed subject matter is intended to embrace all such alterations, modifications, and variations that fall within the spirit and scope of the appended claims.
In particular and in regard to the various functions performed by the above described components, devices, systems and the like, the terms (including a reference to a “means”) used to describe such components are intended to correspond, unless otherwise indicated, to any component which performs the specified function of the described component (e.g., a functional equivalent), even though not structurally equivalent to the disclosed structure, which performs the function in the herein illustrated exemplary aspects of the claimed subject matter.
The aforementioned systems and components have been described with respect to interaction between several components. It can be appreciated that such systems and components can include those components or specified sub-components, some of the specified components or sub-components, and/or additional components, and according to various permutations and combinations of the foregoing. Sub-components can also be implemented as components communicatively coupled to other components rather than included within parent components (hierarchical). Additionally, it should be noted that one or more components may be combined into a single component providing aggregate functionality or divided into several separate sub-components. Any components described herein may also interact with one or more other components not specifically described herein.
In addition, while a particular feature of the subject innovation may have been disclosed with respect to only one of several implementations, such feature may be combined with one or more other features of the other implementations as may be desired and advantageous for any given or particular application. Furthermore, to the extent that the terms “includes,” “including,” “has,” “contains,” variants thereof, and other similar words are used in either the detailed description or the claims, these terms are intended to be inclusive in a manner similar to the term “comprising” as an open transition word without precluding any additional or other elements.
Thus, the embodiments and examples set forth herein were presented in order to best explain various selected embodiments of the present invention and its particular application and to thereby enable those skilled in the art to make and use embodiments of the invention. However, those skilled in the art will recognize that the foregoing description and examples have been presented for the purposes of illustration and example only. The description as set forth is not intended to be exhaustive or to limit the embodiments of the invention to the precise form disclosed.