Embodiments and examples of the invention are generally in the field vehicles with autonomous driving and data processing systems for autonomous driving. More particularly, embodiments and examples of the invention relate to a deep neural network (DNN) based driving assistance system.
One type of vehicle is an electric powered vehicle that is gaining popularity due to its use of clean energy. Electric vehicles use a rechargeable battery to power an inductive motor that drives the vehicle. The rechargeable battery can be charged by being plugged into an electrical outlet or wirelessly by way of an inductive charging system. For a wireless or inductive charging system, a vehicle can be electrically coupled to a charging spot or pad to receive electrical power magnetically from the charging system to recharge its battery. Accurate alignment with the charging pad is essential to have the necessary coupling strength for inductive charging to properly recharge the vehicle battery. This requires a driver to manually maneuver the vehicle and accurately align a magnetic attractor under the vehicle with a magnet on the charging spot or pad to recharge the battery. The magnetic attractor under the vehicle is typically out of sight of the driver and, as a result, accurate positioning of the magnetic attractor by driver such that the vehicle maneuvers over the charging spot or pad can be difficult for proper alignment during charging.
Embodiments and examples of a deep neural network (DNN) based driving assistance system. The disclosed embodiments and examples can be for an end-to-end DNN or a DNN having intermediate outputs. For one example, a vehicle data processing system includes one or more sensors and a driving assistance system. The one or more sensors obtain data describing an environment around a vehicle. The driving assistance system is coupled to the one or more sensors and configured to detect continuously a designated object in the environment around the vehicle based on the captured data from the one or more sensors using a deep neural network (DNN). The driving assistance system is also configured to output commands from the DNN to autonomously steer the vehicle to the designated object in the environment to enable proper coupling of the vehicle with the designated object.
For one example, the designated object includes a charging pad of a wireless charging system, and the driving assistance system can be a charging assistance system to detect continuously the charging pad and to output commands to autonomously steer the vehicle to couple with the charging pad and enable wireless charging with the wireless charging system for recharging an electric battery of the vehicle. For other examples, the designated object can be a trailer hitch component, cable charging component, gas filling component or other device or component for coupling with the vehicle. The driving assistance system can be configured to output commands from the DNN to autonomously steer the vehicle to any of these designated objects in the environment to enable coupling of the vehicle with the designated objects.
For one example, the one or more sensors can include at least one camera, light detection and ranging device (LIDAR), ultrasonic devices, inertia measurement unit (IMU), and/or global positioning device (GPS). The camera can be any type of camera (e.g., surround camera, 2D or 3D camera, infrared camera, or night vision camera) to capture image data surrounding the vehicle including the designated object. The LIDAR can measure distance to the designated object by illuminating the designated object with a laser. The ultrasonic device can detect objects and distances using ultrasound waves. The IMU can collect angular velocity and linear acceleration data, and the GPS device can obtain GPS data and calculate geographical positioning of the vehicle.
For one example, a DNN includes a convolutional neural network (CNN) to detect the designated object in the data from the one or more sensors and a recurrent neural network (RNN) to track the designated object and output commands to a steering control system to steer the vehicle to enabling coupling with the designated object. The DNN can further include one or more sub-networks to detect obstacles or other objects in the environment based on the data from the one or more sensors. For one example, the driving assistance system can receive initialization information regarding location of the designated object from a database. Information regarding the designated object or other objects can be updated in the database.
Other devices, systems, methods and computer-readable mediums for end to end deep neural networks based charging assistance system are described.
The appended drawings illustrate examples and are, therefore, exemplary embodiments and not considered to be limiting in scope.
Deep neural network (DNN) based driving assistance system is disclosed. Deep neural networks (DNNs) are disclosed that learn features of designated objects or other objects for coupling with a vehicle based on statistical structures or correlations within input sensor data. The learned features can be provided to a mathematical model that can map detected features to an output. The mathematical model used by the DNN can be specialized for a specific task to be performed, e.g., detecting and tracking a designated object, e.g., a charging pad for wireless charging. The disclosed embodiments or examples can implement end-to-end DNN driving assistance or a DNN with intermediate outputs for driving assistance.
For one example, a vehicle data processing system includes one or more sensors and a driving assistance system. The one or more sensors obtain data describing an environment around a vehicle. The driving assistance system is coupled to the one or more sensors and configured to detect and track a designated object in the environment around the vehicle based on the captured data from the one or more sensors using a DNN. The driving assistance system is also configured to output commands from the DNN used to autonomously steer or maneuver the vehicle to the designated object in the environment to enable coupling of the vehicle with the designated object.
For one example, the driving assistance system can be used for wireless charging assistance to detect a charging pad in an environment surrounding the vehicle that is coupled to a wireless charging system. The driving assistance system can use a DNN to detect and track the charging pad and output commands used to autonomously steer the vehicle to the charging pad for enabling wireless coupling for recharging an electric battery of the vehicle without requiring a magnetic attractor. The vehicle can be coupled to other designated objects such as, for example, a trailer hitch component, cable charging component, gas filling component or like components or devices such that the driving assistance system outputs commands used to autonomously steer or maneuver the vehicle to any of these designated objects using the DNN. By using a DNN, a driving assistance system can provide an end to end operation for autonomously steering or maneuvering a vehicle to a designated object such as, e.g., a charging pad for wireless charging.
As set forth herein, various embodiments, examples and aspects will be described with reference to details discussed below, and the accompanying drawings will illustrate various embodiments and examples. The following description and drawings are illustrative and are not to be construed as limiting. Numerous specific details are described to provide a thorough understanding of various embodiments and examples. However, in certain instances, well-known or conventional details are not described in order to provide a concise discussion of the embodiments and examples.
For one example, electric motor 108 can be an alternating current (AC) induction motors, brushless direct-current (DC) motors, or brushed DC motors. Exemplary motors can include a rotor having magnets that can rotate around an electrical wire or a rotor having electrical wires that can rotate around magnets. Other exemplary motors can include a center section holding magnets for a rotor and an outer section having coils. For one example, when driving wheels 109, electric motor 108 contacts with electric battery 104 providing an electric current on the wire that creates a magnetic field to move the magnets in the rotor that generates torque to drive wheels 109. For one example, electric battery 104 can be a 120V or 240V rechargeable battery to power electric motor 108 or other electric motors for vehicle 110. Examples of electric battery 104 can include lead-acid, nickel-cadmium, nickel-metal hydride, lithium ion, lithium polymer, or other types of rechargeable batteries. For one example, electric battery 104 can be located on the floor and run along the bottom of vehicle 110. For one example, steering control system 105 can control electric motor 108 and wheels 109 based on commands from driving assistance system 107.
As a rechargeable battery, for one example, electric battery 104 can be charged wirelessly using a wireless charging system 115 connected to a charging pad 117 having a charging pad pattern 116. Wireless charging system 115 and charging pad 117 can be located in a garage, parking lot, gasoline station or any location for wireless charging vehicle 110. Wireless charging system 115 can have alternating current (AC) connectors coupled to a power source that can charge a 120V or 240V rechargeable battery. For example, wireless charging system 115 can provide kilowatts (kW) of power to charging pad 117 such as, e.g., 3.7 kW, 7.7 kW, 11 kW, and 22 kW of power to inductive receiver 104 of vehicle 110. For one example, charging pad 117 with charging pad pattern 116 is a designated object in the environment for coupling with vehicle 110. For example, driving assistance system 107 of vehicle 110 can detect the designated object (e.g., charging pad 117 with charging pad pattern 116) and continuously track the designated object using a deep neural network (DNN) to output commands including steering commands, braking commands, transmission commands, switching-off motor commands, etc. Each of these commands can be forwarded to respect subsystems of vehicle 110 to control their respective functions, e.g., braking, powertrain and steering.
For one example, steering control system 105 receives and processes commands from driving assistance system 107 to output steering signals such as forward, backward, stop, velocity, yaw direction, yaw velocity, etc. to steering subsystems of vehicle 110 to perform respective functions. In this way, driving assistance system 107 and steering control system 105 can be used to autonomously steer or maneuver vehicle 110 such that inductive receiver 104 is positioned substantially and directly above charging pad 117 to receive electric power inductively by way of wireless charging system 115. The DNN used by driving assistance system 107 can be trained to detect and track other designated objects in the environment for coupling with vehicle 110 including a trailer hitch component, cable charging component, gas filling component or other device or component for coupling with vehicle 110.
For one example, driving assistance system 107 can use rear vision sensors 112 near pillar C (103) to capture images of charging pad 117 and charging pad pattern 116 including the environment around or surrounding wireless charging system 115. Driving assistance system 107 can also use front vision sensors 106 near pillar A (101) to capture images in front of vehicle 110. Front and rear vision sensors 106 and 112 can include any type of camera to capture images such as a two or three dimensional (2D or 3D) camera, infrared camera, night vision camera or a surround view or stereo camera to capture a 360° degree surround image around vehicle 110. Driving assistance system 107 inputs those captured images (e.g., input feature maps) to deep neural networks (DNNs) disclosed herein that detects features (e.g., charging pattern 116 on charging pad 117) in the images to output commands to steering control system 105 in order to autonomously steer and maneuver vehicle 110 such that inductive receiver 104 is positioned over charging pad 117 for wireless charging. By using a DNN (e.g., as disclosed in
For one example, driving assistance system 107 and steering control system 105 can be one or more programs running on a computer or data processing system including one or more processors, central processing units (CPUs), system-on-chip (Soc) or micro-controllers and memory to run or implement respective functions and operations. For other examples, driving assistance system 107 and steering control system 105 can each be an electronic control unit (ECU) including a micro-controller and memory storing code to implement the end to end driving assistance including wireless charging assistance as disclosed herein. Driving assistance system 107 can implement WiFi, cellular or Bluetooth communication and related wireless communication protocols and, in this way, driving assistance system 107 can have access to other services including cloud services (e.g., cloud-based system 120 and database 121 shown in
For other examples, driving assistance system 107 can use additional data captured from other sensors to input data to the DNN in order to output commands for autonomously steering vehicle 110 over the charging pad 117. Other sensors can include a light detection and ranging (LIDAR) device 119 at the rear of vehicle 110 that can measure distance to a target by illuminating the target with a pulsed laser. LIDAR device 119 can be positioned in other locations such as on top of vehicle 110 and additional LIDAR devices can be located on vehicle 110 to measure distance of a target object using light. Additional sensors such as sensors 118-1 and 118-2 can be located on either side of vehicle 110 near pillar A (101) and include ultrasonic devices that detect objects and distances using ultrasound waves, and inertia measurement unit (IMU) which can collect angular velocity and linear acceleration data, global positioning system (GPS) device which can receive GPS satellite data and calculate vehicle 110 geographical position and etc. Data from sensors 118-1 and 118-2 can also be input to a DNN used to output commands to steering control system 105 for autonomously steering vehicle 110 over the charging pad 117 or other designated objects for coupling with vehicle 110.
For one example, one or more ECUs can be part of a global positioning system (GPS) or a wireless connection system or modem to communicate with cloud-based system 120 and database 121. Examples of communication protocols include Global System for Mobile Communications (GSM), General Packet Radio Service (GPRS), CDMAOne, CDMA2000, Evolution-Data Optimized (EV-DO), Enhanced Data Rates for GSM Evolution (EDGE), Universal Mobile Telecommunications System (UMTS), Digital Enhanced Cordless Telecommunications (DECT), Digital AMPS (IS-136/TDMA), Integrated Digital Enhance Network (iDEN), etc. and protocols including IEEE 802.11 wireless protocols, long-term evolution LTE 3G+ protocols, and Bluetooth and Bluetooth low energy (BLE) protocols.
For one example, database 121 can be part of cloud-based system 120 and store initialization or localization or area data for vehicle 110 in which a designated object, e.g., charging pad 117, is located in an environment surrounding vehicle 110 for coupling with vehicle 110. For other examples, database 121 can be located within or accessible by vehicle 110. Database 121 can store location information of service charging stations from maps, driving history related to vehicle 110. Vehicle 110 can have a database that stores such information and can be updated periodically from database 121 that can be located in cloud-based system 120. This database data can be forwarded driving assistance system 107 via a wireless connection or by way of network topology 150. For one example, vehicle 110 can communicate with a server in cloud-based system 120 providing information to vehicle 110 of other possible wireless charging locations and updating a DNN in driving assistance system 107 for an updated charging location. Driving assistance system 107 can use the updated information, e.g., updated DNN computations, to autonomously steer or maneuver vehicle 110 for the updated charging location.
For one example, vehicle 110 can communicate with other vehicles wirelessly, e.g., asking if vehicle is leaving a spot having a designated object for coupling by way of a vehicle-to-vehicle (V2V) communication protocol. For other examples, vehicle 110 can share data wireless from components of a highway and street system infrastructure such as charging stations, RFID readers, cameras, traffic lights, lane markers, street lights, signage, parking meters etc. by way of a vehicle-to-infrastructure (V2I) communication protocol which can assist driving assistance system 107 if related to the surroundings of vehicle 110. Information and data retrieved wireless using V2V and V2I communication protocols can be forwarded to driving assistance system 107 and processed and can be used to assist in detecting a designated object, e.g., charging pad 107, or other objects and obstacles.
For one example, each ECU can run firmware or code or hard-wired to perform its function and control any number of electronic components operating within vehicle 110. For example, ECUs network areas 150-A, 150-B and 150-C can have ECUs controlling electronic components or subsystems for braking, steering, powertrain, climate control, ignition, stability, lighting, airbag, sensors and etc. The ECUs in the different networking areas of vehicle 110 can communicate with each other by way of network topology 150 and network busses 158 and 159. Although two network busses are shown in
For one example, on-board computer 207 can run programs or modules to implement driving assistance system 107 and steering control system 105 to autonomously perform wireless charging by autonomously steering or maneuvering vehicle 110 such that inductive receiver 104 is above charging pad 117. For other examples, on-board computer 207 can receive voice commands that are processed to control interfaces on vehicle dashboard 237. For one example, driver tablet 210 is a tablet computer and can provide a touch screen with haptic feedback and controls. A driver of vehicle 110 can use driver tablet 210 to access vehicle function controls such as, e.g., climate control settings. Driver tablet 210 can be coupled to on-board computer 207 or another vehicle computer or ECU (not shown).
Display 202 can include a light emitting diode (LED) display, liquid crystal display (LCD), organic light emitting diode (OLED), or quantum dot display, which can run substantially from one side to the other side of vehicle dashboard 237. For one example, coast-to-display 202 can be a curved display integrated into and spans the substantial width of dashboard 237 (or coast-to-coast). One or more graphical user interfaces can be provided in a plurality of display areas such as display areas 1 (214), 2 (216), and 3 (218) of coast-to-coast display 202. Such graphical user interfaces can include status menus shown in, e.g., display areas 1 (214) and 3 (218) in which display area 3 (218) shows charging pad view 217. For one example, display area 1 (214) can show rear view, side view, or surround view images of vehicle 110 from one or more cameras, which can be located outside or inside of vehicle 110.
Referring to
Examples of I/O devices 320 include external devices such as a pen, Bluetooth devices and other like devices controlled by I/O controller 318. Network interface 317 can include modems, wired and wireless transceivers and communicate using any type of networking protocol including wired or wireless WAN and LAN protocols including LTE and Bluetooth standards. Memory device 310 can be any type of memory including random access memory (RAM), dynamic random-access memory (DRAM), which requires power continually in order to refresh or maintain the data in the memory. Non-volatile storage 306 can be a mass storage device including a magnetic hard drive or a magnetic optical drive or an optical drive or a digital video disc (DVD) RAM or a flash memory or other types of memory systems, which maintain data (e.g. large amounts of data) even after power is removed from the system.
For one example, memory devices 310 or database 312 can store user information and parameters related to DNN used by driving assistance system 107 including user information for applications on display 202. Although memory devices 310 and database 312 are shown coupled to system bus 301, processor(s) 302 can be coupled to any number of external memory devices or databases locally or remotely by way of network interface 317, e.g., database 312 can be secured storage in a cloud-based system 120. For one example, processor(s) 302 can implement techniques and operations described herein. Display(s) 315 can represent display 202 in
Examples and embodiments disclosed herein can be embodied in a data processing system architecture, data processing system or computing system, or a computer-readable medium or computer program product. Aspects, features, and details of the disclosed examples and embodiments can take the hardware or software or a combination of both, which can be referred to as a system or engine. The disclosed examples and embodiments can also be embodied in the form of a computer program product including one or more computer readable mediums having computer readable code which can be executed by one or more processors (e.g., processor(s) 302) to implement the techniques and operations disclosed herein.
Referring to
Referring to
For one example, convolutional layer 504 can provide a first level of filtering of the images for a designated object (e.g., charging pad 117) and convolutional layer 506 can provide a second level filter of the first level of filtering to detect additional features (e.g., charging pad pattern 116). The computations for a CNN include applying the convolution mathematical operation or computations to each filter to produce the output of that filter. CNN 402 can thus use filters of the convolutional layers 504 and 506 to charging pad 117 having charging pad pattern 116, which can be a designated object for coupling with a vehicle. The filters also can be configured to filter images for detection of other types of designated objects described herein. The outputs of the convolutional layers 504 and 506 feed into fully connected layers 508 that can produce an output feature map of detected features of a designated object. For one example, an output feature map can include multi-dimensional data describing a detected object in space and confidences of the detected object type—i.e., is the detected object a charing pad 117 having a charging pattern 116. Such data can be used by CNN 402 to determine that a charging pad 117 has been detected based on confidences that it has the charging pattern 116. For other examples, fully connected layers 508 can be omitted and the output of convolutional layers 504 and 506 can be used by RNN 406 to track the designated object and output steering commands.
Referring to
For one example, RNN 406 illustrates an exemplary recurrent neural network in which a previous state of the network influences the output of the current state of the network. The use of RNNs generally revolves around using mathematical models to predict the future based on a prior sequence of inputs. For example, RNN 406 can perform modeling to predict an upcoming steering, braking, or powertrain command of a vehicle based on previous feature maps of a designated object and track that designated object in subsequent data from CNN 402. RNN 406 has an input layer 512 that receives an input vector of a detected designated object from CNN 402, hidden layers 514 to implement a recurrent function, a feedback mechanism 515 to enable a ‘memory’ of previous states, and an output layer 516 to output a result such as, e.g., a steering command, e.g., left, right, straight, yaw rate or velocity.
For one example, RNN 406 can operate on time-steps. The state of the RNN at a given time step is influenced based on the previous time step via the feedback mechanism 515. For example, each time step can be based on data from an image capture one point in time and the next time step can be based on data from an image captured at a subsequent point in time. For a given time step, the state of the hidden layers 514 is defined by the previous state and the input at the current time step. For one example, an initial input (x1) at a first-time step can be processed by the hidden layer 514. A second input (x2) can be processed by the hidden layer 514 using state information that is determined during the processing of the initial input (x1). A given state can be computed as st=ƒ(Uxt+Wst−1), where U and W are parameter matrices. The function ƒ is generally a nonlinearity, such as the hyperbolic tangent function (Tanh) or a variant of the rectifier function ƒ(x)=max(0, x). Mathematical functions used in the hidden layers 514 of RNN 406 can vary depending on the specific object to track, e.g., charging pad 117 or a hitch or gas filling component. For one example, hidden layers 514 can include spatio-temporal convolution (ST-Conv) layers that can shift along both spatial and temporal dimensions. RNN 406 and input layer 512 can receive input from sensors including images from cameras and vehicle dynamic data such as LIDAR, ultrasonic, inertia, GPS, speed, torque, wheel angle data etc. that can be synchronized with varying time stamps to assist in determining output commands 407. For one example, RNN 406 can be trained for multi-task learning to determine specific output commands 407 for steering vehicle 110 to charging pad 117 or other designated object such as steering commands, stop and accelerate, switch off, or other vehicle commands. RNN 406 can be trained to minimize loss when coupling vehicle 110 to charging pad 117.
Referring to
Once a given network has been structured for a task in the driving assistance system 400, the neural network is trained using training dataset 522. Training frameworks 524 can be used to enable hardware acceleration of the training process. For example, training frameworks 524 can hook into an untrained neural network 526 and enable the untrained neural net to be trained to detect designated objects in a vehicle environment 100 including charging pad 117 or, alternatively, designated objects such as a trailer hitch component, cable charging component, gas filling component or other device or component for coupling with vehicle 110.
To start the training process, initial weights (filters) may be chosen randomly or by pre-training using a deep belief network. The training cycle can be performed in either a supervised or unsupervised manner. Supervised learning is a learning technique in which training is performed as a mediated operation, such as when training dataset 522 includes input paired with the desired output for the input, or where the training dataset 522 includes input having known output and the output of the neural network is manually graded. The network processes the inputs and compares the resulting outputs against a set of expected or desired outputs. Errors are then propagated back through the system. The training frameworks 524 can adjust to adjust the weights that control the untrained neural network 526. The training frameworks 524 can provide tools to monitor how well the untrained neural network 526 is converging towards a model suitable to generating correct answers based on known input data. The training process occurs repeatedly as the weights of the network are adjusted to refine the output generated by the neural network. The training process can continue until the neural network reaches a statistically desired accuracy associated with a trained neural network 528. The trained neural network 528 can then be deployed to implement any number of machine learning operations such as CNN 402, sub-branch network 404 and RNN 406.
Unsupervised learning is a learning method in which the network attempts to train itself using unlabeled data. Thus, for unsupervised learning the training dataset 522 will include input data without any associated output data. The untrained neural network 526 can learn groupings within the unlabeled input and can determine how individual inputs are related to the overall dataset. Unsupervised training can be used to generate a self-organizing map, which is a type of trained neural network 528 capable of performing operations useful in reducing the dimensionality of data. Unsupervised training can also be used to perform anomaly detection, which allows the identification of data points in an input dataset that deviate from the normal patterns of the data.
Variations on supervised and unsupervised training may also be employed. Semi-supervised learning is a technique in which in the training dataset 522 includes a mix of labeled and unlabeled data of the same distribution. Incremental learning is a variant of supervised learning in which input data is continuously used to further train the model. Incremental learning enables the trained neural network 528 to adapt to the new data 523 without forgetting the knowledge instilled within the network during initial training providing a result 530. Whether supervised or unsupervised, the training process for particularly deep neural networks may be too computationally intensive for a single compute node. Instead of using a single compute node, a distributed network of computational nodes can be used to accelerate the training process.
For one example, CNN 556 receives input from sensors 552 and can process the input using one or more convolutional layers to generate an intermediate feature map. This intermediate feature map can be fed into a plurality of subnetworks such as subnetworks 1-3 (557-1 to 557-3). Subnetwork 1 (557-1) can include one or more convolutional layers to detect a pad 560 (i.e., charging pad 117). Subnetwork 2 (557-2) can include one or more convolutional layers to detect free space 561. Subnetwork 3 (557-3) can include one or more convolutional layers to detect obstacles 562 or other objects. The detected pad 560, free space 561 and obstacles 562 or other objects are fed into RNN 570 that also receives vehicle dynamics 554 and output of RNN 567. RNN 567 receives geometry conversion 558 data that provides a virtual bird's eye view of detected pad 560 and surrounding area. Geometry conversion 558 can receive vehicle 110 sensor data to create the bird's eye view. RNN 570 can track detected pad, 560, free space 561, and obstacles 562 to determine driving commands using bird's eye view from RNN 567 and vehicle dynamics 554.
Initially, at operation 602, data is obtained from one or more sensors describing an environment of a vehicle, e.g., vehicle 110. For example, images from front and back vision sensors 106 and 112 are obtained. Alternatively, data from other sensors can be obtained such as sensors 118-1 and 118-2 and LIDAR 119. The sensor data is fed into driving assistance system 107.
At operation 604, a designated object is detected and tracked in the environment using a DNN. For example, driving assistance system 107 can feed images from sensor data into CNN 402 that detect (filter) features of a designated object. Referring to
At operation 606, driving and steering signals are output based on commands from the DNN to autonomously steer or maneuver the vehicle to the designed object for coupling. For example, output commands 407, e.g., steering commands, from RNN 406 that are forwarded to steering control system 105. Steering control system 105 can receive a continuous stream of steering commands from RNN 406 to autonomously steer or maneuver vehicle 110 to a designated object for coupling, e.g., wireless charging pad 117 or other designated objects described herein.
At operation 702, data from sensors describing an environment are obtained. For example, images from front and back vision sensors 106 and 112 are obtained. Alternatively, data from other sensors can be obtained such as sensors 118-1 and 118-2 and LIDAR 119. The sensor data is fed into driving assistance system 107.
At operation 704, an obstacle or other object is detected in the environment using a DNN. For example, sub-branch network 404 can be branch or subnetwork of CNN 402 with a subset of convolutional layers 504 and 506 and fully connected layers 508. Sub-branch network 404 can be configured to detect obstacles or other objects in the environment of vehicle 110. For one example, images from front vision sensors 106 and rear vision sensors 112 may show an obstacle (e.g., a person or another vehicle) within the environment of vehicle 110. Sub-branch network 404 can be modeled and trained to detect such obstacles and provide a warning to a driver. Alternatively, sub-branch network 404 can be trained and configured to detect other types of objects such as a charging pad, cables, segmentation of the ground in the surrounding environment, parking lines, etc. The detection of such the other objects can take into consideration as a loss during training of the CNN 402.
At operation 706, a warning is provided of the detected obstacle. For example, referring to
In the foregoing specification, the invention has been described with reference to specific examples and exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of disclosed examples and embodiments. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.