The present disclosure relates generally to autonomous vehicles. More particularly, the present disclosure is related to a modular enclosure for an in-vehicle computer system.
An in-vehicle computer system resides inside an autonomous vehicle and enables the autonomous vehicle to operate autonomously. The in-vehicle computer system includes many modules and components. The modules and components of the in-vehicle computer system are typically scattered in different chassis at different locations inside the autonomous vehicle.
This disclosure recognizes various problems and previously unmet needs related to autonomous vehicle technology, and more specifically to the lack of a modular enclosure that is able to house a set of components to facilitate the autonomous function of an autonomous vehicle while meeting a set of requirements, comprising a space requirement, a communication requirement, a cooling requirement, and a shock absorption requirement, the lack of enclosure that can be integrated into any semi-tractor truck and “plug-in” to the truck to connect to vehicle control components (e.g., breaks, steering wheel, engine, etc.) and sensors (e.g., cameras, Radars, etc.) and be able to facilitate the autonomous operations of the autonomous vehicle.
Certain embodiments of the present disclosure provide unique technical solutions to technical problems of current autonomous vehicle technology, including those problems described herein, to improve autonomous vehicle technology. More specifically, the present disclosure contemplates an unconventional enclosure that is configured to house a set of components (that are essential to facilitate the autonomous function of an autonomous vehicle) within a confined space while being able to meet the set of requirements. In particular, certain systems are designed and deployed with respect to the enclosure to allow the set of components to fit within the confined space as defined by the space requirements for the enclosure. The set of components includes a sensor processing unit, a compute unit, a vehicle control unit, a data diagnostics unit, and a communication gateway.
The disclosed system is configured to provide an enclosure that is modular-meaning that the enclosure is configured to be integrated into any semi-truck tractor unit, be connected to the sensors (and other components that control the movement of the vehicle), and enable the vehicle to operate autonomously. Therefore, the enclosure obviates the need for cables, chassis, and disparate components that make up the computing devices onboard an autonomous vehicle. For example, the modular enclosure may be installed in any vehicle without the need for additional cabling. This, in turn, reduces the installation time. Furthermore, in the long term, not having additional cabling will reduce the maintenance and tripping hazards. The modular enclosure provides connectors to be connected to the sensors and other components of the vehicle upon installation.
In the current practices, if the set of components is placed within a confined space, it would be challenging to provide appropriate power signal to each component so that each component is able to function with at least a minimum required power. Further, when the set of components is placed within a confined space, the components will be overheated as a result of electrical signals passing through layers and wires on circuit boards of the blades on which the components are placed. It would also be challenging to provide appropriate cooling to the components due to the components being closely spaced apart. If cables are used to distribute power to the components and enable the data communications of the components, the set of components will not fit within the required threshold space and/or satisfy the space requirements, communication requirements, or cooling requirements. The same applies to the blades on which the components are deployed. As the blades are placed closely together within a confined space, it becomes challenging to provide appropriate cooling, an appropriate power distribution, and communication lines to the blades.
The disclosed system provides a technical solution to these and other challenges in autonomous vehicle technology. For example, the disclosed system is configured to provide an appropriate power distribution among the components to allow communication throughput (e.g., data rate) more than a threshold communication throughput while being confined within a stringent physical space to satisfy the space requirement. For example, an unconventional backplane is designed and implemented for the enclosure, where the backplane includes transmission lines, circuit boards, bus wires, and the like configured to enable communicating power signals to the components, data communications among the components, and data communications from the enclosure to other devices, such as the vehicle subsystems, external devices, and the like.
The disclosed system is further configured to provide appropriate cooling to the components (implemented on the blades) by implementing a cooling system that is configured to satisfy the cooling requirement for the enclosure. The cooling requirement may indicate that the temperature within the enclosure is less than a threshold temperature (e.g., less than 18 degrees (° C.), less than 22 degrees (° C.), and the like). In one example, the cooling system may be coolant-based (e.g., liquid-based) and configured to pump the coolant through the pipes that circulate the coolant between the blades and a heat exchanger.
The disclosed system is further configured to provide shock absorption to the enclosure, for example, by implementing a shock absorption system to dampen the vibrations that may be generated from the movements of the autonomous vehicle while the autonomous vehicle is traveling on a road. Therefore, the safety and durability of the components are improved, e.g., by implementing the shock absorption system and the cooling system. Furthermore, the required communication throughput for the components is provided by the backplane, therefore, obviating the need for cabling that suffers from difficulty in maintenance and causes tripping hazards. In addition, cabling is more prone to electrical surges and damage at least because they are usually left on the floor.
In this manner, the components within the enclosure are kept secure from physical damage that may occur due to overheating and/or bumps on the road on which the autonomous vehicle is traveling. Furthermore, the modular enclosure provides technical improvements to the current autonomous vehicle technology by providing the ability to deploy and integrate the enclosure into any semi-tractor truck with minimum to no cabling, which reduces the deployment time and provides serviceability-meaning that the modular enclosure may be provided to autonomous vehicles as a “plug-in” solution to facilitate the autonomous navigation of the autonomous vehicles.
In certain embodiments, a system comprises an autonomous vehicle and an enclosure associated with the autonomous vehicle. The autonomous vehicle is configured to travel on a road autonomously. The enclosure is configured to house a set of components that facilitates the autonomous function of the autonomous vehicle. The set of components comprises a sensor processing unit, a compute unit, a vehicle control unit, a communication gateway, and a data diagnostics unit. The sensor processing unit is configured to detect objects from sensor data captured by at least one sensor. The compute unit is configured to determine a navigation path for the autonomous vehicle based at least in part upon an input signal received from the sensor processing unit. The vehicle control unit is configured to control the autonomous function of the autonomous vehicle based at least in part upon input signals received from other components from among the set of components. The communication gateway is configured to establish communications between the autonomous vehicle and other devices. The data diagnostics unit is configured to determine health data for at least one component of the autonomous vehicle. The enclosure is further configured to meet a set of requirements comprising a space requirement, a communication requirement, a cooling requirement, and a shock absorption requirement. The space requirement indicates that the enclosure is to have a dimension less than a threshold dimension. The communication requirement indicates to provide transmission lines to facilitate a communication throughput more than a threshold communication throughput among the set of components. The cooling requirement indicates to satisfy a threshold temperature within the enclosure. The shock absorption requirement indicates to satisfy a threshold damping factor. The enclosure comprises a backplane that is configured to satisfy the communication requirement. The backplane comprises the transmission lines that enable communications among the components of the set of components. The backplane is connected to a set of manifolds configured to circulate a coolant between a heat exchanger and the set of components. The backplane further comprises a set of connectors to connect at least one of the set of components to the at least one sensor. The backplane is positioned across a side of the set of components and against a back wall of the enclosure.
Certain embodiments of this disclosure may include some, all, or none of these advantages. These advantages and other features will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings and claims.
For a more complete understanding of this disclosure, reference is now made to the following brief description, taken in connection with the accompanying drawings and detailed description, wherein like reference numerals represent like parts.
As described above, previous technologies fail to provide efficient, reliable, and safe solutions to house a set of components that facilitate the autonomous function of an autonomous vehicle while meeting a set of requirements. The present disclosure provides various systems, methods, and devices to provide a modular enclosure configured to house a set of components that facilitate the autonomous function of an autonomous vehicle while meeting a set of requirements. Embodiments of the present disclosure and its advantages may be understood by referring to
In general, the system 100 provides several practical applications, technical improvements, and technical advantages that overcome the previously unmet technical problems in autonomous vehicle technology-which are described below. The system 100 is configured to reduce the form-factor of the set of components 104. In particular, certain systems are designed and deployed with respect to the enclosure 102 to allow the essential set of components 104 to fit within the confined space as defined by the space requirements 150 for the enclosure 102. The system 100 is further configured to provide the enclosure 102 that is modular-meaning that the enclosure 102 is configured to be integrated into any semi-truck tractor unit, be connected to the sensors 346 (and other components that control the movement of the vehicle), and enable the vehicle to operate autonomously. Therefore, the enclosure 102 obviates the need for cables, chassis, and scattered disparate components that make up the computing devices onboard an autonomous vehicle 302. For example, the modular enclosure 102 may be installed in any vehicle without the need for additional cabling that may include sensor cables and truck harness. The modular enclosure 102 provides connectors to be connected to the sensors 346 and other components of the vehicle upon installation.
In the current practices, if the set of components 104 is placed within a confined space, it would be challenging to provide appropriate power signals to each component 104 so that each component 104 is able to function with at least a minimum required power. Further, when the set of components 104 is placed within a confined space, the components 104 will be overheated as a result of electrical signals passing through layers, components, and wires on circuit boards of the blades on which the components 104 are placed. It would also be challenging to provide appropriate cooling to the components 104 due to the components being closely spaced apart. If cables are used to distribute power signals to the components 104 and enable the data communications of the components 104, the set of components 104 would not fit within the required threshold space or satisfy the space requirements 150 in the current practices. The same applies to the blades 118 on which the components 104 are deployed. As the blades 118 are placed closely together within a confined space, it becomes challenging to provide appropriate cooling and an appropriate power distribution to the blades 118.
The system 100 provides a solution to these and other challenges in autonomous vehicle technology. For example, the system 100 is configured to provide an appropriate power distribution and sequencing among the components 104 to allow communication throughput (e.g., data rate) more than a threshold communication throughput 160. For example, an unconventional backplane 204 (see
The system 100 is further configured to provide appropriate cooling to the components 104 (and blades 118) by implementing a cooling system 106 that is configured to satisfy the cooling requirement 154 for the enclosure 102. The cooling requirement 154 may indicate that the temperature within the enclosure 102 is less than a threshold temperature 162 (e.g., less than 18 degrees (° C.), less than 22 degrees (° C.), and the like). In one example, the cooling system 106 may be liquid-based and configured to pump cooled liquid through the pipes that circulate through the blades 118.
The system 100 is further configured to provide shock absorption to the enclosure 102, for example, by implementing a shock absorption system 108 to dampen the vibrations that may be generated from movements of the autonomous vehicle 302 while the autonomous vehicle 302 is traveling on a road. Therefore, the safety and durability of the components 104 are improved, e.g., by implementing the shock absorption system 108 and the cooling system 106. Furthermore, the required communication throughput for the components 104 is provided by the backplane (204 in
In this manner, the components 104 are kept secure from physical damage that may occur due to overheating and/or bumps on the road on which the autonomous vehicle 302 is traveling. Furthermore, the modular enclosure 102 provides technical improvements to the current autonomous vehicle technology by providing the ability to deploy and integrate the enclosure 102 into any semi-tractor truck with minimum to no cabling, which reduces the deployment time. Furthermore, the modular enclosure 102 provides an additional technical improvement to the current autonomous vehicle by providing serviceability-meaning that the modular enclosure 102 may be provided to autonomous vehicles as a “plug-in” product to facilitate the autonomous navigation of the autonomous vehicles.
Network 110 may include any interconnecting system capable of transmitting audio, video, signals, data, messages, or any combination of the preceding. Network 110 may include all or a portion of a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a personal area network (PAN), a wireless PAN (WPAN), an overlay network, a software-defined network (SDN), a virtual private network (VPN), a packet data network (e.g., the Internet), a mobile telephone network (e.g., cellular networks, such as 4G or 5G), a plain old telephone (POT) network, a wireless data network (e.g., WiFi, WiGig, WiMAX, etc.), a long-term evolution (LTE) network, a universal mobile telecommunications system (UMTS) network, a peer-to-peer (P2P) network, a Bluetooth network, a near field communication (NFC) network, a Zigbee network, a Z-wave network, a WiFi network, and/or any other suitable network.
In certain embodiments, the autonomous vehicle 302 may include a semi-truck tractor unit attached to a trailer to transport cargo or freight from one location to another location (see
Control device 350 may be generally configured to control the operation of the autonomous vehicle 302 and its components and to facilitate autonomous driving of the autonomous vehicle 302. The control device 350 may be further configured to determine a pathway in front of the autonomous vehicle 302 that is safe to travel and free of objects or obstacles, and navigate the autonomous vehicle 302 to travel in that pathway. This process is described in more detail in
The control device 350 may be configured to detect objects on and around a road traveled by the autonomous vehicle 302 by analyzing the sensor data 140 and/or map data 142. For example, the control device 350 may detect objects on and around the road by implementing object detection machine learning modules 146. The object detection machine learning modules 146 may be implemented using neural networks and/or machine learning algorithms for detecting objects from images, videos, infrared images, point clouds, audio feed, Radar data, etc. The object detection machine learning modules 146 are described in more detail further below. The control device 350 may receive sensor data 140 from the sensors 346 positioned on the autonomous vehicle 302 to determine a safe pathway to travel. The sensor data 140 may include data captured by the sensors 346.
Sensors 346 may be configured to capture any object within their detection zones or fields of view, such as landmarks, lane markers, lane boundaries, road boundaries, vehicles, pedestrians, road/traffic signs, among others. In some embodiments, the sensors 346 may be configured to detect rain, fog, snow, and/or any other weather condition. The sensors 346 may include a detection and ranging (LiDAR) sensor, a Radar sensor, a video camera, an infrared camera, an ultrasonic sensor system, a wind gust detection system, a microphone array, a thermocouple, a humidity sensor, a barometer, an inertial measurement unit, a positioning system, an infrared sensor, a motion sensor, a rain sensor, and the like. In some embodiments, the sensors 346 may be positioned around the autonomous vehicle 302 to capture the environment surrounding the autonomous vehicle 302. See the corresponding description of
The control device 350 is described in greater detail in
The processor 122 may be one of the data processors 370 described in
The processor(s) 122 may include a sensor processing unit 124, a compute unit 126, a vehicle control unit 128, and a data diagnostics unit 130. The processor(s) 122 are operably coupled to one another and to other components of the control device 350.
The sensor processing unit 124 may include one or more hardware and/or software processors that are configured to process the sensor data 140 and detect objects from the sensor data 140 captured by the sensors 346. In certain embodiments, the sensor processing unit 124 may perform pre-processing on the sensor data 140 before communicating it to another unit. The pre-processing may include initial identification of objects detected from the sensor data 140, among other operations. In certain embodiments, the sensor processing unit 124 may communicate data with any of the vehicle subsystems (340 in
The compute unit 126 may include one or more hardware and/or software processors that are configured to determine a navigation path for the autonomous vehicle 302 based at least on the input signals received from the sensor processing unit 124. In certain embodiments, the compute unit 126 may communicate data with any of the vehicle subsystems (340 in
The vehicle control unit 128 may include one or more hardware and/or software processors that are configured to control the autonomous function of the autonomous vehicle 302 based at least on the input signals received from other components of the control device 350, including the components 104. The vehicle control unit 128 may also be referred to as a vehicle control and interface unit. In certain embodiments, the vehicle control unit 128 may communicate data with any of the vehicle subsystems (340 in
The data diagnostics unit 130 may include one or more hardware and/or software processors that are configured to determine a health data for each component of the autonomous vehicle 302. The health data may indicate, for example, performance percentage, capacity, utilization, fuel level, oil level, tire air level, operating temperature, and any other indications that may convey a health of a component.
Communication gateway 132 may include the network interface 134. The Network interface 134 may be a component of the network communication subsystem 392 described in
The memory 136 may be one of the data storages 390 described in
Map data 142 may include a virtual map of a city or an area that includes the road traveled by an autonomous vehicle 302. In some examples, the map data 142 may include the map 458 and map database 436 (see
Driving instructions 144 may be implemented by the planning module 462 (Sec descriptions of the planning module 462 in
Routing plan 145 may be a plan for traveling from a start location (e.g., a first autonomous vehicle launchpad/landing pad) to a destination (e.g., a second autonomous vehicle launchpad/landing pad). For example, the routing plan 145 may specify a combination of one or more streets, roads, and highways in a specific order from the start location to the destination. The routing plan 145 may specify stages, including the first stage (e.g., moving out from a start location/launch pad), a plurality of intermediate stages (e.g., traveling along particular lanes of one or more particular street/road/highway), and the last stage (e.g., entering the destination/landing pad). The routing plan 145 may include other information about the route from the start position to the destination, such as road/traffic signs in that routing plan 145, etc.
Object detection machine learning modules 146 may be implemented by the processor 122 executing software instructions 138, and may be generally configured to detect objects and obstacles from the sensor data 140. The object detection machine learning modules 146 may be implemented using neural networks and/or machine learning algorithms for detecting objects from any data type, such as images, videos, infrared images, point clouds, audio feed, Radar data, etc.
In some embodiments, the object detection machine learning modules 146 may be implemented using machine learning algorithms, such as Support Vector Machine (SVM), Naive Bayes, Logistic Regression, k-Nearest Neighbors, Decision Trees, or the like. In some embodiments, the object detection machine learning modules 146 may utilize a plurality of neural network layers, convolutional neural network layers, Long-Short-Term-Memory (LSTM) layers, Bi-directional LSTM layers, recurrent neural network layers, and/or the like, in which weights and biases of these layers are optimized in the training process of the object detection machine learning modules 146. The object detection machine learning modules 146 may be trained by a training dataset that may include samples of data types labeled with one or more objects in each sample. For example, the training dataset may include sample images of objects (e.g., vehicles, lane markings, pedestrians, road signs, obstacles, etc.) labeled with object(s) in each sample image. Similarly, the training dataset may include samples of other data types, such as videos, infrared images, point clouds, audio feed, Radar data, etc., labeled with object(s) in each sample data. The object detection machine learning modules 146 may be trained, tested, and refined by the training dataset and the sensor data 140. The object detection machine learning modules 146 use the sensor data 140 (which are not labeled with objects) to increase their accuracy of predictions in detecting objects. Similar operations and embodiments may apply for training the object detection machine learning modules 146 using the training dataset that includes sound data samples each labeled with a respective sound source and a type of sound. For example, supervised and/or unsupervised machine learning algorithms may be used to validate the predictions of the object detection machine learning modules 146 in detecting objects in the sensor data 140.
The modular enclosure 102 may be a physical structural component that is configured to house the set of components 104. The enclosure 102 is further configured to meet a set of requirements 148. The set of requirements 148 may include a space requirement 150, a communication requirement 152, a cooling requirement 154, and a shock absorption requirement 156.
The space requirement 150 may indicate that the enclosure 102 to have a dimension less than a threshold dimension 158. In other words, the space requirement 150 may indicate that the enclosure 102 should occupy less than the threshold dimension 158 in the physical space. In one example, the threshold dimension 158 may be 36 inches×24 inches×22 inches. In the same or other examples, the threshold dimension 158 may be any suitable dimension that can fit between the driver seat 112 and the passenger seat 114 in a semi-tractor truck and has a height less than a shoulder-level height of a driver (while sitting in the driver seat 112) to allow the driver to view outside the vehicle from side and back windows. In certain embodiments, the enclosure 102 is configured to fit within a threshold volume of space. For example, the enclosure 102 may be configured to fit between a driver seat 112 and a passenger seat 114 in a cabin of the autonomous vehicle 302.
The communication requirement 152 may indicate to provide at least a threshold communication throughput 160 among the components 104, between the components 104 and the external devices 170, and between the components 104 and other components of the autonomous vehicle 302. For example, the communication requirement 152 may be met by transmission lines, cables, and bus wires capable of transmitting data with at least the threshold communication throughput 160. The communication throughput 160 may be referred to as data bitrate. In certain embodiments, the communication requirement 152 may also indicate to supply power (and voltage) signals to the components 104 to maintain the circuit boards of the components 104 operational. In certain embodiments, the communication requirement 152 may also indicate to maintain signal integrity more than a threshold signal integrity, such as a threshold jitter noise, a threshold data package loss, a threshold signal bandwidth, and the like for the data communications of the components 104.
The cooling requirement 154 may indicate to satisfy a threshold temperature 162 within the enclosure 102. The threshold temperature 162 maybe 18 degrees (° C.), 22 degrees (° C.), and the like. The shock absorption requirement 156 may indicate to satisfy a threshold damping factor 164. For example, as the autonomous vehicle 302 travels on a road, the enclosure 102 may experience vibrations from the movements of the autonomous vehicle 302. The shock absorption requirement 156 may be used to determine if the vibrations are dampened below the threshold damping factor 164.
The components 104 may be implemented on blades 118. Each blade 118 may be a thin server or other electronic component that is configured to fit into a bay within the enclosure 102. Each blade 118 may include a computing system that is configured to perform the operations of one or more component 104. In the illustrated embodiment, the blades 118 are vertically positioned inside the enclosure 102. In other embodiments, the blades 118 may be positioned horizontally and/or in any orientation. In the illustrated embodiment, the blades 118 are positioned above the pump 107. In other embodiments, the blades 118 may be positioned at any location with respect to the pump 107. In certain embodiments, each component 104 may be implemented in a single blade 118. Thus, if a component 104 ever needs to be updated, changed, or serviced, the currently in place blade 118 (that hosts the component 104) may be replaced with a new blade 118. In certain embodiments, a component 104 may be implemented in multiple blades 118. In certain embodiments, multiple components 104 may be implemented on a single blade 118. Each blade 118 may include a set of connectors configured to accept one or more manifolds of the cooling systems 106. For example, the set of connectors on the blade 118 may be connected to the manifolds of the cooling system 106 to allow the circulation of the cooled liquid through the pipes of the cooling system 106 through the blades 118 and cooling down the blades 118.
Each blade 118 may also include a set of connectors that are configured to accept transmission lines, bus wires, and the like that from one side are connected to the backplane (204 in
In certain embodiments, the enclosure 102 may include a cooling system 106 that is configured to satisfy the cooling requirement 154. The cooling system 106 may include a set of manifolds that is configured to pump and flow a cooled liquid toward the set of components 104 (implemented on the blades 118). For example, the set of manifolds may be connected to a set of pipes in which the liquid flows. The set of pipes may run through the components 104 (implemented on the blades 118). When the cooled liquid flows through the pipes from the inlets, it reduces the temperature of the electrical components on the blades 118. The cooled liquid is circulated through the blades 118. After the circulation, the liquid absorbs the heat from the electrical components on the blades 118 and this warms up the liquid inside the pipes. The warmed-up liquid flows out from the outlets to one or more heat exchangers (e.g., chillers or radiators. The heat exchangers cool down the liquid so that it can be circulated back into the blades 118. The cooling system 106 further includes a pump 107 (e.g., a single pump, a dual pump, etc.) configured to pump the cooled liquid to the components 104 (or the blades 118).
In certain embodiments, the control device 350 may monitor the temperature within the enclosure 102 by temperature sensor(s) placed within the enclosure 102. The control device 350 may communicate fluid control commands 166 to the cooling system 106 to control the flow of the coolant (e.g., liquid) through the pipes. For example, if the temperature detected by the temperature sensor(s) reaches above the threshold temperature 162, the control device 350 may communicate a fluid control command 166 that leads to flowing a lower-temperature liquid through the pipes that run through the blades 118. In another example, the fluid control command 166 may indicate to increase the flow rate (or circulation rate) of the cooled liquid until the temperature detected by the temperature sensor(s) is at least the threshold temperature 162.
In certain embodiments, the enclosure 102 may include a shock absorption system 108 that is configured to satisfy the shock absorption requirement 156. In certain embodiments, the shock absorption system 108 may be implemented underneath the enclosure 102. In other embodiments, the shock absorption system 108 may be implemented in any suitable location. In certain embodiments, the shock absorption system 108 may be separate from and in addition to the shock absorption system that is underneath the cabin of the autonomous vehicle 302 and above the tires.
The shock absorption system 108 may include multiple shock absorbers or dampeners positioned at different locations. For example, the shock dampeners or absorbers may be placed underneath the enclosure 102, below the blades 118, below the pump 107, between the blades 118 and the pump 107, or in any other suitable locations. Each shock absorber may be configured to absorb or dampen the vibrations caused by movements of the autonomous vehicle 302 while traveling and movements caused by a person roaming inside the cabin of the autonomous vehicle 302. In other words, shock absorber may be configured to absorb or dampen compression and rebound of the springs and suspension. In certain embodiments, the shock absorbers may be formed from absorbing polymers, viscoelastic polymers, visco polymers, rubber, neoprene, silicone, etc.
In certain embodiments, the shock absorption system 108 may provide vibration data 168 to the control device 350. The vibration data 168 may indicate a damping factor and levels of movement (e.g., vertical movement) of the enclosure 102, the blades 118, and/or other components within the enclosure 102 with respect to the road. The control device 350 may analyze the vibration data 168 and determine if the shock absorption requirement 156 is met, e.g., if the damping factor indicated by the vibration data 168 is less than the threshold damping factor and/or the enclosure 102 reaches a steady state within a threshold time period (e.g., within five seconds, ten seconds, etc.). If the control device 350 determines that the enclosure 102 determines that the shock absorption requirement 156 is not met and/or the enclosure 102 does not reach the steady state within the threshold period, the control device 350 may update the routing plan 145 to avoid further bumps on the road. The control device 350 may flag the shock absorption system 108 to be serviced upon arrival at a destination.
Each blade 118 may be slid in or inserted into a bay 210 (e.g., a designated location) within the enclosure 102. In the illustrated embodiment, the interior of enclosure 102 may also be accessed from a side to allow insertion of the blade 118 into a respective bay 210.
The enclosure 102 may include the backplane 204 that is configured to satisfy the communication requirement (152 in
The backplane 204 may further include a set of connectors to connect at least one of the components (e.g., the sensor processing unit 124 in
The autonomous vehicle 302 may include various vehicle subsystems that support the operation of the autonomous vehicle 302. The vehicle subsystems 340 may include a vehicle drive subsystem 342, a vehicle sensor subsystem 344, a vehicle control subsystem 348, and/or network communication subsystem 392. The components or devices of the vehicle drive subsystem 342, the vehicle sensor subsystem 344, and the vehicle control subsystem 348 shown in
The vehicle drive subsystem 342 may include components operable to provide powered motion for the autonomous vehicle 302. In an example embodiment, the vehicle drive subsystem 342 may include an engine/motor 342a, wheels/tires 342b, a transmission 342c, an electrical subsystem 342d, and a power source 342c.
The vehicle sensor subsystem 344 may include a number of sensors 346 configured to sense information about an environment or condition of the autonomous vehicle 302. The vehicle sensor subsystem 344 may include one or more cameras 346a or image capture devices, a radar unit 346b, one or more thermal sensors 346c, a wireless communication unit 346d (e.g., a cellular communication transceiver), an inertial measurement unit (IMU) 346e, a laser range finder/LiDAR unit 346f, a Global Positioning System (GPS) transceiver 346g, a wiper control system 346h. The vehicle sensor subsystem 344 may also include sensors configured to monitor internal systems of the autonomous vehicle 302 (e.g., an 02 monitor, a fuel gauge, an engine oil temperature, etc.).
The IMU 346e may include any combination of sensors (e.g., accelerometers and gyroscopes) configured to sense position and orientation changes of the autonomous vehicle 302 based on inertial acceleration. The GPS transceiver 346g may be any sensor configured to estimate a geographic location of the autonomous vehicle 302. For this purpose, the GPS transceiver 346g may include a receiver/transmitter operable to provide information regarding the position of the autonomous vehicle 302 with respect to the Earth. The radar unit 346b may represent a system that utilizes radio signals to sense objects within the local environment of the autonomous vehicle 302. In some embodiments, in addition to sensing the objects, the radar unit 346b may additionally be configured to sense the speed and the heading of the objects proximate to the autonomous vehicle 302. The laser range finder or LiDAR unit 346f may be any sensor configured to use lasers to sense objects in the environment in which the autonomous vehicle 302 is located. The cameras 346a may include one or more devices configured to capture a plurality of images of the environment of the autonomous vehicle 302. The cameras 346a may be still image cameras or motion video cameras.
Cameras 346a may be rear-facing and front-facing so that pedestrians, and any hand signals made by them or signs held by pedestrians, may be observed from all around the autonomous vehicle. These cameras 346a may include video cameras, cameras with filters for specific wavelengths, as well as any other cameras suitable to detect hand signals, hand-held traffic signs, or both hand signals and hand-held traffic signs. A sound detection array, such as a microphone or array of microphones, may be included in the vehicle sensor subsystem 344. The microphones of the sound detection array may be configured to receive audio indications of the presence of, or instructions from, authorities, including sirens and commands such as “Pull over.” These microphones are mounted, or located, on the external portion of the vehicle, specifically on the outside of the tractor portion of an autonomous vehicle. Microphones used may be any suitable type, mounted such that they are effective both when the autonomous vehicle is at rest, as well as when it is moving at normal driving speeds.
The vehicle control subsystem 348 may be configured to control the operation of the autonomous vehicle 302 and its components. Accordingly, the vehicle control subsystem 348 may include various elements such as a throttle and gear selector 348a, a brake unit 348b, a navigation unit 348c, a steering system 348d, and/or an autonomous control unit 348e. The throttle and gear selector 348a may be configured to control, for instance, the operating speed of the engine and, in turn, control the speed of the autonomous vehicle 302. The throttle and gear selector 348a may be configured to control the gear selection of the transmission. The brake unit 348b can include any combination of mechanisms configured to decelerate the autonomous vehicle 302. The brake unit 348b can slow the autonomous vehicle 302 in a standard manner, including by using friction to slow the wheels or engine braking. The brake unit 348b may include an anti-lock brake system (ABS) that can prevent the brakes from locking up when the brakes are applied. The navigation unit 348c may be any system configured to determine a driving path or route for the autonomous vehicle 302. The navigation unit 348c may additionally be configured to update the driving path dynamically while the autonomous vehicle 302 is in operation. In some embodiments, the navigation unit 348c may be configured to incorporate data from the GPS transceiver 346g and one or more predetermined maps so as to determine the driving path for the autonomous vehicle 302. The steering system 348d may represent any combination of mechanisms that may be operable to adjust the heading of autonomous vehicle 302 in an autonomous mode or in a driver-controlled mode.
The autonomous control unit 348e may represent a control system configured to identify, evaluate, and avoid or otherwise negotiate potential obstacles or obstructions in the environment of the autonomous vehicle 302. In general, the autonomous control unit 348c may be configured to control the autonomous vehicle 302 for operation without a driver or to provide driver assistance in controlling the autonomous vehicle 302. In some embodiments, the autonomous control unit 348c may be configured to incorporate data from the GPS transceiver 346g, the radar unit 346b, the LiDAR unit 346f, the cameras 346a, and/or other vehicle subsystems to determine the driving path or trajectory for the autonomous vehicle 302.
The network communication subsystem 392 may comprise network interfaces, such as routers, switches, modems, and/or the like. The network communication subsystem 392 may be configured to establish communication between the autonomous vehicle 302 and other systems, servers, etc. The network communication subsystem 392 may be further configured to send and receive data from and to other systems.
Many or all of the functions of the autonomous vehicle 302 can be controlled by the in-vehicle control computer 350. The in-vehicle control computer 350 may include at least one data processor 370 (which can include at least one microprocessor) that executes processing instructions 380 stored in a non-transitory computer-readable medium, such as the data storage device 390 or memory. The in-vehicle control computer 350 may also represent a plurality of computing devices that may serve to control individual components or subsystems of the autonomous vehicle 302 in a distributed fashion. In some embodiments, the data storage device 390 may contain processing instructions 380 (e.g., program logic) executable by the data processor 370 to perform various methods and/or functions of the autonomous vehicle 302, including those described with respect to
The data storage device 390 may contain additional instructions as well, including instructions to transmit data to, receive data from, interact with, or control one or more of the vehicle drive subsystem 342, the vehicle sensor subsystem 344, and the vehicle control subsystem 348. The in-vehicle control computer 350 can be configured to include a data processor 370 and a data storage device 390. The in-vehicle control computer 350 may control the function of the autonomous vehicle 302 based on inputs received from various vehicle subsystems (e.g., the vehicle drive subsystem 342, the vehicle sensor subsystem 344, and the vehicle control subsystem 348).
The sensor fusion module 402 can perform instance segmentation 408 on image and/or point cloud data items to identify an outline (e.g., boxes) around the objects and/or obstacles located around the autonomous vehicle. The sensor fusion module 402 can perform temporal fusion 410 where objects and/or obstacles from one image and/or one frame of point cloud data item are correlated with or associated with objects and/or obstacles from one or more images or frames subsequently received in time.
The sensor fusion module 402 can fuse the objects and/or obstacles from the images obtained from the camera and/or point cloud data item obtained from the LiDAR sensors. For example, the sensor fusion module 402 may determine based on a location of two cameras that an image from one of the cameras comprising one half of a vehicle located in front of the autonomous vehicle is the same as the vehicle captured by another camera. The sensor fusion module 402 may send the fused object information to the tracking or prediction module 446 and the fused obstacle information to the occupancy grid module 460. The in-vehicle control computer may include the occupancy grid module 460 which can retrieve landmarks from a map database 458 stored in the in-vehicle control computer. The occupancy grid module 460 can determine drivable areas and/or obstacles from the fused obstacles obtained from the sensor fusion module 402 and the landmarks stored in the map database 458. For example, the occupancy grid module 460 can determine that a drivable area may include a speed bump obstacle.
As shown in
The radar 456 on the autonomous vehicle can scan an area surrounding the autonomous vehicle or an area towards which the autonomous vehicle is driven. The Radar data may be sent to the sensor fusion module 402 that can use the Radar data to correlate the objects and/or obstacles detected by the radar 456 with the objects and/or obstacles detected from both the LiDAR point cloud data item and the camera image. The Radar data also may be sent to the tracking or prediction module 446 that can perform data processing on the Radar data to track objects by object tracking module 448 as further described below.
The in-vehicle control computer may include a tracking or prediction module 446 that receives the locations of the objects from the point cloud and the objects from the image, and the fused objects from the sensor fusion module 402. The tracking or prediction module 446 also receives the Radar data with which the tracking or prediction module 446 can track objects by object tracking module 448 from one point cloud data item and one image obtained at one time instance to another (or the next) point cloud data item and another image obtained at another subsequent time instance.
The tracking or prediction module 446 may perform object attribute estimation 450 to estimate one or more attributes of an object detected in an image or point cloud data item. The one or more attributes of the object may include a type of object (e.g., pedestrian, car, or truck, etc.). The tracking or prediction module 446 may perform behavior prediction 452 to estimate or predict the motion pattern of an object detected in an image and/or a point cloud. The behavior prediction 452 can be performed to detect a location of an object in a set of images received at different points in time (e.g., sequential images) or in a set of point cloud data items received at different points in time (e.g., sequential point cloud data items). In some embodiments, the behavior prediction 452 can be performed for each image received from a camera and/or each point cloud data item received from the LiDAR sensor. In some embodiments, the tracking or prediction module 446 can be performed (e.g., run or executed) on received data to reduce computational load by performing behavior prediction 452 on every other or after every pre-determined number of images received from a camera or point cloud data item received from the LiDAR sensor (e.g., after every two images or after every three-point cloud data items).
The behavior prediction 452 feature may determine the speed and direction of the objects that surround the autonomous vehicle from the Radar data, where the speed and direction information can be used to predict or determine motion patterns of objects. A motion pattern may comprise a predicted trajectory information of an object over a pre-determined length of time in the future after an image is received from a camera. Based on the motion pattern predicted, the tracking or prediction module 446 may assign motion pattern situational tags to the objects (e.g., “located at coordinates (x,y),” “stopped,” “driving at 50 mph,” “speeding up” or “slowing down”). The situation tags can describe the motion pattern of the object. The tracking or prediction module 446 may send the one or more object attributes (e.g., types of the objects) and motion pattern situational tags to the planning module 462. The tracking or prediction module 446 may perform an environment analysis 454 using any information acquired by system 400 and any number and combination of its components.
The in-vehicle control computer may include the planning module 462 that receives the object attributes and motion pattern situational tags from the tracking or prediction module 446, the drivable area and/or obstacles, and the vehicle location and pose information from the fused localization module 426 (further described below).
The planning module 462 can perform navigation planning 464 to determine a set of trajectories on which the autonomous vehicle can be driven. The set of trajectories can be determined based on the drivable area information, the one or more object attributes of objects, the motion pattern situational tags of the objects, location of the obstacles, and the drivable area information. In some embodiments, the navigation planning 464 may include determining an area next to the road where the autonomous vehicle can be safely parked in a case of emergencies. The planning module 462 may include behavioral decision making 466 to determine driving actions (e.g., steering, braking, throttle) in response to determining changing conditions on the road (e.g., traffic light turned yellow, or the autonomous vehicle is in an unsafe driving condition because another vehicle drove in front of the autonomous vehicle and in a region within a pre-determined safe distance of the location of the autonomous vehicle). The planning module 462 performs trajectory generation 468 and selects a trajectory from the set of trajectories determined by the navigation planning operation 464. The selected trajectory information may be sent by the planning module 462 to the control module 470.
The in-vehicle control computer may include a control module 470 that receives the proposed trajectory from the planning module 462 and the autonomous vehicle location and pose from the fused localization module 426. The control module 470 may include a system identifier 472. The control module 470 can perform a model-based trajectory refinement 474 to refine the proposed trajectory. For example, the control module 470 can apply filtering (e.g., Kalman filter) to make the proposed trajectory data smooth and/or to minimize noise. The control module 470 may perform the robust control 476 by determining, based on the refined proposed trajectory information and current location and/or pose of the autonomous vehicle, an amount of brake pressure to apply, a steering angle, a throttle amount to control the speed of the vehicle, and/or a transmission gear. The control module 470 can send the determined brake pressure, steering angle, throttle amount, and/or transmission gear to one or more devices in the autonomous vehicle to control and facilitate precise driving operations of the autonomous vehicle.
The deep image-based object detection 424 performed by the image-based object detection module 418 can also be used detect landmarks (e.g., stop signs, speed bumps, etc.,) on the road. The in-vehicle control computer may include a fused localization module 426 that obtains landmarks detected from images, the landmarks obtained from a map database 436 stored on the in-vehicle control computer, the landmarks detected from the point cloud data item by the LiDAR-based object detection module 412, the speed and displacement from the odometer sensor 444, or a rotary encoder, and the estimated location of the autonomous vehicle from the GPS/IMU sensor 438 (e.g., GPS sensor 440 and IMU sensor 442) located on or in the autonomous vehicle. Based on this information, the fused localization module 426 can perform a localization operation 428 to determine a location of the autonomous vehicle, which can be sent to the planning module 462 and the control module 470.
The fused localization module 426 can estimate pose 430 of the autonomous vehicle based on the GPS and/or IMU sensors 438. The pose of the autonomous vehicle can be sent to the planning module 462 and the control module 470. The fused localization module 426 can also estimate status (e.g., location, possible angle of movement) of the trailer unit based on (e.g., trailer status estimation 434), for example, the information provided by the IMU sensor 442 (e.g., angular rate and/or linear velocity). The fused localization module 426 may also check the map content 432.
In the example of
The transmission lines 804a-f are generally used for modeling physical traces on the circuit board that form the circuit 800. The transmission lines 804a-f may be configured to emulate the physical properties of the physical traces. For example, the physical traces may be electrical signals conducting copper lines that are manufactured on the circuit board 800. The transmission lines are generally copper lines or trances that are designed to conduct electromagnetic signals. A transmission line may be formed over a dielectric material that forms the circuit board. The capacitors 806a-b generally represent or model parasitic capacitance along the transmission lines. The via 806 may be an instance of a via discussed with respect to
The first section 901 of the circuit 900 may include a PCIe interface 908, transmission lines 912a-f, capacitors 912a-b, and a via 914. The PCIe interface 908 may be used as a driver to create electrical signals. The electrical signals may flow through the output ports of the PCIe interface 908 toward the transmission lines 910a and 910c, respectively. Ultimately, the electrical signals may flow through the communication paths from the first component 902 to the second component 904 and to the third component 906 and reach the PCIe interface 928.
The transmission lines 910a-f are generally used for modeling physical traces on the circuit board at the first section 901 of the circuit 900. The transmission lines 910a-f may be configured to emulate the physical properties of the physical traces. For example, the physical traces may be electrically conducting copper lines that are manufactured on the circuit board 900. The transmission lines are generally copper lines or trances that are designed to conduct electromagnetic signals. A transmission line may be formed over a dielectric material that forms the circuit board. The capacitors 912a-b generally represent parasitic capacitance along the transmission lines.
The via 914 generally represents an electrical connection between copper layers in a printed circuit board (PCB) that forms the circuit 900. The via 914 may be configured to emulate an electrical connection between a component on one layer of the circuit board to another component on another layer of the circuit board. The via 914 may be an instance of a via discussed with respect to
The s-parameter component 916 may be configured to model the connectors that are used to connect the first component 902 to the second component 904. The first component 902 may be connected to the second component 904 via the s-parameter component 916.
The second section 903 of the circuit board 900 may include transmission lines 918a-f and vias 920a-b. The transmission lines 918a-f may represent physical traces on the circuit board at the second section 903. In one example, transmission lines 918a-f may be substantially similar to other transmission lines 910a-f. For example, transmission lines 918a-f may substantially have the same or similar physical properties (such as resistance, length, width, material, etc.) as transmission line 918a-f. In other examples, transmission lines 918a-f may have different physical properties compared to other transmission lines 910a-f. The vias 920a-b may be configured to emulate an electrical connection between a component on one layer of the circuit board to another component on another layer of the circuit board. The vias 920a-b may be an instance of a via discussed with respect to
The third section 905 of the circuit 900 may include transmission lines 924a-f, vias 926a-b, and a PCIe interface 928. The transmission lines 924a-f may represent physical traces on the circuit board at the third section 905. In one example, transmission lines 924a-f may be substantially similar to other transmission lines on the circuit board 900. For example, the transmission lines 924a-f may substantially have the same or similar physical properties (such as resistance, length, width, material, etc.) as other transmission lines. In other examples, the transmission lines 924a-f may have different physical properties compared to other transmission lines on the circuit board 900.
The vias 926a-b may be configured to emulate an electrical connection between a component on one layer of the circuit board to another component on another layer of the circuit board. The vias 926a-b may be instances of a via discussed with respect to
The PCIe interface 928 may be configured to emulate a receiver that receives the electrical signals from the PCIe interface 908, where the electrical signals travel through the communication paths from the PCIe interface 908 toward the PCIe interface 928. The differential electrical signals from two outputs of the PCIe interface 908 may travel through the transmission lines 910a-f, 918a-f, and 924a-f, capacitors 912a-b, vias 914, 920a-b, and 926a-b, and s-parameter components 916, 922 to reach the PCIe interface 928.
While several embodiments have been provided in this disclosure, it should be understood that the disclosed systems and methods might be embodied in many other specific forms without departing from the spirit or scope of this disclosure. The present examples are to be considered as illustrative and not restrictive, and the intention is not to be limited to the details given herein. For example, the various elements or components may be combined or integrated into another system or certain features may be omitted, or not implemented.
In addition, techniques, systems, subsystems, and methods described and illustrated in the various embodiments as discrete or separate may be combined or integrated with other systems, modules, techniques, or methods without departing from the scope of this disclosure. Other items shown or discussed as coupled or directly coupled or communicating with each other may be indirectly coupled or communicating through some interface, device, or intermediate component whether electrically, mechanically, or otherwise. Other examples of changes, substitutions, and alterations are ascertainable by one skilled in the art and could be made without departing from the spirit and scope disclosed herein.
To aid the Patent Office, and any readers of any patent issued on this application in interpreting the claims appended hereto, applicants note that they do not intend any of the appended claims to invoke 35 U.S.C. § 112 (f) as it exists on the date of filing hereof unless the words “means for” or “step for” are explicitly used in the particular claim.
Implementations of the disclosure can be described in view of the following clauses, the features of which can be combined in any reasonable manner.
Clause 1. A system comprising:
Clause 2. The system of Clause 1, wherein the enclosure is modular such that the enclosure is configured to be integrated with a semi-tractor truck.
Clause 3. The system of Clause 1, wherein the communication requirement further indicates to provide the communication throughput between at least one component of the set of components and the other devices.
Clause 4. The system of Clause 1, wherein the other devices comprise at least one of:
Clause 5. The system of Clause 1, wherein the threshold dimension is 36 inches×24 inches×22 inches.
Clause 6. The system of Clause 1, wherein the enclosure is placed within a cabin of the autonomous vehicle.
Clause 7. The system of Clause 1, wherein the enclosure is placed between a driver seat and a passenger seat of the autonomous vehicle.
Clause 8. The system of Clause 1, wherein the cooling requirement is satisfied by a liquid-based cooling system.
Clause 9. The system of Clause 1, wherein the communication requirement further indicates to facilitate the communication throughput more than the threshold communication throughput between at least one component of the set of components and the other devices.
Clause 10. The system of Clause 1, wherein the enclosure further comprises a cooling system configured to satisfy the cooling requirement.
Clause 11. A physical enclosure, comprising:
Clause 12. The physical enclosure of Clause 11, wherein a cooling system comprises:
Clause 13. The physical enclosure of Clause 11, wherein:
Clause 14. The physical enclosure of Clause 11, wherein:
Clause 15. The physical enclosure of Clause 14, wherein:
Clause 16. The physical enclosure of Clause 11, wherein the autonomous vehicle is a semi-truck tractor unit attached to a trailer.
Clause 17. The physical enclosure of Clause 11, wherein the at least one sensor comprises a camera, a light detection and ranging sensor, a motion sensor, a Radar sensor, or an infrared sensor.
Clause 18. The physical enclosure of Clause 11, wherein the physical enclosure is modular such that the physical enclosure is configured to be integrated with a semi-tractor truck.
Clause 19. The physical enclosure of Clause 11, wherein the communication requirement further indicates to provide the communication throughput between at least one component of the set of components and the other devices.
Clause 20. The physical enclosure of Clause 11, wherein the other devices comprise at least one of:
This application claims priority to U.S. Provisional Patent Application Ser. No. 63/495,687 filed Apr. 12, 2023, and entitled “Modular Enclosure for an In-Vehicle Computer System,” which is incorporation herein by reference.
Number | Date | Country | |
---|---|---|---|
63495687 | Apr 2023 | US |