An autonomous vehicle (or AV) is a vehicle having a processor, programming instructions, and drivetrain components that are controllable by the processor without requiring a human operator. An autonomous vehicle may be fully autonomous in that the autonomous vehicle does not require a human operator for most or all driving conditions and functions, or an autonomous vehicle may be semi-autonomous in that a human operator may be required in certain conditions or for certain operations, or that a human operator may override the autonomous vehicle's autonomous system and may take control of the autonomous vehicle.
In some implementations, a desiccant assembly within a sensor housing includes a desiccant chamber configured to hold a desiccant element; a transfer window positioned between the desiccant chamber and a sensor chamber of the sensor housing; and a permeable membrane covering the transfer window and configured to allow water vapor to transfer from the sensor chamber to the desiccant chamber.
In some implementations, a lidar system includes a sensor housing including a sensor chamber; a sensor disposed within the sensor chamber; and a desiccant assembly disposed within the sensor housing, the desiccant assembly comprising a desiccant chamber configured to hold a desiccant element; and a permeable membrane positioned between the desiccant chamber and the sensor chamber and configured to allow water vapor to transfer from the sensor chamber to the desiccant chamber.
In some implementations, an equipment housing includes an equipment chamber configured to hold electronic equipment; and a desiccant assembly, comprising a desiccant chamber configured to hold a desiccant element; and a transfer assembly positioned between the desiccant chamber and the equipment chamber, wherein the transfer assembly is configured to allow water vapor to transfer from the equipment chamber to the desiccant chamber and to prevent particulate matter from transferring from the desiccant chamber to the equipment chamber.
In some implementations, a method of manufacturing a sensor housing includes providing a desiccant chamber configured to hold a desiccant element; positioning a transfer window between the desiccant chamber and a sensor chamber of the sensor housing; and disposing a permeable membrane over the transfer window, wherein the permeable membrane is configured to allow water vapor to transfer from the sensor chamber to the desiccant chamber.
The following detailed description of example implementations refers to the accompanying drawings, which are incorporated herein and form a part of the specification. The same reference numbers in different drawings may identify the same or similar elements. In general, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears.
Autonomous vehicles (“AVs”) may use a number of different sensors for situational awareness. The sensors, which may be part of a self-driving system (“SDS”) in the AV, may include one or more of a camera, a lidar (Light Detection and Ranging) device, and/or an inertial measurement unit (“IMU”), among other examples. Sensors such as cameras and lidar may be used to capture and analyze scenes around the AV to detect objects. Sometimes, a scene representation, such as point cloud obtained from the AV's lidar, may be combined with one or more images from one or more cameras to obtain further insights to the scene or situation around the AV. It can be appreciated that sensors external to the vehicle, such as cameras and lidar sensors, may be subjected to extreme weather conditions and subsequent fluctuations such as, for example, high temperatures, low temperatures, temperature changes, snow conditions, extreme wind conditions, and/or rain, among other examples. Weather conditions and fluctuations in weather conditions may cause humidity (e.g., water vapor) to build up within and/or to ingress into a sensor enclosure, causing the sensor to perform at suboptimal levels. For example, condensation can occur when temperatures change at a sufficiently high rate.
Some implementations described herein limit condensation buildup within a sensor assembly, thereby mitigating negative effects upon electronics within the sensor assembly due to condensation. For example, some aspects may facilitate reduction of the effects of condensation upon optical sensors operating in outdoor environments, such as those integrated on the external portion of vehicles (e.g., AVs). According to some aspects, a sensor assembly may include a desiccant assembly within a sensor housing. The desiccant assembly may include a desiccant chamber configured to hold at least one desiccant element (e.g., one or more desiccant blocks) and may include a transfer window positioned between the desiccant chamber and a sensor chamber of the sensor housing. A permeable membrane may cover the transfer window and may be configured to allow water vapor to transfer from the sensor chamber to the desiccant chamber, while preventing liquid water and particulate matter from transferring from the desiccant chamber to the sensor chamber. The permeable membrane and/or a set of dimensions of the at least one transfer window may be configured so that a leak rate corresponding to a transfer of water vapor from the sensor chamber to the desiccant chamber causes prevention of condensation of water within the sensor chamber.
The vehicle 102 may include any moving form of conveyance that is capable of carrying one or more human occupants and/or cargo and that is powered by any form of energy. The vehicle 102 may include, for example, a land vehicle (e.g., a car, a truck, a van, or a train), an aircraft (e.g., an unmanned aerial vehicle or a drone), or a watercraft. In the example of
As shown in
In some implementations, the vehicle 102 may travel along a road in a semi-autonomous or autonomous manner. The vehicle 102 may be configured to detect objects 110 in a proximity of the vehicle 102. An object 110 may include, for example, another vehicle (e.g., an autonomous vehicle or a non-autonomous vehicle that requires a human operator for most or all driving conditions and functions), a cyclist (e.g., a rider of a bicycle, electric scooter, or motorcycle), a pedestrian, a road feature (e.g., a roadway boundary, a lane marker, a sidewalk, a median, a guard rail, a barricade, a sign, a traffic signal, a railroad crossing, or a bike path), and/or another object that may be on a roadway or in proximity of a roadway, such as a tree or an animal.
To detect objects 110, the vehicle 102 may be equipped with one or more sensors, such as a lidar system, as described in more detail elsewhere herein. The lidar system may be configured to transmit a light pulse 112 to detect objects 110 located within a distance or range of distances of the vehicle 102. The light pulse 112 may be incident on an object 110 and may be reflected back to the lidar system as a reflected light pulse 114. The reflected light pulse 114 may be incident on the lidar system and may be processed to determine a distance between the object 110 and the vehicle 102. The reflected light pulse 114 may be detected using, for example, a photodetector or an array of photodetectors positioned and configured to receive the reflected light pulse 114. In some implementations, a lidar system may be included in another system other than a vehicle 102, such as a robot, a satellite, and/or a traffic light, or may be used as a standalone system. Furthermore, implementations described herein are not limited to autonomous vehicle applications and may be used in other applications, such as robotic applications, radar system applications, metric applications, and/or system performance applications.
The lidar system may provide lidar data, such as information about a detected object 110 (e.g., information about a distance to the object 110, a speed of the object 110, and/or a direction of movement of the object 110), to one or more other components of the on-board system 104. Additionally, or alternatively, the vehicle 102 may transmit lidar data to the remote computing device 106 (e.g., a server, a cloud computing system, and/or a database) via the network 108. The remote computing device 106 may be configured to process the lidar data and/or to transmit a result of processing the lidar data to the vehicle 102 via the network 108.
The network 108 may include one or more wired and/or wireless networks. For example, the network 108 may include a wireless wide area network (e.g., a cellular network or a public land mobile network), a local area network (e.g., a wired local area network or a wireless local area network (WLAN), such as a Wi-Fi network), a personal area network (e.g., a Bluetooth network), a near-field communication network, a telephone network, a private network, the Internet, and/or a combination of these or other types of networks. The network 108 enables communication among the devices of environment 100.
As indicated above,
The power system 202 may be configured to generate mechanical energy for the vehicle 102 to move the vehicle 102. For example, the power system 202 may include an engine that converts fuel to mechanical energy (e.g., via combustion) and/or a motor that converts electrical energy to mechanical energy.
The one or more sensors 204 may be configured to detect operational parameters of the vehicle 102 and/or environmental conditions of an environment in which the vehicle 102 operates. For example, the one or more sensors 204 may include an engine temperature sensor 210, a battery voltage sensor 212, an engine rotations per minute (RPM) sensor 214, a throttle position sensor 216, a battery sensor 218 (to measure current, voltage, and/or temperature of a battery), a motor current sensor 220, a motor voltage sensor 222, a motor position sensor 224 (e.g., a resolver and/or encoder), a motion sensor 226 (e.g., an accelerometer, gyroscope and/or inertial measurement unit), a speed sensor 228, an odometer sensor 230, a clock 232, a position sensor 234 (e.g., a global navigation satellite system (GNSS) sensor and/or a global positioning system (GPS) sensor), one or more cameras 236, a lidar system 238, one or more other ranging systems 240 (e.g., a radar system and/or a sonar system), and/or an environmental sensor 242 (e.g., a precipitation sensor and/or ambient temperature sensor).
The one or more controllers 206 may be configured to control operation of the vehicle 102. For example, the one or more controllers 206 may include a brake controller 244 to control braking of the vehicle 102, a steering controller 246 to control steering and/or direction of the vehicle 102, a throttle controller 248 and/or a speed controller 250 to control speed and/or acceleration of the vehicle 102, a gear controller 252 to control gear shifting of the vehicle 102, a routing controller 254 to control navigation and/or routing of the vehicle 102 (e.g., using map data), and/or an auxiliary device controller 256 to control one or more auxiliary devices associated with the vehicle 102, such as a testing device, an auxiliary sensor, and/or a mobile device transported by the vehicle 102.
The on-board computing device 208 may be configured to receive sensor data from one or more sensors 204 and/or to provide commands to one or more controllers 206. For example, the on-board computing device 208 may control operation of the vehicle 102 by providing a command to a controller 206 based on sensor data received from a sensor 204. In some implementations, the on-board computing device 208 may be configured to process sensor data to generate a command. The on-board computing device 208 may include memory, one or more processors, an input component, an output component, and/or a communication component, as described in more detail elsewhere herein.
As an example, the on-board computing device 208 may receive navigation data, such as information associated with a navigation route from a start location of the vehicle 102 to a destination location for the vehicle 102. In some implementations, the navigation data is accessed and/or generated by the routing controller 254. For example, the routing controller 254 may access map data and identify possible routes and/or road segments that the vehicle 102 can travel to move from the start location to the destination location. In some implementations, the routing controller 254 may identify a preferred route, such as by scoring multiple possible routes, applying one or more routing techniques (e.g., minimum Euclidean distance, Dijkstra's algorithm, and/or Bellman-Ford algorithm), accounting for traffic data, and/or receiving a user selection of a route, among other examples. The on-board computing device 208 may use the navigation data to control operation of the vehicle 102.
As the vehicle travels along the route, the on-board computing device 208 may receive sensor data from various sensors 204. For example, the position sensor 234 may provide geographic location information to the on-board computing device 208, which may then access a map associated with the geographic location information to determine known fixed features associated with the geographic location, such as streets, buildings, stop signs, and/or traffic signals, which may be used to control operation of the vehicle 102.
In some implementations, the on-board computing device 208 may receive one or more images captured by one or more cameras 236, may analyze the one or more images (e.g., to detect object data), and may control operation of the vehicle 102 based on analyzing the images (e.g., to avoid detected objects). Additionally, or alternatively, the on-board computing device 208 may receive object data associated with one or more objects detected in a vicinity of the vehicle 102 and/or may generate object data based on sensor data. The object data may indicate the presence or absence of an object, a location of the object, a distance between the object and the vehicle 102, a speed of the object, a direction of movement of the object, an acceleration of the object, a trajectory (e.g., a heading) of the object, a shape of the object, a size of the object, a footprint of the object, and/or a type of the object (e.g., a vehicle, a pedestrian, a cyclist, a stationary object, or a moving object). The object data may be detected by, for example, one or more cameras 236 (e.g., as image data), the lidar system 238 (e.g., as lidar data) and/or one or more other ranging systems 240 (e.g., as radar data or sonar data). The on-board computing device 208 may process the object data to detect objects in a proximity of the vehicle 102 and/or to control operation of the vehicle 102 based on the object data (e.g., to avoid detected objects).
In some implementations, the on-board computing device 208 may use the object data (e.g., current object data) to predict future object data for one or more objects. For example, the on-board computing device 208 may predict a future location of an object, a future distance between the object and the vehicle 102, a future speed of the object, a future direction of movement of the object, a future acceleration of the object, and/or a future trajectory (e.g., a future heading) of the object. For example, if an object is a vehicle and map data indicates that the vehicle is at an intersection, then the on-board computing device 208 may predict whether the object will likely move straight or turn. As another example, if the sensor data and/or the map data indicates that the intersection does not have a traffic light, then the on-board computing device 208 may predict whether the object will stop prior to entering the intersection.
The on-board computing device 208 may generate a motion plan for the vehicle 102 based on sensor data, navigation data, and/or object data (e.g., current object data and/or future object data). For example, based on current locations of objects and/or predicted future locations of objects, the on-board computing device 208 may generate a motion plan to move the vehicle 102 along a surface and avoid collision with other objects. In some implementations, the motion plan may include, for one or more points in time, a speed of the vehicle 102, a direction of the vehicle 102, and/or an acceleration of the vehicle 102. Additionally, or alternatively, the motion plan may indicate one or more actions with respect to a detected object, such as whether to overtake the object, yield to the object, pass the object, or the like. The on-board computing device 208 may generate one or more commands or instructions based on the motion plan, and may provide those command(s) to one or more controllers 206 for execution.
As indicated above,
The housing 302 may be rotatable (e.g., by 360 degrees) around an axle 314 (or hub) of the motor 310. The housing 302 may include an aperture 316 (e.g., an emitter and/or receiver aperture) made of a material transparent to light. Although a single aperture 316 is shown in
The housing 302 may house the light emitter system 304, the light detector system 306, and/or the optical element structure 308. The light emitter system 304 may be configured and/or positioned to generate and emit pulses of light through the aperture 316 and/or through a transparent material of the housing 302. For example, the light emitter system 304 may include one or more light emitters, such as laser emitter chips or other light emitting devices. The light emitter system 304 may include any number of individual light emitters (e.g., 8 emitters, 64 emitters, or 128 emitters), which may emit light at substantially the same intensity or of varying intensities. The light detector system 306 may include a photodetector or an array of photodetectors configured and/or positioned to receive light reflected back through the housing 302 and/or the aperture 316.
The optical element structure 308 may be positioned between the light emitter system 304 and the housing 302, and/or may be positioned between the light detector system 306 and the housing 302. The optical element structure 308 may include one or more lenses, waveplates, and/or mirrors that focus and direct light that passes through the optical element structure 308. The light emitter system 304, the light detector system 306, and/or the optical element structure 308 may rotate with a rotatable housing 302 or may rotate inside of a stationary housing 302.
The analysis device 312 may be configured to receive (e.g., via one or more wired and/or wireless connections) sensor data collected by the light detector system 306, analyze the sensor data to measure characteristics of the received light, and generate output data based on the sensor data. In some implementations, the analysis device 312 may provide the output data to another system that can control operations and/or provide recommendations with respect to an environment from which the sensor data was collected. For example, the analysis device 312 may provide the output data to the on-board system 104 (e.g., the on-board computing device 208) of the vehicle 102 to enable the on-board system 104 to process the output data and/or use the output data (or the processed output data) to control operation of the vehicle 102. The analysis device 312 may be integrated into the lidar system 300 or may be external from the lidar system 300 and communicatively connected to the lidar system 300 via a network. The analysis device 312 may include memory, one or more processors, an input component, an output component, and/or a communication component, as described in more detail elsewhere herein.
As indicated above,
The pocket 424 may be configured to receive at least one desiccant element 428. For example, in the illustrated example, the at least one desiccant element 428 includes two desiccant blocks 428. In some aspects, the at least one desiccant element 428 may be removable. In some other aspects, the at least one desiccant element 428 may be fixed. In some aspects, the at least one desiccant element 428 may include an adhesive material adhered to at least one side of a transfer wall 430. The at least one desiccant element 428 may be either configured to be replaceable at a predetermined service interval, or configured to be permanently integrated and operable for the duration of the expected life of the sensor. In one example, a replaceable desiccant element 428 may be placed in the pocket 424 in a location that is accessible for a maintenance technician to perform a replacement service. In other examples, the at least one desiccant element 428 may be permanently integrated within a sensor (e.g., lidar) assembly and may be placed at any location that advantageously leverages size, weight, space considerations as well as mass balance considerations in cases where the sensor is a mechanical (e.g., spinning or rotating) sensor such as a mechanical lidar. The at least one desiccant element 428 may be shaped in any number of ways to leverage space and performance considerations of the sensor. The at least one desiccant element 428 may be made of a molecular sieve powder mixed with a polymer binder and formed into the desired shape.
For example, the permeable membrane 434 may include a material selected so that a leak rate corresponding to a transfer of water vapor from the sensor chamber 402 to the desiccant chamber 426 facilitates maintaining a relative humidity level within the sensor chamber 402 at or below a humidity threshold. In some aspects, the permeable membrane 434 may include a polymer material. For example, the polymer material may include expanded polytetrafluoroethylene (ePTFE). In some aspects, the permeable membrane 434 may be an adhesive material adhered to the transfer wall 430. For example, the permeable membrane 434 may be adhered to an upper surface (e.g., a surface facing the desiccant chamber 426) of the transfer wall 430 and/or to a lower surface (e.g., a surface facing the sensor chamber 402) of the transfer wall 430.
As shown in
As shown in
In some aspects, the desiccant assembly 414 may include one or more mechanical assemblies configured to open and close, and/or partially open and partially close, the at least one transfer window 432. For example, a mechanical flapper or slider may be configured to cover the at least one transfer window 432 in response to actuation by an actuator. In some aspects, the actuator may be communicatively coupled with the on-board computing device 208 depicted in
As indicated above,
For example, in some aspects, although described herein in the context of sensors, similar desiccant assemblies may be used in conjunction with any type of electronic equipment to prevent condensation within an equipment chamber. For example, an equipment housing may include an equipment chamber configured to hold electronic equipment; and a desiccant assembly. The desiccant assembly may include, as described herein, a desiccant chamber configured to hold a desiccant element; and a transfer assembly positioned between the desiccant chamber and the equipment chamber. The transfer assembly may include at least one transfer window and at least one permeable membrane and may be configured to allow water vapor to transfer from the equipment chamber to the desiccant chamber and to prevent particulate matter from transferring from the desiccant chamber to the equipment chamber.
As shown in
The method 500 may include additional aspects, such as any single aspect or any combination of aspects described below and/or described in connection with one or more other methods or operations described elsewhere herein. In a first aspect, the desiccant chamber comprises a pocket defined in a surface of a desiccant assembly body, the desiccant assembly body configured to separate the desiccant chamber from the sensor chamber. In a second aspect, alone or in combination with the first aspect, the pocket comprises a recess defined in the surface of the desiccant assembly body, the recess comprising a transfer wall within which the transfer window is defined. In a third aspect, alone or in combination with one or more of the first and second aspects, the recess is configured to receive the desiccant element.
In a fourth aspect, alone or in combination with one or more of the first through third aspects, the desiccant element comprises an adhesive material adhered to at least one side of the transfer wall. In a fifth aspect, alone or in combination with one or more of the first through fourth aspects, the desiccant chamber is configured to hold at least one additional desiccant element. In a sixth aspect, alone or in combination with one or more of the first through fifth aspects, the method 500 includes positioning at least one additional transfer window between the desiccant chamber and the sensor chamber. In a seventh aspect, alone or in combination with one or more of the first through sixth aspects, the method 500 includes providing an access component configured to isolate the desiccant chamber from an environment external to the sensor chamber.
In an eighth aspect, alone or in combination with one or more of the first through seventh aspects, providing the access component comprises removably attaching a chamber cover to a surface of desiccant assembly body, the desiccant assembly body configured to separate the desiccant chamber from the sensor chamber. In a ninth aspect, alone or in combination with one or more of the first through eighth aspects, the method 500 includes selecting the permeable membrane so that a leak rate corresponding to a transfer of water vapor from the sensor chamber to the desiccant chamber causes prevention of condensation of water within the sensor chamber. In a tenth aspect, alone or in combination with one or more of the first through ninth aspects, the method 500 includes configuring the permeable membrane to prevent a transfer of liquid water and particulate matter from the desiccant chamber to the sensor chamber. In an eleventh aspect, alone or in combination with one or more of the first through tenth aspects, the permeable membrane comprises a material selected so that a leak rate corresponding to a transfer of water vapor from the sensor chamber to the desiccant chamber facilitates maintaining a relative humidity level within the sensor chamber at or below a humidity threshold. In a twelfth aspect, alone or in combination with one or more of the first through eleventh aspects, the method 500 includes configuring a set of dimensions of the transfer window is so that a leak rate corresponding to a transfer of water vapor from the sensor chamber to the desiccant chamber facilitates maintaining a relative humidity level within the sensor chamber at or below a humidity threshold.
In a thirteenth aspect, alone or in combination with one or more of the first through twelfth aspects, the permeable membrane comprises a polymer material. In a fourteenth aspect, alone or in combination with one or more of the first through thirteenth aspects, the polymer material comprises expanded polytetrafluoroethylene. In a fifteenth aspect, alone or in combination with one or more of the first through fourteenth aspects, the desiccant element is removeable. In a sixteenth aspect, alone or in combination with one or more of the first through fifteenth aspects, the method 500 includes disposing the desiccant assembly within the sensor housing at a location that is selected so that a mass balance associated with the sensor housing facilitates a mechanical operation of a sensor within the sensor housing.
Although
The foregoing disclosure provides illustration and description, but is not intended to be exhaustive or to limit the implementations to the precise forms disclosed. Modifications may be made in light of the above disclosure or may be acquired from practice of the implementations.
As used herein, the term “component” is intended to be broadly construed as hardware, firmware, or a combination of hardware and software. It will be apparent that systems and/or methods described herein may be implemented in different forms of hardware, firmware, and/or a combination of hardware and software. The hardware and/or software code described herein for implementing aspects of the disclosure should not be construed as limiting the scope of the disclosure. Thus, the operation and behavior of the systems and/or methods are described herein without reference to specific software code—it being understood that software and hardware can be used to implement the systems and/or methods based on the description herein.
As used herein, satisfying a threshold may, depending on the context, refer to a value being greater than the threshold, greater than or equal to the threshold, less than the threshold, less than or equal to the threshold, equal to the threshold, not equal to the threshold, or the like.
Although particular combinations of features are recited in the claims and/or disclosed in the specification, these combinations are not intended to limit the disclosure of various implementations. Features from different implementations and/or aspects disclosed herein can be combined. For example, one or more features from a method implementations may be combined with one or more features of a device, system, or product implementation. Features described herein may be combined in ways not specifically recited in the claims and/or disclosed in the specification. Although each dependent claim listed below may directly depend on only one claim, the disclosure of various implementations includes each dependent claim in combination with every other claim in the claim set. As used herein, a phrase referring to “at least one of” a list of items refers to any combination and permutation of those items, including single members. As an example, “at least one of: a, b, or c” is intended to cover a, b, c, a-b, a-c, b-c, and a-b-c, as well as any combination with multiple of the same item. As used herein, the term “and/or” used to connect items in a list refers to any combination and any permutation of those items, including single members (e.g., an individual item in the list). As an example, “a, b, and/or c” is intended to cover a, b, c, a-b, a-c, b-c, and a-b-c.
No element, act, or instruction used herein should be construed as critical or essential unless explicitly described as such. Also, as used herein, the articles “a” and “an” are intended to include one or more items, and may be used interchangeably with “one or more.” Further, as used herein, the article “the” is intended to include one or more items referenced in connection with the article “the” and may be used interchangeably with “the one or more.” Furthermore, as used herein, the term “set” is intended to include one or more items (e.g., related items, unrelated items, or a combination of related and unrelated items), and may be used interchangeably with “one or more.” Where only one item is intended, the phrase “only one” or similar language is used. Also, as used herein, the terms “has,” “have,” “having,” or the like are intended to be open-ended terms. Further, the phrase “based on” is intended to mean “based, at least in part, on” unless explicitly stated otherwise. Also, as used herein, the term “or” is intended to be inclusive when used in a series and may be used interchangeably with “and/or,” unless explicitly stated otherwise (e.g., if used in combination with “either” or “only one of”).
This patent application claims priority to U.S. Provisional Patent Application No. 63/399,099, filed on Aug. 18, 2022, entitled “Desiccant Integration for Enclosure Humidity Control” and assigned to the assignee hereof. The disclosure of the prior application is considered part of and is incorporated by reference into this patent application.
Number | Date | Country | |
---|---|---|---|
63399099 | Aug 2022 | US |