The present disclosure is generally directed to vehicle glass structures and, in particular, toward vehicle windshields.
In recent years, transportation methods have changed substantially. This change is due in part to a concern over the limited availability of natural resources, a proliferation in personal technology, and a societal shift to adopt more environmentally friendly transportation solutions. These considerations have encouraged the development of a number of new driving systems.
Driver assist systems are becoming increasingly popular and can assist drivers by monitoring traffic movement, optimizing features such as cruise control, giving lane departure warnings, performing pre-collision braking and blind spot detection. Various imaging systems may be used with such driver assist systems and can be configured to determine an environment surrounding the vehicle.
Traditional light imaging detection and ranging (LIDAR) solutions typically operate by emitting a pulsed laser light toward a target and measuring the reflected light to determine a distance or range to a target. Conventional LIDAR systems have been used in generating detailed three-dimensional maps, including determining a real-time three-dimensional environmental map for air and/or land-based vehicles.
Vehicle manufacturers employing conventional LIDAR systems generally mount the LIDAR system a specific height above the vehicle roof to view a complete periphery around the vehicle. Because safety may be dependent on the unimpeded measurement range of the LIDAR system, vehicle manufacturers typically mount the conventional LIDAR system several feet above the roof of a particular vehicle.
Embodiments of the present disclosure will be described in connection with a vehicle, and in some embodiments, an electric vehicle, rechargeable electric vehicle, and/or hybrid-electric vehicle and associated systems.
Vehicle manufacturers employing conventional Light Imaging, Detection, And Ranging (LIDAR) systems generally mount the LIDAR system a specific height above a vehicle roof to view a complete periphery around the vehicle from a certain distance from the vehicle. This viewing distance may be limited by the mount height of the LIDAR system and any portion of the vehicle that extends into the vertical angular measurement range of the LIDAR system. Because safety may be dependent on the unimpeded measurement range of the LIDAR system, vehicle manufacturers may mount the conventional LIDAR system several feet above the roof of a particular vehicle. As can be appreciated, this functional design requirement limits the aesthetic and even aerodynamic design of a complete vehicle system.
Embodiments of the present disclosure describe a windshield, which may be constructed in such a way that allows for high-performance and long-range LIDAR sensors (and/or imaging systems) to be mounted behind the glass of the windshield. In addition to proper material construction, the windshield angle may be set to allow for operational efficiency and aerodynamics. As described herein, the windshield material may also increase safety.
In some embodiments, the windshield may include a combination safety glass laminate that is modified at an upper center portion to include a cut away section where an alkali-aluminosilicate sheet glass may be disposed, inserted, or fused. This section of sheet glass may provide high scratch-resistance and hardness as well as superior optical transmission characteristics for LIDAR sensors, which may be mounted behind the windshield in this region.
In some embodiments, the arrangement of various components of the vehicle 100 may be described with reference to a representative coordinate system 102. The representative coordinate system 102 includes an X-axis, a Y-axis, and a Z-axis, which may be used to define a relationship between components of the vehicle 100, a dimension of components of the vehicle 100, and/or a direction of components, or features of components, of the vehicle 100. For instance, as shown in
Although shown in the form of a car, it should be appreciated that the vehicle 100 described herein may include any conveyance or model of a conveyance, where the conveyance was designed for the purpose of moving one or more tangible objects, such as people, animals, cargo, and the like. The term “vehicle” does not require that a conveyance moves or is capable of movement. Typical vehicles may include but are in no way limited to cars, trucks, motorcycles, busses, automobiles, trains, railed conveyances, boats, ships, marine conveyances, submarine conveyances, airplanes, space craft, flying machines, human-powered conveyances, and the like.
In some embodiments, the vehicle 100 may include a number of sensors, devices, and/or systems that are capable of assisting in driving operations, e.g., autonomous or semi-autonomous control. Examples of the various sensors and systems may include, but are in no way limited to, one or more of cameras (e.g., independent, stereo, combined image, etc.), infrared (IR) sensors, radio frequency (RF) sensors, ultrasonic sensors (e.g., transducers, transceivers, etc.), radio object-detection and ranging (RADAR) sensors (e.g., object-detection sensors and/or systems), LIDAR systems, odometry sensors and/or devices (e.g., encoders, etc.), orientation sensors (e.g., accelerometers, gyroscopes, magnetometer, etc.), navigation sensors and systems (e.g., GPS, etc.), and other ranging, imaging, and/or object-detecting sensors. In some embodiments, the sensors may be disposed in an interior 150 space, or cabin, of the vehicle 100 and/or on an outside of the vehicle 100. Additionally or alternatively, the sensors and systems may be disposed in one or more portions of the vehicle 100 (e.g., the frame 104, or chassis, a body panel, a compartment, etc.).
The vehicle sensors and systems may be selected and/or configured to suit a level of operation associated with the vehicle 100. Among other things, the number of sensors used in a system may be altered to increase or decrease information available to a vehicle control system (e.g., affecting control capabilities of the vehicle 100). Additionally or alternatively, the sensors and systems may be part of one or more advanced driver assistance systems (ADAS) associated with a vehicle 100. In any event, the sensors and systems may be used to provide driving assistance at any level of operation (e.g., from fully-manual to fully-autonomous operations, etc.) as described herein.
The various levels of vehicle control and/or operation can be described as corresponding to a level of autonomy associated with a vehicle 100 for vehicle driving operations. For instance, at Level 0, or fully-manual driving operations, a driver (e.g., a human driver) may be responsible for all the driving control operations (e.g., steering, accelerating, braking, etc.) associated with the vehicle 100. Level 0 may be referred to as a “No Automation” level. At Level 1, the vehicle 100 may be responsible for a limited number of the driving operations associated with the vehicle 100, while the driver is still responsible for most driving control operations. An example of a Level 1 vehicle 100 may include a vehicle 100 in which the throttle control and/or braking operations may be controlled by the vehicle 100 (e.g., cruise control operations, etc.). Level 1 may be referred to as a “Driver Assistance” level. At Level 2, the vehicle 100 may collect information (e.g., via one or more driving assistance systems, sensors, etc.) about an environment of the vehicle 100 (e.g., an area surrounding the vehicle 100, roadway, traffic, ambient conditions, etc.) and use the collected information to control driving operations (e.g., steering, accelerating, braking, etc.) associated with the vehicle 100. In a Level 2 autonomous vehicle 100, the driver may be required to perform other aspects of driving operations not controlled by the vehicle 100. Level 2 may be referred to as a “Partial Automation” level. It should be appreciated that Levels 0-2 all involve the driver monitoring the driving operations of the vehicle 100.
At Level 3, the driver may be separated from controlling all the driving operations of the vehicle 100 except when the vehicle 100 makes a request for the operator to act or intervene in controlling one or more driving operations. In other words, the driver may be separated from controlling the vehicle 100 unless the driver is required to take over for the vehicle 100. Level 3 may be referred to as a “Conditional Automation” level. At Level 4, the driver may be separated from controlling all the driving operations of the vehicle 100 and the vehicle 100 may control driving operations even when a user fails to respond to a request to intervene. Level 4 may be referred to as a “High Automation” level. At Level 5, the vehicle 100 can control all the driving operations associated with the vehicle 100 in all driving modes. The vehicle 100 in Level 5 may continually monitor traffic, vehicular, roadway, and/or environmental conditions while driving the vehicle 100. In Level 5, there is no human driver interaction required in any driving mode. Accordingly, Level 5 may be referred to as a “Full Automation” level. It should be appreciated that in Levels 3-5, the vehicle 100, and/or one or more automated driving systems associated with the vehicle 100, monitors the driving operations of the vehicle 100 and the driving environment.
As shown in
Referring now to
In some embodiments, the vehicle 100 may include a ranging and imaging system 112, such as one or more LIDAR sensors and associated hardware, or the like. The LIDAR sensor 112 may be configured to detect visual information in an environment surrounding at least a portion of the vehicle 100. The visual information detected in the environment within a field of view of the LIDAR sensor 112 may be processed (e.g., via one or more sensor and/or system processors, etc.) to generate a complete a view of an environment 200 around, or partially around, the vehicle 100. The LIDAR sensor 112 may be configured to generate changing views (e.g., approximately across 180 degrees in front, and along the sides, of the vehicle 100 when viewed from the plan view, etc.) of the environment 200 in real time (or near real time) for instance, as the vehicle 100 drives. In some cases, the LIDAR sensor 112 may have an effective detection limit 204 that is some distance from a center axis of the vehicle 100 outward over approximately 180 degrees. The effective detection limit 204 of the LIDAR sensor 112 defines a view zone 208 (e.g., an area and/or volume, etc.) surrounding at least a portion of the vehicle 100. Any object falling outside of the view zone 208 may be in the undetected zone 212 and would not be detected by the LIDAR sensor 112 of the vehicle 100 at a given point in time. Although shown disposed behind the laminate vehicle windshield 154, the LIDAR sensor 112 may be disposed behind one or more other laminate glass surfaces (e.g., side windows, quarter glass, rear window, sensor windows, etc.) of the vehicle 100. In some embodiments, the vehicle 100 may include number of LIDAR sensors 112 disposed in, and/or around, the vehicle 100 to provide a 360-degree view surrounding the vehicle 100. In any event, the composition and arrangement of the other laminate glass surfaces may be the same as, or substantially similar to, that of the laminate vehicle windshield 154 as described herein.
The LIDAR sensor 112 may include one or more components configured to measure distances to targets using laser illumination. In some embodiments, the LIDAR sensor 112 may provide 3D imaging data of an environment around the vehicle 100. The imaging data may be processed to generate at least a portion of a full 360-degree view of the environment around the vehicle 100. The LIDAR sensor 112 may include a laser light generator configured to generate a plurality of target illumination laser beams (e.g., laser light channels). In some embodiments, this plurality of laser beams may be aimed at, or directed to, a rotating reflective surface (e.g., a mirror) and guided outwardly from the LIDAR sensor 112 (e.g., through a portion of the laminate vehicle windshield 154, etc.) into a measurement environment. The rotating reflective surface may be configured to continually rotate 360 degrees about an axis, such that the plurality of laser beams is directed in outwardly from the vehicle 100 in a predetermined field of view. A photodiode receiver of the LIDAR sensor 112 may detect when light from the plurality of laser beams emitted into the measurement environment returns (e.g., reflected echo) to the LIDAR sensor 112. The LIDAR sensor 112 may calculate, based on a time associated with the emission of light to the detected return of light, a distance from the vehicle 100 to the illuminated target. In some embodiments, the LIDAR sensor 112 may generate over 2.0 million points per second and have an effective operational range of at least 100 meters. Examples of the LIDAR sensor 112 as described herein may include, but are not limited to, at least one of Velodyne® LiDAR™ HDL-64E 64-channel LIDAR sensors, Velodyne® LiDAR™ HDL-32E 32-channel LIDAR sensors, Velodyne® LiDAR™ PUCK™ VLP-16 16-channel LIDAR sensors, Leica Geosystems Pegasus: Two mobile sensor platform, Garmin® LIDAR-Lite v3 measurement sensor, Quanergy M8 LiDAR sensors, Quanergy S3 solid state LiDAR sensor, LeddarTech® LeddarVU compact solid state fixed-beam LIDAR sensors, other industry-equivalent LIDAR sensors and/or systems, and may perform illuminated target and/or obstacle detection in an environment around the vehicle 100 using any known or future-developed standard and/or architecture.
Sensor data and information may be collected by one or more sensors or systems 116A-K, 112 of the vehicle 100 monitoring the vehicle sensing environment 200. This information may be processed (e.g., via a processor, computer-vision system, etc.) to determine targets (e.g., objects, signs, people, markings, roadways, conditions, etc.) inside one or more detection zones 208, 216A-D associated with the vehicle sensing environment 200. In some cases, information from multiple sensors 116A-K may be processed to form composite sensor detection information. For example, a first sensor 116A and a second sensor 116F may correspond to a first camera 116A and a second camera 116F aimed in a forward traveling direction of the vehicle 100. In this example, images collected by the cameras 116A, 116F may be combined to form stereo image information. This composite information may increase the capabilities of a single sensor in the one or more sensors 112, 116A-K by, for example, adding the ability to determine depth associated with targets in the one or more detection zones 204, 208, 216A-D. Similar image data may be collected by rear view sensors (e.g., LIDAR sensors 112, etc.) or cameras (e.g., sensors 116G, 116H) aimed in a rearward traveling direction vehicle 100 (e.g., facing in a direction running from the front 110 to the rear 120 of the vehicle 100, etc.).
In some embodiments, multiple sensors 116A-K may be effectively joined to increase a sensing zone and provide increased sensing coverage. For instance, multiple RADAR sensors 116B disposed on the front 110 of the vehicle may be joined to provide a zone 216B of coverage that spans across an entirety of the front 110 of the vehicle. In some cases, the multiple RADAR sensors 116B may cover a detection zone 216B that includes one or more other sensor detection zones 216A. These overlapping detection zones may provide redundant sensing, enhanced sensing, and/or provide greater detail in sensing within a particular portion (e.g., zone 216A) of a larger zone (e.g., zone 216B). Additionally or alternatively, the sensors 116A-K of the vehicle 100 may be arranged to create a complete coverage, via one or more sensing zones 208, 216A-D around the vehicle 100. In some areas, the sensing zones 216C of two or more sensors 116D, 116E may intersect at an overlap zone 220. In some areas, the angle and/or detection limit of two or more sensing zones 216C, 216D (e.g., of two or more sensors 116E, 116J, 116K) may meet at a virtual intersection point 224.
The vehicle 100 may include a number of sensors 116E, 116G, 116H, 116J, 116K disposed proximal to the rear 120 of the vehicle 100. These sensors can include, but are in no way limited to, an imaging sensor, camera, IR, LIDAR sensors, RADAR sensors, RF sensors, ultrasonic sensors, and/or other object-detection sensors. Among other things, these sensors 116E, 116G, 116H, 116J, 116K may detect targets near or approaching the rear of the vehicle 100. For example, another vehicle approaching the rear 120 of the vehicle 100 may be detected by one or more of the ranging and imaging systems (e.g., LIDAR sensor 112, etc.), rear-view cameras 116G, 116H, and/or rear facing RADAR sensors 116J, 116K. As described above, the images from the rear-view cameras 116G, 116H may be processed to generate a stereo view (e.g., providing depth associated with an object or environment, etc.) for targets visible to both cameras 116G, 116H. As another example, the vehicle 100 may be driving and one or more of the LIDAR sensor 112, front-facing cameras 116A, 116F, front-facing RADAR sensors 116B, and/or ultrasonic sensors 116C may detect targets in front of the vehicle 100. This approach may provide critical sensor information to a vehicle control system in at least one of the autonomous driving levels described above. For instance, when the vehicle 100 is driving autonomously (e.g., Level 3, Level 4, or Level 5) and detects other vehicles stopped in a travel path, the sensor detection information may be sent to the vehicle control system of the vehicle 100 to control a driving operation (e.g., braking, decelerating, etc.) associated with the vehicle 100 (in this example, slowing the vehicle 100 as to avoid colliding with the stopped other vehicles). As yet another example, the vehicle 100 may be operating and one or more of the LIDAR sensor 112, and/or the side-facing sensors 116D, 116E (e.g., RADAR, ultrasonic, camera, combinations thereof, and/or other type of sensor), may detect targets at a side of the vehicle 100. It should be appreciated that the sensors 116A-K may detect a target that is both at a side 160 and a front 110 of the vehicle 100 (e.g., disposed at a diagonal angle to a centerline of the vehicle 100 running from the front 110 of the vehicle 100 to the rear 120 of the vehicle). Additionally or alternatively, the sensors 112, 116A-K may detect a target that is both, or simultaneously, at a side 160 and a rear 120 of the vehicle 100 (e.g., disposed at a diagonal angle to the centerline of the vehicle 100).
Referring to
In contrast to standard laminated windshields, which may sandwich a polyvinyl butyral layer between identical layers of soda-lime glass, the laminate vehicle windshield 154 described herein may employ a LIDAR optically-tuned glass layer (e.g., first glass sheet 304) that is adhered to a low-iron soda-lime glass layer (e.g., second glass sheet 312) via the interlayer 308 and may further provide a void area 310 in the interlayer 308 and the second glass sheet 312 where only a portion of the first glass sheet 304 is arranged. In this arrangement, a LIDAR sensor 112 may be disposed in the interior 150 of the vehicle 100 behind the laminate vehicle windshield 154 and, more specifically, behind the void area 310, such that light emitted or received by the LIDAR sensor 112 only passes through the first glass sheet 304 (e.g., the alkali-aluminosilicate layer) of the laminate vehicle windshield 154.
The laminate vehicle windshield 154 may comprise a lower edge 316, an upper edge 318, a left-side edge 320, and a right-side edge 322 disposed about a periphery of the laminate vehicle windshield 154. In some embodiments, the edges 316, 318, 320, 322 may be formed as linear, arcuate, or a combination of edge shapes or types, which are joined together (e.g., via radii, chamfers, corners, etc.) forming a periphery of the laminate vehicle windshield 154 and/or the first glass sheet 304. Although described as having a lower edge, an upper edge, a left-side edge, and a right-side edge, for example, when viewed from a vehicle front 110, it should be appreciated, that the sides, or edges, of the laminate vehicle windshield 154 may correspond to a first edge (e.g., lower edge 316), a second edge (e.g., upper edge 318), a third edge (e.g., left-side edge 320), and a fourth edge (e.g., right-side edge 322), respectively. In some embodiments, the laminate vehicle windshield 154 may be symmetrical about a windshield centerline 314. The windshield centerline 314 may pass through a center of the vehicle 100 (e.g., along the YZ-plane running through the center of the vehicle 100).
In
The total thickness of the laminate vehicle windshield 154 may depend on a number of factors including, but in no way limited to, safety requirements, light transmission qualities, vehicle design, etc., and/or combinations thereof. In one embodiment, the total thickness of the laminate vehicle windshield 154, measured from the surface of the first glass sheet 304 adjacent to the outer windshield space 404 to the surface of the second glass sheet 312 adjacent to the inner windshield space 408, may be in a range of 1.7 mm to 7 mm thick, or any value in between the range. In some embodiments, the first glass sheet 304 may be in a range of 0.5 mm to 2.0 mm thick, the interlayer 308 may be in a range of 0.2 mm to 1.5 mm thick, and the second glass sheet 312 in a range of 1.0 mm to 3.5 mm thick, and/or any value therebetween. It should be appreciated that the thicknesses of the laminate vehicle windshield 154, and/or the thicknesses of the layers 304, 308, 312 of the laminate vehicle windshield 154 are not limited to the ranges provided and may be thinner or thicker than the values provided without departing from the scope of the disclosure.
Referring now to
The laminate vehicle windshield 154 may be supported by one or more support structures 508A-508C. The support structures 508A-508C may correspond to points where the laminate vehicle windshield 154 is attached to the frame 104 of the vehicle 100. In some embodiments, the support structures 508A-508C may comprise a mechanical adhesive interface between the frame 104, or a portion of the frame 104 defining a frame for the laminate vehicle windshield 154, and the laminate vehicle windshield 154. In one embodiment, the support structures 508A-508C may define a frame for the laminate vehicle windshield 154. The frame for the laminate vehicle windshield 154 may be configured to support the laminate vehicle windshield 154 via adhesive contact (e.g., via a urethane adhesive material disposed between the laminate vehicle windshield 154 and the frame, etc.).
In some embodiments, the support structures 508A-508C may follow a periphery of the laminate vehicle windshield 154, the first glass sheet 304, and/or the void area 310. For example, the second support structure 508B and the third support structure 508C may surround the void area 310 disposed in the laminate vehicle windshield 154. In some embodiments, the second support structure 508B may connect with the third support structure 508C forming a continuous support frame for the laminate vehicle windshield 154. Additionally or alternatively, the first support structure 508A and the third support structure 508C may surround the laminate vehicle windshield 154 (e.g., following a periphery of the laminate vehicle windshield 154 and/or the first glass sheet 304, etc.). Additional details of the shape of the support structures 508A-508C are described in conjunction with
In some embodiments, the windshield support structure 608 may correspond to a material interface, adhesive contact, and/or frame between the laminate vehicle windshield 154 and the frame 104 of the vehicle 100. As shown in
In any event, the windshield support structures 508A-C, 608 described herein may be made from a plastic, urethane, polyurethane, rubber, composites, etc., and/or other material. In some embodiments, the windshield support structures 508A-C, 608 may provide a compliant interface between the laminate vehicle windshield 154 and the frame 104 of the vehicle 100. Among other things, this compliant interface may provide shock absorbing characteristics for the laminate vehicle windshield 154, provide an even contact surface between the laminate vehicle windshield 154 and the frame 104 of the vehicle 100, and/or accommodate differences in coefficients of thermal expansion between the laminate vehicle windshield 154 and the frame 104 of the vehicle 100.
Referring to
The windshield support structure 608 of
The windshield support structure 608 of
The windshield support structure 608 of
Although shown as arcuate and substantially rectangular shapes, it should be appreciated that the void area 310 of the laminate vehicle windshield 154 described herein may be of any shape, open or closed. For instance, any linear, curvilinear, or combination linear and curvilinear shape may be used to notch an edge (e.g., the upper edge 318, etc.) of the interlayer 308 and/or the second glass sheet 312 of laminate vehicle windshield 154 forming the void area 310. Additionally or alternatively, any polygonal, circular, elliptical, and/or combination shape may be removed, or not formed in, the interlayer 308 and/or the second glass sheet 312 to form the void area 310 of the laminate vehicle windshield 154.
In addition, the void areas 310, 610A-610C described herein may be shown as disposed adjacent to a particular edge (e.g., upper edge 318) of the laminate vehicle windshield 154 but embodiments of the present disclosure are not so limited. For instance, depending on the configuration of the laminate vehicle windshield 154 and/or the placement of the LIDAR sensor 112 or LIDAR sensor unit 512, the void areas 310, 610A-610C may be oriented in a center of the laminate vehicle windshield 154, adjacent to the lower edge 316, the left-side edge 320, and/or the right-side edge 322 of the laminate vehicle windshield 154.
The exemplary systems and methods of this disclosure have been described in relation to vehicle glass structures and vehicle windshields. However, to avoid unnecessarily obscuring the present disclosure, the preceding description omits a number of known structures and devices. This omission is not to be construed as a limitation of the scope of the claimed disclosure. Specific details are set forth to provide an understanding of the present disclosure. It should, however, be appreciated that the present disclosure may be practiced in a variety of ways beyond the specific detail set forth herein.
A number of variations and modifications of the disclosure can be used. It would be possible to provide for some features of the disclosure without providing others.
The present disclosure, in various embodiments, configurations, and aspects, includes components, methods, processes, systems and/or apparatus substantially as depicted and described herein, including various embodiments, subcombinations, and subsets thereof. Those of skill in the art will understand how to make and use the systems and methods disclosed herein after understanding the present disclosure. The present disclosure, in various embodiments, configurations, and aspects, includes providing devices and processes in the absence of items not depicted and/or described herein or in various embodiments, configurations, or aspects hereof, including in the absence of such items as may have been used in previous devices or processes, e.g., for improving performance, achieving ease, and/or reducing cost of implementation.
The foregoing discussion of the disclosure has been presented for purposes of illustration and description. The foregoing is not intended to limit the disclosure to the form or forms disclosed herein. In the foregoing Detailed Description for example, various features of the disclosure are grouped together in one or more embodiments, configurations, or aspects for the purpose of streamlining the disclosure. The features of the embodiments, configurations, or aspects of the disclosure may be combined in alternate embodiments, configurations, or aspects other than those discussed above. This method of disclosure is not to be interpreted as reflecting an intention that the claimed disclosure requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment, configuration, or aspect. Thus, the following claims are hereby incorporated into this Detailed Description, with each claim standing on its own as a separate preferred embodiment of the disclosure.
Moreover, though the description of the disclosure has included description of one or more embodiments, configurations, or aspects and certain variations and modifications, other variations, combinations, and modifications are within the scope of the disclosure, e.g., as may be within the skill and knowledge of those in the art, after understanding the present disclosure. It is intended to obtain rights, which include alternative embodiments, configurations, or aspects to the extent permitted, including alternate, interchangeable and/or equivalent structures, functions, ranges, or steps to those claimed, whether or not such alternate, interchangeable and/or equivalent structures, functions, ranges, or steps are disclosed herein, and without intending to publicly dedicate any patentable subject matter.
Embodiments include a laminate vehicle windshield, comprising: a first glass sheet; a second glass sheet disposed behind the first glass sheet; and an interlayer disposed between the first glass sheet and second glass sheet, wherein the first glass sheet and the second glass sheet are bonded together via adhesive contact with the interlayer; wherein the second glass sheet and the interlayer each include a void area disposed behind the first glass sheet and within a periphery of the first glass sheet.
Aspects of the above laminate vehicle windshield include wherein the first glass sheet comprises alkali-aluminosilicate, and wherein the second glass sheet comprises low iron soda-lime. Aspects of the above laminate vehicle windshield include wherein the first glass sheet is light permeable in a range of 900 nm to 1550 nm. Aspects of the above laminate vehicle windshield include wherein the interlayer comprises a polyvinyl butyral resin. Aspects of the above laminate vehicle windshield include wherein the first glass sheet is 0.5 mm to 2.0 mm thick, wherein the interlayer is 0.2 mm to 1.5 mm thick, and wherein the second glass sheet is 1.0 to 3.5 mm thick. Aspects of the above laminate vehicle windshield include wherein the void area is arranged as a substantially rectangular cutout in the second glass sheet and the interlayer. Aspects of the above laminate vehicle windshield include wherein the substantially rectangular cutout notches an edge of a periphery of the second glass sheet and the interlayer. Aspects of the above laminate vehicle windshield include wherein the substantially rectangular cutout comprises at least four substantially linear edges, and wherein each of the at least four substantially linear edges of the substantially rectangular cutout is offset from an adjacent edge of the second glass sheet and the interlayer.
Embodiments include a vehicle windshield assembly, comprising: a laminate vehicle windshield, comprising: a first glass sheet; a second glass sheet disposed behind the first glass sheet; and an interlayer disposed between the first glass sheet and second glass sheet, wherein the first glass sheet and the second glass sheet are bonded together via adhesive contact with the interlayer; wherein the second glass sheet and the interlayer each include a void area disposed behind the first glass sheet and within a periphery of the first glass sheet; and a support structure surrounding the periphery of the first glass sheet and following a periphery of the void area in the second glass sheet and the interlayer, wherein the support structure comprises a mechanical frame to which the laminate vehicle windshield is attached.
Aspects of the above vehicle windshield assembly include wherein the first glass sheet comprises alkali-aluminosilicate, and wherein the second glass sheet comprises low iron soda-lime. Aspects of the above vehicle windshield assembly include wherein the first glass sheet is light permeable in a range of 900 nm to 1550 nm. Aspects of the above vehicle windshield assembly include wherein the interlayer comprises a polyvinyl butyral resin. Aspects of the above vehicle windshield assembly include wherein the first glass sheet is 0.5 mm to 2.0 mm thick, wherein the interlayer is 0.2 mm to 1.5 mm thick, and wherein the second glass sheet is 1.0 to 3.5 mm thick. Aspects of the above vehicle windshield assembly include wherein the void area is arranged as a substantially rectangular cutout in the second glass sheet and the interlayer, and wherein a portion of the support structure surrounds at least three sides of the substantially rectangular cutout. Aspects of the above vehicle windshield assembly include wherein the substantially rectangular cutout notches an edge of a periphery of the second glass sheet and the interlayer. Aspects of the above vehicle windshield assembly include wherein the substantially rectangular cutout comprises at least four substantially linear edges, and wherein each of the at least four substantially linear edges of the substantially rectangular cutout is offset from an adjacent edge of the second glass sheet and the interlayer. Aspects of the above vehicle windshield assembly include wherein a urethane adhesive material is disposed between the second glass sheet and the support structure attaching the laminate vehicle windshield to the support structure.
Embodiments include a vehicle, comprising: a vehicle chassis comprising a windshield frame defining a mount periphery for a windshield; a laminate vehicle windshield disposed in the windshield frame of the vehicle chassis, the laminate vehicle windshield comprising: a first glass sheet; a second glass sheet disposed behind the first glass sheet; and an interlayer disposed between the first glass sheet and second glass sheet, wherein the first glass sheet and the second glass sheet are bonded together via adhesive contact with the interlayer; wherein the second glass sheet and the interlayer each include a void area disposed behind the first glass sheet and within a periphery of the first glass sheet; and a support structure surrounding the periphery of the first glass sheet and following a periphery of the void area in the second glass sheet and the interlayer, wherein the support structure comprises a mechanical adhesive interface between the windshield frame and the laminate vehicle windshield, and wherein the laminate vehicle windshield separates an interior cabin of the vehicle from an exterior of the vehicle.
Aspects of the above vehicle further comprise a light imaging detection and ranging (LIDAR) sensor disposed inside the cabin of the vehicle, wherein the LIDAR sensor is arranged behind the first glass sheet, and wherein a path defining a field of view of the LIDAR sensor passes through the first glass sheet from the inside of the cabin of the vehicle to the exterior of the vehicle without passing through the second glass sheet in the void area. Aspects of the above vehicle further comprise wherein the first glass sheet comprises alkali-aluminosilicate that is light permeable in a range of 900 nm to 1550 nm, wherein the second glass sheet comprises low iron soda-lime, and wherein the interlayer comprises a polyvinyl butyral resin.
Any one or more of the aspects/embodiments as substantially disclosed herein.
Any one or more of the aspects/embodiments as substantially disclosed herein optionally in combination with any one or more other aspects/embodiments as substantially disclosed herein.
One or more means adapted to perform any one or more of the above aspects/embodiments as substantially disclosed herein.
The phrases “at least one,” “one or more,” “or,” and “and/or” are open-ended expressions that are both conjunctive and disjunctive in operation. For example, each of the expressions “at least one of A, B and C,” “at least one of A, B, or C,” “one or more of A, B, and C,” “one or more of A, B, or C,” “A, B, and/or C,” and “A, B, or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
The term “a” or “an” entity refers to one or more of that entity. As such, the terms “a” (or “an”), “one or more,” and “at least one” can be used interchangeably herein. It is also to be noted that the terms “comprising,” “including,” and “having” can be used interchangeably.