System and method for generating and interpreting point clouds of a rail corridor along a survey path

Information

  • Patent Grant
  • 11782160
  • Patent Number
    11,782,160
  • Date Filed
    Wednesday, October 6, 2021
    2 years ago
  • Date Issued
    Tuesday, October 10, 2023
    7 months ago
Abstract
An autonomous system for generating and interpreting point clouds of a rail corridor along a survey path while moving on a railroad corridor assessment platform. The system includes two LiDAR sensors configured to scan along scan planes that intersect but not at all points. The LiDAR sensors are housed in autonomously controlled and temperature controlled protective enclosures.
Description
FIELD

This disclosure relates to the field of automated railroad assessment systems. More particularly, this disclosure relates to a system and method for generating point clouds of a rail corridor along a survey path of a test vehicle.


BACKGROUND

In the last decade, light detection and ranging or “LiDAR” technology has been used in the railroad assessment industry to acquire 3D laser scans of the surroundings of a rail corridor. For example, PCT Publication Number WO2018/208153 entitled “System and Method for Mapping a Railway Track” to Fugro Technologies B.V. describes the use of a 3D laser scanner mounted on the front of a locomotive to gather and generate geo-referenced 3D point cloud data which includes point data corresponding to the two rails on which the vehicle is moving as well as the surroundings of the railroad track. In order to gain a broad view for the LiDAR equipment, such equipment is placed on the very front of a train on the locomotive.


A similar system for gathering 3D data using a device on the front of a locomotive is described in U.S. Patent Application Publication Number 2018/0370552 entitled “Real Time Machine Vision System for Vehicle Control and Protection” to Solfice Research, Inc. The system described can gather point cloud data including data gathered using LiDAR.


U.S. Patent Application Publication Number 2019/0135315 entitled “Railway Asset Tracking and Mapping System” to Herzog Technologies, Inc. describes the use of a LiDAR system for gathering and storing the positions and imagery of physical railroad assets along a rail corridor. Unlike the previously described systems, the Herzog system is mounted on a hi-rail vehicle which can be disruptive to normal train traffic along a railroad.


U.S. Pat. No. 9,175,998 entitled “Ballast Delivery and Computation System and Method” to Georgetown Rail Equipment Company describes a system mounted on a hi-rail vehicle using LiDAR to determine ballast profiles and whether ballast needs to be replaced.


All of the examples discussed above are either mounted on the front of a locomotive or on a hi-rail vehicle. In the former examples, the system is necessarily attached to a locomotive. In the latter examples, hi-rail vehicles are used which can be disruptive to railroads causing downtime while the hi-rail vehicle is operating.


What is needed, therefore, is an alternative to the different ways LiDAR has been used in the past to gather data related to railroads and their surroundings.


SUMMARY

The above and other needs are met by a system for generating and interpreting point clouds of a rail corridor along a survey path while moving on a railroad corridor assessment platform. In one embodiment, the system includes a railroad corridor assessment platform; a first LiDAR sensor configured to scan along a first scan plane, the first LiDAR sensor attached to the railroad corridor assessment platform in a rear-facing direction; a second LiDAR sensor configured to scan along a second scan plane, the second LiDAR sensor attached to the railroad corridor assessment platform in a rear-facing direction, wherein the first scan plane crosses the second scan plane but is not coplanar at all points with the second scan plane; a first sensor enclosure housing for protecting the first LiDAR sensor, the first sensor enclosure further comprising a first sensor enclosure LiDAR sensor cap configured to move from a first sensor enclosure LiDAR sensor cap open position in which the first LiDAR sensor is exposed to a first sensor enclosure LiDAR sensor cap closed position in which the first LiDAR sensor is not exposed, and from the first sensor enclosure LiDAR sensor cap closed position to the first sensor enclosure LiDAR sensor cap open position; and a second sensor enclosure for housing for protecting the second LiDAR sensor, the second sensor enclosure further comprising a second sensor enclosure LiDAR sensor cap configured to move from a second sensor enclosure LiDAR sensor cap open position in which the second LiDAR sensor is exposed to a second sensor enclosure LiDAR sensor cap closed position in which the second LiDAR sensor is not exposed, and from the second sensor enclosure LiDAR sensor cap closed position to the second sensor enclosure LiDAR sensor cap open position. The system preferably further includes a data storage device; an Inertial Measurement Unit (IMU); a geo-location device; and a high performance computing system in electrical communication with the first LiDAR sensor, the second LiDAR sensor, the data storage device, the IMU, and the geo-location device, the computing system comprising a high-performance processor wherein the processor controls operations of the first LiDAR sensor and the second LiDAR sensor, and wherein the processor performs a method for generating and interpreting point clouds of a rail corridor, the method comprising operations of (i) obtaining a first set of point cloud data using the first LiDAR sensor; (ii) obtaining a second set of point cloud data using the second LiDAR sensor; (iii) obtaining railroad corridor assessment platform attitude information using the IMU; (iv) obtaining geo-location information using the geo-location device; (v) combining the first set of point cloud data together, the second set of point cloud data, the railroad corridor platform attitude information, and the geo-location information to generate a combined point cloud; (vi) identifying rail corridor features of interest found in the combined point cloud; (vii) creating an inventory of the identified rail corridor features of interest; and (viii) storing the combined point cloud on the data storage device.


The system for generating and interpreting point clouds of a rail corridor described above preferably further includes (A) a first sensor enclosure further including (i) a first sensor enclosure outer shell comprising a first sensor enclosure outer shell first aperture and a first sensor enclosure outer shell second aperture; (ii) a first sensor enclosure first high resolution camera in electrical communication with the computing system, the first sensor enclosure first high resolution camera oriented to view from the inside of the first sensor enclosure through the first sensor enclosure outer shell first aperture to gather digital image data of a rail corridor; and (ii) a first sensor enclosure second high resolution camera in electrical communication with the computing system, the first sensor enclosure second high resolution camera oriented to view from the inside of the first sensor enclosure through the first sensor enclosure outer shell second aperture to gather digital image data of a rail corridor; and (B) a second sensor enclosure including (i) a second sensor enclosure outer shell comprising a second sensor enclosure outer shell first aperture and a second sensor outer shell second aperture; (ii) a second sensor enclosure first high resolution camera in electrical communication with the computing system, the second sensor enclosure first high resolution camera oriented to view from the inside of the second sensor enclosure through the second sensor enclosure outer shell first aperture to gather digital image data of a rail corridor; and (iii) a second sensor enclosure second high resolution camera in electrical communication with the computing system, the second sensor enclosure second high resolution camera oriented to view from the inside of the second sensor enclosure through the second sensor enclosure outer shell second aperture to gather digital image data of a rail corridor. The system for generating and interpreting point clouds of a rail corridor may further include (A) a wheel mounted shaft encoder for sending trigger signals to the first sensor enclosure first high resolution camera, the first sensor enclosure second high resolution camera, the second sensor enclosure first high resolution camera, and the second sensor enclosure second high resolution camera as the railroad corridor assessment platform moves along a survey path; (B) the high-performance processor wherein the processor controls operations of the first sensor enclosure first high resolution camera, the first sensor enclosure second high resolution camera, the second sensor enclosure first high resolution camera, and the second sensor enclosure second high resolution camera, and wherein the processor performs a method for generating and interpreting digital image data, the method comprising operations of (i) receiving pulses from the shaft encoder and triggering the first sensor enclosure first high resolution camera, the first sensor enclosure second high resolution camera, the second sensor enclosure first high resolution camera, and the second sensor enclosure second high resolution camera to obtain digital image data at the same time instances; (ii) obtaining a first set of digital image data using the first sensor enclosure first high resolution camera; (iii) obtaining a second set of digital image data using the first sensor enclosure second high resolution camera; (iv) obtaining a third set of digital image data using the second sensor enclosure first high resolution camera; (v) obtaining a fourth set of digital image data using the second sensor enclosure second high resolution camera; (vi) combining the first set of digital image data, the second set of digital image data, the third set of digital image data, and the fourth set of digital image data to form a combined set of digital image data comprising a plurality of digital images and generating a combined panoramic digital image of the rail corridor; and (vii) storing the combined set of digital image data on the data storage device. Alternatively or additionally, the system for generating and interpreting point clouds of a rail corridor may further include (A) the first sensor enclosure further including (i) a first sensor enclosure inner shell comprising a first sensor enclosure inner shell first aperture and a first sensor enclosure inner shell second aperture, wherein the first sensor enclosure inner shell is configured to move relative to the first sensor enclosure outer shell from a first sensor enclosure inner shell open position wherein the first sensor enclosure outer shell first aperture is in line with the first sensor enclosure inner shell first aperture and the first sensor enclosure outer shell second aperture is in line with the first sensor enclosure inner shell second aperture to a first sensor enclosure inner shell closed position wherein the first sensor enclosure outer shell first aperture is not in line with the first sensor enclosure inner shell first aperture and the first sensor enclosure outer shell second aperture is not in line with the first sensor enclosure inner shell second aperture resulting in (1) the first sensor enclosure outer shell first aperture being blocked by the first sensor enclosure inner shell to protect the first sensor enclosure first high resolution camera and (2) the first sensor enclosure outer shell second aperture being blocked by the first sensor enclosure inner shell to protect the first sensor enclosure second high resolution camera; (ii) a first inner shell motorized linear actuator in electrical communication with the computing system and connected to the first sensor enclosure inner shell for moving the first sensor enclosure inner shell from the first sensor enclosure inner shell open position to the first sensor enclosure inner shell closed position or from the first sensor enclosure inner shell closed position to the first sensor enclosure inner shell open position depending upon control instructions from the computing system; (iii) a first sensor enclosure LiDAR cap configured to move from a first sensor enclosure cap open position in which the first LiDAR sensor is exposed to a first sensor enclosure cap closed position in which the first LiDAR sensor is not exposed, and from the first sensor enclosure cap closed position to the first sensor enclosure cap open position; and (iv) a first LiDAR cap motorized actuator in electrical communication with the computing system and connected to the first sensor enclosure LiDAR cap for moving the first sensor enclosure LiDAR cap from the first sensor enclosure cap closed position to the first sensor enclosure cap open position or from the first sensor enclosure cap open position to the first sensor enclosure cap closed position depending on control instructions from the computing system; and (B) the second sensor enclosure further including (i) a second sensor enclosure inner shell comprising a second sensor enclosure inner shell first aperture and a second sensor enclosure inner shell second aperture, wherein the second sensor enclosure inner shell is configured to move relative to the second sensor enclosure outer shell from a second sensor enclosure inner shell open position wherein the second sensor enclosure outer shell first aperture is in line with the second sensor enclosure inner shell first aperture and the second sensor enclosure outer shell second aperture is in line with the second sensor enclosure inner shell second aperture to a second sensor enclosure inner shell closed position wherein the second sensor enclosure outer shell first aperture is not in line with the second sensor enclosure inner shell first aperture and the second sensor enclosure outer shell second aperture is not in line with the second sensor enclosure inner shell second aperture resulting in (1) the second sensor enclosure outer shell first aperture being blocked by the second sensor enclosure inner shell to protect the second sensor enclosure first high resolution camera and (2) the second sensor enclosure outer shell second aperture being blocked by the second sensor enclosure inner shell to protect the second sensor enclosure second high resolution camera; (ii) a second inner shell motorized linear actuator in electrical communication with the computing system and connected to the second sensor enclosure inner shell for moving the second sensor enclosure inner shell from the second sensor enclosure inner shell open position to the second sensor enclosure inner shell closed position or from the second sensor enclosure inner shell closed position to the second sensor enclosure inner shell open position depending upon control instructions from the computing system; (iii) a second sensor enclosure LiDAR cap configured to move from a second sensor enclosure cap open position in which the second LiDAR sensor is exposed to a second sensor enclosure cap closed position in which the second LiDAR sensor is not exposed, and from the second sensor enclosure cap closed position to the second sensor enclosure cap open position; and (iV) a second LiDAR cap motorized actuator in electrical communication with the computing system and connected to the second sensor enclosure LiDAR cap for moving the second sensor enclosure LiDAR cap from the second sensor enclosure cap closed position to the second sensor enclosure cap open position or from the second sensor enclosure cap open position to the second sensor enclosure cap closed position depending on control instructions from the computing system.


Some versions of the system described above may further include (A) a climatic sensor on the railroad corridor assessment platform, the climatic sensor in electrical communication with the computing system; and (B) the high-performance processor wherein the processor controls operations of the first inner shell motorized linear actuator and the second inner shell motorized linear actuator, and wherein the processor performs a method for protecting the first sensor enclosure first high resolution camera, the first sensor enclosure second high resolution camera, the second sensor enclosure first high resolution camera, and the second sensor enclosure second high resolution camera, the method comprising operations of (i) receiving climatic conditions data from the climatic sensor; (ii) activating the first inner shell motorized linear actuator to move the first sensor enclosure inner shell from the first sensor enclosure inner shell open position to the first sensor enclosure inner shell closed position based on the received climatic conditions data; (iii) activating the second inner shell motorized linear actuator to move the second sensor enclosure inner shell from the second sensor enclosure inner shell open position to the second sensor enclosure inner shell closed position based on the received climatic conditions data; (iv) activating the first LiDAR cap motorized actuator to move the first sensor enclosure LiDAR cap from the first sensor enclosure cap open position to the first sensor enclosure cap closed position; and (v) activating the second LiDAR cap motorized actuator to move the second sensor enclosure LiDAR cap from the second sensor enclosure cap open position to the second sensor enclosure cap closed position.


Some versions of the system described above may further include (A) a motion sensor for sensing motion of the railroad corridor assessment platform, the motion sensor in electrical communication with the computing system; and (B) the high-performance processor wherein the processor controls operations of the first inner shell motorized linear actuator and the second inner shell motorized linear actuator, and wherein the processor performs a method for protecting the first sensor enclosure first high resolution camera, the first sensor enclosure second high resolution camera, the second sensor enclosure first high resolution camera, and the second sensor enclosure second high resolution camera, the method comprising operations of (i) receiving a motion sensor signal from the motion sensor indicating that the railroad corridor assessment platform is moving relative to a railroad track below a minimum speed threshold programmed into the computing system; (ii) activating the first inner shell motorized linear actuator to move the first sensor enclosure inner shell from the first sensor enclosure inner shell open position to the first sensor enclosure inner shell closed position based on the received motion sensor signal; (iii) activating the second inner shell motorized linear actuator to move the second sensor enclosure inner shell from the second sensor enclosure inner shell open position to the second sensor enclosure inner shell closed position based on the received motion sensor signal; (iv) activating the first LiDAR cap motorized actuator to move the first sensor enclosure LiDAR cap from the first sensor enclosure cap open position to the first sensor enclosure cap closed position; and (v) activating the second LiDAR cap motorized actuator to move the second sensor enclosure LiDAR cap from the second sensor enclosure cap open position to the second sensor enclosure cap closed position.


Some versions of the system described above may further include (A) a temperature sensor on the railroad corridor assessment platform in electrical communication with the computing system and proximate to the first sensor enclosure and the second sensor enclosure; (B) a heating and cooling system in electrical communication with the computing system, the heating and cooling system further including (i) an air blower; (ii) a heater for heating air blown from the air blower; (iii) an air chiller for cooling air blown from the air blower; and (iv) an air duct for channeling air from the air blower to the first sensor enclosure and the second sensor enclosure; and (C) the high-performance processor wherein the processor controls operations of the heating and cooling system, and wherein the processor performs a method for regulating air temperature in the first sensor enclosure and the second sensor enclosure, the method comprising operations of (i) receiving temperature data from the temperature sensor; (ii) activating the air blower; and (iii) activating the heater or the air chiller based on the received temperature data.


Some versions of the system described above may further include (A) the first sensor enclosure further comprising at least one first sensor air aperture through which air can directed across the first LiDAR sensor; and (B) the second sensor enclosure further comprising at least one second sensor air aperture through which air can directed across the second LiDAR sensor.


Some versions of the system described above may further include (A) a first sensor enclosure in which the first LiDAR sensor is housed; (B) a second sensor enclosure in which the second LiDAR sensor is housed; (C) a temperature sensor in electrical communication with the computing system and located proximate to the first LiDAR sensor and the second LiDAR sensor; (D) a heating and cooling system in electrical communication with the computing system, the heating and cooling system further including (i) an air blower; (ii) a heater for heating air blown from the air blower; (iii) an air chiller for cooling air blown from the air blower; and (iv) a duct for channeling air from the air blower to the first sensor enclosure and the second sensor enclosure depending on temperature data sent by the temperature sensor to the computing system.


In another aspect, a method for generating and interpreting point clouds of a rail corridor is disclosed. In one embodiment, the method includes (A) obtaining a first set of point cloud data using a processor and a first LiDAR sensor oriented to scan along a first scan plane and attached to a railroad corridor assessment platform in a rear-facing orientation wherein the first LiDAR sensor is in electrical communication with the processor; (B) obtaining a second set of point cloud data using the processor and a second LiDAR sensor oriented to scan along a second scan plane and attached to the railroad corridor assessment platform in a rear-facing orientation wherein the second LiDAR sensor is in electrical communication with the processor, wherein the first scan plane crosses the second scan plane but is not coplanar at all points with the second scan plane, and wherein neither the first scan plane nor the second scan plane intersect a main body of any rail car adjoined to a rear end of the railroad corridor assessment platform; (C) obtaining railroad corridor assessment platform attitude data using an Inertial Measurement Unit (IMU) in electrical communication with the processor; (D) obtaining geo-location data of the railroad corridor assessment platform using a geo-location device in electrical communication with the processor; (E) combining the first set of point cloud data, the second set of point cloud data, the railroad corridor assessment platform attitude data, and the geo-location data to generate a combined point cloud using the processor; (F) identifying rail corridor features of interest found in the combined point cloud using the processor; (G) creating an inventory of the identified rail corridor features of interest using the processor; and (H) storing the combined point cloud on a data storage device in electrical communication with the processor. In some embodiments, the method may further include (A) receiving pulses from a wheel mounted shaft encoder in electrical communication with the processor and triggering a first sensor enclosure first high resolution camera, a first sensor enclosure second high resolution camera, a second sensor enclosure first high resolution camera, and a second sensor enclosure second high resolution camera to obtain digital image data at the same time instances; (B) obtaining a first set of digital image data using the first sensor enclosure first high resolution camera; (C) obtaining a second set of digital image data using the first sensor enclosure second high resolution camera; (D) obtaining a third set of digital image data using the second sensor enclosure first high resolution camera; (E) obtaining a fourth set of digital image data using the second sensor enclosure second high resolution camera; (F) combining the first set of digital image data, the second set of digital image data, the third set of digital image data, and the fourth set of digital image data to form a combined set of digital image data comprising a plurality of digital images and generating a combined panoramic digital image of a rail corridor; and (G) storing the combined set of digital image data on a data storage device in electrical communication with the processor. Such methods may further include colorizing the combined point cloud using the combined set of digital image data and the processor.


Some versions of the method described above may further include (A) triggering a first sensor enclosure third high resolution camera and a second sensor enclosure third high resolution camera to obtain digital image data at the same time instances as the first sensor enclosure first high resolution camera, the first sensor enclosure second high resolution camera, the second sensor enclosure first high resolution camera, and the second sensor enclosure second high resolution camera; (B) obtaining a fifth set of digital image data using a first sensor enclosure third high resolution camera; (C) obtaining a sixth set of digital image data using a second sensor enclosure third high resolution camera; and (D) combining the first set of digital image data, the second set of digital image data, the third set of digital image data, the fourth set of digital image data, the fifth set of digital image data and the sixth set of digital image data to form a combined set of digital image data comprising a plurality of digital images and generating a combined panoramic digital image of a rail corridor. Such methods may further include geo-referencing the colorized combined point cloud using a geo-referencing device.


Some versions of the methods described above may further include (A) housing the first LiDAR sensor, the first sensor enclosure first high resolution camera, the first sensor enclosure second high resolution camera, and the first sensor enclosure third high resolution camera in a first sensor enclosure including a first sensor enclosure LiDAR cap for protecting the first LiDAR sensor; (B) housing the second LiDAR sensor, the second sensor enclosure first high resolution camera, the second sensor enclosure second high resolution camera, and the second sensor enclosure third high resolution camera in a second sensor enclosure including a second sensor enclosure LiDAR cap for protecting the second LiDAR sensor; and (C) blowing temperature-controlled air to the first sensor enclosure and the second sensor enclosure using a heating and cooling system including an air blower wherein temperature-controlled air is blown through an air duct to the first sensor enclosure and the second sensor enclosure and wherein the heating and cooling system is controlled by the computing system based on temperature data received by a temperature sensor proximate to the first LiDAR sensor and the second LiDAR sensor.


Some versions of the methods described above may further include (A) blowing temperature-controlled air through an aperture in the first sensor enclosure adjacent to the first LiDAR sensor for blowing away flying debris and precipitation from the first LiDAR sensor and to maintain the first sensor enclosure LiDAR cap at a temperature above freezing to eliminate the accumulation of frozen precipitation; and (B) blowing temperature-controlled air through an aperture in the second sensor enclosure adjacent to the second LiDAR sensor for blowing away flying debris and precipitation from the second LiDAR sensor and to maintain the second sensor enclosure LiDAR cap at a temperature above freezing to eliminate the accumulation of frozen precipitation


Some versions of the methods described above may further include (A) housing the first LiDAR sensor in a first sensor enclosure including (i) a first sensor enclosure LiDAR cap configured to move from a first sensor enclosure cap open position in which the first LiDAR sensor is exposed to a first sensor enclosure cap closed position in which the first LiDAR sensor is not exposed, and from the first sensor enclosure cap closed position to the first sensor enclosure cap open position; and (ii) a first LiDAR cap motorized linear actuator in electrical communication with the computing system and connected to the first sensor enclosure LiDAR cap for moving the first sensor enclosure LiDAR cap from the first sensor enclosure cap closed position to the first sensor enclosure cap open position or from the first sensor enclosure cap open position to the first sensor enclosure cap closed position depending on control instructions from the computing system; (B) housing the first sensor enclosure first high-resolution camera in the first sensor enclosure including (i) a first sensor enclosure outer shell; (ii) a first sensor enclosure outer shell first aperture through which the first sensor enclosure first high-resolution camera obtains digital image data; (ii) a first sensor enclosure inner shell configured to move relative to the first sensor outer shell from a first sensor enclosure inner shell open position wherein the first sensor enclosure outer shell first aperture is open and the first sensor enclosure first high-resolution camera is exposed to weather outside the first sensor enclosure to a first sensor enclosure inner shell closed position wherein the first sensor enclosure outer shell first aperture is blocked by the first sensor inner shell and the first sensor enclosure first high-resolution camera is not exposed to weather outside the first sensor enclosure; (C) housing the first sensor enclosure second high-resolution camera in the first sensor enclosure including (i) a first sensor enclosure outer shell second aperture through which the first sensor enclosure second high-resolution camera obtains digital image data; (ii) the first sensor enclosure inner shell configured to move relative to the first sensor enclosure outer shell from the first sensor enclosure inner shell open position wherein the first sensor enclosure outer shell second aperture is open and the first sensor enclosure second high-resolution camera is exposed to weather outside the first sensor enclosure to the first sensor enclosure inner shell closed position wherein the first sensor enclosure outer shell second aperture is blocked by the first sensor inner shell and the first sensor enclosure second high-resolution camera is not exposed to weather outside the first sensor enclosure; (iii) a first inner shell motorized linear actuator connected to the first sensor enclosure inner shell and in electrical communication with the processor for moving the first sensor enclosure inner shell from the first sensor enclosure inner shell open position to the first sensor enclosure inner shell closed position and from the first sensor enclosure inner shell closed position to the first sensor enclosure inner shell open position depending on instructions from the processor; (D) housing the second LiDAR sensor in a second sensor enclosure including (i) a second sensor enclosure LiDAR cap configured to move from a second sensor enclosure cap open position in which the second LiDAR sensor is exposed to a second sensor enclosure cap closed position in which the second LiDAR sensor is not exposed, and from the second sensor enclosure cap closed position to the second sensor enclosure cap open position; and (ii) a second LiDAR cap motorized linear actuator in electrical communication with the computing system and connected to the second sensor enclosure LiDAR cap for moving the second sensor enclosure LiDAR cap from the second sensor enclosure cap closed position to the second sensor enclosure cap open position or from the second sensor enclosure cap open position to the second sensor enclosure cap closed position depending on control instructions from the computing system; (E) housing the second sensor enclosure first high-resolution camera in the second sensor enclosure including (i) a second sensor enclosure outer shell; (ii) a second sensor enclosure outer shell first aperture through which the second sensor enclosure first high-resolution camera obtains digital image data; (iii) a second sensor enclosure inner shell configured to move relative to the second sensor outer shell from a second sensor enclosure inner shell open position wherein the second sensor enclosure outer shell first aperture is open and the second sensor enclosure first high-resolution camera is exposed to weather outside the second sensor enclosure to a second sensor inner shell closed position wherein the second sensor enclosure outer shell first aperture is blocked by the second sensor inner shell and the second sensor enclosure first high-resolution camera is not exposed to weather outside the second sensor enclosure; and (F) housing the second sensor enclosure second high-resolution camera in the second sensor enclosure including (i) a second sensor enclosure outer shell second aperture through which the second sensor enclosure second high-resolution camera obtains digital image data; (ii) the second sensor enclosure inner shell configured to move relative to the second sensor enclosure outer shell from the second sensor enclosure inner shell open position wherein the second sensor enclosure outer shell second aperture is open and the second sensor enclosure second high-resolution camera is exposed to weather outside the second sensor enclosure to the second sensor enclosure inner shell closed position wherein the second sensor enclosure outer shell second aperture is blocked by the second sensor inner shell and the second sensor enclosure second high-resolution camera is not exposed to weather outside the second sensor enclosure; and (iii) a second inner shell motorized linear actuator connected to the second sensor enclosure inner shell and in electrical communication with the processor for moving the second sensor enclosure inner shell from the second sensor enclosure inner shell open position to the second sensor enclosure inner shell closed position and from the second sensor enclosure inner shell closed position to the second sensor enclosure inner shell open position depending on instructions from the processor;


Some versions of the methods described above may further include (A) detecting weather conditions outside the first sensor enclosure and the second sensor enclosure using a climatic sensor in electrical communication with the processor; (B) activating the first inner shell motorized linear actuator to move the first sensor enclosure inner shell from the first sensor enclosure inner shell open position to the first sensor enclosure inner shell closed position based on information received by the processor from the climatic sensor; (C) activating the second inner shell motorized linear actuator to move the second sensor enclosure inner shell from the second sensor enclosure inner shell open position to the second sensor enclosure inner shell closed position based on information received by the processor from the climatic sensor; (D) activating the first LiDAR cap motorized linear actuator to move the first sensor enclosure LiDAR cap from the first sensor enclosure cap open position to the first sensor enclosure cap closed position; and (E) activating the second LiDAR cap motorized linear actuator to move the second sensor enclosure LiDAR cap from the second sensor enclosure cap open position to the second sensor enclosure cap closed position.


Some versions of the methods described above may further include (A) detecting movement of the railroad corridor assessment platform using a motion sensor; (B) activating the first inner shell motorized linear actuator to move the first sensor enclosure inner shell from the first sensor enclosure inner shell open position to the first sensor enclosure inner shell closed position based on information received by the processor from the motion sensor; (C) activating the second inner shell motorized linear actuator to move the second sensor enclosure inner shell from the second sensor enclosure inner shell open position to the second sensor enclosure inner shell closed position based on information received by the processor from the motion sensor; (D) activating the first LiDAR cap motorized actuator to move the first sensor enclosure LiDAR cap from the first sensor enclosure cap open position to the first sensor enclosure cap closed position; and (E) activating the second LiDAR cap motorized actuator to move the second sensor enclosure LiDAR cap from the second sensor enclosure cap open position to the second sensor enclosure cap closed position.


In another aspect, a system for generating and interpreting point clouds of a rail corridor along a survey path while moving on a railroad corridor assessment platform is disclosed. In some embodiments, the system includes (A) a railroad corridor assessment platform; (B) a first LiDAR sensor configured to scan along a first scan plane, the first LiDAR sensor attached to the railroad corridor assessment platform in a rear-facing direction; (C) a second LiDAR sensor configured to scan along a second scan plane, the second LiDAR sensor attached to the railroad corridor assessment platform in a rear-facing direction, wherein the first scan plane crosses the second scan plane but is not coplanar at all points with the second scan plane and wherein neither the first scan plane nor the second scan plane intersect a main body of any rail car adjoined to a rear end of the railroad corridor assessment platform; (D) a data storage device; (E) an Inertial Measurement Unit (IMU); (F) a geo-location device; and (G) a high performance computing system in electrical communication with the first LiDAR sensor, the second LiDAR sensor, the data storage device, the IMU, and the geo-location device, the computing system comprising a high-performance processor wherein the processor controls operations of the first LiDAR sensor and the second LiDAR sensor for obtaining and storing point cloud data.


Some versions of the methods described above may further include (A) a first sensor enclosure in which the first LiDAR sensor is housed; (B) a second sensor enclosure in which the second LiDAR sensor is housed; (C) a temperature sensor in electrical communication with the computing system and located proximate to the first LiDAR sensor and the second LiDAR sensor; (D) a heating and cooling system in electrical communication with and controlled by the computing system, the heating and cooling system further including (i) an air blower; (ii) a heater for heating air blown from the air blower; (iii) an air chiller for cooling air blown from the air blower; and (iv) a duct for channeling air from the air blower to the first sensor enclosure and the second sensor enclosure depending on temperature data sent by the temperature sensor to the computing system.


Some versions of the methods described above may further include (A) the first sensor enclosure further including (i) a first sensor enclosure inner shell comprising a first sensor enclosure inner shell first aperture and a first sensor enclosure inner shell second aperture, wherein the first sensor enclosure inner shell is configured to move relative to the first sensor enclosure outer shell from a first sensor enclosure inner shell open position wherein the first sensor enclosure outer shell first aperture is in line with the first sensor enclosure inner shell first aperture and the first sensor enclosure outer shell second aperture is in line with the first sensor enclosure inner shell second aperture to a first sensor enclosure inner shell closed position wherein the first sensor enclosure outer shell first aperture is not in line with the first sensor enclosure inner shell first aperture and the first sensor enclosure outer shell second aperture is not in line with the first sensor enclosure inner shell second aperture resulting in (1) the first sensor enclosure outer shell first aperture being blocked by the first sensor enclosure inner shell to protect the first sensor enclosure first high resolution camera and (2) the first sensor enclosure outer shell second aperture being blocked by the first sensor enclosure inner shell to protect the first sensor enclosure second high resolution camera; and (ii) a first inner shell motorized linear actuator in electrical communication with the computing system and connected to the first sensor enclosure inner shell for moving the first sensor enclosure inner shell from the first sensor enclosure inner shell open position to the first sensor enclosure inner shell closed position and from the first sensor enclosure inner shell closed position to the first sensor enclosure inner shell open position depending upon control instructions from the computing system; (iii) a first sensor enclosure LiDAR cap configured to move from a first sensor enclosure cap open position in which the first LiDAR sensor is exposed to a first sensor enclosure cap closed position in which the first LiDAR sensor is not exposed, and from the first sensor enclosure cap closed position to the first sensor enclosure cap open position; and (iv) a first LiDAR cap motorized actuator in electrical communication with the computing system and connected to the first sensor enclosure LiDAR cap for moving the first sensor enclosure LiDAR cap from the first sensor enclosure cap closed position to the first sensor enclosure cap open position or from the first sensor enclosure cap open position to the first sensor enclosure cap closed position depending on control instructions from the computing system; and (B) the second sensor enclosure further including (i) a second sensor enclosure inner shell comprising a second sensor enclosure inner shell first aperture and a second sensor enclosure inner shell second aperture, wherein the second sensor enclosure inner shell is configured to move relative to the second sensor enclosure outer shell from a second sensor enclosure inner shell open position wherein the second sensor enclosure outer shell first aperture is in line with the second sensor enclosure inner shell first aperture and the second sensor enclosure outer shell second aperture is in line with the second sensor enclosure inner shell second aperture to a second sensor enclosure inner shell closed position wherein the second sensor enclosure outer shell first aperture is not in line with the second sensor enclosure inner shell first aperture and the second sensor enclosure outer shell second aperture is not in line with the second sensor enclosure inner shell second aperture resulting in (1) the second sensor enclosure outer shell first aperture being blocked by the second sensor enclosure inner shell to protect the second sensor enclosure first high resolution camera and (2) the second sensor enclosure outer shell second aperture being blocked by the second sensor enclosure inner shell to protect the second sensor enclosure second high resolution camera; (ii) a second inner shell motorized linear actuator in electrical communication with the computing system and connected to the second sensor enclosure inner shell for moving the second sensor enclosure inner shell from the second sensor enclosure inner shell open position to the second sensor enclosure inner shell closed position and from the second sensor enclosure inner shell closed position to the second sensor enclosure inner shell open position depending upon control instructions from the computing system; (iii) a second sensor enclosure LiDAR cap configured to move from a second sensor enclosure cap open position in which the second LiDAR sensor is exposed to a second sensor enclosure cap closed position in which the second LiDAR sensor is not exposed, and from the second sensor enclosure cap closed position to the second sensor enclosure cap open position; and (iv) a second LiDAR cap motorized actuator in electrical communication with the computing system and connected to the second sensor enclosure LiDAR cap for moving the second sensor enclosure LiDAR cap from the second sensor enclosure cap closed position to the second sensor enclosure cap open position or from the second sensor enclosure cap open position to the second sensor enclosure cap closed position depending on control instructions from the computing system.


Some versions of the methods described above may further include a climatic sensor in electrical communication with the computing device.


Some versions of the methods described above may further include a motion sensor for sensing motion of the railroad corridor assessment platform and wherein the motion sensor is in electrical communication with the computing system.


The summary provided herein is intended to provide examples of particular disclosed embodiments and is not intended to cover all potential embodiments or combinations of embodiments. Therefore, this summary is not intended to limit the scope of the invention disclosure in any way, a function which is reserved for the appended claims.





BRIEF DESCRIPTION OF THE DRAWINGS

Further features, aspects, and advantages of the present disclosure will become better understood by reference to the following detailed description, appended claims, and accompanying figures, wherein elements are not to scale so as to more clearly show the details, wherein like reference numbers indicate like elements throughout the several views, and wherein:



FIG. 1 shows an image of a rear section of a railroad corridor assessment platform including a boxcar showing a cross-sectional view of the railroad boxcar revealing a portion of a railroad corridor assessment system including a rear-mounted first sensor enclosure including a first LiDAR sensor;



FIG. 2 shows a rear view of the boxcar shown in FIG. 1 including the first rear-mounted sensor enclosure including the first LiDAR sensor and a second rear-mounted sensor enclosure including a second LiDAR sensor;



FIG. 3A shows a side cross-sectional view of the first sensor enclosure shown in a first sensor enclosure inner shell open position, exposing the first LiDAR sensor so that point cloud data can be gathered by the first LiDAR sensor;



FIG. 3B shows a side cross-sectional view of the first sensor enclosure shown in a first sensor enclosure inner shell closed position, concealing and protecting the first LiDAR sensor from the elements including inclement weather;



FIG. 4 shows a close-up end view of the first sensor enclosure shown in FIGS. 1-3B;



FIG. 5 shows a partially transparent plan view looking down on and into the first sensor enclosure and the second sensor enclosure as they would be mounted on the railroad boxcar wherein the first LiDAR sensor and the second LiDAR sensor are configured in preferred orientations to gather point cloud data;



FIG. 6 shows a partially transparent view looking up and into the first sensor enclosure and the second sensor enclosure as they would be mounted on the railroad boxcar wherein the first LiDAR sensor and the second LiDAR sensor are configured in preferred orientations to gather point cloud data;



FIG. 7 shows a partially transparent end view looking directly at the first sensor enclosure and the second sensor enclosure as they would be mounted on a rear wall the railroad boxcar wherein the first LiDAR sensor and the second LiDAR sensor are configured in preferred orientations to gather point cloud data;



FIG. 8 shows a partially transparent perspective view looking at and into the first sensor enclosure;



FIG. 9 shows a first sensor enclosure outer shell and mounting plate along with a first sensor enclosure inner shell motorized linear actuator;



FIG. 10 shows a simple schematic of the railroad corridor assessment system including electrical connections from a computing system to various components of the first sensor enclosure and the second sensor enclosure as well as air flow connections through a duct from a heating and cooling system to the first sensor enclosure and the second sensor enclosure;



FIG. 11A shows a first side view of the first sensor enclosure with the outer shell removed to reveal a range of preferred orientations of the first LiDAR sensor along a Z,Y plane with the first LiDAR sensor being rotatable along an X axis coming out of the page at point 178;



FIG. 11B shows a second side view of the first sensor enclosure with the outer shell removed to reveal a range of preferred orientations of the first LiDAR sensor along a Z,Y plane with the first LiDAR sensor being rotatable along an X axis coming out of the page at point 178;



FIG. 12A shows a first end view of the first sensor enclosure with the outer shell removed to reveal a range of preferred orientations of the first LiDAR sensor along a X,Y plane with the first LiDAR sensor being rotatable along an Z axis coming out of the page at point 180;



FIG. 12B shows a second end view of the first sensor enclosure with the outer shell removed to reveal a range of preferred orientations of the first LiDAR sensor along a X,Y plane with the first LiDAR sensor being rotatable along an Z axis coming out of the page at point 180;



FIG. 13 shows a plan view of looking down on the railroad boxcar with the first sensor enclosure and the second sensor enclosure and an adjoining railroad boxcar coupled to the railroad boxcar with the sensor enclosures wherein the view shows a preferred first scan plane of the first LiDAR sensor and a preferred second scan plane of the second LiDAR sensor wherein the scan planes intersect but are not coplanar at all points and wherein the scan planes do not intersect a main body of the adjoining rail car, thereby providing a 360-degree scan line for gathering point cloud data using the first LiDAR sensor and the second LiDAR sensor; and



FIG. 14 shows a side view of looking at the railroad boxcar with the first sensor enclosure and the second sensor enclosure and an adjoining railroad boxcar coupled to the to railroad boxcar with the sensor enclosures as shown in FIG. 13 wherein the view shows a preferred first scan plane of the first LiDAR sensor and a preferred second scan plane of the second LiDAR sensor wherein the scan planes intersect but are not coplanar at all points and wherein the scan planes do not intersect a main body of the adjoining rail car, thereby providing a 360-degree scan line for gathering point cloud data using the first LiDAR sensor and the second LiDAR sensor.





The figures are provided to illustrate concepts of the invention disclosure and are not intended to embody all potential embodiments of the invention. Therefore, the figures are not intended to limit the scope of the invention disclosure in any way, a function which is reserved for the appended claims.


DETAILED DESCRIPTION

Various terms used herein are intended to have particular meanings. Some of these terms are defined below for the purpose of clarity. The definitions given below are meant to cover all forms of the words being defined (e.g., singular, plural, present tense, past tense). If the definition of any term below diverges from the commonly understood and/or dictionary definition of such term, the definitions below control.


Air or Gas: broadly defined as any gas or mixtures thereof.


Data Communication: a first feature is said to be in data communication with a second feature if the first feature is configured to transmit information to the second feature and the second feature is configured to receive such data, whether such data is transmitted through one or more electrical conductors (e.g., wires), cables (including optical fiber), wirelessly, or a combination thereof.


Electrical Communication: a first feature is said to be in electrical communication with a second feature if there is a conductive path for electricity in any form to flow between the first feature and the second feature thereby electrically connecting the first feature with the second feature. Being in electrical communication does not necessarily mean that electricity is actively flowing but that such structures are configured so that electricity could flow easily from the first feature to the second feature. Features that are in electrical communication may also be in data communication with one another. Therefore, for features that normally transfer or receive data, if such features are said to be in electrical communication with one another, it can be inferred that such features are also in data communication with one another.


Fluid Communication: a first feature is said to be in fluid communication with a second feature if there is a duct or path for air to flow between the first feature and the second feature.


Proximate: a first feature is said to be proximate to a second feature if the first feature is attached to or otherwise extends all the way to the second feature or if the first feature is located close to or extends to a location close to the second feature.



FIGS. 1-10 show different views and aspects of a railroad corridor assessment system 100 for generating and interpreting point clouds of a rail corridor along a survey path while moving on a railroad corridor assessment platform 102. The system 100 includes a computing system 104 including a high-performance processor 106. The computing system 104 is in electrical communication with a first LiDAR sensor 108A and a second LiDAR sensor 108B as shown schematically in FIG. 10. In the embodiment shown in FIG. 1 and FIG. 2, the railroad corridor assessment platform 102 includes a railroad boxcar 110 including a roof 112 and a plurality of walls 114 including a rear wall 114A. FIG. 1 shows a cross-sectional view of the railroad boxcar 110 and some of its contents. The first LiDAR sensor 108A is housed in a first sensor enclosure 116A attached near the top of the rear wall 114A near a first side 118A of the railroad boxcar 110 as shown in FIG. 2. Similarly, the second LiDAR sensor 108B is housed in a second sensor enclosure 116B attached near the top of the rear wall 114A near a second side 118B of the railroad boxcar 110 as shown in FIG. 1 and FIG. 2.


An important aspect of the railroad corridor assessment system 100 is the orientation of the first LiDAR sensor 108A and the second LiDAR sensor 108B. The first LiDAR sensor 108A is oriented and configured to scan a first scan plane 120A. The second LiDAR sensor 108B is oriented and configured to scan a second scan plane 120B which, although intersecting with the first scan plane 120A is not in the same plane as the first scan plane 120A. The first scan plane 120A and the second scan plane 120B intersect as shown in FIG. 5, FIG. 13, and FIG. 14. Specific information regarding the angles of orientation are discussed in more detail below. An adjoining railroad boxcar 122 is shown coupled to a rear end 123 of the railroad boxcar 110 in FIG. 13 and FIG. 14. These figures show how the first LiDAR sensor 108A scanning along the first scan plane 120A and the second LiDAR sensor 108B scanning along the second scan plane 120B are able to acquire 360 degrees of point cloud data despite the adjoining railroad boxcar 122 being coupled so close to the railroad boxcar 110. Because of the specific orientation of the LiDAR sensors 108, the first scan plane 120A and the second scan plane 120B fit tightly between a main body 124 of the railroad boxcar 110 and a main body 126 of the adjoining railroad boxcar 122 without the first scan plane 120A or the second scan plane 120B intersecting the main body 124 of the railroad boxcar 110 or the main body of any adjoining rail car of any type such as, for example, the main body 126 of the adjoining railroad boxcar 122. As such, the visual presence from the perspective of the LiDAR sensors 108 of any railroad boxcar coupled to and behind the railroad boxcar 110 is effectively minimized when the LiDAR sensors 108 gather point cloud data. Additionally, with the selection of preferred scan planes, there is no interference between the scan planes and the adjacent box car main body even when the cars travel on curved track and the corners of one side of the box car bodies are necessarily closer. Taking the adjoining railroad boxcar 122 as an example, a “main body” of a railroad boxcar is defined herein as walls 126A (if present) and a base platform 126B including wheels and axles supporting such platform but excluding coupling hardware 128. Prior to this disclosure, similar LiDAR systems were only used on the front of locomotives or on the roofs hi-rail vehicles. A unique feature of the railroad corridor assessment system 100 is that the LiDAR sensors 108 are attached to a rear wall of the railroad boxcar 110 as opposed to the roof 112 of the railroad boxcar 110 or on the front of the locomotive pushing or pulling the railroad boxcar 110. Another problem only a cross-plane LiDAR sensor configuration can solve is the ability to detect planar surfaces that are perpendicular to the railroad tracks of a railroad corridor. LiDAR sensors that are mounted so the scan planes are perpendicular to the direction of travel (like on the front of a locomotive) are not able to measure surfaces that are also perpendicular to the direction of travel such as, for example, sign faces and building walls. In contrast, LiDAR sensors configured with a scan angle of more or less than 90 degrees as described herein (nominally about 79 degrees for the second LiDAR sensor 108B as shown in FIG. 5 shown as angle ρ, and about 101 degrees for the first LiDAR sensor 108A shown as angle τ) to the direction of travel allows measurement on the back of a planar surface (e.g., a sign face) as the sensors 108 approach the planar surface with the first scan plane 120A from the first LiDAR sensor 108A, and on the front surface (by the second scan plane 120B) as the sensors 108 move past the sign (for a surface/sign on the B side of the car).


The LiDAR sensors 108 are used to gather point cloud data along a railroad corridor. Such point cloud data is used by the processor 106 to generate and interpret point clouds, revealing various features along a railroad corridor including signage, furniture, adjoining tracks, ballast profile, drainage ditch profile, embankments and tunnel walls. Real-time point clouds are generated by the processor, preferably in LAS file format. The point cloud data that is gathered and the generated point clouds are stored on a data storage device 130 in electrical communication with the computing system 104. The data storage system is preferably in the form of network-attached storage (NAS) computer data storage server. In order to produce a correctly referenced point cloud some additional devices are included in the system 100 including an Inertial Measurement Unit (IMU) 131 in electrical communication with the processor 106 as well as a geolocation device such as, for example, a GPS device 132 in electrical communication with the processor 106. These additional devices help provide real-time LiDAR sensor 108 attitude information (based on the attitude of the boxcar 110 on which the LiDAR sensors 108 are installed) and real-time GPS position information in conjunction with the gathered LiDAR sensor point cloud data. The processor 106 controls operations of the first LiDAR sensor 108A and the second LiDAR sensor 108B and performs a method for generating and interpreting point clouds of a rail corridor. The method includes operations of obtaining a first set of point cloud data using the first LiDAR sensor 108A; obtaining a second set of point cloud data using the second LiDAR sensor 108B; obtaining boxcar 110 attitude information using the IMU 131; obtaining GPS information using the geo-location device 132; combining the first set of point cloud data together, the second set of point cloud data, the IMU attitude information, and GPS information to generate a combined point cloud using the processor 106; identifying rail corridor features of interest found in the combined point cloud using the processor 106; creating an inventory of the identified rail corridor features of interest using the processor 106; and storing the combined point cloud and the inventory of identified rail corridor features on the data storage device 130.


In addition to the processor 106, the computing system preferably further includes one or more LiDAR controllers, a local data storage server and a high-performance compute graphics processing unit (GPU) server. The LiDAR sensors 108 used are preferably Riegl VUX-1HA sensors available from RIEGL Laser Measurement Systems GmbH based in Horn, Austria. Raw and post-processed trajectory and LiDAR sensor data are archived to the data storage device 130 for back-office re-processing and analysis. The railroad corridor assessment system 100 is capable of gathering data and generating point clouds with survey grade accuracy while the railroad corridor assessment platform 102 is moving at speeds up to and even greater than 70 miles per hour on a railroad track. Electrical power for the various devices described in this disclosure can be provided by a diesel generator onboard the railroad corridor assessment platform 102 and/or photovoltaic solar panels, or set of batteries on the railroad corridor assessment platform 102. Preferably, batteries are available on the railroad corridor assessment platform 102 and are charged by an onboard generator and one or more photovoltaic solar panels mounted on the roof of the railroad corridor assessment platform 102. In addition, individual devices may include individualized backup battery power from smaller batteries in electrical communication with individualized devices.


In addition to the LiDAR sensors 108, the railroad corridor assessment system 100 also preferably includes a plurality of first sensor enclosure high resolution cameras 134 and a plurality of second sensor enclosure high resolution cameras 136. The plurality of first sensor enclosure high resolution cameras 134 preferably includes at least a first sensor enclosure first high-resolution camera 134A and a first sensor enclosure second high resolution camera 134B. In the embodiments shown in FIGS. 3 and 5-8, there is also a first sensor enclosure third high resolution camera 134C. Similarly, the plurality of second sensor enclosure high resolution cameras 136 preferably includes at least a second sensor enclosure first high-resolution camera 136A and a second sensor enclosure second high resolution camera 136B. In the embodiments shown in FIGS. 3 and 5-8, there is also a second sensor enclosure third high resolution camera 136C. As indicated by their names, the first sensor enclosure high resolution cameras 134 are located in the first sensor enclosure 116A and the second sensor enclosure high resolution cameras 136 are located in the second sensor enclosure 116B. The first sensor enclosure high resolution cameras 134 and the second sensor enclosure high resolution cameras 136 are all in electrical communication with and are controlled by the computing system 104.


The first sensor enclosure 116A includes a first sensor enclosure outer shell 138A including a first sensor enclosure outer shell first aperture 140A, a first sensor enclosure outer shell second aperture 140B, and a first sensor enclosure outer shell third aperture 140C. The first sensor enclosure first high-resolution camera 134A is oriented to view from the inside of the first sensor enclosure 116A through the first sensor enclosure outer shell first aperture 140A to gather digital image data of a rail corridor. The first sensor enclosure second high resolution camera 134B is oriented to view from the inside of the first sensor enclosure 116A through the first sensor enclosure outer shell second aperture 140B to gather digital image data of a rail corridor. The first sensor enclosure third high resolution camera 134C is oriented to view from the inside of the first sensor enclosure 116A through the first sensor enclosure outer shell third aperture 140C to gather digital image data of a rail corridor.


The second sensor enclosure 116B includes a second sensor enclosure outer shell 138B including a second sensor enclosure outer shell first aperture 142A, a second sensor enclosure outer shell second aperture 142B, and a second sensor enclosure outer shell third aperture 142C. The second sensor enclosure first high-resolution camera 136A is oriented to view from the inside of the second sensor enclosure 116B through the second sensor enclosure outer shell first aperture 142A to gather digital image data of a rail corridor. The second sensor enclosure second high resolution camera 136B is oriented to view from the inside of the second sensor enclosure 116B through the second sensor enclosure outer shell second aperture 142B to gather digital image data of a rail corridor. The second sensor enclosure third high resolution camera 136C is oriented to view from the inside of the second sensor enclosure 116B through the second sensor enclosure outer shell third aperture 142C to gather digital image data of a rail corridor.


As shown in the Figures, in embodiments in which three high-resolution digital cameras are used, preferably one of the three cameras is facing up, one of the three cameras is facing out to the side away from the railroad boxcar 110, and one of the three cameras is facing down. Using all six high-resolution digital cameras, it is possible to generate a combined 360-degree panoramic digital image of a rail corridor using the processor 106. The digital image data from each camera (134A, 134B, 134C, 136A, 136B, and 136C) are synchronized using a boxcar wheel mounted shaft encoder 143. Preferably, the shaft encoder 143 uses a 10,000 pulse per revolution producing a pulse every 0.287 millimeter (mm). The encoder pulses are divided to produce a camera trigger every 1.5 to 2 meters while the railroad boxcar 110 is moving at 70 miles per hour. This trigger is used to acquire an image from all six cameras (134A, 134B, 134C, 136A, 136B, and 136C) at the same instance and position so the images can be combined into a single panoramic image. They cannot be accurately combined, nor geo-referenced as a panoramic image, if they are not acquired at the same instance. The processor performs a method for generating and interpreting digital image data. The method includes operations of obtaining a first set of digital image data using the first sensor enclosure first high resolution camera 134A being triggered by signals from the shaft encoder 143; obtaining a second set of digital image data using the first sensor enclosure second high resolution camera 134B being triggered by signals from the shaft encoder 143; obtaining a third set of digital image data using the first sensor enclosure third high resolution camera 134C being triggered by signals from the shaft encoder 143; obtaining a fourth set of digital image data using the second sensor enclosure first high resolution camera 136A being triggered by signals from the shaft encoder 143; obtaining a fifth set of digital image data using the second sensor enclosure second high resolution camera 136B being triggered by signals from the shaft encoder 143; obtaining a sixth set of digital image data using the second sensor enclosure third high resolution camera 136C being triggered by signals from the shaft encoder 143; combining the first set of digital image data, the second set of digital image data, the third set of digital image data, the fourth set of digital image data, the fifth set of digital image data, and the sixth set of digital image data to form a combined set of digital image data including a plurality of digital images and generating a combined panoramic digital image of the rail corridor using the processor 106; time stamping the plurality of digital images using the processor 106; and storing the combined set of digital image data on the data storage device 130. The time stamping of the digital image data allows for geo-referencing and/or coloring a generated LiDAR point cloud by superimposing the generated LiDAR point cloud with the combined panoramic digital image of the rail corridor. Acquired images are preferably able to resolve text with a font height as small as 2 inches at a distance of from about 2 meters to about 15 meters. The combined panoramic digital image of the rail corridor provides a way to visually assess site conditions at the point and time an image of a specific site is obtained and generated by the railroad corridor assessment system 100.


Because high-resolution digital cameras and their attached optic lens are sensitive equipment, it is undesirable to have such devices exposed to the elements in inclement weather, airborne debris or other detrimental conditions. The sensor enclosures 116 shown in the FIGS. 3-9 provide a way to address this problem. First, the railroad corridor assessment system 100 preferably includes a climatic conditions sensor 144 in data communication with the computing system 104. The climatic sensor 144 may actually include a plurality of sensors to detect different types of weather conditions. One important condition for determination is whether there is precipitation and the severity of the precipitation. When a minimum threshold of precipitation programmed into the computing system 104 is detected by the climatic sensor 144, data is sent to the computing system 104 to take further action to autonomously protect the high-resolution digital cameras (134A, 134B, 134C, 136A, 136B, and 136C). Similarly, after a precipitation event has ended, the climatic sensor 144 communicates the new weather status to the computing system 104 to re-expose the digital cameras (134A, 134B, 134C, 136A, 136B, and 136C) to the elements and enable gathering additional digital images. In order to protect the digital cameras (134A, 134B, 134C, 136A, 136B, and 136C), the first sensor enclosure 116A includes a round first sensor enclosure inner shell 146A and the second sensor enclosure 116B includes a round second sensor enclosure inner shell 146A. As shown in FIG. 3, FIG. 8 and FIG. 9, the first sensor enclosure inner shell 146A fits inside a round first sensor enclosure outer shell wall 148A and the first sensor enclosure 116A is configured so that the first sensor enclosure inner shell 146A can rotate and move relative to the first sensor enclosure outer shell wall 148A. The first sensor enclosure inner shell 146A includes a first sensor enclosure inner shell first aperture 150A, a first sensor enclosure inner shell second aperture 150B, and a first sensor enclosure inner shell third aperture 150C which line up with the first sensor enclosure outer shell first aperture 140A, the first sensor enclosure outer shell second aperture 140B, and the first sensor enclosure outer shell third aperture 140C, respectively, when the first sensor enclosure inner shell 146A is in a first sensor enclosure inner shell open position (see FIG. 9), exposing the first sensor enclosure first high resolution camera 134A, the first sensor enclosure second high resolution camera 134B, and the first sensor enclosure third high resolution camera 134C to weather outside the first sensor enclosure 116A. The first sensor enclosure 116A includes a first inner shell motorized linear actuator 152A in electrical communication with the processor 106 and connected to the first sensor enclosure inner shell 146A and a first sensor enclosure mounting plate 174A wherein the mounting plate 174A is mounted to the rear wall 114A of the railroad boxcar 110. The first inner shell motorized linear actuator 152A is controlled by the processor 106 to move the first sensor enclosure inner shell 146A from the first sensor enclosure inner shell open position (see FIG. 9) to a first sensor enclosure inner shell closed position (see FIG. 5 and FIG. 6) wherein the respective apertures no longer line up and the first sensor enclosure inner shell 146A blocks the first sensor enclosure outer shell first aperture 140A, the first sensor enclosure outer shell second aperture 140B, and the first sensor enclosure outer shell third aperture 140C, thereby protecting the first sensor enclosure first high resolution camera 134A, the first sensor enclosure second high resolution camera 134B, and the first sensor enclosure third high resolution camera 134C from weather outside the first sensor enclosure 116A. The exact same features are found on the second sensor enclosure 116B which includes a second inner shell motorized linear actuator 152B in electrical communication with the processor 106 and which is connected to a second sensor enclosure inner shell 146B and a second sensor enclosure mounting plate 174B wherein the mounting plate 174B is mounted to the rear wall 114 of the railroad boxcar. The second inner shell motorized linear actuator 152B is controlled by the processor 106 to move the second sensor enclosure inner shell 146B from a second sensor enclosure inner shell open position (wherein second sensor enclosure first high resolution camera 136A, the second sensor enclosure second high resolution camera 136B, and the second sensor enclosure third high resolution camera 136C are exposed through a second sensor enclosure inner shell first aperture 154A, a second sensor enclosure inner shell second aperture 154B, and a second sensor enclosure inner shell third aperture 154C, respectively) to a second sensor enclosure inner shell closed position (wherein the second sensor enclosure inner shell 146B blocks the second sensor enclosure outer shell first aperture 142A, the second sensor enclosure outer shell second aperture 142B, and the second sensor enclosure outer shell third aperture 142C), thereby protecting the second sensor enclosure first high resolution camera 136A, the second sensor enclosure second high resolution camera 136B, and the second sensor enclosure third high resolution camera 136C from weather outside the second sensor enclosure 116B.


In addition to protecting the digital cameras (134A, 134B, 134C, 136A, 136B, and 136C) and associated lenses, the sensor enclosures 116 are also configured to protect the LiDAR sensors 108 as well. FIG. 3A shows a side cross-sectional view of the first sensor enclosure 116A wherein the LiDAR sensor 108A is exposed and collect data. FIG. 3B shows the same view as FIG. 3A except in FIG. 3B, the LiDAR sensor is not exposed. A first sensor enclosure LiDAR cap 155A is used to protect the first LiDAR sensor 108A and can be moved relative to the first sensor enclosure outer shell 138A from a first sensor enclosure cap open position shown in FIG. 3A to a first sensor enclosure cap closed position shown in FIG. 3B by a first LiDAR cap motorized linear actuator 156A which is in electrical communication with the processor 106 and which is connected to the first sensor enclosure LiDAR cap 155A and a distal end 159A of the first LiDAR sensor 108A as shown in FIG. 3A. Both the first sensor enclosure 116A and the second sensor enclosure 116B have this same configuration including a second sensor enclosure cap open position in which the second LIDAR sensor 108B is exposed and a second sensor enclosure cap closed position in which the second LiDAR sensor 108B is concealed and protected. As shown in FIGS. 5 and 6, the second enclosure 116B includes a second sensor enclosure LiDAR cap 155B for protecting the second LiDAR sensor 108B. The second sensor enclosure 116B also includes a second LiDAR cap motorized linear actuator 156B in electrical communication with the processor 106 and connected to the second sensor enclosure LiDAR cap and a distal end of the second LiDAR sensor 108B. The LiDAR caps 155 can be opened or closed using the LiDAR cap motorized linear actuators 156 based on signals sent by the processor 106, wherein such signals are generated by the processor 106 based on data from another source such as, for example, a clock, the climatic sensor 144, or a motion sensor 157 discussed in more detail below.


In some embodiments, the processor performs a method for protecting the LiDAR sensors (108A and 108B) and the high-resolution cameras (134A, 134B, 134C, 136A, 136B, and 136C). The method includes operations of receiving climatic conditions data from the climatic sensor 144; using the processor 106 to activate the first inner shell motorized linear actuator 152A to move the first sensor enclosure inner shell 146A from the first sensor enclosure inner shell open position to the first sensor enclosure inner shell closed position and activate the first LiDAR cap motorized linear actuator 156A to move the first sensor enclosure LiDAR cap 155A from the first sensor enclosure inner shell open position to the first sensor enclosure inner shell closed position based on the received climatic conditions data; and using the processor to activate the second inner shell motorized linear actuator 152B to move the second sensor enclosure inner shell 146B from the second sensor enclosure inner shell open position to the second sensor enclosure inner shell closed position and activate the second LiDAR cap motorized linear actuator 156B to move the second sensor enclosure LiDAR cap 155B from the second sensor enclosure cap open position to the second sensor enclosure cap closed position based on the received climatic conditions data. This usually would occur in inclement weather. If the climatic sensor 144 sends data to the processor indicating that it is safe to expose the high-resolution cameras (134A, 134B, 134C, 136A, 136B, and 136C) to allow for the cameras to obtain data, a method includes operations of receiving climatic conditions data from the climatic sensor 144; using the processor 106 to activate the first inner shell motorized linear actuator 152A to move the first sensor enclosure inner shell 146A from the first sensor enclosure inner shell closed position to the first sensor enclosure inner shell open position and activate the first LiDAR cap motorized linear actuator 156A to move the first sensor enclosure LiDAR cap 155A from the first sensor enclosure cap closed position to the first sensor enclosure cap open position based on the received climatic conditions data; and using the processor to activate the second inner shell motorized linear actuator 152B to move the second sensor enclosure inner shell 146B from the second sensor enclosure inner shell closed position to the second sensor enclosure inner shell open position and activate the second LiDAR cap motorized linear actuator 156B to move the second sensor enclosure LiDAR cap 155B from the second sensor enclosure cap closed position to the second sensor enclosure cap open position based on the received climatic conditions data.


The railroad corridor assessment system 100 also preferably includes a motion sensor 157 for detecting motion of the railroad corridor assessment platform 102. Depending on data received from the motion sensor 157 to the computing device 104, the processor 106 preferably (1) activates the first inner shell motorized linear actuator 152A to move the first sensor enclosure inner shell 146A to the first sensor enclosure inner shell open position (see FIG. 9) and activates the first LiDAR cap motorized linear actuator 156A to move the first sensor enclosure LiDAR cap 155A to the first sensor enclosure cap open position (see FIG. 3A) when the railroad corridor assessment platform 102 is moving and (2) activates the first inner shell motorized linear actuator 152A to move the first sensor enclosure inner shell 146A to the first sensor enclosure inner shell closed position (see FIG. 5 and FIG. 6) and activates the first LiDAR cap motorized linear actuator 156A to move the first sensor enclosure LiDAR cap 155A to the first sensor enclosure cap closed position (see FIG. 3B) when the railroad corridor assessment platform 102 stops moving. Similarly, the processor 106 also preferably activates the second inner shell motorized linear actuator 152B to move the second sensor enclosure inner shell 146B to the second sensor enclosure inner shell open position and activates the second LiDAR cap motorized linear actuator 156B to move the second sensor enclosure LiDAR cap 155B to the first sensor enclosure cap open position (see FIG. 3A) when the railroad corridor assessment platform 102 is moving and activates the second inner shell motorized linear actuator 152B to move the second sensor enclosure inner shell 146A to the second sensor enclosure inner shell closed position and activates the second LiDAR cap motorized linear actuator 156B to move the second sensor enclosure LiDAR cap 155B to the second sensor enclosure cap closed position when the railroad corridor assessment platform 102 stops moving. The motion sensor 157 is broadly defined as any device providing data to the processor 106 giving an indication that the railroad corridor assessment platform 102 is moving or that the railroad corridor assessment platform 102 has stopped moving.


As indicated above, the processor 106 controls operations of the first inner shell motorized linear actuator 152A and the second inner shell motorized linear actuator 152B. In some embodiments, the processor performs a method for protecting the high-resolution cameras (134 and 136). The method includes operations of receiving a motion sensor signal from the motion sensor 157 indicating that the railroad corridor assessment platform 102 has stopped moving relative to a railroad track or is moving below a minimum speed threshold programmed into the computing system 104; using the processor 106 to activate the first inner shell motorized linear actuator 152A to move the first sensor enclosure inner shell 146A from the first sensor enclosure inner shell open position to the first sensor enclosure inner shell closed position based on the received motion sensor signal; and using the processor 106 to activate the second inner shell motorized linear actuator 152B to move the second sensor enclosure inner shell 146B from the second sensor enclosure inner shell open position to the second sensor enclosure inner shell closed position based on the received motion sensor signal. These steps are usually performed if the railroad corridor assessment platform 102 has ceased moving along a railroad track. If, on the other hand, the railroad corridor assessment platform 102 starts moving from a stalled or stopped state, in some embodiments, the processor performs a method for exposing the high-resolution cameras (134 and 136) so that they can gather data. The method includes operations of receiving a motion sensor signal from the motion sensor 157 indicating that the railroad corridor assessment platform 102 has started moving relative to a railroad track at or above a minimum speed threshold programmed into the computing system 104; using the processor 106 to activate the first inner shell motorized linear actuator 152A to move the first sensor enclosure inner shell 146A from the first sensor enclosure inner shell closed position to the first sensor enclosure inner shell open position based on the received motion sensor signal; and using the processor 106 to activate the second inner shell motorized linear actuator 152B to move the second sensor enclosure inner shell 146B from the second sensor enclosure inner shell closed position to the second sensor enclosure inner shell open position based on the received motion sensor signal.


As indicated above, the railroad corridor assessment system 100 and other systems like it are vulnerable to extreme weather conditions and flying dirt and debris in disturbed air created by a consist moving at speed along a rail corridor. The features described above regarding the first sensor enclosure 116A and the second sensor enclosure 116B address some weather concerns. In addition to these features, the railroad corridor assessment system 100 includes a temperature sensor 158 on the railroad corridor assessment platform 102 in electrical communication with the computing system 104 and proximate to the first sensor enclosure 116A and the second sensor enclosure 116B. In some embodiments there are separate temperature sensors including a first temperature sensor 158A in the first sensor enclosure 116A and a second temperature sensor 158B in the second sensor enclosure 116B. The railroad corridor assessment system 100 preferably includes a heating and cooling system 160 in electrical communication with the computing system 104. The heating and cooling system 160 preferably includes an air blower 162, a heater 164 for heating air blown from the air blower 162, an air chiller 166 for cooling air blown from the air blower 162, and an air duct 168 in fluid communication with the heating and cooling system 160 and the sensor enclosures 116 for channeling air from the air blower 162 to the first sensor enclosure 116A and the second sensor enclosure 116B. Preferably, the heater 164 and chiller 166 include a combination heater/chiller 167 such as, for example, a Peltier thermoelectric heating and cooling device capable of providing heating or cooling depending on electrical control signals received by such device from the processor 106. Depending on temperature readings from the first temperature sensor 158A and the second temperature sensor 158B sent to the computing system 104, the processor 106 can be programmed to activate the air blower 162 in addition to either the heating or cooling function of the combination heater/chiller 164 depending on whether the sensor enclosures 116 need to be heated or cooled. If temperatures in the sensor enclosures 116 are within an acceptable range, the processor 106 optionally can activate only the air blower 162 to circulate air to and through the sensor enclosures 116. The climate control features allow the system 100 to operate in extreme weather and climate conditions.


The sensor enclosures 116 each include a sensor enclosure outer cap (170A and 170B) and the LiDAR sensor caps (155A and 155B) shown in FIG. 3A, FIG. 3B and FIG. 4. The sensor enclosure outer cap design includes a plurality of cap apertures 172 through which blown air from the air blower 162 exits the enclosures 116 and is used as an air curtain to prevent dirt, debris or precipitation from interfering with the lens of LiDAR sensors 108. These features also allow for warm air to exit the sensor enclosures 116 from the air blower 162 heated by the heater 164 to melt snow and ice that would otherwise form around the LiDAR sensors 108 and digital cameras (134 and 136) in cold weather. As such, if the motion sensor 156 communicates to the computing system 104 that the railroad corridor assessment platform 102 is moving, the processor can be programmed to automatically activate the air blower 162 so that air will flow out the cap apertures 172 to blow away flying dirt and debris. In one embodiment, the processor 106 performs a method for regulating air temperature in the first sensor enclosure 116A and the second sensor enclosure 116B and the method includes the operations of receiving temperature data from the temperature sensor 158, activating the air blower 162, and activating the heater 164 or the chiller 166 based on the received temperature data (or, in the case of a single device, the combination heater/chiller 167). If the temperature in the first sensor enclosure 116A and the second sensor enclosure 116B rises above an upper threshold programmed into the computing system 104, the processor 106 can be programmed to automatically activate the air blower 162 and the air combination heater/chiller 164 to provide cool air. If, on the other hand, the temperature in the first sensor enclosure 116A and the second sensor enclosure 116B falls below a lower threshold programmed into the computing system 104, the processor 160 can be programmed to automatically activate the air blower 162 and the combination heater/chiller 164 to provide heated air.


With reference to FIGS. 3 and 5-8, each sensor enclosure 116 includes a mounting plate (174A and 174B) for mounting the enclosures 116 to the back wall 114A of the railroad boxcar 110. Each enclosure 116 also includes an internal frame member 176 (first internal frame member 176A and second internal frame member 176B) configured to be attached on the mounting plate 174 (first mounting plate 174A and second mounting plate 174B) along a first plane substantially parallel with the rear wall 114A of the railroad boxcar 110. The position of the internal frame member 176 can be swiveled to a new position by loosening attachment devices (e.g., bolts, screws, or other attachment devices known to persons having ordinary skill in the art), rotating the internal frame member 176 and any LiDAR sensor attached thereto), and retightening the attachment devices with the internal frame member 176 reoriented at a different angle. For each separate enclosure, a LiDAR sensor (e.g., first LiDAR sensor 108A) is attached to an internal frame member (e.g., first frame member 176A) in a hinged configuration as shown so that the LiDAR sensors 108 can rotate relative to the internal frame members 176 in the enclosures 116 along an X axis as shown in FIG. 11A and FIG. 11B. For reference, the axes (X, Y, and Z) in a standard Cartesian coordinate system as defined herein are shown in FIG. 11A, FIG. 11B, FIG. 12A, and FIG. 12B with the non-listed axis being orthogonal to the drawing sheet. For example, in FIG. 11A and FIG. 11B, the X axis is orthogonal to the drawing sheet and comes out the page at point 178. In FIG. 12A and FIG. 12B, the Z axis is orthogonal to the drawing sheet and comes out of the page at point 180. FIG. 12A and FIG. 12B show a view along a Y,Z oriented plane of the first sensor enclosure 116A with the first sensor enclosure outer shell wall 148A and sensor enclosure outer cap 170 removed to show internal parts. The range of rotation of the first LiDAR sensor 108A along the X axis preferably ranges from about +23 degrees (shown as angle α in FIG. 11A) to about −23 degrees (shown as angle β in FIG. 11B). The range of rotation of the first LiDAR sensor along the Z axis preferably ranges from about +32 degrees (shown as angle θ in FIG. 12A) to about −32 degrees (shown as angle ω in FIG. 12B). If the LiDAR sensors 108 are not rotated at all, the first scan plane 120A and the second scan plane 120B overlap completely (are coplanar at all points) along the X,Y plane shown in FIG. 13 (or a plane parallel to the X,Y plane). However, the first LiDAR sensor 108A and the second LiDAR sensor 108B are oriented in the manner shown in FIGS. 5, 6, and 12A-15. Because of the unique orientations of the first LiDAR sensor 108 and the second LiDAR sensor 108B described and shown herein, a full 360-degree scan to gather point cloud data can be accomplished using the railroad corridor assessment system 100 between the rear wall 114A of the railroad boxcar 110 and a front wall of an adjoining railroad boxcar if another boxcar is adjoined to the railroad boxcar. As shown in FIG. 13 and FIG. 14, despite the example adjoining boxcar 122 being coupled very close to the railroad boxcar 110 and the LiDAR sensors 108 along the back wall 114A of the railroad boxcar, the first scan plane 120A and the second scan plane 120B have clear “views” to gather point cloud data without the adjoining box car 122 interfering with such scans.


The previously described embodiments of the present disclosure have many advantages, including gathering LiDAR point cloud data from a standard-sized railroad boxcar that can easily be added to or removed from a consist. The specific orientation of the LiDAR sensors 108 allows for the gathering of point cloud data even if another boxcar is coupled directly behind the railroad corridor assessment platform 102. The LiDAR sensors enclosures 116 are situated below a roofline of the railroad corridor assessment platform 102. Point clouds can be assessed in real-time by the processor 106 to identify and inventory various features along a rail corridor such as signage, furniture, adjoining tracks, and Positive Train Control (PTC) assets. The processor can use the generated point cloud(s) to measure ballast profiles, measure drainage ditch profiles, and identify and measure of embankments and tunnel walls. If the same survey path is run more than once, on the later survey(s), the processor 106 can be programmed to detect changes in the features that were previously detected in the point clouds of a prior survey by comparing the new and old point clouds. The point cloud data is geo-referenced and time stamped to correlate such data with new or additional data.


An additional advantage is the use of a plurality of digital cameras to gather a 360-degree panoramic ribbon digital image of a rail corridor. The image data is preferably time-stamped and can be combined with the LiDAR point cloud data to add color to the point cloud(s). The sensitive cameras are protected in the sensor enclosures 116 which automatically open or close depending on (1) the weather and (2) whether the railroad corridor assessment platform 102 is moving at a minimum speed. The heating and cooling system 160 provides temperature-controlled air to the sensor enclosures 116 so that the system 100 can keep operating even in extreme heat or cold. Additionally, the enclosures 116 include cap apertures through which the temperature-controlled air can exit the enclosures 116 and, while exiting, act like an air curtain to blow away any flying dust or debris from the devise in the enclosures 116.


The foregoing description of preferred embodiments of the present disclosure has been presented for purposes of illustration and description. The described preferred embodiments are not intended to be exhaustive or to limit the scope of the disclosure to the precise form(s) disclosed. For example, various features said to be in electrical communication with one another may be communicating wirelessly and powered locally by batteries or other power sources. Obvious modifications or variations are possible in light of the above teachings. The embodiments are chosen and described in an effort to provide the best illustrations of the principles of the disclosure and its practical application, and to thereby enable one of ordinary skill in the art to utilize the concepts revealed in the disclosure in various embodiments and with various modifications as are suited to the particular use contemplated. All such modifications and variations are within the scope of the disclosure as determined by the appended claims when interpreted in accordance with the breadth to which they are fairly, legally, and equitably entitled.


Any element in a claim that does not explicitly state “means for” performing a specified function, or “step for” performing a specific function, is not to be interpreted as a “means” or “step” clause as specified in 35 U.S.C. § 112, ¶6. In particular, the use of “step of” in the claims herein is not intended to invoke the provisions of 35 U.S.C. § 112, ¶6.

Claims
  • 1. A system for generating and interpreting point clouds of a rail corridor along a survey path while moving on a railroad corridor assessment platform, the system comprising: a. a railroad corridor assessment platform;b. a first LiDAR sensor configured to scan along a first scan plane, the first LiDAR sensor attached to the railroad corridor assessment platform;c. a second LiDAR sensor configured to scan along a second scan plane, the second LiDAR sensor attached to the railroad corridor assessment platform;d. a first sensor enclosure housing for protecting the first LiDAR sensor, the first sensor enclosure further comprising a first sensor enclosure LiDAR sensor cap configured to move from a first sensor enclosure LiDAR sensor cap open position in which the first LiDAR sensor is exposed to a first sensor enclosure LiDAR sensor cap closed position in which the first LiDAR sensor is not exposed, and from the first sensor enclosure LiDAR sensor cap closed position to the first sensor enclosure LiDAR sensor cap open position; ande. a second sensor enclosure for housing for protecting the second LiDAR sensor, the second sensor enclosure further comprising a second sensor enclosure LiDAR sensor cap configured to move from a second sensor enclosure LiDAR sensor cap open position in which the second LiDAR sensor is exposed to a second sensor enclosure LiDAR sensor cap closed position in which the second LiDAR sensor is not exposed, and from the second sensor enclosure LiDAR sensor cap closed position to the second sensor enclosure LiDAR sensor cap open position.
  • 2. The system for generating and interpreting point clouds of a rail corridor of claim 1 further comprising: a. a first sensor enclosure further comprising: i. a first sensor enclosure outer shell comprising a first sensor enclosure outer shell first aperture and a first sensor enclosure outer shell second aperture;ii. a first sensor enclosure first high resolution camera in electrical communication with the computing system, the first sensor enclosure first high resolution camera oriented to view from the inside of the first sensor enclosure through the first sensor enclosure outer shell first aperture to gather digital image data of a rail corridor; andiii. a first sensor enclosure second high resolution camera in electrical communication with the computing system, the first sensor enclosure second high resolution camera oriented to view from the inside of the first sensor enclosure through the first sensor enclosure outer shell second aperture to gather digital image data of a rail corridor; andb. a second sensor enclosure further comprising: i. a second sensor enclosure outer shell comprising a second sensor enclosure outer shell first aperture and a second sensor outer shell second aperture;ii. a second sensor enclosure first high resolution camera in electrical communication with the computing system, the second sensor enclosure first high resolution camera oriented to view from the inside of the second sensor enclosure through the second sensor enclosure outer shell first aperture to gather digital image data of a rail corridor; andiii. a second sensor enclosure second high resolution camera in electrical communication with the computing system, the second sensor enclosure second high resolution camera oriented to view from the inside of the second sensor enclosure through the second sensor enclosure outer shell second aperture to gather digital image data of a rail corridor.
  • 3. The system for generating and interpreting point clouds of a rail corridor of claim 2 further comprising: a. a wheel mounted shaft encoder for sending trigger signals to the first sensor enclosure first high resolution camera, the first sensor enclosure second high resolution camera, the second sensor enclosure first high resolution camera, and the second sensor enclosure second high resolution camera as the railroad corridor assessment platform moves along a survey path;b. a high-performance processor wherein the processor controls operations of the first sensor enclosure first high resolution camera, the first sensor enclosure second high resolution camera, the second sensor enclosure first high resolution camera, and the second sensor enclosure second high resolution camera, and wherein the processor performs a method for generating and interpreting digital image data, the method comprising operations of: i. receiving pulses from the shaft encoder and triggering the first sensor enclosure first high resolution camera, the first sensor enclosure second high resolution camera, the second sensor enclosure first high resolution camera, and the second sensor enclosure second high resolution camera to obtain digital image data at the same time instances;ii. obtaining a first set of digital image data using the first sensor enclosure first high resolution camera;iii. obtaining a second set of digital image data using the first sensor enclosure second high resolution camera;iv. obtaining a third set of digital image data using the second sensor enclosure first high resolution camera;v. obtaining a fourth set of digital image data using the second sensor enclosure second high resolution camera;vi. combining the first set of digital image data, the second set of digital image data, the third set of digital image data, and the fourth set of digital image data to form a combined set of digital image data comprising a plurality of digital images and generating a combined panoramic digital image of the rail corridor; andvii. storing the combined set of digital image data on the data storage device.
  • 4. The system for generating and interpreting point clouds of a rail corridor of claim 3 further comprising: a. the first sensor enclosure further comprising: i. a first sensor enclosure inner shell comprising a first sensor enclosure inner shell first aperture and a first sensor enclosure inner shell second aperture, wherein the first sensor enclosure inner shell is configured to move relative to the first sensor enclosure outer shell from a first sensor enclosure inner shell open position wherein the first sensor enclosure outer shell first aperture is in line with the first sensor enclosure inner shell first aperture and the first sensor enclosure outer shell second aperture is in line with the first sensor enclosure inner shell second aperture to a first sensor enclosure inner shell closed position wherein the first sensor enclosure outer shell first aperture is not in line with the first sensor enclosure inner shell first aperture and the first sensor enclosure outer shell second aperture is not in line with the first sensor enclosure inner shell second aperture resulting in (1) the first sensor enclosure outer shell first aperture being blocked by the first sensor enclosure inner shell to protect the first sensor enclosure first high resolution camera and (2) the first sensor enclosure outer shell second aperture being blocked by the first sensor enclosure inner shell to protect the first sensor enclosure second high resolution camera; andii. a first inner shell motorized linear actuator in electrical communication with the computing system and connected to the first sensor enclosure inner shell for moving the first sensor enclosure inner shell from the first sensor enclosure inner shell open position to the first sensor enclosure inner shell closed position or from the first sensor enclosure inner shell closed position to the first sensor enclosure inner shell open position depending upon control instructions from the computing system; andiii. a first LiDAR cap motorized actuator in electrical communication with the computing system and connected to the first sensor enclosure LiDAR sensor cap for moving the first sensor enclosure LiDAR sensor cap from the first sensor enclosure LiDAR sensor cap closed position to the first sensor enclosure LiDAR sensor cap open position or from the first sensor enclosure LiDAR sensor cap open position to the first sensor enclosure LiDAR sensor cap closed position depending on control instructions from the computing system; andb. the second sensor enclosure further comprising i. a second sensor enclosure inner shell comprising a second sensor enclosure inner shell first aperture and a second sensor enclosure inner shell second aperture, wherein the second sensor enclosure inner shell is configured to move relative to the second sensor enclosure outer shell from a second sensor enclosure inner shell open position wherein the second sensor enclosure outer shell first aperture is in line with the second sensor enclosure inner shell first aperture and the second sensor enclosure outer shell second aperture is in line with the second sensor enclosure inner shell second aperture to a second sensor enclosure inner shell closed position wherein the second sensor enclosure outer shell first aperture is not in line with the second sensor enclosure inner shell first aperture and the second sensor enclosure outer shell second aperture is not in line with the second sensor enclosure inner shell second aperture resulting in (1) the second sensor enclosure outer shell first aperture being blocked by the second sensor enclosure inner shell to protect the second sensor enclosure first high resolution camera and (2) the second sensor enclosure outer shell second aperture being blocked by the second sensor enclosure inner shell to protect the second sensor enclosure second high resolution camera; andii. a second inner shell motorized linear actuator in electrical communication with the computing system and connected to the second sensor enclosure inner shell for moving the second sensor enclosure inner shell from the second sensor enclosure inner shell open position to the second sensor enclosure inner shell closed position or from the second sensor enclosure inner shell closed position to the second sensor enclosure inner shell open position depending upon control instructions from the computing system; andiii. a second LiDAR cap motorized actuator in electrical communication with the computing system and connected to the second sensor enclosure LiDAR sensor cap for moving the second sensor enclosure LiDAR sensor cap from the second sensor enclosure LiDAR sensor cap closed position to the second sensor enclosure LiDAR sensor cap open position or from the second sensor enclosure LiDAR sensor cap open position to the second sensor enclosure LiDAR sensor cap closed position depending on control instructions from the computing system.
  • 5. The system for generating and interpreting point clouds of a rail corridor of claim 4 further comprising: a. a climatic sensor on the railroad corridor assessment platform, the climatic sensor in electrical communication with the high performance processor; andb. the high-performance processor wherein the processor controls operations of the first inner shell motorized linear actuator and the second inner shell motorized linear actuator, and wherein the processor performs a method for protecting the first sensor enclosure first high resolution camera, the first sensor enclosure second high resolution camera, the second sensor enclosure first high resolution camera, and the second sensor enclosure second high resolution camera, the method comprising operations of: i. receiving climatic conditions data from the climatic sensor;ii. activating the first inner shell motorized linear actuator to move the first sensor enclosure inner shell from the first sensor enclosure inner shell open position to the first sensor enclosure inner shell closed position based on the received climatic conditions data;iii. activating the second inner shell motorized linear actuator to move the second sensor enclosure inner shell from the second sensor enclosure inner shell open position to the second sensor enclosure inner shell closed position based on the received climatic conditions data;iv. activating the first LiDAR cap motorized actuator to move the first sensor enclosure LiDAR cap from the first sensor enclosure cap open position to the first sensor enclosure cap closed position; andv. activating the second LiDAR cap motorized actuator to move the second sensor enclosure LiDAR cap from the second sensor enclosure cap open position to the second sensor enclosure cap closed position.
  • 6. The system for generating and interpreting point clouds of a rail corridor of claim 5 further comprising: a. a motion sensor for sensing motion of the railroad corridor assessment platform, the motion sensor in electrical communication with the high performance processor; andb. the high-performance processor wherein the processor controls operations of the first inner shell motorized linear actuator and the second inner shell motorized linear actuator, and wherein the processor performs a method for protecting the first sensor enclosure first high resolution camera, the first sensor enclosure second high resolution camera, the second sensor enclosure first high resolution camera, and the second sensor enclosure second high resolution camera, the method comprising operations of: i. receiving a motion sensor signal from the motion sensor indicating that the railroad corridor assessment platform is moving relative to a railroad track below a minimum speed threshold programmed into the computing system;ii. activating the first inner shell motorized linear actuator to move the first sensor enclosure inner shell from the first sensor enclosure inner shell open position to the first sensor enclosure inner shell closed position based on the received motion sensor signal;iii. activating the second inner shell motorized linear actuator to move the second sensor enclosure inner shell from the second sensor enclosure inner shell open position to the second sensor enclosure inner shell closed position based on the received motion sensor signal;iv. activating the first LiDAR cap motorized actuator to move the first sensor enclosure LiDAR cap from the first sensor enclosure cap open position to the first sensor enclosure cap closed position; andv. activating the second LiDAR cap motorized actuator to move the second sensor enclosure LiDAR cap from the second sensor enclosure cap open position to the second sensor enclosure cap closed position.
  • 7. The system for generating and interpreting point clouds of a rail corridor of claim 1 further comprising: a. a temperature sensor on the railroad corridor assessment platform in electrical communication with the computing system and proximate to the first sensor enclosure and the second sensor enclosure;b. a heating and cooling system in electrical communication with the computing system, the heating and cooling system further comprising: i. an air blower;ii. a heater for heating air blown from the air blower;iii. an air chiller for cooling air blown from the air blower; andiv. an air duct for channeling air from the air blower to the first sensor enclosure and the second sensor enclosure; andc. the high-performance processor wherein the processor controls operations of the heating and cooling system, and wherein the processor performs a method for regulating air temperature in the first sensor enclosure and the second sensor enclosure, the method comprising operations of: i. receiving temperature data from the temperature sensor;ii. activating the air blower; andiii. activating the heater or the air chiller based on the received temperature data.
  • 8. The system for generating and interpreting point clouds of a rail corridor of claim 7 further comprising: a. the first sensor enclosure further comprising at least one first sensor air aperture through which air can directed across the first LiDAR sensor; andb. the second sensor enclosure further comprising at least one second sensor air aperture through which air can directed across the second LiDAR sensor.
  • 9. A method for generating and interpreting point clouds of a rail corridor, the method comprising: a. obtaining a first set of point cloud data using a processor and a first LiDAR sensor oriented to scan along a first scan plane and attached to a railroad corridor assessment platform wherein the first LiDAR sensor is in electrical communication with the processor, and wherein the first LiDAR sensor is housed in a first sensor enclosure further comprising a first sensor enclosure LiDAR cap configured to move from a first sensor enclosure LiDAR sensor cap open position in which the first LiDAR sensor is exposed to a first sensor enclosure LiDAR sensor cap closed position in which the first LiDAR sensor is not exposed, and from the first sensor enclosure LiDAR sensor cap closed position to the first sensor enclosure LiDAR sensor cap open position; andb. obtaining a second set of point cloud data using the processor and a second LiDAR sensor oriented to scan along a second scan plane and attached to the railroad corridor assessment platform in a rear-facing orientation wherein the second LiDAR sensor is in electrical communication with the processor wherein the second LiDAR sensor is housed in a second sensor enclosure further comprising a second sensor enclosure LiDAR cap configured to move from a second sensor enclosure LiDAR sensor cap open position in which the second LiDAR sensor is exposed to a second sensor enclosure LiDAR sensor cap closed position in which the second LiDAR sensor is not exposed, and from the second sensor enclosure LiDAR sensor cap closed position to the second sensor enclosure LiDAR sensor cap open position.
  • 10. The method of claim 9 further comprising: a. receiving pulses from a wheel mounted shaft encoder in electrical communication with the processor and triggering a first sensor enclosure first high resolution camera, a first sensor enclosure second high resolution camera, a second sensor enclosure first high resolution camera, and a second sensor enclosure second high resolution camera to obtain digital image data at the same time instances;b. obtaining a first set of digital image data using the first sensor enclosure first high resolution camera;c. obtaining a second set of digital image data using the first sensor enclosure second high resolution camera;d. obtaining a third set of digital image data using the second sensor enclosure first high resolution camera;e. obtaining a fourth set of digital image data using the second sensor enclosure second high resolution camera;f. combining the first set of digital image data, the second set of digital image data, the third set of digital image data, and the fourth set of digital image data to form a combined set of digital image data comprising a plurality of digital images and generating a combined panoramic digital image of a rail corridor; andg. storing the combined set of digital image data on a data storage device in electrical communication with the processor.
  • 11. The method of claim 10 further comprising colorizing the combined point cloud using the combined set of digital image data and the processor.
  • 12. The method of claim 10 further comprising: a. triggering a first sensor enclosure third high resolution camera and a second sensor enclosure third high resolution camera to obtain digital image data at the same time instances as the first sensor enclosure first high resolution camera, the first sensor enclosure second high resolution camera, the second sensor enclosure first high resolution camera, and the second sensor enclosure second high resolution camera;b. obtaining a fifth set of digital image data using a first sensor enclosure third high resolution camera;c. obtaining a sixth set of digital image data using a second sensor enclosure third high resolution camera;d. combining the first set of digital image data, the second set of digital image data, the third set of digital image data, the fourth set of digital image data, the fifth set of digital image data and the sixth set of digital image data to form a combined set of digital image data comprising a plurality of digital images and generating a combined panoramic digital image of a rail corridor.
  • 13. The method of claim 11 further comprising geo-referencing the colorized combined point cloud using a geo-referencing device.
  • 14. The method of claim 12 further comprising: a. housing the first LiDAR sensor, the first sensor enclosure first high resolution camera, the first sensor enclosure second high resolution camera, and the first sensor enclosure third high resolution camera in the first sensor enclosure;b. housing the second LiDAR sensor, the second sensor enclosure first high resolution camera, the second sensor enclosure second high resolution camera, and the second sensor enclosure third high resolution camera in the second sensor enclosure;c. blowing temperature-controlled air to the first sensor enclosure and the second sensor enclosure using a heating and cooling system including an air blower wherein temperature-controlled air is blown through an air duct to the first sensor enclosure and the second sensor enclosure and wherein the heating and cooling system is controlled by the processor based on temperature data received by a temperature sensor proximate to the first LiDAR sensor and the second LiDAR sensor.
  • 15. The method of claim 14 further comprising: a. blowing temperature-controlled air through an aperture in the first sensor enclosure adjacent to the first LiDAR sensor for blowing away flying debris and precipitation from the first LiDAR sensor and to maintain the first sensor enclosure LiDAR cap at a temperature above freezing to eliminate the accumulation of frozen precipitation; andb. blowing temperature-controlled air through an aperture in the second sensor enclosure adjacent to the second LiDAR sensor for blowing away flying debris and precipitation from the second LiDAR sensor and to maintain the second sensor enclosure LiDAR cap at a temperature above freezing to eliminate the accumulation of frozen precipitation.
  • 16. The method of claim 12 further comprising: a. housing the first LiDAR sensor in a first sensor enclosure comprising a first LiDAR cap motorized linear actuator in electrical communication with the computing system and connected to the first sensor enclosure LiDAR sensor cap for moving the first sensor enclosure LiDAR sensor cap from the first sensor enclosure LiDAR sensor cap closed position to the first sensor enclosure LiDAR sensor cap open position or from the first sensor enclosure LiDAR sensor cap open position to the first sensor enclosure LiDAR sensor cap closed position depending on control instructions from the computing system;b. housing the first sensor enclosure first high-resolution camera in the first sensor enclosure comprising: i. a first sensor enclosure outer shell;ii. a first sensor enclosure outer shell first aperture through which the first sensor enclosure first high-resolution camera obtains digital image data;iii. a first sensor enclosure inner shell configured to move relative to the first sensor outer shell from a first sensor enclosure inner shell open position wherein the first sensor enclosure outer shell first aperture is open and the first sensor enclosure first high-resolution camera is exposed to weather outside the first sensor enclosure to a first sensor enclosure inner shell closed position wherein the first sensor enclosure outer shell first aperture is blocked by the first sensor inner shell and the first sensor enclosure first high-resolution camera is not exposed to weather outside the first sensor enclosure;c. housing the first sensor enclosure second high-resolution camera in the first sensor enclosure comprising: i. a first sensor enclosure outer shell second aperture through which the first sensor enclosure second high-resolution camera obtains digital image data;ii. the first sensor enclosure inner shell configured to move relative to the first sensor enclosure outer shell from the first sensor enclosure inner shell open position wherein the first sensor enclosure outer shell second aperture is open and the first sensor enclosure second high-resolution camera is exposed to weather outside the first sensor enclosure to the first sensor enclosure inner shell closed position wherein the first sensor enclosure outer shell second aperture is blocked by the first sensor inner shell and the first sensor enclosure second high-resolution camera is not exposed to weather outside the first sensor enclosure;iii. a first inner shell motorized linear actuator connected to the first sensor enclosure inner shell and in electrical communication with the processor for moving the first sensor enclosure inner shell from the first sensor enclosure inner shell open position to the first sensor enclosure inner shell closed position and from the first sensor enclosure inner shell closed position to the first sensor enclosure inner shell open position depending on instructions from the processor;d. housing the second LiDAR sensor in a second sensor enclosure comprising a second LiDAR cap motorized linear actuator in electrical communication with the computing system and connected to the second sensor enclosure LiDAR sensor cap for moving the second sensor enclosure LiDAR sensor cap from the second sensor enclosure LiDAR sensor cap closed position to the second sensor enclosure LiDAR sensor cap open position or from the second sensor enclosure LiDAR sensor cap open position to the second sensor enclosure LiDAR sensor cap closed position depending on control instructions from the computing system;e. housing the second sensor enclosure first high-resolution camera in the second sensor enclosure comprising: i. a second sensor enclosure outer shell;ii. a second sensor enclosure outer shell first aperture through which the second sensor enclosure first high-resolution camera obtains digital image data;iii. a second sensor enclosure inner shell configured to move relative to the second sensor outer shell from a second sensor enclosure inner shell open position wherein the second sensor enclosure outer shell first aperture is open and the second sensor enclosure first high-resolution camera is exposed to weather outside the second sensor enclosure to a second sensor inner shell closed position wherein the second sensor enclosure outer shell first aperture is blocked by the second sensor inner shell and the second sensor enclosure first high-resolution camera is not exposed to weather outside the second sensor enclosure; andf. housing the second sensor enclosure second high-resolution camera in the second sensor enclosure comprising: i. a second sensor enclosure outer shell second aperture through which the second sensor enclosure second high-resolution camera obtains digital image data;ii. the second sensor enclosure inner shell configured to move relative to the second sensor enclosure outer shell from the second sensor enclosure inner shell open position wherein the second sensor enclosure outer shell second aperture is open and the second sensor enclosure second high-resolution camera is exposed to weather outside the second sensor enclosure to the second sensor enclosure inner shell closed position wherein the second sensor enclosure outer shell second aperture is blocked by the second sensor inner shell and the second sensor enclosure second high-resolution camera is not exposed to weather outside the second sensor enclosure; andiii. a second inner shell motorized linear actuator connected to the second sensor enclosure inner shell and in electrical communication with the processor for moving the second sensor enclosure inner shell from the second sensor enclosure inner shell open position to the second sensor enclosure inner shell closed position and from the second sensor enclosure inner shell closed position to the second sensor enclosure inner shell open position depending on instructions from the processor.
  • 17. The method of claim 16 further comprising; a. detecting weather conditions outside the first sensor enclosure and the second sensor enclosure using a climatic sensor in electrical communication with the processor;b. activating the first inner shell motorized linear actuator to move the first sensor enclosure inner shell from the first sensor enclosure inner shell open position to the first sensor enclosure inner shell closed position based on information received by the processor from the climatic sensor;c. activating the second inner shell motorized linear actuator to move the second sensor enclosure inner shell from the second sensor enclosure inner shell open position to the second sensor enclosure inner shell closed position based on information received by the processor from the climatic sensor;d. activating the first LiDAR cap motorized linear actuator to move the first sensor enclosure LiDAR sensor cap from the first sensor enclosure LiDAR sensor cap open position to the first sensor enclosure LiDAR sensor cap closed position; ande. activating the second LiDAR cap motorized linear actuator to move the second sensor enclosure LiDAR sensor cap from the second sensor enclosure LiDAR sensor cap open position to the second sensor enclosure LiDAR sensor cap closed position.
  • 18. The method of claim 16 further comprising; a. detecting movement of the railroad corridor assessment platform using a motion sensor;b. activating the first inner shell motorized linear actuator to move the first sensor enclosure inner shell from the first sensor enclosure inner shell open position to the first sensor enclosure inner shell closed position based on information received by the processor from the motion sensor;c. activating the second inner shell motorized linear actuator to move the second sensor enclosure inner shell from the second sensor enclosure inner shell open position to the second sensor enclosure inner shell closed position based on information received by the processor from the motion sensor;d. activating the first LiDAR cap motorized actuator to move the first sensor enclosure LiDAR sensor cap from the first sensor enclosure LiDAR sensor cap open position to the first sensor enclosure LiDAR sensor cap closed position; ande. activating the second LiDAR cap motorized actuator to move the second sensor enclosure LiDAR sensor cap from the second sensor enclosure LiDAR sensor cap open position to the second sensor enclosure LiDAR sensor cap closed position.
  • 19. The system for generating and interpreting point clouds of a rail corridor of claim 1 further comprising: a. a temperature sensor in electrical communication with the processor and located proximate to the first LiDAR sensor and the second LiDAR sensor;b. a heating and cooling system in electrical communication with the processor, the heating and cooling system further comprising: i. an air blower;ii. a heater for heating air blown from the air blower;iii. an air chiller for cooling air blown from the air blower; andiv. a duct for channeling air from the air blower to the first sensor enclosure and the second sensor enclosure depending on temperature data sent by the temperature sensor to the computing system.
CROSS-REFERENCE(S) TO RELATED APPLICATION(S)

This application is a continuation of and claims priority to U.S. Nonprovisional patent application Ser. No. 17/076,899 entitled “SYSTEM AND METHOD FOR GENERATING AND INTERPRETING POINT CLOUDS OF A RAIL CORRIDOR ALONG A SURVEY PATH” which was filed on Oct. 22, 2020 which is a continuation of and claims priority to U.S. Nonprovisional patent application Ser. No. 16/876,484 entitled “SYSTEM AND METHOD FOR GENERATING AND INTERPRETING POINT CLOUDS OF A RAIL CORRIDOR ALONG A SURVEY PATH” which was filed on May 18, 2020 which is a nonprovisional application claiming priority to (1) provisional U.S. Provisional Patent Application No. 62/848,630 invented by Darel Mesher and entitled “Autonomous Track Assessment System” which was filed on May 16, 2019; provisional U.S. Provisional Patent Application No. 62/988,630 invented by Darel Mesher and entitled “Autonomous Track Assessment System” which was filed on Mar. 12, 2020; and provisional U.S. Provisional Patent Application No. 63/016,661 invented by Darel Mesher and entitled “Autonomous Track Assessment System” which was filed on Apr. 28, 2020, the entireties of which are incorporated herein by reference.

US Referenced Citations (484)
Number Name Date Kind
3562419 Stewart et al. Feb 1971 A
3942000 Dieringer Mar 1976 A
4040738 Wagner Aug 1977 A
4198164 Cantor Apr 1980 A
4265545 Slaker May 1981 A
4330775 Iwamoto et al. May 1982 A
4490038 Theurer et al. Dec 1984 A
4531837 Panetti Jul 1985 A
4554624 Wickham et al. Nov 1985 A
4600012 Kohayakawa et al. Jul 1986 A
4653316 Fukuhara Mar 1987 A
4676642 French Jun 1987 A
4691565 Theurer Sep 1987 A
4700223 Shoutaro et al. Oct 1987 A
4731853 Hata Mar 1988 A
4775238 Weber Oct 1988 A
4781060 Berndt Nov 1988 A
4899296 Khattak Feb 1990 A
4900153 Weber et al. Feb 1990 A
4915504 Thurston Apr 1990 A
4974168 Marx Nov 1990 A
5199176 Theurer et al. Apr 1993 A
5203089 Trefouel et al. Apr 1993 A
5221044 Guins Jun 1993 A
5245855 Burgel et al. Sep 1993 A
5247338 Danneskiold-Samsoe et al. Sep 1993 A
5275051 De Beer Jan 1994 A
5353512 Theurer et al. Oct 1994 A
5433111 Hershey et al. Jul 1995 A
5487341 Newman Jan 1996 A
5493499 Theurer et al. Feb 1996 A
5612538 Hackel et al. Mar 1997 A
5623244 Cooper Apr 1997 A
5627508 Cooper et al. May 1997 A
5671679 Straub et al. Sep 1997 A
5721685 Holland et al. Feb 1998 A
5743495 Welles Apr 1998 A
5744815 Gurevich et al. Apr 1998 A
5757472 Wangler et al. May 1998 A
5786750 Cooper Jul 1998 A
5787815 Andersson et al. Aug 1998 A
5791063 Kesler et al. Aug 1998 A
5793491 Wangler et al. Aug 1998 A
5793492 Vanaki Aug 1998 A
5804731 Jaeggi Sep 1998 A
5808906 Sanchez-Revuelta et al. Sep 1998 A
5912451 Gurevich et al. Jun 1999 A
5969323 Gurevich Oct 1999 A
5970438 Clark et al. Oct 1999 A
5986547 Korver et al. Nov 1999 A
6025920 Dec Feb 2000 A
6055322 Salganicoff Apr 2000 A
6055862 Martens May 2000 A
6062476 Stern et al. May 2000 A
6064428 Trosino et al. May 2000 A
6069967 Rozmus et al. May 2000 A
6128558 Kernwein Oct 2000 A
6243657 Tuck et al. Jun 2001 B1
6252977 Salganicoff Jun 2001 B1
6324912 Wooh Dec 2001 B1
6347265 Bidaud Feb 2002 B1
6356299 Trosino et al. Mar 2002 B1
6357297 Makino et al. Mar 2002 B1
6405141 Carr et al. Jun 2002 B1
6416020 Gronskov Jul 2002 B1
6496254 Bostrom Dec 2002 B2
6523411 Mian et al. Feb 2003 B1
6540180 Anderson Apr 2003 B2
6570497 Puckette, IV May 2003 B2
6600999 Clark et al. Jul 2003 B2
6615648 Ferguson et al. Sep 2003 B1
6634112 Carr et al. Oct 2003 B2
6647891 Holmes et al. Nov 2003 B2
6665066 Nair et al. Dec 2003 B2
6681160 Bidaud Jan 2004 B2
6698279 Stevenson Mar 2004 B1
6715354 Wooh Apr 2004 B2
6768551 Mian et al. Jul 2004 B2
6768959 Ignagni Jul 2004 B2
6804621 Pedanckar Oct 2004 B1
6854333 Wooh Feb 2005 B2
6862936 Kenderian et al. Mar 2005 B2
6873998 Dorum Mar 2005 B1
6909514 Nayebi Jun 2005 B2
6976324 Theurer et al. Dec 2005 B2
6995556 Nejikovsky et al. Feb 2006 B2
7023539 Kowalski Apr 2006 B2
7034272 Leonard Apr 2006 B1
7036232 Casagrande May 2006 B2
7054762 Pagano et al. May 2006 B2
7084989 Johannesson et al. Aug 2006 B2
7130753 Pedanekar Oct 2006 B2
7152347 Herzog et al. Dec 2006 B2
7164476 Shima et al. Jan 2007 B2
7164975 Bidaud Jan 2007 B2
7208733 Mian et al. Apr 2007 B2
7213789 Matzan May 2007 B1
7298548 Mian Nov 2007 B2
7328871 Mace et al. Feb 2008 B2
7355508 Mian et al. Apr 2008 B2
7357326 Hattersley et al. Apr 2008 B2
7392117 Bilodeau et al. Jun 2008 B1
7392595 Heimann Jul 2008 B2
7394553 Carr et al. Jul 2008 B2
7403296 Farritor et al. Jul 2008 B2
7412899 Mian et al. Aug 2008 B2
7463348 Chung Dec 2008 B2
7499186 Waisanen Mar 2009 B2
7502670 Harrison Mar 2009 B2
7516662 Nieisen et al. Apr 2009 B2
7555954 Pagano et al. Jul 2009 B2
7564569 Mian et al. Jul 2009 B2
7602937 Mian et al. Oct 2009 B2
7616329 Villar et al. Nov 2009 B2
7659972 Magnus et al. Feb 2010 B2
7680631 Selig et al. Mar 2010 B2
7681468 Verl et al. Mar 2010 B2
7698028 Bilodeau et al. Apr 2010 B1
7755660 Nejikovsky et al. Jul 2010 B2
7755774 Farritor et al. Jul 2010 B2
7769538 Rousseau Aug 2010 B2
7832281 Mian et al. Nov 2010 B2
7869909 Harrison Jan 2011 B2
7882742 Martens Feb 2011 B1
7899207 Mian et al. Mar 2011 B2
7920984 Farritor Apr 2011 B2
7937246 Farritor et al. May 2011 B2
7942058 Turner May 2011 B2
8006559 Mian et al. Aug 2011 B2
8079274 Mian et al. Dec 2011 B2
8081320 Villar et al. Dec 2011 B2
8111387 Douglas et al. Feb 2012 B2
8140250 Mian et al. Mar 2012 B2
8150105 Mian et al. Apr 2012 B2
8155809 Bilodeau et al. Apr 2012 B1
8180590 Szwilski et al. May 2012 B2
8188430 Mian May 2012 B2
8190377 Fu May 2012 B2
8209145 Paglinco et al. Jun 2012 B2
8263953 Fomenkar et al. Sep 2012 B2
8289526 Kilian et al. Oct 2012 B2
8326582 Mian et al. Dec 2012 B2
8335606 Mian et al. Dec 2012 B2
8345948 Zarembski et al. Jan 2013 B2
8352410 Rousselle et al. Jan 2013 B2
8345099 Bloom et al. Feb 2013 B2
8365604 Kahn Feb 2013 B2
8405837 Nagle, II et al. Mar 2013 B2
8412393 Anderson Apr 2013 B2
8418563 Wigh et al. Apr 2013 B2
8423240 Mian Apr 2013 B2
8424387 Wigh et al. Apr 2013 B2
8478480 Mian et al. Jul 2013 B2
8485035 Wigh et al. Jul 2013 B2
8490887 Jones Jul 2013 B2
8514387 Scherf et al. Aug 2013 B2
8577647 Farritor et al. Nov 2013 B2
8615110 Landes Dec 2013 B2
8625878 Haas et al. Jan 2014 B2
8649932 Mian et al. Feb 2014 B2
8655540 Mian et al. Feb 2014 B2
8682077 Longacre, Jr. Mar 2014 B1
8700924 Mian et al. Apr 2014 B2
8711222 Aaron et al. Apr 2014 B2
8724904 Fujiki May 2014 B2
8806948 Kahn et al. Aug 2014 B2
8818585 Bartonek Aug 2014 B2
8820166 Wigh et al. Sep 2014 B2
8868291 Mian et al. Oct 2014 B2
8875635 Turner Nov 2014 B2
8887572 Turner Nov 2014 B2
8903574 Cooper et al. Dec 2014 B2
8925873 Gamache et al. Jan 2015 B2
8934007 Snead Jan 2015 B2
8942426 Bar-am Jan 2015 B2
8958079 Kainer et al. Feb 2015 B2
9036025 Haas et al. May 2015 B2
9049433 Prince Jun 2015 B1
9050984 Li et al. Jun 2015 B2
9111444 Kaganovich Aug 2015 B2
9121747 Mian et al. Sep 2015 B2
9134185 Mian et al. Sep 2015 B2
9175998 Turner Nov 2015 B2
9177210 King Nov 2015 B2
9187104 Fang et al. Nov 2015 B2
9195907 Longacre, Jr. Nov 2015 B1
9205849 Cooper et al. Dec 2015 B2
9205850 Shimada Dec 2015 B2
9212902 Enomoto et al. Dec 2015 B2
9222904 Harrison Dec 2015 B2
9234786 Groll et al. Jan 2016 B2
9255913 Kumar et al. Feb 2016 B2
9297787 Fisk Mar 2016 B2
9310340 Mian et al. Apr 2016 B2
9336683 Inomata et al. May 2016 B2
9340219 Gamache et al. May 2016 B2
9346476 Dargy et al. May 2016 B2
9347864 Farritor et al. May 2016 B2
9389205 Mian et al. Jul 2016 B2
9415784 Bartonek et al. Aug 2016 B2
9423415 Nanba et al. Aug 2016 B2
9429545 Havira et al. Aug 2016 B2
9441956 Kainer et al. Sep 2016 B2
9446776 Cooper et al. Sep 2016 B2
9454816 Mian et al. Sep 2016 B2
9469198 Cooper et al. Oct 2016 B2
9518947 Bartonek et al. Dec 2016 B2
9533698 Warta Jan 2017 B2
9562878 Graham et al. Feb 2017 B2
9571796 Mian et al. Feb 2017 B2
9575007 Rao et al. Feb 2017 B2
9580091 Kraeling et al. Feb 2017 B2
9581998 Cooper et al. Feb 2017 B2
9607446 Cooper et al. Mar 2017 B2
9618335 Mesher Apr 2017 B2
9619691 Pang et al. Apr 2017 B2
9619725 King Apr 2017 B2
9628762 Farritor Apr 2017 B2
9664567 Sivathanu et al. May 2017 B2
9669852 Combs Jun 2017 B2
9671358 Cooper et al. Jun 2017 B2
9689760 Lanza di Scalea et al. Jun 2017 B2
9714043 Mian et al. Jul 2017 B2
9744978 Bhattacharjya et al. Aug 2017 B2
9752993 Thompson et al. Sep 2017 B1
9771090 Warta Sep 2017 B2
9796400 Puttagunta et al. Oct 2017 B2
9810533 Fosburgh et al. Nov 2017 B2
9822492 Hartl et al. Nov 2017 B2
9825662 Mian et al. Nov 2017 B2
9849894 Mesher Dec 2017 B2
9849895 Mesher Dec 2017 B2
9860962 Mesher Jan 2018 B2
9873442 Mesher Jan 2018 B2
9921584 Rao et al. Mar 2018 B2
9922416 Mian et al. Mar 2018 B2
9950716 English Apr 2018 B2
9950720 Mesher Apr 2018 B2
9981671 Fraser et al. May 2018 B2
9981675 Cooper et al. May 2018 B2
9983593 Cooper et al. May 2018 B2
9989498 Lanza di Scalea et al. Jun 2018 B2
10035498 Richardson et al. Jul 2018 B2
10040463 Singh Aug 2018 B2
10043154 King Aug 2018 B2
10077061 Schmidt et al. Sep 2018 B2
10081376 Singh Sep 2018 B2
10086857 Puttagunta et al. Oct 2018 B2
10167003 Bilodeau Jan 2019 B1
10286877 Lopez Galera et al. May 2019 B2
10322734 Mesher Jun 2019 B2
10349491 Mesher Jul 2019 B2
10352831 Kondo et al. Jul 2019 B2
10362293 Mesher Jul 2019 B2
10370014 Matson et al. Aug 2019 B2
10384697 Mesher Aug 2019 B2
10392035 Berggren Aug 2019 B2
10401500 Yang et al. Sep 2019 B2
10408606 Raab Sep 2019 B1
10414416 Hampapur Sep 2019 B2
10502831 Eichenholz Dec 2019 B2
10518791 Singh Dec 2019 B2
10543861 Bartek et al. Jan 2020 B1
10582187 Mesher Mar 2020 B2
10611389 Khosla Apr 2020 B2
10613550 Khosla Apr 2020 B2
10616556 Mesher Apr 2020 B2
10616557 Mesher Apr 2020 B2
10616558 Mesher Apr 2020 B2
10618537 Khosla Apr 2020 B2
10625760 Mesher Apr 2020 B2
10730538 Mesher Aug 2020 B2
10796192 Fernandez Oct 2020 B2
10816347 Wygant et al. Oct 2020 B2
10822008 Wade Nov 2020 B2
10829135 Anderson et al. Nov 2020 B2
10919546 Llorenty et al. Feb 2021 B1
10954637 Kaiser Mar 2021 B2
10989694 Kawabata et al. Apr 2021 B2
11001283 Dick et al. May 2021 B2
11046340 Matson et al. Jun 2021 B2
11107233 Saniei et al. Aug 2021 B2
11169269 Mesher Nov 2021 B2
11196981 Mesher Dec 2021 B2
11259007 Mesher Feb 2022 B2
11338832 Brick et al. May 2022 B1
11358617 Dick et al. Jun 2022 B2
11377130 Mesher Jul 2022 B2
11399172 Mesher Jul 2022 B2
11427232 Davis et al. Aug 2022 B2
11479281 Dick et al. Oct 2022 B2
20010045495 Olson et al. Nov 2001 A1
20020065610 Clark et al. May 2002 A1
20020070283 Young Jun 2002 A1
20020093487 Rosenberg Jul 2002 A1
20020099507 Clark et al. Jul 2002 A1
20020150278 Wustefeld Oct 2002 A1
20020196456 Komiya et al. Dec 2002 A1
20030059087 Waslowski et al. Mar 2003 A1
20030062414 Tsikos et al. Apr 2003 A1
20030072001 Mian et al. Apr 2003 A1
20030075675 Braune et al. Apr 2003 A1
20030140509 Casagrande Jul 2003 A1
20030160193 Sanchez Revuelta et al. Aug 2003 A1
20030164053 Ignagni Sep 2003 A1
20040021858 Shima et al. Feb 2004 A1
20040084069 Woodard May 2004 A1
20040088891 Theurer May 2004 A1
20040095135 Nejikovsky et al. May 2004 A1
20040122569 Bidaud Jun 2004 A1
20040189452 Li Sep 2004 A1
20040247157 Lages Dec 2004 A1
20040263624 Nejikovsky Dec 2004 A1
20050121539 Takada et al. Jun 2005 A1
20050244585 Schmeling Nov 2005 A1
20050279240 Pedanekar et al. Dec 2005 A1
20060017911 Villar Jan 2006 A1
20060098843 Chew May 2006 A1
20060171704 Bingle Aug 2006 A1
20060231685 Mace et al. Oct 2006 A1
20070136029 Selig et al. Jun 2007 A1
20070150130 Welles Jun 2007 A1
20070211145 Kilian et al. Sep 2007 A1
20070265780 Kesler et al. Nov 2007 A1
20070289478 Becker et al. Dec 2007 A1
20080007724 Chung Jan 2008 A1
20080177507 Mian et al. Jul 2008 A1
20080212106 Hoffmann Sep 2008 A1
20080298674 Baker Dec 2008 A1
20080304065 Hesser Dec 2008 A1
20080304083 Farritor et al. Dec 2008 A1
20090040503 Kilian Feb 2009 A1
20090073428 Magnus Mar 2009 A1
20090196486 Distante et al. Aug 2009 A1
20090250533 Akiyama et al. Oct 2009 A1
20090273788 Nagle et al. Nov 2009 A1
20090319197 Villar et al. Dec 2009 A1
20100007551 Pagliuco Jan 2010 A1
20100026551 Szwilski Feb 2010 A1
20100106309 Grohman et al. Apr 2010 A1
20100207936 Minear Aug 2010 A1
20100289891 Akiyama Nov 2010 A1
20110064273 Zarembski et al. Mar 2011 A1
20110209549 Kahn Sep 2011 A1
20110251742 Haas et al. Oct 2011 A1
20120026352 Natroshvilli et al. Feb 2012 A1
20120051643 Ha et al. Mar 2012 A1
20120062731 Enomoto et al. Mar 2012 A1
20120192756 Miller et al. Aug 2012 A1
20120216618 Bloom et al. Aug 2012 A1
20120218868 Kahn et al. Aug 2012 A1
20120222579 Turner et al. Sep 2012 A1
20120245908 Berggren Sep 2012 A1
20120263342 Haas Oct 2012 A1
20120300060 Farritor Nov 2012 A1
20130070083 Snead Mar 2013 A1
20130092758 Tanaka et al. Apr 2013 A1
20130096739 Landes et al. Apr 2013 A1
20130155061 Jahanashahi et al. Jun 2013 A1
20130170709 Distante et al. Jul 2013 A1
20130191070 Kainer Jul 2013 A1
20130202090 Belcher et al. Aug 2013 A1
20130230212 Landes Sep 2013 A1
20130231873 Fraser Sep 2013 A1
20130276539 Wagner et al. Oct 2013 A1
20130313372 Gamache et al. Nov 2013 A1
20130317676 Cooper et al. Nov 2013 A1
20140069193 Graham et al. Mar 2014 A1
20140129154 Cooper May 2014 A1
20140142868 Bidaud May 2014 A1
20140151512 Cooper Jun 2014 A1
20140177656 Mian et al. Jun 2014 A1
20140200952 Hampapur et al. Jul 2014 A1
20140333771 Mian et al. Nov 2014 A1
20140339374 Mian et al. Nov 2014 A1
20150106038 Turner Apr 2015 A1
20150131108 Kainer et al. May 2015 A1
20150219487 Maraini Aug 2015 A1
20150225002 Branka et al. Aug 2015 A1
20150268172 Naithani et al. Sep 2015 A1
20150269722 Naithani et al. Sep 2015 A1
20150284912 Delmonic et al. Oct 2015 A1
20150285688 Naithani et al. Oct 2015 A1
20150375765 Mustard Dec 2015 A1
20160002865 English et al. Jan 2016 A1
20160039439 Fahmy et al. Feb 2016 A1
20160059623 Kilian Mar 2016 A1
20160121912 Puttagunta May 2016 A1
20160159381 Fahmy Jun 2016 A1
20160207551 Mesher Jul 2016 A1
20160209003 Mesher Jul 2016 A1
20160212826 Mesher Jul 2016 A1
20160221592 Puttagunta Aug 2016 A1
20160249040 Mesher Aug 2016 A1
20160282108 Martinod Restrepo et al. Sep 2016 A1
20160304104 Witte et al. Oct 2016 A1
20160305915 Witte et al. Oct 2016 A1
20160312412 Schrunk, III Oct 2016 A1
20160318530 Johnson Nov 2016 A1
20160321513 Mitti et al. Nov 2016 A1
20160325767 LeFabvre et al. Nov 2016 A1
20160368510 Simon et al. Dec 2016 A1
20170029001 Berggren Feb 2017 A1
20170034892 Mesher Feb 2017 A1
20170066459 Singh Mar 2017 A1
20170106885 Singh Apr 2017 A1
20170106887 Mian et al. Apr 2017 A1
20170182980 Davies et al. Jun 2017 A1
20170203775 Mesher Jul 2017 A1
20170205379 Prince et al. Jul 2017 A1
20170219471 Fisk et al. Aug 2017 A1
20170267264 English et al. Sep 2017 A1
20170297536 Giraud et al. Oct 2017 A1
20170305442 Viviani Oct 2017 A1
20170313286 Giraud et al. Nov 2017 A1
20170313332 Paget et al. Nov 2017 A1
20170336293 Kondo et al. Nov 2017 A1
20180038957 Kawazoe et al. Feb 2018 A1
20180039842 Schuchmann et al. Feb 2018 A1
20180057030 Puttagunta et al. Mar 2018 A1
20180079433 Mesher Mar 2018 A1
20180079434 Mesher Mar 2018 A1
20180106000 Fruehwirt Apr 2018 A1
20180120440 O'Keefe May 2018 A1
20180127006 Wade May 2018 A1
20180220512 Mesher Aug 2018 A1
20180222504 Birch et al. Aug 2018 A1
20180276494 Fernandez Sep 2018 A1
20180281829 Euston et al. Oct 2018 A1
20180297621 Matson et al. Oct 2018 A1
20180339720 Singh Nov 2018 A1
20180370552 Puttagunta et al. Dec 2018 A1
20180372875 Juelsgaard et al. Dec 2018 A1
20190039633 Li Feb 2019 A1
20190054937 Graetz Feb 2019 A1
20190107607 Danziger Apr 2019 A1
20190135315 Dargy et al. May 2019 A1
20190156569 Jung May 2019 A1
20190179026 Englard et al. Jun 2019 A1
20190248393 Khosla Aug 2019 A1
20190310470 Weindorf et al. Oct 2019 A1
20190344813 Kaiser et al. Nov 2019 A1
20190349563 Mesher Nov 2019 A1
20190349564 Mesher Nov 2019 A1
20190349565 Mesher Nov 2019 A1
20190349566 Mesher Nov 2019 A1
20190357337 Mesher Nov 2019 A1
20190367060 Mesher Dec 2019 A1
20190367061 Mesher Dec 2019 A1
20200025578 Wygant Jan 2020 A1
20200034637 Olson et al. Jan 2020 A1
20200086903 Mesher Mar 2020 A1
20200116865 Yang et al. Apr 2020 A1
20200122753 Buerger Apr 2020 A1
20200156677 Mesher May 2020 A1
20200160733 Dick et al. May 2020 A1
20200164904 Dick et al. May 2020 A1
20200180667 Kim et al. Jun 2020 A1
20200198672 Underwood et al. Jun 2020 A1
20200221066 Mesher Jul 2020 A1
20200231193 Chen et al. Jul 2020 A1
20200239049 Dick et al. Jul 2020 A1
20200302592 Ebersohn et al. Sep 2020 A1
20200346673 Mesher Nov 2020 A1
20200361502 Metzger Nov 2020 A1
20200363532 Mesher Nov 2020 A1
20200400542 Fisk et al. Dec 2020 A1
20210019548 Fernandez Jan 2021 A1
20210041398 Van Wyk et al. Feb 2021 A1
20210041877 Lacaze et al. Feb 2021 A1
20210061322 Dick et al. Mar 2021 A1
20210072393 Mesher Mar 2021 A1
20210078622 Miller et al. Mar 2021 A1
20210229714 Dick et al. Jul 2021 A1
20210327087 Saniei et al. Oct 2021 A1
20210370993 Qian Dec 2021 A1
20210396685 Qian Dec 2021 A1
20210403060 Pertosa Dec 2021 A1
20220035037 Mesher Feb 2022 A1
20220116580 Mesher Apr 2022 A1
20220189001 Fernandez Jun 2022 A1
20220242466 Brick et al. Aug 2022 A1
20220258779 Dick et al. Aug 2022 A1
20220324497 Brick et al. Oct 2022 A1
Foreign Referenced Citations (178)
Number Date Country
2019338073 Oct 2020 AU
2019338073 Aug 2021 AU
2020337499 Mar 2022 AU
2061014 Aug 1992 CA
2574428 Feb 2006 CA
2607634 Apr 2008 CA
2574428 Oct 2009 CA
2782341 Jun 2011 CA
2844113 Feb 2013 CA
2986580 Sep 2014 CA
2867560 Apr 2015 CA
2607634 Jun 2015 CA
2945614 Oct 2015 CA
2945614 Oct 2015 CA
2732971 Jan 2016 CA
2996128 Mar 2016 CA
2860073 May 2016 CA
2867560 Jul 2017 CA
3042136 Jun 2018 CA
3071417 Jan 2019 CA
3071425 Jan 2019 CA
3070280 Jul 2021 CA
104751602 Jul 2015 CN
106291538 Jan 2017 CN
106364503 Feb 2017 CN
106373191 Feb 2017 CN
106384190 Feb 2017 CN
1045356526 Jun 2017 CN
107688024 Feb 2018 CN
206984011 Feb 2018 CN
108009484 May 2018 CN
108657222 Oct 2018 CN
19831176 Jan 2000 DE
19831215 Jan 2000 DE
10040139 Jul 2002 DE
19826422 Sep 2002 DE
60015268 Mar 2005 DE
19943744 Jan 2006 DE
19919604 Aug 2009 DE
102012207427 Jul 2013 DE
102009018036 Feb 2014 DE
102014119056 Jun 2016 DE
0274081 Jul 1988 EP
1079322 Feb 2001 EP
1146353 Oct 2001 EP
1158460 Nov 2001 EP
1168269 Jan 2002 EP
1197417 Apr 2002 EP
1236634 Sep 2002 EP
1098803 Jan 2003 EP
1285225 Jul 2004 EP
1600351 Jan 2007 EP
1892503 Jul 2007 EP
1918702 May 2008 EP
1964026 Sep 2008 EP
2322901 May 2011 EP
1992167 May 2016 EP
3024123 May 2016 EP
2806065 Sep 2016 EP
3138753 Mar 2017 EP
3138753 Mar 2017 EP
3138754 Mar 2017 EP
2697738 Aug 2017 EP
2697738 Aug 2017 EP
3351452 Jul 2018 EP
2998927 Sep 2018 EP
3431359 Jan 2019 EP
3310963 Mar 2019 EP
3420135 Oct 2019 EP
3561501 Oct 2019 EP
3442849 Jan 2020 EP
3392411 Feb 2020 EP
3105599 Apr 2020 EP
3433154 Jun 2020 EP
3658439 Jun 2020 EP
3689706 Aug 2020 EP
3554919 Oct 2020 EP
3555365 Oct 2020 EP
3746346 Dec 2020 EP
3580393 Apr 2021 EP
2674809 Oct 1992 FR
3049255 Sep 2017 FR
3077553 Feb 2018 FR
3049255 Apr 2018 FR
3052416 Jul 2019 FR
3077553 Aug 2019 FR
2265779 Oct 1993 GB
2378344 Feb 2003 GB
2383635 Jun 2005 GB
2536746 Sep 2016 GB
2536746 Mar 2017 GB
60039555 Mar 1985 JP
63302314 Dec 1988 JP
6011316 Jan 1994 JP
06322707 Nov 1994 JP
H07146131 Jun 1995 JP
7280532 Oct 1995 JP
H07294443 Nov 1995 JP
H07294444 Nov 1995 JP
10332324 Dec 1998 JP
11172606 Jun 1999 JP
2000221146 Aug 2000 JP
2000241360 Sep 2000 JP
H0924828 Jul 2002 JP
2002294610 Oct 2002 JP
2003074004 Mar 2003 JP
2003121556 Apr 2003 JP
2004132881 Apr 2004 JP
2007240342 Sep 2007 JP
4008082 Nov 2007 JP
2010229642 Oct 2010 JP
5283548 Sep 2013 JP
5812595 Nov 2015 JP
2015209205 Nov 2015 JP
2016191264 Nov 2016 JP
6068012 Jan 2017 JP
2017020862 Jan 2017 JP
6192717 Sep 2017 JP
6327413 May 2018 JP
6425990 Nov 2018 JP
2019065650 Apr 2019 JP
6530979 Jun 2019 JP
101562635 Oct 2015 KR
101706271 Feb 2017 KR
1020180061929 Jun 2018 KR
1020220043457 Apr 2022 KR
2142892 Dec 1999 RU
101851 Jan 2011 RU
1418105 Aug 1988 SU
200005576 Feb 2000 WO
200008459 Feb 2000 WO
2000-73118 Dec 2000 WO
2001066401 Sep 2001 WO
2001066401 May 2003 WO
2005036199 Apr 2005 WO
2005036199 Apr 2005 WO
2005098352 Oct 2005 WO
2006008292 Jan 2006 WO
2006014893 Feb 2006 WO
2008146151 Jan 2009 WO
2009007817 Mar 2009 WO
2011002534 Jan 2011 WO
2012142548 Oct 2012 WO
2013146502 Mar 2013 WO
2013177393 Nov 2013 WO
2014017015 Jan 2014 WO
2015003772 Jan 2015 WO
2015160300 Oct 2015 WO
2015165560 Nov 2015 WO
2016008201 Jan 2016 WO
2016027072 Feb 2016 WO
2016007393 Jul 2016 WO
2016168576 Oct 2016 WO
2016168623 Oct 2016 WO
2017159701 Sep 2017 WO
2018010827 Jan 2018 WO
2018158712 Sep 2018 WO
2018207469 Nov 2018 WO
2018208153 Nov 2018 WO
2018210441 Nov 2018 WO
2019023613 Jan 2019 WO
2019023658 Jan 2019 WO
2019023613 Jan 2019 WO
2019023658 Jan 2019 WO
2019086158 May 2019 WO
2019149456 Aug 2019 WO
2019212693 Nov 2019 WO
2020053699 Mar 2020 WO
2020058215 Mar 2020 WO
2020078703 Apr 2020 WO
2020232431 Nov 2020 WO
2020232443 Nov 2020 WO
2022058127 Mar 2022 WO
2022087506 Apr 2022 WO
2022111983 Jun 2022 WO
2022130488 Jun 2022 WO
2022130510 Jun 2022 WO
2022133032 Jun 2022 WO
Non-Patent Literature Citations (141)
Entry
US 8,548,242 B1, 10/2013, Longacre, Jr. (withdrawn)
T. Kanade, ed., Three-Dimensional Machine Vision, Kluwer Academic Publishers (1987) [Part 1].
T. Kanade, ed., Three-Dimensional Machine Vision, Kluwer Academic Publishers (1987) [Part 2].
D.D. Davis et al., “Tie Condition Inspection a Case Study of Tie Failure Rate, Mods, and Clustering,” Report No. R-714, Association of American Railroads Research and Test Department (Jul. 1989).
John Choros et al., “Prevention of Derailments due to Concrete Tie Rail Seat Deterioration,” Proceedings of ASME/IEEE Joint Rail Conference & Internal Combustion Engine Spring Technical Conference. No. 40096 (2007).
US Patent and Trademark Office, Final Office Action for U.S. Appl. No. 16/255,928 dated Apr. 27, 2020.
US Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 16/742,057 dated May 26, 2020.
Invitation to Pay Additional Fees, PCT App. Ser. No. PCT/US2020/033449 dated Jul. 9, 2020.
International Report on Patentability, PCT App. Ser. No. PCT/IB2018/058574 dated Aug. 6, 2020.
International Report on Patentability, PCT App. Ser. No. PCT/US2020/033374 dated Aug. 14, 2020.
Julio Molleda et al., “A Profile Measurement System for Rail Manufacturing using Multiple Laser Range Finders” (2015).
International Search Report and Written Opinion of the International Searching Authority, PCT App. Ser. No. PCT/US2020/033449 dated Sep. 14, 2020 (including Kovalev et al. “Freight car models and their computer-aided dynamic analysis”, Multibody System Dynamics, Nov. 2009).
“Laser Triangulation for Track Change and Defect Detection”, U.S. Department of Transportation, Federal Railroad Administration (Mar. 2020).
“Extended Field Trials of LRAIL for Automated Track Change Detection”, U.S. Department of Transportation, Federal Railroad Administration (Apr. 2020).
US Patent and Trademark Office, Final Office Action for U.S. Appl. No. 16/742,057 dated May 26, 2020.
Paul et al., “A Technical Evaluation of Lidar-Based Measurement of River Water Levels”, Water Resources Research (Apr. 4, 2020).
Ahn et al., “Estimating Water Reflectance at Near-Infrared Wavelengths for Turbid Water Atmospheric Correction: A Preliminary Study for GOCI-II”, Remote Sensing (Nov. 18, 2020).
Hart et al., “Automated Railcar and Track Inspection Projects: A Review of Collaborations Between CVRL and RailTEC”, presentation by Computer Vision and Robotics Laboratory and Railroad Engineering Program (RailTEC) University of Illinois at Urbana-Champaign (2017).
US Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 16/802,763 dated Jun. 29, 2021.
Yang et al., “Automated Extraction of 3-D Railway Tracks from Mobile Laser Scanning Point Clouds”, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 7, No. 12, Dec. 2014.
Li et al., “Rail Component Detection, Optimization, and Assessment for Automatic Rail Track Inspection”, IEEE Transactions of Intelligent Transportation Systems, vol. 15, No. 2, Apr. 2014.
International Preliminary Report on Patentability, PCT Application No. PCT/US2020/033449, completed May 24, 2021 and dated Aug. 12, 2021.
Espino et al., “Rail and Turnout Detection Using Gradient Information and Template Matching”, 2013 IEEE Internatiojnal Conference on Intelligent Rail Transportation Proceedings (2013).
U.S. Patent and Tademark Office, Non-Final Office Action for U.S. Appl. No. 17/243,746 dated Aug. 27, 2021.
Supplementary European Search Report, European Patent Application No. 18920660, dated Feb. 28, 2022.
US Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 14/725,490 dated Feb. 23, 2018.
Shawn Landers et al., “Development and Calibration of a Pavement Surface Performance Measure and Prediction Models for the British Columbia Pavement Management System” (2002).
Zheng Wu, “Hybrid Multi-Objective Optimization Models for Managing Pavement Assetts” (Jan. 25, 2008).
“Pavement Condition Index 101”, OGRA's Milestones (Dec. 2009).
“Rail Radar Bringing the Track Into the Office” presentation given to CN Rail Engineering on Jan. 21, 2011.
Rail Radar, Inc. Industrial Research Assistance Program Application (IRAP) (Aug. 10, 2012).
“Rail Radar Automated Track Assessment” paper distributed at the Association of American Railways (AAR) Transportation Test Center in Oct. 2010 by Rail Radar, Inc.
US Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 14/725,490 dated Mar. 30, 2017.
US Patent and Trademark Office, Final Office Action for U.S. Appl. No. 14/725,490 dated Aug. 16, 2017.
Kantor, et al., “Automatic Railway Classification Using Surface and Subsurface Measurements” Proceedings of the 3rd International Conference on Field and Service Robitics, pp. 43-48 (2001).
Magnes, Daniel L., “Non-Contact Technology for Track Speed Rail Measurements (Orian)” SPIE vol. 2458, pp. 45-51 (1995).
Ryabichenko, et al. “CCD Photonic System for Rail Width Measurement” SPIE vol. 3901, pp. 37-44 (1999).
Gingras, Dennis, “Optics and Photonics Used in Road Transportation” (1998).
Liviu Bursanescu and François Blais, “Automated Pavement Distress Data Collection and Analysis: a 3-D Approach” (1997).
US Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 14/724,925 dated Feb. 26, 2016.
US Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 14/724,890 dated Jul. 29, 2016.
US Patent and Trademark Office, Final Office Action for U.S. Appl. No. 14/724,890 dated Nov. 10, 2016.
US Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 14/724,890 dated Mar. 24, 2017.
US Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 16/255,928 dated Oct. 18, 2019.
US Patent and Trademark Office, Final Office Action for U.S. Appl. No. 16/127,956 dated Jul. 9, 2019.
Judgment jury verdict in Case No. 2:21-cv-1289, dated Aug. 26, 2022.
Tetra Tech, Inc.'s opening claim construction brief in Case No. 2:21-cv-1289, dated Oct. 4, 2021.
Pavemetric's response to Tetra tech's claim construction brief in Case No. 2:21-cv-1289, dated Oct. 18, 2021.
Tetra Tech, Inc.'s Reply claim construction brief in Case No. 2:21-cv-1289, dated Oct. 25, 2021.
Pavemetric's sur-reply to Tetra tech's claim construction brief in Case No. 2:21-cv-1289, dated Nov. 1, 2021.
Claim construction order in Case No. 2:21-cv-1289, dated Dec. 1, 2021.
Order regarding motion for summary judgment in Case No. 2:21-cv-1289, dated Apr. 27, 2022.
Final pretrial conference order in Case No. 2:21-cv-1289, dated Aug. 8, 2022.
Pavemetrics trial brief in Case No. 2:21-cv-1289, dated Aug. 10, 2022.
Jury instructions in Case No. 2:21-cv-1289, dated Aug. 24, 2022.
Verdict form in Case No. 2:21-cv-1289, dated Aug. 24, 2022.
Final pretrial conference order in Case No. 2:21-cv-1289, dated Aug. 26, 2022.
MVTec Software GmbH, Halcon Solution Guide I: Basics, available at http://download.mvtec.com/halcon-10.0-solution-guide-i.pdf (2013)(“Halcon Solution Guide”).
National Instruments, NI Vision for LabVIEW User Manual, available at https://www.ni.com/pdf/manuals/371007b.pdf (2005) (“LabVIEW 2005 Manual”).
Wenbin Ouyang & Bugao Xu, Pavement Cracking Measurements Using 3D Laser-Scan Images, 24 Measurement Sci. & Tech. 105204 (2013) (“Ouyang”).
Processing: A Practical Approach With Examples in Matlab (2011)(“Solomon”).
Çağlar Aytekin et al., Railway Fastener Inspection by Real-Time Machine Vision, 45 IEEE Transactions on Sys., Man, and Cybernetics: Sys. 1101 (Jan. 2015) (“Aytekin”).
Jinfeng Yang et al., An Efficient Direction Field-Based Method for the Detection of Fasteners on High-Speed Railways, 11 Sensors 7364 (2011) (“Yang”).
Urszula Marmol & Slawomir Mikrut, Attempts at Automatic Detection of Railway Head Edges from Images and Laser Data, 17 Image Processing & Commc'n 151 (2012) (“Marmol”).
Xaxier Gibert-Serra et al., A Machine Vision System for Automated Joint Bar Inspection from a Moving Rail Vehicle, Proc. 2007 ASME/IEEE Joint Rail Conf. & Internal Combustion Engine Spring Tech. Conf. 289 (2007) (“Gibert-Serra”).
Sick Sensor Intelligence, Product Catalog 2014/2015: Vision, available at https://www.sick.com/media/docs/2/02/302/Product_catalog_Vision_en_IM005 0302.PDF (2013) (“Sick Catalog”).
Sick Sensor Intelligence, Application: 3D Vision for Cost-Efficient Maintenance of Rail Networks, TETRATECH_0062963-64 (Jan. 2015) (“Sick Article”).
Matrox Electronic Systems, Ltd., Matrox Imaging Library version 9 User Guide, available athttps://www.matrox.com/apps/imaging_documentation_files/mil_userguide.pdf (2008) (“Matrox MIL 9 User Guide”).
MVTec Software GmbH, Halcon: the Power of Machine Vision, available at https://pyramidimaging.com/specs/MVTec/Halcon%2011.pdf (2013)(“Halcon Overview”).
Tordivel AS, Scorpion Vision Software: Version X Product Data, available at http://www.tordivel.no/scorpion/pdf/Scorpion%20X/PD-2011-0005%20Scorpion%20X%20Product%20Data.pdf (2010) (“Scorpion Overview”).
OpenCV 3.0.0.-dev documentation, available at https://docs.opencv.org/3.0-beta/index.html (2014) (“OpenCV”).
Mathworks Help Center, Documentation: edge, available https://www.mathworks.com/help/images/ref/edge.html (2011) (“Matlab”).
National Instruments, NI Vision for LabVIEW Help, available https://www.ni.com/pdf/manuals/370281w.zip (2014) “LabVIEW”).
Intel Integrated Performance Primitives for Intel Architecture, Reference Manual, vol. 2: Image and Video Processing, available at http://www.nacad.ufrj.br/online/intel/Documentation/en_US/ipp/ippiman.pdf (Mar. 2009).
Andrew Shropshire Boddiford, Improving the Safety and Efficiency of Rail Yard Operations Using Robotics, UT Elec. Theses and Dissertations, available at http://hdl.handle.net/2152/2911 (2013).
Leszek Jarzebowicz & Slawomir Judek, 3D Machine Vision System for Inspection of Contact Strips in Railway Vehicle Current Collectors, 2014 Int'l Conf. on Applied Elecs. 139 (2014).
Peng Li, A Vehicle-Based Laser System for Generating High-Resolution Digital Elevation Models, K-State Elec. Theses, Dissertations, and Reports, available at http://hdl.handle.net/2097/3890 (2010).
Pavemetrics' Preliminary Invalidity Contentions in Case No. 2:21-cv-1289, dated Jul. 15, 2021.
Exhibits 2-9 to Pavemetrics' Preliminary Invalidity Contentions in Case No. 2:21-cv-1289, dated Jul. 15, 2021.
Pavemetrics' Invalidity Contentions and Preliminary Identification in Case No. 2:21-cv-1289, dated Sep. 13, 2021.
Exhibit 2 to ,Pavemetrics' Invalidity Contentions and Preliminary Identification in Case No. 2:21-cv-1289, dated Sep. 13, 2021.
Exhibit 3 to ,Pavemetrics' Invalidity Contentions and Preliminary Identification in Case No. 2:21-cv-1289, dated Sep. 13, 2021.
U.S. Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 16/889,016 dated Sep. 23, 2021.
U.S. Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 16/877,106 dated Sep. 20, 2021.
U.S. Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 16/898,544 dated Sep. 24, 2021.
International Preliminary Report on Patentability, PCT App. No. PCT/US2020/033374 dated Nov. 16, 2021.
Korean Intellectual Property Office, International Search Report for Int. App. No. PCT/IB2018/058574 dated Feb. 27, 2019.
Korean Intellectual Property Office, Written Opinion of the International Searching Authority for Int. App. No. PCT/IB2018/058574 dated Feb. 27, 2019.
US Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 16/127,956 dated Dec. 31, 2018.
D.D. Davis et al., “Tie Performance—A Progress Report of the Des Plaines Test Site,” Report No. R-746, Association of American Railroads Research and Test Department (Apr. 1990).
Mattias Johanneson, “Architectures for Sheet-of-Light Range Imaging,” Report No. LiTH-ISY-l-1335, Image Processing Group, Department of Electrical Engineering, Linköping University (Feb. 27, 1992).
Mattias Johannesson, “Sheet-of-light Range Imaging,” Linköping Studies in Science and Technology. Dissertations No. 399 (1995).
M. Johannesson, SIMD Architectures for Range and Radar Imaging, PhD thesis, University of Linköping (1995).
Erik Åstrand, “Automatic Inspection of Sawn Wood,” Linköping Studies in Science and Technology. Dissertations. No. 424 (1996).
Mattias Johannesson, “Sheet-of-Light range imaging experiments with MAPP2200,” Report No. LiTH-ISY-l-1401, Image Processing Group, Department of Electrical Engineering, Linköping University (Sep. 28, 1992).
M. de Bakker et al., “A Smart Range Image Sensor,” Proceedings of the 24th European Solid-State Circuits Conference (1998):208-11;xii+514.
Dr. Mats Gokstorp et al., “Smart Vision Sensors,” International Conference on Image Processing (Oct. 4-7, 1998), Institute of Electrical and Electronics Engineers, Inc.
Mattias Johanneson, et al., “An Image Sensor for Sheet-of-Light Range Imaging,” IAPR Workshop on Machine Vision Applications (Dec. 7-9, 1992).
Mattias Johannesson, “Can Sorting using sheet-of-light range imaging and MAPP2200,” Institute of Electrical and Electronics Engineers; International Conference on Systems, Man and Cybernetics (Oct. 17-20, 1993).
Michiel de Bakker, et al., “Smart PSD array for sheet-of-light range imaging,” The International Society for Optical Engineering. Sensors and Camera Systems for Scientific, Industrial, and Digital Photography Applications (Jan. 24-26, 2000).
Umayal Chidambaram, “Edge Extraction of Color and Range Images,” (Dec. 2003).
Franz Pernkopf et al., “Detection of surface defects on raw milled steel blocks using range imaging” The International Society for Optical Engineering. Machine Vision Applications in Industrial Inspection X (Jan. 21-22, 2002).
Murhed, Anders, “IVP Integrated Vision Products,” Pulp and Paper International 44.12 (Dec. 1, 2002).
Anders Åstrand, “Smart Image Sensors,” Linköping Studies in Science and Technology. Dissertations No. 319 (1993).
Mattias Johannesson et al., “Five Contributions to the Art of Sheet-of-light Range Imaging on MAPP2200,” Report No. LiTH-ISY-R-1611, Image Processing Group, Department of Electrical Engineering, Linköping University (Apr. 14, 1994).
Newman et al., “A Survey of Automated Visual Inspection,” Computer Vision an Image Understanding vol. 61, No. 2, March, pp. 231-262, 1995.
J. Velten et al., “Application of a Brightness-Adapted Edge Detector for Real-Time Railroad Tie Detection in Video Images,” Institute of Electrical and Electronics Engineers (1999).
R. Gordon Kennedy, “Problems of Cartographic Design in Geographic Information Systems for Transportation,” Cartographic Perspectives (Jul. 20, 1999).
Richard Reiff, “An Evaluation of Remediation Techniques for Concrete Tie Rail Seat Abrasion in the Fast Environment,” American Railway Engineering Association, Bulletin 753 (1995).
Russell H. Lutch et al., “Causes and Preventative Methods for Rail Seat Abrasion in North America's Railroads,” Conference Paper (Oct. 2014).
Nigel Peters and Steven R. Mattson, “CN 60E Concrete Tie Development,” AREMA: 25 (2003).
Arthur L. Clouse et al. “Track Inspection Into the 21st Century” (Sep. 19, 2006).
Railroad Safety Advisory Committee (RSAC), Minutes of Meeting, Dec. 10, 2008, Washington, D.C.
Dennis P. Curtin, “An Extension to the Textbook of Digital Photography, Pixels and Images” (2007).
US Patent and Tademark Office, Non-Final Office Action for U.S. Appl. No. 17/076,899 dated Jan. 29, 2021.
Handbook of Computer Vision and Applications, vol. 2, Academic Press, “Signal Processing and Pattern Recognition” (1999).
International Advances in Nondestructive Testing, vol. 16, Gordon and Breach Science Publishers, S.A. (1991).
Babenko, Pavel, dissertation entitled “Visual Inspection of Railroad Tracks”, University of Central Florida (2009).
Shah, Mubarak, “Automated Visual Inspection/Detection of Railroad Track”, Florida Department of Transportation (Jul. 2010).
Metari et al., “Automatic Track Inspection Using 3D Laser Profilers to Improve Rail Transit Asset Condition Assessment and State of Good Repair—A Preliminary Study”, TRB 93rd Annual Meeting (Nov. 15, 2013).
Laurent, John et al., “Implementation and Validation of a New 3D Automated Pavement Cracking Measurement Equipment” (2010).
Final Written Judgment, U.S. Patentent Trial and Appeal Board, Inter Partes Review, Tetra Tech Canada, Inc. v. Georgetown Rail Equipment Company, (2020).
Tetra Tech, Inc. Annual Report excerpts (2020).
Declaration of David Drakes, Pavemetrics Systems, Inc. v. Tetra Tech, Inc. (case 2:21-cv-1289) (Mar. 22, 2021).
Declaration of John Laurent, Pavemetrics Systems, Inc. v. Tetra Tech, Inc. (case 2:21-cv-1289) (Mar. 22, 2021).
“An Automated System for Rail Transit Infrastructure Inspection”, 1st Quarterly Report, USDOT and University of Massachusetts Lowell (Sep. 30, 2012).
IRI Measurements Using the LCMS presentation, Pavemetrics (2012).
“An Automated System for Rail Transit Infrastructure Inspection”, 2d Quarterly Report, USDOT and University of Massachusetts Lowell (Jan. 15, 2013).
Ritars 3rd Quarterly Meeting Minutes, “An Automated System for Rail Transit Infrastructure Inspection” (May 14, 2013).
“An Automated System for Rail Transit Infrastructure Inspection”, 5th Quarterly Report, USDOT and University of Massachusetts Lowell (Oct. 15, 2013).
25th Annual Road Profile User's Group Meeting agenda, San Antonio, Texas (Sep. 16, 2013).
“LCMS—Laser Crack Measurement System” presentation, Pavemetrics Systems Inc. (Sep. 2013).
Metari, et al., “An Automatic Track Inspection Using 3D Laser Profilers to Improve Rail Transit Asset Condition Assessment and State of Good Repair: A Preliminary Study” presentation, Transportation Research Board 93rd Annual Meeting (given Jan. 14, 2014).
Lorent, et al., “Detection of Range-Based Rail Gage and Missing Rail Fasteners: Use of High-Resolution Two- and Three-dimensional Images” (Jan. 2014).
“3D Mapping of Pavements: Geometry and DTM” presentation, Pavemetrics Systems Inc. (Sep. 2014).
“Laser Rail Inspection System (LRAIL)” datasheet, Pavemetrics Systems Inc. (Oct. 2014).
Pavemetrics Systems Inc. webpage screenshot (Dec. 18, 2014).
Pavemetrics Systems Inc. LRAIL webpage (Feb. 20, 2015).
Pavemetrics' Memorandum in Opposition to motion for Preliminary Injunction, Pavemetrics Systems, Inc. v. Tetra Tech, Inc. (case 2:21-cv-1289) (Mar. 22, 2021).
Pavemetrics' Compulsory Counterclaim for Declaratory Judgment, Pavemetrics Systems, Inc. v. Tetra Tech, Inc. (case 2:21-cv-1289) (Mar. 24, 2021).
Supplementary European Search Report, App. No. EP20806472 (dated May 11, 2023).
Related Publications (1)
Number Date Country
20220035037 A1 Feb 2022 US
Provisional Applications (3)
Number Date Country
63016661 Apr 2020 US
62988630 Mar 2020 US
62848630 May 2019 US
Continuations (2)
Number Date Country
Parent 17076899 Oct 2020 US
Child 17494940 US
Parent 16876484 May 2020 US
Child 17076899 US