This disclosure relates to the field of railway track inspection and assessment systems. More particularly, this disclosure relates to a railway track inspection and assessment system using one or more 3D sensors oriented at an oblique angle relative to a railway track for gathering data from a side of a rail.
Tie plate damage to wooden crossties through crosstie surface abrasion is a significant form of distress negatively impacting crosstie condition by reducing rail fastener holding capabilities. Referring to
The rail 100 includes a rail head 110 located at a top of the rail 100, a rail web 112, and a rail foot 114 located below the rail web 112 and the rail head 110. A bottom of the rail foot 114 is referred to as a rail base seat 116, and a top of the rail foot 114 is referred to as a rail base surface 118.
Employing current three-dimensional (3D) triangulation-based measurement technologies used for railway track assessment with 3D sensors positioned above the rail assembly, an elevation of the rail base seat 116, or the tie plate base 108 cannot be measured directly. Therefore, an elevation of the tie plate base 108 must be estimated by measuring an elevation of a top surface of the tie plate 102 (i.e., the tie plate surface 120) and subtracting an estimated thickness of the tie plate 102.
The plate cut value increases as the tie plate 102 cuts downward into an upper surface of the crosstie 106 to which the tie plate 102 is fastened (the tie plate base 108 penetrates or cuts into the upper crosstie surface 122). Conventional methods of determining plate cut value require calculating the difference between the surface elevation of outermost tie plate edges (on the “field” side outside of the rails and on the “gauge” side that is between the rails) and the adjacent upper crosstie surface 122 elevations near the edge of the tie plate 102. Referring to
Plate Cut=Crosstie Surface Elevation−(Plate Surface Elevation−Plate Thickness Estimate) Equation 1
A plate cut value of 0 millimeters (mm) would represent an undamaged (new) crosstie surface, as shown in
In addition to plate cut in wooden crossties, concrete crosstie surface abrasion is a significant form of distress which negatively impacts concrete crosstie condition. Referring to
Employing 3D triangulation-based measurement technologies used for railway track assessment with sensors positioned above the track surface, the elevation of the rail base seat 130, or the rail pad thickness cannot be measured directly. Therefore, the rail base seat elevation must be estimated by measuring a rail base surface elevation 140 and subtracting an estimated rail base thickness.
As a rail base seat wears the underlying pad 128, the pad thickness is reduced to zero. At the point of a zero thickness pad, the rail seat abrasion is said to be 0 mm, representing the point at which the rail base seat 130 is beginning to contact the upper crosstie surface 132. As the rail base seat 130 continues to abrade and penetrate into the upper crosstie surface 132, the rail seat abrasion values increase.
The conventional method of determining the rail seat abrasion parameter requires calculating the difference between the rail base seat elevation (for the field and the gauge sides of the rail) and the adjacent crosstie surface field and gauge elevations near the rail base, as shown in
Rail Seat Abrasion=Crosstie Surface Elevation−(Rail Base Surface Elevation−Rail Base Thickness Estimate) Equation 2
In practice, it is common to have significant amounts of ballast 124 or other track debris obscuring the rail base surface for substantial portions of a rail network, as illustrated in
What is needed, therefore, is a means to measure plate cut and rail seat abrasion values in all track conditions. The capability to determine elevations for all crosstie plates and rail base surfaces regardless of whether they are obscured by ballast or other debris would significantly improve the ability to report plate cut measures for all wooden crossties and rail seat abrasion measures for all concrete crossties in a rail owner's network.
In another aspect, current track assessment systems used by various companies that obtain 3D elevation maps of railway tracks view such tracks and associated features vertically and such systems are unable to obtain full views of the sides (rail webs) of rails. What is needed, therefore, is a means to obtain 3D profiles an 3D elevation maps of the rail webs of rails to analyze various track features.
In a related aspect, manufacturer markings are often placed on rail webs and contain important information regarding the physical characteristics of the rails on which such markings are located as well as other information including the age of the particular rails and the manufacturer of the particular rails. What is needed, therefore, is a way to access such information when assessing a railway track using a track assessment system on a moving rail vehicle operating along such railway track. Such information could be used for inventory purposes and/or to help with analysis of the degradation of particular rails along a railway track.
In another aspect, sensors and structured light emitters are often disrupted by debris moving around beneath rail vehicles carrying track assessment systems, such debris building up on the optics of such sensors and light emitters or on substantially transparent panels for protecting such sensors and light emitters. What is needed, therefore, is a means to easily remove such debris buildup while a track assessment system is in operation, moving along a railway track.
The above and other needs are met by a system for inspecting a railway track, the apparatus comprising a processor; at least one sensor oriented to capture data of the railway track, the at least one sensor in electronic communication with the processor; a data storage device in electronic communication with the processor; and computer executable instructions stored on a computer readable storage medium in communication with the processor. The computer executable instructions are operable to determine an elevation of a surface of a rail head of a rail located on the railway track based on a distance to the rail head from the at least one sensor; determine an elevation of a surface of a crosstie of the railway track based on a distance to a top surface of the crosstie from the at least one sensor; estimate a total rail height and underlying rail support height; and calculate a crosstie wear value based on the determined rail head surface elevation, crosstie surface elevation, and estimated total rail height of the rail and underlying rail support height of an underlying rail support. The underlying rail support can be, for example, a tie plate (for wooden crosstie applications) or a pad (for concrete crosstie applications).
Preferably, the system for inspecting a railway track described above is located on a rail vehicle and further includes an encoder electromechanically engaged with a wheel of the rail vehicle and in communication with the processor to provide location data of the rail vehicle. Preferably, the system also comprises a GPS antenna in communication with the processor for detecting a location of the system
Preferably, the at least one sensor of the system for inspecting a railway track described above further comprises a light emitter and a camera in communication with the processor, wherein the camera captures a field of view of the railway track including reflected light from the light emitter to generate a three-dimensional elevation map of the railway track. Alternatively, the at least one sensor may comprise one or more time of flight sensors. In some embodiments, the at least one sensor may comprise one or more light emitters, one or more cameras, and one or more time of flight sensors.
In addition to the system described above, a method of determining wear of a railway track is also disclosed, such method comprising the steps of shining a beam of light along a railway track, interrogating a railway track using at least one sensor which forms part of a track assessment system housed on a rail vehicle; receiving data corresponding to the railway track based on the interrogation of the railway track using the at least one sensor; determining an elevation of a rail head of the railway track based on the received data; determining an elevation of a top surface of a rail crosstie of the railway track based on the received data; estimating a total rail height of the railway track and a height of an underlying rail support; and determining a crosstie wear value based on the elevation of the rail head, the elevation of the top surface of the crosstie, the estimated total rail height, and the estimated height of the underlying rail support.
In a preferred embodiment, the estimated height of the rail is based on one or more visual indicators displayed on the rail which are visually captured by the at least one sensor and compared by the processor to a database of rail markings used by the manufacturer of the rail.
In a preferred embodiment, the method described above further comprises the step of determining a geographic location of one or more railway track features corresponding to the data captured on the at least one sensor, wherein the estimated total rail height is based on the geographic location of the one or more railway track features.
In a preferred embodiment, the method described above further comprises the step of determining an estimated total rail height by using the processor to access a database which includes data which correlates specific geographic track locations to the identities of the specific types of rails placed at those geographic track locations.
In one embodiment (in which the underlying rail support comprises a crosstie plate), the step of estimating a total rail height of the railway track and a height of an underlying rail support further comprises estimating a thickness of the crosstie plate. This method may further include the step of estimating a thickness of the tie plate based on received data at a plurality of locations along a length of track, wherein the estimated tie plate thickness is based on a maximum distance from the top surface of the rail head to the top surface of the rail crosstie along the length of track.
In one embodiment of the method described above, the rail wear value is a plate cut value corresponding to an amount that the tie plate has cut into a top surface of the rail crosstie being interrogated.
In one embodiment (in which the rail crosstie is a concrete rail crosstie), the rail wear value is a rail seat abrasion value corresponding to an amount that a rail base seat has cut into a top surface of the concrete rail crosstie being interrogated. In a related embodiment, the underlying rail support comprises a pad and the rail crosstie being interrogated is a concrete crosstie.
In one embodiment (in which the underlying rail support comprises a pad separating a rail from a concrete crosstie), the method further comprises the step of estimating a thickness of the pad. This method may further include the step of estimating a thickness of the pad based on received data at a plurality of locations along a length of track, wherein the estimated pad thickness is based on a maximum distance from the top surface of the rail head to the top surface of the rail crosstie along the length of track.
The disclosure herein also covers a railway track assessment apparatus for gathering, storing, and processing profiles of one or both rails on a railway track while the apparatus travels on a rail vehicle along the railway track. Such apparatus includes a processor; a system controller in communication with the processor; a data storage device in communication with the processor; a power source for providing electric power to the track assessment apparatus; and a first sensor pod attached adjacent to an undercarriage of the rail vehicle. The first sensor pod includes a first 3D sensor in communication with the system controller wherein the first 3D sensor is oriented at an oblique angle β relative to a railway track bed surface supporting rails on which the rail vehicle is moving wherein such orientation provides the first 3D sensor a side view of a first side of a first rail of the railway track so that the first 3D sensor can obtain data from the first side of the first rail; and a first structured light generator in communication with the system controller. In one embodiment, the first sensor pod is oriented at an oblique angle α relative to the undercarriage of the rail vehicle.
The first sensor pod can further include a first sensor enclosure wherein the first 3D sensor and the first structured light generator are attached adjacent to the first sensor enclosure inside of the first sensor enclosure; a first thermal sensor; and a first heating and cooling device wherein the system controller further includes a temperature controller in communication with the first thermal sensor and the first heating and cooling device wherein the first heating and cooling device is activated or deactivated by the temperature controller based on feedback from the first thermal sensor so that the temperature inside the first sensor enclosure is maintained within a specific range.
The railway track assessment apparatus may further include an encoder engaged with a wheel of the rail vehicle to transmit pulses to the system controller based on the direction and distance travelled of the rail vehicle; a GNSS receiver in communication with the processor for providing position data of the railway track assessment apparatus to the processor; the system controller further including a sensor trigger controller; and computer executable instructions stored on a computer readable storage medium in communication with the sensor trigger controller operable to convert wheel encoder pulses to a desired profile measurement interval; and reference profile scans to geo-spatial coordinates by synchronizing encoder pulses with GNSS receiver position data.
The railway track assessment apparatus preferably further includes (1) a second sensor pod attached adjacent to the undercarriage of the rail vehicle, the second sensor pod including a second 3D sensor in communication with the system controller wherein the second 3D sensor is oriented at an oblique angle β relative to a railway track bed surface supporting rails on which the rail vehicle is moving wherein such orientation provides the second 3D sensor a side view of a second side of the first rail of the railway track so that the second 3D sensor can obtain data from the second side of the first rail; and a second structured light generator in communication with the system controller; (2) a third sensor pod attached adjacent to the undercarriage of the rail vehicle, the third sensor pod including a third 3D sensor in communication with the system controller wherein the third 3D sensor is oriented at an oblique angle β relative to a railway track bed surface supporting rails on which the rail vehicle is moving wherein such orientation provides the third 3D sensor a side view of a first side of a second rail of the railway track so that the third 3D sensor can obtain data from the first side of the second rail; and a third structured light generator in communication with the system controller; and (3) a fourth sensor pod attached adjacent to the undercarriage of the rail vehicle, the fourth sensor pod including a fourth 3D sensor in communication with the system controller wherein the fourth 3D sensor is oriented at an oblique angle β relative to a railway track bed surface supporting rails on which the rail vehicle is moving wherein such orientation provides the fourth 3D sensor a side view of a second side of the second rail of the railway track so that the fourth 3D sensor can obtain data from the second side of the second rail; and a fourth structured light generator in communication with the system controller. The railway track assessment apparatus may further include (1) an encoder engaged with a wheel of the rail vehicle to transmit pulses to the system controller based on the direction and distance travelled of the rail vehicle; (2) a GNSS receiver in communication with the processor for providing position data of the railway track assessment apparatus to the processor; (3) the system controller further including a sensor trigger controller; and (4) computer executable instructions stored on a computer readable storage medium in communication with the processor operable to (a) synchronize the repeated activation of the first 3D sensor, the second 3D sensor, the third 3D sensor, and the fourth 3D sensor; (b) combine profile scans from the first 3D sensor and the second 3D sensor into a first combined profile scan, and combine profile scans from the third 3D sensor and the fourth 3D sensor into a second combined profile scan; and reference the first combined profile scan and the second combined profile scan to geo-spatial coordinates by synchronizing encoder pulses with GNSS receiver position data.
Additionally or alternatively, the railway track assessment apparatus may further include a first sensor enclosure inside which the first 3D sensor and the first structured light generator are attached adjacent to the first sensor enclosure; and a cover plate forming a wall of the first sensor enclosure wherein the cover plate further includes a first cover plate aperture with a first glass panel covering the first cover plate aperture, and a second cover plate aperture with a second glass panel covering the second cover plate aperture. Preferably, the first glass panel includes a light transmission band that is compatible with the wavelengths of the first structured light generator, allowing most of any generated light from the first structured light generator to pass through the first glass panel and not be reflected back into the first sensor enclosure by the first glass panel.
The railway track assessment apparatus preferably further includes an air blower in communication with the system controller; and first ducting extending from the air blower to a position proximate to the first glass panel and the second glass panel, wherein the air blower is activated at specified times to blow air through the ducting to dislodge and deflect debris from the first glass panel and the second glass panel. In one specific embodiment, the railway track assessment apparatus further includes (1) an air distribution lid attached adjacent to the cover plate, the air distribution lid further including (a) a first duct mount; (b) a first walled enclosure adjacent to the first glass panel; (c) a first enclosed channel providing space for air to flow from the first duct mount to the first walled enclosure proximate to the first glass panel; (d) a first air distribution lid first aperture at least partially covering the first glass panel; (e) a second duct mount; (f) a second walled enclosure adjacent to the second glass panel; and (g) a second enclosed channel providing space for air to flow from the second duct mount to the second walled enclosure proximate to the second glass panel; (2) an air blower in communication with the system controller; and (3) first ducting from the air blower to the air distribution lid wherein the first ducting further includes a first duct attached adjacent to the first duct mount and a second duct attached adjacent to the second duct mount, wherein the air blower is activated by the system controller at specified times and, when activated, the air blower causes air to flow from the air blower, through the ducting, through first duct mount and the second duct mount, through the first enclosed channel and the second enclosed channel, to the first walled enclosure and the second walled enclosure to dislodge debris from the first glass panel and the second glass panel during operation of the railway track assessment apparatus.
Additionally or alternatively, the railway track assessment apparatus may further include (1) a first database stored on a computer readable medium in communication with the processor wherein the first database includes manufacturing data regarding the physical characteristics of specified rails cross-referenced with rail web markings; (2) the system controller further including a 3D sensor controller; (3) computer executable instructions stored on a computer readable storage medium in communication with the 3D sensor controller operable to allow the 3D sensor controller to (a) gather a first scanline of a rail being scanned from the first 3D sensor, such first scanline providing information regarding a physical characteristic of a rail feature shown in the scanline; and (b) gather multiple scanlines to form an elevation map of the rail being scanned; and (4) computer executable instructions stored on a computer readable storage medium in communication with the processor operable to allow the processor to (a) analyze alpha-numeric markings on a side of the rail being scanned using an optical character recognition algorithm, such alpha-numeric markings analyzed using the elevation map; (b) access the first database; (c) cross-reference alpha-numeric markings in the elevation map with manufacturing data in the first database; (d) measure a first physical characteristic of the rail being scanned using the processor to analyze the first scanline; (e) using a machine vision algorithm, compare the first physical characteristic of the rail being scanned with a same type of physical characteristic of a rail found in the first database, wherein the rail found in the first database matches the alpha-numeric markings that were decoded by the 3D sensor controller applicable to the first scanline; and (f) determine the condition of the first physical characteristic of the rail being scanned based on the comparison between the first physical characteristic of the rail being scanned and the same type of physical characteristic of a rail found in the first database. In the same or similar embodiment, the railway track assessment apparatus further includes (1) a wireless transmitter and receiver in communication with the processor; (2) a second database stored on a computer readable medium in communication with the processor but geographically remote from the processor, wherein the second database includes manufacturing data regarding the physical characteristics of specified rails cross-referenced with rail web markings; and (3) computer executable instructions stored on a computer readable storage medium in communication with the processor operable to allow the processor to (a) access the second database; (b) cross-reference alpha-numeric markings in the elevation map with manufacturing data in the second database; and (c) measure the first physical characteristic of the rail being scanned using the processor to analyze the first scanline; (d) using a machine vision algorithm, compare the first physical characteristic of the rail being scanned with a same type of physical characteristic of a rail found in the second database, wherein the rail found in the second database matches the alpha-numeric markings that were deciphered by the 3D sensor controller applicable to the first scanline; and (e) determine the condition of the first physical characteristic of the rail being scanned based on the comparison between the first physical characteristic of the rail being scanned and the same type of physical characteristic of a rail found in the second database.
Additionally or alternatively, the railway track assessment apparatus may further include (1) the system controller further including a 3D sensor controller; and (2) computer executable instructions stored on a computer readable storage medium in communication with the 3D sensor controller operable to allow the 3D sensor controller to gather an elevation map of the first side of the first rail using the first 3D sensor; (3) computer executable instructions stored on a computer readable storage medium in communication with the processor operable to allow the processor to (a) determine whether there is an elevation variation in the elevation map; (b) if there is an elevation variation in the elevation map, (I) determine the likely cause of the elevation variation based on the size and shape of the elevation variation; (II) assign a specific type of rail component identity to that elevation variation; (III) analyze the elevation variation under the presumption that the elevation variation coincides with the assigned specific type of rail component; and (IV) save the elevation map, the identity of the assigned rail component, and the measurements made during the analysis of the elevation variation to the data storage device.
Additionally or alternatively, the railway track assessment apparatus may further include (1) the system controller further including a 3D sensor controller; (2) computer executable instructions stored on a computer readable storage medium in communication with the 3D sensor controller operable to allow the 3D sensor controller to gather a scanline of the first side of the first rail using the first 3D sensor; (3) computer executable instructions stored on a computer readable storage medium in communication with the processor operable to allow the processor to (a) calibrate the first 3D sensor to determine the real world unit width of a pixel in a scanline; (b) locate a pixel representing a rail base bottom using a horizontal edge detection machine vision algorithm (c) determine whether a tie is present in the scanline by detecting a generally smooth planar surface in proximity to and below the first rail using a machine vision algorithm; (d) if a tie is present in the scanline, locate a pixel representing the top of the tie surface using a machine vision algorithm; (e) calculate the difference in elevation between the bottom of the rail and the top of the tie surface representing the thickness of a pad under the first rail; and (f) based on the calculated thickness of the pad, determine the amount of rail seat abrasion on the tie under the first rail.
Additionally or alternatively, the railway track assessment apparatus may further include (1) the system controller further including a 3D sensor controller; (2) computer executable instructions stored on a computer readable storage medium in communication with the 3D sensor controller operable to allow the 3D sensor controller to gather a scanline of the first side of the first rail using the first 3D sensor; (3) computer executable instructions stored on a computer readable storage medium in communication with the processor operable to allow the processor to (a) calibrate the first 3D sensor to determine the real word unit width of a pixel in a scanline; (b) locate a pixel representing a rail base bottom using a horizontal edge detection machine vision algorithm; (c) determine whether a tie is present in the scanline by detecting a generally smooth planar surface in proximity to and below the first rail using a machine vision algorithm; (d) if a tie is present in the scanline, locate a pixel representing the top of the tie surface using a machine vision algorithm; and (e) calculate the difference in elevation between the bottom of the rail and the top of the tie surface representing the amount of plate cut in the first rail.
Additionally or alternatively, the system controller may further include (1) a laser power controller in communication with the first structured light generator and the processor; and (2) computer executable instructions stored on a computer readable storage medium in communication with the laser power controller operable to allow the laser power controller to adjust the power to the structured light generator based on the light intensity of the most recent profile scan made by the first 3D sensor.
A method for analyzing the side of a rail of a railway track is also disclosed, such method including the steps of (1) scanning the first side of a first rail of a railway track with an optical scanning system using a first sensor that is attached adjacent to a rail vehicle and oriented at an oblique angle relative to a railway track bed surface supporting rails on which the rail vehicle is moving wherein the first sensor obtains data regarding the first side of the first rail; (2) generating a first 3D profile of the first side of the first rail based on the data gathered by the first sensor using a system controller; (3) analyzing the first 3D profile using a processor, such processor operating using a machine vision algorithm.
The method may further include (4) generating a 3D elevation map of the first side of the first rail by combining a plurality of 3D profiles including the first 3D profile using a processor; (5) analyzing alpha-numeric markings on the first side of the first rail shown in the 3D elevation map using the processor operating an optical character recognition algorithm; (6) referencing the 3D elevation map to geo-spatial coordinates using the processor by synchronizing the plurality of 3D profiles forming the 3D elevation map with the location of the rail vehicle when those 3D profiles were generated using a GNSS receiver; and (7) storing the referenced 3D elevation map to a data storage device using a processor. This method may further include (8) accessing a database stored on a computer readable using a processor wherein the first database includes manufacturing data regarding the physical characteristics of specified rails cross-referenced with alpha-numeric rail markings located on the sides of the specified rails; (9) cross-referencing the alpha-numeric markings in the elevation map with manufacturing data in the database using the processor; (10) measuring a physical characteristic of the first side of the first rail using the processor to analyze the first 3D profile; (11) comparing the physical characteristic of the first side of the first rail shown in the first 3D profile with a same type of physical characteristic of a rail found in the database matching the alpha-numeric markings that were detected by the processor; and (12) determining the condition of the first physical characteristic of the first side of the first rail being scanned based on the comparison between the first physical characteristic of the first side of the first rail being scanned and the same type of physical characteristic of the specified rails found in the database.
Additionally or alternatively, the method may include (4) controlling the temperature of the inside of a first sensor enclosure in which the first sensor is located using a temperature controller in communication with a thermal sensor and a heating and cooling device.
Additionally or alternatively, the method may further include (4) scanning the second side of the first rail of the railway track with the optical scanning system using a second sensor that is attached adjacent to the rail vehicle and oriented at an oblique angle relative to the undercarriage of the rail vehicle wherein the second sensor obtains data regarding the second side of the first rail; (5) generating a second 3D profile of the second side of the first rail based on the data gathered by the second sensor using the system controller; (6) scanning the first side of a second rail of the railway track with the optical scanning system using a third sensor that is attached adjacent to the rail vehicle and oriented at an oblique angle relative to the undercarriage of the rail vehicle wherein the third sensor obtains data regarding the first side of the second rail; (7) generating a third 3D profile of the first side of the second rail based on the data gathered by the third sensor using the system controller; (8) scanning the second side of the second rail of the railway track with the optical scanning system using a fourth sensor that is attached adjacent to the rail vehicle and oriented at an oblique angle relative to the undercarriage of the rail vehicle wherein the fourth sensor obtains data regarding the second side of the second rail; (9) generating a fourth 3D profile of the second side of the second rail based on the data gathered by the fourth sensor using the system controller; (10) analyzing the first 3D profile using the processor, such processor operating using a machine vision algorithm. Such method may further include (11) synchronizing activation of the first sensor, the second sensor, the third sensor, and the fourth sensor using the system controller in communication with an encoder; (12) combining the first 3D profile from the first sensor, the second 3D profile from the second sensor, the third 3D profile from the third sensor and the fourth 3D profile from the fourth sensor into a single combined 3D profile scan; and (13) referencing the combined 3D profile scan to geo-spatial coordinates by synchronizing encoder pulses with GNSS receiver position data from a GNSS receiver.
Additionally or alternatively, the method may further include (4) generating a 3D elevation map of the first side of the first rail by combining a plurality of 3D profiles including the first 3D profile using the processor; (5) determining whether there is an elevation variation in the 3D elevation map using the processor operating a machine vision algorithm; and (6) if there is an elevation variation in the elevation map, (a) determining the likely cause of the elevation variation based on the size and shape of the elevation variation using the processor operating a machine vision algorithm; (b) assigning a specific type of rail component identity to that elevation variation using the processor; (c) analyzing the elevation variation under the presumption that the elevation variation coincides with the assigned specific type of rail component using the processor; and (d) saving the elevation map, the identity of the assigned rail component, and any measurements made during the analysis of the elevation variation to a data storage device using the processor.
Additionally or alternatively, the method may further include (4) calibrating the first sensor to determine the real word unit width of a pixel in a 3D profile; (5) locating a pixel representing a rail base bottom using the processor operating a machine vision algorithm; (6) determining whether a tie is present in the 3D profile using the processor operating a machine vision algorithm; and (7) if a tie is present in the 3D profile, (a) locating a pixel representing the top of the tie using the processor operating a machine vision algorithm; (b) calculating the thickness of a pad under the first rail using the processor; and (c) determining the amount of rail seat abrasion on the tie under the first rail based on the calculated thickness of the pad using the processor.
Additionally or alternatively, the method may further include (4) calibrating the first sensor to determine the real word unit width of a pixel in a 3D profile; (5) locating a pixel representing a rail base bottom using the processor operating a machine vision algorithm; (6) determining whether a tie is present in the 3D profile using the processor operating a machine vision algorithm; and (7) if a tie is present in the 3D profile, (a) locating a pixel representing the top of the tie using the processor operating a machine vision algorithm; and (b) calculating the plate cut under the first rail using the processor.
Additionally or alternatively, the method may further include adjusting the power to a structured light generator based on the light intensity of the most recent 3D profile scan made by the first sensor using the processor and a laser power control controller.
In another aspect, a method of clearing debris from the view of a sensor of a railway track assessment system is disclosed. The method includes the steps of (1) blowing air from an air blowing device on a rail vehicle wherein such air is blown through a duct to an exit location wherein the exit location is proximate to a sensor enclosure including a transparent window through which a sensor has a field of view; and (2) clearing the transparent window of debris using the air exiting the duct at the exit location.
In another aspect, a sensor pod is disclosed including a sill mount attached adjacent to an undercarriage of a rail vehicle; a first side bracket attached adjacent to a first side of the sill mount; a second side bracket attached adjacent to a second side of the sill mount; and a sensor enclosure wherein a first side of the sensor enclosure is attached adjacent to the first side bracket and a second side of the sensor enclosure is attached adjacent to the second side bracket. In one preferred embodiment, (1) the first side bracket further includes a plurality of elongate first side bracket apertures; (2) the second side bracket further includes a plurality of second side bracket apertures; and (3) the sensor enclosure further includes (a) a first plurality of tapped holes for receiving fasteners located along a first side of the sensor enclosure that align with the plurality of elongate first side bracket apertures; and (b) a second plurality of tapped holes for receiving fasteners located along a second side of the sensor enclosure that align with the plurality of elongate second side bracket apertures, wherein the sensor enclosure can be selectively fastened to the first side bracket at different angles using first fasteners extending through the plurality of elongate first side bracket apertures into the first plurality of tapped holes and wherein the sensor enclosure can be selectively fastened to the second side bracket at different angles using second fasteners extending through the plurality of elongate second side bracket apertures into the second plurality of tapped holes.
In another aspect, method of for gathering, storing, and processing profiles of one or both rails on a railway track using a railway track assessment apparatus while the railway track assessment apparatus travels on a rail vehicle along the railway track is disclosed, the method comprising locating a plurality of pixels using a railway track assessment apparatus, the railway track assessment apparatus comprising a processor; a system controller in communication with the processor; a data storage device in communication with the processor; a power source for providing electric power to the track assessment apparatus; a first sensor pod attached adjacent to an undercarriage of the rail vehicle, the first sensor pod comprising a first 3D sensor in communication with the system controller, the first 3D sensor configured in an orientation at an oblique angle β relative to a railway track bed surface supporting rails on which the rail vehicle is moving and configured in an orientation substantially perpendicular to the supporting rails to obtain a side view of a first side of a first rail of the railway track and obtain data from the railway track; and a first structured light generator in communication with the system controller. The method further comprises determining a distance between the 3D sensor and a center of a rail along the railway track using the processor; and calibrating using the processor to determine the real-world width of a pixel from a scan made by the 3D sensor of the rail.
In some embodiments, the method described above further comprises identifying the rail type of the rail using the processor by using optical character recognition to identify rail markings on the rail; and determining rail head wear using the processor by comparing a rail head height of the rail with the design specifications for the rail. The method may further comprise determining a rail height of the rail. Additionally or alternatively, the located plurality of pixels may comprise a rail head top pixel, a rail head bottom pixel, a rail base top pixel, and a rail base bottom pixel.
Additionally or alternatively, the method described above may further comprise identifying the rail type of the rail using the processor by using optical character recognition to identify rail markings on the rail; and determining rail head face wear using the processor by comparing a rail head a measured rail head face position with a rail head face position that was present when the rail was first manufactured.
Additionally or alternatively, the method described above may further comprise determining a cant angle of the rail.
Additionally or alternatively, the method described above may further comprise locating a tie surface pixel using the 3D scanner and the processor; and determining a pad thickness of a pad between a rail base of the rail and a tie under the rail base.
Additionally or alternatively, the method described above may further comprise detecting a large localized surface elevation difference along a minimum length of the rail; and analyzing the large localized surface elevation difference as a joint bar. The analyzing step further comprises a member selected from the group consisting of: performing linear and geospatial referencing of the joint of the joint bar, taking inventory of the joint bar, determining a condition of the joint, determining a width of any gap detected between rail segment ends joined using the joint bar, determining whether the joint bar is compliant with any required specifications of the owner of the railroad where the rail is located, and combinations thereof.
Additionally or alternatively, the method described above may further comprise detecting a rail switch point; and analyzing an image of the detected rail switch point using the processor wherein the image is gathered by the 3D sensor. The method may further comprise a step selected from the group consisting of determining whether any part of the detected rail switch point has broken off, determining whether a top of the detected rail switch point is at a correct height, and determining whether the detected rail switch point sits flush against a main rail surface of the rail.
Additionally or alternatively, the method described above may further comprise detecting a concrete rail clip; and analyzing an image of the detected concrete rail clip using the processor; and determining whether there are any abnormalities or breakages of the detected concrete rail clip.
Additionally or alternatively, the method described above may further comprise detecting a small localized surface elevation difference along a minimum length of the rail; analyzing the small localized surface elevation difference as a rail weld; and recording data related to the detected small localized surface elevation on the data storage device.
Additionally or alternatively, the method described above may further comprise detecting a wired based localized surface elevation difference along a minimum length of the rail; and analyzing the wire based localized surface elevation difference as a bond wire; and recording data related to the detected wire based localized surface elevation on the data storage device.
Additionally or alternatively, the method described above may further comprise detecting a hole based localized surface elevation difference along a minimum length of the rail; analyzing the hole based localized surface elevation difference as a rail hole; and recording data related to the detected hole based localized surface elevation on the data storage device.
Additionally or alternatively, the method described above may further comprise detecting a crack based or gap based localized surface elevation difference along a minimum length of the rail; analyzing the crack based or gap based localized surface elevation difference as a broken rail; and recording data related to the detected crack based or gap based localized surface elevation on the data storage device.
In some embodiments, the method may further comprise determining whether a crushed railhead is present; determining whether a battered rail joint is present; and determining whether misaligned rails are present.
In some embodiments, the method may further comprise analyzing rail height transitions.
The summary provided herein is intended to provide examples of particular disclosed embodiments and is not intended to cover all potential embodiments or combinations of embodiments. Therefore, this summary is not intended to limit the scope of the invention disclosure in any way, a function which is reserved for the appended claims.
Further features, aspects, and advantages of the present disclosure will become better understood by reference to the following detailed description, appended claims, and accompanying figures, wherein elements are not to scale so as to more clearly show the details, wherein like reference numbers indicate like elements throughout the several views, and wherein:
Various terms used herein are intended to have particular meanings. Some of these terms are defined below for the purpose of clarity. The definitions given below are meant to cover all forms of the words being defined (e.g., singular, plural, present tense, past tense). If the definition of any term below diverges from the commonly understood and/or dictionary definition of such term, the definitions below control.
“Track”, “Railway track”, “track bed”, “rail assembly”, or “railway track bed” is defined herein to mean a section of railway including the rails, crossties (or “ties”), components holding the rails to the crossties, components holding the rails together, and ballast material.
A “processor” is defined herein to include a processing unit including, for example, one or more microprocessors, an application-specific instruction-set processor, a network processor, a vector processor, a scalar processor, or any combination thereof, or any other control logic apparatus now known or later developed that is capable of performing the tasks described herein, or any combination thereof.
The phrase “in communication with” means that two or more devices are in communication with one another physically (e.g., by wire) or indirectly (e.g., by wireless communication).
When referring to the mechanical joining together (directly or indirectly) of two or more objects, the term “adjacent” means proximate to or adjoining. For example, for the purposes of this disclosure, if a first object is said to be attached “adjacent to” a second object, the first object is either attached directly to the second object or the first object is attached indirectly (i.e., attached through one or more intermediary objects) to the second object.
Embodiments of the present disclosure provide methods and apparatuses for determining plate cut and rail seat abrasion values without requiring the upper surface of a crosstie plate for wooden crossties or rail base for concrete crossties to be visible to sensors located in proximity of a rail assembly. Methods described herein enable determination of plate cut and rail seat abrasion values when all or portions of the rail assembly are obscured by ballast or other debris, and only require that a top of the rail head and a portion of an underlying crosstie surface to be visible to sensors passing overhead.
As shown in
For embodiments employing one or more light emitters 208, such light emitters 208 are used to project a light, preferably a laser line, onto a surface of an underlying rail assembly to use in association with three-dimensional sensors to three-dimensionally triangulate the rail assembly. In a preferred embodiment, a camera 224 in communication with the processor 202 via a camera interface 226 is oriented such that a field of view 228 of the camera 224 captures the rail assembly including the light projected from the light emitter 208. The camera 224 may include a combination of lenses and filters and using known techniques of three-dimensional triangulation a three-dimensional elevation map of an underlying railway track bed can be generated by the processor 202 after vectors of elevations are gathered by the camera 224 as the rail vehicle 222 moves along the rail. Elevation maps generated based on the gathered elevation and intensity data can be interrogated by the processor 202 or other processing device using machine vision algorithms. Suitable cameras and sensors may include commercially available three-dimensional sensors and cameras, such as three-dimensional cameras manufactured by SICK AG based in Waldkirch, Germany.
ToF sensors are preferably based on pulsed laser light or LiDAR technologies. Such technologies determine the distance between the sensor and a measured surface by calculating an amount of time required for a light pulse to propagate from an emitting device, reflect from a point on the surface to be measured, and return back to a detecting device. The ToF sensors may be a single-point measurement device or may be an array measurement device, commonly referred to as a ToF camera, such as those manufactured by Basler AG or pmdtechnologies AG.
Referring to
Referring again to
In a preferred embodiment, data from the camera 224 and one or more sensors 212 is combined, and a calibration process is preferably performed between the camera 224 and one or more sensors 212 using a known dimensional calibration target such that the camera 224 and one or more sensors 212 combine to generate a 3D elevation map as described in greater detail below.
The encoder 216 is located at a wheel 230 of the rail vehicle 222 and is in communication with the processor 202 via the encoder interface 220. The encoder 216 preferably operates at a rate of at least 12,500 pulses per revolution of the wheel 230 with a longitudinal distance of approximately 0.25 mm per pulse. Measurements from sensors 212 of the track assessment system are preferably synchronized with data from the encoder 216 to determine locations of measurements of the track assessment system and a generated three-dimensional elevation map. In one embodiment, the track assessment system further includes a GPS antenna 232 in communication with the processor 202 via a GPS interface 234 to further provide geo-position synchronization data during measurement of a rail assembly.
In order to extend the ability to estimate plate cut measures in areas with obscured crosstie plates (
Methods disclosed herein determine a difference between a wooden crosstie surface elevation 300 and an estimated tie plate base elevation 302. The improved rail head surface elevation method described herein measures a rail head surface elevation 304 as a reference elevation and calculates a vertical offset from the rail head surface elevation 304 to establish the estimated tie plate base elevation 302. This vertical offset is calculated as the sum of an estimated rail height 306 and an estimated tie plate thickness 308. The total height of the entire rail is the sum of both the “estimated rail height” 306 (which includes the distance from the rail head surface elevation 304 to a rail base surface elevation 310) plus the estimated tie plate thickness 308. A plate cut measurement 312 based on rail head surface elevation (which is insensitive to the presence of rail base surface debris) may be determined, for example, as follows:
Plate Cut Measurement=Crosstie Surface Elevation−(Rail Head Surface Elevation−(Rail Height Estimate+Estimated Crosstie Plate Thickness)) Equation 3
Estimated rail height 306 may be determined, for example, from a) the specifications of known rail sizes and types, b) by using a representative fixed elevation estimate, or c) by calculating the elevation difference between the rail head and rail base top surface at regular intervals along the length of the track.
Exemplary methods of determining the estimated rail height 306 can include analyzing data collected on the track assessment system 200, including location data from one or both of the encoder 216 and GPS antenna 232 to determine a position at which measurements of the rail assembly are taken. Location data may be used to determine a particular type of rail used based on data provided by an owner or operator of a particular railway, such data accessed directly from an onboard data storage device (e.g., the data storage device 206) or wirelessly from a remote data storage device. For example, an owner or operator of a railway may provide data regarding the manufacturer and size of a rail used at particular locations of the railway, and the estimated rail height 306 may be determined based on known dimensions of the rail available from the manufacturer.
In another exemplary method, data collected from the track assessment system 200 may be analyzed to detect visual marks or indicators 314 located on the rail, as shown in
In yet another exemplary method, the estimated rail height 306 (
The estimated tie plate thickness 308 shown in
Referring to
Field Rail Height=Rail Head Elevation−Field Rail Base Elevation Equation 4
Gauge Rail Height=Rail Head Elevation−Gauge Rail Base Elevation Equation 5
Various sensors and technologies may be employed to determine elevations of components of the track and to provide additional measurements when calculating rail height, rail base thickness, or tie plate thickness estimates. These technologies can include fixed point or LiDAR based Time of Flight ToF range sensors referenced to 3D triangulation elevation measurement systems.
In order to extend the ability to estimate rail seat abrasion (RSA) measurement in areas with obscured rail base surfaces, the rail base seat elevation measures can be referenced to the top surface of the rail head 110, the surface on which the wheels travel, is an area of the track structure which is never obscured. Rail seat abrasion measurements referenced from the rail head elevation produce valid RSA measures, even in conditions where the presence of ballast, debris or foliage in and around the track obscures all but the top surface of the rail head and a small portion of the crosstie surface.
Methods and embodiments of the present disclosure are further capable of determining a rail seat abrasion (RSA) value of a section of track. Referring to
Rail Seat Abrasion=Crosstie Surface Elevation−(Rail Head Elevation−(Rail Height Estimate+Rail Base Thickness Estimate)). Equation 6
With further reference to
Referring now to
Field Side Total Rail Height=Rail Head Elevation−(Field Side Crosstie Elevation+Pad Thickness) Equation 7
Gauge Side Total Rail Height=Rail Head Elevation−(Gauge Side Crosstie Elevation+Pad Thickness) Equation 8
The combined rail height and rail base thickness (collectively, the “total rail height”), plus pad thickness can be determined by calculating a running maximum of a difference of the rail head surface elevation 304 to the concrete crosstie surface elevation 328, as shown in
The calculation of the rail seat elevation based on the difference in rail head elevation and combined rail height and rail base thickness measurement allows calculating RSA measurements in situations where the rail base is obscured with track debris, such as ballast stones. The presence of track debris, and ballast stones in particular, on the top surface of the rail base (e.g., the rail foot and crosstie plates) is a common occurrence.
Rail Seat Abrasion=Crosstie Surface Elevation−(Rail Head Elevation−(Rail Height Estimate[including rail head and rail web]+Rail Base Thickness Estimate)) Equation 9
The method described above is insensitive to the presence of debris on the rail base surface. For example,
Referring now to
The fixed-point Time of Flight or LiDAR sensors can be positioned to provide measurements for rail base, rail head and crosstie surface elevations for both the field and gauge side of each rail. These systems would be capable of providing real-time rail seat abrasion measures in both clear rail base and obscured rail base scenarios.
In operation, the track assessment system 200 scans an underlying track, and the track assessment system 200 preferably moves along the track to gather data at various points along the track. Data from the track assessment system includes elevational data corresponding to an elevation of the rail head and an elevation of a top surfaces of crossties. Elevation data may be stored on the data storage device 206 (
Embodiments of the present disclosure refer to an elevation or surface elevation of various components of a rail assembly, such as the concrete crosstie surface elevation 320, rail head surface elevation 304, and other surface elevations. As shown in
Methods and embodiments described herein advantageously allow for the detection and measurement of plate cut and rail seat abrasion in environments where all or portions of crosstie plates and other components of the rail assembly are obscured by debris such as ballast stones. One embodiment as shown in
In certain embodiments a 3D track assessment system 500 can be used as shown schematically in
In a preferred embodiment, the 3D track assessment system 500 includes a first sensor 502A, a first structured light generator 506A, a first heating and cooling device 522A (e.g., solid state or piezo electric), and a first thermal sensor 524A all substantially sealed in a first enclosure 526A forming part of a first sensor pod 528A; a second sensor 502B, a second structured light generator 506B, a second heating and cooling device 522B, and a second thermal sensor 524B all substantially sealed in a second enclosure 526B forming part of a second sensor pod 528B; a third sensor 502C, a third structured light generator 506C, a third heating and cooling device 522C, and a third thermal sensor 524C all substantially sealed in a third enclosure 526C forming part of a third sensor pod 528C; and a fourth sensor 502D, a fourth structured light generator 506D, a fourth heating and cooling device 522D, and a fourth thermal sensor 524D all substantially sealed in a fourth enclosure 526D forming part of a fourth sensor pod 528D.
The controller 514 further includes a 3D sensor controller 530 in communication with the 3D sensors 502, a sensor trigger controller 532 in communication with the 3D sensors 502, a structured light power controller 534 in communication with the structured light generators 506, and a temperature controller 536 in communication with the heating and cooling devices 522 and the thermal sensors 524. The system controller 514 further includes a network interface 538 in communication with the processor 508 and the 3D sensor controller 530, sensor trigger controller 532, structured light power controller 534, and the temperature controller 536. The triggering for the 3D sensors 502 is generated by converting pulses from an encoder 538 (e.g., a quadrature wheel encoder attached adjacent to a wheel 540 on the survey rail vehicle 504 wherein the encoder 538 is capable of generating 12,500 pulses per revolution, with a corresponding direction signal) using the dedicated sensor trigger controller 532, a component of the dedicated system controller 514, which allows converting the very high resolution wheel encoder pulses to a desired profile measurement interval programmatically. For example, the wheel 540 could produce encoder pulses every 0.25 mm of travel and the sensor trigger controller 532 would reduce the sensor trigger pulse to one every 1.5 mm and generate a signal corresponding to the forward survey direction, or a different signal for a reverse survey direction.
The configuration of the four 3D sensors 502 and light generators 506 ensure that the complete rail profile is captured by combining the trigger synchronized left and right 3D sensor profiles of both rails 520 on a railway track simultaneously to produce a single combined scan for each rail. These scans can be referenced to geo-spatial coordinates using the processor 508 by synchronizing the wheel encoder 538 pulses to GNSS receiver positions acquired from the GNSS satellite network (e.g., GPS). This combined rail profile and position reference information can then be saved in the data storage device 510.
The 3D sensors 502 and structured light generators 506 are housed in the substantially sealed water tight enclosures 526. Because of the heating and cooling devices 522, thermal sensors 524, and the dedicated temperature controller 536, the inside of the enclosures 526 can be heated when the ambient temperature is below a low temperature threshold and cooled when the ambient air temperature is above a high temperature threshold. The thermal sensors 524 provide feedback to the temperature controller 536 so that the temperature controller can activate the heating function or the cooling function of the heating and cooling devices on an as-needed basis. These sealed and climate-controlled enclosures 526 ensure the correct operation and extend the operational life of the sensitive sensors 502 and light generators 506 by maintaining a clean and dry environment within acceptable ambient temperature limits. The temperature control function is part of the system controller 514 with a dedicated heating and cooling device interface inside each enclosure.
Sensor pod 528 structural components such as the sides of the enclosures 526, internal frames 546, the sill mount 549, the side brackets 550, and the air distribution lids 556 are preferably made of impact resistant and non-corrosive materials including, for example, aluminum or stainless steels. Although metal is preferred, other materials could be used instead of metal including, for example, polycarbonate or ABS plastics. The power supply 512 can be different types of power sources such as, for example, electricity from the rail vehicle 504 originating from a liquid fuel to propel the rail vehicle 504 and being output as electricity, a generator burning a fuel and outputting electricity, solar panels and outputting electricity or a battery source. Power is preferably fed to the system controller 514 and from there is fed to other components of the system 500 in communication with or otherwise electrically tied to the system controller 514. For example, power is fed from the system controller 514 to the processor 508, components in communication with the processor 508, and the sensor pods 528 (including all electrical hardware in the sensor pods 528). The operator interface 516 can come in the form of different devices including, for example, an onboard computer with a monitor and input device (e.g., a keyboard), a computing tablet, a computing cellular device, or other similar device known to a person having ordinary skill in the art.
Each 3D sensor profile gathered from operating the 3D sensors is analyzed by the system controller 514 and the light intensity from the light generators 506 is adjusted to optimize the exposure levels. Low intensity profile scans result in an increase of structured light generator power and over exposed profile scans reduces the structured light generator drive power level. The laser power controller 534 also monitors structured light source temperature and current and is able to shutdown each individual light source in the event that safe operating limits are exceeded.
Computer executable instructions stored on a computer readable storage medium in communication with the system controller 514 are used to run an algorithm to control the amount of power supplied to the structured light generators 506. The structured light generators 506 are preferably laser line generators and are referred to below as “lasers”. An example of this algorithm is shown in the flowchart in
On system 500 initialization, each 3D sensor 502 is configured with required operational parameters by the 3D sensor controller 530. These parameters can include; exposure times, region of interest, gain levels, and sensor processing algorithms. The 3D sensor controller 530 is programmed with the specific 3D sensor operational parameters from a configuration file for each sensor by the processor 508 to allow external changes to the sensor parameters as required.
During operation of the system 500, 3D sensor scan data is streamed from the system controller 514 to the processor 508 for storage in the data storage device 510, linear referencing (wheel encoder based), geo-spatial referencing (GNSS receiver based), processing, and analysis. The processor 508 is programmed with algorithms for real-world profile coordinate correction (converting the oblique scan angle profile data to real-world coordinates), and feature detection and assessments. Features can include the recognition of rail web manufacturer markings (including both branded (raised) and stamped (recessed) marks). These marks are repeated (typically every 4 to 8 feet) on a rail web of a rail on one or both sides of the rail. These marks include the weight of rail (115, 136, 140 lb/yard rail), heat treating information, manufacturer, and year and lot/month of manufacture. Examples of 3D rail web elevation profile-based rail manufacturer marks are shown in
After the alphanumeric markings are processed and recorded, in real time or near real time, the system 500 can access the onboard database 576A or the cloud-based database 576B via a wireless transmitter/receiver 578 in communication with the processor 508. By accessing the database(s) 576, the system 500 is then informed on the specific design specifications for that specific section of rail being analyzed. Current measurements of this section of rail made by the system 500 can then be compared to design specifications for the section of rail to determine changes that have occurred to that rail section over time. The system uses various rail processing steps for this analysis and it preferably processes the data streams from all four 3D sensors simultaneously or substantially simultaneously.
With additional reference to
The processor 508 can also determine rail cant angle by determining the angle that the vertical centerline of the combined field and gauge rail profile (the calculated centerline of the measured rail cross-section) makes with the plane of the tie surfaces to which the rails are fastened (Step 822).
The processor 508 can also locate tie surface pixels if a particular scanline includes a tie (Step 824) by identifying regions of the profile where the surface normal variation (or the profile gradient variation) is low, representing a smooth region of the correct dimension (8 to 10 inches wide typically) and placement (next to both rails. This information can be used to help determine pad thickness between a rail base and a tie under the rail base (Step 826). More details regarding these steps are discussed below and shown in
The processor 508 can also determine rail web surface elevation variations (or anomalies) (Step 828) by determining the difference between the localized rail web surface profile and the extended rail web median profile (a profile derived by calculating the median profile elevation at each point of the profile over a rail length of from about 5 meters to about 6 meters). If a large localized surface elevation difference of a minimum length (anomaly) is detected, the processor 508 presumes that the anomaly represents a rail joint bar and that joint is then analyzed (Step 830). This step can further include sub-steps such as linear and geospatial referencing of the joint, taking inventory of the joint bar (including size and type), determining the joint condition (e.g., whether it is broken or has missing bolts), the width of any gap detected between the joined rail segment ends (rail joint gap) and whether the joint is compliant with any required specifications from the railroad owner. An image reproduced from the combination of multiple scanlines of a joint bar is shown in
If a small localized surface elevation anomaly is detected, the processor 508 will presume it is a rail weld which can be visually analyzed by the processor 508 (Step 832) and data recorded on the data storage device 510 related to this feature. If a wire based localized elevation anomaly is detected, the processor 508 presumes it is a bond wire which can be analyzed by the processor (Step 834) and data recorded on the data storage device 510 related to this feature. If a hole based localized elevation anomaly is detected, the processor 508 will presume that it represents a rail hole which can be analyzed by the processor 508 (Step 836) and data recorded to the data storage device 510 related to this feature. If a crack or gap based localized elevation anomaly is detected, the processor 508 presumes it represents a broken rail which can be analyzed by the processor 508 (Step 838) and data recorded to the data storage device 510 related to this feature. The processor 508 can also determine elevation anomalies related to manufacturer markings (Step 840) discussed above in association with
The system 500 can also be used to locate railhead surface elevation variations (Step 842). If the processor 508 detects what it believes is a rail joint based on visual analysis (Step 844), the processor 508 can then visually analyze crushed railheads (Step 846), visually analyze battered railheads joints (Step 848), visually analyze misaligned rails (Step 850), and visually analyze rail height transitions (Step 852). For all of the features that are analyzed in steps 800-852, the system 500 can record and keep track of data associated with that particular location on the rail being interrogated, such information including time, linear referencing (wheel encoder based), and geo-spatial referencing (GNSS receiver based). Such information can be stored in the data storage device 510. Not all of the rail processing steps described above (800-852) have to occur together or in the specific order as listed.
Because of the unique orientation of the sensors 502 relative to rails being interrogated, the system 500 also can be used to make direct determinations of rail seat abrasion or plate cut. Computer executable instructions stored on a computer readable storage medium in communication with the processor 508 are used to run this algorithm shown in a flowchart in
If the system 500 is detecting plate cut values, a slightly different algorithm is used because the estimated thickness of the tie plate must be accounted for. Computer executable instructions stored on a computer readable storage medium in communication with the processor 508 are used to run this algorithm shown in a flowchart in
One of the advantages of the embodiments described herein is that some of the embodiments represent the first non-contact 3D measurement/analysis of rail webs for the purposes of rail manufacturing mark inventory (required for accurate rail asset management not currently possible) at the network level. The ability to take this type of inventory allows regulatory compliance assessments for rail at the network level (right rail type for the place/use it has been installed). Certain embodiments herein also represent the first 3D measurement/analysis of other side-of-rail hardware (joints, welds, bond wires, rail pads thickness for PCC ties) not previously possible at the network level. These embodiments augment emerging downward looking 3D track assessment technologies with the ability to look at the ‘side’ of the rails which are one of the most critical components of the overall track structure. Such embodiments produce for the first time a more complete 3D view of the track surface including such side views.
The foregoing description of preferred embodiments of the present disclosure has been presented for purposes of illustration and description. The described preferred embodiments are not intended to be exhaustive or to limit the scope of the disclosure to the precise form(s) disclosed. Obvious modifications or variations are possible in light of the above teachings. The embodiments are chosen and described in an effort to provide the best illustrations of the principles of the disclosure and its practical application, and to thereby enable one of ordinary skill in the art to utilize the concepts revealed in the disclosure in various embodiments and with various modifications as are suited to the particular use contemplated. All such modifications and variations are within the scope of the disclosure as determined by the appended claims when interpreted in accordance with the breadth to which they are fairly, legally, and equitably entitled.
This application is a continuation of and claims priority to U.S. Nonprovisional application Ser. No. 16/889,016 entitled “APPARATUS AND METHOD FOR GATHERING DATA FROM SENSORS ORIENTED AT AN OBLIQUE ANGLE RELATIVE TO A RAILWAY TRACK” which was filed on Jun. 1, 2020, which is a continuation of and claims priority to U.S. Nonprovisional application Ser. No. 16/255,928 entitled “APPARATUS AND METHOD FOR GATHERING DATA FROM SENSORS ORIENTED AT AN OBLIQUE ANGLE RELATIVE TO A RAILWAY TRACK” which was filed on Jan. 24, 2019 which is a continuation-in-part of and claims priority to U.S. Nonprovisional application Ser. No. 16/127,956, now U.S. Pat. No. 10,625,760, entitled “APPARATUS AND METHOD FOR CALCULATING WOODEN CROSSTIE PLATE CUT MEASUREMENTS AND RAIL SEAT ABRASION MEASUREMENTS BASED ON RAIL HEAD HEIGHT” which was filed on Sep. 11, 2018 which claims priority to U.S. Provisional Application Ser. No. 62/679,467 entitled “APPARATUS AND METHOD FOR CALCULATING WOODEN TIE PLATE CUT MEASUREMENTS AND RAIL SEAT ABRASION MEASUREMENTS” which was filed on Jun. 1, 2018, the entireties of which are incorporated herein by reference in their respective entireties.
Number | Name | Date | Kind |
---|---|---|---|
3562419 | Stewart et al. | Feb 1971 | A |
3942000 | Dieringer | Mar 1976 | A |
4040738 | Wagner | Aug 1977 | A |
4198164 | Cantor | Apr 1980 | A |
4265545 | Slaker | May 1981 | A |
4330775 | Iwamoto et al. | May 1982 | A |
4490038 | Theurer et al. | Dec 1984 | A |
4531837 | Panetti | Jul 1985 | A |
4554624 | Wickham et al. | Nov 1985 | A |
4600012 | Kohayakawa et al. | Jul 1986 | A |
4653316 | Fukuhara | Mar 1987 | A |
4676642 | French | Jun 1987 | A |
4691565 | Theurer | Sep 1987 | A |
4700223 | Shoutaro et al. | Oct 1987 | A |
4731853 | Hata | Mar 1988 | A |
4775238 | Weber | Oct 1988 | A |
4781060 | Berndt | Nov 1988 | A |
4899296 | Khattak | Feb 1990 | A |
4900153 | Weber et al. | Feb 1990 | A |
4915504 | Thurston | Apr 1990 | A |
4974168 | Marx | Nov 1990 | A |
5199176 | Theurer et al. | Apr 1993 | A |
5203089 | Trefouel et al. | Apr 1993 | A |
5221044 | Guins | Jun 1993 | A |
5245855 | Burgel et al. | Sep 1993 | A |
5247338 | Danneskiold-Samsoe et al. | Sep 1993 | A |
5275051 | De Beer | Jan 1994 | A |
5353512 | Theurer et al. | Oct 1994 | A |
5433111 | Hershey et al. | Jul 1995 | A |
5487341 | Newman | Jan 1996 | A |
5493499 | Theurer et al. | Feb 1996 | A |
5612538 | Hackel et al. | Mar 1997 | A |
5623244 | Cooper | Apr 1997 | A |
5627508 | Cooper et al. | May 1997 | A |
5671679 | Straub et al. | Sep 1997 | A |
5721685 | Holland et al. | Feb 1998 | A |
5743495 | Welles | Apr 1998 | A |
5744815 | Gurevich et al. | Apr 1998 | A |
5757472 | Wangler et al. | May 1998 | A |
5786750 | Cooper | Jul 1998 | A |
5787815 | Andersson et al. | Aug 1998 | A |
5791063 | Kesler et al. | Aug 1998 | A |
5793491 | Wangler et al. | Aug 1998 | A |
5793492 | Vanaki | Aug 1998 | A |
5804731 | Jaeggi | Sep 1998 | A |
5808906 | Sanchez-Revuelta et al. | Sep 1998 | A |
5912451 | Gurevich et al. | Jun 1999 | A |
5969323 | Gurevich | Oct 1999 | A |
5970438 | Clark et al. | Oct 1999 | A |
5986547 | Korver et al. | Nov 1999 | A |
6025920 | Dec | Feb 2000 | A |
6055322 | Salganicoff | Apr 2000 | A |
6055862 | Martens | May 2000 | A |
6062476 | Stern et al. | May 2000 | A |
6064428 | Trosino et al. | May 2000 | A |
6069967 | Rozmus et al. | May 2000 | A |
6128558 | Kernwein | Oct 2000 | A |
6243657 | Tuck et al. | Jun 2001 | B1 |
6252977 | Salganicoff | Jun 2001 | B1 |
6324912 | Wooh | Dec 2001 | B1 |
6347265 | Bidaud | Feb 2002 | B1 |
6356299 | Trosino et al. | Mar 2002 | B1 |
6357297 | Makino et al. | Mar 2002 | B1 |
6405141 | Carr et al. | Jun 2002 | B1 |
6416020 | Gronskov | Jul 2002 | B1 |
6496254 | Bostrom | Dec 2002 | B2 |
6523411 | Mian et al. | Feb 2003 | B1 |
6540180 | Anderson | Apr 2003 | B2 |
6570497 | Puckette, IV | May 2003 | B2 |
6600999 | Clark et al. | Jul 2003 | B2 |
6615648 | Ferguson et al. | Sep 2003 | B1 |
6634112 | Carr et al. | Oct 2003 | B2 |
6647891 | Holmes et al. | Nov 2003 | B2 |
6665066 | Nair et al. | Dec 2003 | B2 |
6681160 | Bidaud | Jan 2004 | B2 |
6698279 | Stevenson | Mar 2004 | B1 |
6715354 | Wooh | Apr 2004 | B2 |
6768551 | Mian et al. | Jul 2004 | B2 |
6768959 | Ignagni | Jul 2004 | B2 |
6804621 | Pedanckar | Oct 2004 | B1 |
6854333 | Wooh | Feb 2005 | B2 |
6862936 | Kenderian et al. | Mar 2005 | B2 |
6873998 | Dorum | Mar 2005 | B1 |
6909514 | Nayebi | Jun 2005 | B2 |
6976324 | Theurer et al. | Dec 2005 | B2 |
6995556 | Nejikovsky et al. | Feb 2006 | B2 |
7023539 | Kowalski | Apr 2006 | B2 |
7034272 | Leonard | Apr 2006 | B1 |
7036232 | Casagrande | May 2006 | B2 |
7054762 | Pagano et al. | May 2006 | B2 |
7084989 | Johannesson et al. | Aug 2006 | B2 |
7130753 | Pedanekar | Oct 2006 | B2 |
7152347 | Herzog et al. | Dec 2006 | B2 |
7164476 | Shima et al. | Jan 2007 | B2 |
7164975 | Bidaud | Jan 2007 | B2 |
7208733 | Mian et al. | Apr 2007 | B2 |
7213789 | Matzan | May 2007 | B1 |
7298548 | Mian | Nov 2007 | B2 |
7328871 | Mace et al. | Feb 2008 | B2 |
7355508 | Mian et al. | Apr 2008 | B2 |
7357326 | Hattersley et al. | Apr 2008 | B2 |
7392117 | Bilodeau et al. | Jun 2008 | B1 |
7392595 | Heimann | Jul 2008 | B2 |
7394553 | Carr et al. | Jul 2008 | B2 |
7403296 | Farritor et al. | Jul 2008 | B2 |
7412899 | Mian et al. | Aug 2008 | B2 |
7463348 | Chung | Dec 2008 | B2 |
7499186 | Waisanen | Mar 2009 | B2 |
7502670 | Harrison | Mar 2009 | B2 |
7516662 | Nieisen et al. | Apr 2009 | B2 |
7555954 | Pagano et al. | Jul 2009 | B2 |
7564569 | Mian et al. | Jul 2009 | B2 |
7602937 | Mian et al. | Oct 2009 | B2 |
7616329 | Villar et al. | Nov 2009 | B2 |
7659972 | Magnus et al. | Feb 2010 | B2 |
7680631 | Selig et al. | Mar 2010 | B2 |
7681468 | Verl et al. | Mar 2010 | B2 |
7698028 | Bilodeau et al. | Apr 2010 | B1 |
7755660 | Nejikovsky et al. | Jul 2010 | B2 |
7755774 | Farritor et al. | Jul 2010 | B2 |
7769538 | Rousseau | Aug 2010 | B2 |
7832281 | Mian et al. | Nov 2010 | B2 |
7869909 | Harrison | Jan 2011 | B2 |
7882742 | Martens | Feb 2011 | B1 |
7899207 | Mian et al. | Mar 2011 | B2 |
7920984 | Farritor | Apr 2011 | B2 |
7937246 | Farritor et al. | May 2011 | B2 |
7942058 | Turner | May 2011 | B2 |
8006559 | Mian et al. | Aug 2011 | B2 |
8079274 | Mian et al. | Dec 2011 | B2 |
8081320 | Villar et al. | Dec 2011 | B2 |
8111387 | Douglas et al. | Feb 2012 | B2 |
8140250 | Mian et al. | Mar 2012 | B2 |
8150105 | Mian et al. | Apr 2012 | B2 |
8155809 | Bilodeau et al. | Apr 2012 | B1 |
8180590 | Szwilski et al. | May 2012 | B2 |
8188430 | Mian | May 2012 | B2 |
8190377 | Fu | May 2012 | B2 |
8209145 | Paglinco et al. | Jun 2012 | B2 |
8263953 | Fomenkar et al. | Sep 2012 | B2 |
8289526 | Kilian et al. | Oct 2012 | B2 |
8326582 | Mian et al. | Dec 2012 | B2 |
8335606 | Mian et al. | Dec 2012 | B2 |
8345948 | Zarembski et al. | Jan 2013 | B2 |
8352410 | Rousselle et al. | Jan 2013 | B2 |
8345099 | Bloom et al. | Feb 2013 | B2 |
8365604 | Kahn | Feb 2013 | B2 |
8405837 | Nagle, II et al. | Mar 2013 | B2 |
8412393 | Anderson | Apr 2013 | B2 |
8418563 | Wigh et al. | Apr 2013 | B2 |
8423240 | Mian | Apr 2013 | B2 |
8424387 | Wigh et al. | Apr 2013 | B2 |
8478480 | Mian et al. | Jul 2013 | B2 |
8485035 | Wigh et al. | Jul 2013 | B2 |
8490887 | Jones | Jul 2013 | B2 |
8514387 | Scherf et al. | Aug 2013 | B2 |
8577647 | Farritor et al. | Nov 2013 | B2 |
8615110 | Landes | Dec 2013 | B2 |
8625878 | Haas et al. | Jan 2014 | B2 |
8649932 | Mian et al. | Feb 2014 | B2 |
8655540 | Mian et al. | Feb 2014 | B2 |
8682077 | Longacre, Jr. | Mar 2014 | B1 |
8700924 | Mian et al. | Apr 2014 | B2 |
8711222 | Aaron et al. | Apr 2014 | B2 |
8724904 | Fujiki | May 2014 | B2 |
8806948 | Kahn et al. | Aug 2014 | B2 |
8818585 | Bartonek | Aug 2014 | B2 |
8820166 | Wigh et al. | Sep 2014 | B2 |
8868291 | Mian et al. | Oct 2014 | B2 |
8875635 | Turner et al. | Nov 2014 | B2 |
8887572 | Turner | Nov 2014 | B2 |
8903574 | Cooper et al. | Dec 2014 | B2 |
8925873 | Gamache et al. | Jan 2015 | B2 |
8934007 | Snead | Jan 2015 | B2 |
8942426 | Bar-Am | Jan 2015 | B2 |
8958079 | Kainer et al. | Feb 2015 | B2 |
9036025 | Haas et al. | May 2015 | B2 |
9049433 | Prince | Jun 2015 | B1 |
9050984 | Li et al. | Jun 2015 | B2 |
9111444 | Kaganovich | Aug 2015 | B2 |
9121747 | Mian et al. | Sep 2015 | B2 |
9134185 | Mian et al. | Sep 2015 | B2 |
9175998 | Turner et al. | Nov 2015 | B2 |
9177210 | King | Nov 2015 | B2 |
9187104 | Fang et al. | Nov 2015 | B2 |
9195907 | Longacre, Jr. | Nov 2015 | B1 |
9205849 | Cooper et al. | Dec 2015 | B2 |
9205850 | Shimada | Dec 2015 | B2 |
9212902 | Enomoto et al. | Dec 2015 | B2 |
9222904 | Harrison | Dec 2015 | B2 |
9234786 | Groll et al. | Jan 2016 | B2 |
9255913 | Kumar et al. | Feb 2016 | B2 |
9297787 | Fisk | Mar 2016 | B2 |
9310340 | Mian et al. | Apr 2016 | B2 |
9336683 | Inomata et al. | May 2016 | B2 |
9340219 | Gamache et al. | May 2016 | B2 |
9346476 | Dargy et al. | May 2016 | B2 |
9347864 | Farritor et al. | May 2016 | B2 |
9389205 | Mian et al. | Jul 2016 | B2 |
9415784 | Bartonek et al. | Aug 2016 | B2 |
9423415 | Nanba et al. | Aug 2016 | B2 |
9429545 | Havira et al. | Aug 2016 | B2 |
9441956 | Kainer et al. | Sep 2016 | B2 |
9446776 | Cooper et al. | Sep 2016 | B2 |
9454816 | Mian et al. | Sep 2016 | B2 |
9469198 | Cooper et al. | Oct 2016 | B2 |
9518947 | Bartonek et al. | Dec 2016 | B2 |
9533698 | Warta | Jan 2017 | B2 |
9562878 | Graham et al. | Feb 2017 | B2 |
9571796 | Mian et al. | Feb 2017 | B2 |
9575007 | Rao et al. | Feb 2017 | B2 |
9580091 | Kraeling et al. | Feb 2017 | B2 |
9581998 | Cooper et al. | Feb 2017 | B2 |
9607446 | Cooper et al. | Mar 2017 | B2 |
9618335 | Mesher | Apr 2017 | B2 |
9619691 | Pang et al. | Apr 2017 | B2 |
9619725 | King | Apr 2017 | B2 |
9628762 | Farritor | Apr 2017 | B2 |
9664567 | Sivathanu et al. | May 2017 | B2 |
9669852 | Combs | Jun 2017 | B2 |
9671358 | Cooper et al. | Jun 2017 | B2 |
9689760 | Lanza di Scalea et al. | Jun 2017 | B2 |
9714043 | Mian et al. | Jul 2017 | B2 |
9744978 | Bhattacharjya et al. | Aug 2017 | B2 |
9752993 | Thompson et al. | Sep 2017 | B1 |
9771090 | Warta | Sep 2017 | B2 |
9796400 | Puttagunta et al. | Oct 2017 | B2 |
9810533 | Fosburgh et al. | Nov 2017 | B2 |
9822492 | Hartl et al. | Nov 2017 | B2 |
9825662 | Mian et al. | Nov 2017 | B2 |
9849894 | Mesher | Dec 2017 | B2 |
9849895 | Mesher | Dec 2017 | B2 |
9860962 | Mesher | Jan 2018 | B2 |
9873442 | Mesher | Jan 2018 | B2 |
9921584 | Rao et al. | Mar 2018 | B2 |
9922416 | Mian et al. | Mar 2018 | B2 |
9950716 | English | Apr 2018 | B2 |
9950720 | Mesher | Apr 2018 | B2 |
9981671 | Fraser et al. | May 2018 | B2 |
9981675 | Cooper et al. | May 2018 | B2 |
9983593 | Cooper et al. | May 2018 | B2 |
9989498 | Lanza di Scalea et al. | Jun 2018 | B2 |
10035498 | Richardson et al. | Jul 2018 | B2 |
10040463 | Singh | Aug 2018 | B2 |
10043154 | King | Aug 2018 | B2 |
10077061 | Schmidt et al. | Sep 2018 | B2 |
10081376 | Singh | Sep 2018 | B2 |
10086857 | Puttagunta et al. | Oct 2018 | B2 |
10167003 | Bilodeau | Jan 2019 | B1 |
10286877 | Lopez Galera et al. | May 2019 | B2 |
10322734 | Mesher | Jun 2019 | B2 |
10349491 | Mesher | Jul 2019 | B2 |
10352831 | Kondo et al. | Jul 2019 | B2 |
10362293 | Mesher | Jul 2019 | B2 |
10370014 | Matson et al. | Aug 2019 | B2 |
10384697 | Mesher | Aug 2019 | B2 |
10392035 | Berggren | Aug 2019 | B2 |
10401500 | Yang et al. | Sep 2019 | B2 |
10408606 | Raab | Sep 2019 | B1 |
10414416 | Hampapur | Sep 2019 | B2 |
10502831 | Eichenholz | Dec 2019 | B2 |
10518791 | Singh | Dec 2019 | B2 |
10543861 | Bartek et al. | Jan 2020 | B1 |
10582187 | Mesher | Mar 2020 | B2 |
10611389 | Khosla | Apr 2020 | B2 |
10613550 | Khosla | Apr 2020 | B2 |
10616556 | Mesher | Apr 2020 | B2 |
10616557 | Mesher | Apr 2020 | B2 |
10616558 | Mesher | Apr 2020 | B2 |
10618537 | Khosla | Apr 2020 | B2 |
10625760 | Mesher | Apr 2020 | B2 |
10730538 | Mesher | Aug 2020 | B2 |
10752271 | Chung et al. | Aug 2020 | B2 |
10796192 | Fernandez | Oct 2020 | B2 |
10816347 | Wygant et al. | Oct 2020 | B2 |
10822008 | Wade | Nov 2020 | B2 |
10829135 | Anderson et al. | Nov 2020 | B2 |
10919546 | Llorenty et al. | Feb 2021 | B1 |
10953899 | Chung et al. | Mar 2021 | B2 |
10954637 | Kaiser | Mar 2021 | B2 |
10989694 | Kawabata et al. | Apr 2021 | B2 |
11001283 | Dick et al. | May 2021 | B2 |
11046340 | Matson et al. | Jun 2021 | B2 |
11107233 | Saniei et al. | Aug 2021 | B2 |
11169269 | Mesher | Nov 2021 | B2 |
11196981 | Mesher | Dec 2021 | B2 |
11259007 | Mesher | Feb 2022 | B2 |
11338832 | Brick et al. | May 2022 | B1 |
11358617 | Dick et al. | Jun 2022 | B2 |
11377130 | Mesher | Jul 2022 | B2 |
11399172 | Mesher | Jul 2022 | B2 |
11427232 | Davis et al. | Aug 2022 | B2 |
11433931 | Chung et al. | Sep 2022 | B2 |
11479281 | Pick et al. | Oct 2022 | B2 |
20010045495 | Olson et al. | Nov 2001 | A1 |
20020065610 | Clark et al. | May 2002 | A1 |
20020070283 | Young | Jun 2002 | A1 |
20020093487 | Rosenberg | Jul 2002 | A1 |
20020099507 | Clark et al. | Jul 2002 | A1 |
20020150278 | Wustefeld | Oct 2002 | A1 |
20020196456 | Komiya et al. | Dec 2002 | A1 |
20030059087 | Waslowski et al. | Mar 2003 | A1 |
20030062414 | Tsikos et al. | Apr 2003 | A1 |
20030072001 | Mian et al. | Apr 2003 | A1 |
20030075675 | Braune et al. | Apr 2003 | A1 |
20030140509 | Casagrande | Jul 2003 | A1 |
20030160193 | Sanchez Revuelta et al. | Aug 2003 | A1 |
20030164053 | Ignagni | Sep 2003 | A1 |
20040021858 | Shima et al. | Feb 2004 | A1 |
20040084069 | Woodard | May 2004 | A1 |
20040088891 | Theurer | May 2004 | A1 |
20040095135 | Nejikovsky et al. | May 2004 | A1 |
20040122569 | Bidaud | Jun 2004 | A1 |
20040189452 | Li | Sep 2004 | A1 |
20040247157 | Lages | Dec 2004 | A1 |
20040263624 | Nejikovsky | Dec 2004 | A1 |
20050121539 | Takada et al. | Jun 2005 | A1 |
20050244585 | Schmeling | Nov 2005 | A1 |
20050279240 | Pedanekar et al. | Dec 2005 | A1 |
20060017911 | Villar | Jan 2006 | A1 |
20060098843 | Chew | May 2006 | A1 |
20060171704 | Bingle | Aug 2006 | A1 |
20060231685 | Mace et al. | Oct 2006 | A1 |
20070136029 | Selig et al. | Jun 2007 | A1 |
20070150130 | Welles | Jun 2007 | A1 |
20070211145 | Kilian et al. | Sep 2007 | A1 |
20070265780 | Kesler et al. | Nov 2007 | A1 |
20070289478 | Becker et al. | Dec 2007 | A1 |
20080007724 | Chung | Jan 2008 | A1 |
20080177507 | Mian et al. | Jul 2008 | A1 |
20080212106 | Hoffmann | Sep 2008 | A1 |
20080298674 | Baker | Dec 2008 | A1 |
20080304065 | Hesser | Dec 2008 | A1 |
20080304083 | Farritor et al. | Dec 2008 | A1 |
20090040503 | Kilian | Feb 2009 | A1 |
20090073428 | Magnus | Mar 2009 | A1 |
20090196486 | Distante et al. | Aug 2009 | A1 |
20090250533 | Akiyama et al. | Oct 2009 | A1 |
20090273788 | Nagle et al. | Nov 2009 | A1 |
20090319197 | Villar et al. | Dec 2009 | A1 |
20100007551 | Pagliuco | Jan 2010 | A1 |
20100026551 | Szwilski | Feb 2010 | A1 |
20100106309 | Grohman et al. | Apr 2010 | A1 |
20100207936 | Minear | Aug 2010 | A1 |
20100289891 | Akiyama | Nov 2010 | A1 |
20110064273 | Zarembski et al. | Mar 2011 | A1 |
20110209549 | Kahn | Sep 2011 | A1 |
20110251742 | Haas et al. | Oct 2011 | A1 |
20120026352 | Natroshvilli et al. | Feb 2012 | A1 |
20120051643 | Ha et al. | Mar 2012 | A1 |
20120062731 | Enomoto et al. | Mar 2012 | A1 |
20120192756 | Miller et al. | Aug 2012 | A1 |
20120216618 | Bloom et al. | Aug 2012 | A1 |
20120218868 | Kahn et al. | Aug 2012 | A1 |
20120222579 | Turner et al. | Sep 2012 | A1 |
20120245908 | Berggren | Sep 2012 | A1 |
20120263342 | Haas | Oct 2012 | A1 |
20120300060 | Farritor | Nov 2012 | A1 |
20130070083 | Snead | Mar 2013 | A1 |
20130092758 | Tanaka et al. | Apr 2013 | A1 |
20130096739 | Landes et al. | Apr 2013 | A1 |
20130155061 | Jahanashahi et al. | Jun 2013 | A1 |
20130170709 | Distante et al. | Jul 2013 | A1 |
20130191070 | Kainer | Jul 2013 | A1 |
20130202090 | Belcher et al. | Aug 2013 | A1 |
20130230212 | Landes | Sep 2013 | A1 |
20130231873 | Fraser et al. | Sep 2013 | A1 |
20130276539 | Wagner et al. | Oct 2013 | A1 |
20130313372 | Gamache et al. | Nov 2013 | A1 |
20130317676 | Cooper et al. | Nov 2013 | A1 |
20140069193 | Graham et al. | Mar 2014 | A1 |
20140129154 | Cooper | May 2014 | A1 |
20140142868 | Bidaud | May 2014 | A1 |
20140151512 | Cooper | Jun 2014 | A1 |
20140177656 | Mian et al. | Jun 2014 | A1 |
20140200952 | Hampapur et al. | Jul 2014 | A1 |
20140333771 | Mian et al. | Nov 2014 | A1 |
20140339374 | Mian et al. | Nov 2014 | A1 |
20150106038 | Turner | Apr 2015 | A1 |
20150131108 | Kainer et al. | May 2015 | A1 |
20150219487 | Maraini | Aug 2015 | A1 |
20150225002 | Branka et al. | Aug 2015 | A1 |
20150268172 | Naithani et al. | Sep 2015 | A1 |
20150269722 | Naithani et al. | Sep 2015 | A1 |
20150284912 | Delmonic et al. | Oct 2015 | A1 |
20150285688 | Naithani et al. | Oct 2015 | A1 |
20150375765 | Mustard | Dec 2015 | A1 |
20160002865 | English et al. | Jan 2016 | A1 |
20160039439 | Fahmy et al. | Feb 2016 | A1 |
20160059623 | Kilian | Mar 2016 | A1 |
20160121912 | Puttagunta et al. | May 2016 | A1 |
20160159381 | Fahmy | Jun 2016 | A1 |
20160207551 | Mesher | Jul 2016 | A1 |
20160209003 | Mesher | Jul 2016 | A1 |
20160212826 | Mesher | Jul 2016 | A1 |
20160221592 | Puttagunta | Aug 2016 | A1 |
20160249040 | Mesher | Aug 2016 | A1 |
20160282108 | Martinod Restrepo et al. | Sep 2016 | A1 |
20160304104 | Witte et al. | Oct 2016 | A1 |
20160305915 | Witte et al. | Oct 2016 | A1 |
20160312412 | Schrunk, III | Oct 2016 | A1 |
20160318530 | Johnson | Nov 2016 | A1 |
20160321513 | Mitti et al. | Nov 2016 | A1 |
20160325767 | LeFabvre et al. | Nov 2016 | A1 |
20160368510 | Simon et al. | Dec 2016 | A1 |
20170029001 | Berggren | Feb 2017 | A1 |
20170034892 | Mesher | Feb 2017 | A1 |
20170066459 | Singh | Mar 2017 | A1 |
20170106885 | Singh | Apr 2017 | A1 |
20170106887 | Mian et al. | Apr 2017 | A1 |
20170182980 | Davies et al. | Jun 2017 | A1 |
20170203775 | Mesher | Jul 2017 | A1 |
20170205379 | Prince et al. | Jul 2017 | A1 |
20170219471 | Fisk et al. | Aug 2017 | A1 |
20170267264 | English et al. | Sep 2017 | A1 |
20170297536 | Giraud et al. | Oct 2017 | A1 |
20170305442 | Viviani | Oct 2017 | A1 |
20170313286 | Giraud et al. | Nov 2017 | A1 |
20170313332 | Paget et al. | Nov 2017 | A1 |
20170336293 | Kondo et al. | Nov 2017 | A1 |
20180038957 | Kawazoe et al. | Feb 2018 | A1 |
20180039842 | Schuchmann et al. | Feb 2018 | A1 |
20180057030 | Puttagunta et al. | Mar 2018 | A1 |
20180079433 | Mesher | Mar 2018 | A1 |
20180079434 | Mesher | Mar 2018 | A1 |
20180106000 | Fruehwirt | Apr 2018 | A1 |
20180120440 | O'Keefe | May 2018 | A1 |
20180127006 | Wade | May 2018 | A1 |
20180220512 | Mesher | Aug 2018 | A1 |
20180222504 | Birch et al. | Aug 2018 | A1 |
20180276494 | Fernandez | Sep 2018 | A1 |
20180281829 | Euston et al. | Oct 2018 | A1 |
20180297621 | Matson et al. | Oct 2018 | A1 |
20180339720 | Singh | Nov 2018 | A1 |
20180370552 | Puttagunta et al. | Dec 2018 | A1 |
20180372875 | Juelsgaard et al. | Dec 2018 | A1 |
20190039633 | Li | Feb 2019 | A1 |
20190054937 | Graetz | Feb 2019 | A1 |
20190107607 | Danziger | Apr 2019 | A1 |
20190135315 | Dargy et al. | May 2019 | A1 |
20190156569 | Jung et al. | May 2019 | A1 |
20190179026 | Englard et al. | Jun 2019 | A1 |
20190248393 | Khosla | Aug 2019 | A1 |
20190310470 | Weindorf et al. | Oct 2019 | A1 |
20190344813 | Kaiser et al. | Nov 2019 | A1 |
20190349563 | Mesher | Nov 2019 | A1 |
20190349564 | Mesher | Nov 2019 | A1 |
20190349565 | Mesher | Nov 2019 | A1 |
20190349566 | Mesher | Nov 2019 | A1 |
20190357337 | Mesher | Nov 2019 | A1 |
20190367060 | Mesher | Dec 2019 | A1 |
20190367061 | Mesher | Dec 2019 | A1 |
20200025578 | Wygant et al. | Jan 2020 | A1 |
20200034637 | Olson et al. | Jan 2020 | A1 |
20200086903 | Mesher | Mar 2020 | A1 |
20200116865 | Yang et al. | Apr 2020 | A1 |
20200122753 | Buerger | Apr 2020 | A1 |
20200156677 | Mesher | May 2020 | A1 |
20200160733 | Dick et al. | May 2020 | A1 |
20200164904 | Dick et al. | May 2020 | A1 |
20200180667 | Kim et al. | Jun 2020 | A1 |
20200198672 | Underwood et al. | Jun 2020 | A1 |
20200221066 | Mesher | Jul 2020 | A1 |
20200231193 | Chen et al. | Jul 2020 | A1 |
20200239049 | Dick et al. | Jul 2020 | A1 |
20200302592 | Ebersohn et al. | Sep 2020 | A1 |
20200346673 | Mesher | Nov 2020 | A1 |
20200361502 | Metzger | Nov 2020 | A1 |
20200363532 | Mesher | Nov 2020 | A1 |
20200400542 | Fisk et al. | Dec 2020 | A1 |
20210019548 | Fernandez | Jan 2021 | A1 |
20210041398 | Van Wyk et al. | Feb 2021 | A1 |
20210041877 | Lacaze et al. | Feb 2021 | A1 |
20210061322 | Dick et al. | Mar 2021 | A1 |
20210072393 | Mesher | Mar 2021 | A1 |
20210078622 | Miller et al. | Mar 2021 | A1 |
20210229714 | Dick et al. | Jul 2021 | A1 |
20210327087 | Saniei et al. | Oct 2021 | A1 |
20210370993 | Qian | Dec 2021 | A1 |
20210396685 | Qian | Dec 2021 | A1 |
20210403060 | Pertosa | Dec 2021 | A1 |
20220035037 | Mesher | Feb 2022 | A1 |
20220116580 | Mesher | Apr 2022 | A1 |
20220189001 | Fernandez | Jun 2022 | A1 |
20220242466 | Brick et al. | Aug 2022 | A1 |
20220258779 | Dick et al. | Aug 2022 | A1 |
20220324497 | Brick et al. | Oct 2022 | A1 |
Number | Date | Country |
---|---|---|
2019338073 | Aug 2021 | AU |
2061014 | Aug 1992 | CA |
2069971 | Mar 1993 | CA |
2574428 | Feb 2006 | CA |
2607634 | Apr 2008 | CA |
2574428 | Oct 2009 | CA |
2782341 | Jun 2011 | CA |
2844113 | Feb 2013 | CA |
2986580 | Sep 2014 | CA |
2867560 | Apr 2015 | CA |
2607634 | Jun 2015 | CA |
2945614 | Oct 2015 | CA |
2945614 | Oct 2015 | CA |
2732971 | Jan 2016 | CA |
2996128 | Mar 2016 | CA |
2860073 | May 2016 | CA |
2867560 | Jul 2017 | CA |
2955105 | Jul 2017 | CA |
3042136 | Jun 2018 | CA |
3070280 | Jul 2021 | CA |
104751602 | Jul 2015 | CN |
106291538 | Jan 2017 | CN |
106364503 | Feb 2017 | CN |
106373191 | Feb 2017 | CN |
106384190 | Feb 2017 | CN |
104535652 | Jun 2017 | CN |
107688024 | Feb 2018 | CN |
206984011 | Feb 2018 | CN |
108009484 | May 2018 | CN |
108657222 | Oct 2018 | CN |
113626975 | Nov 2021 | CN |
19831176 | Jan 2000 | DE |
19831215 | Jan 2000 | DE |
10040139 | Jul 2002 | DE |
19826422 | Sep 2002 | DE |
60015268 | Mar 2005 | DE |
19943744 | Jan 2006 | DE |
19919604 | Aug 2009 | DE |
102012207427 | Jul 2013 | DE |
102009018036 | Feb 2014 | DE |
102014119056 | Jun 2016 | DE |
0274081 | Jul 1988 | EP |
1079322 | Feb 2001 | EP |
1146353 | Oct 2001 | EP |
1158460 | Nov 2001 | EP |
1168269 | Jan 2002 | EP |
1197417 | Apr 2002 | EP |
1236634 | Sep 2002 | EP |
1098803 | Jan 2003 | EP |
1285225 | Jul 2004 | EP |
1600351 | Jan 2007 | EP |
1892503 | Jul 2007 | EP |
1918702 | May 2008 | EP |
1964026 | Sep 2008 | EP |
1992167 | May 2016 | EP |
3024123 | May 2016 | EP |
2806065 | Sep 2016 | EP |
3138753 | Mar 2017 | EP |
3138753 | Mar 2017 | EP |
3138754 | Mar 2017 | EP |
2697738 | Aug 2017 | EP |
2697738 | Aug 2017 | EP |
3351452 | Jul 2018 | EP |
2998927 | Sep 2018 | EP |
3431359 | Jan 2019 | EP |
3310963 | Mar 2019 | EP |
3420135 | Oct 2019 | EP |
3561501 | Oct 2019 | EP |
3442849 | Jan 2020 | EP |
3105599 | Apr 2020 | EP |
3433154 | Jun 2020 | EP |
3689706 | Aug 2020 | EP |
3554919 | Oct 2020 | EP |
3555365 | Oct 2020 | EP |
3580393 | Apr 2021 | EP |
3685117 | Nov 2021 | EP |
2674809 | Oct 1992 | FR |
3049255 | Sep 2017 | FR |
3077553 | Feb 2018 | FR |
3049255 | Apr 2018 | FR |
3052416 | Jul 2019 | FR |
3077553 | Aug 2019 | FR |
2265779 | Oct 1993 | GB |
2378344 | Feb 2003 | GB |
2383635 | Jun 2005 | GB |
2403861 | Mar 2006 | GB |
2419759 | Feb 2007 | GB |
2536746 | Sep 2016 | GB |
2536746 | Mar 2017 | GB |
51001138 | Jan 1976 | JP |
60039555 | Mar 1985 | JP |
63302314 | Dec 1988 | JP |
6011316 | Jan 1994 | JP |
06322707 | Nov 1994 | JP |
H07146131 | Jun 1995 | JP |
7280532 | Oct 1995 | JP |
H07294443 | Nov 1995 | JP |
H07294444 | Nov 1995 | JP |
10332324 | Dec 1998 | JP |
11172606 | Jun 1999 | JP |
2000221146 | Aug 2000 | JP |
2000241360 | Sep 2000 | JP |
H0924828 | Jul 2002 | JP |
2002294610 | Oct 2002 | JP |
2003074004 | Mar 2003 | JP |
2003121556 | Apr 2003 | JP |
2004132881 | Apr 2004 | JP |
2007240342 | Sep 2007 | JP |
4008082 | Nov 2007 | JP |
2010229642 | Oct 2010 | JP |
5283548 | Sep 2013 | JP |
5812595 | Nov 2015 | JP |
2015209205 | Nov 2015 | JP |
2016191264 | Nov 2016 | JP |
6068012 | Jan 2017 | JP |
2017020862 | Jan 2017 | JP |
6192717 | Sep 2017 | JP |
6327413 | May 2018 | JP |
6425990 | Nov 2018 | JP |
2019065650 | Apr 2019 | JP |
6530979 | Jun 2019 | JP |
101562635 | Oct 2015 | KR |
101706271 | Feb 2017 | KR |
1020180061929 | Jun 2018 | KR |
1020220043457 | Apr 2022 | KR |
2142892 | Dec 1999 | RU |
101851 | Jan 2011 | RU |
1418105 | Aug 1988 | SU |
200005576 | Feb 2000 | WO |
200008459 | Feb 2000 | WO |
2000-73118 | Dec 2000 | WO |
2001066401 | Sep 2001 | WO |
0186227 | Nov 2001 | WO |
2001066401 | May 2003 | WO |
2005036199 | Apr 2005 | WO |
2005036199 | Apr 2005 | WO |
2005098352 | Oct 2005 | WO |
2006008292 | Jan 2006 | WO |
2006014893 | Feb 2006 | WO |
2008146151 | Jan 2009 | WO |
2009007817 | Mar 2009 | WO |
2010091970 | Aug 2010 | WO |
2011002534 | Jan 2011 | WO |
2011126802 | Oct 2011 | WO |
2012142548 | Oct 2012 | WO |
2013146502 | Mar 2013 | WO |
2013177393 | Nov 2013 | WO |
2014017015 | Jan 2014 | WO |
2015003772 | Jan 2015 | WO |
2015160300 | Oct 2015 | WO |
2015165560 | Nov 2015 | WO |
2016008201 | Jan 2016 | WO |
2016027072 | Feb 2016 | WO |
2016007393 | Jul 2016 | WO |
2016168576 | Oct 2016 | WO |
2016168623 | Oct 2016 | WO |
2017159701 | Sep 2017 | WO |
2018010827 | Jan 2018 | WO |
2018158712 | Sep 2018 | WO |
2018207469 | Nov 2018 | WO |
2018208153 | Nov 2018 | WO |
2018210441 | Nov 2018 | WO |
2019023613 | Jan 2019 | WO |
2019023658 | Jan 2019 | WO |
2019023613 | Jan 2019 | WO |
2019023658 | Jan 2019 | WO |
2019086158 | May 2019 | WO |
2019149456 | Aug 2019 | WO |
2019212693 | Nov 2019 | WO |
2020053699 | Mar 2020 | WO |
2020078703 | Apr 2020 | WO |
2020232431 | Nov 2020 | WO |
2020232443 | Nov 2020 | WO |
2022058127 | Mar 2022 | WO |
2022087506 | Apr 2022 | WO |
2022111983 | Jun 2022 | WO |
2022130488 | Jun 2022 | WO |
2022130510 | Jun 2022 | WO |
2022133032 | Jun 2022 | WO |
Entry |
---|
US 8,548,242 B1, 10/2013, Longacre, Jr. (withdrawn) |
US Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 14/725,490 dated Feb. 23, 2018. |
Shawn Landers et al., “Development and Calibration of a Pavement Surface Performance Measure and Prediction Models for the British Columbia Pavement Management System” (2002). |
Zheng Wu, “Hybrid Multi-Objective Optimization Models for Managing Pavement Assetts” (Jan. 25, 2008). |
“Pavement Condition Index 101”, OGRA's Milestones (Dec. 2009). |
“Rail Radar Bringing the Track Into the Office” presentation given to CN Rail Engineering on Jan. 21, 2011. |
Rail Radar, Inc. Industrial Research Assistance Program Application (IRAP) (Aug. 10, 2012). |
“Rail Radar Automated Track Assessment” paper distributed at the Association of American Railways (AAR) Transportation Test Center in Oct. 2010 by Rail Radar, Inc. |
US Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 14/725,490 dated Mar. 30, 2017. |
US Patent and Trademark Office, Final Office Action for U.S. Appl. No. 14/725,490 dated Aug. 16, 2017. |
Kantor, et al., “Automatic Railway Classification Using Surface And Subsurface Measurements” Proceedings of the 3rd International Conference on Field and Service Robitics, pp. 43-48 (2001). |
Magnes, Daniel L., “Non-Contact Technology for Track Speed Rail Measurements (ORIAN)” SPIE vol. 2458, pp. 45-51 (1995). |
Ryabichenko, et al. “CCD Photonic System For Rail Width Measurement” SPIE vol. 3901, pp. 37-44 (1999). |
Gingras, Dennis, “Optics and Photonics Used in Road Transportation” (1998). |
Liviu Bursanescu and François Blais, “Automated Pavement Distress Data Collection and Analysis: a 3-D Approach” (1997). |
US Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 14/724,925 dated Feb. 26, 2016. |
US Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 14/724,890 dated Jul. 29, 2016. |
US Patent and Trademark Office, Final Office Action for U.S. Appl. No. 14/724,890 dated Nov. 10, 2016. |
US Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 14/724,890 dated Mar. 24, 2017. |
Korean Intellectual Property Office, International Search Report for Int. App. No. PCT/IB2018/058574 dated Feb. 27, 2019. |
Korean Intellectual Property Office, Written Opinion of the International Searching Authority for Int. App. No. PCT/IB2018/058574 dated Feb. 27, 2019. |
US Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 16/127,956 dated Dec. 31, 2018. |
D.D. Davis et al., “Tie Performance—A Progress Report of the Des Plaines Test Site,” Report No. R-746, Association of American Railroads Research and Test Department (Apr. 1990). |
Mattias Johanneson, “Architectures for Sheet-of-Light Range Imaging,” Report No. LiTH-ISY-I-1335, Image Processing Group, Department of Electrical Engineering, Linköping University (Feb. 27, 1992). |
U.S. Appl. No. 60/584,769, “System & Method For Inspecting Railroad Track” by John Nagle & Steven C. Orrell. |
Mattias Johannesson, “Sheet-of-light Range Imaging,” Linköping Studies in Science and Technology. Dissertations No. 399 (1995). |
M. Johannesson, SIMD Architectures for Range and Radar Imaging, PHD thesis, University of Linköping (1995). |
Erik Åstrand, “Automatic Inspection of Sawn Wood,” Linköping Studies in Science and Technology. Dissertations. No. 424 (1996). |
Mattias Johannesson, “Sheet-of-Light range imaging experiments with MAPP2200,” Report No. LiTH-ISY-I-1401, Image Processing Group, Department of Electrical Engineering, Linköping University (Sep. 28, 1992). |
M. de Bakker et al., “A Smart Range Image Sensor,” Proceedings of the 24th European Solid-State Circuits Conference (1998):208-11;xii+514. |
Dr. Mats Gokstorp et al., “Smart Vision Sensors,” International Conference on Image Processing (Oct. 4-7, 1998), Institute of Electrical and Electronics Engineers, Inc. |
Mattias Johanneson, et al., “An Image Sensor for Sheet-of-Light Range Imaging,” IAPR Workshop on Machine Vision Applications (Dec. 7-9, 1992). |
Mattias Johannesson, “Can Sorting using sheet-of-light range imaging and MAPP2200,” Institute of Electrical and Electronics Engineers; International Conference on Systems, Man and Cybernetics (Oct. 17-20, 1993). |
Michiel de Bakker, et al., “Smart PSD array for sheet-of-light range imaging,” The International Society for Optical Engineering. Sensors and Camera Systems for Scientific, Industrial, and Digital Photography Applications (Jan. 24-26, 2000). |
Umayal Chidambaram, “Edge Extraction of Color and Range Images,” (Dec. 2003). |
Franz Pernkopf et al., “Detection of surface defects on raw milled steel blocks using range imaging” The International Society for Optical Engineering. Machine Vision Applications in Industrial Inspection X (Jan. 21-22, 2002). |
Murhed, Anders, “IVP Integrated Vision Products,” Pulp and Paper International 44.12 (Dec. 1, 2002). |
Anders Åstrand, “Smart Image Sensors,” Linköping Studies in Science and Technology. Dissertations. No. 319 (1993). |
Mattias Johannesson et al., “Five Contributions to the Art of Sheet-of-light Range Imaging on MAPP2200,” Report No. LiTH-ISY-R-1611, Image Processing Group, Department of Electrical Engineering, Linköping University (Apr. 14, 1994). |
Federal Register, vol. 73 (70695-70696). |
Newman et al., “A Survey of Automated Visual Inspection,” Computer Vision an Image Understanding vol. 61, No. 2, March, pp. 231-262, 1995. |
J. Velten et al., “Application of a Brightness-Adapted Edge Detector for Real-Time Railroad Tie Detection in Video Images,” Institute of Electrical and Electronics Engineers (1999). |
R. Gordon Kennedy, “Problems of Cartographic Design in Geographic Information Systems for Transportation,” Cartographic Perspectives (Jul. 20, 1999). |
Richard Reiff, “An Evaluation of Remediation Techniques For Concrete Tie Rail Seat Abrasion In the Fast Environment,” American Railway Engineering Association, Bulletin 753 (1995). |
Russell H. Lutch et al., “Causes and Preventative Methods for Rail Seat Abrasion in North America's Railroads,” Conference Paper (Oct. 2014). |
Nigel Peters and Steven R. Mattson, “CN 60E Concrete Tie Development,” AREMA: 25 (2003). |
Federal Register, vol. 76, No. 175, pp. 55819-55825. |
National Transportation Safety Board, “Railroad Accident Brief” (NTSB/RAB-06/03). |
Arthur L. Clouse et al. “Track Inspection Into the 21st Century” (Sep. 19, 2006). |
Federal Register, vol. 76, No. 63, pp. 18001-18346 (18073). |
Railroad Safety Advisory Committee (RSAC), Minutes of Meeting, Dec. 10, 2008, Washington, D.C. |
Dennis P. Curtin, “An Extension to The Textbook of Digital Photography, Pixels and Images” (2007). |
Holland L.P.'s Combined Motion for Early Markman Claim Construction and Summary Judgment of Non-Infringement in Georgetown Rail Equipment Company v. Holland L.P., (E.D. Tex.) (Tyler) (6:13-cv-366). |
Georgetown Rail Equipment Company's Response to Holland L.P.'s Combined Motion for Early Markman Claim Construction and Summary Judgment of Non-Infringement in Georgetown Rail Equipment Company v. Holland L.P., (E.D. Tex.) (Tyler) (6:13-cv-366). |
Georgetown Rail Equipment Company's P.R. 4-5(a) Opening Markman Claim Construction Brief in Georgetown Rail Equipment Company v. Holland L.P., (E.D. Tex.) (Tyler) (6:13-cv-366). |
Holland L.P.'s Responsive Markman Claim Construction Brief Under P.R. 4-5 in Georgetown Rail Equipment Company v. Holland L.P., (E.D. Tex.) (Tyler) (6:13-cv-366). |
Claim Construction Memorandum Opinion and Order in Georgetown Rail Equipment Company v. Holland L.P., (E.D. Tex.) (Tyler) (6:13-cv-366). |
Public Judgment and Reasons in Georgetown Rail Equipment Company v. Rail Radar Inc. and Tetra Tech EBA Inc. (T-896-15) (2018 FC 70). |
US Patent and Trademark Office, Final Office Action for U.S. Appl. No. 16/255,928 dated Apr. 27, 2020. |
US Patent and Trademark Office, Final Office Action for U.S. Appl. No. 16/742,057 dated May 26, 2020. |
Paul et al., “A Technical Evaluation of Lidar-Based Measurement of River Water Levels”, Water Resources Research (Apr. 4, 2020). |
Ahn et al., “Estimating Water Reflectance at Near-Infrared Wavelengths for Turbid Water Atmospheric Correction: A Preliminary Study for GOCI-II”, Remote Sensing (Nov. 18, 2020). |
Hart et al., “Automated Railcar and Track Inspection Projects: A Review of Collaborations Between CVRL and RailTEC”, presentation by Computer Vision and Robotics Laboratory and Railroad Engineering Program (RailTEC) University of Illinois at Urbana-Champaign (2017). |
US Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 16/802,763 dated Jun. 29, 2021. |
Yang et al., “Automated Extraction of 3-D Railway Tracks from Mobile Laser Scanning Point Clouds”, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 7, No. 12, Dec. 2014. |
Li et al., “Rail Component Detection, Optimization, and Assessment for Automatic Rail Track Inspection”, IEEE Transactions of Intelligent Transportation Systems, vol. 15, No. 2, Apr. 2014. |
U.S. Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 16/898,544 dated Sep. 24, 2021. |
U.S. Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 16/889,016 dated Sep. 23, 2021. |
U.S. Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 16/877,106 dated Sep. 20, 2021. |
International Preliminary Report on Patentability, PCT Application No. PCT/US2020/033449, completed May 24, 2021 and dated Aug. 12, 2021. |
Espino et al., “Rail and Turnout Detection Using Gradient Information and Template Matching”, 2013 IEEE International Conference on Intelligent Rail Transportation Proceedings (2013). |
U.S. Patent and Tademark Office, Non-Final Office Action for U.S. Appl. No. 17/243,746 dated Aug. 27, 2021. |
Supplementary European Search Report, European U.S. Appl. No. 18/920,660, filed Feb. 28, 2022. |
Tetra Tech, Inc.'s opening claim construction brief in Case No. 2:21-cv-1289, dated Oct. 4, 2021. |
Pavemetric's response to Tetra tech's claim construction brief in Case No. 2:21-cv-1289, dated Oct. 18, 2021. |
Tetra Tech, Inc.'s Reply claim construction brief in Case No. 2:21-cv-1289, dated Oct. 25, 2021. |
Pavemetric's sur-reply to Tetra tech's claim construction brief in Case No. 2:21-cv-1289, dated Nov. 1, 2021. |
Claim construction order in Case No. 2:21-cv-1289, dated Dec. 1, 2021. |
Order regarding motion for summary judgment in Case No. 2:21-cv-1289, dated Apr. 27, 2022. |
Final pretrial conference order in Case No. 2:21-cv-1289, dated Aug. 8, 2022. |
Pavemetrics trial brief in Case No. 2:21-cv-1289, dated Aug. 10, 2022. |
Jury instructions in Case No. 2:21-cv-1289, dated Aug. 24, 2022. |
Verdict form in Case No. 2:21-cv-1289, dated Aug. 24, 2022. |
Final pretrial conference order in Case No. 2:21-cv-1289, dated Aug. 26, 2022. |
Judgment jury verdict in Case No. 2:21-cv-1289, dated Aug. 26, 2022. |
Çağlar Aytekin et al., Railway Fastener Inspection by Real-Time Machine Vision, 45 IEEE Transactions on Sys., Man, and Cybernetics: Sys. 1101 (Jan. 2015) (“Aytekin”). |
Jinfeng Yang et al., An Efficient Direction Field-Based Method for the Detection of Fasteners on High-Speed Railways, 11 Sensors 7364 (2011) (“Yang”). |
Urszula Marmol & Slawomir Mikrut, Attempts at Automatic Detection of Railway Head Edges from Images and Laser Data, 17 Image Processing & Commc'n 151 (2012) (“Marmol”). |
Xaxier Gibert-Serra et al., A Machine Vision System for Automated Joint Bar Inspection from a Moving Rail Vehicle, Proc. 2007 ASME/IEEE Joint Rail Conf. & Internal Combustion Engine Spring Tech. Conf. 289 (2007) (“Gibert-Serra”). |
Sick Sensor Intelligence, Product Catalog 2014/2015: Vision, available at https://www.sick.com/media/docs/2/02/302/Product_catalog_Vision_en_IM005 0302.PDF (2013) (“Sick Catalog”). |
Sick Sensor Intelligence, Application: 3D Vision for Cost-Efficient Maintenance of Rail Networks, Tetratech_0062963-64 (Jan. 2015) (“Sick Article”). |
Matrox Electronic Systems, Ltd., Matrox Imaging Library version 9 User Guide, available athttps://www.matrox.com/apps/imaging_documentation_files/mil_userguide.pdf (2008) (“Matrox MIL 9 User Guide”). |
MVTec Software GmbH, Halcon: the Power of Machine Vision, available at https://pyramidimaging.com/specs/MVTec/Halcon%2011.pdf (2013)(“Halcon Overview”). |
Tordivel AS, Scorpion Vision Software: Version X Product Data, available at http://www.tordivel.no/scorpion/pdf/Scorpion%20X/PD-2011-0005%20Scorpion%20X%20Product%20Data.pdf (2010) (“Scorpion Overview”). |
OpenCV 3.0.0.—dev documentation, available at https://docs.opencv.org/3.0-beta/index.html (2014) (“OpenCV”). |
Mathworks Help Center, Documentation: edge, available https://www.mathworks.com/help/images/ref/edge.html (2011) (“Matlab”). |
National Instruments, NI Vision for LabView Help, available https://www.ni.com/pdf/manuals/370281w.zip (2014) (“LabView”). |
Intel Integrated Performance Primitives for Intel Architecture, Reference Manual, vol. 2: Image and Video Processing, available at http://www.nacad.ufrj.br/online/intel/Documentation/en_US/ipp/ippiman.pdf (Mar. 2009). |
Andrew Shropshire Boddiford, Improving the Safety and Efficiency of Rail Yard Operations Using Robotics, UT Elec. Theses and Dissertations, available at http://hdl.handle.net/2152/2911 (2013). |
Leszek Jarzebowicz & Slawomir Judek, 3D Machine Vision System for Inspection of Contact Strips in Railway Vehicle Current Collectors, 2014 Int'l Conf. on Applied Elecs. 139 (2014). |
Peng Li, A Vehicle-Based Laser System for Generating High-Resolution Digital Elevation Models, K-State Elec. Theses, Dissertations, and Reports, available at http://hdl.handle.net/2097/3890 (2010). |
Pavemetrics' Preliminary Invalidity Contentions in Case No. 2:21-cv-1289, dated Jul. 15, 2021. |
Exhibits 2-9 to Pavemetrics' Preliminary Invalidity Contentions in Case No. 2:21-cv-1289, dated Jul. 15, 2021. |
Pavemetrics' Invalidity Contentions and Preliminary Identification in Case No. 2:21-cv-1289, dated Sep. 13, 2021. |
Exhibit 2 to ,Pavemetrics' Invalidity Contentions and Preliminary Identification in Case No. 2:21-cv-1289, dated Sep. 13, 2021. |
Exhibit 3 to ,Pavemetrics' Invalidity Contentions and Preliminary Identification in Case No. 2:21-cv-1289, dated Sep. 13, 2021. |
US Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 16/742,057 dated May 26, 2020. |
Invitation to Pay Additional Fees, PCT App. Ser. No. PCT/US2020/033449 dated Jul. 9, 2020. |
International Report on Patentability, PCT App. Ser. No. PCT/IB2018/058574 dated Aug. 6, 2020. |
International Report on Patentability, PCT App. Ser. No. PCT/US2020/033374 dated Aug. 14, 2020. |
Julio Molleda et al., “A Profile Measurement System for Rail Manufacturing using Multiple Laser Range Finders” (2015). |
International Search Report and Written Opinion of the International Searching Authority, PCT App. Ser. No. PCT/US2020/033449 dated Sep. 14, 2020 (including Kovalev et al. “Freight car models and their computer-aided dynamic analysis”, Multibody System Dynamics, Nov. 2009). |
“Laser Triangulation for Track Change and Defect Detection”, U.S. Department of Transportation, Federal Railroad Administration (Mar. 2020). |
“Extended Field Trials of LRAIL for Automated Track Change Detection”, U.S. Department of Transportation, Federal Railroad Administration (Apr. 2020). |
T. Kanade, ed., Three-Dimensional Machine Vision, Kluwer Academic Publishers (1987) [Part 1]. |
T. Kanade, ed., Three-Dimensional Machine Vision, Kluwer Academic Publishers (1987) [Part 2]. |
D.D. Davis et al., “Tie Condition Inspection a Case Study of Tie Failure Rate, Mods, and Clustering,” Report No. R-714, Association of American Railroads Research and Test Department (Jul. 1989). |
John Choros et al., “Prevention of Derailments due to Concrete Tie Rail Seat Deterioration,” Proceedings of ASME/IEEE Joint Rail Conference & Internal Combustion Engine Spring Technical Conference. No. 40096 (2007). |
US Patent and Trademark Office, Non-Final Office Action for U.S. Appl. No. 16/255,928 dated Oct. 18, 2019. |
US Patent and Trademark Office, Final Office Action for U.S. Appl. No. 16/127,956 dated Jul. 9, 2019. |
US Patent and Tademark Office, Non-Final Office Action for U.S. Appl. No. 17/076,899 dated Jan. 29, 2021. |
Handbook of Computer Vision and Applications, vol. 2, Academic Press, “Signal Processing and Pattern Recognition” (1999). |
International Advances in Nondestructive Testing, vol. 16, Gordon and Breach Science Publishers, S.A. (1991). |
Babenko, Pavel, dissertation entitled “Visual Inspection of Railroad Tracks”, University of Central Florida (2009). |
Shah, Mubarak, “Automated Visual Inspection/Detection of Railroad Track”, Florida Department of Transportation (Jul. 2010). |
Metari et al., “Automatic Track Inspection Using 3D Laser Profilers to Improve Rail Transit Asset Condition Assessment and State of Good Repair—A Preliminary Study”, TRB 93rd Annual Meeting (Nov. 15, 2013). |
Aurent, John et al., “Implementation and Validation of a New 3D Automated Pavement Cracking Measurement Equipment” (2010). |
Final Written Judgment, U.S. Patentent Trial and Appeal Board, Inter Partes Review, Tetra Tech Canada, Inc. v. Georgetown Rail Equipment Company, (2020). |
Tetra Tech, Inc. Annual Report excerpts (2020). |
Federal Railroad Administration Track Safety Standards Fact Sheet. |
Declaration of David Drakes, Pavemetrics Systems, Inc. v. Tetra Tech, Inc. (case 2:21-cv-1289) (Mar. 22, 2021). |
Declaration of John Laurent, Pavemetrics Systems, Inc. v. Tetra Tech, Inc. (case 2:21-cv-1289) (Mar. 22, 2021). |
“An Automated System for Rail Transit Infrastructure Inspection”, 1st Quarterly Report, USDOT and University of Massachusetts Lowell (Sep. 30, 2012). |
IRI Measurements Using the LCMS presentation, Pavemetrics (2012). |
High-speed 3D imaging of rail YouTube URL link and associated image. |
LCMS for High-speed Rail Inspection video URL link and image. |
“An Automated System for Rail Transit Infrastructure Inspection”, 2d Quarterly Report, USDOT and University of Massachusetts Lowell (Jan. 15, 2013). |
Ritars 3rd Quarterly Meeting Minutes, “An Automated System for Rail Transit Infrastructure Inspection” (May 14, 2013). |
“An Automated System for Rail Transit Infrastructure Inspection”, 5th Quarterly Report, USDOT and University of Massachusetts Lowell (Oct. 15, 2013). |
25th Annual Road Profile User's Group Meeting agenda, San Antonio, Texas (Sep. 16, 2013). |
“LCMS-Laser Crack Measurement System” presentation, Pavemetrics Systems Inc. (Sep. 2013). |
Metari, et al., “An Automatic Track Inspection Using 3D Laser Profilers to Improve Rail Transit Asset Condition Assessment and State of Good Repair: A Preliminary Study” presentation, Transportation Research Board 93rd Annual Meeting (given Jan. 14, 2014). |
Lorent, et al., “Detection of Range-Based Rail Gage and Missing Rail Fasteners: Use of High-Resolution Two- and Three-dimensional Images” (Jan. 2014). |
“3D Mapping of Pavements: Geometry and DTM” presentation, Pavemetrics Systems Inc. (Sep. 2014). |
“Laser Rail Inspection System (LRAIL)” datasheet, Pavemetrics Systems Inc. (Oct. 2014). |
Pavemetrics Systems Inc. webpage screenshot (Dec. 18, 2014). |
Pavemetrics Systems Inc. LRAIL webpage (Feb. 20, 2015). |
Pavemetrics' Memorandum in Opposition to motion for Preliminary Injunction, Pavemetrics Systems, Inc. v. Tetra Tech, Inc. (case 2:21-cv-1289) (Mar. 22, 2021). |
Pavemetrics' Compulsory Counterclaim for Declaratory Judgment, Pavemetrics Systems, Inc. v. Tetra Tech, Inc. (case 2:21-cv-1289) (Mar. 24, 2021). |
MVTec Software GmbH, Halcon Solution Guide I: Basics, available at http://download.mvtec.com/halcon-10.0-solution-guide-i.pdf (2013)(“Halcon Solution Guide”). |
National Instruments, NI Vision for LabView User Manual, available at https://www.ni.com/pdf/manuals/371007b.pdf (2005) (“LabView 2005 Manual”). |
Wenbin Ouyang & Bugao Xu, Pavement Cracking Measurements Using 3D Laser-Scan Images, 24 Measurement Sci. & Tech. 105204 (2013) (“Ouyang”). |
Chris Solomon & Toby Breckon, Fundamentals of Digital Image. |
Number | Date | Country | |
---|---|---|---|
20230058445 A1 | Feb 2023 | US |
Number | Date | Country | |
---|---|---|---|
62679467 | Jun 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16889016 | Jun 2020 | US |
Child | 17981581 | US | |
Parent | 16255928 | Jan 2019 | US |
Child | 16889016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16127956 | Sep 2018 | US |
Child | 16255928 | US |