The Rail Safety Improvement Act of 2008, as enacted by the U.S. Congress, requires all Class I railroads and passenger rail operators to implement a mandatory Positive Train Control (PTC) collision avoidance system. PTC introduces continuous global positioning system (GPS) based location and speed tracking, with sophisticated on-board wireless technology that enables enforcement of vehicle movement in a rail system from a centralized control center. PTC utilizes a mapping of the tracks in a rail system and of assets located along the tracks, such as rail crossings, signals, mile markers, and the like. The mapping information is configured into track data files or subdivision files for each segment of track in the rail system.
The PTC track data must be verified prior to its use. The verification process requires that every feature in the PTC onboard track data is verified to have an accurate position to within 2.2 meters of the position reported by a precision GPS unit.
Current processes for track-data verification require a vehicle to be driven along a rail system to each of a plurality of assets whose GPS location is to be verified. The vehicle is moved along the tracks to align a precision GPS unit mounted on the vehicle or towed therebehind alongside the asset, e.g. the GPS unit is positioned such that a line drawn between the GPS unit and the asset is generally perpendicular to the length of the tracks. This alignment is typically visually aligned or “eyeballed” by an operator traveling in the vehicle.
Depending on the location of the GPS unit relative to the operator, the alignment may be verified by aligning the asset with a window of the vehicle or a mark placed on or adjacent to the window. Or the operator may exit the vehicle to view the asset and GPS unit and to instruct a driver of the vehicle to move the vehicle into proper alignment, such as when the GPS unit is mounted on a trailer towed behind the vehicle. Such as verification system is time consuming, prone to operator error, and may expose the operator to dangerous conditions.
Additionally, the current process requires a representative of the Federal Railroad Administration (FRA) to witness the validation. This can create scheduling conflicts or difficulties and is an inefficient use of manpower resources.
Embodiments of the invention are defined by the claims below, not this summary. A high-level overview of various aspects of the invention are provided here for that reason, to provide an overview of the disclosure, and to introduce a selection of concepts that are further described in the Detailed Description section below. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in isolation to determine the scope of the claimed subject matter. In brief, this disclosure describes, among other things, a track-data verification vehicle and method for generating a video-based record of track-verification activities.
The track-data verification vehicle is configured for travel on the tracks of a rail system. The vehicle includes a 360° video camera and a precision GPS unit mounted on top of the vehicle. A positive train control (PTC) track data verification system is disposed interior to the vehicle. A second video camera is positioned within the vehicle to capture a view of a monitor associated with the PTC track data verification system and of user interactions with a set of input controls.
A video monitor is also located within the vehicle and displays a 360° video image captured by the 360° video camera. The video image includes a pair of indicators that depict locations in the video image at which objects depicted therein are physically in a desired alignment with the GPS unit.
In use, the track-data verification vehicle is driven along the tracks to the location of an asset, the position of which is to be verified. A 360° video image captured by the 360° video camera is displayed on the video monitor within the vehicle. Using the indicators included in the 360° video image, the asset is aligned with the GPS unit, e.g. the representation of the asset in the video image is aligned with the respective indicator. The position of the asset relative to the tracks is then recorded by providing an input to a keypad associated with the PTC track data verification system.
Simultaneously with positioning the vehicle and aligning the asset with the GPS unit, the second video camera inside the vehicle captures an image of the display monitor associated with the PTC track data verification system and of the keypad associated therewith. The second video camera thus also captures the operator's inputs to the keypad.
The data captured by the PTC track data verification system including the GPS data, the 360° video, and the video captured by the second video camera are synchronized to provide correspondence therebetween. Thus, the data elements can be reviewed at a later date and can be referenced relative to one another. In one embodiment, a composite video is generated for reviewing the data verification process. The composite video includes the 360° video superimposed over a portion of the video image captured by the second video camera to enable an operator to view both videos simultaneously. As such, verification of the track-data can be reviewed and approved by an FRA representative when it is convenient for the representative and without the representative being required to ride along in the track-data verification vehicle.
Illustrative embodiments of the invention are described in detail below with reference to the attached drawing figures, and wherein:
The subject matter of select embodiments of the invention is described with specificity herein to meet statutory requirements. But the description itself is not intended to necessarily limit the scope of claims. Rather, the claimed subject matter might be embodied in other ways to include different components, steps, or combinations thereof similar to the ones described in this document, in conjunction with other present or future technologies. Terms should not be interpreted as implying any particular order among or between various steps herein disclosed unless and except when the order of individual steps is explicitly described.
With initial reference to
An external camera 14 and a positioning system 16 are mounted on top of the vehicle 10. The external camera 14 is disposed on a support post 18 to provide sufficient height above the vehicle 10 so that the vehicle 10 does not overly obstruct the external camera's view of the ground or objects on or near the ground, as shown best by viewing regions or viewing cones 20 and 22 depicted in
The external camera 14 preferably comprises a 360° camera configured to generate an image that spans 360° horizontally around the external camera 14 and/or the vehicle 10. The external camera 14 includes a plurality of image sensors 24 or an array of cameras that each capture an image from a respective, overlapping viewing region 20, 22, 26, 28 (see
The positioning system 16 is preferably a precision global positioning system (GPS) unit configured to provide centimeter-level positional accuracy however other more or less precise units may be employed. Other positioning system technologies including the GLONASS system operated by the Russian Aerospace Defense Forces, the Galileo system provided by the European Union and European Space Agency, or the Long Range Navigation (LORAN) hyperbolic radio navigation system developed by the United States, among other satellite-based and non-satellite-based systems can be employed instead of or in addition to GPS. The complete positioning system 16 may be mounted on top of the external camera 14 or only a receiver or antenna portion thereof might be mounted on the external camera 14 while the remainder of the unit 16 is disposed within the vehicle 10 or integrated into a control unit 50 as discussed below. The position reported by the positioning system 16 is the position of the portion of the positioning system 16 that is co-located with the camera 14. Alternatively, the receiver or antenna portion of the positioning system unit 16 may be provided in association with the vehicle 10 with a known offset from the external camera 14. The actual position of the external camera 14 and/or asset to be verified can thus be calculated based on the known offset.
With reference now to
The PTC data-verification unit 30 also includes an input device or keypad 38 disposed in close proximity to the display device 34. The keypad 38 includes a plurality of buttons 40 that are associated with respective soft-key fields 42 displayed in the PTC data verification screen 36. In an embodiment, the keypad 38 is mounted or located separately from the display device 34 and comprises another form of input device, such as for example, a microphone for receiving voice commands or notes, a mouse, a track-pad, or the like. As depicted in
An input-capture system comprising an internal camera 48 is mounted in the interior of the vehicle 10 and positioned to capture an image of the display device 34 and the keypad 38. As depicted in
In another embodiment, the input-capture system comprises an application executing on a computing device, such as a control unit 50 (described below) that interfaces with the PTC data-verification unit 30 to record the provision of inputs thereto. The input-capture system may also capture a representation of a display presented by the PTC data-verification unit 30 or data presented thereon concurrently with provision of the inputs.
The control unit 50, such as a laptop computer, or other computing device, is provided in the vehicle 10. The control unit 50 is communicatively coupled to both the external and internal cameras 14, 48 and is configured to control operation thereof. The control unit 50 also provides a memory for storage of images captured by the external and internal cameras 14, 48 or an external storage medium 52, such as a hard drive, flash memory, or similar device may be provided for storage of the images. The control unit 50 may be communicatively coupled to the positioning system 16 to obtain position data therefrom and to associate the position data with the images captured by the cameras 14, 48. The control unit 50 might also provide any necessary processing functions of the positioning system 16.
Upon capture of the plurality of images by the 360° external camera 14, the control unit 50 provides processing necessary to generate a composite 360° image 54 as depicted in
The 360° view provides context for the image 54 and for an asset therein. Features ahead of and behind the vehicle 10 and the asset can be seen. For example, a sign 55F which lies ahead of the vehicle 10 is seen from a front side thereof and a sign 55B which lies behind the vehicle 10 is seen from a back side thereof (
The 360° image 54 can be presented on a monitor 56 associated with the control unit 50 or on a secondary monitor 58 for viewing by operators seated within the vehicle 10. The secondary monitor 58 can be positioned as desired for viewing from any seating position within the vehicle, e.g. from a front seat or a rear seat.
The control unit 50 also displays one or more indicators 60 concurrently with and superimposed or overlaid on the composite 360° image 54. The indicators 60 comprise any visual symbol or feature that is useable to identify a location within the image 54 that corresponds to a physical location alongside the vehicle 10 that is in a desired alignment with the positioning system 16. Preferably, the indicators 60 depict a location that falls on a line 62 that passes through the positioning system 16 and that is perpendicular to a centerline 64 of the vehicle 10. As depicted in
In one embodiment, the control unit 50 is configured to generate a vertically downward looking, bird's eye view of the area surrounding the vehicle 10. This view can be similarly displayed on one or both of the monitors 56, 58 with the indicators 60 and may aid identifying and aligning the positioning system 16 with low-lying objects like grade crossings.
With continued reference to
Initially, the vehicle 10 is positioned on the rails 12 of a segment of a rail system for which the PTC track data for a plurality of assets is to be verified. An operator/driver of the vehicle 10 is positioned in a driver's seat 68 while a second operator is positioned in a right- or left-side rear seat 70, 72 within reach of the keypad 38 associated with the PTC data-verification unit 30. In one embodiment, only the operator/driver is required to operate the vehicle 10 for verification of the track data; the PTC data-verification unit 30 is positioned to enable viewing of the screen 36 and access to the keypad 38 by the operator/driver from the driver's seat 68.
The PTC data-verification unit 30 is initiated to execute a PTC track-data verification application. The screen 36 is thus displayed on the display device 30 for viewing by the second operator. The control unit 50 is also initiated to activate the external and internal cameras 14, 48. The 360° image 54 captured by the external camera 14 is displayed by the control unit 50 on the monitor 56 or on the secondary monitor 58 for viewing by the operator/driver and/or the second operator. The image 54 comprises a video and can be stored by the control unit 50 in whole. Or only select portions of the video image 54, such as portions within a predetermined distance or temporal window around an asset being verified might be stored. Alternatively, only one or more still images 54 might be stored.
The internal camera 48 captures an image 74 of the display device 34 and the keypad 38. The image 74 can be stored in whole or in part similarly to that of the 360° image 54.
The vehicle 10 is driven along the rails 12 to an asset, the position of which is to be verified. For example, as depicted in
Upon attaining alignment of the signal 66 with the indicator 60, the second operator provides an appropriate input to the keypad 38 to indicate the location associated with the signal 66 to the PTC track data-verification unit 30. The internal camera 48 captures an image or video of the screen 36 and of the operator's hand 76 providing the input to the keypad 38, as depicted in
The location of the signal 66 or other assets that are verified using the unit 10 are described as the location associated with the asset because the indicated location may not be the actual location of the asset. The asset is typically located a distance transversely away from the positioning system 16 and the rails 12. The verified location is thus the location of the asset relative to the length of the rails 12, e.g. the location at which a line drawn to the asset from the rails 12 is perpendicular to the length of the rails 12. In another embodiment, a transverse distance from the rails 12 to the asset may be measured or estimated by one of the operators or by an application executing on the computing unit 32 to provide a more exact location of the asset.
The 360° image 54 captured by the external camera 16 may optionally be superimposed or overlaid on the image 74 captured by the internal camera 48 to provide a combined image 78 (
Many different arrangements of the various components depicted, as well as components not shown, are possible without departing from the scope of the claims below. Embodiments of the technology have been described with the intent to be illustrative rather than restrictive. Alternative embodiments will become apparent to readers of this disclosure after and because of reading it. Alternative means of implementing the aforementioned can be completed without departing from the scope of the claims below. Certain features and subcombinations are of utility and may be employed without reference to other features and subcombinations and are contemplated within the scope of the claims.
This application claims priority to U.S. Provisional Patent Application No. 61/883,486 filed Sep. 27, 2013 and titled TRACK-DATA VERIFICATION, the disclosure of which is hereby incorporated herein in its entirety by reference.
Number | Name | Date | Kind |
---|---|---|---|
6434452 | Gray | Aug 2002 | B1 |
7755660 | Nejikovsky | Jul 2010 | B2 |
7805227 | Welles et al. | Sep 2010 | B2 |
8073581 | Morris et al. | Dec 2011 | B2 |
8150568 | Gray | Apr 2012 | B1 |
8452467 | Kumar | May 2013 | B2 |
8538609 | Kumar | Sep 2013 | B2 |
8712610 | Kumar | Apr 2014 | B2 |
20090037039 | Yu et al. | Feb 2009 | A1 |
20090276108 | Kumar et al. | Nov 2009 | A1 |
20120179309 | Wilson et al. | Jul 2012 | A1 |
20120274772 | Fosburgh | Nov 2012 | A1 |
Entry |
---|
Barber et al., “Geometric validation of a ground-based mobile laser scanning system.” ISPRS Journal of Photogrammetry and Remote Sensing, vol. 63 No. 1, Jan. 2008, pp. 128-141. |
Redding et al., “Vision-based target localization from a fixed-wing miniature air vehicle,” Proceedings of the 2006 American Control Conference, Jun. 2006, pp. 2862-2867. |
Number | Date | Country | |
---|---|---|---|
20150094885 A1 | Apr 2015 | US |
Number | Date | Country | |
---|---|---|---|
61883486 | Sep 2013 | US |