SIGNAL ASPECT ENFORCEMENT

Information

  • Patent Application
  • 20240109569
  • Publication Number
    20240109569
  • Date Filed
    December 13, 2023
    11 months ago
  • Date Published
    April 04, 2024
    7 months ago
Abstract
A signal aspect enforcement method for a rail vehicle includes determining, by an on-board controller on the rail vehicle, a position of the rail vehicle; determining, by the on-board controller, whether a first signal aspect for a signal matches a second signal aspect for the signal; and determining, by the on-board controller, at least one of a route of the rail vehicle or a speed limit of the rail vehicle when the first signal aspect matches the second signal aspect. The first signal aspect is determined from first data from first system on the rail vehicle, and the second signal aspect is determined from second data from a second system on the rail vehicle, which system is different from the first system used to determine the first signal aspect.
Description
BACKGROUND

Safe and efficient train operation relies on the consistent transmission and receipt of signals that provide instructions to train drivers or train control systems. When a vehicle is able to communicate with a wayside/central movement authority unit (MAU), a current movement authority is communicated to the vehicle and the vehicle is allowed to proceed to the limit of the received movement authority, under the speed restrictions specified in the movement authority.


For communication-based train control (CBTC) operation, signal enforcement is only possible when communicating with a train having an established position. A wayside MAU sends an appropriate movement authority to the train, based on the reported status of a signal aspect. In certain failure situations, such as if the vehicle position is not determined or the communication with the MAU is not functioning, the CBTC on-board (on-board the vehicle) controller commands the vehicle to brake to a stop because the movement authority is not able to be determined.


When the vehicle is operating unattended and encounters a failure situation, the vehicle is forced to stop and a special recovery procedure, typically involving sending crew to the failed vehicle, is necessary to continue operation. This process results in service delays and passenger dissatisfaction.





BRIEF DESCRIPTION OF THE DRAWINGS


FIGS. 1A and 1B are diagrams including a side view and top view of a signal enforcement system architecture, in accordance with some embodiments.



FIG. 2 is a graph of a positioning system operation, in accordance with some embodiments.



FIG. 3 is a graph of a braking calculation, in accordance with some embodiments.



FIG. 4 is a diagram of a signal and trackside beacon arrangement, in accordance with some embodiments.



FIG. 5 is a diagram of a trackside beacon installation, in accordance with some embodiments.



FIG. 6 is a diagram of a signal and trackside beacon and retroreflector arrangement, in accordance with some embodiments.



FIGS. 7A and 7B are diagrams of route selection in response to signal aspect, in accordance with some embodiments.



FIG. 8 is a flowchart of a signal aspect enforcement method, in accordance with some embodiments.



FIG. 9 is a diagram of a camera built-in test with an external light source, in accordance with some embodiments.



FIG. 10 is a diagram of a camera built-in test with an internal light source (right side), in accordance with some embodiments.



FIG. 11 is a diagram of a signal aspect command line and non-intrusive current monitoring, in accordance with some embodiments.



FIG. 12 is a high-level block diagram of a processor-based system usable in conjunction with one or more embodiments.





DETAILED DESCRIPTION

The following disclosure provides many different embodiments, or examples, for implementing different features of the provided subject matter. Specific examples of components, values, operations, materials, arrangements, or the like, are described below to simplify the present disclosure. These are, of course, merely examples and are not intended to be limiting. Other components, values, operations, materials, arrangements, or the like, are contemplated. For example, the formation of a first feature over or on a second feature in the description that follows may include embodiments in which the first and second features are formed in direct contact, and may also include embodiments in which additional features may be formed between the first and second features, such that the first and second features may not be in direct contact. In addition, the present disclosure may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed.


Further, spatially relative terms, such as “beneath,” “below,” “lower,” “above,” “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. The spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. The apparatus may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein may likewise be interpreted accordingly.


Some embodiments provide a signal aspect enforcement system and method using on-vehicle (on-board) sensors to determine the signal aspect and an associated movement authority past the signal. Some embodiments provide a signal aspect enforcement system and method that extracts expected signals and associated allowable signal aspects from a map.


Some embodiments provide identification of an ego track of a vehicle using multi-sensor data from on-board sensors. Some embodiments provide a high-integrity, accurate vehicle path and switch position using multi-sensor data from on-board sensors. Some embodiments provide methods and architectures to extract path information from 2D and 3D sensors to achieve performance and integrity for extracting rail path. Some embodiments provide methods and architectures to combine path information from 2D and 3D sensors to achieve performance and integrity for extracting rail path. The path or trajectory of ego vehicle is determined by tracking the train position over time from the path measured by the 2D and 3D sensors, cross-checked with the trajectory estimated by the motion measurements provided by the inertial measurement units (IMUs), and accepted if the cross check is successful.


Some embodiments provide a signal aspect enforcement system and method that accurately determine a vehicle position on a track (ego track) and use the determined vehicle position to select a signal (ego signal) and check a detected signal aspect of the signal with an allowable aspect for the signal. In some embodiments for determining the ego signal, all the signals in the field of view of 2D and 3D sensors are detected and then, based on the train position, the expected signals are extracted from the map, including their allowable aspects. From the extracted tracks, the ego track is related to the contextual information of the extracted tracks in terms of relative spatial order (e.g., if the ego track is the middle track, there should be neighboring tracks on either side). Then, based on ego track, the ego signal is selected for the corresponding track (e.g., in the above example the ego signal corresponds to the middle signal of the three signals) and the detected aspect of the ego signal is cross-checked with the allowable aspect for such signal stored in the map database. Supervisions on the signal aspect with respect to the aspect of previous signal and the speed limit of the vehicle for this segment are applied too.



FIGS. 1A and 1B are side view and top view diagrams of a signal aspect enforcement system architecture, in accordance with some embodiments. A vehicle 102 moves along tracks 104. The track (or path) occupied by the vehicle 102 may be referred to as the ego track (or ego path). The vehicle 102, in accordance with some embodiments, is a train, subway, monorail, car, bus or other suitable vehicle. The tracks 104, in accordance with some embodiments, are train tracks, rails, roads or other vehicle guideways. On board the vehicle 102 is an on-board computer 106, also referred to as a vehicle on-board controller (VOBC). In at least some embodiments, on-board computer 106 is an on-board controller (also referred to as an on-board computer), processor, processing device or the like. The on-board computer 106 is communicably connected to and communicates with an on-board radar 108, an on-board camera 110 and one or more on-board beacons 112. The on-board camera 110, in accordance with some embodiments, includes one or more visible spectrum, near infra-red (IR) spectrum, long wavelength infra-red spectrum or other wavelength suitable cameras. A beacon system 116, in accordance with some embodiments, includes one or more of an ultra-wide band, dedicated short range communications, radio frequency identification or another internet-of-things system or other suitable beacon system capable of providing signal aspect status, beacon identification (ID) and/or ranging information to the vehicle 102. The on-board beacons 112 (FIG. 1B) are communicably connected to a beacon antenna 114.


Alongside the tracks 104 is a beacon system 116, including a signal aspect beacon 118 and a positioning beacon 120. The signal aspect beacon 118 associated with signal 124 is installed a predetermined distance before the signal 124 to increase the on-board beacon detection range. A retroreflector 122 and a signal 124 are electrically connected to the signal aspect beacon 118. In accordance with an embodiment, retroreflector 122 and signal aspect beacon 118 receive power from the signal aspect of signal 124. In accordance with various embodiments, retroreflector 122, signal 124, Movement Authority Unit (MAU) 128 and signal aspect beacon 118 are connected to individual power sources (not shown). A MAU 128 is communicably connected with the beacon system 116, in particular with the positioning beacon 120 and signal aspect beacon 118 and communicates movement authority data to the vehicle 102. In some embodiments, retroreflector 122 is an active retroreflector.


Signal 124 has three signal aspects: a green signal aspect 130 which indicates that the vehicle 102 is able to proceed normally, a yellow signal aspect 132 which indicates that the vehicle 102 is able to proceed with cautionary speed restrictions and a red signal aspect 134 which indicates that the vehicle 102 must stop. When a signal aspect is to be enforced, the signal aspect is illuminated or “lit.” When a signal aspect is not to be enforced, the signal aspect is not illuminated or “dark.”


The signal aspect enforcement system and method, in accordance with an embodiment, uses on-board visible spectrum, near infra-red spectrum and/or long-wavelength infra-red camera 110, beacon system 116 and the on-board radar 108 with the associated active retroreflector 122 to determine the signal aspect and the associated movement authority past the signal 124. The beacon system 116 includes an on-board beacon (requester) 112 and a signal aspect beacon (responder) 118 associated with the signal 124. The on-board radar 108, in accordance with an embodiment, is a commercial, off-the-shelf radar such as a 77 GHz millimeter (mm) wavelength radar, which is capable of providing range, azimuth and a measure of the strength of the reflection for detected targets. In accordance with an embodiment, a 3D radar is used to provide the relative elevation of the returns.


The on-board computer 106 contains a database with all the signals 124 in the system along with the corresponding signal location, ID, associated beacon IDs, signal aspects (including the aspect colors and aspect location in the signal structure) and restrictions associated with each aspect. In at least one embodiment, the signal location is provided with respect to a guideway map. The guideway map includes both the gradient and curvature of the track 104 well as the location of signals 124 relative to the adjacent track 104. The guideway map includes the line-of-sight (LOS) range at any given location on the track 104. The LOS range is the furthest distance along the tracks that is not obstructed (by tunnel walls or other structures.


Sensors such as cameras, radars and beacons typically are limited to being able to detect objects or features within their LOS. Objects or features beyond the LOS are not “visible” to these sensors, except in certain multipath conditions in which the measurements are not trustable. A beacon, e.g., a radio device using WiFi, LTE or 5G, can serve a sensing role, but the sensing role may be limited by the LOS due to being based on radiofrequency wireless technology. The beacon can serve a communications role that supports wireless communications, but the communications role may likewise be limited by the LOS due to being based on radiofrequency wireless technology. The beacon can serve a communications role that supports wired communication, which is not limited by LOS. Using a proper installation schema of beacons and connectivity between the beacons (wired and wireless), the beacon information can be relayed beyond LOS, and in some cases anywhere.


The on-board computer 106 calculates the braking distance required to bring the vehicle 102 to a complete stop in consideration of the grade and the guaranteed emergency brake rate (GEBR), due to the grade of the track 104, failures in the braking system and latencies in the braking system. Even though the position is not determined, the grade is estimated by differentiating the motion acceleration calculated based on the speed measurements and the measured acceleration using Equation (1).





Grade=(VMeasuredti−VMeasuredti−1)/(ti−ti−1)−aMeasured   Equation (1):


wherein, in Equation (1):

    • VMeasuredti is the measured speed at time ti;
    • VMeasuredti−1 is the measured speed at time ti−1; and
    • aMeasured is the average acceleration of the measured accelerations at times ti and ti−1.


The on-board computer 106 determines the maximum worst case braking distance of the vehicle using Equation (2). This determination considers the propulsion runaway acceleration, the propulsion cutoff time, the emergency brake engagement time and the GEBR.






d
EB
=V
LOS
×t
PCO+0.5(aPRW+Grade)×tPCO2+(VLOS+(aPRW+Grade)×tPCOtEBE+0.5Grade×tEBE2+(VLOS+(aPRW+Grade)×tPCO+Grade×tEBE)2/(2×(GEBR+Grade))   Equation (2):


wherein, in Equation (2):

    • dEB is the worst case braking distance;
    • VLOS is the LOS speed;
    • tPCO is the propulsion cutoff time;
    • aPRW is the propulsion runaway acceleration; and
    • tEBE is the emergency brake engagement time.


The signal 124, signal ID, and associated aspect are determined based on multiple sensors. For example, in an embodiment, the sensors include an on-board visible spectrum, near IR spectrum and/or LWIR camera 110 and/or a beacon system 116 and/or the on-board radar 108. In an embodiment, the sensors include one or more mobile LiDARs (3D sensors) with cameras (visible color or monochrome and/or infrared; 2D sensors), radars/imaging radars (3D sensors) and/or inertial measurement units (IMUs). In an embodiment, the sensors are used to achieve rail path extraction.


When the vehicle's position on the guideway is known, the on-board computer 106 determines that the signal aspect is the red signal aspect 134 if a retroreflector 122 associated with the red signal aspect 134 of the signal 124 is detected in its expected location is as in the map and the along-tracks distance to the signal 124. Verifying the location of the retroreflector prevents false positives.


The on-board computer 106 determines the signal aspect based on data provided by independent sensors such as the on-board camera 110, beacon system 116, the on-board radar 108 and retroreflector 122. Two sensors, for example, the on-board camera 110 and beacon system 116, are sufficient to provide the information necessary to determine the signal ID and the vehicle position on the guideway from which the along-tracks distance to the signal is derived. To determine the signal aspect and the along-tracks distance to the signal based on the on-board radar 108 and the associated retroreflector 122, the vehicle position is provided to the on-board computer 106.


When the vehicle 102 position on the guideway is unknown or the vehicle 102 is not able to communicate with the MAU 128, the on-board camera 110 records image data representing the guideway, on-board computer 106 determines the signal ID and the aspect of the signal 124 using object and color recognition algorithms processing image data representing a view of the guideway received from the on-board camera 110. The on-board computer 106 compares the signal ID and aspect information determined from the image data with the signal ID and aspect information received from the signal aspect beacon 118 to determine a match. Matching establishes confidence in the signal ID and aspect determined from image data generated by the on-board camera 110. When the vehicle 102 position on the guideway is unknown or the vehicle 102 is not able to communicate with the MAU 128, the on-board computer 106 uses image and color recognition algorithms to identify “dark” signal aspects in the image data from the on-board camera 110. A signal aspect is “dark” when the associated signal is not illuminated. When the vehicle 102 position on the guideway is unknown or the vehicle 102 is not able to communicate with the MAU 128, the on-board computer determines whether the vehicle 102 is authorized to proceed along the route and what speed limit the vehicle 102 must respect based on the signal ID and the signal aspect. For example, if the signal 124 aspect is red, the vehicle 102 is not allowed to proceed along the route and is directed to stop. If the signal 124 aspect is green, the vehicle 102 is allowed to proceed along the route, respecting the speed limit associated with a green signal. When the vehicle 102 position on the guideway is unknown or the vehicle 102 is not able to communicate with the MAU 128, the on-board computer 106 determines the vehicle position from the along-tracks distance determined by the on-board radar 108. The on-board computer 106 determines if the vehicle 102 has sufficient distance to come to a stop before reaching a signal 124 and sends a signal to the vehicle emergency brakes (not shown) if the signal aspect is the red signal aspect 134 and the vehicle 102 does not have a sufficient distance to come to a stop before reaching the signal 124. When train position is unknown or when train position is known with a relatively large position uncertainty, the accuracy of the vehicle's path on the guideway is relatively poor and hence, online extraction of the ego path of the train is needed, and this should be done using the set of autonomous sensors on-board the train. The tracks for the ego vehicle with on-board sensors are determined considering the sensor location(s) and the rail(s) with respect to the vehicle body. The other tracks detected are classified as neighboring tracks and then, based on the ego track and vehicle speed, the predicted ego trajectory is estimated.


In an embodiment, a method for extracting the ego track and identifying neighbour tracks using a path extraction component which uses diverse on-board sensor technologies (e.g., a 3D sensor such as LiDAR/imaging radar and a 2D sensor such as camera) to extract the ego track as well as neighbour tracks within the sensors' field of view (FOV). In an embodiment, the component structure processes the diverse sensors independently and does late sensor fusion for improved integrity. In an embodiment, a temporal diversity principle is utilized (tracking objects over a time window) to reduce the false detections. In an embodiment, classical detectors (non-AI-based detectors) are utilized to supervise the output of AI-based detectors (e.g., neural networks) for algorithmic diversity, and to reject possible false detections from the AI detectors. In an embodiment, the fusion component incorporates data association rules that will allow only associating objects from the diverse sensor pipelines if they have compliant object types and have sufficient overlap. These data association rules help with rejecting some false detections for improved integrity. For example, a camera sensor may detect a type “pedestrian” while the LiDAR sensor may detect the object as “signal” incorrectly; since the type is not compliant, the object will be flagged and rejected. In an embodiment, the fusion component also incorporates “track management” which tracks the confidence level of each tracked object in terms of how consistent the apparently stationary object is detected over time. The moving objects are rejected when monitoring their motion over time. Tracked objects with low confidence level are not stored. Only very trusted objects, i.e., with sufficiently high confidence level in their detection and their motion status (i.e., stationary), are used in localization, to thus provide high integrity. In an embodiment, temporal trackers are used to allow the tracking of features of large objects that may not fit within the FOV of the sensors (e.g., platforms).


The ego track information of the train is important for contextual scene features relating the landmark objects to the ego track improving the robustness of the solution. In one or more embodiments, the ego track path is extracted in the local map from the sequence of provided sensor positions from the fusion of Object Detector and Tracker component without the use of Path Extraction (PE) component. For this case, the path will be corresponding to the past traversed path by train only, but SIL4 odometry and IMU rate gyro information can be used to predict from the previously constructed train path the forward path. In one or more embodiments, after the first successful localization of the train position and orientation, the known position and orientation of the train are used to extract from the map spline information the forward path of the ego track. Then, a consistency check/supervision is performed between the extracted path using PE component and the extracted path from spline information in the stored reference map. Inconsistencies will be flagged and initiate safe reactions for high integrity.


In an embodiment, path extraction is used for an autonomous train. The path information is used as a basis for various vehicle situational awareness functions, such as (1) forward object detection that detects obstacles along an ego-path hindering the safe train operation, (2) signal recognition that detects an ego signal (i.e., a signal corresponding to the train of interest path or trajectory) based on ego path information and recognizes the signal aspect, and (3) positioning that localizes the train by understanding the surrounding environment of the train including path information. Thus, accurate and complete path information is used to support successive functions for autonomous driving.


The beacon system 116 allows the vehicle 102 to determine the signal ID and signal aspect, and to reestablish the position of the vehicle 102 on the guideway with the detection of at least two wayside beacons 116. By establishing the position of the vehicle on the guideway and the signal aspect, the on-board computer 106 establishes the route the vehicle 102 is authorized to take and the speed limit of the vehicle. The speed limit is determined by the on-board computer using the speed limit associated with signal aspect and the speed limit associated with the vehicle's position on the guideway from the guideway map.


The on-board computer 106 determines if the signal aspect is commanded on and if the signal 124 is actually “lit” based on signal data received by on-board beacon 112. The on-board beacon 112 receives signal data from a signal aspect beacon 118 associated with the signal 124. The on-board computer uses the signal data received from the signal aspect beacon 118 associated with the signal 124 to determine if a “dark” signal aspect reported by the on-board camera 110 is valid.


A communication system such as WiFi and/or LTE and/or Bluetooth and/or UWB is used to determine the signal ID and aspect via communication message from a wayside radio 138 to the on-board radio 136. The wayside radio antenna 142 (or antenna array) location on the guideway is in the database map. In at least some embodiments, the wayside radio is a dedicated radio associated with the signal. In at least some embodiments, the wayside radio is a generic radio for communicating different types of information.


The range between the on-board radio antenna 140 and the wayside radio antenna 142 (or antenna array) is measured using range estimation based on the signal-to-noise ratio range measurement techniques (e.g., RSSI). The signal-to-noise ratio range values behave according to a Poisson distribution as a function of the range between these two devices. In accordance with another embodiment, range is measured using range estimation based on time-of-flight range measurement techniques (e.g., FM RTT). In accordance with another embodiment, the range is measured using angle of arrival and/or angle of departure estimation based on a MIMO (multiple Input Multiple Output) antenna array.


In at least some embodiments, Wi-Fi, LTE, Bluetooth or UWB communication systems typically operate within the 2.4 GHz to 10 GHz base frequency band.



FIG. 4 is a diagram 400 of a signal and trackside beacon arrangement, in accordance with some embodiments. A signal 402, similar to signal 124 in FIG. 1, includes three signal aspects; a green aspect 404, a yellow aspect 406 and a red aspect 408. The green aspect 404 is electrically connected to and sends power to a green aspect trackside beacon 410. The yellow aspect 406 is electrically connected to and sends power to a yellow aspect trackside beacon 412. The red aspect 408 is electrically connected to and sends power to a red aspect trackside beacon 414. A MAU 420 is communicably connected with a position trackside beacon 416, similar to positioning beacon 120 in FIG. 1B. The MAU 420 sends signals corresponding to the signal aspect, speed limit and route to the position trackside beacon 416. The position trackside beacon 416 is connected to a power source 418.


The Movement Authority Unit (MAU) calculates the movement authority to each train based on the train position, switch status, signal aspect, or the like. The MAU is located at the central control room or at stations rooms. It is a wayside central control unit. The MAU receives the location of each train, the switches status, routes status, signals aspects status, etc. and determine the movement authority for each train.


Signal aspect beacon 118 receives power from an associated aspect (e.g., the red signal aspect 134) of a signal 124. Positioning beacon 120 does not receive power from an associated aspect but is in communication with MAU 128, receiving signal aspect information and relaying the signal ID and aspect information to the on-board beacon 112. If the signal aspect is red, for example, the on-board beacon 112 receives the signal data from the positioning beacon 120 and the signal aspect beacon 118 and the on-board computer 106 uses the signal data to establish the position of the vehicle 102 and determine the signal aspect. If two positioning beacons 120 are used, the position is reestablished and the signal aspect is determined. If the signal aspect is not red, then signal aspect beacon 118 is not powered and therefore is not detected by the on-board beacon 112. However, if the signal aspect is red, then signal aspect beacon 118 is powered and therefore is detected by the on-board beacon 112



FIG. 2 is a graph plotting the speed of a vehicle as the speed of the vehicle increases through various thresholds until a braking command is generated. The graph plots the speed of the vehicle on the vertical axis against time on the horizontal axis. As shown by the graph, the vehicle's speed begins at zero at time zero. As time passes, the vehicle gains speed. The train reaches a LOS speed limit at the first dashed line. The train reaches a first speed threshold (Speed threshold 1) at the second dashed line. When the train reaches the first speed threshold, audio and visual warnings are generated to inform the driver that the first speed threshold has been reached. If the brakes are not applied, the speed of the vehicle continues to increase until a second speed threshold (Speed threshold 2) is reached, indicated by the third dashed line. At this time, a braking command is generated, and the vehicle speed reduces to zero, i.e., the vehicle comes to a stop.


If a vehicle, such as vehicle 102 in FIG. 1, enters a non-CBTC territory, or the position of the vehicle 102 is not determined, or the communication with a MAU 128 is not established, on-board computer 106 supervises that the speed of the vehicle 102 does not exceed the LOS speed limit determined from the vehicle's LOS visible distance along the track. If the speed of the vehicle 102 exceeds the LOS speed limit a warning, e.g., audio and/or visual, is generated to the driver of the vehicle. When the speed increases further, the on-board computer 106 commands the vehicle 102 to brake to a stop.



FIG. 3 is a graph depicting the speed of a vehicle as a propulsion runaway condition is occurred by plotting vehicle speed 300 on the vertical axis against time on the horizontal axis. In a first phase (t0-t1), the vehicle speed is increasing in a propulsion runaway condition 302. When a predetermined threshold speed 308, based on a signal aspect determined by the signal aspect enforcement system of Figure, is reached at t1, signal aspect enforcement system sends an engine cut-off signal. The engine is cut off and the vehicle moves without propulsion during a coasting period 304. At time t2, the signal aspect enforcement system sends a brake request. The emergency brakes are applied during an emergency braking period 306, and the vehicle slows down. In some embodiments, only the braking command is sent which is further decomposed by the vehicle's braking system to an engine (propulsion) cut-off command sent to the propulsion system followed by a braking command to the braking system.



FIG. 5 is a diagram 500 of a side-view of a trackside beacon installation, such as beacon system 116 including signal aspect beacon 118 and positioning beacon 120, in accordance with some embodiments. A vehicle 502 moves along a track 503, i.e., the ego track. The vehicle 502 includes an on-board beacon 504 connected to an on-board computer 505. A trackside beacon 506, positioned along the track 503 communicates with the on-board beacon 504. Using time-of-flight information from the communication, the on-board computer 505 determines a first along-tracks distance 512 between the trackside beacon 506 to the on-board beacon 504. A second along-tracks distance 514 from a signal 508 to the trackside beacon 506 is communicated by the trackside beacon 506 to the on-board beacon 504 or, in accordance with some embodiments, provided to the on-board computer 505 by guideway map data.


An along-track-to-signal distance 510 to the signal 508 is determined by the on-board computer 505 by adding the first along-tracks distance 512 and the second along-tracks distance 514. The along-track-to-signal distance 510 is the along-track distance from the signal 508 to the vehicle 502. A worst case braking distance for the vehicle 502 at a given location on the track 503 is determined by the on-board computer 505 using the speed of the vehicle 502, the weight of the vehicle 502, the slope of the track 503 and other factors related to qualities of the vehicle or track 503. The speed of the vehicle 502 is controlled by the on-board computer 505 so that the along-track-to-signal distance 510, the distance from the vehicle 502 to the next signal 508, is always greater than the worst case braking distance. If the signal 508 has a red aspect and the worst case braking distance is less than the along-track-to-signal distance, the on-board computer 505 sends a signal to engage the vehicle's emergency brakes (not shown).


A worst case braking distance must be smaller than the smallest LOS distance in the line, otherwise the vehicle's capability to stop before a red signal aspect, such as the red signal aspect 134 in FIG. 1, is not guaranteed. The worst case braking distance is compared against an along-tracks distance 510 to the signal 508. If the signal aspect is the red signal aspect 134 and the worst case braking distance is greater than the along-tracks distance to the signal 508, emergency brakes are requested.



FIG. 6 is a diagram 600 of a signal and trackside beacons and retroreflector arrangement, in accordance with some embodiments. A signal 602 includes three signal aspects: a green aspect 604, a yellow aspect 606 and a red aspect 608. The green aspect 604 is electrically connected to and sends power to a green aspect trackside beacon 610. The yellow aspect 606 is electrically connected to and sends power to a yellow aspect trackside beacon 612. The red aspect 608 is electrically connected to and sends power to a red aspect trackside beacon 614. A trackside positioning beacon 616 is electrically connected to a power source 618. The red aspect 608 is electrically connected to and sends power to a red aspect retroreflector 620.


Each signal 602 is associated with a red aspect retroreflector 620 driven by the red aspect 608 of the signal. In accordance with some embodiments, multiple red aspect retroreflectors 620 are implemented. In accordance with some embodiments, the red aspect retroreflector 620 is an active retroreflector, such as a Van Atta retroreflector or equivalent, powered by the signal's red aspect 608. The red aspect retroreflector 620 significantly boosts the strength of the return reflection. For example, if the red aspect 608 is illuminated or “lit,” then the associated red aspect retroreflector 620 is powered and retro-reflects the radar signal to the radar, such as the on-board radar 108 in FIG. 1. If the red aspect 608 is illuminated or “dark,” then no (or a significantly weaker) retro-reflection is observed by the radar. The red aspect retroreflector 620 associated with a signal 602 is installed a predetermined distance before the signal, such as signal 124 in FIG. 1, to increase the on-board radar detection range.



FIGS. 7A and 7B are diagrams 700 of route selection in response to a signal aspect identified by the signal aspect enforcement system shown in FIG. 1, in accordance with some embodiments. A signal 702 showing a green aspect 704 instructs a vehicle, such as vehicle 102 in FIG. 1, to take a normal, non-diverging route 706 as the vehicle moves along the tracks 708. A signal 710 showing a yellow aspect 712 instructs a vehicle, such as vehicle 102 in FIG. 1, to take a turnout diverging route 714.


The on-board computer 106 determines the reaction to the signal aspect. For example: red aspect: stop before the signal, yellow aspect: proceed with the speed limit specified in the database for yellow aspect and green aspect: proceed with the speed limit specified in the database for green aspect. The on-board computer 106 determines the route based on the signal aspect.



FIG. 8 is a flowchart 800 of a signal aspect enforcement method, in accordance with some embodiments. Initially, the position of a vehicle, such as vehicle 102 in FIG. 1, is unknown or communication with the vehicle is not available and the flow begins in process 802. The flow proceeds to process 804 wherein the process determines if the vehicle speed is less than the maximum LOS speed. If the speed is not less than the maximum LOS speed, the flow proceeds to process 806 and the emergency brakes are engaged. If the speed is less than the maximum LOS speed or emergency brakes have been engaged, the flow continues to process 808 where the grade of the tracks is determined. The flow continues to process 810 wherein the worst case braking distance for the vehicle is determined.


The flow then proceeds in parallel to a camera process 812 and a beacon system process 814 to determine vehicle and signal parameters. In accordance with other embodiments, the camera process 812 and the beacon system process 814 proceed serially. In at least one embodiment, camera process 812 proceeds prior to beacon system process 814. The camera process 812 and the beacon system process 814 proceed simultaneously or nearly simultaneously, because the data generated by the processes represent the vehicle's current position and situation and then the two sets of data are compared. A camera, such as the on-board camera 110 in FIG. 1, generates image data representing the guideway near the vehicle including a signal, such as signal 124 in FIG. 1. The camera process 812 begins by using image and color recognition processes to determine the signal ID of the signal from the image data generated by the camera in process 816. The vehicle position is determined from the signal ID in process 818. The signal aspect of the signal 124 is determined using color and image recognition processes on the image data generated by the camera in process 820. The distance along the tracks from the vehicle to the signal is determined from visual data generated by the camera in process 822.


The on-board camera 110 is installed on-board the vehicle 102 looking forward. The images/frames captured by the on-board camera 110 are processed, using machine vision algorithms and/or neural network algorithms to identify the signal ID in process 816 and the associated aspects in process 820. The signal ID and aspect are checked to verify the consistency with the expected signal ID and aspect, that the signal ID identified based on the camera images/frames is a valid signal ID contained in the guideway map database, that the number of aspects identified based on the camera images/frames matches the expected number of aspects specified in the database, that the aspect's colour is identified based on the camera's images/frames matches the expected colours specified in the database, that the signal aspects spatial arrangement matches the expected spatial arrangement specified in the guideway map database.


If the checks to verify consistency are repeatedly passed, for a predetermined minimum number of check cycles (typically 3), then the signal attributes are deemed verified.


The on-board controller determines the along-tracks distance to the signal in process 822 and determines the position of the vehicle in process 818 on the guideway (based on the signal ID, the signal location on the map and the along-tracks distance to the signal) and the aspect of the signal.












TABLE 1





Signal aspect
Red
Yellow
Green







Red aspect
Power on
Power off
Power off


Yellow Aspect
Power off
Power on
Power off


Green Aspect
Power off
Power off
Power on


Positioning
Power on
Power on
Power on









The beacon system process 814 begins by determining the signal ID from data received from the signal aspect beacon in process 824. The vehicle position is determined using time-of-flight information and data from the guideway map in process 826. The signal aspect is determined from data received from the signal aspect beacon in process 828. The distance along the tracks from the vehicle to the signal is determined from the time-of-flight information and data from the guideway map in process 830.


The on-board beacons, such as on-board beacons 112 in FIG. 1, periodically scan which trackside beacons, such as trackside beacons 118 in FIG. 1, are available within a predefined range (typically 200 m or longer). If a trackside beacon 118 is within the scanning range, the trackside beacon 118 responds to the on-board beacon 112 with the trackside beacon ID. Then, the on-board beacon determines the range to the trackside beacon, based on time of flight or equivalent techniques, and report the range and beacon's ID to the on-board computer 106.


The on-board computer 106 checks if the trackside beacon ID is associated with a signal in process 824. The on-board computer 106 determines the vehicle position on the guideway based on at least one signal aspect trackside beacon 118 and one positioning trackside beacon 120 associated with the signal in process 826. The on-board computer 106 determines the signal aspect reported by the trackside beacon 118 in process 828.


The on-board computer 106 converts the range measured by the on-board beacon 112 to the along-tracks distance to the signal aspect trackside beacon 118 and determines the along-tracks distance to the signal in process 830.


The parameters determined by the camera process 812 and the beacon system process 814 are then compared in process 832 to determine if there is a two-out-of-two (2oo2) match. In 2oo2 duplex voting, both the data from the camera and the data from the beacon system must “agree” to proceed. If there is not a match, the process sends an alarm indicating a camera and/or radar failure in process 834. The alarm is sent to the vehicle on-board controller and in some cases to a diagnostics computer located at the central control room or maintenance depot. This check provides a high level of safety integrity. In some embodiments, the level of safety integrity is Safety Integrity Level (SIL) 4. For a device to be rated as SIL 4, the device is required to have demonstrable on-demand reliability. SIL 4 is based on International Electrotechnical Commission's (IEC) standard IEC 61508 and EN 50126 and EN 50129 standards. SIL 4 requires the probability of failure per hour to range from 10−8 to 10−9. These checks include a 2oo2 voting between the signal ID and signal aspect determined based on the camera and the beacon system.


The signal ID and aspect derived from the camera images/frames and the beacon system must be an exact match, otherwise the brakes are applied. In some cases, due to visibility constraints, weather or other environmental conditions, the camera does not detect the signal. The signal's attributes from the beacon system is trusted and an alarm is sent indicating camera failure. This is because the camera is more sensitive to weather and environmental influences.


At times, the signal aspect becomes “dark” (i.e., the signal aspect is commanded to be illuminated or “lit” but the signal aspect is not illuminated (“dark”)). In this situation, the beacon system does detect the correct signal aspect. If the beacon system determines that an aspect is “on” but the camera does not while no other aspect is determined on by the camera, then the beacon system is trusted and an alarm is sent indicating a not illuminated or “dark” aspect.


The vehicle's position derived from the camera images/frames and the beacon system must agree within a predetermined range (typically 5 m), otherwise the brakes are applied. If the vehicle position derived from the camera images/frames and the beacon system agree within the expected range, then the vehicle's position derived from the beacon system is used because the beacon system is more accurate than the camera.


The vehicle position derived from the beacon system is used to determine an along-tracks position window (typically 5 m to 10 m) in which a retroreflector associated with the signal red aspect is expected to be detected. The vehicle position derived from the beacon system is more accurate than the position derived by the camera. The beacon system has a greater range than the camera system.


If the retroreflector is detected within the expected window, a check is performed to verify that the signal aspect derived based on the 2oo2 voting between the camera and the beacon system is red. If the signal aspect derived based on the 2oo2 voting between the camera and the beacon system is red, the signal aspect is confirmed to be red otherwise an alarm is sent indicating radar failure.


If the retroreflector is not detected within the expected window, a check is performed to verify that the signal aspect derived based on the 2oo2 voting between the camera and the beacon system is not red. If the signal aspect derived based on the 2oo2 voting between the camera and the beacon system is not red, the signal aspect derived based on the 2oo2 voting between the camera and the beacon system is confirmed, otherwise an alarm is sent indicating radar failure.


If there is a 2oo2 match in process 832 or an alarm has been sent in process 834, the process provides the signal aspect and the distance along the track from the vehicle to the signal. The process provides the vehicle position in process 838 to a radar process 840. The on-board computer 106 determines the signal aspect using the on-board radar 108 and retroreflector 122 in process 842 and determines the distance along the tracks from the vehicle to the signal in process 844. These values are compared in process 846 to the aspect and distance along the tracks from the vehicle to the signal provided by process 836. If the values do not match in process 846, an alarm is sent indicating a camera and/or radar failure in process 834. If the values match in process 846, the process determines the route in process 848. The process determines the speed limit in process 850. The process determines if the distance along the tracks from the vehicle to the signal is less than the worst case braking distance in process 852. If the distance along the tracks from the vehicle to the signal is less than the worst case braking distance, the process makes an emergency brake request at process 854 and returns to process 808 to determine the grade of the track. If the distance along the tracks from the vehicle to the signal is not less than the worst case braking distance, the process controls the vehicle to stop before the signal is reached in process 856 and the flow returns to process 808.



FIG. 9 is a set-up 900 for conducting a camera built-in test (testing for the color red) with external colored light source, in accordance with some embodiments. An external colored light source 902 includes areas corresponding to the green aspect 904, the yellow aspect 906 and the red aspect 908 of a signal. The external colored light source 902 has a height H and a width W. The colored aspects 904, 906 and 908 are circular and have diameters D. The camera generates an image 910. The camera searches the image 910 for red, green and blue pixels that form a shape 916 having a height h and a width w. The camera identifies three areas within the shape 916 having a diameter d, two of the areas 912 are “dark” having red, green and blue pixels on and the third area 914 having only red pixels on.


The method of checking the camera's health includes pixels and pixel colors health check, pattern check, location within the FOV check, size check and intensity check.


The images/frames 910 reported by the camera are compared against the expected images/frames while the camera is facing a colored light source 902 and/or colored signs with a known pattern, installed on or near the camera housing or on the wayside at known locations, while the camera distance from the colored light source 902 and/or colored signs is within a known distance range.



FIG. 10 is a diagram 1000 of a camera test system, in accordance with some embodiments. A camera 1002 is positioned within a camera enclosure 1004 having three non-transparent surfaces 1006 and a transparent surface 1008. The camera 1002 is communicably connected with a microcontroller unit (MCU) 1010. Within the camera enclosure 1004, two colored light sources 1012 are positioned so that they are within the camera's outer field of view (FOV) 1014 but outside of the camera's inner FOV 1016. The two colored light sources 1012 are communicably connected with the MCU 1010. The MCU 1010 is communicably connected with a computer 1018. The camera 1002 is communicably connected with computer 1018.


The signal aspect enforcement system and method, in accordance with an embodiment, checks the health of a visible spectrum/near IR/LWIR camera 1002 with a dedicated colored light source 1012 installed at the camera's enclosure 1004.


The images/frames reported by the camera 1002 are compared against the expected images/frames while the camera 1002 is facing a colored light source 1012, with known pattern, installed at the camera's enclosure 1004.



FIG. 11 is a diagram 1100 of a signal aspect command line and non-intrusive current monitoring, in accordance with some embodiments. A MAU 1102 is communicably connected with a relay 1104. The MAU 1102 is communicably connected with a red aspect trackside beacon 1106. The MAU 1102 communicates a signal aspect command line to the relay 1104 and the red aspect trackside beacon 1106. The relay 1104 is electrically connected to and provides power to the red aspect 1108 of a signal 1110. The relay is electrically connected to and provides power to the red aspect trackside beacon 1106. The red aspect trackside beacon 1106 monitors the power from the relay 1104 using non-intrusive current monitoring.


The red aspect trackside beacon 1106, using non-intrusive current monitoring by placing an inductive coil on the signal aspect command line and the filament of the red aspect 1108, detects if the red aspect 1108 is on and actually illuminated or “lit.”



FIG. 12 is a block diagram of an on-board computer 1200 in accordance with some embodiments.


In some embodiments, the on-board computer 1200 is a general purpose computing device including a hardware processor 1202 and a non-transitory, computer-readable storage medium 1204. Computer-readable storage medium 1204, amongst other things, is encoded with, i.e., stores, computer program code 1206, i.e., a set of executable instructions. Execution of instructions 1206 by hardware processor 1202 represents (at least in part) a movement control tool which implements a portion or all of the methods described herein in accordance with one or more embodiments (hereinafter, the noted processes and/or methods).


Processor 1202 is electrically coupled to computer-readable storage medium 1204 via a bus 1208. Processor 1202 is also electrically coupled to an I/O interface 1210 by bus 1208. A network interface 1212 is also electrically connected to processor 1202 via bus 1208. Network interface 1212 is connected to a network 1214, so that processor 1202 and computer-readable storage medium 1204 are capable of connecting to external elements via network 1214. Processor 1202 is configured to execute computer program code 1206 encoded in computer-readable storage medium 1204 in order to cause the on-board computer 1200 to be usable for performing a portion or all of the noted processes and/or methods. In one or more embodiments, processor 1202 is a central processing unit (CPU), a multi-processor, a distributed processing system, an application specific integrated circuit (ASIC), and/or a suitable processing unit.


In one or more embodiments, computer-readable storage medium 1204 is an electronic, magnetic, optical, electromagnetic, infrared, and/or a semiconductor system (or apparatus or device). For example, computer-readable storage medium 1204 includes a semiconductor or solid-state memory, a magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk, and/or an optical disk. In one or more embodiments using optical disks, computer-readable storage medium 1204 includes a compact disk-read only memory (CD-ROM), a compact disk-read/write (CD-R/W), and/or a digital video disc (DVD).


In one or more embodiments, computer-readable storage medium 1204 stores computer program code 1206 configured to cause the on-board computer 1200 to be usable for performing a portion or all of the noted processes and/or methods. In one or more embodiments, computer-readable storage medium 1204 also stores information which facilitates performing a portion or all of the noted processes and/or methods. In one or more embodiments, computer-readable storage medium 1204 stores parameters 1207.


The on-board computer 1200 includes I/O interface 1210. I/O interface 1210 is coupled to external circuitry. In one or more embodiments, I/O interface 1210 includes a keyboard, keypad, mouse, trackball, trackpad, touchscreen, and/or cursor direction keys for communicating information and commands to processor 1202.


The on-board computer 1200 also includes network interface 1212 coupled to processor 1202. Network interface 1212 allows the on-board computer 1200 to communicate with network 1214, to which one or more other computer systems are connected. Network interface 1212 includes wireless network interfaces such as BLUETOOTH, WIFI, WIMAX, GPRS, or WCDMA; or wired network interfaces such as ETHERNET, USB, or IEEE-1264. In one or more embodiments, a portion or all of noted processes and/or methods, is implemented in two or more the on-board computers 1200.


The on-board computer 1200 is configured to receive information through I/O interface 1210. The information received through I/O interface 1210 includes one or more of instructions, data, design rules, libraries of standard cells, and/or other parameters for processing by processor 1202. The information is transferred to processor 1202 via bus 1208. The on-board computer 1200 is configured to receive information related to a UI through I/O interface 1210. The information is stored in computer-readable storage medium 1204 as user interface (UI) 1242.


In some embodiments, a portion or all of the noted processes and/or methods is implemented as a standalone software application for execution by a processor. In some embodiments, a portion or all of the noted processes and/or methods is implemented as a software application that is a part of an additional software application. In some embodiments, a portion or all of the noted processes and/or methods is implemented as a plug-in to a software application.


In some embodiments, the processes are realized as functions of a program stored in a non-transitory computer readable recording medium. Examples of a non-transitory computer readable recording medium include, but are not limited to, external/removable and/or internal/built-in storage or memory unit, e.g., one or more of an optical disk, such as a DVD, a magnetic disk, such as a hard disk, a semiconductor memory, such as a ROM, a RAM, a memory card, and the like.


A signal aspect enforcement method for a rail vehicle with an unknown position is performed by receiving speed measurements from speed measuring device by an on-board controller and determining using the received speed measurements if rail vehicle speed is less than a predetermined line-of-sight threshold speed. The on-board controller receives grade measurements from a grade measuring device by the on-board controller and determines the grade of the rail. The on-board controller determines the worst-case braking distance of the rail vehicle using the rail vehicle speed and grade of rail. The on-board controller receives image data including a first signal aspect from a camera system and beacon/radio data including a second signal aspect from a beacon/radio system. The on-board controller determines if the first signal aspect matches the second signal aspect and determines route of rail vehicle and speed limit of rail vehicle by the on-board controller based on the first signal aspect.


When the rail vehicle speed is greater than a line-of-sight threshold speed, the on-board controller outputs a brake request.


The image data includes signal identification and the on-board controller uses the signal identification to determine the position of the rail vehicle.


The beacon/radio data includes signal identification and the on-board controller uses the signal identification to determine the position of the rail vehicle.


The image data includes signal identification and the on-board controller uses the signal identification to determine the position of the rail vehicle and the beacon/radio data includes signal identification and the on-board controller uses the signal identification to determine the position of the rail vehicle.


The on-board controller uses the image data to determine a first along-tracks distance to a signal and the beacon/radio data to determine a second along-tracks distance to a signal.


The on-board controller performs tests on the camera to confirm the camera's ability to identify shapes and colours.


A signal aspect enforcement system for a rail vehicle includes an on-board controller, a camera system, in communication with the on-board controller, generating image data including signal aspect to the on-board controller, a beacon/radio system, in communication with the on-board controller, providing received beacon/radio data including signal aspect and signal location to the on-board controller and a radar system, in communication with the on-board controller, providing radar data to the on-board controller. The on-board controller is configured to use the image data, the received beacon/radio data and the radar data to determine signal aspect.


The on-board controller is configured to receive rail vehicle speed measurement from a speed measuring device and determines if the measured rail vehicle speed is less than a predetermined line-of-sight threshold speed.


The on-board controller is configured to receive rail grade measurements from a rail grade measuring device and determines the rail grade.


The on-board controller is configured to use the rail vehicle speed and the rail grade to determine a worst case braking distance for the rail vehicle.


The radar data is processed by the on-board controller to determine the along-tracks distance to signal.


The on-board controller is configured to determine the route and speed limit


The on-board controller is configured to perform tests on the camera to confirm the camera's ability to identify shapes and colours.


A signal aspect enforcement method for a rail vehicle includes receiving speed measurements from speed measuring device by an on-board controller and determining using the received speed measurements if rail vehicle speed is less than a predetermined line-of-sight threshold speed. The on-board controller receives grade measurements from a grade measuring device by the on-board controller and determining grade of rail and determines the worst-case braking distance of the rail vehicle using the rail vehicle speed and grade of rail. The on-board controller receives image data including a first signal aspect from a camera system and beacon/radio data including a second signal aspect from a beacon/radio system by the on-board controller and determines if the first signal aspect matches the second signal aspect. The on-board controller determines the route of rail vehicle and speed limit of rail vehicle by the on-board controller based on the first signal aspect When the rail vehicle speed is greater than a line-of-sight threshold speed, the on-board controller outputs a brake request and the image data includes signal identification and the on-board controller uses the signal identification to determine the position of the rail vehicle.


The beacon/radio data includes signal identification and the on-board controller uses the signal identification to determine the position of the rail vehicle.


The image data includes signal identification and the on-board controller uses the signal identification to determine the position of the rail vehicle and the beacon/radio data includes signal identification and the on-board controller uses the signal identification to determine the position of the rail vehicle.


The on-board controller uses the image data to determine a first along-tracks distance to a signal and the beacon/radio data to determine a second along-tracks distance to a signal.


When the first along-tracks distance to a signal does not match the second along-tracks distance to a signal, the computer output indicates an alarm.


The on-board controller performs tests on the camera to confirm the camera's ability to identify shapes and colours.


In an embodiment, a signal aspect enforcement method for a rail vehicle with an unknown position includes determining, by an on-board controller on the rail vehicle, a position of the rail vehicle; determining, by the on-board controller, whether a first signal aspect for a signal matches a second signal aspect for the signal; and determining, by the on-board controller, at least one of a route of the rail vehicle or a speed limit of the rail vehicle when the first signal aspect matches the second signal aspect. In an embodiment, the first signal aspect is determined from first data from first system on the rail vehicle, and the second signal aspect is determined from second data from a second system on the rail vehicle, which system is different from the first system used to determine the first signal aspect.


In an embodiment, the first system is one of a camera system on the rail vehicle, a beacon/radio system on the rail vehicle, a LiDAR system on the rail vehicle, or a radar system on the rail vehicle, and the second system is one of a camera system on the rail vehicle, a beacon/radio system on the rail vehicle, a LiDAR system on the rail vehicle, or a radar system on the rail vehicle, which system is different from the first system.


In an embodiment, data from the camera system is used to determine the first signal aspect, the camera system being a stereo camera system or multi-camera system, and data from the LiDAR system or the radar system is used to determine the second signal aspect.


In an embodiment, the method further includes receiving a speed measurement from a speed measuring device by the on-board controller, and determining a first speed of the rail vehicle; determining if the first speed is less than a predetermined line-of-sight threshold speed; receiving a grade measurement from a grade measuring device by the on-board controller and determining a first grade of a rail on which the rail vehicle is traveling; determining a worst-case braking distance of the rail vehicle using the first speed and the first grade; and outputting a brake request from the on-board controller when the first speed is greater than the predetermined line-of-sight threshold speed.


In an embodiment, the first data includes a first identification of the signal, the method further including using the on-board controller to make a first determination of the position of the rail vehicle using the first identification of the signal.


In an embodiment, the second data includes a second identification of the signal, the method further including using the on-board controller to made a second determination of the position of the rail vehicle using the second identification of the signal.


In an embodiment, the LiDAR system is used to generate third data including a third identification of the signal, and the radar system is used to generate fourth data including a fourth identification of the signal, the method further including using the on-board controller to made a third determination of the position of the rail vehicle using the third identification of the signal; using the on-board controller to make a fourth determination of the position of the rail vehicle using the fourth identification of the signal; and providing a position of the rail vehicle only if the first, second, third, and fourth determinations of the position all match within certain predefined tolerance.


In an embodiment, the on-board controller determines the position of the rail vehicle only if a position based on data from the camera system, a position based on data from the beacon/radio system, a position based on data from the LiDAR system, a position based on data from the radar system, and a position (if available) supplied by another independent positioning system all match within a predefined tolerance.


In an embodiment, at least two of the camera system, the beacon/radio system, the LiDAR system, and the radar system on the rail vehicle are used to extract at least two rail tracks, the at least two rail tracks including a first track that the rail vehicle occupies and a second track nearby the first track.


In an embodiment, the on-board controller determines an ego track the rail vehicle is on, and an association of the ego track with the signal.


In an embodiment, data from the camera system, data from the LiDAR system, and data from the radar system are used to generate a map, the map including a location of the signal on the map, and an ego track the rail vehicle is on.


In an embodiment, the on-board controller cross checks the generated location of the signal against a stored signal location, and raises an alarm if the generated location of the signal does not match the stored signal location.


In an embodiment, the on-board controller uses data from the camera system to determine the first signal aspect.


In an embodiment, the on-board controller uses data from the beacon/radio system to determine the second signal aspect.


In an embodiment, the on-board controller accepts a signal aspect only if the first signal aspect based on the data from the camera system and the second signal aspect based on the data from the beacon/radio system match, and the signal aspect is a valid aspect in a database.


In an embodiment, data from the camera system includes an identification of the signal, an ego track, a nearby track, and an extracted path (trajectory), and the on-board controller uses the identification of the signal, the ego track, the nearby track, and the extracted path (trajectory) together with a map to determine the position of the rail vehicle.


In an embodiment, the on-board controller uses data from the camera system to determine a first along-tracks distance to the signal, uses data from the beacon/radio system to determine a second along-tracks distance to the signal, uses data from the LiDAR system to determine a third along-tracks distance to the signal, and uses data from the radar system to determine a fourth along-tracks distance to the signal.


In an embodiment, the on-board controller determines an along-tracks distance to the signal only if all of the first through fourth along-tracks distances to the signal match within a certain tolerance.


In an embodiment, a signal aspect enforcement method for a rail vehicle includes determining, by an on-board controller, whether a first signal aspect for a signal matches a second signal aspect for the signal; and determining, by the on-board controller, at least one of a route of the rail vehicle or a speed limit of the rail vehicle when the first signal aspect matches the second signal aspect. In an embodiment, the first signal aspect is determined from first data from a first system on the rail vehicle, and the second signal aspect is determined from second data from a second system on the rail vehicle, which system is different from the first system used to determine the first signal aspect.


In an embodiment, the first system is one of a camera system on the rail vehicle, a beacon/radio system on the rail vehicle, a LiDAR system on the rail vehicle, or a radar system on the rail vehicle, and the second system is one of a camera system on the rail vehicle, a beacon/radio system on the rail vehicle, a LiDAR system on the rail vehicle, or a radar system on the rail vehicle, which system is different from the first system.


The foregoing outlines features of several embodiments so that those skilled in the art may better understand the aspects of the present disclosure. Those skilled in the art should appreciate that they may readily use the present disclosure as a basis for designing or modifying other processes and structures for carrying out the same purposes and/or achieving the same advantages of the embodiments introduced herein. Those skilled in the art should also realize that such equivalent constructions do not depart from the spirit and scope of the present disclosure, and that they may make various changes, substitutions, and alterations herein without departing from the spirit and scope of the present disclosure.

Claims
  • 1. A signal aspect enforcement method for a rail vehicle with an unknown position, the method comprising: determining, by an on-board controller on the rail vehicle, a position of the rail vehicle;determining, by the on-board controller, whether a first signal aspect for a signal matches a second signal aspect for the signal; anddetermining, by the on-board controller, at least one of a route of the rail vehicle or a speed limit of the rail vehicle when the first signal aspect matches the second signal aspect,wherein: the first signal aspect is determined from first data from a first system on the rail vehicle, andthe second signal aspect is determined from second data from a second system on the rail vehicle, which system is different from the first system used to determine the first signal aspect.
  • 2. The method of claim 1, wherein: the first system is one of a camera system on the rail vehicle, a beacon/radio system on the rail vehicle, a LiDAR system on the rail vehicle, or a radar system on the rail vehicle, andthe second system is one of a camera system on the rail vehicle, a beacon/radio system on the rail vehicle, a LiDAR system on the rail vehicle, or a radar system on the rail vehicle, which system is different from the first system.
  • 3. The method of claim 2, wherein: data from the camera system is used to determine the first signal aspect, the camera system being a stereo camera system or multi-camera system, anddata from the LiDAR system or the radar system is used to determine the second signal aspect.
  • 4. The method of claim 2, wherein the first data includes a first identification of the signal, the method further comprising using the on-board controller to make a first determination of the position of the rail vehicle using the first identification of the signal.
  • 5. The method of claim 4, wherein the second data includes a second identification of the signal, the method further comprising using the on-board controller to made a second determination of the position of the rail vehicle using the second identification of the signal.
  • 6. The method of claim 5, wherein: the LiDAR system is used to generate third data including a third identification of the signal, andthe radar system is used to generate fourth data including a fourth identification of the signal,the method further comprising: using the on-board controller to made a third determination of the position of the rail vehicle using the third identification of the signal;using the on-board controller to make a fourth determination of the position of the rail vehicle using the fourth identification of the signal; andproviding a position of the rail vehicle only if the first, second, third, and fourth determinations of the position all match within certain predefined tolerance.
  • 7. The method of claim 2, wherein the on-board controller determines the position of the rail vehicle only if a position based on data from the camera system, a position based on data from the beacon/radio system, a position based on data from the LiDAR system, a position based on data from the radar system, and a position (if available) supplied by another independent positioning system all match within a predefined tolerance.
  • 8. The method of claim 2, wherein at least two of the camera system, the beacon/radio system, the LiDAR system, and the radar system on the rail vehicle are used to extract at least two rail tracks, the at least two rail tracks including a first track that the rail vehicle occupies and a second track nearby the first track.
  • 9. The method of claim 2, wherein data from the camera system, data from the LiDAR system, and data from the radar system are used to generate a map, the map including: a location of the signal on the map, andan ego track the rail vehicle is on.
  • 10. The method of claim 9, wherein the on-board controller cross checks the generated location of the signal against a stored signal location, and raises an alarm if the generated location of the signal does not match the stored signal location.
  • 11. The method of claim 2, wherein the on-board controller uses data from the camera system to determine the first signal aspect.
  • 12. The method of claim 11, wherein the on-board controller uses data from the beacon/radio system to determine the second signal aspect.
  • 13. The method of claim 12, wherein the on-board controller accepts a signal aspect only if the first signal aspect based on the data from the camera system and the second signal aspect based on the data from the beacon/radio system match, and the signal aspect is a valid aspect in a database.
  • 14. The method of claim 2, wherein: data from the camera system includes an identification of the signal, an ego track, a nearby track, and an extracted path (trajectory), andthe on-board controller uses the identification of the signal, the ego track, the nearby track, and the extracted path (trajectory) together with a map to determine the position of the rail vehicle.
  • 15. The method of claim 2, wherein the on-board controller uses data from the camera system to determine a first along-tracks distance to the signal, uses data from the beacon/radio system to determine a second along-tracks distance to the signal, uses data from the LiDAR system to determine a third along-tracks distance to the signal, and uses data from the radar system to determine a fourth along-tracks distance to the signal.
  • 16. The method of claim 15, wherein the on-board controller determines an along-tracks distance to the signal only if all of the first through fourth along-tracks distances to the signal match within a certain tolerance.
  • 17. The method of claim 1, further comprising: receiving a speed measurement from a speed measuring device by the on-board controller, and determining a first speed of the rail vehicle;determining if the first speed is less than a predetermined line-of-sight threshold speed;receiving a grade measurement from a grade measuring device by the on-board controller and determining a first grade of a rail on which the rail vehicle is traveling;determining a worst-case braking distance of the rail vehicle using the first speed and the first grade; andoutputting a brake request from the on-board controller when the first speed is greater than the predetermined line-of-sight threshold speed.
  • 18. The method of claim 1, wherein the on-board controller determines an ego track the rail vehicle is on, and an association of the ego track with the signal.
  • 19. A signal aspect enforcement method for a rail vehicle, the method comprising: determining, by an on-board controller, whether a first signal aspect for a signal matches a second signal aspect for the signal; anddetermining, by the on-board controller, at least one of a route of the rail vehicle or a speed limit of the rail vehicle when the first signal aspect matches the second signal aspect,wherein: the first signal aspect is determined from first data from a first system on the rail vehicle, andthe second signal aspect is determined from second data from a second system on the rail vehicle, which system is different from the first system used to determine the first signal aspect.
  • 20. The method of claim 19, wherein: the first system is one of a camera system on the rail vehicle, a beacon/radio system on the rail vehicle, a LiDAR system on the rail vehicle, or a radar system on the rail vehicle, andthe second system is one of a camera system on the rail vehicle, a beacon/radio system on the rail vehicle, a LiDAR system on the rail vehicle, or a radar system on the rail vehicle, which system is different from the first system.
PRIORITY CLAIM

The present application is a continuation-in-part of U.S. patent application Ser. No. 17/073,145, filed Oct. 16, 2020, and claims the priority of U.S. Provisional Application No. 62/916,672, filed Oct. 17, 2019, each of which is incorporated herein by reference in its entirety.

Provisional Applications (1)
Number Date Country
62916672 Oct 2019 US
Continuation in Parts (1)
Number Date Country
Parent 17073145 Oct 2020 US
Child 18538568 US