TRAIN COLLISION AVOIDANCE AND ALERT

Information

  • Patent Application
  • 20190248392
  • Publication Number
    20190248392
  • Date Filed
    February 12, 2018
    6 years ago
  • Date Published
    August 15, 2019
    5 years ago
Abstract
Systems and methods are provided for generating a safety notification to a first train with respect to a speed of a second train along a track, the system including a camera transceiver, comprising a light emitter configured to emit a narrow bandwidth light, an image sensor, one or more light reflectors and a lens subsystem; and a processor configured to perform the steps of: determining a train speed and a train position; receiving images from the camera transceiver; processing the images to calculate a speed and a position of a potential obstacle on the track; responsively to calculating the potential obstacle speed and position, determining a safe speed of the train and providing a control signal to control the first train.
Description
FIELD OF THE INVENTION

The present invention is directed to systems and methods for train operation and safety, and in particular methods for train signaling.


BACKGROUND

Loss of life and property in the event of a train collision can be enormous. An ever-present danger is the possibility of a collision between trains if there is a track switching malfunction or error. Consequently, multiple, redundant safety mechanisms are imperative.


Light beam signaling between trains is described in U.S. Pat. No. 3,365,572 to Henry, “Automatic collision prevention, alarm and control system” and in U.S. Pat. No. 6,290,188 to Bassett, “Collision avoidance system for track-guided vehicles”. Henry describes a system in which light emitted by trains is turned on and off by a preset duty cycle in order to differentiate that light from ambient light, thereby assisting in visual recognition of trains that might be on a collision course. Bassett describes a system whereby polarized light emitted by light sources on one train are reflected diffusely by on-coming trains, but reflected in a polarized manner from other objects.


SUMMARY

Embodiments of the present invention provide systems and methods for increasing the safety and efficiency of trains by facilitating the distant recognition of obstacles, including trains on the same tracks. According to an embodiment of the present invention, a system may include a camera transceiver, including a light emitter configured to emit a narrow bandwidth light, an image sensor, one or more light reflectors and a lens subsystem; and a processor configured to perform the steps of: determining a train speed and a train position; receiving images from the camera transceiver; processing the images to calculate a speed and a position of a potential obstacle on the track; responsively to calculating the potential obstacle speed and position, determining a safe speed of the train; and responsively to determining the safe speed, providing a control signal to control the first train. The control signal may control an automatic brake unit, an acceleration unit or an audio or visual driver alert unit of the train, or a turning, switching or other control unit of the railway system. Determining the safe speed may include determining a rate of deceleration or acceleration.


In some embodiments, the safe speed may be to stop. The train may be a first train, the potential obstacle may be a second train travelling towards the first train on the track, processing the images may include determining the length of the second train and determining the safe speed may include determining that the train must stop before reaching a junction between the trains. Alternatively, the train may be a first train, the potential obstacle may be a second train travelling towards the first train on the track, processing the images may include determining the length of the second train and determining the safe speed may include determining that the train must take a rail switch at a junction located at a point between the trains.


In some embodiments, the notification further includes a safety level, and the system may be further configured to display the safety level on a screen viewable by the driver or by a railway system supervisor. Additionally, the display may be a stand-alone display, a multimedia/GPS navigation display, or a smartphone/PDA display. The safety level may be one of a set of possible levels comprising a first level that the first train speed may be safe and a second notification that the speed may be unsafe. In further embodiments, the train may be a driverless train. The potential obstacle may be a second train and processing the images may include identifying a representation of the second train in the images. The train may be a first train, the potential obstacle may be an on-coming second train on the track and calculating the speed and position of the obstacle may include receiving the images while the relative speed between the first train and second trains may be changing.


Receiving the images further may include determining a contour of the track from the images and responsively orienting the camera transceiver to a view of the track. The laser emitter and the image sensor may respectively transmit and receive laser pulses through one or more common lenses of the lens subsystem. The train may be a first train, the potential obstacle may be a second train travelling towards the first train on the track, processing the images may include determining the length of the second train by matching a pattern of one or more of a light reflector configuration, a shutter rate, and a color to a pre-defined pattern defining a length. The camera transceiver may be a front camera transceiver and the system further may include back camera transceivers on the second train.


Processing the images may include identifying reflections from the back camera transceivers on a second train and determining the safe speed may include determining a speed to prevent the first train from hitting the back of the second train.


The system may include reflectors installed along the track, processing the images may include identifying an obstruction by identifying a lack of reflected light along a portion of the track, and determining the safe speed may include determining to stop the train due to the lack of reflected light. The system may include a warning tower situated near a railway crossing having reflectors installed on the crossing, processing the images may include identifying from a field of view of the warning tower an obstruction of the reflectors, and determining the safe speed may include determining to stop the train due to the obstruction. The system may include a warning tower situated near a railway crossing, a vehicle approaching the railway crossing may be equipped with a second light emitter and the image sensor may transmit and receive laser pulses.


The system may include a warning tower situated near a railway crossing having reflectors installed on the road, processing the images may include identifying from a field of view of the warning tower an obstruction of the reflectors and responsively determining to stop the vehicle. The system may include a warning tower situated near a railway crossing, processing the images may include identifying from a field of view of the warning tower an obstruction of a registration plate of the vehicle, and determining the vehicle safe speed may include determining to stop the vehicle.


There is further provided, according to embodiments of the present invention, a method for generating a safety notification for a train on a railway track with respect to a vehicle moving on a road crossing the track, the method including: configuring a camera transceiver including a light emitter configured to emit a narrow bandwidth light, an image sensor, light reflectors and a lens subsystem; and providing a processor configured to be communicatively coupled to the camera transceiver and to a memory that may include computer-readable instructions, which cause the processor to perform the steps of: determining a vehicle speed and a vehicle position; receiving images from the camera transceiver; processing the images to calculate a length of a vehicle travelling on the road; responsively to calculating the length of the vehicle, determining a safe speed of the train; and responsively to determining the safe speed, providing a control signal to control the first train.


There is further provided, according to embodiments of the present invention, a method for generating a safety notification with respect to a speed of a first train moving on a track, the method including: configuring a camera transceiver including a light emitter configured to emit a narrow bandwidth light, an image sensor, one or more light reflectors and a lens subsystem; and providing a processor configured to be communicatively coupled to the camera transceiver and to a memory that may include computer-readable instructions, which cause the processor to perform the steps of: determining a first train speed and a first train position; receiving images from the camera transceiver; processing the images to calculate a speed, a position, and a length of a second train travelling on the track; responsively to calculating the second train speed, position, and length of the second train, determining a safe speed of the first train; and providing a control signal to control the first train.


The present invention will be more fully understood from the following detailed description of embodiments thereof.





BRIEF DESCRIPTION OF DRAWINGS

The accompanying drawings illustrate embodiments of the disclosed subject matter and explain principles of embodiments of the disclosed subject matter. Structural details are shown only to the extent necessary for an understanding of the disclosed subject matter and the various ways in which it may be practiced.



FIG. 1 is a schematic, block diagram of a monitoring system for detecting obstacles on a railway track, according to an embodiment of the present invention.



FIGS. 2A-2C are schematic, pictorial illustrations of a railway system including two trains equipped with the monitoring system, according to an embodiment of the present invention.



FIG. 3 is a schematic, pictorial illustration of a railway system including a train and a railway crossing tower equipped with the monitoring system, according to an embodiment of the present invention.



FIG. 4 is a schematic, pictorial illustrations of a railway system including a train equipped with the monitoring system and tracks having “cat eye” reflectors, according to an embodiment of the present invention.



FIG. 5 is a schematic, pictorial illustration of a camera transceiver of the monitoring system, according to an embodiment of the present invention.



FIG. 6 is a schematic, pictorial illustration of the camera transceiver recording multiple images of an on-coming train, according to an embodiment of the present invention.



FIGS. 7A-7D are schematic, pictorial illustrations of scenarios of operation of the monitoring system, when there is a risk of a potential collision between two trains on a track with a junction, according to an embodiment of the present invention.



FIG. 8 is a schematic, pictorial illustration of a scenario of operation of the monitoring system, when there is a risk of a potential collision between two trains on a track without a junction, according to an embodiment of the present invention.



FIG. 9 is a graph of train velocity, indicating the train deceleration due to braking, according to an embodiment of the present invention.



FIGS. 10A-10H are schematic, pictorial illustrations of scenarios of operation of the monitoring system, when there is a risk of a potential collision between a train and a vehicle at a railway crossing, according to an embodiment of the present invention.



FIGS. 11A and 11B are pictorial orthogonal views of a camera transceiver of the monitoring system, according to an embodiment of the present invention.





DETAILED DESCRIPTION OF THE INVENTION

In the following detailed description of various embodiments, reference is made to the accompanying drawings that form a part thereof, and in which are shown by way of illustrating specific embodiments by which the invention may be practiced. It is understood that other embodiments may be envisioned and structural changes made without departing from the scope of the present invention.



FIG. 1 is a schematic, block diagram of a monitoring system 20 for monitoring a rail track for obstacles, according to an embodiment of the present invention. The monitoring system is configured to operate on a train in motion and to monitor the track, typically in the forward direction of the train.


The monitoring system includes a processing device, such as a monitoring processor 22, which receives and processes inputs from one or more sources, including a camera transceiver 24. The camera transceiver includes elements described further herein below, such as an image sensor 26, servo motor sensors 27 and a light emitter 28.


The image sensor may be a complementary metal-oxide-semiconductor (CMOS) megapixel chip or similar sensor, typically having a high resolution, such as 40 megapixels. Lenses of the camera described further herein below provide the sensor with a view towards the track in front of the train. The field of view of the camera and other aspect of orientation, zoom, and focus may be controlled by the processor by sending control signals that control the orientation of the camera and/or the lens subsystem.


The light emitter which may be an infrared, conduction-cooled bar package (CCP) laser or similar emitter generally configured to emit a narrow bandwidth light. Light emission from the emitter may be pulsed, or may operate in continuous wave mode (CW) or in quasi-continuous wave mode (Q-CW). The processor controls the light emitter and receives images from the camera sensor. Images may include images with reflections of emitted light and images without such reflections, enabling the processor to process the two types of images and to filter all light other than the reflections.


The monitoring system may include reflectors 30, which are typically installed on the rear or/and front of some or all trains travelling on the same railway system. The system may also include track “cat eye” reflectors 32, which may be installed on the tracks, as described further herein below. Light from the light emitters are reflected from the two types of reflectors and sensed by the image sensor.


Light emitters or reflectors may include shutters 34, which may be controlled by the processor to modulate the frequency of pulses of reflected light. In some embodiments, trains of a given railway system may have multiple reflectors, with or without shutters, installed on both the front and back sides of the trains.


Processor inputs may include a driver input 36, which may be a human driver or an autonomous computer-based mechanism for controlling the operation of the train. Input may also be received from one or more external data sources 38, such as long or/and local area wireless data networks, such as cellular, Wi-Fi, satellite networks, etc., as well as from in-train sensors 40, which provide speed and location data with respect to the train, including GPS data from GPS receivers.


Received data may also include map data stored in a memory 42 of the processor, which is also configured to store the processor software. Based on the map data, the processor may monitor the type of track on which a train is being driven. The map data also includes rail configuration information related to locations of railway junctions and crossings, which enable the processor to perform calculations described herein below when approaching such junctions and crossings.


In additional embodiments, means of driver input, such as a touch screen input, may be provided to indicate the driver's intentions. Based on the speeds and positions of the train and surrounding trains and obstacles, the processor determines whether the current speed is safe. If not, the processor may provide a control signal to control the train, such a safety/warning notification or alert to a display and/or audio output 44.


The display may be a stand-alone display, or an existing train multimedia/GPS navigation display, or a Smartphone/PDA display viewable by the train driver or conductor, and/or a railway system supervisor. The control signal may indicate a safety level such as “safe” or “unsafe”, or may be indicated numerically or by color and/or volume. The control signal may be configured to indicate a safe speed, which may include an indication of a need to decelerate or accelerate in order to avoid a potential collision. When a train is approaching a junction, the control signal may be an instruction to turn at the junction. In addition, the control signal may control an automatic control unit 46, such as a brake system or an acceleration system of the train, or a turning, switching or other control unit of the railway system.



FIGS. 2A and 2B are schematic, pictorial illustrations of a portion 50 of a railway system including a track 52, the track having a low curvature, on which a first train 56 and a second train 58, are travelling, both equipped with the monitoring system described above, according to an embodiment of the present invention. As indicated in FIG. 2A, the trains at a given moment may be travelling towards each other on the same track 52. Typically such an incident might be due to a timing or switching. On the portion 50 of the system illustrated in FIGS. 2A and 2B, a turnout track 54 provides a route for the second train 58 to travel in order to avoid a collision with the first train 56. The turnout track joins the track 52 at a switch or junction 59.


Installed towards the front of both trains are camera transceivers 24 and reflectors 30 of the monitoring system 20, as described above. Light emitted from the camera transceivers is reflected from reflectors of the opposing trains and in turn sensed by the image sensors of the camera transceivers. The processors receive the images, identify the reflections, and determine the speeds and locations of the opposing trains. Based on the speeds and locations of both trains, the processors each determine whether or not the train 58 can turn onto the track 54 and avoid a collision without either train changing speed, or whether one or both of the trains needs to change its speed or stop.



FIG. 2B illustrates a scenario whereby the train 58 proceeds on the track 54, while the train 56 stops. In the scenario shown the train 56 must stop, as it would otherwise reach junction 59 before the full length of 56 could pass, thereby causing a collision. In an alternative scenario, the train 56 might have to stop to avoid being hit, or the train 58 would increase its speed, and the train 56 might decrease its speed, such that neither train would have to stop.



FIG. 2C is schematic, pictorial illustrations of a portion 60 of the railway system, including a track 62 having a high curvature, on which trains 58 and 66 are travelling, both equipped with the monitoring system 20 described above, according to an embodiment of the present invention. On the train 58, the processor of the monitoring system 20 operates the camera transceiver of the train to scan a field of view 64, which is a relatively narrow angle of a scan angle 69. The servo motor sensors 27, described above, are configured to enable the field of view 64 of the camera transceiver to “pan” the scan angle 69. The processor processes acquired images of the scan to identify in the images the reflectors 30 of the train 66. As described above, reflectors are typically installed on the rear or front of all trains. Also the processor may adjust lenses of the camera transceiver so that the track and possible obstacles on the track remain in the field of view of the camera.



FIG. 3 is a schematic, pictorial illustration of a scenario in which the train 56 is travelling is traveling on train system portion 70 towards a railway crossing 74, elements of monitoring system 20 being installed on the train, according to an embodiment of the present invention. The illustrated portion 70 of the railway system includes a crossing tower, or a warning tower 72, positioned before the road crossing. The warning tower may also include elements of monitoring system 20, as described further herein below. Reflectors 76 may also be installed at the junction to indicate the presence of the junction. In embodiments of the present invention, the warning tower 72 detects a vehicle 78 travelling towards the junction and determines the vehicle's speed. The warning tower also determines the location and speed of the train 56. If the vehicle is not slowing to stop before reaching the junction (or alternatively, is not travelling fast enough to pass the junction before the train), the tower may send to the train 56 a signal, by means of the light emitter of the watch tower's camera transceiver, or by wireless communication network, to indicate the potential collision. The signal is recorded by the camera transceiver of the train 56 and an image sent to the monitoring processor 22 of the train 56. The monitoring processor 22 is configured to analyze the signal to determine whether the train should slow down or stop.



FIG. 4 is a schematic, pictorial illustrations of a railway system portion 90, on which is travelling the train 56, which has installed elements of the monitoring system 20, according to an embodiment of the present invention. The track 52 may be laid out with “cat eye” track reflectors 42, typically ground-mounted retro reflectors. The monitoring processor 22 of the monitoring system 20 is configured to recognize if a view of a track reflector is obstructed, meaning that there is an obstruction on the track. FIG. 4 shows an example, whereby a vehicle 92 has stopped on the track. The camera transceiver 24 of the monitoring system 20 does not have a view of at least two track reflectors, the view being blocked by the vehicle 92. The blocked view is determined by the monitoring processor 22 from images recorded by the camera transceiver 24.



FIG. 5 is a schematic, pictorial illustration of camera transceiver 24, installed, by way of example, in the train 56 described above, and configured to record images of an on-coming train, such as train 58, according to an embodiment of the present invention. Reflectors 30 on train 58 appear as pixels in an image recorded on an image sensor 122 of the camera transceiver. The processor is generally configured to detect such pixels, typically by filtering noise from the image as described further herein below. A distance between the reflectors, indicated as distance D, is proportional to a distance between pixels of the reflectors appearing in the images, (indicated in the figure as hi), divided by a focal length, f, of a lens 124 of the camera transceiver 24. The subscript, i, of the parameter h indicates the time of the measurement. It is to be understood that the elements indicated in the figure are for illustration purposes only. For example, lens 124 is illustrative of the actual lens subsystem employed in the camera transceiver.



FIG. 6 is a schematic, pictorial illustration of in-train camera transceiver 24 in a first train 56 recording multiple images of on-coming train 58, according to an embodiment of the present invention. Multiple images are required in order to determine a velocity v of the on-coming train, which may be determined as a function of the distance, D, between reflectors on the on-coming train.


As shown in the figure, images are recorded at three distinct times, at three respective distances, z1, z2, and z3.


The relations between these distances, the speeds of the passing and on-coming trains, and the lens parameters described above are as follows:







z
i

=


Df
*
cos






θ
i


cos







α
i

/

(


h

i
+
1


-

h
i


)



+


D
2

*
sin






θ
i






cos








α
i



(


h

i
+
1


+

h
i


)


/

(


h

i
+
1


-

h
i


)

















z

i
+
1



cos






α

i
+
1



=



z
i


cos






α
i


-


v
2


Δ





t
*
cos






θ
i


-



(


v


1.

i

+
1


+

v

1.

i







)

/
2






Δ





t
*
cos





β







Where:





    • D: distance between headlights of on-coming train; ‘hi : distance from headlight pixels in image recorded on sensor of camera relative to the center of the image;

    • v2 : velocity of on-coming train;

    • v1.1, v1.2, v1.3: velocities of passing train, at three distinct times (separated by a constant time difference), calculated initially during a time period as the vehicle travels from A1 and A2, and subsequently on an on-going basis until the end of the maneuver. Note: the velocity of the train may be changing while proceeding with the maneuver. Until the trains first identify each other : v1.1=v1.2=v1.3=v1





As indicated in FIG. 5, which is an overhead view of the field of view of the camera transceiver 24, the track may not be straight, but rather curved, in which case the line of site between the camera and the on-coming train may be an angle αi, from a straight projection in the direction of motion of the first train. The angle between that straight projection and the line of the headlights is an angle θi and the angle between the line of site and the passing vehicle may be an angle β.


The above equation then needs to be adjusted, such that:







z
i

=


Df
*


cos






θ
i


cos






α
i




h

i
+
1


-

h
i




+


D
2

*


sin






θ
i






cos







α
i



(


h

i
+
1


+

h
i


)





h

i
+
1


-

h
i













z

i
+
1



cos






α

i
+
1



=



z
i


cos






α
i


-


v
2


Δ





t
*
cos






θ
i


-



(


v


1.

i

+
1


+

v

1.

i







)

/
2






Δ





t
*
cos





β







FIGS. 7A-7D are a schematic, pictorial illustrations of the railway system 50, described above with respect to FIG. 2, during operation of the trains 56 and 58 on the track 52, according to an embodiment of the present invention.


As indicated in FIG. 7A, the trains 56 and 58 are travelling towards each other on track 52. Lights emitted by camera transceivers 24 on the respective trains are reflective by respective reflectors 30 on the trains. Lens systems of the respective camera transceivers, described above with respect to FIG. 5, provide each camera transceiver with a field of view of an area in front of each train. As indicated in FIG. 7A, the train 56 has a field of view 136, and the train 58 has a field of view 138. At a point in time referred to herein below as T0, the train 56 is at a location A0 and the train 58 is at a location B0 .


The illustration of FIG. 7B indicates the distance LT between the trains at time T0 when the trains first identify each other and illustrates a scenario described with respect to FIG. 2B above, where S1 and S2 are respective braking apportionment distances of the two respective trains, whereby both trains must stop to avoid a collision.






L
T01
=v
1
*t
S






L
T02
=v
2
*t
S





tS=1 sec


Assuming the following inequalities hold:






S
1
+S
2
+L
T01
+L
T02
<L
T






L
j1
−S
1
>=L
T01


then, the train 56 must stop. If on the other hand the following inequality holds:






L
j1
−S
1
<L
T01


then both trains must stop.


The illustration of FIG. 7C indicates the distance LT between the trains when the trains first identify each other that determine the minimum distance at which the monitoring systems on the trains must be capable of recognizing on-coming trains. These parameters are: the maximum permissible speed of the trains on the railway system (v1, v2 for respective trains 56 and 58); the braking distance given the maximum permissible speed (LT01, LT02 for respective trains 56 and 58); and the maximum length, l, of the train 58. Then:






L
T01
=v
1
*t
S






L
T02
=v
2
*t
S





LTB2=v2*tS

    • TI—Identification time






L
TI
=L
TI1
+L
TI2




    • LTI—Shortening the distance between trains until they first identify each other (z3−z1).









L
TI1
=v
1
*T
I






L
TI2
=v
2
*T
I





(LT02+LTB2+LTI2+1)/v2=(LT01+LT1+LTI1)/v1






L
T1
+L
T01
+L
TI1=(LT02+LTB2+LTI2+1)*v1/v2






L
T=(LT01+LTI1+LT1)+LT02+LTI2






L
T=(LT02+LTB2+LTI2+1)*v1/v2+LT02+LTI2


Assuming for example a maximum speed v1=201 km/h (e.g., class 7 on the U.S. Northeast Corridor) and a minimum speed v2=97 km/h, and a straight line length of 1672 m (e.g., freight trains—class 4), to obtain a measurement precision of 7%, assuming a 40-megapixel camera, and an identification time (TI) of 9.7 sec, and ts=1 sec, then:






L
T=(v2*tS+V2*tS+V2*T1+1)*v1/v2+v2*tS+v2*T1






L
T
=v
1
*t
S
+v
1
*t
S
+v
1
*T
I+1*v1/v2+v2*tS+v2*TI






L
T=(2*v1+v2)*tS+(v1+v2)*TI+1*v1/v2






L
T=(2*55.83+26.94)*1+(55.83+26.94)*9.7+1672*55.83/26.94





LT=4406 m

    • LT is the minimum distance between trains when the identification must occur. For the parameters in the example, the value must be 4715 meters (including the speed measurement inaccuracy).


In further embodiments, the length of the train 58 may be considered. The train 56 does not have to stop if the train 58 can clear the junction by its entire length before train 56 reaches the junction. FIG. 7D illustrates a scenario described with respect to FIG. 2B above, whereby the train 56 either stops or continues depending on the length of the train 58. The length of the train 58 can be communicated to the train 56 by light signals related to reflectors 30 on the train 58. A length can be communicated by installing a reflector for each unit length of train, or by providing a shutter with each reflector, set to permit reflection at regular intervals that indicate, such that the frequency is indicative of the length.


A margin of stop time, tS, may be calculated, as the amount of time by which the second train 58, clears the junction before the first train, the train 56, reaches the junction. If this time is calculated to be greater than a preset value, such as 1 second, then both trains are allowed to proceed. If tS is less than a preset value, then one or both of the trains must stop. The calculation of tS is as follows:






L
T01
=v
1
*t
s






L
TB2
=v
2
*t
S






v
2
*T=L
T
−L
j1
+L
TB2+1






v
2
*T=L
T
−L
j1
+v
2*tS+1






T=(Lj1−v1*tS)/v1






v
2*(Lj1−v1*ts)/v1=LT−Lj1+v2*tS+1






v
2
*L
j1
/v
1
−v
2
*t
S
=L
T
−L
j1
+v
2
*t
S+1





2*v2*tS=v2/v1*Lj1+Lj1−LT−1






t
S
=L
j1(1/2 v1+1/v2)−(LT+1)/v2



FIG. 8 illustrates a scenario in which no junction separates the two trains. If the trains are travelling towards each other the trains need to stop, but the abruptness of the deceleration can be controlled. FIG. 9 is a graph of train velocity, indicating the train deceleration due to braking. Assuming that various levels of braking deceleration can be applied, the appropriate level is determined by the following equations:






S
1
+S
2+(v1+v2)*tS=LT






S
1
=x
i

30 v

1*(TD+TB)






S
2
=x
2
+v
2*(TD+TB)






x
1
+x
2+(v1+v2)*(tS+TD+TB)=LT

    • where,
    • S1 and S2 are respective actual braking distances,
    • x1, x2 are respective breaking distances of the two respective trains, and
    • xD and xB are respective distances corresponding to the driver supervision time
    • Tx—Braking time
    • TD—Driver supervision time assume 9 sec.
    • TB=Emergency brakes build up time assume 5 sec
    • T1=Identification time to driver supervision time 9 sec
    • tS=1 sec


Then:

    • x1=x2=1536 m—braking distances of the two trains (not including any speed measurement inaccuracy).





v1=v2=160 km/h

    • LT=4405 m
    • LT is the minimum distance between trains when the identification must occur. For the parameters in the example, the value must be 4715 meters (including the speed measurement inaccuracy).


Alternatively, the trains are travelling in the same direction, but the front train is travelling slower than the back train. The monitoring system of the back train detects the front train and determines the safe speed of travel of the back train.



10A-10H are schematic, pictorial illustrations of a railway system 90, on which the train 56 is approaching the railway crossing 74 at which non-rail vehicles may pass, as described above with respect to FIG. 3. The warning tower 72 situated near the crossing 74 includes at least one monitoring system 20, which has several fields of view, 142, 144, and 146. The field of view 142 provides a view of a vehicle 140, which is approaching the crossing along a road 150, which may be configured with reflectors 42.


Field of view 144 provides a view of the crossing itself, which may also be configured with reflectors 42. Field of view 146 provides a view of the train 56, which, like the vehicle, is approaching the crossing.


The camera transceiver 24 of the train 56 also has a field of view 148, which provides a view of the crossing and the warning tower. As the vehicle 140 and the train 56 approach the crossing, the warning tower detects images of the vehicle 140 and the train 56 at a time T0 (that is, although the vehicle and the train may not be detected at the same instant, they are both detected at time T0). The train may be detected by light from the warning tower reflected by the reflectors 30 on the train. The vehicle may be detected by light from the warning tower reflected from the vehicle itself, such as from the vehicle headlights, or a vehicle registration plate (license plate), or from reflectors, similar to the train reflectors 30, which may be installed on the vehicle. Alternatively or additionally, the vehicle may be detected by a lack of reflection from the road reflectors 42 located on the road.


In response to detecting the vehicle and the train at time T0, the processor of the warning tower monitoring system determines the respective speeds and locations of the vehicle and train, by the process described above with respect to FIGS. 5 and 6. The monitoring processor 22 may then signal the vehicle 140 and/or the train 56 of an impending threat of collision. A notification to the train from the warning tower, may be provided by emitting from the light emitter of the warning tower camera transceiver a pulsed light that indicates the warning. Alternatively, light from the train and reflected the warning tower may be pulsed by controlling a reflector shutter, as described further herein below. In particular, the train must be warned as to whether to stop, to slow down or to speed up.


Similar means of notification may be provided to the vehicle, if the monitoring system is also installed on the vehicle. Alternatively, a standard railroad crossing light may be controlled by the warning tower to signal the impending arrival of the train.


In the scenario illustrated in FIG. 10A, the train can clear the junction before there is a collision with the vehicle 140, such that the train, after clearing the track reaches a point AT. Parameters related to the train movement are indicated in FIG. 10B. A calculation as to whether or not the train can proceed without hitting the vehicle is as follows:






L
T0
=v
T
*t
T






L
T
=v
T
*T






L
T1
+m+L
T0
+l
1
=L
T






L
T1
+m+V
T
*t
S+1=VT*T






T=(LT+m+VT*tS+l1)/VT






L
C
−n−L
C0
=L
CT






L
C
−n−v
Č
*t
T
=v
Č*(LT+m+VT*tS+l1)/VT






V
Č=(vCO+VCT)/2






L
C
−n−v
Č
*t
T
=V
Č*(LT+m+VT*tS+l1)/VT





LC−n−vČ*tT=vČ*(LT+m+l1)/VT*VČ*tT





2*VČ*tT=LC=n−vČ*(LT+m+l1)/VT






t
T=(LC−n)/2VČ−(LT+m+l1)/2 VT>=1 sec


The parameter tT is the amount of time by which the train passes the intersection before the vehicle reaches the intersection. When this is greater than 1 second, the train can safely continue without slowing down.


In a further scenario, the train cannot clear the junction without causing a collision with the vehicle 140. Consequently, the train must stop at a point AS before the junction. Parameters for this scenario are indicated in FIG. 10C. A calculation as to whether or not the train can proceed without hitting the vehicle is as follows:






L
T0
=v
T
*t
S






L
T
=V
T
*T






L
T1
+m−L
T0
=L
T





LT1+m−vT*tS=vT*T






T=(LT1+m−VT*tS)/vT






L
CT
−l
C
−L
C0
+n=L
C






v
Č*(LT1+m−vT*tS)/vT−lC−vČ*tS+n=LC






v
Č=(vCO+vCT)/2






v
Č*(LT1+m)/vT−lC−2vČ*tS+n=LC






v
Č*(LT1+m)/vT−lc−LC+n=2vČ*tS





(LT1+m)/2vT−(lC+LC−n)/2vČ=ts


The equations to determine the length of vehicle 140, as illustrated in FIG. 10D, are as follows:






l
C
+k*L
C
/m=
C0
*T
2





lC+LC1+k*(LC−LC1)/m=vC0*T3






l
C
+L
C1
+k/m*L
C
−k/m*L
C1
=v
C0
*T
3





vC0=LC1/T1






L
C1
−k/m*L
C1
=T
3
/T
1
−L
C1
*T
2
/T
1






k/m*L
C1
=L
C1
−L
C1
*T
3
/T
1
+L
C1
*T
2
/T
1






k/m=1−T3/T1+T2/T1=1−(T3−T2)/T1






l
C
=L
C1
*T
2
/T
1
−L
C*[1−(T3−T2)/T1]



FIG. 10F is a schematic, pictorial illustration of camera transceiver 24, installed, by way of example, on the warning tower 72, and configured to record images of an on-coming vehicle, such as the vehicle 140, according to an embodiment of the present invention. A registration plate 141 (license plate) or headlights of the vehicle appear as pixels in an image recorded on an image sensor 122 of the camera transceiver. The processor is generally configured to detect such pixels, typically by filtering noise from the image as described further herein below. A length of the registration plate, or distance between headlights of vehicle, indicated as distance D, is proportional to a distance between pixels of the reflectors appearing in the images, (indicated in the figure as hi), divided by a focal length, f, of a lens 124 of the camera transceiver. The subscript, i, of the parameter h indicates the time of the measurement. It is to be understood that the elements indicated in the figure are for illustration purposes only. For example, lens 124 is illustrative of the actual lens subsystem employed in the camera transceiver.


The equations to determine the velocity of vehicle 140, in the scenario illustrated in FIG. 10G, are as follows:







z
i

=


Df
*


cos






θ
i


cos






α
i




h

i
+
1


-

h
i




+


D
2

*


sin






θ
i






cos







α
i



(


h

i
+
1


+

h
i


)





h

i
+
1


-

h
i













z

i
+
1



cos






α

i
+
1



=



z
i


cos






α
i


-


v
C


Δ





t
*
cos






θ
i







Where:





    • D: Registration plate length or distance between headlights of vehicle.

    • hi: distance from headlight pixels in image recorded on sensor of camera relative to the center of the image;

    • vC: velocity of the vehicle 140;





The calculation of parameter tS for the vehicle 140 is as follows and is illustrated in FIG. 10E:






V
0=(LC1−J0+J1)/T1






J
0
=b*L
C
/h






J
1
=b*(LC−LC1)/h






V
0
=L
C1*(1−b/h)/T1






V
C0
=L
C1
/T
1






V
0
=V
C0*(1−b/h)






V
C0
=V
0/(1−b/h)






T
S>=1 SEC






t
S>=1/(1−b/h)

    • if b=0.5 m max, h=3 m min−tS>=1.2 SEC


In a further scenario, illustrated in FIG. 10H, the vehicle 140, or some other obstacle, is stopped at the junction. The warning tower detects the presence of the vehicle because the track reflectors 42 are obscured. Consequently, the train must stop before reaching the junction. Whether or not the train can stop using regular braking or not is determined by the following equations:






L
T0
=V
T
*t
S






s
1
+L
T0
−m=L
TS






s
1
+v
T
*t
S
−m=L
TS



FIGS. 11A and 11B are orthogonal illustrations of a camera transceiver 300, according to an embodiment of the present invention. The camera transceiver includes a light emitter aperture 302 and an image sensor aperture 304, for transmission of light from a light emitter 306 and image capture by an image sensor 308. Panning of the camera transceiver line of sight is performed by a mirror system 316 and servo sensors 301. As described above, the image sensor is typically a high resolution CMOS sensor, providing sufficient resolution to detect the reflected light of on-coming trains at distances according to the speed of trains in the rail system, such as was calculated above with respect to FIG. 7B. The light emitter is typically a narrow bandwidth infrared laser, such as an 830 nm diode bar. The power, dispersion, and duty cycle of the emitter are configured to conform to the International Electro technical Commission (IEC) 60825-1 standard for safety of laser products


An image sensor lens subsystem 310 and the mirror system 316 can be controlled by the processor to adjust the camera transceiver field of view (FOV), including orientation, zoom and focus, as described above. A light emitter lens subsystem 318 can be similarly controlled to control the field of light dispersion. In alternative embodiments, the image sensor and light emitter can be positioned to share a single aperture, with some common mirroring and lens subsystem elements. Before the trains first identify each other, the FOV of the image sensor is set to be approximately the same as the field of dispersion of the light emitter. Typically the image sensor lens subsystem also includes a band pass filter 312, as well as an interference filter, configured at the wavelength of the emitter, such as 830 nm. The image sensor lens subsystem also includes an optical zoom.


Typically a train has one or more reflectors installed, as described above. One such reflector, indicated as a retro reflector 322, may be incorporated within the camera transceiver. The reflector 322 may also return pulsed light in order to communicate information such as train length, the pulsing being implemented by a shutter 330 controlled by the processor.


It is to be understood that elements of the monitoring system may be combined in different combinations in different embodiments of the present invention. Processing elements of the system may be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations thereof. Such elements can be implemented as a computer program product, tangibly embodied in an information carrier, such as a non-transient, machine-readable storage device, for execution by, or to control the operation of, data processing apparatus, such as a programmable processor or deployed to be executed on multiple computers at one site or distributed across multiple sites. Memory storage may also include multiple distributed memory units, including one or more types of storage media including, but are not limited to, magnetic media, optical media, and integrated circuits such as read-only memory devices (ROM) and random access memory (RAM). The system may have one or more network interface modules controlling sending and receiving of data packets over networks.


Method steps associated with the system and process can be rearranged and/or one or more such steps can be omitted to achieve the same, or similar, results to those described herein. It is to be understood that the embodiments described hereinabove are cited by way of example, and that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes variations and modifications thereof which would occur to persons skilled in the art upon reading the foregoing description and which are not disclosed in the prior art.

Claims
  • 1. A system installed on a first train for generating a safety notification with respect to a speed of a second train along a track, the system comprising: a camera transceiver, comprising a light emitter configured to emit a narrow bandwidth light, an image sensor, one or more light reflectors and a lens subsystem; anda processor communicatively coupled to the camera transceiver and to a memory that comprises computer-readable instructions, which cause the processor to perform the steps of:determining a train speed and a train position;receiving images from the camera transceiver;processing the images to calculate a speed and a position of a potential obstacle on the track;responsively to calculating the potential obstacle speed and position, determining a safe speed of the train; andresponsively to determining the safe speed, providing a control signal to control the first train.
  • 2. The system of claim 1, wherein the control signal controls an automatic brake unit, an acceleration unit or an audio or visual driver alert unit of the train, or a turning, switching or other control unit of the railway system.
  • 3. The system of claim 1, wherein determining the safe speed comprises determining a rate of deceleration or acceleration.
  • 4. The system of claim 1, wherein the safe speed is to stop.
  • 5. The system of claim 1, wherein the train is a first train, wherein the potential obstacle is a second train travelling towards the first train on the track, wherein processing the images comprises determining the length of the second train and wherein determining the safe speed comprises determining that the train must stop before reaching a junction between the trains.
  • 6. The system of claim 1, wherein the train is a first train, wherein the potential obstacle is a second train travelling towards the first train on the track, wherein processing the images comprises determining the length of the second train and wherein determining the safe speed comprises determining that the train must take a rail switch at a junction located at a point between the trains.
  • 7. The system of claim 1, wherein the notification further comprises a safety level, and wherein the system is further configured to display the safety level on a screen viewable by the driver or by a railway system supervisor.
  • 8. The system of claim 7, wherein the display is a stand-alone display, a multimedia/GPS navigation display, or a smartphone/PDA display.
  • 9. The system of claim 8, wherein the safety level is one of a set of possible levels comprising a first level that the first train speed is safe and a second notification that the speed is unsafe.
  • 10. The system of claim 1, wherein the train is a driverless train.
  • 11. The system of claim 1, wherein the potential obstacle is a second train and wherein processing the images comprises identifying a representation of the second train in the images.
  • 12. The system of claim 1, wherein the train is a first train, wherein the potential obstacle is an on-coming second train on the track and wherein calculating the speed and position of the obstacle comprises receiving the images while the relative speed between the first train and second trains is changing.
  • 13. The system of claim 1, wherein receiving the images further comprises determining a contour of the track from the images and responsively orienting the camera transceiver to a view of the track.
  • 14. The system of claim 1, wherein the laser emitter and the image sensor respectively transmit and receive laser pulses through one or more common lenses of the lens subsystem.
  • 15. The system of claim 1, wherein the train is a first train, wherein the potential obstacle is a second train travelling towards the first train on the track, wherein processing the images comprises determining the length of the second train by matching a pattern of one or more of a light reflector configuration, a shutter rate, and a color to a pre-defined pattern defining a length.
  • 16. The system of claim 1, wherein the camera transceiver is a front camera transceiver and wherein the system further comprises back camera transceivers on the second train.
  • 17. The system of claim 16, wherein processing the images further comprises identifying reflections from the back camera transceivers on a second train and wherein determining the safe speed comprises determining a speed to prevent the first train from hitting the back of the second train.
  • 18. The system of claim 1, further comprising reflectors installed along the track, wherein processing the images comprises identifying an obstruction by identifying a lack of reflected light along a portion of the track, and wherein determining the safe speed comprises determining to stop the train due to the lack of reflected light.
  • 19. The system of claim 1, further comprising a warning tower situated near a railway crossing having reflectors installed on the crossing, wherein processing the images comprises identifying from a field of view of the warning tower an obstruction of the reflectors, and wherein determining the safe speed comprises determining to stop the train due to the obstruction.
  • 20. The system of claim 1, further comprising a warning tower situated near a railway crossing, wherein a vehicle approaching the railway crossing is equipped with a second light emitter and the image sensor transmits and receives laser pulses.
  • 21. The system of claim 1, further comprising a warning tower situated near a railway crossing having reflectors installed on the road, wherein processing the images comprises: identifying from a field of view of the warning tower an obstruction of the reflectors and responsively determining to stop the vehicle.
  • 22. The system of claim 1, further comprising a warning tower situated near a railway crossing, wherein processing the images comprises identifying from a field of view of the warning tower an obstruction of a registration plate of the vehicle, and wherein determining the vehicle safe speed comprises determining to stop the vehicle.
  • 23. A method for generating a safety notification for a train on a railway track with respect to a vehicle moving on a road crossing the track, the method comprising: configuring a camera transceiver comprising a light emitter configured to emit a narrow bandwidth light, an image sensor, light reflectors and a lens subsystem; andproviding a processor configured to be communicatively coupled to the camera transceiver and to a memory that comprises computer-readable instructions, which cause the processor to perform the steps of:determining a vehicle speed and a vehicle position;receiving images from the camera transceiver;processing the images to calculate a length of a vehicle travelling on the road;responsively to calculating the length of the vehicle, determining a safe speed of the train; andresponsively to determining the safe speed, providing a control signal to control the first train.
  • 24. A method for generating a safety notification with respect to a speed of a first train moving on a track, the method comprising: configuring a camera transceiver comprising a light emitter configured to emit a narrow bandwidth light, an image sensor, one or more light reflectors and a lens subsystem; andproviding a processor configured to be communicatively coupled to the camera transceiver and to a memory that comprises computer-readable instructions, which cause the processor to perform the steps of:determining a first train speed and a first train position;receiving images from the camera transceiver;processing the images to calculate a speed, a position, and a length of a second train travelling on the track;responsively to calculating the second train speed, position, and length of the second train, determining a safe speed of the first train; andproviding a control signal to control the first train.