The present invention relates to a station monitoring apparatus, a station monitoring method, and a program.
When a passenger gets onto a vehicle stopping at a station, the passenger or belongings of the passenger may be caught in a door of the vehicle. When the vehicle begins to move in this state, there arises a possibility that the passenger gets injured. Accordingly, Patent Document 1 describes that a human body dynamic state detection sensor is arranged in a platform with a predetermined pitch, and, in order to sense light of a light emitter mounted on a train, a plurality of light receivers are arranged in a platform with a predetermined pitch. Then, a system described in Patent Document 1 computes a movement velocity of a person by use of a detection result of the human body dynamic state detection sensor, also computes a movement velocity of a train by use of a detection result of the light receiver, and, when the two movement velocities correspond, urgently stops the train.
[Patent Document]
[Patent Document 1] Japanese Patent Application Publication No. H10-297491
[DISCLOSURE OF THE INVENTION]
[Technical Problem]
In a platform of a station, a person may move in the same direction as a vehicle. There is a possibility that a technique described in Patent Document 1 mistakenly determines that such a person is caught in a door of the vehicle.
An object of the present invention is to accurately sense that a person or belongings of the person are caught in a door of a vehicle.
an image processing unit that determines a position of a door of a vehicle and a position of a person by analyzing an image capturing a platform of a station; and
a decision unit that decides, after the vehicle has begun to move, by use of the position of the door and the position of the person, whether to perform predetermined processing.
The present invention provides a station monitoring method including:
by a computer,
The present invention provides a program causing a computer to execute:
processing of determining a position of a door of a vehicle and a position of a person by analyzing an image capturing a platform of a station; and
processing of deciding, after the vehicle has begun to move, by use of the position of the door and the position of the person, whether to perform predetermined processing.
The present invention can accurately sense that a person or belongings of the person are caught in a door of a vehicle.
The above-described object, other objects, features, and advantageous effects will become more apparent from a preferred example embodiment described below and the following accompanying drawings.
Hereinafter, an example embodiment of the present invention is described by use of the drawings. Note that, a similar reference sign is assigned to a similar component in all the drawings, and description thereof will not be repeated accordingly.
Then, the station monitoring apparatus 10 outputs a sensing result to an output apparatus 30. The output apparatus 30 is, for example, a display apparatus intended for a person concerned in an operation of a train. The display apparatus may be placed on the same track side of the same platform as the capture apparatus 20, or may be placed in a monitoring center. Moreover, the output apparatus 30 may be a portable terminal carried by a station employee. Moreover, the output apparatus 30 may be a speaker placed in the same platform as the capture apparatus 20. Note that, when the output apparatus 30 is a display apparatus, the output apparatus 30 may further display an image generated by the capture apparatus 20.
In the example illustrated in the present drawing, the decision unit 120 uses data stored in a history data storage unit 122. The history data storage unit 122 stores at least one of history data of a position of a person and a position of a door determined by the image processing unit 110, and history data of a relative position of a position of a person and a position of a door. The decision unit 120 decides, by use of the history data, whether to perform predetermined processing. Details of processing performed by the image processing unit 110 and processing performed by the decision unit 120 are described by use of other drawings. Note that, the history data storage unit 122 may be a part of the station monitoring apparatus 10, or may be located outside of the station monitoring apparatus 10.
The bus 1010 is a data transmission path through which the processor 1020, the memory 1030, the storage device 1040, the input/output interface 1050, and the network interface 1060 transmit/receive data to/from one another. However, a method of mutually connecting the processor 1020 and the like is not limited to bus connection.
The processor 1020 is a processor achieved by a central processing unit (CPU), a graphics processing unit (GPU), or the like.
The memory 1030 is a main storage apparatus achieved by a random access memory (RAM), or the like.
The storage device 1040 is an auxiliary storage apparatus achieved by a hard disk drive (HDD), a solid state drive (SSD), a memory card, a read only memory (ROM), or the like. The storage device 1040 stores a program module that achieves each of functions (e.g., the image processing unit 110 and the decision unit 120) of the station monitoring apparatus 10. The processor 1020 reads each of the program modules onto the memory 1030, executes the read program module, and thereby achieves each of the functions being related to the program module. Moreover, the storage device 1040 also functions as the history data storage unit 122.
The input/output interface 1050 is an interface for connecting the station monitoring apparatus 10 and various kinds of input/output equipment (e.g., the capture apparatus 20 and the output apparatus 30).
The network interface 1060 is an interface for connecting the station monitoring apparatus 10 to a network. The network is, for example, a local area network (LAN) or a wide area network (WAN). A method of connecting the network interface 1060 to a network may be wireless connection or may be wired connection. Note that, the station monitoring apparatus 10 may be connected to at least one of the capture apparatus 20 and the output apparatus 30 via the network interface 1060.
When having generated an image, each of the capture apparatuses 20 transmits the image to the station monitoring apparatus 10. The station monitoring apparatus 10 acquires the image from the capture apparatus 20 together with date and time information indicating a generation date and time of the image (step S10). Next, the image processing unit 110 and the decision unit 120 perform the following processing for each of all the doors 50 that the vehicle 40 has.
First, the image processing unit 110 senses a position of a person and a position of the door 50 included in the image, and stores a sensing result in the history data storage unit 122 as data being associated with the date and time information, i.e., the history data described above (step S20). Herein, the decision unit 120 preferably limits a person to be a sensing target to a person located within a predetermined distance (e.g., within 2 m) from the door 50. For example, the decision unit 120 senses a position of the door 50 by processing the image first. Next, the decision unit 120 decides, as a region of a sensing processing target of a person, a region located within a predetermined distance from the door 50 sensed in the image. In this way, a processing load of the station monitoring apparatus 10 can be reduced.
Then, the decision unit 120 computes a movement velocity of a position of the door 50 by use of the history data stored by the history data storage unit 122, and a position of the door 50 currently generated by the image processing unit 110. Moreover, the decision unit 120 computes a movement velocity of a position of a person by use of the history data stored by the history data storage unit 122, and a position of the person currently generated by the image processing unit 110 (step S30). A velocity of the door 50 and a velocity of a person are indicated by, for example, the number of movement pixels/one frame (or a unit time), or an actual movement distance/one frame (or a unit time). Then, when a difference of the two computed movement velocities is within a criterion (step S40: Yes), the decision unit 120 increases a value of a counter, i.e., a continuation time by one frame (step S50). Then, when the value of the counter has reached a criterion, i.e., when a state in which a difference between the movement velocity of the door 50 and the movement velocity of the person is within a criterion continues for a time equal to or more than a criterion time (step S60: Yes), a possibility that a person or belongings of the person are caught in the door 50 is high, and, therefore, the decision unit 120 executes predetermined processing (step S70). A criterion (criterion time) in the step S60 is, for example, equal to or more than 1/30 seconds. Moreover, the criterion is preferably equal to or less than one second. Moreover, processing performed in the step S70 is, for example, processing of stopping the vehicle 40 having the door 50. In this instance, the decision unit 120 may cause a speaker placed in a station to output a caution sound.
Note that, in the step S60, a criterion time may not be a fixed value, and may vary in response to a movement distance (the number of movement pixels) per unit time. Specifically, as a movement distance per unit time becomes long, a criterion time is shortened. For example, when the number of movement pixels of the door 50 between a certain time T and a time T+α exceeds the predetermined number of pixels (e.g., equal to or more than 10 pixels), a time interval α is set to a criterion time. In other words, the time interval a taken for the number of movement pixels of the door 50 to exceed a predetermined number of pixels is set to a criterion time. By allowing a criterion time to be dynamic in this way, the criterion time can be prolonged at a time point when an electric train begins to run, and the criterion time can be shortened when the electric train reaches a given velocity. This reduces such a risk that a criterion time is too short and an influence of minute noise is received, or a criterion time is too long and a decision becomes excessively late, and enables an optimum criterion time to be set in response to a running speed of an electric train.
Moreover, in the step S40, the decision unit 120 computes a difference of the two movement velocities as a vector. Then, a component (e.g., a movement direction of the vehicle 40) in a particular direction of the vector is preferably a basis for decision. For example, when a component in the movement direction of the vehicle 40 is less than a criterion regarding a vector representing a difference of the two movement velocities, movement amounts of a person and an electric train may be decided to be equal. An influence of noise resulting from a body movement of a person can be eased by making decision only based on a component in a movement direction of the vehicle 40.
Moreover, the decision unit 120 preferably outputs, to the output apparatus 30, information (hereinafter, referred to as door position information) indicating a position of the door 50 being high in a possibility of catching a person or belongings of the person (i.e., the door 50 decided to have reached a criterion in the step S60). As described above, the output apparatus 30 is, for example, a portable terminal carried by a station employee and/or a display apparatus placed in a station. In this way, an action of a station employee can be accelerated. Note that, door position information output to the door 50 is generated by use of, for example, a position of the door 50 (e.g., positions of the ends 52 and 54) sensed by the image processing unit 110.
Herein, door position information output to the output apparatus 30 may include, for example, a position of the capture apparatus 20 that has generated an image in which the door 50 is captured. In this case, the capture apparatus 20 transmits identification information of the capture apparatus 20 to the station monitoring apparatus 10 together with the image. Then, the station monitoring apparatus 10 previously stores the identification information of the capture apparatus 20 in association with a position of the capture apparatus 20. Then, the decision unit 120 determines, by use of the stored information, a position being related to the identification information transmitted from the capture apparatus 20.
Note that, when the difference of the movement velocities is more than the criterion in the step S50, the decision unit 120 resets a value of the counter (step S52), and returns to the step S10.
When the image processing unit 110 senses a position of a door and a position of a person (step S20), the decision unit 120 computes a difference between the two positions. The difference indicates a relative distance between the door and the person (step S32). Then, when a relative position is equal to or less than a criterion value (step S42: Yes), the decision unit 120 performs processing in and after step S50. Moreover, when a relative position is more than the criterion value (step S42 No), the decision unit 120 resets a counter (step S52), and returns to step S10.
As above, according to the present example embodiment, the station monitoring apparatus 10 determines a position of the door 50 of the vehicle 40 and a position of a person by analyzing an image capturing a platform of a station. Then, whether to perform predetermined processing is decided by use of the positions. For example, even when a person moves at a velocity close to the vehicle 40 at a place being distant from the door 50, the station monitoring apparatus 10 does not decide that the person or belongings of the person are caught in the door 50. Moreover, even when belongings of a person caught in the door 50 are thin belongings, such a fact can be sensed. Thus, a fact that a person or belongings of the person are caught in the door 50 can be accurately sensed.
While an example embodiment of the present invention has been described above with reference to the drawings, the example embodiment is an exemplification of the present invention, and various configurations other than the configuration described above can also be adopted.
Moreover, although a plurality of processes (pieces of processing) are described in order in a plurality of flowcharts used in the above description, an execution order of processes executed in each example embodiment is not limited to the described order. In each example embodiment, an order of illustrated processes can be changed to an extent that causes no problem in terms of content. Moreover, each of the example embodiments described above can be combined to an extent that content does not contradict.
Some or all of the above-described example embodiments can also be described as, but are not limited to, the following supplementary notes.
an image processing unit that determines a position of a door of a vehicle and a position of a person by analyzing an image capturing a platform of a station; and
a decision unit that decides, after the vehicle has begun to move, by use of the position of the door and the position of the person, whether to perform predetermined processing.
the decision unit performs the predetermined processing when deciding that a difference between a movement velocity of any of the persons and a movement velocity of the door is within a criterion continuously for a criterion time.
the decision unit performs the predetermined processing when deciding that a relative distance between any of the persons and the door is within a criterion continuously for a criterion time.
the criterion time is equal to or more than 1/30 seconds.
the position of the door is an end of the door.
the door is a double-leaf door, and
the position of the door is an end of a tangent line of two door leaves.
the image processing unit determines, as the position of the person, a center of a region where the person is present.
the image processing unit estimates a position of a joint of the person, and determines, as the position of the person, a closest position to the position of the door among the estimated positions of the joints.
the predetermined processing includes processing of stopping the vehicle.
the predetermined processing includes output of information indicating the position of the door.
the decision unit outputs information indicating the position of the door to a portable terminal carried by a station employee.
by a computer,
by the computer,
performing the predetermined processing when deciding that a difference between a movement velocity of any of the persons and a movement velocity of the door is within a criterion continuously for a criterion time.
by the computer,
performing the predetermined processing when deciding that a relative distance between any of the persons and the door is within a criterion continuously for a criterion time.
the criterion time is equal to or more than 1/30 seconds.
the position of the door is an end of the door.
the door is a double-leaf door, and
the position of the door is an end of a tangent line of two door leaves.
by the computer,
determining, as the position of the person, a center of a region where the person is present.
by the computer,
estimating a position of a joint of the person, and determining, as the position of the person, a closest position to the position of the door among the estimated positions of the joints.
the predetermined processing includes processing of stopping the vehicle.
the predetermined processing includes output of information indicating the position of the door.
by the computer,
outputting information indicating the position of the door to a portable terminal carried by a station employee.
processing of determining a position of a door of a vehicle and a position of a person by analyzing an image capturing a platform of a station; and
processing of deciding, after the vehicle has begun to move, by use of the position of the door and the position of the person, whether to perform predetermined processing.
the predetermined processing when deciding that a difference between a movement velocity of any of the persons and a movement velocity of the door is within a criterion continuously for a criterion time.
the predetermined processing when deciding that a relative distance between any of the persons and the door is within a criterion continuously for a criterion time.
the criterion time is equal to or more than 1/30 seconds.
the position of the door is an end of the door.
the door is a double-leaf door, and
the position of the door is an end of a tangent line of two door leaves.
deciding, as the position of the person, a center of a region where the person is present.
estimating a position of a joint of the person, and deciding, as the position of the person, a closest position to the position of the door among the estimated positions of the joints.
the predetermined processing includes processing of stopping the vehicle.
the predetermined processing includes output of information indicating the position of the door.
outputting information indicating the position of the door to a portable terminal carried by a station employee.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/000684 | 1/10/2020 | WO |