The present disclosure relates to an image processing apparatus, an image processing method, and a computer readable medium.
In related techniques, an abnormality in a monitoring target such as a fence has been monitored by a monitoring person in a monitoring room monitoring camera images of a plurality of cameras. For example, when the monitoring person determines that there is a suspicious point in the monitoring target, he/she turns the orientation of the camera towards the monitoring target and controls the camera to zoom in so as to detect an abnormality in the monitoring target. However, it may take a lot of time for a human to detect an abnormality in a monitoring target, which delay may cause a large increase in the cost of finding and handling the abnormality.
Thus, recently, a system for monitoring an abnormality in a monitoring target using an optical fiber has been proposed (e.g., Patent Literature 1).
In the technique described in Patent Literature 1, an optical fiber detection sensor identifies a deflection and the like generated in a fence, and detect an intrusion of a moving body such as a person, and also a place and so on where the intrusion is detected. Then, the photographed video of the current camera showing the moving object and the photographed video of a camera adjacent to the current camera are displayed separately on the same screen.
Patent Literature 1: Japanese Unexamined Patent Application Publication No. 2009-017416
The technique described in Patent Literature 1 displays a camera image of a camera in which a moving body appears and a camera image of a camera adjacent to the camera. However, there is a problem that it is difficult for a monitoring person to visually recognize that an abnormality has occurred from only a display of a camera image.
Thus, an object of the present disclosure is to provide an image processing apparatus, an image processing method, and a computer readable medium that can solve the above-described problem and can display an image in such a way that an occurrence of an abnormality can be visually recognized easily.
In an example aspect, an image processing apparatus includes:
a display unit; and
a control unit configured to control the display unit to display a sensing data image indicating sensing data of an optical fiber and a camera image of a camera which photographs an area in which a predetermined event is detected by the sensing data.
In another example aspect, an image processing method performed by an image processing apparatus includes:
acquiring a sensing data image indicating sensing data of an optical fiber and a camera image of a camera which photographs an area in which a predetermined event is detected by the sensing data; and
displaying the sensing data image and the camera image.
In another example aspect, a non-transitory computer readable medium storing a program causing a computer to execute:
a procedure for acquiring a sensing data image indicating sensing data of an optical fiber and a camera image of a camera which photographs an area in which a predetermined event is detected by the sensing data; and
a procedure for displaying the sensing data image and the camera image.
According to the above-described aspects, it is possible to achieve an effect that an image can be displayed in such a way that an occurrence of an abnormality can be visually recognized easily.
Hereinafter, an embodiment of the present disclosure will be described with reference to the drawings. In the embodiment described below, as an example, a monitoring target to be monitored is described as a fence, but the monitoring target is not limited to a fence.
First, a configuration of a monitoring system according to this embodiment will be described with reference to
As shown in
The optical fiber cable 20 is formed by covering one or more optical fibers. The optical fiber cable 20 is laid on the fences 10 and buried in the ground along the fences 10. Specifically, the optical fiber cable 20 extends from the optical fiber detection unit 30 along the fences 10 and is turned back at a turning point and returns to the optical fiber detection unit 30. A part of the optical fiber cable 20 between the optical fiber detection unit 30 and the turning point is laid on the fences 10, and the other part of the optical fiber cable 20 is buried in the ground along the fences 10. However, the method of laying and burying the optical fiber cable 20 shown in
The camera 50 photographs an area where the fences 10 are installed. The camera 50 is implemented by, for example, a fixed camera, a PTZ (Pan Tilt Zoom) camera, or the like. The plurality of cameras 50 may be installed so that the entire area where the fences 10 are installed can be photographed, although the number of installed cameras 50 and an installation spacing between the cameras 50 are not particularly limited. For example, when a high-performance camera 50 having a long maximum shooting distance is used, the number of installed cameras can be reduced and the installation spacing between the cameras 50 can be increased.
The monitoring system according to the first embodiment monitors the fences 10 and its surroundings using an optical fiber sensing technique that uses an optical fiber as a sensor.
Specifically, the optical fiber detection unit 30 makes pulsed light incident on at least one optical fiber included in the optical fiber cable 20. Then, as the pulsed light is transmitted through the optical fiber in the direction of the fences 10, backscattered light is generated at each transmission distance. This backscattered light returns to the optical fiber detection unit 30 via the above-mentioned same optical fiber as the one through which the pulsed light is transmitted.
At this time, the optical fiber detection unit 30 makes the pulsed light incident in the clockwise direction and receives the backscattered light from this pulsed light in the clockwise direction and also makes the pulsed light incident in the counterclockwise direction and receives the backscattered light from this pulsed light in the counterclockwise direction. Thus, the optical fiber detection unit 30 receives the backscattered light from two directions.
Here, the fence 10 vibrates when an event such as a person grabbing and shaking the fence 10 occurs, and the vibration of the fence 10 is transmitted to the optical fiber. The vibration pattern of the vibration of the fence 10 transmitted to the optical fiber is a dynamically fluctuating pattern and differs according to the type of an event occurring in the fence 10. In the first embodiment, for example, the following events are assumed as predetermined events that occur in the fences 10.
(1) A person grabs the fence 10 and shakes it.
(2) A person hits the fence 10.
(3) A person climbs the fence 10.
(4) A person places a ladder against the fence 10 and climbs the ladder.
(5) A person or an animal wanders around the fence 10.
(6) A person digs around the fence 10.
Thus, the backscattered light received from the optical fiber by the optical fiber detection unit 30 includes a pattern corresponding to the state of the fence 10, i.e., a pattern corresponding to an event occurring in the fence 10. Therefore, in this embodiment, the state of the fence 10 is detected by the method described below by using the fact that the pattern corresponding to the state of the fence 10 is included in the backscattered light. Specifically, a predetermined event occurring in the fence 10 is detected.
The optical fiber detection unit 30 can identify the location of the fence 10 in which this backscattered light is generated based on a time difference between a time when the pulsed light is incident on the optical fiber and a time when the backscattered light is received from this optical fiber. Further, in the first embodiment, as described above, the fences 10 are composed of a plurality of fences 10 connected to each other. Therefore, as shown in
Thus, the optical fiber detection unit 30 can generate, for example, as sensing data vibration data as shown in
In the example shown in
Thus, in this embodiment, the control unit 41 performs machine learning (e.g., deep learning) on the vibration pattern when a predetermined event is occurring in the fence 10 and detects whether a predetermined event is occurring in the fence 10 using a result of the machine learning (initial training model).
First, a method of the machine learning will be described with reference to
As shown in
Next, the control unit 41 checks the vibration patterns against the supervised data and classifies the vibration patterns (Step S3) and performs supervised learning (Step S4). By doing so, an initial training model is obtained (Step S5). When a vibration pattern corresponding to an event occurring in the fence 10 is input, this initial training model outputs a predetermined event that may be applicable if there is a possibility that this event may correspond to any of the predetermined event. Alternatively, this initial training model may output, together with a predetermined event that may be applicable, confidence with which this predetermined event occurs. Further, an importance of an event based on the confidence and a priority of the event may be displayed. For example, an importance of an event “person climbs fence 10” is set higher, and a priority of this event is set higher than that of an event “person or animal wanders around fence 10” to be output.
Next, a method of determining whether a predetermined event occurring is detected in the fence 10 will be described.
In this case, the control unit 41 first acquires a vibration pattern corresponding to an event occurring in the fence 10 from the optical fiber detection unit 30. Next, the control unit 41 inputs this vibration pattern to the initial training model. By doing so, since the control unit 41 can obtain a predetermined event that may be applicable as a result of the output from the initial training model, it detects that a predetermined event is occurring. Moreover, when the control unit 41 obtains confidence together with a predetermined event that may be applicable as a result of the output from the initial training model, it may determine that the predetermined event occurring is detected if the confidence is more than or equal to a threshold.
As described above, in this embodiment, the vibration pattern when a predetermined event is occurring in the fence 10 is machine-learned, and a result of the machine learning is used to detect a predetermined event occurring in the fence 10.
It may be difficult in an analysis by a human to extract, from data, features for detecting a predetermined event occurring in the fence 10. In this embodiment, by building a training model from a large number of patterns, it is possible to detect a predetermined event occurring in the fence 10 with high accuracy even when it is difficult in an analysis by a human to do so.
Note that in the machine learning according to this embodiment, in the initial state, a training model may be generated based on two or more pieces of supervised data. In addition, this training model may be made to newly learn a newly detected pattern. At this time, a specific condition for detecting a predetermined event occurring in the fence 10 may be adjusted based on the new training model.
As shown in
Additionally, the control unit 41 may control two or more cameras 50 which photograph an area including the fence 10 in which the predetermined event is detected among the plurality of cameras 50. In this case, the function may be divided for each camera 50. For example, at least one of the two or more cameras 50 may photograph a face of a person present in the above-mentioned area, so that the photographed face image is used for face authentication, while another at least one of the two or more cameras 50 may photograph the above-mentioned entire area, so that the photographed image is used for monitoring a behavior of a person or an animal present in the above-mentioned area. Moreover, the two or more cameras 50 may photograph the area with different angles. Furthermore, at least one of the two or more cameras 50 may perform photographing to complement photographing of another camera 50. For example, when there is a blind spot that cannot be photographed by the other camera 50 in the above-mentioned area, the at least one camera 50 may photograph the blind spot.
The display unit 42 is installed in a monitoring room or the like which monitors the entire area where the fences 10 are installed and performs various displays under the control of the control unit 41.
Specifically, the control unit 41 further control the display unit 42 to display, for example, a sensing data image indicating sensing data generated by the optical fiber detection unit 30 and a camera image of the camera 50 which photographs an area including the fence 10 in which a predetermined event is detected based on the sensing data.
Hereinafter, specific Display Examples displayed by the display unit 42 according to the first embodiment will be described.
First, a Display Example 1 will be described with reference to
As shown in
The sensing data image P11 indicates sensing data generated by the optical fiber detection unit 30. This sensing data is obtained by arranging vibration data similar to the vibration data shown in
The camera image P12 is a camera image of the camera 50 which photographs the area including the fence 10 in which the event that a person is digging the surroundings is detected and is controlled by the control unit 41.
The sensing data image P11 is not limited to one shown in
Alternatively, as shown in
In the sensing data shown in
The horizontally long rectangle in the range of about 90 m to 370 m from the optical fiber detection unit 30 indicates the underground, and an upper side of this horizontally long rectangle indicates a boundary with the ground. That is, in this range, the sensing data shown in
Further, a horizontally long rectangle in the range of about 390 m to about 560 m from the optical fiber detection unit 30 indicates the fence 10, and a lower side of this horizontally long rectangle indicates a boundary with the ground. That is, in this range, the sensing data shown in
In the sensing data shown in
Alternatively, the sensing data image P11 may indicate the sensing data as shown in
In addition, as for the sensing data image P11 and the camera image P12, by a user (e.g., monitoring person or the like in a monitoring room, which will be hereinafter the same) designating a specific period of time or time, the sensing data P11 at the designated time and the camera image P12 of the camera 50 corresponding to the location of the detected vibration data may be displayed. Specifically, as shown in
Next, a Display Example 2 will be described with reference to
As shown in
The overhead image P22 is an image of the entire area where the fences 10 are installed as seen from above. Like in the Display Example 1, also in this Display Example 2, the control unit 41 detects an event that a person is digging around the fence 10.
The area identifying information P23 has a balloon shape. The area identifying information P23 indicates by an arrow an area including the fence 10 in which the control unit 41 has detected an occurrence of the above-mentioned event with a warning message that a person is digging around the fence 10. The area identifying information P23 is superimposed on the overhead image P22 and displayed.
The sensing data image P21 and the camera image P24 are similar to the sensing data image P11 and the camera image P12 of
Next, a Display Example 3 will be described with reference to
As shown in
The event information image P35 indicates event information representing an occurrence status of an event. In the example of
Further, the event information image P35 may be configured in such a way that the user can select an event. Specifically, when the user designates a specific event from a plurality of events displayed in the event information image P35, the control unit 41 displays the sensing data image P31, the overhead image P32, and the camera image P34 corresponding to the date and time of this event.
The sensing data image P31 and the camera image P34 are similar to the sensing data image P11 and the camera image P12 of
Next, a hardware configuration of a computer 60 that implements the image processing apparatus 40 will be described with reference to
As shown in
The processor 601 is an arithmetic processing apparatus such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit). The memory 602 is a memory such as a RAM (Random Access Memory) or a ROM (Read Only Memory). The storage 603 is a storage apparatus such as an HDD (Hard Disk Drive), an SSD (Solid State Drive), or a memory card. The storage 603 may be a memory such as a RAM or a ROM.
The storage 603 stores a program that implements the function of the control unit 41 included in the image processing apparatus 40. The processor 601 implements the function of the control unit 41 by executing the program. Here, the processor 601 may execute the program after reading it into the memory 602 or without reading it into the memory 602. The memory 602 and the storage 603 also play a role to store information and data held by the control unit 41.
The above program can be stored and provided to a computer (including the computer 60) using any type of non-transitory computer readable media. Non-transitory computer readable media include any type of tangible storage media. Examples of non-transitory computer readable media include magnetic storage media (such as floppy disks, magnetic tapes, hard disk drives, etc.), optical magnetic storage media (e.g. magneto-optical disks), CD-ROM (Compact Disc-Read Only Memory), CD-R (CD-Recordable), CD-R/W (CD-ReWritable), and semiconductor memories (such as mask ROM, PROM (Programmable ROM), EPROM (Erasable PROM), flash ROM, RAM (Random Access Memory), etc.). The program may be provided to a computer using any type of transitory computer readable media. Examples of transitory computer readable media include electric signals, optical signals, and electromagnetic waves. Transitory computer readable media can provide the program to a computer via a wired communication line (e.g. electric wires, and optical fibers) or a wireless communication line.
The input/output interface 604 is connected to a display apparatus 6041, an input apparatus 6042, and so on. The display apparatus 6041 implements the display unit 42 and displays a screen corresponding to drawing data processed by the processor 601, such as an LCD (Liquid Crystal Display) or a CRT (Cathode Ray Tube) display. The input apparatus 6042 receives an operator's operation input and is, for example, a keyboard, a mouse, and a touch sensor. The display apparatus 6041 and the input apparatus 6042 may be combined and implemented as a touch panel.
The communication interface 605 transmits data to and receives data from an external apparatus. For example, the communication interface 605 communicates with an external apparatus via a wired communication path or a wireless communication path.
Hereinafter, an operation of the image processing apparatus 40 according to the first embodiment will be described. Here, an operation flow of the image processing apparatus 40 according to the first embodiment will be described with reference to
As shown in
After that, the control unit 41 controls the display unit 42 to display a sensing data image indicating the sensing data and a camera image of the camera 50 which photographs the area including the fence 10 where the predetermined event is detected (Step S12). Specifically, the control unit 41 controls the display unit 42 to display the images as in the Display Example 1. Alternatively, the control unit 41 may control the display unit 42 to display the images as in the Display Example 2 or Display Example 3.
As described above, according to this embodiment, the sensing data image indicating the sensing data of the optical fiber and the camera image of the camera 50 which photographs the area including the fence 10 where the predetermined event is detected by the sensing data are displayed on the display unit 42. In this way, since not only the camera image of the camera 50 which photographs the area including the fence 10 where the predetermined event is detected, but also the sensing data image that is to be based on for the detection are displayed, it becomes easy for a monitoring person to visually recognize that the predetermined event (abnormality) has occurred.
Moreover, according to the first embodiment, the optical fiber sensing technique which uses an optical fiber as a sensor is used. Thus, this embodiment has advantages such that an optical fiber is not affected by electromagnetic noises, it is not necessary to supply power to a sensor, and it is possible to achieve excellent environmental resistance and easy maintenance.
In the above-described first embodiment, an example in which the fences 10 are installed outdoors is described.
In contrast, the second embodiment is an example in which the fences 10 are experimentally installed indoors (in a room, to be more specific).
As shown in
In the second embodiment, one of predetermined events that occur in the fences 10a and 10b is an event that a person touches the fences 10a and 10b. Therefore, cameras 50A and 50B are installed in the room so that the fences 10a and 10b can be photographed when a person touches the fences 10a and 10b. Further, a door 70 is installed in the room, and a person enters and exits the room through the door 70.
Note that in
Further, the only difference between the monitoring system according to the second embodiment and that according to the above-described first embodiment is that, in the first embodiment, the fences 10 are installed indoors. The basic configuration and operation of the first embodiment are the same as those of the second embodiment. Therefore, hereinafter, only specific Display Examples displayed by the display unit 42 according to the second embodiment will be described.
First, a Display Example 1 will be described with reference to
As shown in
The sensing data image P41 indicates sensing data generated by the optical fiber detection unit 30. Here, the sensing data indicates that a person has touched the fence 10b. Thus, the control unit 41 detects an event that a person is touching the fence 10b and controls the camera 50A which photographs an area including the fence 10b.
The camera image P42 is a camera image of the camera 50A which photographs the area including the fence 10b and is controlled by the control unit 41. In the camera image of the camera 50A, a person is surrounded by a square frame (the same applies to the following Display Examples).
Next, a Display Example 2 will be described with reference to
As shown in
Like in the Display Example 1, in this Display Example 2, the control unit 41 detects an event that a person is touching the fence 10b. However, in this Display Example 2, unlike the Display Example 1, the control unit 41 controls the two cameras 50A and 50B which photograph an area including the fence 10b.
Thus, the camera image P52 includes the camera images of the two cameras 50A and 50B controlled by the control unit 41. Furthermore, each of the camera images of the cameras 50A and 50B includes not only an image at the time when the above-described event is detected (second image from the left) but also an image after this time (leftmost image) and images before the time (third and fourth images from the left). At this time, in order to easily distinguish the image at the time when the above-described event is detected (second image from the left) from the other images, the image may be displayed in an emphasized manner, for example, by placing a symbol or a thick frame line on the image at the time when the above-described event is detected.
Note that the camera images of the cameras 50A and 50B may be switched upside down. The order in which the camera images of the cameras 50A and 50B are arranged one above the other may be selected by the user.
In addition, as for the sensing data image P51 and the camera image P52, by the user designating a specific period of time or time and the camera 50, the sensing data P11 at the specified time and the camera image P52 of the camera 50 corresponding to the location of the detected vibration data may be displayed. Specifically, as shown in
In the second embodiment, since only two cameras 50A and 50B are provided, the camera images of the cameras 50A and 50B can be included in the camera image P52. However, when the number of cameras 50 is large and the control unit 41 controls a larger number of cameras 50, camera images of all the cameras 50 controlled by the control unit 41 may not be included in the camera image P52. In such a case, the camera image included in the camera image P52 may be selected by the user.
The sensing data image P51 is similar to the sensing data image P41 in
Next, a Display Example 3 will be described with reference to
As shown in
The overhead image P62 is an image of the entire room where the fences 10a and 10b are installed as seen from above.
The camera image P63 is similar to the camera image P42 of
The sensing data image P61 is similar to the sensing data image P41 in
Next, a Display Example 4 will be described with reference to
As shown in
The event information image P74 indicates event information representing an occurrence status of an event. In the example of
Further, the event information image P74 may be configured in such a way that the user can select an event. Specifically, when the user designates a specific event from a plurality of events displayed in the event information image P74, the control unit 41 displays the sensing data image P71, the overhead image P72, and the camera image P73 corresponding to the date and time of this event.
The overhead image P72 is similar to the overhead image P62 of
The camera image P73 is similar to the camera image P42 of
The sensing data image P71 is similar to the sensing data image P41 in
Next, a Display Example 5 will be described with reference to
As shown in
The overhead image P81 is similar to the overhead image P62 of
The area identifying information P82 has a balloon shape. The area identifying information P82 indicates by an arrow an area including the location of the fence 10b in which the control unit 41 has detected an occurrence of the above-mentioned event with a warning message that a person is touching the fence 10b. The area identifying information P82 is superimposed on the overhead image P81 and displayed.
As described above, the basic configuration and operation of the second embodiment are the same as those of the first embodiment. Thus, the effect of the second embodiment is the same as that of the first embodiment.
In the above-described embodiments, an example in which the monitoring target is the fences 10 has been described, but the monitoring target is not limited to the fences 10. First, the installation site of the monitoring target may be an airport, a port, a plant, a nursing facility, a company building, a border, a nursery, a home, or the like. The monitoring target may be, a wall, a pipeline, a utility pole, a civil engineering structure, a floor, etc. in addition to a fence. Further, a laying or burying site of the optical fiber cable 20 when the monitoring target is monitored may be a wall, a pipeline, a utility pole, a civil engineering structure, a floor, etc., in addition to a fence and underground. For example, when the fence 10 installed in a nursing facility is to be monitored, examples of a predetermined event that could occur in the fence 10 include a person hitting the fence 10, a person leaning against the fence 10 due to injury or the like, and a person climbing over the fence 10 to escape.
In the above-described embodiments, it has been described that the fence 10 vibrates when a predetermined event occurs. However, when such an event occurs, a sound, a temperature, a strain, a stress, and the like change in the fence 10, and these changes are transmitted to the optical fiber. The patterns of a sound, a temperature, a strain, a stress, and the like are also dynamically fluctuating patterns and differ according to the type of an event occurring in the fence 10. For this reason, the optical fiber detection unit 30 may use a Distributed Acoustic Sensor, a Distributed Temperature Sensor, etc. in addition to a distributed vibration sensor to detect a change in a vibration, a sound, a temperature, strain, and stress, etc. and generate sensing data. Then, the control unit 41 may detect an event occurring in the fence 10 based on the sensing data in which the change in the vibration, the sound, the temperature, the strain, and stress, etc. is reflected. This further improves a detection accuracy.
In the above-described embodiments, when a predetermined event is occurring in the fence 10, the control unit 41 controls the angle, zoom magnification, and so on of the camera 50 which photographs an area including this fence 10. However, the control unit 41 may continue to perform control even after a predetermined event has occurred. For example, the control unit 41 may control the camera 50 to track a person, an animal, a car, and the like present in the above-mentioned area. Moreover, when a person wandering around the fence 10 leaves an object such as a suspicious object, the control unit 41 may control one camera 50 to photograph the object and another camera 50 to track the person.
Moreover, the control unit 41 and the display unit 42 of the image processing apparatus 40 may be provided separately from each other. For example, the display unit 42 may be provided in a monitoring room, and the image processing apparatus 40 including the control unit 41 may be provided outside the monitoring room.
In the above-described embodiments, only one optical fiber detection unit 30 is provided and the optical fiber cable 20 is occupied. However, the present disclosure is not limited to this.
For example, the optical fiber detection unit 30 may be provided in a communication carrier station, and the optical fiber cable 20 may be shared between existing communication equipment provided inside the communication carrier station and the optical fiber detection unit 30.
Alternatively, one optical fiber detection unit 30 may be provided in each of the plurality of communication carrier stations, and the detection unit fiber cable 20 may be shared between the plurality of optical fiber detection units 30 provided in the plurality of respective communication carrier stations.
Further alternatively, a plurality of optical fiber detection units 30 may be provided in one communication carrier station, and the optical fiber cable 20 may be shared between the plurality of optical fiber detection units 30.
Although the present disclosure has been described with reference to the embodiments, the present disclosure is not limited to the above-described embodiments. Various changes that can be understood by those skilled in the art can be made to the configurations and details of the present disclosure within the scope of the present disclosure.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2018/041318 | 11/7/2018 | WO | 00 |