This application claims priority to Japanese Patent Application No. 2021-032645 filed on Mar. 2, 2021, incorporated herein by reference in its entirety.
The present disclosure relates to an abnormal behavior notification device, an abnormal behavior notification system, an abnormal behavior notification method, and a recording medium.
Conventionally, the following technique is known (see, for example, Japanese Unexamined Patent Application Publication No. 2020-61079 (JP 2020-61079 A)). When a first vehicle detects a traffic violation vehicle with an in-vehicle camera, the first vehicle transmits an evidence image of the traffic violation, characteristic information of the traffic violation vehicle, etc. to a server, and the server transmits the characteristic information of the traffic violation vehicle to a second vehicle that is located near the estimated position of the traffic violation vehicle. The second vehicle captures images of the license plate, driver, etc. of the traffic violation vehicle and transmits the images to the server, and the server transmits this information to a client (police system, etc.).
In recent years, vehicle theft techniques have become more sophisticated, and vehicles can be stolen silently. Also, vehicle theft may take only several minutes. For this reason, even if the vehicle is parked in the garage at home, it is difficult to catch the criminal by capturing the scene of the theft. Therefore, there is a need for vehicle owners to be urgently notified when their possessions behave differently than usual, such as when the vehicle is stolen.
Further, with the arrival of an aging society with a declining birthrate, it is socially important to watch over persons certified as requiring long-term care and the elderly living alone. For related persons such as family members or friends of the person certified as requiring long-term care or an elderly person, if the person certified as requiring long-term care or the elderly person behaves differently than usual or wanders about, there are safety concerns such as the person certified as requiring long-term care or the elderly person will be missing or will get involved in some kind of trouble. Therefore, there is a need for these related persons to be urgently notified when the person certified as requiring long-term care or the elderly person behaves differently than usual, such as when the person certified as requiring long-term care or the elderly person wanders about.
The technique described in JP 2020-61079 A is to capture images of a license plate, a driver, etc. of a traffic violation vehicle and provide the images to a client when an unspecified traffic violation vehicle is detected. Thus, it is not assumed that information is provided to the user when the object or person desired to be watched over by the user behaves differently than usual as described above, and there is room for improvement.
In view of the above issue, an object of the present disclosure is to provide an abnormal behavior notification device, an abnormal behavior notification system, an abnormal behavior notification method, and a recording medium that enable an alert to be notified when a detection target desired to be watched over by a user exhibits an abnormal behavior different than usual.
The gist of the present disclosure is as follows.
(1) An abnormal behavior notification device including: a registration unit that registers identification information for identifying a detection target in a storage unit; a determination unit that determines whether the detection target is shown in an image captured on or around a road, based on the identification information; an abnormal behavior determination unit that determines whether the detection target exhibits an abnormal behavior that is different from a normal behavior of the detection target, when the detection target is shown in the image; and a transmission unit that transmits an alert when the detection target exhibits the abnormal behavior.
(2) The abnormal behavior notification device according to (1) described above, in which the image is an image captured by a mobile body traveling on the road.
(3) The abnormal behavior notification device according to (2) described above, in which: the normal behavior is that the detection target moves in a predetermined movement route and a predetermined time zone; and the abnormal behavior determination unit determines that the detection target exhibits the abnormal behavior that is different from the normal behavior when a position of the detection target based on a position of the mobile body when the image showing the detection target is captured is not included in the predetermined movement route, or when a time at which the image is captured is not included in the predetermined time zone.
(4) The abnormal behavior notification device according to any one of (1) to (3) described above, in which the detection target is a vehicle, and the identification information is information of a license plate of the vehicle.
(5) The abnormal behavior notification device according to any one of (1) to (3) described above, in which the detection target is a specific person, and the identification information is a facial image of the specific person.
(6) The abnormal behavior notification device according to any one of (1) to (5) described above, in which the registration unit registers the identification information received from a user terminal.
(7) The abnormal behavior notification device according to (6) described above, in which the registration unit registers the normal behavior received from the user terminal together with the identification information.
(8) The abnormal behavior notification device according to (6) or (7) described above, in which the transmission unit transmits the alert to the user terminal.
(9) The abnormal behavior notification device according to (3) described above, further including an estimation unit that specifies, based on the identification information, from a plurality of images showing the detection target captured by the mobile body in the past, positions of the detection target when the images are captured, and estimates the predetermined movement route and the predetermined time zone based on the specified positions of the detection target and imaging times of the images.
(10) The abnormal behavior notification device according to (1) described above, in which: the detection target is a specific person, and the normal behavior is that the specific person is accompanied by an attendant; and when the specific person is shown in the image and the same other person is not shown in the image continuously for a predetermined time or more within a predetermined distance from the specific person, the abnormal behavior determination unit determines that the specific person exhibits the abnormal behavior that is different from the normal behavior.
(11) The abnormal behavior notification device according to (10) described above, in which the identification information is a facial image of the specific person.
(12) An abnormal behavior notification system including a user terminal owned by a user and an abnormal behavior notification device communicably connected to the user terminal, the abnormal behavior notification system including: an acquisition unit that acquires identification information for identifying a detection target input to the user terminal; a registration unit that registers the identification information in a storage unit; a determination unit that determines whether the detection target is shown in an image captured on or around a road, based on the identification information; an abnormal behavior determination unit that determines whether the detection target exhibits an abnormal behavior that is different from a normal behavior of the detection target, when the detection target is shown in the image; and a transmission unit that transmits an alert to the user terminal when the detection target exhibits the abnormal behavior.
(13) An abnormal behavior notification method including: a step of registering identification information for identifying a detection target in a storage unit; a step of determining whether the detection target is shown in an image captured on or around a road, based on the identification information; a step of determining whether the detection target exhibits an abnormal behavior that is different from a normal behavior of the detection target, when the detection target is shown in the image; and a step of transmitting an alert when the detection target exhibits the abnormal behavior.
(14) A recording medium recording a program that causes a computer to function as: a registration unit that registers identification information for identifying a detection target in a storage unit; a determination unit that determines whether the detection target is shown in an image captured on or around a road, based on the identification information; a determination unit that determines whether the detection target exhibits an abnormal behavior that is different from a normal behavior of the detection target, when the detection target is shown in the image; and a transmission unit that transmits an alert when the detection target exhibits the abnormal behavior.
The present disclosure exerts an effect that makes it possible to provide an abnormal behavior notification device, an abnormal behavior notification system, an abnormal behavior notification method, and a program that enable an alert to be notified when a detection target desired to be watched over by a user exhibits an abnormal behavior different than usual.
Features, advantages, and technical and industrial significance of exemplary embodiments of the disclosure will be described below with reference to the accompanying drawings, in which like signs denote like elements, and wherein:
Hereinafter, several embodiments according to the present disclosure will be described with reference to the drawings. However, these descriptions are intended merely to illustrate embodiments of the present disclosure and are not intended to limit the present disclosure to such particular embodiments.
The mobile body 100 is a vehicle such as an automobile that travels on the road. In the present embodiment, as an example, the mobile body 100 is an autonomous driving bus that travels on a road based on a predetermined command and transports passengers, and is regularly operated in a smart city. A smart city is a sustainable city or district for which management (planning, maintenance, management and operation, etc.) is performed while utilizing new technologies such as information and communication technology (ICT) to address various issues of the city in an effort to realize overall optimization, which is proposed by the Ministry of Land, Infrastructure, Transport and Tourism. The mobile body 100 is not limited to a vehicle that is autonomously driven, and may be a vehicle that is manually driven.
The mobile body 100 is provided with a camera, captures images of the surroundings of the mobile body 100 during operation, and generates images showing surrounding vehicles, people, structures, and the like. Consequently, the mobile body 100 transmits the generated image to the server 200.
The server 200 is a device that manages a plurality of mobile bodies 100, and issues an operation command to each mobile body 100. The operation command includes information such as the operation route and the operation time of the mobile body 100, and bus stops where the mobile body 100 stops, and is transmitted from the server 200 to the mobile body 100. The server 200 receives the image transmitted from the mobile body 100, and when the detection target registered in advance is displayed in the image and the detection target exhibits an abnormal behavior that is different from usual, issues an alert (warning). The alert is transmitted to, for example, the user terminal 300 that has registered the detection target.
The user terminal 300 is, for example, a portable computer such as a smartphone, a mobile phone terminal, a tablet terminal, a personal information terminal, or a wearable computer (smart watch or the like). The user terminal 300 may be a personal computer (PC). In order to register the detection target in the server 200, the user terminal 300 transmits the registration information related to the detection target to the server 200. Further, the user terminal 300 receives the alert transmitted from the server 200 and notifies the user of the alert.
The detection target is a target for which the user requests detection of the abnormal behavior, and corresponds to a vehicle (automobile) owned by the user, a person (family, friends, etc.), an object, a structure, or the like that the user watches over. The detection target widely includes anything the user requests to detect the abnormal behavior, such as pets owned by the user, a home of the user (entrance, windows, walls, etc.), as long as images of the detection target can be captured by the camera of the mobile body 100.
Since the mobile body 100 is regularly operated in the smart city, the situation of vehicles, people, structures, or the like in the smart city is recorded in the images captured by the camera of the mobile body 100. Therefore, the server 200 can monitor the events occurring in the smart city by collecting and analyzing the images captured by the mobile body 100. In particular, when there are a plurality of the mobile bodies 100, the server 200 can monitor the events occurring in the smart city in detail based on more images.
The camera does have to be provided on the mobile body 100, and may be, for example, a plurality of surveillance cameras (fixed point cameras) installed at predetermined locations in the smart city. In this case, the abnormal behavior notification system 1000 is configured by connecting the surveillance cameras, the server 200, and the user terminal 300 so as to be able to communicate with each other via the communication network 500 such as the Internet. Also in this case, the server 200 can monitor the events occurring in the smart city by collecting and analyzing the images captured by the surveillance cameras.
When the detection target registered in the server 200 exists on or near the operation route of the mobile body 100 while the mobile body 100 is in operation, the detection target encounters the mobile body 100 and its image is captured by a camera equipped in the mobile body 100. When an image of the detection target is captured, the server 200 recognizes the position and time of the detection target at the time of imaging by using the captured image and the position information of the mobile body 100. When the image of the detection target is captured, the server 200 recognizes the state of the detection target at the time of imaging from the image. Upon recognizing the above, the server 200 determines whether the detection target exhibits an abnormal behavior that is different from the normal behavior of the detection target registered in the server 200.
The abnormal behavior of the detection target includes the case where the detection target exists in a time or location that is different from usual, such as when the detection target exists in a time zone different than usual, or when the detection target exists in a location different than usual. For example, when the detection target is a vehicle and the vehicle is mainly used for commuting in the morning and evening, the time zone and route in which the vehicle is driven are generally constant. In this case, the normal behavior of the vehicle is to travel on the commuting route in the time zone during the morning and evening hours, and it is an abnormal behavior for the vehicle to travel in the time zone during the daytime hours or for the vehicle to travel on a different route than the commuting route. In addition, when the detection target is an elderly person, the time zone and route for the elderly person to take a walk are often fixed. In this case, the normal behavior of the elderly person is to take a walk on the usual route during the usual time zone, and it is an abnormal behavior that is different from usual for the elderly person to take a walk at a time zone different from the usual time zone, or for the elderly person to take a walk on a different route than the usual route.
Further, the abnormal behavior of the detection target includes a case where the detection target is in a state different from the normal state, such as a case where the detection target is acting in a state different from the normal state. For example, when the detection target is a specific person, and the specific person usually acts together with another person, the abnormal behavior of the detection target is that the specific person acts alone. For example, when the detection target is a person certified as requiring long-term care, the person certified as requiring long-term care often takes a walk with an accompanying caregiver. In this case, the normal behavior of the person certified as requiring long-term care is to take a walk with a caregiver, and it is an abnormal behavior that is different from usual for the person certified as requiring long-term care to go out alone. Further, for example, when the detection target is a gate at home and the gate is normally closed, the abnormal behavior of the detection target is when the gate is open.
In order to detect the abnormal behavior of these detection targets, a combination of the detection target and the normal behavior of the detection target is registered in advance in the server 200. Registration is performed based on the registration information related to the detection target transmitted from the user terminal 300.
When the detection target exhibits an abnormal behavior that is different from usual, the user who has received the alert can take appropriate actions based on the alert. For example, when the detection target is a vehicle owned by the user, the vehicle may have been stolen and the user can notice the theft at an early stage, so that actions can be made immediately such as calling the police. As a result, early arrest of the criminal is achieved. When the detection target is a person certified as requiring long-term care or an elderly person, the detection target may behave differently than usual or wander about, so that the user who has received the alert can take actions such as searching.
The control unit 110 of the mobile body 100 is composed of a processor. The processor has one or more central processing units (CPUs) and peripheral circuits thereof. The processor may further include other arithmetic circuits such as a logical operation unit, a numerical operation unit, or a graphic processing unit. The control unit 110 provides a function that meets a predetermined purpose by controlling peripheral devices such as the positioning information receiving unit 130 or the camera 140 through execution of a computer program executably deployed in the work area of the storage unit 150.
The communication I/F 120 of the mobile body 100 is a communication interface between the communication network 500, and includes, for example, an antenna and a signal processing circuit that executes various processes related to wireless communication such as modulation and demodulation of wireless signals. The communication I/F 120 receives, for example, a downlink radio signal from a radio base station connected to the communication network 500, and transmits an uplink radio signal to the radio base station. The communication I/F 120 takes out a signal transmitted from the server 200 to the mobile body 100 from the received downlink radio signal and passes the signal to the control unit 110. Further, the communication I/F 120 generates an uplink radio signal including the signal transmitted from the control unit 110 to the server 200, and transmits the radio signal.
The positioning information receiving unit 130 of the mobile body 100 acquires positioning information indicating the current position and posture of the mobile body 100. For example, the positioning information receiving unit 130 can be a global positioning system (GPS) receiver. Each time the positioning information receiving unit 130 receives the positioning information, the positioning information receiving unit 130 outputs the acquired positioning information to the control unit 110 via the in-vehicle network.
The camera 140 of the mobile body 100 is an in-vehicle camera having a two-dimensional detector composed of an array of photoelectric conversion elements having sensitivity to visible light such as a charge coupled device (CCD) or complementary metal-oxide semiconductor (C-MOS), and an imaging optical system that forms an image of a region to be imaged and detected on the two-dimensional detector. The camera 140 is provided toward the outside of the mobile body 100. The camera 140 captures images of the surroundings of the mobile body 100 (for example, front of the mobile body 100) such as on or around the road at predetermined imaging cycles (for example, 1/30 second to 1/10 second), and generates images showing the surroundings of the mobile body 100. The camera 140 may be composed of a stereo camera, and may be configured to acquire the distance from each structure in the image, based on the parallax of the right and left images. Each time the camera 140 generates an image, the camera 140 outputs the generated image to the control unit 110 via the in-vehicle network together with the imaging time.
The storage unit 150 of the mobile body 100 has, for example, a volatile semiconductor memory and a non-volatile semiconductor memory. Information such as internal parameters of the camera 140 is stored in the storage unit 150. The internal parameters include the mounting position of the camera 140 on the mobile body 100, the posture of the camera 140 with respect to the mobile body 100, the focal length of the camera 140, and the like.
The server 200 has a control unit 210, a communication I/F 220, and a storage unit 230, which is one mode of the abnormal behavior notification device. The control unit 210 of the server 200 is composed of a processor, as in the control unit 110 of the mobile body 100. The communication I/F 220 of the server 200 includes a communication module connected to the communication network 500. For example, the communication I/F 220 may include a communication module that complies with a wired local area network (LAN) standard. The server 200 is connected to the communication network 500 via the communication I/F 220. As in the storage unit 150 of the mobile body 100, the storage unit 230 of the server 200 has, for example, a volatile semiconductor memory and a non-volatile semiconductor memory.
The user terminal 300 has a control unit 310, a communication I/F 320, a storage unit 330, a display unit 340, an input unit 350, a camera 360, and a speaker 370. The control unit 310 is composed of a processor, as in the control unit 110 of the mobile body 100.
The communication I/F 320 of the user terminal 300 is configured in the same manner as the communication I/F 120 of the mobile body 100. As in the storage unit 150 of the mobile body 100, the storage unit 330 of the user terminal 300 has, for example, a volatile semiconductor memory and a non-volatile semiconductor memory. The display unit 340 of the user terminal 300 is composed of, for example, a liquid crystal display (LCD), and displays an alert when the user terminal 300 receives an alert from the server 200. The input unit 350 of the user terminal 300 is composed of, for example, a touch sensor, a mouse, a keyboard, and the like, and information according to the user's operation is input. When the input unit 350 is composed of a touch sensor, the display unit 340 and the input unit 350 may be configured as an integrated touch panel. The camera 360 of the user terminal 300 is configured in the same manner as the camera 140 of the mobile body 100, and has a two-dimensional detector composed of an array of photoelectric conversion elements, and an imaging optical system that forms an image of a region to be imaged and detected on the two-dimensional detector. The speaker 370 of the user terminal 300 issues an alert by voice when the user terminal 300 receives an alert from the server 200.
The image acquisition unit 110a of the control unit 110 acquires the image data generated by the camera 140. For example, the image acquisition unit 110a acquires an image generated by the camera 140 at predetermined time intervals. The image data is associated with the imaging time.
The transmission unit 110b of the control unit 110 performs a process of transmitting, to the server 200 via the communication I/F 120, the image acquired by the image acquisition unit 110a, the imaging time at which the image was captured, the positioning information received by the positioning information receiving unit 130 at the imaging time at which the image was captured, and the internal parameters of the camera 140.
The functional block of the control unit 210 of the server 200 shown in
The reception unit 210a of the control unit 210 receives, via the communication I/F 220, the image transmitted from the mobile body 100, the imaging time, the positioning information of the mobile body 100, and the internal parameters of the camera 140. Further, the reception unit 210a receives, via the communication I/F 220, the registration information related to the detection target transmitted from the user terminal 300.
The registration unit 210b of the control unit 210 registers the registration information related to the detection target received from the user terminal 300 in the storage unit 230. Specifically, the registration unit 210b registers the combination of the identification information for identifying the detection target and the normal behavior of the detection target in the storage unit 230. The identification information is information such as a vehicle number or a facial image of a person. When the detection target is a vehicle, the registration unit 210b registers the combination of the vehicle number and the normal behavior of the vehicle received from the user terminal 300. When the detection target is a person requiring long-term care or an elderly person, the registration unit 210b registers the combination of the facial image of the person and the normal behavior of the person received from the user terminal 300.
The normal behavior of the detection target is included in the registration information received from the user terminal 300. When the detection target is a vehicle, the registration unit 210b registers the normal behavior received from the user terminal 300, including the time zone in which the vehicle travels and the route in which the vehicle travels. When the detection target is a person requiring long-term care or an elderly person, the registration unit 210b registers the normal behavior received from the user terminal 300, including the time zone in which the person walks, the route, the presence or absence of a caregiver, and the like. Alternatively, the normal behavior of the detection target may be estimated by the server 200. In this case, the registration information received from the user terminal 300 does not have to include the normal behavior.
Every time the reception unit 210a receives the image from the mobile body 100, the detection target determination unit 210c of the control unit 210 determines whether the detection target is shown in the image captured by the mobile body 100 while the mobile body 100 moves, based on the identification information for identifying the detection target registered by the registration unit 210b.
For the above-mentioned identifier, the detection target determination unit 210c can use a segmentation identifier that outputs, for example, from the input image, the certainty that an object is represented by a pixel for each pixel of the image, for each type of object that may be represented by the pixel, and that has been trained in advance to identify that the object with the maximum certainty is represented. As such an identifier, the detection target determination unit 210c can use a deep neural network (DNN) having a convolutional neural network (CNN) architecture for segmentation such as fully convolutional network (FCN), for example. Alternatively, the detection target determination unit 210c may use a segmentation identifier based on another machine learning method such as random forest or support vector machine. In this case, the detection target determination unit 210c inputs an image into the segmentation identifier to identify a pixel in which a desired object appears in the image. The detection target determination unit 210c then sets a group of pixels in which the same type of object is shown as a region in which the object is represented.
As described above, the server 200 may estimate the normal behavior of the detection target. In this case, the normal behavior estimation unit 210d of the control unit 210 estimates the normal behavior of the detection target. From a plurality of images showing the detection target captured by the mobile body 100 in the past, the normal behavior estimation unit 210d identifies the position of the detection target when the images are captured, and estimates a predetermined movement route and a predetermined time zone in the normal behavior based on the specified position of the detection target and the imaging time of the images. When the detection target is a vehicle and the image includes a vehicle matching the vehicle number registered by the registration unit 210b based on the determination result of the detection target determination unit 210c, the normal behavior estimation unit 210d specifies the position of the vehicle with respect to the world coordinate system, based on the positioning information of the mobile body 100 when the image was captured, the position of the vehicle in the image (the position of the vehicle with respect to the camera coordinate system), and the internal parameters of the camera 140.
At this time, specifically, the normal behavior estimation unit 210d obtains a conversion formula that converts the camera coordinate system, which uses the position of the camera 140 of the mobile body 100 as the origin point and the optical axis direction of the camera 140 as one axial direction, into the world coordinate system. Such a conversion formula is represented by a combination of a rotation matrix representing rotation between the coordinate systems and a translation vector representing translation between the coordinate systems. The normal behavior estimation unit 210d converts the position of the vehicle included in the image shown by the camera coordinate system into coordinates in the world coordinate system according to the conversion formula. As a result, the position of the vehicle when the image is captured can be obtained. When the image includes a vehicle that matches the vehicle number registered by the registration unit 210b, the normal behavior estimation unit 210d may simply set the position of the mobile body 100 when the image is captured as the position of the vehicle.
Then, based on the plurality of types of the position information of the vehicle that is the detection target thus obtained and the imaging time of the images used to identify each position information, the normal behavior estimation unit 210d estimates the normal route and normal time zone in which the vehicle travels as the normal behavior of the vehicle.
In the example shown in
More specifically, the normal behavior estimation unit 210d estimates the normal route and time zone in which the vehicle travels, for example, by rule-based estimation or estimation using machine learning.
In the rule-based estimation, for example, based on the probability that a point P indicating the specified position of the vehicle exists in each small region S, a set of the small regions S having an existence probability of a predetermined value or more is estimated as the normal vehicle route. The existence probability is represented by, for example, the number of points P existing in each small region S within the period (for example, one month, half a year, one year, etc.) in which the position information (point P) of the vehicle is collected. Further, the time range corresponding to the points P included in the small regions S with the existence probability of equal to or more than the predetermined value is estimated as the normal time zone.
As for the number of points P, it is only necessary that a predetermined number necessary for estimating the normal behavior by rule-based estimation or estimation using machine learning is collected, and the predetermined number is, for example, 100. In the case of machine learning, in order to suppress harmful effects of overlearning, learning with more than a predetermined number of point clouds may be avoided.
Further, when an alert is transmitted to the user terminal 300 and the cancel button of the user terminal 300, which will be described later, is pressed, causing the user terminal 300 to transmit that the alert is unnecessary, the normal behavior estimation unit 210d may perform learning excluding the position and time of the detection target that is the source of the alert.
Also when the detection target is a person certified as requiring long-term care or an elderly person, the normal behavior estimation unit 210d uses the same method as when the detection target is a vehicle, and the normal route and time zone when the person moves are estimated as the normal behavior. In particular, it may be difficult for the user to grasp the normal behavior of a person who may wander about, and thus the normal behavior cannot be transmitted to the user terminal 300. In some embodiments, the normal behavior is estimated on the server 200 side.
Further, when the detection target corresponding to the identification information is included in the image based on the determination result of the detection target determination unit 210c, the normal behavior estimation unit 210d may estimate the normal behavior of the detection target from the state of the detection target shown in the image. For example, when the detection target shown in
The normal behavior of the detection target estimated by the normal behavior estimation unit 210d as described above may be registered in the storage unit 230 by the registration unit 210b together with the identification information of the detection target. Alternatively, the normal behavior of the detection target estimated by the normal behavior estimation unit 210d may not be registered, and the configuration may be such that when images serving as the source of the estimation are acquired, the normal behavior is sequentially updated based on these images.
The abnormal behavior determination unit 210e of the control unit 210 determines whether the detection target exhibits an abnormal behavior based on the combination of the identification information for identifying the detection target registered by the registration unit 210b and the normal behavior of the detection target, and the images received by the reception unit 210a from the mobile body 100. With the normal behavior of the detection target being that the detection target moves in a predetermined movement route and a predetermined time zone, when the position of the detection target based on the position of the mobile body 100 when the image showing the detection target is captured is not included in the predetermined movement route, or the time at which the image showing the detection target is captured is not included in the predetermined time zone, the abnormal behavior determination unit 210e determines that the detection target exhibits an abnormal behavior that is different from the normal behavior.
More specifically, when the image includes the detection target corresponding to the identification information registered by the registration unit 210b based on the determination result of the detection target determination unit 210c, the abnormal behavior determination unit 210e specifies the position of the detection target with respect to the world coordinate system, based on the positioning information of the mobile body 100 when the image was captured, the position of the detection target in the image (the position of the detection target with respect to the camera coordinate system), and the internal parameters of the camera 140. Then, the abnormal behavior determination unit 210e compares the position of the detection target thus obtained and the time at which the image including the detection target was captured, with the route and time zone in the normal behavior of the detection target. When the position of the detection target is not included in the route of the normal behavior or when the time at which the image showing the detection target was captured is not included in the time zone of the normal behavior, the abnormal behavior determination unit 210e determines that the behavior of the detection target is abnormal.
It should be noted that the abnormal behavior determination unit 210e may determine that the behavior of the detection target is abnormal when the position of the detection target is not included in the route of the normal behavior and when the time at which the image showing the detection target was captured is not included in the time zone of the normal behavior.
For example, when the detection target is a vehicle owned by the user and the image includes a vehicle matching the vehicle number registered by the registration unit 210b based on the determination result of the detection target determination unit 210c, the abnormal behavior determination unit 210e, as in the normal behavior estimation unit 210d, specifies the position of the vehicle with respect to the world coordinate system, based on the positioning information of the mobile body 100 when the image was captured, the position of the vehicle in the image (the position of the vehicle with respect to the camera coordinate system), and the internal parameters of the camera 140. Then, the abnormal behavior determination unit 210e compares the position of the vehicle thus obtained and the time at which the image including the vehicle was captured, with the route and time zone in the normal behavior of the vehicle.
It should be noted that the abnormal behavior determination unit 210e may determine whether the position of the detection target is included in the route of the normal behavior based on a region obtained by expanding the width of the route of the normal behavior. For example, when the route of the normal behavior registered by the user is the route A1 shown in
Further, when the detection target corresponding to the identification information is included in the image and the state of the detection target shown in the image is different from the state of the normal behavior registered by the registration unit 210b, the abnormal behavior determination unit 210e determines that the behavior of the detection target is abnormal. For example, in the case where the detection target is a specific person and the normal behavior is that this specific person is accompanied by an attendant, when the specific person is shown in the image and the same other person is not shown in the image continuously for a predetermined time or more within a predetermined distance from the specific person, the abnormal behavior determination unit 210e determines that the specific person exhibits an abnormal behavior that is different from the normal behavior.
When the state of the normal behavior of the person certified as requiring long-term care 30 registered by the registration unit 210b is that the person certified as requiring long-term care 30 acts together with the caregiver 40 as shown in
In contrast, when the same other person (caregiver 40) exists continuously for a predetermined time or more within a predetermined distance from the person certified as requiring long-term care 30 as shown in
When the abnormal behavior determination unit 210e determines an abnormal behavior of the detection target, the alert transmission unit 210f of the control unit 210 transmits an alert to the user terminal 300 that has transmitted the registration information related to the detection target. The alert transmission unit 210f may transmit the latest position information of the detection target that has been determined to have exhibited the abnormal behavior together with the alert.
In the example of
When the user owning the user terminal 300 to which the alert is transmitted receives the alert, the user recognizes that the registered detection target exhibits an abnormal behavior that is different from usual. When the abnormal behavior is a behavior that the user does not know in advance, the user can take appropriate actions for the abnormal behavior. For example, when the detection target is a vehicle, it is conceivable that the vehicle has been stolen and the thief is driving the vehicle in a time zone or route different than usual. Therefore, the user who has received the alert can take appropriate measures such as calling the police.
On the other hand, the user owning the user terminal 300 to which the alert is transmitted can cancel the alert when the abnormal behavior is a behavior that the user knows in advance. For example, in the example of
The registration information acquisition unit 310a of the control unit 310 acquires the registration information related to the detection target, which is input by the user by operating the input unit 350. As described above, the registration information related to the detection target includes the identification information for identifying the detection target and the normal behavior of the detection target. As described above, the identification information is, for example, information on the license plate of the vehicle when the detection target is a vehicle, and is a facial image when the detection target is a person certified as requiring long-term care or an elderly person.
When the identification information is a facial image, the registration information acquisition unit 310a acquires, as the identification information, an image showing the face of a person obtained by the user by capturing an image of a person certified as requiring long-term care or an elderly person with the camera 360 of the user terminal 300, for example.
The registration information transmission unit 310b of the control unit 310 performs a process of transmitting, to the server 200 via the communication I/F 320, the registration information acquired by the registration information acquisition unit 310a.
Then, when the user presses a transmission button 342d, the registration information transmission unit 310b transmits the vehicle number and the normal behavior to the server 200. In the example shown in
The alert reception unit 310c of the control unit 310 receives, via the communication I/F 320, the alert transmitted from the server 200. When the latest position information of the detection target is transmitted from the server 200 together with the alert, the alert reception unit 310c receives the latest position information of the detection target.
The alert notification unit 310d of the control unit 310 performs a process for notifying the user of the alert received by the alert reception unit 310c. Specifically, the alert notification unit 310d performs a process of displaying the alert on the display unit 340 or a process of outputting the alert by voice from the speaker 370.
When the user who has been notified of the alert has expected the behavior of the vehicle and the displayed alert is not fundamentally required, the user can cancel the alert by pressing a button 342f for canceling the alert. When the alert is canceled, a message indicating the cancellation is sent to the server 200.
Consequently, the reception unit 210a of the control unit 210 of the server 200 receives the registration information related to the detection target transmitted from the user terminal 300 (step S20). Next, the registration unit 210b of the control unit 210 registers the registration information related to the detection target received from the user terminal 300 in the storage unit 230 (step S22). In this way, the identification information for identifying the detection target for which the user desires to detect the abnormal behavior and the normal behavior of the detection target are registered in the server 200.
When the camera 140 of the mobile body 100 captures images of the surroundings of the mobile body 100, the image acquisition unit 110a of the control unit 110 of the mobile body 100 acquires the image data generated by the camera 140 (step S10). Then, the transmission unit 110b of the control unit 110 transmits the image data acquired by the image acquisition unit 110a to the server 200 (step S12). The transmission unit 110b transmits information such as the imaging time at which the image was captured, the positioning information of the mobile body 100 when the image was captured, and the internal parameters of the camera 140 to the server 200 together with the image data.
The reception unit 210a of the control unit 210 of the server 200 receives the image data transmitted from the mobile body 100, and also receives the information such as the imaging time, the positioning information of the mobile body 100, and the internal parameters of the camera 140 (step S24). Next, the detection target determination unit 210c of the control unit 210 determines whether the detection target exists in the image received from the mobile body 100 (step S26), and when the detection target exists, the abnormal behavior determination unit 210e determines whether the behavior of the detection target is an abnormal behavior different than usual (step S28) based on the normal behavior of the detection target registered in the storage unit 230. When the behavior of the detection target is an abnormal behavior different than usual, the alert transmission unit 210f of the control unit 210 transmits an alert to the user terminal 300 (step S29).
The alert reception unit 310c of the control unit 310 of the user terminal 300 receives the alert transmitted from the server 200 (step S34). Subsequently, the alert notification unit 310d of the control unit 310 notifies the user of the alert received by the alert reception unit 310c (step S36). As a result, the alert is displayed on the display unit 340, and the alert is output by voice from the speaker 370.
In
First, the reception unit 210a of the control unit 210 of the server 200 receives the image data transmitted from the mobile body 100, the imaging time, the positioning information of the mobile body 100, and the internal parameters of the camera 140 (step S40). Next, the detection target determination unit 210c of the control unit 210 determines whether the detection target exists in the image received from the mobile body 100 (step S42). When the detection target exists in the image, the normal behavior estimation unit 210d specifies the position of the detection target based on the position of the detection target in the image and the position of the mobile body 100 when the image was captured (step S44), and accumulates the combination of the position of the detection target and the imaging time of the image in the storage unit 230 (step S46). On the other hand, when the detection target does not exist in the image in step S42, the process returns to step S40 and the processes of step 40 and after are performed again.
After step S46, the normal behavior estimation unit 210d determines whether a predetermined number of combinations of the position of the detection target and the time has been accumulated (step S48), and when a predetermined number has been accumulated, estimates the normal behavior of the detection target based on the accumulated predetermined number of the positions of the detection target and the times (step S50). When a predetermined number has not been accumulated in step S48, the process returns to step S40 and the processes of step 40 and after are performed again.
Modification
When the user's schedule is registered in the storage unit 230 of the user terminal 300, the user terminal 300 may share the schedule information with the server 200. In this case, even when the abnormal behavior determination unit 210e determines that the detection target exhibits an abnormal behavior, the alert transmission unit 210f of the control unit 210 of the server 200 does not need to transmit an alarm when the abnormal behavior is based on a behavior registered in the schedule. This suppresses the transmission of alerts that are unnecessary for the user.
Further, when the detection target is a vehicle owned by the user, the position information of the user terminal 300 and the position information of the vehicle may be shared on the server 200 side, and an alert may be transmitted to the owner upon determining that the vehicle has been stolen when the user terminal 300 and the vehicle are not at the same position while the vehicle is moving.
Furthermore, when the vehicle that is the detection target is equipped with a driver monitoring camera, the driver may constantly be specified by the driver monitoring camera. When a person who is not registered in advance is driving the vehicle, the above information may be transmitted from the vehicle to the server 200, and an alert may be transmitted from the server 200 to the user terminal 300 of the user who owns the vehicle.
As described above, according to the present embodiment, the user can receive an alert when the detection target desired to be watched over exhibits an abnormal behavior different than usual, so that the user can detect the abnormal behavior at an early stage. Therefore, the user can take appropriate measures for the detection target that exhibits the abnormal behavior.
Number | Date | Country | Kind |
---|---|---|---|
JP2021-032645 | Mar 2021 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
8786425 | Hutz | Jul 2014 | B1 |
8988200 | Lee | Mar 2015 | B2 |
20100253594 | Szczerba | Oct 2010 | A1 |
20130243252 | Xu | Sep 2013 | A1 |
20200117928 | Nishimura | Apr 2020 | A1 |
20220019823 | Ucar | Jan 2022 | A1 |
20220172626 | Baik | Jun 2022 | A1 |
Number | Date | Country |
---|---|---|
2020-061079 | Apr 2020 | JP |
Number | Date | Country | |
---|---|---|---|
20220284796 A1 | Sep 2022 | US |