This application is based upon and claims the benefit of priority of the prior Japanese Patent Application No. 2019-183371, filed on Oct. 4, 2019, the entire contents of which are incorporated herein by reference.
The embodiment discussed herein is related to an information transmission system, an information transmission method, and an edge device.
For example, a business entity that provides a service to users (hereafter also referred to simply as a business entity) constructs and operates an information processing system for providing the service to the users. For example, the business entity constructs an information processing system that analyzes the action pattern of an analysis target from video images captured in each of a plurality of edge devices (hereafter also referred to simply as edges).
In such an information processing system, each edge device identifies an analysis target that appears in a captured video image and extracts, in advance, information indicating the identified analysis target (hereafter, the information is also referred to as a feature). When accepting a condition from the user, a management apparatus, which is to analyze the action pattern of an analysis target, acquires features extracted from video images that meet the condition, from the edge devices, and analyzes the action pattern of the analysis target based on the acquired features.
Thereby, the information processing system may analyze the action pattern of an analysis target while reducing the amount of communication between each edge device and the management apparatus (for example, see Japanese Laid-open Patent Publication Nos. 2003-324720, 11-015981, 2016-071639, and 2016-127563).
According to an aspect of the embodiments, an information transmission system includes a first edge device configured to detect a first feature corresponding to a first analysis target, and transmit the first feature; a second edge device configured to receive the first feature from the first edge device, detect a second feature corresponding to a second analysis target, determines whether the first feature and the second feature are similar, and transmit, when the first feature and the second feature are similar, first correspondence information indicating that the first analysis target and the second analysis target correspond to each other; and a server configured to receive the correspondence information from the second edge device.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention.
In the rerated art, the feature that being extracted by each edge device is information that may identify a personal. Thus, the operator may not be able to transmit the feature acquired from each edge device to the management device and may not accumulate the feature in the management device from the viewpoint of security and the like. Therefore, the operator may not be able to associate the feature extracted by the different edge devices with the management device, and may not be able to analyze the operation pattern of the analysis target.
In one aspect, an object of the invention is to provide an information transmission system capable of associating feature extracted by different edge devices without transmitting the feature to the management device.
[Configuration of Information Processing System]
A configuration of an information processing system 10 will now be described.
As illustrated in
In the example illustrated in
Like the edge device 2a, the edge device 2b detects an analysis target (hereafter also referred to as a second analysis target) from a video image captured by a camera, and extracts a feature (hereafter also referred to as a second feature) corresponding to the detected analysis target.
The edge device 2b then determines whether the first feature received from the edge device 2a and the second feature extracted by the edge device 2b are similar. As a result, if it is determined that the first feature and the second feature are similar, the edge device 2b generates information indicating that the first analysis target detected by the edge device 2a and the second analysis target detected by the edge device 2b correspond to each other (hereafter the information is also referred to as first correspondence information or simply as correspondence information), and transmits the generated information to the management apparatus 1. For example, the edge device 2b generates first correspondence information indicating that the first analysis target and the second analysis target are the same targets, and transmits the first correspondence information to the management apparatus 1.
For example, each edge device 2 generates first correspondence information indicating a combination of features that are features extracted in different edge devices 2 and are related to the same analysis target. Each edge device 2 transmits, instead of a feature extracted in the edge device 2, the generated first correspondence information to the management apparatus 1.
Thereby, the management apparatus 1 may identify a combination of features that are extracted in different edge devices 2 and are related to the same analysis target, without acquiring a feature extracted in each of the edge devices 2. Therefore, without accumulating features in the management apparatus 1, a business entity may perform association of features acquired in different edge devices 2 and may analyze the action pattern of an analysis target.
[Hardware Configuration of Information Processing System]
A hardware configuration of the information processing system 10 will now be described.
First, the hardware configuration of the edge device 2 will be described.
As illustrated in
The storage medium 204 includes, for example, a program storage area (not illustrated) for storing a program 210 for performing a process for transmitting the first correspondence information from each edge device 2 to the management apparatus 1 (hereafter the process is also referred to as an information transmission process). The storage medium 204 also includes, for example, a storage unit 230 (hereafter also referred to as an information storage area 230) that stores information for use in performing the information transmission process. The storage medium 204 may be, for example, a hard disk drive (HDD) or a solid-state drive (SSD).
The CPU 201 executes the program 210 loaded from the storage medium 204 into the memory 202 to perform the information transmission process.
The communication device 203 wirelessly communicates with the access point 3, for example, by using wireless fidelity (Wi-Fi; registered trademark) or the like.
Next, the hardware configuration of the management apparatus 1 will be described.
As illustrated in
The storage medium 104 includes, for example, a program storage area (not illustrated) for storing a program 110 for performing the information transmission process. The storage medium 104 also includes, for example, a storage unit 130 (hereafter also referred to as an information storage area 130) that stores information for use in performing the information transmission process. The storage medium 104 may be, for example, an HDD or an SSD.
The CPU 101 executes the program 110 loaded from the storage medium 104 into the memory 102 to perform the information transmission process.
The communication device 103 communicates with the access point 3 in a wired manner via the network NW, for example.
[Functions of Information Processing System]
The functions of the information processing system 10 will now be described.
First, the block diagram of functions of the edge device 2 will be described.
As illustrated in
For example, as illustrated in
The video acquisition unit 211 acquires the video data 231 captured by a camera (not illustrated) mounted on each edge device 2 and stores the acquired video data 231 in the information storage area 230.
The information receiving unit 212 receives a target time at which the information transmission process is to be performed, from an operation terminal (not illustrated) in which the business entity performs various operations.
The information receiving unit 212 acquires another feature 232 transmitted from another edge device 2. For example, the information receiving unit 212 acquires another feature 232 corresponding to another analysis target detected by another edge device 2.
The information receiving unit 212 receives the preference information 133 transmitted from the management apparatus 1 and stores the received preference information 133 in the information storage area 230. The preference information 133 is information indicating, to each edge device 2, another edge device 2 to which the edge device 2 is to preferentially transmit the feature 232.
The time control unit 213 identifies, among one or more pieces of video data 231 stored in the information storage area 230, one or more pieces of video data 231 corresponding to the target time received by the information receiving unit 212.
The target detecting unit 214 detects an analysis target determined in advance, by using the one or more pieces of video data 231 identified by the time control unit 213. For example, the target detecting unit 214 determines whether an analysis target appears in the one or more pieces of video data 231 identified by the time control unit 213.
The feature extracting unit 215 extracts the feature 232 corresponding to an analysis target detected by the target detecting unit 214. For example, the feature extracting unit 215 analyzes, among the one or more pieces of video data 231 identified by the time control unit 213, the video data 231 in which an analysis target detected by the target detecting unit 214 appears, thereby extracting the feature 232 corresponding to the analysis target.
The information transmitting unit 216 transmits the feature 232 extracted by the feature extracting unit 215, to another edge device 2.
The similarity determination unit 217 compares another feature 232 received by the information receiving unit 212 with the feature 232 extracted by the feature extracting unit 215. The similarity determination unit 217 determines whether the similarity relationship between the other feature 232 received by the information receiving unit 212 and the feature 232 extracted by the feature extracting unit 215 satisfies a predetermined condition. For example, the similarity determination unit 217 determines whether each of the other feature 232 received by the information receiving unit 212 and the feature 232 extracted by the feature extracting unit 215 is the feature 232 corresponding to the same analysis target (for example, the same person).
When the similarity determination unit 217 determines that the similarity relationship satisfies the predetermined condition, the information generating unit 218 generates the first correspondence information 233 indicating that the other analysis target detected by the other edge device 2 corresponds to the analysis target detected by the target detecting unit 214. In this case, the information transmitting unit 216 transmits the first correspondence information 233 generated by the information generating unit 218, to the management apparatus 1.
Next, a block diagram of functions of the management apparatus 1 will be described.
In the management apparatus 1, as illustrated in
For example, as illustrated in
The information receiving unit 111 receives the respective pieces of first correspondence information 233 transmitted from the edge devices 2 and stores the received respective pieces of first correspondence information 233 in the information storage area 130.
From each of the respective pieces of first correspondence information 233 stored in the information storage area 130, the information generating unit 112 generates a piece of second correspondence information 131 indicating the correspondence relationship of each of the pieces of first correspondence information 233.
The number-of-times tallying unit 113 tallies the numbers of times that the first correspondence information 233 is transmitted from the edge devices 2. In this case, the information generating unit 112 generates the number-of-times information 132 indicating the numbers of times of transmission tallied by the number-of-times tallying unit 113.
The edge identification unit 114 references the number-of-times information 132 generated by the information generating unit 112 and identifies the respective edge device 2 to which each edge device 2 is to preferentially transmit the feature 232. In this case, the information generating unit 112 generates the preference information 133 indicating the edge device 2 identified by the edge identification unit 114.
The information transmitting unit 115 transmits the preference information 133 generated by the information generating unit 112, to each edge device 2.
[Outline of First Embodiment]
The outline of a first embodiment will now be described.
As illustrated in
Meanwhile, as illustrated in
Thereafter, as illustrated in
As a result, when it is determined in S22 that the features 232 are similar (YES in S23), the second edge device 2 transmits the first correspondence information 233 indicating that the first analysis target detected by the first edge device 2 in S1 and the second analysis target detected by the second edge device 2 in S11 correspond to each other, to the management apparatus 1 (S24).
When it is determined in S22 that the features 232 are not similar (NO in S23), the second edge device 2 does not perform S24.
Thereby, the management apparatus 1 may identify a combination of the features 232 that are extracted in different edge devices 2 and are related to the same analysis target, without acquiring the feature 232 extracted in each of the edge devices 2. Therefore, without accumulating the features 232 in the management apparatus 1, the business entity may perform association of the features 232 acquired in different edge devices 2 and may analyze the action pattern of an analysis target.
[Specific Example of First Embodiment]
A specific example in the first embodiment will now be described.
As illustrated in
In contrast, as illustrated in
Thereafter, as illustrated in
This enables the management apparatus 1 to determine whether the analysis target OB1 detected by the edge device 2a and the analysis target OB2 detected by the edge device 2b are the same analysis targets, by referencing the first correspondence information 233 transmitted from the edge device 2b. Therefore, without acquiring the feature 232 from each of the edge device 2a and the edge device 2b, the management apparatus 1 may analyze the action pattern of each analysis target.
[Details of First Embodiment]
The first embodiment will now be described in detail.
[Information Transmission Process Performed in Each Edge Device]
First, the information transmission process performed in each edge device 2 will be described.
As illustrated in
When any analysis target determined in advance is detected (YES in S111), the feature extracting unit 215 of the edge device 2a extracts the feature 232 corresponding to the analysis target detected in S111, from the video data 231 stored in the information storage area 230 (S112).
Thereafter, the feature extracting unit 215 of the edge device 2a stores the feature 232 extracted in S112, in the information storage area 230.
Subsequently, the information transmitting unit 216 of the edge device 2a references the preference information 133 stored in the information storage area 230 and determines a certain number of edge devices 2 to which the feature 232 extracted in S112 is to be transmitted (S114).
The certain number as used herein is a number greater than or equal to one and, for example, may be determined in advance by the business entity. For example, the business entity may determine the certain number, for example, within a range where the processing burden on each edge device 2 and the traffic volume between the edge devices 2 do not exceed the thresholds. The details of S114 will be described later.
The information transmitting unit 216 transmits the features 232 (including the feature 232 extracted in S112) stored in the information storage area 230 to the certain number of edge devices 2 determined in S114 (S115).
For example, in this case, the information transmitting unit 216 transmits not only the feature 232 extracted in S112 but all the features 232 stored in the information storage area 230.
As illustrated in
When the feature 232 from another edge device 2 is received (YES in S121), the similarity determination unit 217 of the edge device 2a determines whether the first feature 232 received in S121 and each of the features 232 stored in the information storage area 130 are similar (S122).
For example, the similarity determination unit 217 determines whether the feature 232 received in S121 is similar to any of the features 232 previously detected by the target detecting unit 214 or any of the features 232 previously received by the information receiving unit 212.
As a result, when it is determined that the features 232 are similar (YES in S123), the information generating unit 218 of the edge device 2a generates the first correspondence information 233 indicating a combination of the features 232 determined in S122 to be similar (S124). Specific examples of the first correspondence information 233 will be described below.
[Specific Examples of First Correspondence Information]
The first correspondence information 233 depicted in each of
For example, in the piece of information with “item number” of “1” in the first correspondence information 233 depicted in
In the piece of information with “item number” of “2” in the first correspondence information 233 depicted in
In the piece of information with “item number” of “3” in the first correspondence information 233 depicted in
In the piece of information with “item number” of “1” in the first correspondence information 233 depicted in
In the piece of information with “item number” of “2” in the first correspondence information 233 depicted in
In the piece of information with “item number” of “3” in the first correspondence information 233 depicted in
With reference now to
In S123, when it is determined that the features 232 are not similar (NO in S123), the edge device 2a does not perform S124 and S125.
[Information Transmission Process Performed in Management Apparatus]
Next, the information transmission process performed in the management apparatus 1 will be described.
As illustrated in
When the first correspondence information 233 is received from any of the edge devices 2 (YES in S131), the information generating unit 112 determines whether the first correspondence information 233 received in S131 corresponds to each of the pieces of first correspondence information 233 stored in the information storage area 130 (S132).
As a result, when it is determined that the received first correspondence information 233 corresponds to any of the pieces of stored first correspondence information 233 (YES in S133), the information generating unit 112 of the management apparatus 1 associates together the features 232 included in a combination of the pieces of first correspondence information 233 that are determined in S133 to correspond to each other, thereby generating a piece of the second correspondence information 131 (S134). A specific example of the second correspondence information 131 will be described below.
[Specific Example of Second Correspondence Information]
The second correspondence information 131 depicted in
For example, in the piece of information with “item number” of “1” of the first correspondence information 233 described with reference to
For example, these pieces of information indicate that “0101” and “0202” are the features 232 generated from the same analysis target and that “0202” and “0402” are the features 232 generated from the same analysis target. Therefore, by referencing these pieces of information, the management apparatus 1 may determine that “0101” and “0402” are also the features 232 generated from the same analysis target.
Accordingly, as illustrated in
Similarly, in the piece of information with “item number” of “3” of the first correspondence information 233 described with reference to
Therefore, as illustrated in
With reference now to
The number-of-times tallying unit 113 of the management apparatus 1 adds one to the number of times corresponding to a combination of the edge devices 2 from which the features 232 indicated by the first correspondence information 233 received in S131 are extracted, among the numbers of times included in the number-of-times information 132 stored in the information storage area 130 (S136). Specific examples of the number-of-times information 132 will be described below.
[Specific Examples of Number-Of-Times Information]
In the number-of-times information 132 depicted in
For example, “−”, “110 (times)”, “205 (times)”, “2 (times)”, and so on are stored in the boxes in the column with the header “2a” among the headers “2a”, “2b”, “2c”, “2d”, and so on arranged in the horizontal direction. For example, these boxes indicate that, for the first correspondence information 233 transmitted from the edge device 2a, the number of times of transmission of the first correspondence information 233 corresponding to a combination of the edge device 2a and the edge device 2b is “110 (times)”, the number of times of transmission of the first correspondence information 233 corresponding to a combination of the edge device 2a and the edge device 2c is “205 (times)”, and the number of times of transmission of the first correspondence information 233 corresponding to a combination of the edge device 2a and the edge device 2d is “2 (times)”.
For example, “121 (times)”, “-”, “55 (times)”, “300 (times)”, and so on are stored in the boxes in the column with the header “2b” among the headers “2a”, “2b”, “2c”, “2d”, and so on arranged in the horizontal direction. For example, these boxes indicate that, for the first correspondence information 233 transmitted from the edge device 2b, the number of times of transmission of the first correspondence information 233 corresponding to a combination of the edge device 2b and the edge device 2a is “121 (times)”, the number of times of transmission of the first correspondence information 233 corresponding to a combination of the edge device 2b and the edge device 2c is “55 (times)”, and the number of times of transmission of the first correspondence information 233 corresponding to a combination of the edge device 2b and the edge device 2d is “300 (times)”. Description of the other information included in
For example, after the state depicted in
With reference now to
For example, in the case where, in the number-of-times information 132 described with reference to
The edge identification unit 114 may, for example, start execution of S141 when the total number of times that the first correspondence information 233 is transmitted from the edge devices 2 exceeds a threshold. For example, the edge identification unit 114 may generate the preference information 133 only when the number of times that the first correspondence information 233 is transmitted from the edge devices 2 exceeds the number of times determined in advance as the number of times for generating the preference information 133 having a high accuracy.
The information generating unit 112 of the management apparatus 1 generates the preference information 133 indicating combinations of the edge devices 2 identified in S141 (S142). A specific example of the preference information 133 will be described below.
[Specific Example of Preference Information]
The preference information 133 depicted in
For example, in S141, when a combination of the edge device 2b and the edge device 2d is identified, the information generating unit 112 stores “2b” and “2d” in “edge device (1)” and “edge device (2)”, respectively, of the piece of information with the “item number” of “1”. Description of the other piece of information included in
In S141, the edge identification unit 114 may calculate, for each edge device 2, the transmission ratio of transmission from the edge device 2 to the other edge devices 2, in accordance with the numbers of times included in the number-of-times information 132 stored in the information storage area 130. In S142, the information generating unit 112 may generate, as the preference information 133, information indicating the transmission ratio calculated by the edge identification unit 114.
With reference now to
[Details of S114]
As illustrated in
When it is determined that the preference information 133 is not stored in the information storage area 230 (NO in S152), the information transmitting unit 216 determines the edge devices 2 to which the feature 232 is to be transmitted, so that each edge device 2 has a uniform probability that the edge device 2 serves as a transmission destination of the feature 231 (S154). For example, in this case, the information transmitting unit 216 determines the edge devices 2 to which the feature 232 is to be transmitted, so as to equalize the number of times that each edge device 2 receives the feature 232 from another edge device 2.
When it is determined that the preference information 133 is stored in the information storage area 230 (YES in S152), the information transmitting unit 216 determines the edge devices 2 to which the feature 232 is to be transmitted, so that a combination of the edge devices 2 corresponding to the preference information 133 stored in the information storage area 230 serves as the source and destination of transmission of the feature 232 at a higher probability than another combination of the edge devices 2 (S153).
For example, the preference information 133 described with reference to
Therefore, in S153, for example, the information transmitting unit 216 determines the edge devices 2 to which the feature 232 is to be transmitted, so that the probability that a combination of the source and destination of transmission of the feature 232 will be the edge device 2b and the edge device 2d and the probability that a combination of the source and destination of transmission of the feature 232 will be the edge device 2a and the edge device 2c are high.
Thereby, the management apparatus 1 may perform control so that, between the edge devices 2 in which the feature 232 corresponding to the same analysis target is highly likely to be detected, the feature 232 is transmitted and received at a higher frequency. Therefore, the management apparatus 1 may generate the second correspondence information 131 more efficiently and may perform association of analysis targets detected in different edge devices 2 more efficiently.
[Specific Examples of Information Transmission Process]
Specific examples of the information transmission process will now be described.
(Specific Examples of Edge Device 2a)
First, specific examples of the edge device 2a will be described.
In the case (where t=1) illustrated in
In the case (where t=2) illustrated in
In the case (where t=3) illustrated in
In the case (where t=4) illustrated in
In the case (where t=4) illustrated in
(Specific Examples of Edge Device 2b)
Next, specific examples of the edge device 2b will be described.
In the case (where t=1) illustrated in
In the case (where t=2) illustrated in
In the case (where t=2) illustrated in
In the case (where t=3) illustrated in
In the case (where t=4) illustrated in
(Specific Examples of Edge Device 2c)
Next, specific examples of the edge device 2c will be described.
In the case (where t=0) illustrated in
In the case (where t=1) illustrated in
In the case (where t=3) illustrated in
In the case (where t=3) illustrated in
In the case (where t=4) illustrated in
For example, in the specific examples described above, at predetermined intervals, each edge device 2 compares the extracted features 232 and the features 232 stored in the information storage area 230 with the features 232 received from other edge devices 2 in a round-robin way.
This enables each edge device 2 to identify a combination of the features 232 for which it may be determined that these features 232 have been extracted from the same analysis target.
This also enables the management apparatus 1 to generate the second correspondence information 131 based on the first correspondence information 233 transmitted from each edge device 2.
For example, in the specific examples described above, in the case (where t=2) illustrated in
In this way, the first edge device 2 in the present embodiment transmits the first feature 232 corresponding to the first analysis target detected by the first edge device 2, to the second edge device 2. The second edge device 2 determines whether the similarity relationship between the second feature 232 corresponding to the second analysis target detected by the second edge device 2 and the first feature 232 received from the first edge device 2 satisfies a condition.
When determining that the similarity relationship satisfies the predetermined condition, the second edge device 2 transmits the first correspondence information 233 indicating that the first analysis target and the second analysis target correspond to each other, to the management apparatus 1.
For example, each edge device 2 transmits only the first correspondence information 233 indicating a combination of features that are features 232 extracted in different edge devices 2 and are related to the same analysis target, to the management apparatus 1.
Thereby, the management apparatus 1 may identify a combination of the features 232 that are extracted in different edge devices 2 and are related to the same analysis target, without acquiring the feature 232 from each of the edge devices 2. Therefore, without accumulating the features 232 in the management apparatus 1, the business entity may perform association of the features 232 acquired in different edge devices 2. Accordingly, the business entity may analyze the action pattern of an analysis target.
Without depending on the position of each edge device 2, the management apparatus 1 may perform association of the features 232 acquired by different edge devices 2. Therefore, even when each edge device 2 is a moving device (for example, an onboard device), the management apparatus 1 may analyze the action pattern of an analysis target.
For example, in the case where there is an analysis target captured for a long time period by a camera of the same edge device 2 (for example, an analysis target that has not moved for a long time period), the feature 232 corresponding to the same analysis target is detected a plurality of successive times in each edge device 2. In such a case, in each edge device 2, for the sake of simplicity of the processing involved in comparison of the features 232, the features 232 detected a plurality of successive times are desirably provided with the same identification information.
Each edge device 2 may compare the newly extracted feature 232 with the feature 232 stored in the information storage area 230, as desired. When identifying a combination of the similar features 232 by this comparison, each edge device 2 may determine that there has been an analysis target captured for a long time period by a camera of the same edge device 2, and may provide each of the features 232 corresponding to the identified combination with the same identification information.
All examples and conditional language provided herein are intended for the pedagogical purposes of aiding the reader in understanding the invention and the concepts contributed by the inventor to further the art, and are not to be construed as limitations to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although one or more embodiments of the present invention have been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2019-183371 | Oct 2019 | JP | national |