This application claims the benefit of Taiwan application Serial No. 108109409, filed Mar. 19, 2019, the disclosure of which is incorporated, by reference herein in its entirety.
The disclosure relates in general to a person re-identification method, a person re-identification system and an image screening method.
Machine learning and artificial intelligence have gradually become important in human life. Governments and industries of numerous countries have increased research and development budges for the fields of machine learning and artificial intelligence, in the aim of improving living quality of the human race and reducing inconvenience in life. The person re-identification technology can be applied to anti-terrorism and surveillance in public spaces (e.g., stations, malls and department stores) or private environment and, spaces. In a mall, the person re-identification technology can detect and track customers entering the mall. With the integration of other sensors, items picked by customers can be sensed in real time, and customers can then be allowed to exit the mall and thus complete a payment, without being required to, pay at a cashier. In an airport or a department store, passengers not yet boarded can be quickly located before an airplane takes off, or straying parents or children can be quickly found.
When the person re-identification technology performs recognition on an unknown person, the unknown person is compared with an identified person image set by using humanoid detection, face recognition and object tracking technologies. Because different features can be formed due to clothing styles or accessories carried by persons under different angles, and interference can be generated by different lighting, distances and backgrounds among different cameras, it is frequent that accurate results cannot be obtained by person identification.
The disclosure is directed to a person re-identification method, a person re-identification system and an image screening method.
According to one embodiment of the disclosure, a person re-identification method provided includes the following steps. A plurality of identified images of an identified person are analyzed to obtain an interrelated array, which records a plurality of degrees of association among the identified images. The interrelated array is converted to a directed graph, which has at least one connected group. A representative path of each connected group is obtained. According to the representative paths, a part of the identified images are outputted as at least one representative image set for performing a person re-identification process on an image to be identified.
According to another embodiment of the disclosure, a person re-identification system provided includes an association calculation unit, a directed graph establishment unit, a path analysis unit and a selection unit. The association calculation unit analyzes a plurality of identified images of an identified person to obtain an interrelated array, which records a plurality of degrees of association among the identified images. The directed graph establishment unit converts the interrelated array to a directed graph, which has at least one connected group. The path analysis unit obtains a representative path of each connected group. The selection unit outputs, according to the representative paths, a part of the identified images as a representative image set for performing a person re-identification process on an image to be identified.
According to yet another embodiment of the present invention, an image screening method provided includes the following steps. A plurality of identified images are analyzed to obtain an interrelated array, which records a plurality of degrees of association among the identified images. The interrelated array is a directed graph, which has at least one connected group. A representative path of each connected group is obtained. According to the representative paths, a part of the identified images are outputted as at least one representative image set.
To better understand the above and other aspect of the disclosure, embodiments are described in detail with the accompanying drawings below.
In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. It will be apparent, however, that one or more embodiments may be practiced without these specific details. In other instances, well-known structures and devices are schematically shown in order to simplify the drawing.
Various embodiments are described to explain how the disclosure uses an appropriate image screening method to improve accuracy and operation speed of the person re-identification technology. However, the contents disclosed in the embodiments are not to be construed as limitations to the claimed scope of the disclosure.
In the disclosure, the association calculation unit 100, the directed graph establishment unit 200, the path analysis unit 300 and the selection unit 400 perform an “image screening process”. In the image screening process, a most symbolic representative image set IMS11. The accuracy of the person re-identification technology can be improved by using the representative image set IMS11.
The input unit 600, the algorithm unit 700, the classification unit 800 and the output unit 900 perform a “person re-identification process”. In the “person re-identification process”, by using an image IMb to be identified of an unidentified person, it is determined whether the unidentified person is similar to a particular identified person or it is determined to which identified person the unidentified person is most similar. Operation details of the above components are described in detail with an accompanying flowchart below.
The image screening process of steps S100 to S400 is described below. In step S100, the association calculation unit 100 analyzes multiple identified images IMa of an identified person to obtain an interrelated array MXr.
The range of the degree of association WT is 0 to 1, where 0 represents no association at all and 1 represents full association. In one embodiment, the degree of association WT can be calculated by the algorithm unit 700 by using a convolutional neural network (CNN) algorithm. For example, the #3 identified image IMa is a side view, and the #5 identified image IMa is a rear view, and the #8 identified image IMa is a side view; the degree of association WT of the #3 identified image IMa with respect to the #5 identified image IMa is 0.001, which is low (one is a side view and one is a rear view); the degree of association WT of the #3 identified image IMa with respect to the #8 identified image IMa is 0.999, which is high (both are side views).
In step S200 in
A binary value BC in “0” represents a low degree of association WT, and a binary value BC in “1” represents a high degree of association WT. In another embodiment, the threshold can also be 0.8, 0.85, 0.9 or 0.95. The number of binary value BC in “1” in the adjacency array MXc decreases as the threshold gets higher, and increases as the threshold gets lower.
In step S220 in
As shown in
In step S300 in
In the connected group CG11, there may be multiple paths between an ith node and a jth node, with a shortest path d(i, j). The calculation for the length of a path involves the quantity of connecting edges (without considering the degree of association WT). As equation (1) below, among multiple shortest paths d(i, j) from the ith node to other nodes, the longest is defined as an eccentric distance ecc(i) of the ith node.
ecc(i)=maxjϵCG11d(i,j) (1)
As equation (2) below, the largest eccentric distance ecc(i) is defined as a diameter Dia(CG11).
Dia(CG11)=maxjϵCG11ecc(i)=maxi,jϵCG11d(i,j) (2)
That is to say, the diameter Dia(CG11) represents the longest path connecting any two nodes in the connected group CG11. There may be multiple diameters Dia(CG11), each of which is recorded in the diameter list dL.
In step S320 in
In step S400 in
According to steps S100 to S400 above, the association calculation unit 100, the directed graph establishment unit 200, the path analysis unit300 and the selection unit 400 have completed the “image screening process”, such that four identified image IMa are selected from the ten identified images IMa to form the representative image set IMS11. The representative image set IMS11 can fully represent the contents of the ten identified images IMa, thus effectively improving the accuracy and operation speed of the person re-identification technology.
Next, how the person re-identification process is performed by using the representative image set IMS11 is described below. In step S600 in
In step S700 in
In step S800 in
In the embodiment above, the quantity of the connected group CG11 of the directed graph DG1 is one, and the quantity of the representative path PH11 obtained is one. However, the disclosure is not limited to the above examples; the quantity of the connected group can be greater than or equal to two, and the quantity of the representative path can also be greater than or equal to two.
Alternatively,
Further, the image screening process used in the embodiments above is not limited to applications of a person re-identification.
According to the embodiments above, by analyzing the interrelated array MXr, the adjacency array MXc, the directed graph DG1 (the directed graph DG2, or the directed graph DG3), and the representative path PH11 (or the representative path PH21, the representative path PH22, the representative path PH31, or the representative path PH32), the symbolic representative image set IMS11 (the representative image set IMS21, the representative image set IMS22, the representative image set IMS31, or the representative image set IMS32) is obtained. Thus, the accuracy and operation speed of the person re-identification technology are effectively improved.
It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed embodiments. It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
108109409 | Mar 2019 | TW | national |
Number | Name | Date | Kind |
---|---|---|---|
8488913 | Lin et al. | Jul 2013 | B2 |
9087297 | Filippova | Jul 2015 | B1 |
9602559 | Barros | Mar 2017 | B1 |
9704029 | Bourdev et al. | Jul 2017 | B2 |
9767385 | Nguyen et al. | Sep 2017 | B2 |
20100121973 | Lobacheva | May 2010 | A1 |
20120148165 | Yabu | Jun 2012 | A1 |
20130136416 | Sathish | May 2013 | A1 |
20140324864 | Choe | Oct 2014 | A1 |
20150055931 | Koivukangas | Feb 2015 | A1 |
20150154956 | Brown | Jun 2015 | A1 |
20160092736 | Mai | Mar 2016 | A1 |
20160163311 | Crook | Jun 2016 | A1 |
20160171381 | Brewer | Jun 2016 | A1 |
20160266939 | Shear | Sep 2016 | A1 |
20170344617 | Sen | Nov 2017 | A1 |
20180018142 | Kim et al. | Jan 2018 | A1 |
20180063601 | Chee | Mar 2018 | A1 |
20180075300 | Mai | Mar 2018 | A1 |
20180329744 | Shear et al. | Nov 2018 | A1 |
Number | Date | Country |
---|---|---|
102843511 | Dec 2012 | CN |
103488744 | Jan 2014 | CN |
103617435 | Mar 2014 | CN |
105279508 | Jan 2016 | CN |
107578015 | Jan 2018 | CN |
3 002 710 | Apr 2016 | EP |
I267734 | Dec 2006 | TW |
201305923 | Feb 2013 | TW |
WO 2016159199 | Oct 2016 | WO |
WO2017177371 | Oct 2017 | WO |
WO 2017177371 | Oct 2017 | WO |
Entry |
---|
Chen et al., “Similarity Learning with Spatial Constraints for Person Re-identification”, 10 pages. |
Kodirov et al., “Person Re-identification by Unsupervised 1 Graph Learning”, 18 pages. |
Li et al., “Human Reidentification with Transferred Metric Learning”, 14 pages. |
Nanda et al., “An unsupervised meta-graph clustering based prototype-specific feature quantification for human re-identification in video surveillance”, Engineering Science and Technology, an International Journal, vol. 20, 2017, pp. 1041-1053. |
Wang et al., “Graph Matching with Adaptive and Branching Path Following”, IEEE Trans. on Pattern Analysis and Machine Intelligence 14 pages. |
Zhou et al., “Robust and Efficient Graph Correspondence Transfer for Person Re-identification”, May 15, 2018, 13 pages. |
Number | Date | Country | |
---|---|---|---|
20200302186 A1 | Sep 2020 | US |