This disclosure relates to an image analysis device, an image analysis method, and an image analysis program.
There is a conventional known technique for associating objects, such as machines or placed objects, which are shown in a drawing image such as a blueprint or a layout drawing, and/or their components with attribute information showing their attributes.
For example, Patent Document 1 illustrates the case where components such as utility poles and utility access holes are recognized so as to associate the recognized components with attribute information showing their attributes.
The technique disclosed in Patent Document 1 requires, as a prerequisite, that the components shown in the drawing image are represented by their predetermined shapes such as symbols and their predetermined shapes are associated with their attribute information with a one-to-one correspondence. Therefore, the technique disclosed in Patent Document 1 has a problem that in a case where a component is shown with a generic shape such as a circle or a square, the attribute information of the component cannot be uniquely identified from the shape, and as a result, the attribute information of the component cannot be outputted.
The present disclosure is made to solve the above-mentioned problem, and the object is to make it possible to output the attribute information of the objects and/or their components shown in drawing images even in a case where they are shown with generic shapes.
The image analysis device according to this disclosure is an image analysis device that outputs attribute information of an incoming input image, including a similar image searching unit to search for a drawing image similar to the incoming input image in a knowledge database including drawing images and attribute information showing attributes of the drawing images, an image correspondence detection unit to associate a feature point of the drawing image retrieved by the similar image searching unit with a feature point of the input image to detect the feature point of the drawing image as a corresponding point, and an attribute output unit to extract the attribute information of the corresponding point detected by the image correspondence detection unit from the knowledge database to output it as the attribute information of the input image.
The present disclosure makes it possible to output the attribute information of the objects and/or their components shown in a drawing image even in a case where they are shown with generic shapes.
The image analysis device 100 includes a similar image searching unit 1, an image correspondence detection unit 2, an attribute output unit 3, and a knowledge database 4. The knowledge database 4 stores knowledge data including drawing images and attribute information showing the attributes of the drawing images. The knowledge database 4 may be included in the image analysis device 100 or may be externally connected.
Now, the description returns to
Meanwhile, in a case where the feature amount of the drawing image G1 is vectorized, the searching unit 12 may obtain the degree of similarity using the cosine degree of similarity, as shown in Formula 1 for example, to output the drawing images to the image correspondence detection unit 2 in order of the degree of similarity from high to low.
S=(f×g)/(|f| |g|) Formula (1)
S is the degree of similarity, f is the feature amount obtained from the drawing image G1, and g is the feature amount of a drawing image included in the knowledge data.
Further, when the feature amount is extracted by the feature amount extraction unit 11 using the graph kernel algorithm, the searching unit 12 may calculate the degree of similarity using an analysis estimation algorithm such as that used in Non-Patent Document 1. Non-Patent Document 1: N. Shervashidze et al., “Weisfeiler-Lehman Graph Kernels,” JMLR, vol. 12, pp. 2539-2561, 2011.
The feature point extraction unit 21 extracts the feature points of the objects and/or their components shown in the drawing image G1 by using local feature amounts such as HOG. The feature point extraction unit 21 extracts the vertexes forming the objects and/or their components as their feature points. The feature point extraction unit 21 may further extract, as the feature points, branch points and/or end points forming the objects and/or their components. The feature point extraction unit 21 outputs information showing the extracted feature points to the corresponding point matching unit 22.
The corresponding point matching unit 22 calculates the distance between a feature point of the drawing image G2 retrieved by the similar image searching unit 1 and a feature point of the drawing image G1 extracted by the feature point extraction unit 21 to detect the feature points whose distance is short as the corresponding point. The corresponding point matching unit 22 outputs the information showing the detected corresponding points to the attribute output unit 3.
The corresponding point matching unit 22 calculates the distance between the feature points based on, for example, Formula (2).
d=∥f·g∥
2 Formula (2)
Here, d is the distance between feature points, f is the feature amount of a feature point of the drawing image G1, and g is the feature amount of a feature point of the drawing image G2. For the feature amounts of feature points, the feature amounts extracted by the feature amount extraction unit 11 may be used. The feature amounts of the feature points of the drawing image G2 may be stored in advance in the knowledge database 4.
In this Embodiment, the corresponding point matching unit 22 detects G24A as the corresponding point of the feature point G14A, G24B as the corresponding point of the feature point G14B, G24C as the corresponding point of the feature point G14C, and G24D as the corresponding point of the feature point G14D to output the information showing the detected corresponding points G24A, G24B, G24C, and G24D to the attribute output unit 3.
The attribute output unit 3 extracts, from the knowledge database 4, the attribute information matching the corresponding points detected by the image correspondence detection unit 2 to output it as the attribute information of the drawing image G1. Specifically, the attribute output unit 3 identifies the component G24 of the drawing image G2 as the object or component that is consistent with the corresponding points G24A, G24B, G24C, and G24D outputted from the corresponding point matching unit 22 of the image correspondence detection unit 2. Further, the attribute output unit 3 extracts the attribute information (control unit) of the component G24 of the drawing image G2 from the knowledge database 4. Then, the attribute output unit 3 outputs “control device” extracted from the knowledge database 4 as the attribute information of a component G14, of the drawing image G1, represented by the feature points G14A, G14B, G14C, and G14D.
The image analysis device 100 may include a knowledge data update unit (not shown). The knowledge data update unit may store, in the knowledge database 4, the knowledge data in which the object and/or its component of the drawing image and the attribute information outputted from the attribute output unit 3 are associated with each other.
The searching unit 12 of the similar image searching unit 1 extracts the feature amounts of the drawing images stored in the knowledge database 4 (ST12). Further, the searching unit 12 calculates the degrees of similarity based on the feature amount of the drawing image G1 extracted in ST11 and the feature amounts of the drawing images extracted in ST12 (ST13).
The searching unit 12 repeats ST12 and ST13 until a drawing image whose degree of similarity has not been calculated disappears in the knowledge database 4 (ST14).
The searching unit 12 sorts the drawing images stored in the knowledge database 4 by the degree of similarity to output them to the image correspondence detection unit 2 in order of the degree of similarity from high to low (ST15).
The feature point extraction unit 21 of the image correspondence detection unit 2 extracts the feature points of the drawing image G2 retrieved by the similar image searching unit 1 (ST22).
The corresponding point matching unit 22 of the image correspondence detection unit 2 calculates the distances between the feature points of the drawing image G2 retrieved by the similar image searching unit 1 and the feature points of the inputted drawing image G1 (ST23). Further, the corresponding point matching unit 22 detects, as the corresponding point, a feature point of the drawing image G2 with a short distance to output the information showing the detected corresponding points to the attribute output unit 3 (ST24).
As described above, according to the present embodiment, the image analysis device 100 can output attribute information to be given to a new incoming input image based on the drawing images and the attribute information showing the attributes of the drawing images stored in the knowledge database 4. Therefore, this makes it possible to help to give the attribute information to the objects and/or their components shown in the input image.
In Embodiment 2, in order for the image analysis device 100 to reduce its processing load while it maintains high accuracy in outputting the attribute information of the components shown in the drawing image, the image analysis device 100 can receive, as an external input, not only a drawing image but also a text related to the attribute information of the drawing image.
In a case where an externally inputted text includes a word “elevator”, the searching unit 12a searches the knowledge database 4 for drawing images whose attribute information relates to the word “elevator” and which is similar to the input drawing image G1. In this way, the similar image searching unit 1a narrows down the drawing images to be extracted from the knowledge database 4 by using the input text, so that the attribute information of the components shown in the drawing image can be outputted with high accuracy while the processing load on the image analysis device 100 is kept low.
The embodiments described above are only examples of implementation of the present disclosure, and application examples that include the following additions/changes in the configurations can be conceivable.
In the above-described embodiments, the searching unit 12 of the similar image searching unit 1 sorts the drawing images stored in the knowledge database 4 by the degree of similarity and outputs them to the image correspondence detection unit 2 in order of the degree of similarity from high to low. However, not limited to the above, the searching unit 12 may output the drawing image having a degree of similarity equal to or higher than a threshold value to the image correspondence detection unit 2. For example, suppose that the range of degree of similarity is represented by numerical values from 0 to 100 and the larger numerical value means the larger degree of similarity. A threshold for determining the similarity is set in advance. The searching unit 12 may sort the drawing images stored in the knowledge database 4 in descending order of the degree of similarity, search for the drawing images having degrees of similarity equal to or higher than the preset threshold value, and output the retrieved drawing images to the image correspondence detection unit 2. Then, the image correspondence detection unit 2 may calculate the distances between the feature points of all the drawing images outputted from the similar image searching unit 1 and the feature points of the input image and detect a feature point with a short distance of the drawing image as a corresponding point.
Further, the searching unit 12 may be configured so as to lower the threshold value as the number of the components shown in the drawing image G1 increases. By doing so, searching the drawing images that include the components corresponding to the components shown in the drawing image G1 in the knowledge database 4 becomes easy. Further, instead of using a threshold value, the searching unit 12 may output the drawing images to the image correspondence detection unit 2 in descending order of the degree of similarity as many as a preset number.
In this case, the searching unit 12 may be configured so as to increase the number of the drawing images to be outputted to the image correspondence detection unit 2 as the number of the components shown in the drawing image G1 increases. By doing so, searching the drawing images that include the components corresponding to the components shown in the drawing image G1 in the knowledge database 4 becomes easy.
In the above-described embodiments, the attribute information of the component G14 is outputted based on the attribute information of the component G24 having the same shape as or similar shape to the component G14 of the drawing image G1. However, this is not the limitation. The image correspondence detection unit 2 may output the attribute information of not only the component with the same or similar shape but also the component which has feature point with a short feature distance.
This application is a Continuation of PCT International Application No. PCT/JP2020/004551 filed on Feb. 6, 2020, which is hereby expressly incorporated by reference into the present application.
| Number | Date | Country | |
|---|---|---|---|
| Parent | PCT/JP2020/004551 | Feb 2020 | US |
| Child | 17857802 | US |