1. Field of the Invention
The invention relates to a technique of detecting the correlation between an image and information on the image.
2. Description of the Related Art
In recent years, an image mining method has been developed as a technique of detecting information items obtained in accordance with the relationships between visual features of many images in an image group and association information items (text information and numerical value information) regarding the images. The image mining method includes a process of arranging images in a virtual three-dimensional space on the basis of various perspectives (in an ascending order of performance values, for example) so as to assist an user to find the relationships between visual features of the images and performance values while the user views the arranged images.
The image mining method may be used in fields of product design and manufacturing. For example, automobile manufacturers design engines of different shapes so as to analyze shapes of engines which attain excellent mileages. When the shapes of engines which attain excellent mileages are to be determined, pairs of information items, i.e., an image representing distribution of fuel concentration for one of the engines having different shapes and a mileage information item (performance value) regarding the image, are obtained. The user analyzes the pairs of the image and the mileage information item so as to obtain information derived from the relationships between the shapes of engines and performances. There are various examples of a process using the image mining such as a process of analyzing the relationships between various shapes of magnetic heads and performance values. A technique related to the above techniques is disclosed in Japanese Laid-open Patent Publication No. 2000-305940 discloses a related technique.
A product including a component A and a component B attached to each other by soldering is taken as an example. It is assumed that if attachment of components A and B is not properly performed by soldering, a product including the components A and B is determined as a defective product. When it is considered that a stress applied to a product relates to production of a nondefective product or a defective product, a number of samples of nondefective products and a number of samples of defective products are prepared in order to determine the correlation between the stress and the production of a nondefective product or a defective product. Images visually representing stresses applied to the samples and attribute data blocks of the nondefective products and the defective products which are associated with the images in advance are used to assist an user to find the relationship between the stress and the production of a nondefective product or a defective product. Here, it is assumed that the user views an image group including the nondefective products and defective products separately arranged, and finds a certain feature in a region of the image group. However, even if the user finds the relationship between visual features of the images and performance values of products (nondefective products or defective products, for example), when the relationship found by the user appears only in local regions of the images, the visual features of the images are merely qualitatively represented. For example, when the user find a certain feature in specific portions of images of nondefective products in the image group including the nondefective products and the defective products, the user merely presumes that the feature of the portions of the images may include some relationship which distinguishes between the nondefective products and the defective products. Therefore, the user cannot obtain the relationship between the feature of the portions of the images and association information items as detailed information.
According to an aspect of an embodiment, a method of operating an apparatus having a display device for analyzing a plurality of images each representing a similar item, includes the steps of: displaying the plurality of the images in parallel by the display device; enabling a user to select partial regions of the images such that each of the partial regions represents a similar portion of the each item; extracting information associated with each of the partial regions of the images; and displaying data of the information of each of the partial regions in parallel in a format different from that appeared in the images.
An embodiment of the invention will now be described with reference to the accompanying drawings. Images to be processed in this embodiment are obtained as results of simulations, and more particularly, are results of simulations which obtain magnitudes of stresses generated when printed boards and components are attached to each other by soldering. Different colors in the images representing the results of the simulations correspond to different magnitudes of stress applied between the printed boards and the components attached to each other by soldering. A plurality of images of the embodiment each represents a similar item.
Images 2 to 7 represent printed boards including components attached thereto by soldering. The images 2 to 7 represent different products.
An image analyzing device of this embodiment performs the following processing on the image group 1. First, a user selects an arbitrary region in one of the images included in the image group 1. In
As described above, the image analyzing device displays visual differences between the region in one of the images selected by the user and the regions in the other images corresponding to the region in the one of the images selected by the user as information items which can be compared with one another. Accordingly, the user finds the correlations between the visual differences and differences in performance with ease.
As described above, a method of operating the image analyzing device having a display device for analyzing a plurality of images each representing a similar item, includes the steps of: displaying the plurality of the images in parallel by the display device; enabling a user to select partial regions of the images such that each of the partial regions represents a similar portion of said each item; extracting information associated with each of the partial regions of the images; and displaying data of the information of each of the partial regions in parallel in a format different from that appeared in the images.
Referring now to
The controller 102 entirely controls the image analyzing device 101. The controller 102 is a central processing unit, for example. The controller 102 executes an image processing program 108 developed in the memory 105. The image processing program 108 allows the controller 102 to execute image processing.
The input unit 103 receives various instructions which are input by the user and which are to be supplied to the controller 102. The input unit 103 includes a keyboard, a mouse, and a touch panel. The instructions may be obtained through a network 107-1.
The output unit 104 outputs an image group to be analyzed and a result of calculation performed using the controller 102, for example. The output unit 104 is connected to a display device, for example, and the image group and the result of calculation performed using the controller 102 are displayed in the display device. Furthermore, the output unit 104 may output the image group and the calculation result through the network 107-1 to an external computer.
The memory 105 is a storage region in which the image processing program 108 which is executed using the controller 102 is developed. Furthermore, the memory 105 stores therein data representing a result of calculation performed using the controller 102, image data, and feature value data, for example. The memory 105 is a RAM (Random Access Memory), for example.
The storage unit 106 stores therein the image processing program 108 and image data, for example. The storage unit 106 is a hard disk device, for example.
The network I/F 107 is connected to the network 107-1 and enables transmission and reception of information between the image analyzing device 101 and the external computer, for example. The controller 102 is also capable of obtaining and outputting image information and calculation parameters through the network I/F 107.
A function of the image analyzing device 101 will now be described.
An image database 21 is a database storing image information items. An image selection module 22 obtains an image information item corresponding to an image selected by the user from an image group. A selection image 23 is an image information item of the image selected by the user from the image group.
A region specifying module 24 obtains information on a selection region (region information item). The selection region is included in an image to be subjected to image analyzing processing and is selected by the user using the input unit 103, for example. In addition to the selection by the user, the selection region may be selected in other ways. For example, a region which is visually different from other images may be automatically extracted in accordance with an association information item. The association information item represents a feature of an object to be displayed as an image. The association information item includes a text information item and a numerical value information item associated with an image information item which correspond to information used to determine whether a product corresponding to the image information item is a nondefective product and information representing a performance value of the product, for example. When the selection region is extracted using the association information item, a computer performs comparison between the images. A similar region searching module 25 searches images other than the image including the selection region for regions similar to the selection region so that the regions similar to the selection region is detected. Similar regions 26 (similar portion) are detected by the similar region searching module 25, are included in the images other than the image including the selection region, and correspond to the selection region of the selected image.
A feature value extracting module 27 calculates feature values of the selection region and the similar regions in the images. Feature values 28 are determined by color, color distribution (coloration), distribution of a contour, pattern, and texture, for example.
A feature value display module 29 controls the output unit 104 to display the calculated feature values in a screen. The feature values are represented by histograms, for example.
A performance value database 30 stores therein association information items which are associated with image information items.
The association information items include text information items and numerical value information items which are associated with the image information items. For example, information items used to determine whether products corresponding to the image information items are nondefective products and information items representing performance values of the products are stored in the performance value database 30. A correlation coefficient calculation module 31 calculates coefficients of correlations between the feature values of the selection region 9 and the similar regions in the images and the association information items.
Correlation coefficients 32 for individual dimensions are values representing degrees of the correlations between the feature values and the correlation information items for individual dimensions. A correlation coefficient comparing module 33 detects a dimension 34 which attains a maximum correlation coefficient from among the correlation coefficients 32 for individual dimensions.
By executing the image processing program 108, the controller 102, the input unit 103, the output unit 104, the memory 105, the storage unit 106, and the network I/F 107 function as the image selection module 22, the region specifying module 24, the similar region searching module 25, the feature value extracting module 27, the feature value display module 29, the correlation coefficient calculation module 31, and the correlation coefficient comparing module 33.
The image information items stored in the image database 21 will now be described.
In
Furthermore, in
Here, the junction regions 8-1 to 8-9 will be described.
Processes performed using the image analyzing device 101 will now be described in detail. First, a process of displaying the feature values for individual regions of the images in the image group performed using the image analyzing device 101 will be described.
The image analyzing device 101 arranges the images of the image group stored in the image database 21 in a virtual three-dimensional space so as to display the images in a screen.
The user views the image group 1 displayed in the screen and intends to detect visual features of the images. For example, it is assumed that the user determines that regions of black (color3) surrounded by regions of gray (color2) are small in the images corresponding to the nondefective products whereas the regions of black (color3) surrounded by the regions of gray (color2) are large in the images corresponding to the defective products.
The user selects a certain image from among the images included in the image group 1 displayed in the screen by inputting information for specifying the image to be selected through the input unit 103, for example. In this embodiment, it is assumed that the user selects the image 2 from the image group 1. The image selection module 22 obtains a selection image information item corresponding to the image 2 from the image database 21 in step S01.
Then, the region specifying module 24 obtains a selection region information item in the selected image 2 in accordance with a user's instruction in step S02. The user selects a certain region in the selected image 2 displayed in the screen using the input unit 103, for example. The user assumes that regions of black (color3) surrounded by regions of gray (color2) are small in the images corresponding to the nondefective products whereas the regions of black (color3) surrounded by the regions of gray (color2) are large in the images corresponding to the defective products while viewing the images displayed in the screen. Note that one of the gray (color2) regions corresponds to the region 2-3 in
The similar region searching module 25 searches the images for similar regions in step S03. The similar regions are included in the images other than the selected image and correspond to the selection region 9. The images other than selected image 2 in the image group 1 in
When the user selects certain regions in the other images, a considerable amount of time is required in proportion to the number of images. Therefore, the similar region searching module 25 performs a semiautomatic operation of specifying the similar regions in the other images. Use of this operation reduces burden of labor for the user required for specifying the similar regions. Furthermore, since positions and sizes are uniformly specified when compared with a case where the similar regions in the other images are manually specified, comparison accuracy is improved.
The operation of searching for the similar regions performed in step S03 using the similar region searching module 25 will now be described in detail hereinafter. An outline of the operation of searching for the similar regions is described below. The similar region searching module 25 detects the images 3 to 7 associated with the selected image 2 from the image group 1. Then, the similar region searching module 25 specifies regions in the images 3 to 7 which are correspond to the selection region 9 in the image 2. Thereafter, the similar region searching module 25 determines the regions specified in the images 3 to 7 to be similar regions and outputs them.
In this embodiment, mainly two criteria are employed for determining “similarity” when the similar region searching module 25 performs the operation of searching for the similar regions. A first criterion is the closeness between a relative position of the selected region relative to the selected image and relative positions of the similar regions relative to the other images. A second criterion is closeness of pixel values in the regions. Priorities assigned to the first and second criteria for the operation of searching for the similar regions are determined in accordance with the image group to be processed and a subject to be processed. Therefore, it is difficult to determine a single weighting function. In this embodiment, the similar regions are appropriately specified in accordance with user's operations.
Here, an operation of automatically searching regions in the images 3 to 7 which correspond to the selection region 9 and in the vicinity thereof for similar regions corresponding to the selection region 9 performed using the similar region searching module 25 in step S11 will be described in detail. The images 2 to 7 in the image group 1 are obtained as results of simulations. The similar regions include, as described above, regions which have the positional relationships between the regions and the other images the same as the positional relationship between the selection region 9 in image 2. The similar regions further include regions in which distances (degrees of dissimilarity) between the selection region and the regions are small. Therefore, the similar region searching module 25 searches for the candidates of the similar regions on the basis of the two criteria. An example of a method for selecting regions to be candidates of similar regions will be described hereinafter.
The similar region searching module 25 detects the region B0 in an image P selected from among the other images in step S22. A position of the region B0 in the image P relatively corresponds to a position of the selection region 9 in the image 2. The position of the selection region 9 is obtained as a position in a coordinate of the image 2 and a range of the selection region 9 is also obtained using the coordinate. Accordingly, the region B0 which is located in a coordinate position in the image P corresponding to the coordinate position of the selection region 9 in the image 2 and which has a size the same as that of the selection region 9 can be determined. Here, since the variable “i” is “0”, the region B0 located in the position corresponding to the coordinate position of the selection region 9 is determined.
Then, from step S23 to step S26, regions located in positions shifted by small distances (degrees of dissimilarity) from the similar region B0 corresponding to the selection region 9 are searched for. The similar region searching module 25 increments the variable “i” by one in step S23. The similar region searching module 25 determines a region Bi which is shifted from the region B0 by i pixels in step S24. For example, the region B11 is obtained by shifting the region B0 rightward by one pixel, and the region B13 is obtained by shifting the region B0 downward by one pixel. Therefore, the regions B11 and B13 shown in
Note that regions shifted from the region B0 by two pixels are regions. Specifically, a region is obtained by shifting the region B0 rightward by two pixels, a region is obtained by shifting the region B0 upward by two pixels, a region is obtained by shifting the region B0 leftward by two pixels, a region is obtained by shifting the region B0 downward by two pixels, a region is obtained by shifting the region B0 rightward by one pixel and upward by one pixel, a region is obtained by shifting the region B0 upward by one pixel and leftward by one pixel, a region is obtained by shifting the region B0 leftward by one pixel and downward by one pixel, and a region is obtained by shifting the region B0 downward by one pixel and rightward by one pixel. Here, the region, obtained by shifting the region B0 rightward by one pixel and upward by one pixel, is obtained by shifting the region B0 rightward by one pixel and then upward by one pixel, or by shifting the region B0 upward by one pixel and then rightward by one pixel. Although identical regions may be thus obtained by different shifting ways, regions to be specified should not be overlapped.
Then, the similar region searching module 25 calculates distances (degrees of dissimilarity) between images in step S25. Specifically, a distance (degrees of dissimilarity) between an image in the selection region 9 and images in the region Bi specified in the image P is obtained. The distances (degrees of dissimilarity) between the images are values used to evaluate displacement between the selection region 9 and the regions to be the candidates of the similar regions, and are obtained for selecting one of the regions to be the candidates of the similar regions. The distances (degrees of dissimilarity) between images are obtained as follows, for example.
It is assumed that n pixels are included in the selection region 9 and n pixels are included in a region Bi to be a candidate of one of the similar regions. The pixels have unique values. Assuming that the pixels included in the selection region 9 are denoted by sn, and the pixels in the region to be the candidate of one of the similar regions are denoted by rn, the selection region 9 is represented by S(s1 to sn) and the region to be the candidate of one of the similar regions is represented by R(r1 to rn). Then, differences between the unique values of the pixels included in the selection region 9 and the unique values of the pixels which are included in the region to be the candidate of one of the similar region and which are positioned so as to correspond to the pixels in the selection region 9 are obtained. The differences between the unique values are obtained for individual corresponding pairs of a pixel and a corresponding pixel, the obtained differences are each multiplied, the multiplied differences obtained for individual pairs of a pixel and a corresponding pixel are added to one another so that a total sum di of all the pixels in the regions is obtained. The total sum di corresponds to a distance Di between the image of the selection region 9 and the image in the region to be the candidate of one of the similar regions. Note that not only the total sum but also distances between vectors of image features (vector format) such as color histograms in the respective regions may be employed as the distances between the images.
The similar region searching module 25 determines whether the variable “i” is larger than a constant “T” in step S26. The constant “T” is a value used to determine a range in which the operation of searching for the regions to be the candidates of the similar regions is performed. The constant “T” is appropriately determined in accordance with a feature of the image group 1. For example, information on a range in which displacement among the products corresponding to the images is considered to be generated is obtained in advance, and the constant “T” is determined in accordance with the information. The number of pixels to be moved in order to specify a region may be determined in accordance with a degree of the displacement.
When the determination is negative in step S26, the process in step S23 onwards is repeatedly performed. On the other hand the determination is affirmative in step S26, the similar region searching module 25 sorts the detected regions Bi in an ascending order of distances Di of the images of the regions Bi to be the candidates of the similar regions in step S27. Note that the number of the candidates of the similar regions to be obtained is determined as “k”. After the regions Bi are sorted in the ascending order of the distances Di of the images, k regions Bi are selected from among the regions Bi in the ascending order of the distances Di as the candidates of the similar regions. In this way, the candidates of the similar regions are obtained.
Furthermore, as another method for specifying regions in step S11, a certain area having a predetermined size is determined which includes the region B0 as a center, and the region B0, and regions B10, B11, B12, B13 and so on may be set in the certain area.
The similar region searching module 25 terminates the operation of searching for the similar regions after all the images relating to the selected image 9 in the image group 1 have subjected to the searching operation. When the similar region searching operation performed on all the images in the image group 1 is terminated, the regions in the other images which are the candidates of the similar regions obtained as results of the searching operation are displayed in step S13.
The similar region searching module 25 obtains image information items of regions to be selected included in the images 3 to 7 from the image database 21 in step S14. The user selects one of the similar regions displayed in the screen using the mouse, for example. The similar region searching module 25 obtains an image information item of the candidate of the similar region selected by the user from the image database 21. The similar region searching module 25 performs an operation of reducing the number of the candidates of the similar regions using the selected candidate of the similar region in step S15. It is assumed that the region in the image 3 is selected in step S14, the similar region searching module 25 performs the operation of reducing the number of the similar regions on the remaining images 4 to 7. A criterion for selecting the similar regions is described below. For example, when a position of one of the candidates of the similar region which is selected by the user in an image information item thereof corresponds to the position of the selection region 9 in the image 2, the similar region searching module 25 detects, from among the candidates of the similar regions in the remaining images, candidates in which the positional relationships between the candidates and the corresponding images are similar to the positional relationship between the selection region 9 and the image 2 as the similar regions. On the other hand, when the candidate of the similar region selected by the user has pixel values similar to pixel values of the image in the selection region 9, the similar region searching module 25 detects, from among the candidates of the similar regions in the remaining images, candidates which have pixel values of high degrees of similarities to the pixel values of the image in the selection region 9 as the similar regions.
In
On the other hand, it is assumed that the user selects the region 10-2 in the image 3 as a similar region. The similar region searching module 25 changes a display color of the selected region 10-2, for example. A position of the region 10-2 relative to the image 3 is not the most similar to a position of the selection region 9 relative to the image 2. Therefore, the similar region searching module 25 detects, from among the candidates of the similar regions in the mages 4 to 7, regions which have pixel values of high degrees of similarities to the pixel values of the image in the selection region 9 as the similar regions. In
Note that the user may manually selects the similar regions in the images as needed. Furthermore, the user may newly specify a region other than the candidates of the similar regions.
The operation of reducing the number of the candidates performed in step S15 will be described in detail hereinafter.
In
Hereinafter, processes executed using the similar region searching module 25 will be described. The similar region searching module 25 obtains a region information item corresponding to a region in the image P selected by the user in step S51. Here, it is assumed that the region C1 in the image P is selected by the user. Subsequently, the similar region searching module 25 obtains a region information item corresponding to one of regions in the image P which have not been selected by the user in step S52. Here, it is assumed that the region C2 in the image P is determined to be one of the regions in the image P which have not been selected by the user and selected by the similar region searching module 25.
Then, the similar region searching module 25 calculates a positional displacement E1 between a relative position of the region C1 in the image P and a relative position of the selection region 9 in the image 2 in step S53. For example, the similar region searching module 25 compares coordinate position information of the region C1 relative to the image P with coordinate position information of the selection region 9 relative to the selected image 2. Subsequently, the similar region searching module 25 calculates a positional displacement E2 between a relative position of the region C2, which is selected by the similar region searching module 25, in the image P and the relative position of the selection region 9 in the image 2 in step S54. For example, the similar region searching module 25 compares coordinate position information of the region C2 relative to the image P with the coordinate position information of the selection region 9 relative to the selected image 2. Then, the similar region searching module 25 specifies the image Q in step S55.
The similar region searching module 25 compares the displacement E1 between the relative position of the selection region 9 in the selected image 2 and the relative position of the region C1 in the image P with the displacement E2 between the relative position of the selection region 9 in the selected image 2 and the relative position of the region C2 in the image P in step S56. When it is determined that the displacement E1 is smaller than the displacement E2 in step S56, that is, the determination is affirmative in step S56, the relative position of the region C1 associated with the displacement E1 in the image P is determined to be similar to the relative position of the selection region 9 in the selected image 2. On the other hand, when the displacement E1 is not smaller than the displacement E2 in step S56, that is, the determination is negative in step S56, the region C2 associated with the displacement E2 is determined to have a high degree of similarity to the selection region 9. When the determination is affirmative in step S56, the similar region searching module 25 selects a region F1 in which a displacement between a relative position of the region F1 in the image Q and the relative position of the selection region 9 in the selected image 2 is minimum from the image Q in step S57.
On the other hand, when the determination is negative in step S56, the similar region searching module 25 selects a region F2 including an image which is the most similar to the image in the selection region 9 from the image Q in step S58. Then, the similar region searching module 25 determines whether all the images included in the image group 1 are subjected to the above-described processing in step S59. When the determination is negative in step S59, one of the remaining images is set and the process from step S51 onward are performed on the one of the remaining images. On the other hand, when the determination is affirmative in step S59, the process of
As described above, the similar region searching module 25 reduces the number of the candidates of the similar regions so as to determine the similar regions. Thereafter, the feature value extracting module 27 obtains feature values of the selection region 9 and the similar regions. The feature value display module 29 displays histograms on the basis of the obtained feature values.
Note that the region B0 is arranged in the image P so as to relatively correspond to a position of the selection region 9 in the selected image 2, and also may have pixel values the most similar to the pixel values of the selection region 9 in the image R In this case, for example, the region B0 may be displayed in the screen by changing a color of a frame surrounding the region B0. For example, a candidate of a similar region corresponding to a relative position of the selection region 9 in the selected image 2 is displayed by being surrounded by a frame of a first color, and a candidate of a similar region which is the most similar to the selection region 9 is displayed by being surrounded by a frame of a second color. A candidate of a similar region which is arranged so as to correspond to a relative position of the selection region 9 in the selected image 2 and which is most similar to the selection region 9 is displayed by a frame of a third color. When the user selects the frame of the third color, the similar region searching module 25 displays a question to the user which criterion is used for the operation of reducing the number of the candidates of the similar regions in the remaining images. The similar region searching module 25 allows the user to determine whether matching of relative positions between regions are employed as the criterion or similarities of the regions are employed as the criterion, for example. In accordance with information on the user's determination, the similar region searching module 25 performs the operation of reducing the number of the similar regions in the remaining images.
Next, an operation of calculating feature values in the regions performed in step S06 in
The feature value display module 29 displays the feature values corresponding to the selection region 9 and the similar regions in the image information items. For example, when the color histograms are used for the feature values, the feature value display module 29 displays color histograms for the selection region 9 and the similar regions in the images.
Next, an operation of calculating correlation coefficients among the regions in the images will be described. The feature values of the images corresponding to the regions can be represented by numerical values by performing the processes up to step S06 in
The image analyzing device 101 determines whether analysis of the correlations between the image feature values and the association information items is performed in step S07. When the determination is affirmative in step S07, correlation analyzing processing is performed in step S08. The correlation coefficient calculation module 31 performs processing below.
The feature values calculated using the feature value display module 29 are represented by multidimensional vectors. Dimensions of the multidimensional vectors correspond to colors. Therefore, the correlation coefficient calculation module 31 generates distribution diagrams representing the relationships between dimensional values and the association information items for individual dimensions of the multidimensional vectors in step S31. The association information items are values representing performances of the products, for example. As the values representing performances of the products, “1” is assigned to nondefective products, and “0” is assigned to defective products. It is assumed that, in this embodiment, as absolute values of the correlation coefficients are close to “1”, the relationships between the visual features of the images and the performance values are strong, whereas as the absolute values of the correlation coefficients are close to “0”, the relationships are weak. Therefore, when the image feature values change as the performance values increase, the correlation between the performance values and the image feature values are strong, that is, high correlations are attained.
In step S32, the correlation coefficient calculation module 31 determines whether distribution diagrams for all the dimension of the multidimensional vectors are generated. When the determination is negative in step S32, the correlation coefficient calculation module 31 further performs the operation of generating a distribution diagram illustrating the relationship between a performance value and a feature value. On the other hand, when the determination is affirmative in step S32, the correlation coefficient calculation module 31 detects correlation coefficients from the distribution diagrams of the different dimensions in step S33. The correlation coefficient calculation module 31 performs the following calculation represented by Equation 1 so as to calculate the correlation coefficients for individual dimensions.
In Expression 1, “r” denotes a correlation coefficient and is not less than “−1” and not larger than “1”, “n” denotes the number of samples of images, “xi” denotes an image feature value of an i-th sample, “yi” denotes a performance value of the i-th sample, “xa” denotes an average value of image feature values of all samples, and “ya” denotes an average value of performance values of all the samples.
The correlation coefficient comparing module 33 detects a dimension having the maximum correlation coefficient from among the correlation coefficients for individual dimensions (colors) of the multidimensional vectors in step S34. The correlation coefficient comparing module 33 is capable of obtaining a dimension (color) having the maximum correlation coefficient for a performance value (a nondefective produce or a defective product).
Here, a case where the user selects a region from among the selection region 9 and the similar regions in which a feature value thereof is to be displayed will be described.
The user selects a certain image from the image group 1 displayed in the screen. The user selects the certain image by inputting information used to specify the image to be selected through the input unit 103, for example. In this embodiment, the user selects the image 2 from the image group 1. The image 2 is referred to as a selected image hereinafter. The image selection module 22 obtains a selection image information item corresponding to the selected image 2 from the image database 21 in step S41. Then, the region specifying module 24 obtains a selection region information item of a region selected from the selected image 2 in step S42. The user determines that the region 2-4 surrounded by the region 2-3 is small in the images corresponding to the nondefective products whereas the region 2-4 surrounded by the region 2-3 is large in the images corresponding to the defective products (refer to
Here, another example of the processing of calculating the region B0 performed in step S22 will be described. The images in the image group 1 are preferably images which are easy to compare with one another. However, even if the images are obtained as results of simulations, pixels of the images may be displaced. Furthermore, even if the images are obtained under the identical photographing condition, differences between positions of a camera relative to the products or differences between inclinations of the camera relative to the products may be generated. Therefore, portions of the products corresponding to specific pixels in the corresponding images are not necessarily located in the same position in the image group 1. Accordingly, positions of the images relative to the corresponding other images may be displaced from a coordinate of the selection region 9 relative to the selected image 2. Therefore, the similar region searching module 25 may search regions in the vicinity of the regions in other images which are located so as to relatively correspond to a coordinate position of the selection region 9 and which include objects the most similar to an object of interest included in the selection region 9 for the similar regions. For example, the similar region searching module 25 moves the regions by several peripheral pixels and detects whether a region to be a candidate of a similar region which has the smallest distance to the image in the selection region 9 is exist. When the determination is affirmative, the similar region searching module 25 set the region having the smallest distance as the region B0. Note that a value of the range of the several peripheral pixels is smaller than the constant “T” which is a value used to determine a range in which the operation of searching for the regions to be the candidates of the similar regions is performed.
The image processing described above is applicable to a field of image mining which assists finding of knowledge from an image group including many images. In this embodiment, images in a manufacturing field are taken as examples. The image processing of this embodiment may be applicable to a wide range of fields, such as searching, analyzing, and mining for multimedia information (images, video images, drawings, three-dimensional CAD (Computer Aided Design) data, volume data), knowledge management, PLM (Product Lifecycle Management), CAE (Computer Aided Engineering), designing, manufacturing, marketing, and medical care.
Note that as another embodiment, an operation of specifying regions in the images which are possible to be associated with the performance values using differences of the performance values may be performed. For example, the image analyzing device 101 obtains correlations between a performance value included therein in advance and color distributions of the regions in the images for individual images of the products. The image analyzing device 101 searches the images for regions having correlation coefficients close to 1 or −1. The image analyzing device 101 specifies regions in the images which have the correlation coefficients relative to the performance values close to 1 or −1.
Furthermore, as an application of this embodiment, when the correlations between the images and the performance values are obtained, performances of the other images can be predicted in accordance with the obtained correlations. For example, it is assumed that the image analyzing device 101 obtains the correlation coefficients between image features and the performance values in advance. Thereafter, when obtaining image information items, the image analyzing device 101 predicts the performance values from the image information items and the correlation coefficients. Accordingly, the user can predict performances of the products.
The embodiments can be implemented in computing hardware (computing apparatus) and/or software, such as (in a non-limiting example) any computer that can store, retrieve, process and/or output data and/or communicate with other computers. The results produced can be displayed on a display of the computing hardware. A program/software implementing the embodiments may be recorded on computer-readable media comprising computer-readable recording
media. The program/software implementing the embodiments may also be transmitted over transmission communication media. Examples of the computer-readable recording media include a magnetic recording apparatus, an optical disk, a magneto-optical disk, and/or a semiconductor memory (for
example, RAM, ROM, etc.). Examples of the magnetic recording apparatus include a hard disk device (HDD), a flexible disk (FD), and a magnetic tape (MT). Examples of the optical disk include a DVD (Digital Versatile Disc), a DVD-RAM, a CD-ROM (Compact Disc-Read Only Memory), and a CD-R (Recordable)/RW. An example of communication media includes a carrier-wave signal.
Further, according to an aspect of the embodiments, any combinations of the
described features, functions and/or operations can be provided.
Number | Date | Country | Kind |
---|---|---|---|
2007-333712 | Dec 2007 | JP | national |