This utility patent application claims the benefit of Japanese Priority Patent Application JP 2012-0411192 filed in the Japan Patent Office on Feb. 28, 2012, which is incorporated herein by reference in its entirety.
The present disclosure relates to a terminal apparatus, an information processing apparatus, a display method, and a display control method.
In recent years, digital video cameras on which a large-capacity battery is mounted and which are easily carried due to their compact size have been widely proliferated. Therefore, in general households, opportunities to use digital video cameras have sharply increased compared to the past. On the other hand, since the number of photographed and saved moving images is vast, it is very troublesome to search for desired moving images or scenes. Accordingly, technologies for suggesting relevance or the like between moving images to users and facilitating search have been developed.
For example, Japanese Unexamined Patent Application Publication No. 2009-141820, Japanese Unexamined Patent Application Publication No. 2009-151896, and Japanese Unexamined Patent Application Publication No. 2009-159514 disclose technologies for assisting users in understanding relevance between moving images to be reproduced when a plurality of photographed moving images are reproduced with one display. For example, Japanese Unexamined Patent Application Publication No. 2009-141820, Japanese Unexamined Patent Application Publication No. 2009-151896, and Japanese Unexamined Patent Application Publication No. 2009-159514 disclose configurations in which images generated based on a plurality of moving images are combined to generate a composite image and the composite image is displayed on a display unit.
When the technologies disclosed in Japanese Unexamined Patent Application Publication No. 2009-141820, Japanese Unexamined Patent Application Publication No. 2009-151896, and Japanese Unexamined Patent Application Publication No. 2009-159514 are applied, the relevance between the moving images can be understood. Therefore, even when the number of moving images is considerable, users can easily understand the contents of the moving images, for example, by guessing the contents of differently relevant moving images from the contents of a reproduced moving image. In the technologies disclosed in Japanese Unexamined Patent Application Publication No. 2009-141820, Japanese Unexamined Patent Application Publication No. 2009-151896, and Japanese Unexamined Patent Application Publication No. 2009-159514, however, an operation or the like of substituting a criterion for evaluating the relevance halfway is not assumed. That is, the relevance between the moving images detected based on a predetermined criterion is displayed, but a relevance between the moving images is not suggested seamlessly from various viewpoints.
It is desirable to provide a novel and improved terminal apparatus, a novel and improved information processing apparatus, a novel and improved display method, and a novel and improved display control method capable of realizing a user interface more improved in convenience.
According to an embodiment of the present disclosure, there is provided a terminal apparatus including a display unit that displays information regarding a representative content of each cluster located in lower layers of a cluster to which a selected representative content belongs when information regarding the representative content of each cluster in a predetermined layer is displayed in accordance with a result of hierarchical clustering based on a first rule and the information regarding the representative content is selected. When a predetermined condition is satisfied, the information displayed on the display unit is changed to information regarding a representative content of each cluster extracted in accordance with a result of hierarchical clustering based on a second rule different from the first rule.
Further, according to another embodiment of the present disclosure, there is provided an information processing apparatus including a display control unit that causes information regarding a representative content of each cluster located in lower layers of a cluster to which a selected representative content belongs to be displayed when information regarding the representative content of each cluster in a predetermined layer is displayed in accordance with a result of hierarchical clustering based on a first rule and the information regarding the representative content is selected. When a predetermined condition is satisfied, the display control unit causes information regarding the representative content of each cluster to be displayed in accordance with a result of hierarchical clustering based on a second rule different from the first rule.
Further, according to another embodiment of the present disclosure, there is provided a display method including displaying information regarding a representative content of each cluster located in lower layers of a cluster to which a selected representative content belongs when information regarding the representative content of each cluster in a predetermined layer is displayed in accordance with a result of hierarchical clustering based on a first rule and the information regarding the representative content is selected, and changing display information to information regarding a representative content of each cluster extracted in accordance with a result of hierarchical clustering based on a second rule different from the first rule when a predetermined condition is satisfied.
Further, according to another embodiment of the present disclosure, there is provided a display control method including causing information regarding a representative content of each cluster located in lower layers of a cluster to which a selected representative content belongs to be displayed when information regarding the representative content of each cluster in a predetermined layer is displayed in accordance with a result of hierarchical clustering based on a first rule and the information regarding the representative content is selected, and causing information regarding the representative content of each cluster to be displayed in accordance with a result of hierarchical clustering based on a second rule different from the first rule when a predetermined condition is satisfied.
According to the embodiments of the present technology described above, it is possible to realize the higher convenient user interface.
Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
Hereinafter, the flow of description to be made below will be described in brief.
First, the functional configuration of an information processing apparatus 10 capable of realizing an image classification technology according to an embodiment will be described with reference to
Next, the functional configuration (example of a configuration in which a U/I is considered) of the information processing apparatus 10 capable of realizing the image classification technology according to the embodiment will be described with reference to
Next, a functional configuration of an information processing apparatus 20 capable of realizing an image classification technology in consideration of region division according to the embodiment will be described with reference to
Next, a functional configuration of an information processing apparatus 30 capable of realizing an overlook-image generation technology applied from the image classification technology according to the embodiment will be described with reference to
1. Preview
2. Details of Image Classification Technology
3. Details of Overlook-Image Generation Technology
4. Example of Hardware Configuration
5. Summarization
<1. Preview>
First, the overview of an image classification technology and the overview of an overlook-image generation technology according to the embodiment will be described. Further, the configuration of a system capable of realizing the image classification technology and the overlook-image generation technology according to the embodiment will be described in brief.
[1-1. Overview of Image Classification Technology]
The image classification technology according to the embodiment relates to a technology of clustering images or image regions based on an image feature amount. In the following description, a technology of extracting representative images of clusters and combining the extracted representative images to generate a composite image will also be introduced. Further, an example of the configuration of a user interface that prepares a plurality of combinations (hereinafter referred to as rules) of feature amount coordinate axes and extracts a desired image group by freely reorganizing the rules or layers using a result obtained by performing image hierarchical clustering on the rules will also be introduced.
[1-2. Overview of Overlook-Image Generation Technology]
On the other hand, the overlook-image generation technology according to this embodiment relates to a display control method of easily understanding the contents of an image group using the result of the hierarchical clustering based on an image feature amount. In the following description, for example, a method of realizing a user interface to extract a desired image group by freely reorganizing the rules or layers or a method of generating a composite image very suitable to view the image group in an overlook manner will be introduced. The overlook-image generation technology to be described here is applied from the image classification technology.
[1-3. System Configuration]
The image classification technology and the overlook-image generation technology according to the embodiment are able to be realized, for example, using a single computer or a plurality of computers connected to each other via a network. Further, the image classification technology and the overlook-image generation technology according to the embodiment are able to be realized using a system or the like in which a cloud computing system and an information terminal are combined. Furthermore, the image classification technology and the overlook-image generation technology according to the embodiment are able to be realized by a system in which a terminal apparatus that acquires and displays display data on which the image clustering result is reflected is combined.
The overview of the image classification technology, the overview of the overlook-image generation technology, and the configuration of the system capable of realizing these technologies according to the embodiment have been described. Hereinafter, the image classification technology and the overlook-image generation technology according to the embodiment will sequentially be described in detail.
<2. Details of Image Classification Technology>
The image classification technology according to the embodiment will be described below.
[2-1. Exemplary Configuration #1 (Case in Which Region Division Is Not Considered)]
First, a configuration (exemplary configuration #1) in which an image feature amount is considered using one image as a unit will be described.
(2-1-1. General Configuration of Information Processing Apparatus 10 (
The general configuration of an information processing apparatus 10 capable of realizing the image classification technology according to the embodiment will be described with reference to
As shown in
When an image group to be clustered is input to the information processing apparatus 10, the input image group is input to the feature detection and classification unit 101 and the representative image extraction unit 102. The feature detection and classification unit 101 to which the image group is input detects an image feature amount from each of the images included in the input image group and clusters the images based on the detected image feature amount. A result (hereinafter, classification result) of the clustering performed by the feature detection and classification unit 101 is input to the representative image extraction unit 102.
The representative image extraction unit 102 to which the image group and the classification result are input extracts a representative image of each cluster based on the input image group and the classification result. The representative image of each cluster extracted by the representative image extraction unit 102 is input to the composite image generation unit 103. When the representative image of each cluster is input, the composite image generation unit 103 combines the input representative images to generate a composite image. The composite image generated by the composite image generation unit 103 is displayed on a display device (not shown). The display device may be a display mounted on the information processing apparatus 10 or may be a display of a terminal apparatus that can exchange information via a network or the like.
The general configuration of the information processing apparatus 10 has been described.
(2-1-2. Preprocessing When Applied to Moving Image (
Next, referring to
As shown in
Next, the information processing apparatus 10 determines whether the difference obtained in step S102 is less than or equal to a predetermined threshold value (S103). When the difference is less than or equal to the predetermined threshold value, the information processing apparatus 10 causes the process to proceed to step S105. Conversely, when the difference is not less than or equal to the predetermined threshold value, the information processing apparatus 10 causes the process to proceed to step S104.
When the process proceeds to step S104, the information processing apparatus 10 sets the interest frame image as the latest representative image (S104). Next, the information processing apparatus 10 determines whether the process ends for all of the frames (S105). When the process ends for all of the frames, the information processing apparatus 10 retains the image group extracted as the representative image and ends the preprocessing. Conversely, when the process does not end for all of the frames, the information processing apparatus 10 causes the process to proceed to step S106. When the process proceeds to step S106, the information processing apparatus 10 updates the interest frame to a subsequent frame (S106) and causes the process to proceed to step S101.
The preprocessing has been described when the image classification technology is applied to the moving image.
(2-1-3. Detailed Configuration and Operation of Information Processing Apparatus 10 (
Next, the detailed configuration and operation of the information processing apparatus 10 will be described with reference to
First,
The detection result of the image feature amount obtained by the image feature detection unit 111 is able to be collected as a feature amount table shown in
For example, when a DC component feature amount is used as the image feature amount, the image feature detection unit 111 has a configuration shown in
The histogram calculation unit 121 converts the format of an image into a predetermined format to acquire a histogram. For example, when the histograms of brightness, saturation, and hue are acquired, the histogram calculation unit 121 converts an image into an HSV color space and calculates the histograms using the converted value. Further, a histogram of contrast or the like is acquired, the histogram calculation unit 121 calculates the contrast value of N by M pixels near an interest pixel, considers the calculated contrast value as the contrast value of the interest pixel, and calculates the histogram. Information regarding the calculated histogram is input to the representative value calculation unit 122.
When the information regarding the histogram is input, the representative value calculation unit 122 calculates the representative value from the input histogram. For example, the representative value calculation unit 122 calculates a highest frequency value, an average value, a dispersion value, a standard deviation, or the like as the representative value. The calculated representative value calculated by the representative value calculation unit 122 is output as the image feature amount of the input image.
When a texture component feature amount is used as the image feature amount, the image feature detection unit 111 has a configuration shown in
First, the input image passes through the lowpass filter 131 so that a high-frequency component is cut. The image in which the high-frequency component is cut by the lowpass filter 131 is input to the edge detection filter 132. For example, a filter of Sobel, Prewitt, Roberts cross, Laplacian, or Canny is used as the edge detection filter 132. Edge information detected by the edge detection filter 132 is input to the histogram calculation unit 133.
For example, the histogram calculation unit 133 to which the edge information is input generates a histogram based on the edge information. Information regarding the histogram generated by the histogram calculation unit 133 is input to the representative value calculation unit 134. When the information regarding the histogram is input, the representative value calculation unit 134 calculates a representative value from the input histogram. For example, the representative value calculation unit 134 calculates a highest frequency value, an average value, a dispersion value, a standard deviation, or the like as the representative value. The representative value calculated by the representative value calculation unit 134 is output as the image feature amount of the input image.
When a space frequency feature amount is used as the image feature amount, the image feature detection unit 111 has a configuration shown in
First, the input image is converted from a space region to a frequency region by the two-dimensional FFT unit 141. An output (hereinafter referred to as an FFT output) from the two-dimensional FFT unit 141 is input to the DC component extraction unit 142 and the AC component extraction unit 143. The DC component extraction unit 142 to which the FFT output is input extracts a DC component from the FFT output and outputs the extracted DC component as a DC component feature amount. Further, the AC component extraction unit 143 extracts an AC component from the FFT output and inputs the extracted AC component to the two-dimensional IFFT unit 144. The two-dimensional IFFT unit 144 converts the frequency region of the input AC component into a space region and inputs the result as an FFT output to the binarization unit 145. The binarization unit 145 binarizes the input FFT output.
The output (hereinafter, a binarization output) of the binarization unit 145 is input to the straight component detection filter 146. When the binarization output is input, the straight component detection filter 146 detects a straight component from the binarization output. For example, a filter of Hough conversion or the like is able to be used as the straight component detection filter 146.
A straight line detected by the straight component detection filter 146 represents the characteristics of a space frequency band that considerably occupies the space frequency region. For example, the detection of a horizontal line indicates that a plurality of strong horizontal edges are present in a circular image. That is, when the straight lines are calculated, the edges occupying a great number in the circular image are calculated. Therefore, the detection result of the straight lines is able to be used as an image feature amount. Further, the detection result of the straight lines detected by the straight component detection filter 146 is output as an AC component feature amount.
When a face feature amount is used as the image feature amount, the image feature detection unit 111 has a configuration shown in
First, an image is input to the face recognition unit 151. When the image is input, the face recognition unit 151 recognizes a face included in the input image using any face recognition technology. The face recognition result obtained by the face recognition unit 151 is input to the face matching unit 152. When the face recognition result is input, the face matching unit 152 recognizes who has the face included in the image by verifying the input face recognition result and information regarding faces registered in the face database 153. For example, the faces of family members, friends, or the like are registered in the face database 153. In this case, when the faces of family members, friends, or the like are included in the image, the detection result is output as the image feature amount.
When a scene recognition feature amount is used as the image feature amount, the image feature detection unit 111 has a configuration shown in
First, the input image is input to the color distribution determination unit 161, the edge detection filter 162, and the bandpass filter 166. The color distribution determination unit 161 determines a color distribution of the input image and inputs the determination result to the scene determination unit 165. The edge detection filter 162 detects edge information from the input image and inputs the detection result to the straight component detection filter 163. The straight component detection filter 163 detects straight lines from the input detection result of the edge information and inputs the detection result to the long straight line number counting unit 164.
The long straight line number counting unit 164 counts the number of long straight lines from the input detection result of the straight lines and inputs the count result to the scene determination unit 165. The component of the input image passing through the bandpass filter 166 is input to the histogram calculation unit 167. Further, when many natural objects are imaged, the output of the bandpass filter is estimated to be strong. The histogram calculation unit 167 generates a histogram of edge information of the input component and inputs information (hereinafter referred to as a BPH coefficient) regarding the generated histogram to the scene determination unit 165.
The scene determination unit 165 determines a scene based on the determination result of the color distribution determination unit 161, the output of the long straight line number counting unit 164, and the BPH coefficient. For example, the scene determination unit 165 determines a corresponding scene from a scene group of landscape, indoor, outdoor, urban, portrait, or the like. For example, since a scene image of urban or indoor is assumed to have many artificial objects, an output (the number of long lines) of the long straight line number counting unit 164 can be used as a determination value.
For example, the scene determination unit 165 determines a scene in such a manner that “the scene determination unit 165 determines an urban scene, a landscape scene, and a portrait scene when many long straight lines are present, when the upper portion of an image has a high sky presence ratio and the lower portion of the image has a high green presence ratio, and when a skin color occupies a large area, respectively.” The determination result of the scene determination unit 165 is output as a scene parameter.
Next, the configuration of the color distribution determination unit 161 will be described in more detail with reference to
As shown in
The image input to the color distribution determination unit 161 is first input to the HSV conversion unit 1611. The HSV conversion unit 1611 converts the input image to an expression of an HSV format. The image expressed in the HSV format by the HSV conversion unit 1611 is input to the azure determination unit 1612, the green determination unit 1615, and the skin color determination unit 1618. First, the azure determination unit 1612 determines whether an interest image is azure within the range of a hue H. When an interest pixel is azure, a flag value flag of “flag=1” is output. Conversely, when the interest pixel is not azure, a flag value flag of “flag=0” is output. The flag value flag output by the azure determination unit 1612 is input to the sky probability calculation unit 1613.
Based on the input flag value flag, the sky probability calculation unit 1613 calculates a probability “Ps(px)=(height−y)/height×flag” that an interest pixel px is sky. Here, “height” is the height of an image and y is the coordinate of the interest pixel in the height direction. The probability Ps(px) calculated by the sky probability calculation unit 1613 is input to the cumulating unit 1614. Likewise, while the interest pixel is moved in the entire image, the flag value flag and the probability Ps of each interest pixel are calculated in sequence and are input to the cumulating unit 1614. The cumulating unit 1614 cumulates the probabilities Ps calculated for all of the pixels of the image and outputs the cumulative value as a sky presence determination value.
Likewise, the green determination unit 1615 determines whether an interest pixel is green. When the interest pixel is green, the flag value flag of “flag=1” is output. Conversely, when the interest pixel is not green, the flag value flag of “flag=0” is output. The flag value flag output by the green determination unit 1615 is input to the green probability calculation unit 1616. Based on the input flag value flag, the green probability calculation unit 1616 calculates a probability “Pg(px)=y/height×flag” that the interest pixel px is green. The probability Pg(px) calculated by the green probability calculation unit 1616 is input to the cumulating unit 1617.
Likewise, while the interest pixel is moved in the entire image, the flag value flag and the probability Pg are calculated in sequence and are input to the cumulating unit 1617. The cumulating unit 1617 cumulates the probabilities Pg calculated for all of the pixels of the image and outputs the cumulative value as a sky presence determination value.
The skin color determination unit 1618 determines whether the interest pixel is skin-colored. When the interest pixel is skin-colored, the flag value flag of “flag=1” is output. Conversely, when the interest pixel is not skin-colored, the flag value flag of “flag=0” is output. The flag value flag output by the skin color determination unit 1618 is input to the cumulating unit 1619. Further, while the interest pixel is moved in the entire image, the flag value flag is calculated in sequence and is input to the cumulating unit 1619. The cumulating unit 1619 cumulates the flag values calculated for all of the pixels of the image and outputs the cumulative value as a skin color presence determination value.
As described above, the determination results (the sky presence determination value, the green presence determination value, and the skin color presence determination value) of the color distribution determination unit 161 are used to determine a scene. For example, when the azure occupies the upper portion of an image and the green color occupies the lower portion of the image, the scene of the image is determined to be a “landscape” scene. To perform this determination, a determination reference of determining where the azure and the green colors are mainly distributed is calculated. Further, with regard to the skin color, the relevance with the position in the image is low, and the area itself of the skin color occupying the image is used as the skin color presence determination value.
The configuration of the color distribution determination unit 161 has been described.
Next, specific examples of scene determination conditions used for the scene determination unit 165 to determine a scene will be introduced with reference to
In
As described above, the skin color presence determination value, the sky presence determination value, the green presence determination value, the number of long straight lines, and the BPH coefficient are input to the scene determination unit 165. In the example of
Likewise, when the input values do not satisfy the determination conditions of the second priority, the scene determination unit 165 determines whether the input values satisfy the determination conditions of the third priority. When the input values do not satisfy the determination conditions of the third priority, the scene determination unit 165 determines whether the input values satisfy the determination conditions of the fourth priority. Thus, the scene determination unit 165 sequentially verifies the determination conditions in accordance with the priority and outputs the scene recognition result when the determination conditions are satisfied. In the example of
The specific example of the scene determination condition used for the scene determination unit 165 to determine a scene has been introduced. Of course, any scene determination condition other than the determination conditions of the introduced specific example can be set. Further, the scene determination unit 165 may calculate a ratio between each input value and each threshold value as a scene recognition probability and output the calculated scene recognition probability as the scene recognition result. In this case, the scene recognition probability may be converted into multivalue data and output.
Next, the configuration and operation of the feature classification unit 112 that can classify the feature amounts based on hierarchical clustering (for example, a shortest distance method or a Ward's method) will be described with reference to
As shown in
When the image feature amount is input to the feature classification unit 112, the hierarchical clustering processing unit 171 classifies the images into clusters in accordance with the hierarchical clustering based on the input image feature amount. In this case, a lineage tree of the clustering result is generated and input to the grouping unit 172. When the lineage tree of the clustering result is input, the grouping unit 172 groups cluster components of the lineage tree based on the lineage tree of the cluster result. The grouping unit 172 outputs the grouping result as a classification result.
Hereinafter, the operation of the hierarchical clustering processing unit 171 will be described in more detail with reference to
As shown in
Next, the hierarchical clustering processing unit 171 updates the lineage tree by collecting the clusters for which the distance between the clusters is the minimum to one cluster (S113). Next, the hierarchical clustering processing unit 171 determines whether the clusters are collected to one cluster (S114). When the clusters are collected to one cluster, the hierarchical clustering processing unit 171 ends the hierarchical clustering process. Conversely, when the clusters are not collected to one cluster, the hierarchical clustering processing unit 171 returns the process to step S111.
For example, as shown in
Referring to the inter-cluster distance matrix shown in the upper drawing of
The lineage tree created in the second hierarchical clustering process is expressed as in
Next, referring to the inter-cluster distance matrix shown in the lower drawing of
In the shown example, since the clusters are integrated into one cluster in the fourth hierarchical clustering process, the hierarchical clustering process ends. In this way, the hierarchical clustering process is performed. Information regarding the lineage tree indicating the result of the hierarchical clustering process is input to the grouping unit 172.
Next, the operation of the grouping unit 172 will be described in more detail with reference to
As shown in
When the process proceeds to step S124, the grouping unit 172 determines whether searching for all the divisions ends (S124). When searching for all the divisions ends, the grouping unit 172 causes the process to proceed to step S125. Conversely, when searching for all the divisions does not end, the grouping unit 172 causes the process to proceed to step S121. When the process proceeds to step S125, the grouping unit 172 searches for all of the unmarked divisions from the current division, registers leaf images (that is, images corresponding to the cluster dangling from the divisions) to the same cluster, and records the classification result (S125).
Next, the grouping unit 172 sets the subsequent division as the current division (S126). Next, the grouping unit 172 determines whether to end the searching of all the divisions (S127). When the searching of all the divisions ends, the grouping unit 172 outputs the grouping result as the classification result and ends the series of processes relevant to the grouping process. Conversely, when the searching of all the divisions does not end, the grouping unit 172 causes the process to proceed to step S125.
The contents of the above-described processes are expressed schematically in
As well as the hierarchical clustering described above, an optimization clustering method such as a k-means method can be applied to the clustering process performed by the feature classification unit 112. In this case, as shown in
As shown in
Next, the optimization clustering processing unit 181 calculates the average position or the centroid of the classes and sets the position as the seed position of a new class (S134: see the lower drawing of
Specifically, as shown in
The configuration and operation of the feature classification unit 112 capable of classifying the feature amounts in accordance with the optimization clustering method have been described. To facilitate the description, the two-dimensional feature amount space has been exemplified. In effect, however, a multi-dimensional feature amount space is used.
The configuration and operation of the feature detection and classification unit 101 have been described.
Next, the configuration and operation of the representative image extraction unit 102 will be described. The representative image extraction unit 102 extracts a representative image of each cluster based on the clustering result (see
As shown in
The image group considered as a clustering target and the clustering result are input to the average value/centroid calculation unit 191 and the representative image determination unit 192. When receiving the image group and the clustering result, the average value/centroid calculation unit 191 first calculates the average value or the centroid of the feature amounts of the images of each cluster. The average value or the centroid of the feature amounts calculated by the average value/centroid calculation unit 191 is input to the representative image determination unit 192. When the average value or the centroid of the feature amounts is input, for example, the representative image determination unit 192 determines, as the representative image, an image having a feature amount closest to the input average value or the input centroid of the feature amounts and outputs information regarding the determined representative image.
Next, the operation of the average value/centroid calculation unit 191 will be described in more detail with reference to
As shown in
Next, the average value/centroid calculation unit 191 determines whether the process ends for the row of the current feature amount table (S145). When the process ends for the row of the current feature amount, the average value/centroid calculation unit 191 causes the process to proceed to step S146. Conversely, when the process does not end for the row of the current feature amount, the average value/centroid calculation unit 191 causes the process to proceed to step S144. When the process proceeds to step S144, the average value/centroid calculation unit 191 updates the current feature amount table (S144) and causes the process to proceed to step S142.
When the process proceeds to step S146, the average value/centroid calculation unit 191 calculates the average value and registers the average value as the average value of the feature amounts in the currently processed cluster (S146). Next, the average value/centroid calculation unit 191 determines whether the process ends for all of the clusters (S147). When the process ends for all of the clusters, the average value/centroid calculation unit 191 ends the series of processes relevant to the average value calculation process. Conversely, when the process does not end for all of the clusters, the average value/centroid calculation unit 191 causes the process to proceed to step S148. When the process proceeds to step S148, the average value/centroid calculation unit 191 updates the currently processed cluster (S148) and causes the process to proceed to step S141.
The operation of the average value/centroid calculation unit 191 has been described.
Next, the operation of the representative image determination unit 192 will be described in more detail with reference to
As shown in
Next, the representative image determination unit 192 determines whether the evaluation value calculated in step S154 is the minimum (S155). When the evaluation value is the minimum, the representative image determination unit 192 causes the process to proceed to step S156. Conversely, when the evaluation value is not the minimum, the representative image determination unit 192 causes the process to proceed to step S158. When the process proceeds to step S156, the representative image determination unit 192 updates the representative image of the cluster (S156). Next, the representative image determination unit 192 updates the current row of the feature amount table to the subsequent row (S157) and causes the process to proceed to step S151.
When the process proceeds from step S155 to step S158, the representative image determination unit 192 determines whether all of the rows of the feature amount table end (S158). When all of the rows of the feature amount table end, the representative image determination unit 192 ends the series of processes relevant to the representative image extraction process. Conversely, when all of the rows of the feature amount table do not end, the representative image determination unit 192 causes the process to proceed to step S157.
The operation of the representative image determination unit 192 has been described.
The configuration and operation of the representative image extraction unit 102 have been described.
Next, the configuration and operation of the composite image generation unit 103 will be described. The composite image generation unit 103 combines the representative images extracted from the clusters to generate a composite image. As a method of generating the composite image, for example, a method of simply arraying the plurality of representative images and combining the representative images can be considered, as in
As shown in
Next, the composite image generation unit 103 combines a non-composite image having the side detected in step S163 and the composite image on the side of the non-composite image (S164). At this time, the composite image generation unit 103 expands, contracts, or rotates one image as necessary, so that the composite boundaries can be the minimum to combine the sides of the images in the minimum superimposed manner. Further, to make the boundaries natural, the composite image generation unit 103 performs an LPF process, a morphing process, or the like on the composite boundary portions. Thus, the composite boundaries do not stand out in the process.
Next, the composite image generation unit 103 determines whether the process ends for all of the representative images (S165). When the process ends for all of the representative images, the composite image generation unit 103 ends the series of processes relevant to the process of combining the representative image. Conversely, when the process does not end for all of the representative images, the composite image generation unit 103 causes the process to proceed to step S162 and performs the processes of steps S162 to S165 again. For example, as shown in
The configuration and operation of the composite image generation unit 103 have been described.
The detailed configuration and operation of the information processing apparatus 10 have been described.
(2-1-4. Configuration and Operation in Which U/I Is Considered (
Hereinafter, the configuration and operation of the information processing apparatus 10 configured to choose an image to be clustered viewing a composite image via a user interface (hereinafter referred to as a U/I) and adjust various parameters will be described. In this case, the information processing apparatus 10 is modified to have a configuration shown in
As shown in
First,
The image group including the selected images is clustered. However, the configurations and operations of the feature detection and classification unit 101, the representative image extraction unit 102, and the composite image generation unit 103 are substantially the same as the configurations and operations in the information processing apparatus 10 shown in
The U/I providing unit 11 receives a user's input on the U/I screen on which the composite image is displayed, as shown in
When one cluster is selected, as described above, for example, an image group included in the selected cluster is extracted and the clustering process is performed again, as necessary, as in
In the above-described example, the coordinate axes of the feature amount space are not changed and the U/I configuration is shown to change the resolution. On the other hand, as shown in
First, when the user operates the U/I to select a kind of desired feature amount, a clustering process is performed using the coordinate axes corresponding to the selected kind of feature amount as the reference. Therefore, as shown in
The configuration and operation of the information processing apparatus 10 in which the U/I is considered have been described.
The configuration in which an image feature amount is considered using one image as a unit has been described.
[2-2. Exemplary Configuration #2 (Case in Which Region Division Is Considered)]
Next, a configuration (exemplary configuration #2) in which a plurality of division regions are subjected to the clustering process.
(2-2-1. General Configuration of Information Processing Apparatus 20 (FIG. 45))
The general configuration of an information processing apparatus 20 capable of realizing an image classification technology according to the embodiment will be described with reference to
As shown in
When an image group to be clustered is input to the information processing apparatus 20, the input image group is input to the image region division unit 201. When the image group is input to the image region division unit 201, the image region division unit 201 divides each image included in the input image group into a plurality of regions and generates a divided image group. The divided image group generated by the image region division unit 201 is input to the feature detection and classification unit 202 and the representative image extraction unit 203. When the divided image group is input, the feature detection and classification unit 202 detects an image feature amount from each of the divided images of the input divided image group and clusters the images based on the detected image feature amounts. The result (classification result) of the clustering performed by the feature detection and classification unit 202 is input to the representative image extraction unit 203.
When the divided image group and the classification result are input, the representative image extraction unit 203 extracts a representative image of each cluster based on the input divided image group and the classification result. The representative image of each cluster extracted by the representative image extraction unit 203 is input to the composite image generation unit 204. When the representative image of each cluster is input, the composite image generation unit 204 combines the input representative images to generate a composite image. The composite image generated by the composite image generation unit 204 is displayed on a display device (not shown). The display device may be a display mounted on the information processing apparatus 20 or may be a display of a terminal apparatus that can exchange information via a network or the like.
The general configuration of the information processing apparatus 20 has been described. The configuration of the information processing apparatus 20 other than the configuration used to use the divided images is substantially the same as the configuration of the above-described information processing apparatus 10. As a region division method applicable to the embodiment, various methods can be considered. For example, a method of using an N-digitized image to be described below, a method to which the clustering method is applied, and a method to which a graph theorem is applied can be considered.
(2-2-2. Detailed Configuration and Operation of Information Processing Apparatus 20 (
Next, the detailed configuration and operation of the information processing apparatus 20 will be described with reference to
First,
The detection result of the image feature amount obtained by the image feature detection unit 211 is able to be collected as a feature amount table shown in
The configuration and operation of the feature detection and classification unit 202 have been described. Further, the divided images are clustered, but the feature detection and classification unit 202 can have substantially the same configuration as the above-described feature detection and classification unit 101.
Next,
When the image to be divided is input to the image region division unit 201, the input image is input to the N-digitized processing unit 221. The N-digitized processing unit 221 generates an N-digitized image by performing N-digitization on the input image. The N-digitized image generated by the N-digitized processing unit 221 is input to the region integration processing unit 222. When the N-digitized image is input, the region integration processing unit 222 integrates a pixel considered as noise in the N-digitized image to another pixel value. As an integration method to be performed, for example, a method of using a maximum appearance pixel color filter to be described below is considered.
The image (hereinafter referred to as an N-digitized image) subjected to the integration process by the region integration processing unit 222 is input to the region division processing unit 223. The region division processing unit 223 performs region division such that the pixels having the same pixel value among the pixels of the input N-digitized image is included in the same region, and then outputs the divided image corresponding to each divided region.
Hereinafter, the flow of a region integration process performed by the image region division unit 201 will be described with reference to
The configuration and operation of the image region division unit 201 have been described.
The detailed configuration and operation of the information processing apparatus 20 having the configuration in which the plurality of divided regions are subjected to the clustering process have been described.
<3. Details of Overlook-Image Generation Technology>
Hereinafter, an overlook-image generation technology according to the embodiment will be described in detail.
[3-1. Entire Configuration and Operation of Information Processing Apparatus 30 (
First, the general configuration of an information processing apparatus 30 capable of realizing the overlook-image generation technology according to the embodiment will be described with reference to
As shown in
When an image group to be clustered is input to the information processing apparatus 30, the input image group is input to the clustering unit 301. When the image group is input, the clustering unit 301 clusters the input image group.
As a clustering method applicable to the process of the clustering unit 301, for example, a Nearest Neighbor method, a k-means method, an EM algorithm, or a neural network support vector machine can be used. Examples of the available axis of the image feature amount include a color (RGB or the like), an edge (a size, a direction, or the like of the pixel value of a neighboring pixel), a texture (a total sum of difference values of adjacent pixels of a pixel within a given range), object information (a segmentation result, a size or position of a region, or the like), composition information (information regarding landscape, a structural object, or the like estimated from a positional relation between image feature amounts), and meta-information (a time, a place (GPS information or the like), a tag given by a user, or the like).
The clustering result obtained by the clustering unit 301 is input to the representative image determination unit 302. When the clustering result is input, the representative image determination unit 302 determines the representative image of each cluster based on the input clustering result. For example, the representative image determination unit 302 determines the representative image of each cluster using the image feature amount used in the clustering, the maximum value, the minimum value, or the centroid of an evaluation value, or the like. The representative image of each cluster determined by the representative image determination unit 302 is input to the image composition unit 303.
When the representative image of each cluster is input, the image composition unit 303 combines the input representative images of the cluster to generate a composite image. The composite image generated by the image composition unit 303 is input to the display control unit 304. When the composite image is input, the display control unit 304 displays the input composite image on the display unit 305. Further, when one representative image included in the composite image via the operation input unit 306 is selected, the display control unit 304 acquires the composite image based on a one-layer detailed clustering result of the cluster corresponding to the representative image and displays the acquired composite image on the display unit 305.
When the representative image included in the composite image based on the most detailed clustering result is selected, the display control unit 304 changes the axes (rules) of the image feature amounts, acquires the composite image obtained from the clustering result based on the changed axes, and displays the acquired composite image on the display unit 305. Further, even when an operation of changing the axes is performed through the operation input unit 306, the display control unit 304 acquires the composite image obtained from the clustering result based on the changed axes and displays the acquired composite image on the display unit 305. The display control unit 304 may retain the clustering result based on various axes or the composite image.
The configuration of the information processing apparatus 30 has been described.
[3-2. Detailed Configuration and Operation of Information Processing Apparatus 30 (
Next, the detailed configuration and operation of the information processing apparatus 30 will be described with reference to
In
As a method of determining the representative image, for example, as in
The representative image determination unit 302 may be configured to determine a representative image in accordance with a method using segmentation. Here, the method of determining a representative image using the segmentation will be described with reference to
First,
Next, the representative image determination unit 302 searches for an analogous region in each of the divided regions of each image (S302: see
Next, the representative image determination unit 302 determines whether all of the regions which are the search candidates are searched (S304). When all of the regions are searched, the representative image determination unit 302 causes the process to proceed to step S305. Conversely, when not all of the regions are searched, the representative image determination unit 302 causes the process to proceed to step S302. When the process proceeds to step S305, the representative image determination unit 302 performs the clustering based on the number of links and performs the clustering hierarchically from the image with the large number of links (S305). Next, the representative image determination unit 302 determines the representative image of each cluster (S306) and ends the series of processes relevant to the determination of the representative image. At this time, for example, the representative image determination unit 302 sets an image including the region most linked with other regions as the representative image.
The functions of the clustering unit 301 and the representative image determination unit 302 have been described.
(Movement Between Clustering Result)
Next, movement between succession type clustering results and movement between transition type clustering results will be described with reference to
As the movement between the clustering results, there are two kinds of movement methods. One of the movement methods is a method of performing movement between the clustering results changing a resolution without a change in the coordinate axes of a feature amount space. This method is referred to as a “succession type” method below, as shown in
As in
When the number of images of the selected cluster is less than or equal to a threshold value, the information processing apparatus 30 causes the process to proceed to step S316. Conversely, when the number of images of the selected cluster is not less than or equal to a threshold value, the information processing apparatus 30 causes the process to proceed to step S315. When the process proceeds to step S315, the information processing apparatus 30 performs the movement between the succession type clustering results (S315) and causes the process to proceed to step S312. On the other hand, when the process proceeds to step S316, the information processing apparatus 30 performs the movement between the transition type clustering results (S316) and causes the process to proceed to step S312. The movement between the clustering results may be performed using the prepared clustering results or the clustering results may be calculated every time.
In the case of the method shown in
In the case of the soft switching, as in
When the number of images of the selected cluster is less than or equal to a threshold value, the information processing apparatus 30 causes the process to proceed to step S326. Conversely, when the number of images of the selected cluster is not less than or equal to a threshold value, the information processing apparatus 30 causes the process to proceed to step S325. When the process proceeds to step S325, the information processing apparatus 30 performs the movement between the transition type clustering results (S325) and causes the process to proceed to step S322. On the other hand, when the process proceeds to step S326, the information processing apparatus 30 acquires images from another clustering result, adds the acquired images to the original clustering result (S326), and causes the process to proceed to step S322. The movement between the clustering results may be performed using the prepared clustering results or the clustering results may be calculated every time.
The transition type or succession type may be selected through the U/I.
In the case of the hard switching, as in
When the information processing apparatus 30 receives a transition type command issued at the time of selecting the transition type, the information processing apparatus 30 determines that the transition type is selected and causes the process to proceed to step S337. Conversely, when the information processing apparatus 30 receives a succession type command issued at the time of selecting the succession type, the information processing apparatus 30 determines that the succession type is selected and causes the process to proceed to step S335. When the process proceeds to step S335, the information processing apparatus 30 determines whether the number of images of the selected cluster is less than or equal to a threshold value (S335).
When the number of images of the selected cluster is less than or equal to a threshold value, the information processing apparatus 30 causes the process to proceed to step S337. Conversely, when the number of images of the selected cluster is not less than or equal to a threshold value, the information processing apparatus 30 causes the process to proceed to step S336. When the process proceeds to step S336, the information processing apparatus 30 performs the movement between the succession type clustering results (S336) and causes the process to proceed to step S332. On the other hand, when the process proceeds to step S337, the information processing apparatus 30 performs the movement between the transition type clustering results (S337) and causes the process to proceed to step S332. The movement between the clustering results may be performed using the prepared clustering results or the clustering results may be calculated every time.
In the case of the soft switching, as in
When the information processing apparatus 30 receives a transition type command issued at the time of selecting the transition type, the information processing apparatus 30 determines that the transition type is selected and causes the process to proceed to step S347. Conversely, when the information processing apparatus 30 receives a succession type command issued at the time of selecting the succession type, the information processing apparatus 30 determines that the succession type is selected and causes the process to proceed to step S345. When the process proceeds to step S345, the information processing apparatus 30 determines whether the number of images of the selected cluster is less than or equal to a threshold value (S345).
When the number of images of the selected cluster is less than or equal to a threshold value, the information processing apparatus 30 causes the process to proceed to step S347. Conversely, when the number of images of the selected cluster is not less than or equal to a threshold value, the information processing apparatus 30 causes the process to proceed to step S346. When the process proceeds to step S346, the information processing apparatus 30 performs the movement between the succession type clustering results (S346) and causes the process to proceed to step S342. On the other hand, when the process proceeds to step S347, the information processing apparatus 30 acquires images from another clustering result, adds the images to the original clustering result (S347), and causes the process to proceed to step S342. The movement between the clustering results may be performed using the prepared clustering results or the clustering results may be calculated every time.
As a transition type selecting operation, for example, an operation or the like of designating a region by a dragging operation or a pinch-out operation can be considered, as shown in
Further, as a succession type selecting operation, for example, an operation of the like of designating an image by a flicking operation, a tapping operation, a sliding operation, a dragging operation, or the like can be considered, as shown in
As shown in
The movement between the succession type clustering results and the movement between the transition type clustering results have been described.
(Arraying Method and Composition Method when Combining)
Next, an arraying method and a composition method at the time of combining the representative images will be described with reference to
First,
The contents of the process can be also modified as in
The contents of the process can be also modified as in
The contents of the process can be also modified, as in
The method in
(1) The periphery of an object in each representative image is cut by seam carving, graph cutting, or the like. (2) The objects are connected to each other. At this time, as the method of connecting the objects, for example, a method of connecting contour regions (regions for which the total sum of differences of each bin of a normalized histogram is small, the regions for which a difference of bin of the maximum frequency or a difference of a dispersion of a histogram is small, or the regions for which a difference absolute value between the contour regions is small) in which a color histogram in each contour region is close when the pixels (red dotted line portion) of the periphery (contour portion) of each object are divided into several contour regions (regions in which the red dotted line is partitioned by a blue line portion) can be considered.
(3) The objects are disposed. A method of disposing the objects, for example, one or a combination of a plurality of methods can be considered:
(4) As a method of displaying the composite image (an image in which the objects are connected), for example, the following methods can be considered:
The arraying method and the composition method at the time of combining the representative images have been described. The arraying method and the composition method described above are merely examples. As shown in
The overlook-image generation technology according to the embodiment has been described in detail.
<4: Example Hardware Configuration (FIG. 79)>
Functions of each constituent included in the information processing apparatuses 10, 20, and 30 described above can be realized by using, for example, the hardware configuration of the information processing apparatus shown in
As shown in
The CPU 902 functions as an arithmetic processing unit or a control unit, for example, and controls entire operation or a part of the operation of each structural element based on various programs recorded on the ROM 904, the RAM 906, the storage unit 920, or a removal recording medium 928. The ROM 904 is means for storing, for example, a program to be loaded on the CPU 902 or data or the like used in an arithmetic operation. The RAM 906 temporarily or perpetually stores, for example, a program to be loaded on the CPU 902 or various parameters or the like arbitrarily changed in execution of the program.
These structural elements are connected to each other by, for example, the host bus 908 capable of performing high-speed data transmission. For its part, the host bus 908 is connected through the bridge 910 to the external bus 912 whose data transmission speed is relatively low, for example. Furthermore, the input unit 916 is, for example, a mouse, a keyboard, a touch panel, a button, a switch, or a lever. Also, the input unit 916 may be a remote control that can transmit a control signal by using an infrared ray or other radio waves.
The output unit 918 is, for example, a display device such as a CRT, an LCD, a PDP or an ELD, an audio output device such as a speaker or headphones, a printer, a mobile phone, or a facsimile, that can visually or auditorily notify a user of acquired information. Moreover, the CRT is an abbreviation for Cathode Ray Tube. The LCD is an abbreviation for Liquid Crystal Display. The PDP is an abbreviation for Plasma Display Panel. Also, the ELD is an abbreviation for Electro-Luminescence Display.
The storage unit 920 is a device for storing various data. The storage unit 920 is, for example, a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, or a magneto-optical storage device. The HDD is an abbreviation for Hard Disk Drive.
The drive 922 is a device that reads information recorded on the removal recording medium 928 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, or writes information in the removal recording medium 928. The removal recording medium 928 is, for example, a DVD medium, a Blu-ray medium, an HD-DVD medium, various types of semiconductor storage media, or the like. Of course, the removal recording medium 928 may be, for example, an electronic device or an IC card on which a non-contact IC chip is mounted. The IC is an abbreviation for Integrated Circuit.
The connection port 924 is a port such as a USB port, an IEEE1394 port, a SCSI, an RS-232C port, or a port for connecting an externally connected device 930 such as an optical audio terminal. The externally connected device 930 is, for example, a printer, a mobile music player, a digital camera, a digital video camera, or an IC recorder. Moreover, the USB is an abbreviation for Universal Serial Bus. Also, the SCSI is an abbreviation for Small Computer System Interface.
The communication unit 926 is a communication device to be connected to a network 932, and is, for example, a communication card for a wired or wireless LAN, Bluetooth (registered trademark), or WUSB, an optical communication router, an ADSL router, or a modem for various communication. The network 932 connected to the communication unit 926 is configured from a wire-connected or wirelessly connected network, and is the Internet, a home-use LAN, infrared communication, visible light communication, broadcasting, or satellite communication, for example. Moreover, the LAN is an abbreviation for Local Area Network. Also, the WUSB is an abbreviation for Wireless USB. Furthermore, the ADSL is an abbreviation for Asymmetric Digital Subscriber Line.
<5. Summary>
Finally, the technical spirit and essence according to the embodiment will be described in brief. The technical spirit and essence to be described below can be applied to various apparatuses such as PCs, cellular phones, portable game consoles, portable information terminals, information home appliances, car navigations systems, and photo frames.
The functional configuration of the above-described information processing apparatus can be expressed as follows. For example, a terminal apparatus described in (1) below can sequentially change the clustering results based on a first rule (combination of feature amount coordinates axes) to the detailed contents (the contents of a lower layer) and can also change the display to the clustering results based on a second rule. Therefore, a user operating the terminal apparatus can perform seamlessly an operation of searching for information and an operation of acquiring information from another view point.
Additionally, the present technology may also be configured as below.
(1) A terminal apparatus including:
a display unit that displays information regarding a representative content of each cluster located in lower layers of a cluster to which a selected representative content belongs when information regarding the representative content of each cluster in a predetermined layer is displayed in accordance with a result of hierarchical clustering based on a first rule and the information regarding the representative content is selected,
wherein, when a predetermined condition is satisfied, the information displayed on the display unit is changed to information regarding a representative content of each cluster extracted in accordance with a result of hierarchical clustering based on a second rule different from the first rule.
(2) The terminal apparatus according to (1), wherein the predetermined condition includes selecting information regarding a representative content of each cluster located in a lowest layer.
(3) The terminal apparatus according to (1) or (2), wherein the predetermined condition includes having being performed an operation of changing a rule.
(4) The terminal apparatus according to any one of (1) to (3),
wherein the hierarchical clustering is performed on a set of image contents, and
wherein the display unit displays a composite image obtained by combining representative images of all of clusters extracted as information regarding a representative content of each cluster.
(5) The terminal apparatus according to (4), wherein the hierarchical clustering is performed using a structural line of an object included in each image content as a feature amount.
(6) The terminal apparatus according to (4) or (5), wherein the hierarchical clustering is performed using each division image obtained by dividing each image content into regions as a unit based on a feature amount of each division image.
(7) An information processing apparatus including:
a display control unit that causes information regarding a representative content of each cluster located in lower layers of a cluster to which a selected representative content belongs to be displayed when information regarding the representative content of each cluster in a predetermined layer is displayed in accordance with a result of hierarchical clustering based on a first rule and the information regarding the representative content is selected,
wherein, when a predetermined condition is satisfied, the display control unit causes information regarding the representative content of each cluster to be displayed in accordance with a result of hierarchical clustering based on a second rule different from the first rule.
(8) The information processing apparatus according to (7), wherein the predetermined condition includes selecting information regarding a representative content of each cluster located in a lowest layer.
(9) The information processing apparatus according to (7) or (8), wherein the predetermined condition includes performing an operation of changing a rule.
(10) The information processing apparatus according to any one of (7) to (9), further including:
a clustering unit that clusters a set of contents in accordance with the first or second rule and extracts a representative content of each cluster in each layer; and
an image composition unit that combines images,
wherein the clustering unit performs the hierarchical clustering on a set of image contents,
wherein the image composition unit generates a composite image by combining representative images of all of the clusters extracted as information regarding the representative content of each cluster, and
wherein the display control unit causes the composite image to be displayed.
(11) The information processing apparatus according to (10), wherein the clustering unit performs the hierarchical clustering using a structural line of an object included in each image content as a feature amount.
(12) The information processing apparatus according to (10) or (11), wherein the clustering unit performs the hierarchical clustering using each division image obtained by dividing each of the image content into regions as a unit based on a feature amount of each division image.
(13) A display method including:
displaying information regarding a representative content of each cluster located in lower layers of a cluster to which a selected representative content belongs when information regarding the representative content of each cluster in a predetermined layer is displayed in accordance with a result of hierarchical clustering based on a first rule and the information regarding the representative content is selected; and
changing display information to information regarding a representative content of each cluster extracted in accordance with a result of hierarchical clustering based on a second rule different from the first rule when a predetermined condition is satisfied.
(14) A display control method including:
causing information regarding a representative content of each cluster located in lower layers of a cluster to which a selected representative content belongs to be displayed when information regarding the representative content of each cluster in a predetermined layer is displayed in accordance with a result of hierarchical clustering based on a first rule and the information regarding the representative content is selected; and
causing information regarding the representative content of each cluster to be displayed in accordance with a result of hierarchical clustering based on a second rule different from the first rule when a predetermined condition is satisfied.
(15) A method of operating at least one computing device, the method comprising:
selecting between a first clustering process and a second clustering process based on whether at least one condition is satisfied;
if the first clustering process is selected, performing the first clustering process on a first plurality of images to obtain a first clustering result, wherein the first clustering process uses first coordinate axes that were used in a previous clustering process; and
if the second clustering process is selected, performing the second clustering process on a second plurality of images to obtain a second clustering result, wherein the second clustering process uses second coordinate axes different from the first coordinate axes used in the previous clustering process.
(16) The method of (15), further comprising, prior to selecting between the first clustering process and the second clustering process:
displaying a composite image comprising a plurality of representative images that each represent a respective plurality of images; and
receiving input selecting at least one of the plurality of representative images;
wherein the at least one condition is related to the received input.
(17) The method of (15) or (16), wherein the at least one condition comprises whether the number of images in the respective plurality of images for the selected at least one of the plurality of representative images is greater than a threshold.
(18) The method of (15) to (17), wherein the at least one condition comprises whether the number of selected representative images is greater than a threshold.
(19) The method of (15) to (18), wherein the at least one condition comprises whether the received input was a particular type of operation.
(20) The method of (19), wherein the particular type of operation is a dragging operation or a pinch-out operation.
(21) The method of (19), wherein the particular type of operation is selected from the group consisting of a flicking operation, a tapping operation, a sliding operation, and a dragging operation.
(22) The method of (19), wherein the particular type of operation is a prepared gesture.
(23) The method of (15), wherein the at least one condition comprises a user's explicit selection of which clustering process to use.
(24) The method of (15) or (23), wherein the first clustering result and the second clustering result are each used to generate a respective composite image.
(25) The method of (24), further comprising:
if the first clustering process is selected, displaying the respective composite image generated from the first clustering result; and
if the second clustering process is selected, displaying the respective composite image generated from the second clustering result.
(26) The method of (16), wherein:
the first clustering process comprises a clustering process used to generate the composite image;
the second clustering process comprises a clustering process different from the clustering process used to generate the composite image.
(27) The method of (16) or (26), wherein the composite image is a result of the previous clustering processing.
(28) The method of (16) or (26) or (27), wherein the first clustering process comprises performing a clustering process on the respective plurality of images associated with the selected at least one of the plurality of representative images using the first coordinate axes that were used in the previous clustering process and using a first resolution different from a resolution used in a the precious clustering process.
(29) The method of (16) or (26) to (28), wherein the second clustering process comprises performing a clustering process on at least the respective plurality of images associated with the selected at least one of the plurality of representative images using the second coordinate axes different from the first coordinate axes used in the previous clustering process.
(30) The method of (29), wherein the second clustering process is performed on the respective plurality of images associated with the selected at least one of the plurality of representative images and at least one image associated with a representative image of the plurality of representative images that was not selected by the user.
(31) An apparatus comprising:
a display unit configured to display information;
a user input device for receiving input from the user;
a processing unit configured to implement a method, the method comprising:
displaying, on the display unit, a composite image comprising a plurality of representative images that each represent a respective plurality of images; and
receiving input, via the user input device, selecting at least one of the plurality of representative images;
wherein the at least one condition is related to the received input.
(33) At least one non-transitory computer-readable storage medium comprising computer-executable instructions that, when executed, perform a method, the method comprising:
selecting between a first clustering process and a second clustering process based on whether at least one condition is satisfied;
if the first clustering process is selected, performing the first clustering process on a first plurality of images to obtain a first clustering result, wherein the first clustering process uses first coordinate axes that were used in a previous clustering process; and
if the second clustering process is selected, performing the second clustering process on a second plurality of images to obtain a second clustering result, wherein the second clustering process uses second coordinate axes different from the first coordinate axes used in the previous clustering process.
(34) The at least one non-transitory computer-readable storage medium of (33), wherein the method further comprises, prior to selecting between the first clustering process and the second clustering process:
displaying a composite image comprising a plurality of representative images that each represent a respective plurality of images; and
receiving input selecting at least one of the plurality of representative images;
wherein the at least one condition is related to the received input.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
Number | Date | Country | Kind |
---|---|---|---|
2012-041192 | Feb 2012 | JP | national |